Updates from: 07/22/2023 01:32:06
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Add Api Connector Token Enrichment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/add-api-connector-token-enrichment.md
You can create an API endpoint using one of our [samples](api-connector-samples.
To use an [API connector](api-connectors-overview.md), you first create the API connector and then enable it in a user flow.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Under **Azure services**, select **Azure AD B2C**. 1. Select **API connectors**, and then select **New API connector**.
To use an [API connector](api-connectors-overview.md), you first create the API
Follow these steps to add an API connector to a sign-up user flow.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Under **Azure services**, select **Azure AD B2C**. 1. Select **User flows**, and then select the user flow you want to add the API connector to. 1. Select **API connectors**, and then select the API endpoint you want to invoke at the **Before sending the token (preview)** step in the user flow:
active-directory-b2c Add Sign In Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/add-sign-in-policy.md
The sign-in policy lets users:
* Sign in with an Azure AD B2C Local Account * Users can sign-in with a social account * Password reset
-* Users cannot sign up for an Azure AD B2C Local Account. To create an account, an administrator can use [Azure portal](manage-users-portal.md#create-a-consumer-user), or [MS Graph API](microsoft-graph-operations.md).
+* Users cannot sign up for an Azure AD B2C Local Account. To create an account, an administrator can use the [Azure portal](manage-users-portal.md#create-a-consumer-user), or [Microsoft Graph API](microsoft-graph-operations.md).
![Profile editing flow](./media/add-sign-in-policy/sign-in-user-flow.png)
active-directory-b2c Conditional Access User Flow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/conditional-access-user-flow.md
Azure AD B2C **Premium P2** is required to create risky sign-in policies. **Prem
To add a Conditional Access policy, disable security defaults:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Under **Azure services**, select **Azure Active Directory**. Or use the search box to find and select **Azure Active Directory**.
The claims transformation isn't limited to the `strongAuthenticationPhoneNumber`
To review the result of a Conditional Access event:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Under **Azure services**, select **Azure AD B2C**. Or use the search box to find and select **Azure AD B2C**.
active-directory-b2c Configure A Sample Node Web App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/configure-a-sample-node-web-app.md
-+ Last updated 07/07/2022
active-directory-b2c Configure Security Analytics Sentinel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/configure-security-analytics-sentinel.md
After you configure your Azure AD B2C instance to send logs to Azure Monitor, en
>[!IMPORTANT] >To enable Microsoft Sentinel, obtain Contributor permissions to the subscription in which the Microsoft Sentinel workspace resides. To use Microsoft Sentinel, use Contributor or Reader permissions on the resource group to which the workspace belongs.
-1. Go to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select the subscription where the Log Analytics workspace is created. 3. Search for and select **Microsoft Sentinel**.
active-directory-b2c Configure User Input https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/configure-user-input.md
In this article, you collect a new attribute during your sign-up journey in Azur
## Add user attributes your user flow
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Under **Azure services**, select **Azure AD B2C**. Or use the search box to find and select **Azure AD B2C**.
active-directory-b2c Custom Email Mailjet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/custom-email-mailjet.md
If you don't already have one, start by setting up a Mailjet account (Azure cust
Next, store the Mailjet API key in an Azure AD B2C policy key for your policies to reference.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Choose **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
active-directory-b2c Custom Email Sendgrid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/custom-email-sendgrid.md
Be sure to complete the section in which you [create a SendGrid API key](https:/
Next, store the SendGrid API key in an Azure AD B2C policy key for your policies to reference.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Choose **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
active-directory-b2c Disable Email Verification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/disable-email-verification.md
Some application developers prefer to skip email verification during the sign-up
Follow these steps to disable email verification:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
The **LocalAccountSignUpWithLogonEmail** technical profile is a [self-asserted](
## Test your policy
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
active-directory-b2c Enable Authentication Python Web App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/enable-authentication-python-web-app.md
-+ Last updated 06/28/2022
active-directory-b2c Find Help Open Support Ticket https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/find-help-open-support-ticket.md
If you're unable to find answers by using self-help resources, you can open an o
> [!NOTE] > For billing or subscription issues, use the [Microsoft 365 admin center](https://admin.microsoft.com).
-1. Sign in to [the Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the Azure Active Directory (Azure AD) tenant that contains your Azure subscription:
active-directory-b2c Identity Provider Azure Ad Multi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/identity-provider-azure-ad-multi-tenant.md
This article shows you how to enable sign-in for users using the multi-tenant en
## Register an Azure AD app
-To enable sign-in for users with an Azure AD account in Azure Active Directory B2C (Azure AD B2C), you need to create an application in [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md).
+To enable sign-in for users with an Azure AD account in Azure Active Directory B2C (Azure AD B2C), you need to create an application in the [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md).
1. Sign in to the [Azure portal](https://portal.azure.com). 1. Make sure you're using the directory that contains your organizational Azure AD tenant (for example, Contoso). Select the **Directories + subscriptions** icon in the portal toolbar.
If the sign-in process is successful, your browser is redirected to `https://jwt
- Learn how to [pass the Azure AD token to your application](idp-pass-through-user-flow.md). - Check out the Azure AD multi-tenant federation [Live demo](https://github.com/azure-ad-b2c/unit-tests/tree/main/Identity-providers#azure-active-directory), and how to pass Azure AD access token [Live demo](https://github.com/azure-ad-b2c/unit-tests/tree/main/Identity-providers#azure-active-directory-with-access-token)
active-directory-b2c Identity Provider Azure Ad Single Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/identity-provider-azure-ad-single-tenant.md
As of November 2020, new application registrations show up as unverified in the
## Register an Azure AD app
-To enable sign-in for users with an Azure AD account from a specific Azure AD organization, in Azure Active Directory B2C (Azure AD B2C), you need to create an application in [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md).
+To enable sign-in for users with an Azure AD account from a specific Azure AD organization, in Azure Active Directory B2C (Azure AD B2C), you need to create an application in the [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md).
1. Sign in to the [Azure portal](https://portal.azure.com). 1. Make sure you're using the directory that contains your organizational Azure AD tenant (for example, Contoso):
active-directory-b2c Identity Provider Microsoft Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/identity-provider-microsoft-account.md
zone_pivot_groups: b2c-policy-type
## Create a Microsoft account application
-To enable sign-in for users with a Microsoft account in Azure Active Directory B2C (Azure AD B2C), you need to create an application in [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md). If you don't already have a Microsoft account, you can get one at [https://www.live.com/](https://www.live.com/).
+To enable sign-in for users with a Microsoft account in Azure Active Directory B2C (Azure AD B2C), you need to create an application in the [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md). If you don't already have a Microsoft account, you can get one at [https://www.live.com/](https://www.live.com/).
1. Sign in to the [Azure portal](https://portal.azure.com). 1. Make sure you're using the directory that contains your Azure AD tenant. Select the **Directories + subscriptions** icon in the portal toolbar.
To enable sign-in for users with a Microsoft account in Azure Active Directory B
## Configure Microsoft as an identity provider
-1. Sign in to the [Azure portal](https://portal.azure.com/) as the global administrator of your Azure AD B2C tenant.
+1. Sign in to the [Azure portal](https://portal.azure.com) as the global administrator of your Azure AD B2C tenant.
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Choose **All services** in the top-left corner of the Azure portal, search for and select **Azure AD B2C**.
If you want to get the `family_name` and `given_name` claims from Azure AD, you
Now that you've created the application in your Azure AD tenant, you need to store that application's client secret in your Azure AD B2C tenant.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. Choose **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
active-directory-b2c Multi Factor Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/multi-factor-authentication.md
With [Conditional Access](conditional-access-identity-protection-overview.md) us
::: zone pivot="b2c-user-flow"
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
In Azure AD B2C, you can delete a user's TOTP authenticator app enrollment. Then
### Delete TOTP authenticator app enrollment using the Azure portal
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. In the left menu, select **Users**.
active-directory-b2c Partner Arkose Labs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-arkose-labs.md
The following diagram illustrates how the Arkose Labs platform integrates with A
### Create an ArkoseSessionToken custom attribute
-To create a custom attribute:
+To create a custom attribute:
-1. Go to the [Azure portal](https://ms.portal.azure.com/#home), then to **Azure AD B2C**.
+1. Sign in to the [Azure portal](https://portal.azure.com), then navigate to **Azure AD B2C**.
2. Select **User attributes**. 3. Select **Add**. 4. Enter **ArkoseSessionToken** as the attribute Name.
active-directory-b2c Partner Bloksec https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-bloksec.md
Learn more: [Send a sign out request](./openid-connect.md#send-a-sign-out-reques
For the following instructions, use the directory that contains your Azure AD B2C tenant.
-1. Sign-in to the [Azure portal](https://portal.azure.com/#home) as Global Administrator of your Azure AD B2C tenant.
+1. Sign in to the [Azure portal](https://portal.azure.com) as Global Administrator of your Azure AD B2C tenant.
2. In the portal toolbar, select **Directories + subscriptions**. 3. On the **Portal settings, Directories + subscriptions** page, in the **Directory name** list, find your Azure AD B2C directory. 4. Select **Switch**.
For the following instructions, ensure BlokSec is a new OIDC identity provider (
Store the client secret you noted in your Azure AD B2C tenant. For the following instructions, use the directory with your Azure AD B2C tenant.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the portal toolbar, select **Directories + subscriptions**. 3. On the **Portal settings, Directories + subscriptions** page, in the **Directory name** list, find your Azure AD B2C directory. 4. Select **Switch**.
In the following example, for the `CustomSignUpOrSignIn` user journey, the Refer
For the following instructions, use the directory with your Azure AD B2C tenant.
-1. Sign in to the [Azure portal](https://portal.azure.com/#home).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the portal toolbar, select the **Directories + subscriptions**. 3. On the **Portal settings, Directories + subscriptions** page, in the **Directory name** list, find your Azure AD B2C directory 4. Select **Switch**.
Learn more: [Tutorial: Register a web application in Azure Active Directory B2C]
* [Azure AD B2C custom policy overview](./custom-policy-overview.md) * [Tutorial: Create user flows and custom policies in Azure AD B2C](./tutorial-create-user-flows.md?pivots=b2c-custom-policy)-
active-directory-b2c Partner Nevis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-nevis.md
The diagram shows the implementation.
### Integrate Azure AD B2C with Nevis
-1. Go to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Switch to your Azure AD B2C tenant. Note: the Azure AD B2C tenant usually is in a separate tenant. 3. In the menu, select **Identity Experience Framework (IEF)**. 4. Select **Policy Keys**.
The diagram shows the implementation.
## Next steps - [Custom policies in Azure AD B2C](./custom-policy-overview.md)-- [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy)
+- [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy)
active-directory-b2c Partner Web Application Firewall https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-web-application-firewall.md
To enable WAF, configure a WAF policy and associate it with the AFD for protecti
Create a WAF policy with Azure-managed default rule set (DRS). See, [Web Application Firewall DRS rule groups and rules](../web-application-firewall/afds/waf-front-door-drs.md).
-1. Go to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Create a resource**. 3. Search for Azure WAF. 4. Select **Azure Web Application Firewall (WAF)**.
active-directory-b2c Threat Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/threat-management.md
The first 10 lockout periods are one minute long. The next 10 lockout periods ar
To manage smart lockout settings, including the lockout threshold:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. 1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
active-directory-b2c Tutorial Create Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/tutorial-create-tenant.md
Before you create your Azure AD B2C tenant, you need to take the following consi
>[!NOTE] >If you're unable to create Azure AD B2C tenant, [review your user settings page](tenant-management-check-tenant-creation-permission.md) to ensure that tenant creation isn't switched off. If tenant creation is switched on, ask your *Global Administrator* to assign you a **Tenant Creator** role.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the Azure Active Directory (Azure AD) tenant that contains your subscription:
You can link multiple Azure AD B2C tenants to a single Azure subscription for bi
Azure AD B2C allows you to activate Go-Local add-on on an existing tenant as long as your tenant stores data in a country/region that has local data residence option. To opt-in to Go-Local add-on, use the following steps:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Make sure you're using the directory that contains your Azure AD B2C tenant:
You only need to perform this operation once. Before performing these steps, mak
1. In the **All services** search box, search for **Azure AD B2C**, hover over the search result, and then select the star icon in the tooltip. **Azure AD B2C** now appears in the Azure portal under **Favorites**. 1. If you want to change the position of your new favorite, go to the Azure portal menu, select **Azure AD B2C**, and then drag it up or down to the desired position.
- ![Azure AD B2C, Favorites menu, Microsoft Azure portal](media/tutorial-create-tenant/portal-08-b2c-favorite.png)
+ ![Azure AD B2C, Favorites menu, Azure portal](media/tutorial-create-tenant/portal-08-b2c-favorite.png)
## Next steps
active-directory-b2c View Audit Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/view-audit-logs.md
The activity details panel contains the following relevant information:
The Azure portal provides access to the audit log events in your Azure AD B2C tenant.
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Switch to the directory that contains your Azure AD B2C tenant, and then browse to **Azure AD B2C**. 1. Under **Activities** in the left menu, select **Audit logs**.
active-directory-domain-services Ad Auth No Join Linux Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/ad-auth-no-join-linux-vm.md
+ Last updated 01/29/2023 - # Active Directory authentication non domain joined Linux Virtual Machines
active-directory-domain-services Csp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/csp.md
The following important considerations apply when administering a managed domain
## Next steps
-To get started, [enroll in the Azure CSP program](/partner-center/enrolling-in-the-csp-program). You can then enable Azure AD Domain Services using [the Azure portal](tutorial-create-instance.md) or [Azure PowerShell](powershell-create-instance.md).
+To get started, [enroll in the Azure CSP program](/partner-center/enrolling-in-the-csp-program). You can then enable Azure AD Domain Services using the [Azure portal](tutorial-create-instance.md) or [Azure PowerShell](powershell-create-instance.md).
active-directory-domain-services Join Centos Linux Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/join-centos-linux-vm.md
ms.assetid: 16100caa-f209-4cb0-86d3-9e218aeb51c6
+ Last updated 06/17/2021 - # Join a CentOS Linux virtual machine to an Azure Active Directory Domain Services managed domain
active-directory-domain-services Join Rhel Linux Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/join-rhel-linux-vm.md
ms.assetid: 16100caa-f209-4cb0-86d3-9e218aeb51c6
+ Last updated 07/13/2020 - # Join a Red Hat Enterprise Linux virtual machine to an Azure Active Directory Domain Services managed domain
active-directory-domain-services Join Suse Linux Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/join-suse-linux-vm.md
+ Last updated 01/29/2023 - # Join a SUSE Linux Enterprise virtual machine to an Azure Active Directory Domain Services managed domain
active-directory-domain-services Join Ubuntu Linux Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/join-ubuntu-linux-vm.md
Last updated 01/29/2023 --+ # Join an Ubuntu Linux virtual machine to an Azure Active Directory Domain Services managed domain
active-directory Application Provisioning Config Problem Scim Compatibility https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/application-provisioning-config-problem-scim-compatibility.md
Below are sample requests to help outline what the sync engine currently sends v
## Upgrading from the older customappsso job to the SCIM job Following the steps below will delete your existing customappsso job and create a new SCIM job.
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the **Azure Active Directory > Enterprise Applications** section of the Azure portal, locate and select your existing SCIM application. 3. In the **Properties** section of your existing SCIM app, copy the **Object ID**. 4. In a new web browser window, go to https://developer.microsoft.com/graph/graph-explorer and sign in as the administrator for the Azure AD tenant where your app is added.
Following the steps below will delete your existing customappsso job and create
## Downgrading from the SCIM job to the customappsso job (not recommended) We allow you to downgrade back to the old behavior but don't recommend it as the customappsso does not benefit from some of the updates we make, and may not be supported forever.
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the **Azure Active Directory > Enterprise Applications > Create application** section of the Azure portal, create a new **Non-gallery** application. 3. In the **Properties** section of your new custom app, copy the **Object ID**. 4. In a new web browser window, go to https://developer.microsoft.com/graph/graph-explorer and sign in as the administrator for the Azure AD tenant where your app is added.
active-directory Known Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/known-issues.md
Previously updated : 06/08/2023 Last updated : 07/21/2023 zone_pivot_groups: app-provisioning-cross-tenant-synchronization
This article discusses known issues to be aware of when you work with app provis
### Unsupported synchronization scenarios -- Restoring a previously soft-deleted user in the target tenant - Synchronizing groups, devices, and contacts into another tenant - Synchronizing users across clouds - Synchronizing photos across tenants
active-directory User Provisioning Sync Attributes For Mapping https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/user-provisioning-sync-attributes-for-mapping.md
Get-AzureADUser -ObjectId 0ccf8df6-62f1-4175-9e55-73da9e742690 | Select -ExpandP
## Create an extension attribute using cloud sync Cloud sync will automatically discover your extensions in on-premises Active Directory when you go to add a new mapping. Use the steps below to auto-discover these attributes and set up a corresponding mapping to Azure AD.
-1. Sign-in to the Azure portal with a hybrid administrator account
-2. Select Azure AD Connect
-3. Select **Manage Azure AD cloud sync**
-4. Select the configuration you wish to add the extension attribute and mapping
-5. Under **Manage attributes** select **click to edit mappings**
+1. Sign in to the Azure portal with a hybrid administrator account.
+2. Select Azure AD Connect.
+3. Select **Manage Azure AD cloud sync**.
+4. Select the configuration you wish to add the extension attribute and mapping.
+5. Under **Manage attributes** select **click to edit mappings**.
6. Click **Add attribute mapping**. The attributes will automatically be discovered. 7. The new attributes will be available in the drop-down under **source attribute**. 8. Fill in the type of mapping you want and click **Apply**.
active-directory Workday Retrieve Pronoun Information https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/workday-retrieve-pronoun-information.md
Once you confirm that pronoun data is available in the *Get_Workers* response, g
To retrieve pronouns from Workday, update your Azure AD provisioning app to query Workday using v38.1 of the Workday Web Services. We recommend testing this configuration first in your test/sandbox environment before implementing the change in production.
-1. Sign-in to Azure portal as administrator.
+1. Sign in to the Azure portal as an administrator.
1. Open your *Workday to AD User provisioning* app OR *Workday to Azure AD User provisioning* app. 1. In the **Admin Credentials** section, update the **Tenant URL** to include the Workday Web Service version v38.1 as shown.
active-directory Application Proxy Configure Cookie Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-proxy/application-proxy-configure-cookie-settings.md
Additionally, if your back-end application has cookies that need to be available
## Set the cookie settings - Azure portal To set the cookie settings using the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Navigate to **Azure Active Directory**ΓÇ»>ΓÇ»**Enterprise applications**ΓÇ»>ΓÇ»**All applications**. 3. Select the application for which you want to enable a cookie setting. 4. Click **Application Proxy**.
active-directory Application Proxy Integrate With Tableau https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-proxy/application-proxy-integrate-with-tableau.md
For:
**To publish your app**:
-1. Sign in to the [Azure portal](https://portal.azure.com) as an application administrator.
+1. Sign in to the [Azure portal](https://portal.azure.com) as an application administrator.
2. Select **Azure Active Directory > Enterprise applications**.
active-directory How To Mfa Registration Campaign https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/how-to-mfa-registration-campaign.md
No. This feature is available only for users using Azure AD Multi-Factor Authent
Nudge is available only on browsers and not on applications.
+**Can users be nudged on a mobile device?**
+
+Nudge is not available on mobile devices.
+ **How long will the campaign run for?** You can use the APIs to enable the campaign for as long as you like. Whenever you want to be done running the campaign, simply use the APIs to disable the campaign. + **Can each group of users have a different snooze duration?** No. The snooze duration for the prompt is a tenant-wide setting and applies to all groups in scope.
No. The feature, for now, aims to nudge users to set up the Authenticator app on
**Is there a way for me to hide the snooze option and force my users to setup the Authenticator app?**
-There is no way to hide the snooze option on the nudge. You can set the snoozeDuration to 0, which will ensure that users will see the nudge during each MFA attempt.
+Users in organizations with free and trial subscriptions can postpone the app setup up to three times. There is no way to hide the snooze option on the nudge for organizations with paid subscriptions yet. You can set the snoozeDuration to 0, which will ensure that users will see the nudge during each MFA attempt.
**Will I be able to nudge my users if I am not using Azure AD Multi-Factor Authentication?**
Yes. If they have been scoped for the nudge using the policy.
It's the same as snoozing.
-**Why donΓÇÖt some users see a nudge when there is a conditional access policy for "Register security information"?**
+**Why donΓÇÖt some users see a nudge when there is a Conditional Access policy for "Register security information"?**
+
+A nudge won't appear if a user is in scope for a Conditional Access policy that blocks access to the **Register security information** page.
-A nudge won't appear if a user is in scope for a conditional access policy that blocks access to the **Register security information** page.
+**Do users see a nudge when there is a terms of use (ToU) screen presented to the user during sign-in?**
+
+A nudge won't appear if a user is presented with the [terms of use (ToU)](/azure/active-directory/conditional-access/terms-of-use) screen during sign-in.
+
+**Do users see a nudge when Conditional Access custom controls are applicable to the sign-in?**
+
+A nudge won't appear if a user is redirected during sign-in due to [Conditional Access custom controls](/azure/active-directory/conditional-access/controls) settings.
## Next steps [Enable passwordless sign-in with Microsoft Authenticator](howto-authentication-passwordless-phone.md) +
active-directory Onboard Enable Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/cloud-infrastructure-entitlement-management/onboard-enable-tenant.md
Previously updated : 06/16/2023 Last updated : 07/21/2023
This article describes how to enable Microsoft Entra Permissions Management in y
> [!NOTE] > To complete this task, you must have *Microsoft Entra Permissions Management Administrator* permissions. You can't enable Permissions Management as a user from another tenant who has signed in via B2B or via Azure Lighthouse. ## Prerequisites
active-directory Product Reports https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/cloud-infrastructure-entitlement-management/product-reports.md
Previously updated : 02/23/2022 Last updated : 07/21/2023
Permissions Management offers the following reports for management associated wi
- **Use cases**: - Any task usage or specific task usage via User/Group/Role/App can be tracked with this report. -- **Identity privilege activity report**
- - **Summary of report**: Provides information about permission changes that have occurred in the selected duration.
- - **Applies to**: AWS, Azure, and GCP
- - **Report output type**: PDF
- - **Ability to collate report**: No
- - **Type of report**: **Summary**
- - **Use cases**:
- - Any identity permission change can be captured using this report.
- - The **Identity Privilege Activity** report has the following main sections: **User Summary**, **Group Summary**, **Role Summary**, and **Delete Task Summary**.
- - The **User** summary lists the current granted permissions and high-risk permissions and resources accessed in 1 day, 7 days, or 30 days. There are subsections for newly added or deleted users, users with PCI change, and High-risk active/inactive users.
- - The **Group** summary lists the administrator level groups with the current granted permissions and high-risk permissions and resources accessed in 1 day, 7 days, or 30 days. There are subsections for newly added or deleted groups, groups with PCI change, and High-risk active/inactive groups.
- - The **Role summary** lists similar details as **Group Summary**.
- - The **Delete Task summary** section lists the number of times the **Delete task** has been executed in the given time period.
- - **Permissions Analytics Report** - **Summary of report**: Provides information about the violation of key security best practices. - **Applies to**: AWS, Azure, and GCP
- - **Report output type**: CSV
+ - **Report output type**: CSV, PDF
- **Ability to collate report**: Yes - **Type of report**: **Detailed** - **Use cases**:
Permissions Management offers the following reports for management associated wi
- **Applies to**: AWS, Azure, GCP - **Report output type**: CSV - **Ability to collate report**: Yes
- - **Type of report**: **Detailed**
+ - **Type of report**: **Summary**
- **Use cases**: - This report lists all the assigned permissions for the selected identities.
active-directory Howto Add App Roles In Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/howto-add-app-roles-in-apps.md
To create an app role by using the Azure portal's user interface:
| **Allowed member types** | Specifies whether this app role can be assigned to users, applications, or both.<br/><br/>When available to `applications`, app roles appear as application permissions in an app registration's **Manage** section > **API permissions > Add a permission > My APIs > Choose an API > Application permissions**. | `Users/Groups` | | **Value** | Specifies the value of the roles claim that the application should expect in the token. The value should exactly match the string referenced in the application's code. The value can't contain spaces. | `Survey.Create` | | **Description** | A more detailed description of the app role displayed during admin app assignment and consent experiences. | `Writers can create surveys.` |
- | **Do you want to enable this app role?** | Specifies whether the app role is enabled. To delete an app role, deselect this checkbox and apply the change before attempting the delete operation. | _Checked_ |
+ | **Do you want to enable this app role?** | Specifies whether the app role is enabled. To delete an app role, deselect this checkbox and apply the change before attempting the delete operation. This setting controls the app role's usage and availability while being able to temporarily or permanently disabling it without removing it entirely. | _Checked_ |
1. Select **Apply** to save your changes.
+When the app role is set to enabled, any users, applications or groups who are assigned has it included in their tokens. These can be access tokens when your app is the API being called by an app or ID tokens when your app is signing in a user. If set to disabled, it becomes inactive and no longer assignable. Any previous assignees will still have the app role included in their tokens, but it has no effect as it is no longer actively assignable.
+ ## Assign users and groups to roles Once you've added app roles in your application, you can assign users and groups to the roles. Assignment of users and groups to roles can be done through the portal's UI, or programmatically using [Microsoft Graph](/graph/api/user-post-approleassignments). When the users assigned to the various app roles sign in to the application, their tokens will have their assigned roles in the `roles` claim.
The **Status** column should reflect that consent has been **Granted for \<tenan
## Usage scenario of app roles
-If you're implementing app role business logic that signs in the users in your application scenario, first define the app roles in **App registrations**. Then, an admin assigns them to users and groups in the **Enterprise applications** pane. These assigned app roles are included with any token that's issued for your application, either access tokens when your app is the API being called by an app or ID tokens when your app is signing in a user.
+If you're implementing app role business logic that signs in the users in your application scenario, first define the app roles in **App registrations**. Then, an admin assigns them to users and groups in the **Enterprise applications** pane. These assigned app roles are included with any token that's issued for your application.
If you're implementing app role business logic in an app-calling-API scenario, you have two app registrations. One app registration is for the app, and a second app registration is for the API. In this case, define the app roles and assign them to the user or group in the app registration of the API. When the user authenticates with the app and requests an access token to call the API, a roles claim is included in the token. Your next step is to add code to your web API to check for those roles when the API is called.
active-directory Howto Call A Web Api With Curl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/howto-call-a-web-api-with-curl.md
The Microsoft identity platform requires your application to be registered befor
Follow these steps to create the web API registration:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. If access to multiple tenants is available, use the **Directories + subscriptions** filter :::image type="icon" source="media/common/portal-directory-subscription-filter.png" border="false"::: in the top menu to switch to the tenant in which you want to register the application. 1. Search for and select **Azure Active Directory**. 1. Under **Manage**, select **App registrations > New registration**.
Follow these steps to create the web app registration:
::: zone pivot="api"
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. If access to multiple tenants is available, use the Directories + subscriptions filter :::image type="icon" source="media/common/portal-directory-subscription-filter.png" border="false"::: in the top menu to switch to the tenant in which you want to register the application. 1. Search for and select **Azure Active Directory**. 1. Under **Manage**, select **App registrations** > **New registration**.
active-directory Howto Get List Of All Auth Library Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/howto-get-list-of-all-auth-library-apps.md
No sign-in event that occurred *before* you configure Azure AD to send the event
Once you've integrated your Azure AD sign-in and audit logs with Azure Monitor as specified in the Azure Monitor integration, access the sign-ins workbook:
- 1. Sign into the Azure portal
- 1. Navigate to **Azure Active Directory** > **Monitoring** > **Workbooks**
- 1. In the **Usage** section, open the **Sign-ins** workbook
+ 1. Sign into the Azure portal.
+ 1. Navigate to **Azure Active Directory** > **Monitoring** > **Workbooks**.
+ 1. In the **Usage** section, open the **Sign-ins** workbook.
:::image type="content" source="media/howto-get-list-of-all-auth-library-apps/sign-in-workbook.png" alt-text="Screenshot of the Azure portal workbooks interface highlighting the sign-ins workbook.":::
active-directory Optional Claims https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/optional-claims.md
You can configure optional claims for your application through the Azure portal or application manifest.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search for and select **Azure Active Directory**. 1. Under **Manage**, select **App registrations**. 1. Choose the application for which you want to configure optional claims based on your scenario and desired outcome.
active-directory Reference Breaking Changes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/reference-breaking-changes.md
To help prevent phishing attacks, the device code flow now includes a prompt tha
The prompt that appears looks like this: ## May 2020
active-directory Test Setup Environment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/test-setup-environment.md
Replicating conditional access policies ensures you don't encounter unexpected b
Viewing your production tenant conditional access policies may need to be performed by a company administrator.
-1. Sign into the [Azure portal](https://portal.azure.com) using your production tenant account.
+1. Sign in to the [Azure portal](https://portal.azure.com) using your production tenant account.
1. Go to **Azure Active Directory** > **Enterprise applications** > **Conditional Access**. 1. View the list of policies in your tenant. Click the first one. 1. Navigate to **Cloud apps or actions**.
In a new tab or browser session, sign in to the [Azure portal](https://portal.az
Replicating permission grant policies ensures you don't encounter unexpected prompts for admin consent when moving to production.
-1. Sign into the [Azure portal](https://portal.azure.com) using your production tenant account.
+1. Sign in to the [Azure portal](https://portal.azure.com) using your production tenant account.
1. Click on **Azure Active Directory**. 1. Go to **Enterprise applications**. 1. From your production tenant, go to **Azure Active Directory** > **Enterprise applications** > **Consent and permissions** > **User consent** settings. Copy the settings there to your test tenant.
active-directory Troubleshoot Publisher Verification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/troubleshoot-publisher-verification.md
Below are some common issues that may occur during the process.
- **I am getting an error saying that my MPN ID is invalid or that I do not have access to it.** Follow the [remediation guidance](#mpnaccountnotfoundornoaccess). -- **When I sign into the Azure portal, I do not see any apps registered. Why?**
+- **When I sign in to the Azure portal, I do not see any apps registered. Why?**
Your app registrations may have been created using a different user account in this tenant, a personal/consumer account, or in a different tenant. Ensure you're signed in with the correct account in the tenant where your app registrations were created. - **I'm getting an error related to multi-factor authentication. What should I do?**
active-directory Howto Vm Sign In Azure Ad Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/howto-vm-sign-in-azure-ad-linux.md
-+ # Log in to a Linux virtual machine in Azure by using Azure AD and OpenSSH
active-directory Troubleshoot Mac Sso Extension Plugin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/troubleshoot-mac-sso-extension-plugin.md
description: This article helps to troubleshoot deploying the Microsoft Enterpri
+ Last updated 02/02/2023
-#Customer intent: As an IT admin, I want to learn how to discover and fix issues related to the Microsoft Enterprise SSO plug-in on macOS and iOS.
-
+#Customer intent: As an IT admin, I want to learn how to discover and fix issues related to the Microsoft Enterprise SSO plug-in on macOS and iOS.
# Troubleshooting the Microsoft Enterprise SSO Extension plugin on Apple devices
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
# Product names and service plan identifiers for licensing
-When managing licenses in the [Azure portal](https://portal.azure.com/#blade/Microsoft_AAD_IAM/LicensesMenuBlade/Products) or the [Microsoft 365 admin center](https://admin.microsoft.com), you see product names that look something like *Office 365 E3*. When you use PowerShell v1.0 cmdlets, the same product is identified using a specific but less friendly name: *ENTERPRISEPACK*. When using PowerShell v2.0 cmdlets or [Microsoft Graph](/graph/api/resources/subscribedsku), the same product is identified using a GUID value: *6fd2c87f-b296-42f0-b197-1e91e994b900*. The following table lists the most commonly used Microsoft online service products and provides their various ID values. These tables are for reference purposes in Azure Active Directory (Azure AD), part of Microsoft Entra, and are accurate only as of the date when this article was last updated. Microsoft will continue to make periodic updates to this document.
+When [managing licenses in the Azure portal](https://portal.azure.com/#blade/Microsoft_AAD_IAM/LicensesMenuBlade/Products) or the [Microsoft 365 admin center](https://admin.microsoft.com), you see product names that look something like *Office 365 E3*. When you use PowerShell v1.0 cmdlets, the same product is identified using a specific but less friendly name: *ENTERPRISEPACK*. When using PowerShell v2.0 cmdlets or [Microsoft Graph](/graph/api/resources/subscribedsku), the same product is identified using a GUID value: *6fd2c87f-b296-42f0-b197-1e91e994b900*. The following table lists the most commonly used Microsoft online service products and provides their various ID values. These tables are for reference purposes in Azure Active Directory (Azure AD), part of Microsoft Entra, and are accurate only as of the date when this article was last updated. Microsoft will continue to make periodic updates to this document.
- **Product name**: Used in management portals - **String ID**: Used by PowerShell v1.0 cmdlets when performing operations on licenses or by the **skuPartNumber** property of the **subscribedSku** Microsoft Graph API
active-directory Code Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/code-samples.md
Last updated 04/06/2023
-+ - # Customer intent: As a tenant administrator, I want to bulk-invite external users to an organization from email addresses that I've stored in a .csv file.
active-directory Direct Federation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/direct-federation.md
Next, configure federation with the IdP configured in step 1 in Azure AD. You ca
### To configure federation in the Azure portal
-1. Sign in to the [Azure portal](https://portal.azure.com/) as an External Identity Provider Administrator or a Global Administrator.
+1. Sign in to the [Azure portal](https://portal.azure.com) as an External Identity Provider Administrator or a Global Administrator.
2. In the left pane, select **Azure Active Directory**. 3. Select **External Identities** > **All identity providers**. 4. Select **New SAML/WS-Fed IdP**.
On the **All identity providers** page, you can view the list of SAML/WS-Fed ide
You can remove your federation configuration. If you do, federation guest users who have already redeemed their invitations can no longer sign in. But you can give them access to your resources again by [resetting their redemption status](reset-redemption-status.md). To remove a configuration for an IdP in the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com/). In the left pane, select **Azure Active Directory**.
+1. Sign in to the [Azure portal](https://portal.azure.com). In the left pane, select **Azure Active Directory**.
1. Select **External Identities**. 1. Select **All identity providers**. 1. Under **SAML/WS-Fed identity providers**, scroll to the identity provider in the list or use the search box.
active-directory Google Federation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/google-federation.md
You'll now set the Google client ID and client secret. You can use the Azure por
You can delete your Google federation setup. If you do so, Google guest users who have already redeemed their invitation won't be able to sign in. But you can give them access to your resources again by [resetting their redemption status](reset-redemption-status.md). **To delete Google federation in the Azure portal**
-1. Sign in to the [Azure portal](https://portal.azure.com). On the left pane, select **Azure Active Directory**.
+1. Sign in to the [Azure portal](https://portal.azure.com). On the left pane, select **Azure Active Directory**.
2. Select **External Identities**. 3. Select **All identity providers**. 4. On the **Google** line, select the ellipsis button (**...**) and then select **Delete**.
active-directory Ops Guide Ops https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/ops-guide-ops.md
If AD FS is only used for Azure AD federation, there are some endpoints that can
### Access to machines with on-premises identity components
-Organizations should lock down access to the machines with on-premises hybrid components in the same way as your on-premises domain. For example, a backup operator or Hyper-V administrator shouldn't be able to log in to the Azure AD Connect Server to change rules.
+Organizations should lock down access to the machines with on-premises hybrid components in the same way as your on-premises domain. For example, a backup operator or Hyper-V administrator shouldn't be able to sign in to the Azure AD Connect Server to change rules.
The Active Directory administrative tier model was designed to protect identity systems using a set of buffer zones between full control of the Environment (Tier 0) and the high-risk workstation assets that attackers frequently compromise.
active-directory Secure Resource Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/secure-resource-management.md
Azure AD [Conditional Access](../../role-based-access-control/conditional-access
![Diagram that shows the Conditional Access policy.](media/secure-resource-management/conditional-access.jpeg)
-For example, an administrator may configure a Conditional Access policy, which allows a user to sign into the Azure portal only from approved locations, and also requires either multifactor authentication (MFA) or a hybrid Azure AD domain-joined device.
+For example, an administrator may configure a Conditional Access policy, which allows a user to sign in to the Azure portal only from approved locations, and also requires either multifactor authentication (MFA) or a hybrid Azure AD domain-joined device.
## Azure Managed Identities
A key concept to address with the first two options is that there are two identi
* When you sign in to an Azure Windows Server VM via remote desktop protocol (RDP), you're generally logging on to the server using your domain credentials, which performs a Kerberos authentication against an on-premises AD DS domain controller or Azure AD DS. Alternatively, if the server isn't domain-joined then a local account can be used to sign in to the virtual machines.
-* When you sign into the Azure portal to create or manage a VM, you're authenticating against Azure AD (potentially using the same credentials if you've synchronized the correct accounts), and this could result in an authentication against your domain controllers should you be using Active Directory Federation Services (AD FS) or PassThrough Authentication.
+* When you sign in to the Azure portal to create or manage a VM, you're authenticating against Azure AD (potentially using the same credentials if you've synchronized the correct accounts), and this could result in an authentication against your domain controllers should you be using Active Directory Federation Services (AD FS) or PassThrough Authentication.
### Virtual machines joined to standalone Active Directory Domain Services
active-directory Entitlement Management Access Package Assignments https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/entitlement-management-access-package-assignments.md
na Previously updated : 05/31/2023 Last updated : 06/27/2023
# View, add, and remove assignments for an access package in entitlement management
-In entitlement management, you can see who has been assigned to access packages, their policy, and status. If an access package has an appropriate policy, you can also directly assign user to an access package. This article describes how to view, add, and remove assignments for access packages.
+In entitlement management, you can see who has been assigned to access packages, their policy, status, and user lifecycle (preview). If an access package has an appropriate policy, you can also directly assign user to an access package. This article describes how to view, add, and remove assignments for access packages.
## Prerequisites To use entitlement management and assign users to access packages, you must have one of the following licenses: -- Microsoft Azure AD Premium P2 or Microsoft Entra ID Governance
+- Azure AD Premium P2
- Enterprise Mobility + Security (EMS) E5 license
+- Microsoft Entra ID governance subscription
## View who has an assignment
$policy = $accesspackage.AssignmentPolicies[0]
$req = New-MgEntitlementManagementAccessPackageAssignmentRequest -AccessPackageId $accesspackage.Id -AssignmentPolicyId $policy.Id -TargetEmail "sample@example.com" ```
+## Manage user lifecycle (preview)
+
+Entitlement management also allows you to get visibility into state of a guest user's lifecycle through the following viewpoints:
+
+- **Governed** - The user is set to be governed.
+- **Ungoverned** - The user is set to not be governed.
+- **Blank** - The lifecycle for the user is not determined. This happens when a user had an access package assigned before managing user lifecycle was possible.
+
+> [!NOTE]
+> When a guest user is set as **Governed**, based on ELM tenant settings their account will be deleted or disabled in specified days after their last access package assignment expires. Learn more about ELM settings here: [Manage external access with Azure Active Directory entitlement management](../fundamentals/6-secure-access-entitlement-managment.md).
+
+You can directly convert ungoverned users to governed by using the **Mark Guests as Governed ( preview)** functionality in the top menu bar.
+
+To manage user lifecycle, you'd follow these steps:
+
+**Prerequisite role:** Global administrator, User administrator, Catalog owner, Access package manager or Access package assignment manager
+
+1. In the Azure portal, select **Azure Active Directory** and then select **Identity Governance**.
+
+1. In the left menu, select **Access packages** and then open the access package.
+
+1. In the left menu, select **Assignments**.
+
+1. On the assignments screen, select the user you want to manage the lifecycle for, and then select **Mark guest as governed (Preview)**.
+ :::image type="content" source="media/entitlement-management-access-package-assignments/govern-user-lifecycle.png" alt-text="Screenshot of the govern user lifecycle selection.":::
+1. Select save.
+
+## Manage user lifecycle programmatically
+
+To manage user lifecycle programatically using Microsoft Graph, see: [accessPackageSubject resource type](/graph/api/resources/accesspackagesubject).
+ ## Remove an assignment You can remove an assignment that a user or an administrator had previously requested.
$req = New-MgEntitlementManagementAccessPackageAssignmentRequest -AccessPackageA
## Next steps - [Change request and settings for an access package](entitlement-management-access-package-request-policy.md)-- [View reports and logs](entitlement-management-reports.md)
+- [View reports and logs](entitlement-management-reports.md)
active-directory Entitlement Management Access Package First https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/entitlement-management-access-package-first.md
A resource directory has one or more resources to share. In this step, you creat
![Diagram that shows the users and groups for this tutorial.](./media/entitlement-management-access-package-first/elm-users-groups.png)
-1. Sign in to the [Azure portal](https://portal.azure.com) as a Global administrator or User administrator.
+1. Sign in to the [Azure portal](https://portal.azure.com) as a Global administrator or User administrator.
1. In the left navigation, select **Azure Active Directory**.
active-directory Entitlement Management Logic Apps Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/entitlement-management-logic-apps-integration.md
These triggers to Logic Apps are controlled in a tab within access package polic
**Prerequisite roles:** Global administrator, Identity Governance administrator, Catalog owner or Resource Group Owner
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In the Azure portal, select **Azure Active Directory** and then select **Identity Governance**.
These triggers to Logic Apps are controlled in a tab within access package polic
**Prerequisite roles:** Global administrator, Identity Governance administrator, Catalog owner, or Access package manager
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In the Azure portal, select **Azure Active Directory** and then select **Identity Governance**.
active-directory Tutorial Onboard Custom Workflow Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/tutorial-onboard-custom-workflow-portal.md
The pre-hire scenario can be broken down into the following:
## Create a workflow using prehire template Use the following steps to create a pre-hire workflow that generates a TAP and send it via email to the user's manager using the Azure portal.
- 1. Sign in to Azure portal.
+ 1. Sign in to the Azure portal.
2. On the right, select **Azure Active Directory**. 3. Select **Identity Governance**. 4. Select **Lifecycle workflows**.
active-directory Tutorial Prepare Azure Ad User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/tutorial-prepare-azure-ad-user-accounts.md
Some of the attributes required for the pre-hire onboarding tutorial are exposed
For the tutorial, the **mail** attribute only needs to be set on the manager account and the **manager** attribute set on the employee account. Use the following steps:
- 1. Sign in to Azure portal.
+ 1. Sign in to the Azure portal.
2. On the right, select **Azure Active Directory**. 3. Select **Users**. 4. Select **Melva Prince**.
active-directory Tutorial Scheduled Leaver Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/tutorial-scheduled-leaver-portal.md
The scheduled leaver scenario can be broken down into the following:
## Create a workflow using scheduled leaver template Use the following steps to create a scheduled leaver workflow that will configure off-boarding tasks for employees after their last day of work with Lifecycle workflows using the Azure portal.
- 1. Sign in to Azure portal.
+ 1. Sign in to the Azure portal.
2. On the right, select **Azure Active Directory**. 3. Select **Identity Governance**. 4. Select **Lifecycle workflows**.
After running your workflow on-demand and checking that everything is working fi
## Next steps - [Preparing user accounts for Lifecycle workflows](tutorial-prepare-azure-ad-user-accounts.md) - [Automate employee offboarding tasks after their last day of work using Lifecycle Workflows APIs](/graph/tutorial-lifecycle-workflows-scheduled-leaver)-------
active-directory Migrate From Federation To Cloud Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/connect/migrate-from-federation-to-cloud-authentication.md
On your Azure AD Connect server, follow the steps 1- 5 in [Option A](#option-a).
```powershell Update-MgDomain -DomainId <domain name> -AuthenticationType "Managed" ```
- See [Update-MgDomain](/powershell/module/microsoft.graph.identity.directorymanagement/update-mgdomain?view=graph-powershell-1.0 &preserve-view=true)
+ 3. In the Azure portal, select **Azure Active Directory > Azure AD Connect**.
-4. Verify that the domain has been converted to managed by running the following command:
+4. Verify that the domain has been converted to managed by running the command below. The Authentication type should be set to managed.
```powershell
- Get-MgDomainFederationConfiguration -DomainId yourdomain.com
+ Get-MgDomain -DomainId yourdomain.com
``` ## Complete your migration
active-directory Application Sign In Unexpected User Consent Prompt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/application-sign-in-unexpected-user-consent-prompt.md
Further prompts can be expected in various scenarios:
To ensure the permissions granted for the application are up-to-date, you can compare the permissions that are being requested by the application with the permissions already granted in the tenant.
-1. Sign-in to the Azure portal with an administrator account.
+1. Sign in to the Azure portal with an administrator account.
2. Navigate to **Enterprise applications**. 3. Select the application in question from the list. 4. Under Security in the left-hand navigation, choose **Permissions**
To ensure the permissions granted for the application are up-to-date, you can co
If the application requires assignment, individual users can't consent for themselves. To check if assignment is required for the application, do the following:
-1. Sign-in to the Azure portal with an administrator account.
+1. Sign in to the Azure portal with an administrator account.
2. Navigate to **Enterprise applications**. 3. Select the application in question from the list. 4. Under Manage in the left-hand navigation, choose **Properties**.
If the application requires assignment, individual users can't consent for thems
Determining whether an individual user can consent to an application can be configured by every organization, and may differ from directory to directory. Even if every permission doesn't require admin consent by default, your organization may have disabled user consent entirely, preventing an individual user to consent for themselves for an application. To view your organization's user consent settings, do the following:
-1. Sign-in to the Azure portal with an administrator account.
+1. Sign in to the Azure portal with an administrator account.
2. Navigate to **Enterprise applications**. 3. Under Security in the left-hand navigation, choose **Consent and permissions**. 4. View the user consent settings. If set to *Do not allow user consent*, users will never be able to consent on behalf of themselves for an application.
active-directory F5 Bigip Deployment Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/f5-bigip-deployment-guide.md
You can deploy a BIG-IP in different topologies. This guide focuses on a network
To deploy BIG-IP VE from the [Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps).
-1. Log into the [Azure portal](https://portal.azure.com/#home) using an account with permissions to create VMs. For example, Contributor.
+1. Sign in to the [Azure portal](https://portal.azure.com/#home) using an account with permissions to create VMs, such as Contributor.
2. In the top ribbon search box, type **marketplace** 3. Select **Enter**. 4. Type **F5** into the Marketplace filter.
active-directory Grant Admin Consent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/grant-admin-consent.md
https://login.microsoftonline.com/{organization}/adminconsent?client_id={client-
where: - `{client-id}` is the application's client ID (also known as app ID).-- `{organization}` is the tenant ID or any verified domain name of the tenant you want to consent the application in. You can use the value `common`, which will cause the consent to happen in the home tenant of the user you sign in with.
+- `{organization}` is the tenant ID or any verified domain name of the tenant you want to consent the application in. You can use the value `organizations`, which will cause the consent to happen in the home tenant of the user you sign in with.
As always, carefully review the permissions an application requests before granting consent.
+For more information on constructing the tenant-wide admin consent URL, see [Admin consent on the Microsoft identity platform](../develop/v2-admin-consent.md).
:::zone-end
active-directory Home Realm Discovery Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/home-realm-discovery-policy.md
The json object is an example HRD policy definition:
{ "AccelerateToFederatedDomain":true, "PreferredDomain":"federated.example.edu",
- "AllowCloudPasswordValidation":false,
+ "AllowCloudPasswordValidation":false
} } ```
active-directory Tenant Restrictions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/tenant-restrictions.md
Fiddler is a free web debugging proxy that can be used to capture and modify HTT
1. In the Fiddler Web Debugger tool, select the **Rules** menu and select **Customize Rules…** to open the CustomRules file.
- 2. Add the following lines at the beginning of the `OnBeforeRequest` function. Replace \<List of tenant identifiers\> with a domain registered with your tenant (for example, `contoso.onmicrosoft.com`). Replace \<directory ID\> with your tenant's Azure AD GUID identifier. You **must** include the correct GUID identifier in order for the logs to appear in your tenant.
+ 2. Add the following lines within the `OnBeforeRequest` function. Replace \<List of tenant identifiers\> with a domain registered with your tenant (for example, `contoso.onmicrosoft.com`). Replace \<directory ID\> with your tenant's Azure AD GUID identifier. You **must** include the correct GUID identifier in order for the logs to appear in your tenant.
```JScript.NET // Allows access to the listed tenants.
active-directory Tutorial Manage Access Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/tutorial-manage-access-security.md
Using the information in this tutorial, an administrator learns how to:
For the application that the administrator added to their tenant, they want to set it up so that all users in the organization can use it and not have to individually request consent to use it. To avoid the need for user consent, they can grant consent for the application on behalf of all users in the organization. For more information, see [Consent and permissions overview](consent-and-permissions-overview.md).
-1. Sign in to the [Azure portal](https://portal.azure.com/) with one of the roles listed in the prerequisites.
+1. Sign in to the [Azure portal](https://portal.azure.com) with one of the roles listed in the prerequisites.
2. Search for and select **Azure Active Directory**. 3. Select **Enterprise applications**. 4. Select the application to which you want to grant tenant-wide admin consent.
In this tutorial, the administrator can find the basic steps to configure the ap
### Test multi-factor authentication 1. Open a new browser window in InPrivate or incognito mode and browse to the URL of the application.
-1. Sign in with the user account that you assigned to the application. You're required to register for and use Azure AD Multi-Factor Authentication. Follow the prompts to complete the process and verify you successfully sign into the Azure portal.
+1. Sign in with the user account that you assigned to the application. You're required to register for and use Azure AD Multi-Factor Authentication. Follow the prompts to complete the process and verify you successfully sign in to the Azure portal.
1. Close the browser window. ## Create a terms of use statement
active-directory How Manage User Assigned Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities.md
Last updated 05/10/2023 -+ zone_pivot_groups: identity-mi-methods
active-directory Qs Configure Cli Windows Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/qs-configure-cli-windows-vm.md
Last updated 01/11/2022 -+ ms.devlang: azurecli
active-directory Qs Configure Cli Windows Vmss https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/qs-configure-cli-windows-vmss.md
Last updated 05/25/2023 -+ ms.devlang: azurecli
active-directory Tutorial Vm Managed Identities Cosmos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/tutorial-vm-managed-identities-cosmos.md
Last updated 03/31/2023 -+ ms.tool: azure-cli, azure-powershell ms.devlang: azurecli #Customer intent: As an administrator, I want to know how to access Azure Cosmos DB from a virtual machine using a managed identity
Then read and write data as described in [these samples](../../cosmos-db/sql/sql
# [Portal](#tab/azure-portal)
-1. In the [portal](https://portal.azure.com), select the resource you want to delete.
+1. In the [Azure portal](https://portal.azure.com), select the resource you want to delete.
1. Select **Delete**.
active-directory Tutorial Windows Vm Ua Arm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/tutorial-windows-vm-ua-arm.md
You learn how to:
[!INCLUDE [msi-qs-configure-prereqs](../../../includes/active-directory-msi-qs-configure-prereqs.md)] -- [Sign in to Azure portal](https://portal.azure.com)
+- Sign in to the [Azure portal](https://portal.azure.com)
- [Create a Windows virtual machine](../../virtual-machines/windows/quick-create-portal.md)
CanDelegate: False
For the remainder of the tutorial, you will work from the VM we created earlier.
-1. Sign in to the Azure portal at [https://portal.azure.com](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the portal, navigate to **Virtual Machines** and go to the Windows virtual machine and in the **Overview**, click **Connect**.
active-directory Cross Tenant Synchronization Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/multi-tenant-organizations/cross-tenant-synchronization-configure.md
Previously updated : 05/31/2023 Last updated : 07/21/2023
Attribute mappings allow you to define how data should flow between the source t
1. On the **Attribute Mapping** page, scroll down to review the user attributes that are synchronized between tenants in the **Attribute Mappings** section.
- The first attribute, alternativeSecurityIdentifier, is an internal attribute used to uniquely identify the user across tenants, match users in the source tenant with existing users in the target tenant, and ensure that each user only has one account. The matching attribute cannot be changed. Attempting to change the matching attribute will result in a `schemaInvalid` error.
+ The first attribute, alternativeSecurityIdentifier, is an internal attribute used to uniquely identify the user across tenants, match users in the source tenant with existing users in the target tenant, and ensure that each user only has one account. The matching attribute cannot be changed. Attempting to change the matching attribute or adding additional matching attributes will result in a `schemaInvalid` error.
:::image type="content" source="./media/cross-tenant-synchronization-configure/provisioning-attribute-mapping.png" alt-text="Screenshot of the Attribute Mapping page that shows the list of Azure Active Directory attributes." lightbox="./media/cross-tenant-synchronization-configure/provisioning-attribute-mapping.png":::
active-directory Quickstart Azure Monitor Route Logs To Storage Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/quickstart-azure-monitor-route-logs-to-storage-account.md
To use this feature, you need:
## Archive logs to an Azure storage account
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Azure Active Directory** > **Monitoring** > **Audit logs**.
active-directory Gainsight Saml Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/gainsight-saml-tutorial.md
+
+ Title: Azure Active Directory SSO integration with Gainsight SAML
+description: Learn how to configure single sign-on between Azure Active Directory and Gainsight SAML.
++++++++ Last updated : 07/14/2023++++
+# Azure Active Directory SSO integration with Gainsight SAML
+
+In this article, you'll learn how to integrate Gainsight SAML with Azure Active Directory (Azure AD). Use Azure AD to manage user access and enable single sign-on with Gainsight SAML. Requires an existing Gainsight SAML subscription. When you integrate Gainsight SAML with Azure AD, you can:
+
+* Control in Azure AD who has access to Gainsight SAML.
+* Enable your users to be automatically signed-in to Gainsight SAML with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+You'll configure and test Azure AD single sign-on for Gainsight SAML in a test environment. Gainsight SAML supports both **SP** and **IDP** initiated single sign-on.
+
+## Prerequisites
+
+To integrate Azure Active Directory with Gainsight SAML, you need:
+
+* An Azure AD user account. If you don't already have one, you can [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+* One of the following roles: Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal.
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Gainsight SAML single sign-on (SSO) enabled subscription.
+
+## Add application and assign a test user
+
+Before you begin the process of configuring single sign-on, you need to add the Gainsight SAML application from the Azure AD gallery. You need a test user account to assign to the application and test the single sign-on configuration.
+
+### Add Gainsight SAML from the Azure AD gallery
+
+Add Gainsight SAML from the Azure AD application gallery to configure single sign-on with Gainsight SAML. For more information on how to add application from the gallery, see the [Quickstart: Add application from the gallery](../manage-apps/add-application-portal.md).
+
+### Create and assign Azure AD test user
+
+Follow the guidelines in the [create and assign a user account](../manage-apps/add-application-portal-assign-users.md) article to create a test user account in the Azure portal called B.Simon.
+
+Alternatively, you can also use the [Enterprise App Configuration Wizard](https://portal.office.com/AdminPortal/home?Q=Docs#/azureadappintegration). In this wizard, you can add an application to your tenant, add users/groups to the app, and assign roles. The wizard also provides a link to the single sign-on configuration pane in the Azure portal. [Learn more about Microsoft 365 wizards.](/microsoft-365/admin/misc/azure-ad-setup-guides).
+
+## Configure Azure AD SSO
+
+Complete the following steps to enable Azure AD single sign-on in the Azure portal.
+
+1. In the Azure portal, on the **Gainsight SAML** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, select the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit Basic SAML Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** textbox, type a value using one of the following patterns:
+
+ | **Identifier** |
+ |--|
+ | `urn:auth0:gainsight:<ID>` |
+ | `urn:auth0:gainsight-eu:<ID>` |
+
+ b. In the **Reply URL** textbox, type a URL using one of the following patterns:
+
+ | **Reply URL** |
+ ||
+ | `https://secured.gainsightcloud.com/login/callback?connection=<ID>` |
+ | `https://secured.eu.gainsightcloud.com/login/callback?connection=<ID>` |
+
+1. Perform the following step, if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign on URL** textbox, type a URL using one of the following patterns:
+
+ | **Sign on URL** |
+ ||
+ | `https://secured.gainsightcloud.com/samlp/<ID>` |
+ | `https://secured.eu.gainsightcloud.com/samlp/<ID>` |
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign on URL. Contact [Gainsight SAML support team](mailto:support@gainsight.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set-up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/certificatebase64.png "Certificate")
+
+1. On the **Set up Gainsight SAML** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Screenshot shows to copy configuration appropriate URL.](common/copy-configuration-urls.png "Metadata")
+
+## Configure Gainsight SAML SSO
+
+To configure single sign-on on **Gainsight SAML** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Gainsight SAML support team](mailto:support@gainsight.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Gainsight SAML test user
+
+In this section, you create a user called Britta Simon at Gainsight SAML SSO. Work with [Gainsight SAML support team](mailto:support@gainsight.com) to add the users in the Gainsight SAML SSO platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Gainsight SAML Sign-on URL where you can initiate the login flow.
+
+* Go to Gainsight SAML Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Gainsight SAML for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Gainsight SAML tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Gainsight SAML for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Additional resources
+
+* [What is single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+* [Plan a single sign-on deployment](../manage-apps/plan-sso-deployment.md).
+
+## Next steps
+
+Once you configure Gainsight SAML you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Infinitecampus Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/infinitecampus-tutorial.md
See Infinite Campus [documentation](https://kb.infinitecampus.com/help/sso-servi
The SAML certificate of this integration relies on which eventually need to be renewed so users can continue logging into Infinite Campus through single sign-on. For districts with proper Campus Messenger Email Settings established, Infinite Campus sends warning emails as the certificate expiration approaches. (Subject: "Action required: Your certificate is expiring.") These are the steps to take to replace an expiring SAML certificate:
-1. Have your district's Microsoft Azure Active Directory admin sign-in to the Azure portal.
+1. Have your district's Microsoft Azure Active Directory admin sign in to the Azure portal.
1. On the left navigation pane, select the Azure Active Directory service. 1. Navigate to Enterprise Applications and select your Infinite Campus application set up previously. (If you have multiple Infinite Campus environments like a sandbox or staging site, you have multiple Infinite Campus applications set up here. You need to complete this process in each respective Infinite Campus environment for any with an expiring certificate.) 1. Select Single sign-on.
active-directory Jiramicrosoft Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/jiramicrosoft-tutorial.md
Use your Microsoft Azure Active Directory account with Atlassian JIRA server to
To configure Azure AD integration with JIRA SAML SSO by Microsoft, you need the following items: - An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).-- JIRA Core and Software 6.4 to 9.8.1 or JIRA Service Desk 3.0 to 4.22.1 should be installed and configured on Windows 64-bit version.
+- JIRA Core and Software 6.4 to 9.10.0 or JIRA Service Desk 3.0 to 4.22.1 should be installed and configured on Windows 64-bit version.
- JIRA server is HTTPS enabled. - Note the supported versions for JIRA Plugin are mentioned in below section. - JIRA server is reachable on the Internet particularly to the Azure AD login page for authentication and should able to receive the token from Azure AD.
To get started, you need the following items:
## Supported versions of JIRA
-* JIRA Core and Software: 6.4 to 9.8.1.
+* JIRA Core and Software: 6.4 to 9.10.0.
* JIRA Service Desk 3.0 to 4.22.1. * JIRA also supports 5.2. For more details, click [Microsoft Azure Active Directory single sign-on for JIRA 5.2](jira52microsoft-tutorial.md).
active-directory Ms Confluence Jira Plugin Adminguide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/ms-confluence-jira-plugin-adminguide.md
Note the following information before you install the plug-in:
The plug-in supports the following versions of Jira and Confluence:
-* Jira Core and Software: 6.0 to 9.8.1
+* Jira Core and Software: 6.0 to 9.10.0
* Jira Service Desk: 3.0.0 to 4.22.1. * JIRA also supports 5.2. For more details, click [Microsoft Azure Active Directory single sign-on for JIRA 5.2](./jira52microsoft-tutorial.md). * Confluence: 5.0 to 5.10.
JIRA:
|Plugin Version | Release Notes | Supported JIRA versions | |--|-|-| | 1.0.20 | Bug Fixes: | Jira Core and Software: |
-| | JIRA SAML SSO add-on redirects to incorrect URL from mobile browser. | 7.0.0 to 9.8.1 |
+| | JIRA SAML SSO add-on redirects to incorrect URL from mobile browser. | 7.0.0 to 9.10.0 |
| | The mark log section after enabling the JIRA plugin. | | | | The last login date for a user doesn't update when user signs in via SSO. | | | | | |
No. The plug-in supports only on-premises versions of Jira and Confluence.
The plug-in supports these versions:
-* Jira Core and Software: 6.0 to 9.8.1.
+* Jira Core and Software: 6.0 to 9.10.0
* Jira Service Desk: 3.0.0 to 4.22.1. * JIRA also supports 5.2. For more details, click [Microsoft Azure Active Directory single sign-on for JIRA 5.2](./jira52microsoft-tutorial.md). * Confluence: 5.0 to 5.10.
active-directory Tableau Online Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tableau-online-provisioning-tutorial.md
In June 2022, Tableau released a SCIM 2.0 connector. Completing the steps below
>Be sure to note any changes that have been made to the settings listed above before completing the steps below. Failure to do so will result in the loss of customized settings.
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your current Tableau Cloud app under **Azure Active Directory > Enterprise Applications**.
active-directory Workplace By Facebook Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workplace-by-facebook-provisioning-tutorial.md
In December 2021, Facebook released a SCIM 2.0 connector. Completing the steps b
> [!NOTE] > Be sure to note any changes that have been made to the settings listed above before completing the steps below. Failure to do so will result in the loss of customized settings.
-1. Sign into the [Azure portal](https://portal.azure.com)
-2. Navigate to your current Workplace by Facebook app under Azure Active Directory > Enterprise Applications
+1. Sign in to the [Azure portal](https://portal.azure.com).
+2. Navigate to your current Workplace by Facebook app under Azure Active Directory > Enterprise Applications.
3. In the Properties section of your new custom app, copy the Object ID. ![Screenshot of Workplace by Facebook app in the Azure portal](./media/workplace-by-facebook-provisioning-tutorial/app-properties.png)
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/service-limits.md
If you would like to increase your limit, you can enable auto scaling on your re
#### Retrieve resource ID and region
-* Go to [Azure portal](https://portal.azure.com/)
+* Sign in to the [Azure portal](https://portal.azure.com)
* Select the Anomaly Detector Resource for which you would like to increase the transaction limit * Select Properties (Resource Management group) * Copy and save the values of the following fields:
If you would like to increase your limit, you can enable auto scaling on your re
To request a limit increase for your resource submit a **Support Request**:
-1. Go to [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com)
2. Select the Anomaly Detector Resource for which you would like to increase the limit 3. Select New support request (Support + troubleshooting group) 4. A new window will appear with auto-populated information about your Azure Subscription and Azure Resource
ai-services Cognitive Services Container Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-container-support.md
Previously updated : 05/08/2023 Last updated : 07/21/2023 keywords: on-premises, Docker, container, Kubernetes #Customer intent: As a potential customer, I want to know more about how Azure AI services provides and supports Docker containers for each service.
Azure AI services containers provide the following set of Docker containers, eac
| Service | Container | Description | Availability | |--|--|--|--|
-| [Speech Service API][sp-containers-stt] | **Speech to text** ([image](https://hub.docker.com/_/microsoft-azure-cognitive-services-speechservices-speech-to-text)) | Transcribes continuous real-time speech into text. | Generally available. <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). |
-| [Speech Service API][sp-containers-cstt] | **Custom Speech to text** ([image](https://hub.docker.com/_/microsoft-azure-cognitive-services-speechservices-custom-speech-to-text)) | Transcribes continuous real-time speech into text using a custom model. | Generally available <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). |
-| [Speech Service API][sp-containers-ntts] | **Neural Text to speech** ([image](https://hub.docker.com/_/microsoft-azure-cognitive-services-speechservices-neural-text-to-speech)) | Converts text to natural-sounding speech using deep neural network technology, allowing for more natural synthesized speech. | Generally available. <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). |
-| [Speech Service API][sp-containers-lid] | **Speech language detection** ([image](https://hub.docker.com/_/microsoft-azure-cognitive-services-speechservices-language-detection)) | Determines the language of spoken audio. | Gated preview |
+| [Speech Service API][sp-containers-stt] | **Speech to text** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/speech-to-text/about)) | Transcribes continuous real-time speech into text. | Generally available. <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). |
+| [Speech Service API][sp-containers-cstt] | **Custom Speech to text** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/custom-speech-to-text/about)) | Transcribes continuous real-time speech into text using a custom model. | Generally available <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). |
+| [Speech Service API][sp-containers-ntts] | **Neural Text to speech** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/neural-text-to-speech/about)) | Converts text to natural-sounding speech using deep neural network technology, allowing for more natural synthesized speech. | Generally available. <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). |
+| [Speech Service API][sp-containers-lid] | **Speech language detection** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/language-detection/about)) | Determines the language of spoken audio. | Gated preview |
### Vision containers | Service | Container | Description | Availability | |--|--|--|--|
-| [Azure AI Vision][cv-containers] | **Read OCR** ([image](https://hub.docker.com/_/microsoft-azure-cognitive-services-vision-read)) | The Read OCR container allows you to extract printed and handwritten text from images and documents with support for JPEG, PNG, BMP, PDF, and TIFF file formats. For more information, see the [Read API documentation](./computer-vision/overview-ocr.md). | Generally Available. Gated - [request access](https://aka.ms/csgate). <br>This container can also [run in disconnected environments](containers/disconnected-containers.md). |
-| [Spatial Analysis][spa-containers] | **Spatial analysis** ([image](https://hub.docker.com/_/microsoft-azure-cognitive-services-vision-spatial-analysis)) | Analyzes real-time streaming video to understand spatial relationships between people, their movement, and interactions with objects in physical environments. | Preview |
+| [Azure AI Vision][cv-containers] | **Read OCR** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/vision/read/about)) | The Read OCR container allows you to extract printed and handwritten text from images and documents with support for JPEG, PNG, BMP, PDF, and TIFF file formats. For more information, see the [Read API documentation](./computer-vision/overview-ocr.md). | Generally Available. Gated - [request access](https://aka.ms/csgate). <br>This container can also [run in disconnected environments](containers/disconnected-containers.md). |
+| [Spatial Analysis][spa-containers] | **Spatial analysis** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/vision/spatial-analysis/about)) | Analyzes real-time streaming video to understand spatial relationships between people, their movement, and interactions with objects in physical environments. | Preview |
<!-- |[Personalizer](./personalizer/what-is-personalizer.md) |F0, S0|**Personalizer** ([image](https://go.microsoft.com/fwlink/?linkid=2083928&clcid=0x409))|Azure AI Personalizer is a cloud-based API service that allows you to choose the best experience to show to your users, learning from their real-time behavior.|
ai-services Commitment Tier https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/commitment-tier.md
For more information, see [Azure AI services pricing](https://azure.microsoft.co
## Create a new resource
-1. Sign into the [Azure portal](https://portal.azure.com/) and select **Create a new resource** for one of the applicable Azure AI services or Azure AI services listed.
+1. Sign in to the [Azure portal](https://portal.azure.com) and select **Create a new resource** for one of the applicable Azure AI services or Azure AI services listed.
2. Enter the applicable information to create your resource. Be sure to select the standard pricing tier.
For more information, see [Azure AI services pricing](https://azure.microsoft.co
## Purchase a commitment plan by updating your Azure resource
-1. Sign in to the [Azure portal](https://portal.azure.com/) with your Azure subscription.
+1. Sign in to the [Azure portal](https://portal.azure.com) with your Azure subscription.
2. In your Azure resource for one of the applicable features listed, select **Commitment tier pricing**. 3. Select **Change** to view the available commitments for hosted API and container usage. Choose a commitment plan for one or more of the following offerings: * **Web**: web-based APIs, where you send data to Azure for processing.
ai-services Storage Lab Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/Tutorials/storage-lab-tutorial.md
If you don't have an Azure subscription, create a [free account](https://azure.m
In this section, you'll use the [Azure portal](https://portal.azure.com?WT.mc_id=academiccontent-github-cxa) to create a storage account. Then you'll create a pair of containers: one to store images uploaded by the user, and another to store image thumbnails generated from the uploaded images.
-1. Open the [Azure portal](https://portal.azure.com?WT.mc_id=academiccontent-github-cxa) in your browser. If you're asked to sign in, do so using your Microsoft account.
+1. Sign in to the [Azure portal](https://portal.azure.com?WT.mc_id=academiccontent-github-cxa) in your browser. If you're asked to sign in, do so using your Microsoft account.
1. To create a storage account, select **+ Create a resource** in the ribbon on the left. Then select **Storage**, followed by **Storage account**. ![Creating a storage account](Images/new-storage-account.png)
ai-services Use Case Alt Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/use-case-alt-text.md
In general, we advise a confidence threshold of `0.4` for the Image Analysis 3.2
On rare occasions, image captions can contain embarrassing errors, such as labeling a male-identifying person as a "woman" or labeling an adult woman as a "girl". We encourage users to consider using the latest Image Analysis 4.0 API (preview) which eliminates some errors by supporting gender-neutral captions.
-Please report any embarrassing or offensive captions by going to the [Azure portal](https://ms.portal.azure.com/#home) and navigating to the **Feedback** button in the top right.
+Please report any embarrassing or offensive captions by going to the [Azure portal](https://portal.azure.com) and navigating to the **Feedback** button in the top right.
## Next Steps Follow a quickstart to begin automatically generating alt text by using image captioning on Image Analysis.
ai-services Disconnected Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/containers/disconnected-containers.md
Access is limited to customers that meet the following requirements:
### Create a new resource
-1. Sign into the [Azure portal](https://portal.azure.com/) and select **Create a new resource** for one of the applicable Azure AI services or Azure AI services listed above.
+1. Sign in to the [Azure portal](https://portal.azure.com) and select **Create a new resource** for one of the applicable Azure AI services or Azure AI services listed above.
2. Enter the applicable information to create your resource. Be sure to select **Commitment tier disconnected containers** as your pricing tier.
If you run the container with an output mount and logging enabled, the container
## Next steps [Azure AI services containers overview](../cognitive-services-container-support.md)---------
ai-services Get Started Build Detector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/get-started-build-detector.md
If you don't have an Azure subscription, create a [free account](https://azure.m
## Create a new project
-In your web browser, navigate to the [Custom Vision web page](https://customvision.ai) and select __Sign in__. Sign in with the same account you used to sign into the Azure portal.
+In your web browser, navigate to the [Custom Vision web page](https://customvision.ai) and select __Sign in__. Sign in with the same account you used to sign in to the Azure portal.
![Image of the sign-in page](./media/browser-home.png)
In your web browser, navigate to the [Custom Vision web page](https://customvisi
1. Enter a name and a description for the project. Then select your Custom Vision Training Resource. If your signed-in account is associated with an Azure account, the Resource dropdown will display all of your compatible Azure resources. > [!NOTE]
- > If no resource is available, please confirm that you have logged into [customvision.ai](https://customvision.ai) with the same account as you used to log into the [Azure portal](https://portal.azure.com/). Also, please confirm you have selected the same "Directory" in the Custom Vision website as the directory in the Azure portal where your Custom Vision resources are located. In both sites, you may select your directory from the drop down account menu at the top right corner of the screen.
+ > If no resource is available, please confirm that you have logged into [customvision.ai](https://customvision.ai) with the same account as you used to sign in to the [Azure portal](https://portal.azure.com). Also, please confirm you have selected the same "Directory" in the Custom Vision website as the directory in the Azure portal where your Custom Vision resources are located. In both sites, you may select your directory from the drop down account menu at the top right corner of the screen.
1. Under 1. Select __Object Detection__ under __Project Types__.
ai-services Getting Started Build A Classifier https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/getting-started-build-a-classifier.md
If you don't have an Azure subscription, create a [free account](https://azure.m
## Create a new project
-In your web browser, navigate to the [Custom Vision web page](https://customvision.ai) and select __Sign in__. Sign in with the same account you used to sign into the Azure portal.
+In your web browser, navigate to the [Custom Vision web page](https://customvision.ai) and select __Sign in__. Sign in with the same account you used to sign in to the Azure portal.
![Image of the sign-in page](./media/browser-home.png)
In your web browser, navigate to the [Custom Vision web page](https://customvisi
1. Enter a name and a description for the project. Then select your Custom Vision Training Resource. If your signed-in account is associated with an Azure account, the Resource dropdown will display all of your compatible Azure resources. > [!NOTE]
- > If no resource is available, please confirm that you have logged into [customvision.ai](https://customvision.ai) with the same account as you used to log into the [Azure portal](https://portal.azure.com/). Also, please confirm you have selected the same "Directory" in the Custom Vision website as the directory in the Azure portal where your Custom Vision resources are located. In both sites, you may select your directory from the drop down account menu at the top right corner of the screen.
+ > If no resource is available, please confirm that you have logged into [customvision.ai](https://customvision.ai) with the same account as you used to sign in to the [Azure portal](https://portal.azure.com). Also, please confirm you have selected the same "Directory" in the Custom Vision website as the directory in the Azure portal where your Custom Vision resources are located. In both sites, you may select your directory from the drop down account menu at the top right corner of the screen.
1. Select __Classification__ under __Project Types__. Then, under __Classification Types__, choose either **Multilabel** or **Multiclass**, depending on your use case. Multilabel classification applies any number of your tags to an image (zero or more), while multiclass classification sorts images into single categories (every image you submit will be sorted into the most likely tag). You'll be able to change the classification type later if you want to.
ai-services Logo Detector Mobile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/logo-detector-mobile.md
Follow these steps to run the app:
1. In Visual Studio Solution Explorer, select either the **VisualProvision.Android** project or the **VisualProvision.iOS** project. Choose a corresponding emulator or connected mobile device from the drop-down menu on the main toolbar. Then run the app. > [!NOTE]
- > You will need a MacOS device to run an iOS emulator.
+ > You will need a macOS device to run an iOS emulator.
1. On the first screen, enter your service principal client ID, tenant ID, and password. Select the **Login** button.
Follow these steps to run the app:
## Clean up resources
-If you've followed all of the steps of this scenario and used the app to deploy Azure services to your account, go to the [Azure portal](https://portal.azure.com/). There, cancel the services you don't want to use.
+If you've followed all of the steps of this scenario and used the app to deploy Azure services to your account, sign in to the [Azure portal](https://portal.azure.com). There, cancel the services you don't want to use.
If you plan to create your own object detection project with Custom Vision, you might want to delete the logo detection project you created in this tutorial. A free subscription for Custom Vision allows for only two projects. To delete the logo detection project, on the [Custom Vision website](https://customvision.ai), open **Projects** and then select the trash icon under **My New Project**.
ai-services Concept Insurance Card https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/concept-insurance-card.md
See how data is extracted from health insurance cards using the Document Intelli
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
-* A [Document Intelligence instance](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your key and endpoint.
+* A [Document Intelligence instance](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your key and endpoint.
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot of keys and endpoint location in the Azure portal.":::
ai-services Create Document Intelligence Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/create-document-intelligence-resource.md
The Azure portal is a single platform you can use to create and manage Azure ser
Let's get started:
-1. Navigate to the Azure portal home page: [Azure home page](https://portal.azure.com/#home).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Create a resource** from the Azure home page.
ai-services Create Sas Tokens https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/create-sas-tokens.md
To get started, you need:
## Upload your documents
-1. Go to the [Azure portal](https://portal.azure.com/#home).
+1. Sign in to the [Azure portal](https://portal.azure.com).
* Select **Your storage account** → **Data storage** → **Containers**. :::image type="content" source="media/sas-tokens/data-storage-menu.png" alt-text="Screenshot that shows the Data storage menu in the Azure portal.":::
To get started, you need:
The Azure portal is a web-based console that enables you to manage your Azure subscription and resources using a graphical user interface (GUI).
-1. Go to the [Azure portal](https://portal.azure.com/#home) and navigate as follows:
+1. Sign in to the [Azure portal](https://portal.azure.com).
- * **Your storage account** → **containers** → **your container**.
+1. Navigate to **Your storage account** > **containers** > **your container**.
1. Select **Generate SAS** from the menu near the top of the page.
ai-services Managed Identities Secured Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/managed-identities-secured-access.md
To get started, you need:
* An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/)ΓÇöif you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/).
-* A [**Document Intelligence**](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) or [**Azure AI services**](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) resource in the Azure portal. For detailed steps, _see_ [Create a multi-service resource](../../ai-services/multi-service-resource.md?pivots=azportal).
+* A [**Document Intelligence**](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) or [**Azure AI services**](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) resource in the Azure portal. For detailed steps, _see_ [Create a multi-service resource](../../ai-services/multi-service-resource.md?pivots=azportal).
* An [**Azure blob storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM) in the same region as your Document Intelligence resource. Create containers to store and organize your blob data within your storage account.
ai-services Try Document Intelligence Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/quickstarts/try-document-intelligence-studio.md
CORS should now be configured to use the storage account from Document Intellige
### Sample documents set
-1. Go to the [Azure portal](https://portal.azure.com/#home) and navigate as follows: **Your storage account** → **Data storage** → **Containers**
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to **Your storage account** > **Data storage** > **Containers**.
:::image border="true" type="content" source="../media/sas-tokens/data-storage-menu.png" alt-text="Screenshot: Data storage menu in the Azure portal.":::
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/service-limits.md
monikerRange: '<=doc-intel-3.0.0'
# Service quotas and limits
-<!-- markdownlint-disable MD033 -->
::: moniker range="doc-intel-3.0.0" [!INCLUDE [applies to v3.0](includes/applies-to-v3-0.md)]
If you would like to increase your transactions per second, you can enable auto
* Region * **How to get information (Base model)**:
- * Go to [Azure portal](https://portal.azure.com/)
+ * Sign in to the [Azure portal](https://portal.azure.com)
* Select the Document Intelligence Resource for which you would like to increase the transaction limit * Select *Properties* (*Resource Management* group) * Copy and save the values of the following fields:
If you would like to increase your transactions per second, you can enable auto
Initiate the increase of transactions per second(TPS) limit for your resource by submitting the Support Request: * Ensure you have the [required information](#have-the-required-information-ready)
-* Go to [Azure portal](https://portal.azure.com/)
+* Sign in to the [Azure portal](https://portal.azure.com)
* Select the Document Intelligence Resource for which you would like to increase the TPS limit * Select *New support request* (*Support + troubleshooting* group) * A new window appears with autopopulated information about your Azure Subscription and Azure Resource
ai-services Tutorial Logic Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/tutorial-logic-apps.md
Before we jump into creating the Logic App, we have to set up a OneDrive folder.
At this point, you should have a Document Intelligence resource and a OneDrive folder all set. Now, it's time to create a Logic App resource.
-1. Navigate to the [Azure portal](https://ms.portal.azure.com/#home).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Γ₧ò Create a resource** from the Azure home page.
ai-services Tag Utterances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/tag-utterances.md
Enable identity management for your Language resource using the following option
### [Azure portal](#tab/portal)
-Your Language resource must have identity management, to enable it using [Azure portal](https://portal.azure.com/):
+Your Language resource must have identity management, to enable it using the [Azure portal](https://portal.azure.com):
1. Go to your Language resource 2. From left hand menu, under **Resource Management** section, select **Identity**
Your Language resource must have identity management, to enable it using [Langua
After enabling managed identity, assign the role `Azure AI services User` to your Azure OpenAI resource using the managed identity of your Language resource.
- 1. Go to the [Azure portal](https://portal.azure.com/) and navigate to your Azure OpenAI resource.
+ 1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure OpenAI resource.
2. Select the Access Control (IAM) tab on the left. 3. Select Add > Add role assignment. 4. Select "Job function roles" and click Next.
ai-services Azure Machine Learning Labeling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom/azure-machine-learning-labeling.md
Before you can connect your labeling project to Azure Machine Learning, you need
* Only Azure Machine Learning's JSONL file format can be imported into Language Studio. * Projects with the multi-lingual option enabled can't be connected to Azure Machine Learning, and not all languages are supported. * Language support is provided by the Azure Machine Learning [TextDNNLanguages Class](/python/api/azureml-automl-core/azureml.automl.core.constants.textdnnlanguages?view=azure-ml-py&preserve-view=true&branch=main#azureml-automl-core-constants-textdnnlanguages-supported).
-* The Azure Machine Learning workspace you're connecting to must be assigned to the same Azure Storage account that Language Studio is connected to. Be sure that the Azure Machine Learning workspace has the storage blob data reader permission on the storage account. The workspace needs to have been linked to the storage account during the creation process in the [Azure portal](https://ms.portal.azure.com/#create/Microsoft.MachineLearningServices).
+* The Azure Machine Learning workspace you're connecting to must be assigned to the same Azure Storage account that Language Studio is connected to. Be sure that the Azure Machine Learning workspace has the storage blob data reader permission on the storage account. The workspace needs to have been linked to the storage account during the [creation process in the Azure portal](https://portal.azure.com/#create/Microsoft.MachineLearningServices).
* Switching between the two labeling experiences isn't instantaneous. It may take time to successfully complete the operation. ## Import your Azure Machine Learning labels into Language Studio
Language Studio supports the JSONL file format used by Azure Machine Learning. I
Before you connect to Azure Machine Learning, you need an Azure Machine Learning account with a pricing plan that can accommodate the compute needs of your project. See the [prerequisites section](#prerequisites) to make sure that you have successfully completed all the requirements to start connecting your Language Studio project to Azure Machine Learning.
-1. Use the [Azure portal](https://portal.azure.com/) to navigate to the Azure Blob Storage account connected to your language resource.
+1. Use the [Azure portal](https://portal.azure.com) to navigate to the Azure Blob Storage account connected to your language resource.
2. Ensure that the *Storage Blob Data Contributor* role is assigned to your AML workspace within the role assignments for your Azure Blob Storage account. 3. Navigate to your project in [Language Studio](https://language.azure.com/). From the left navigation menu of your project, select **Data labeling**. 4. Select **use Azure Machine Learning to label** in either the **Data labeling** description, or under the **Activity pane**.
Before you connect to Azure Machine Learning, you need an Azure Machine Learning
1. In the window that appears, follow the prompts. Select the Azure Machine Learning workspace youΓÇÖve created previously under the same Azure subscription. Enter a name for the new Azure Machine Learning project that will be created to enable labeling in Azure Machine Learning. >[!TIP]
- > Make sure your workspace is linked to the same Azure Blob Storage account and Language resource before continuing. You can create a new workspace and link to your storage account through the [Azure portal](https://ms.portal.azure.com/#create/Microsoft.MachineLearningServices). Ensure that the storage account is properly linked to the workspace.
+ > Make sure your workspace is linked to the same Azure Blob Storage account and Language resource before continuing. You can create a new workspace and [link to your storage account using the Azure portal](https://portal.azure.com/#create/Microsoft.MachineLearningServices). Ensure that the storage account is properly linked to the workspace.
1. (Optional) Turn on the vendor labeling toggle to use labeling vendor companies. Before choosing the vendor labeling companies, contact the vendor labeling companies on the [Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/consulting-services?search=AzureMLVend) to finalize a contract with them. For more information about working with vendor companies, see [How to outsource data labeling](/azure/machine-learning/how-to-outsource-data-labeling).
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/overview.md
Previously updated : 04/14/2023 Last updated : 07/19/2023
This section will help you decide which Language service feature you should use
| Extract categories of information without creating a custom model. | Unstructured text | The [preconfigured NER feature](./named-entity-recognition/overview.md) | | | Extract categories of information using a model specific to your data. | Unstructured text | [Custom NER](./custom-named-entity-recognition/overview.md) | Γ£ô | |Extract main topics and important phrases. | Unstructured text | [Key phrase extraction](./key-phrase-extraction/overview.md) | |
-| Determine the sentiment and opinions expressed in text. | Unstructured text | [Sentiment analysis and opinion mining](./sentiment-opinion-mining/overview.md) | |
+| Determine the sentiment and opinions expressed in text. | Unstructured text | [Sentiment analysis and opinion mining](./sentiment-opinion-mining/overview.md) | Γ£ô |
| Summarize long chunks of text or conversations. | Unstructured text, <br> transcribed conversations. | [Summarization](./summarization/overview.md) | | | Disambiguate entities and get links to Wikipedia. | Unstructured text | [Entity linking](./entity-linking/overview.md) | | | Classify documents into one or more categories. | Unstructured text | [Custom text classification](./custom-text-classification/overview.md) | Γ£ô|
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/overview.md
Follow these steps to get the most out of your model:
6. **Classify text**: Use your custom model for sentiment analysis tasks.
+## Development options
+
+|Development option |Description |
+|||
+|Language studio | Language Studio is a web-based platform that lets you try entity linking with text examples without an Azure account, and your own data when you sign up. |
+|REST API | Integrate sentiment analysis into your applications programmatically using the REST API. |
+
+For more information, see [sentiment analysis quickstart](./custom/quickstart.md).
+ ## Reference documentation As you use Custom sentiment analysis, see the following reference documentation and samples for the Language service:
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/how-to/use-containers.md
The Text Analytics for health container image can be found on the `mcr.microsoft
To use the latest version of the container, you can use the `latest` tag. You can also find a full list of [tags on the MCR](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/healthcare/tags).
-Use the [`docker pull`](https://docs.docker.com/engine/reference/commandline/pull/) command to download this container image from the Microsoft public container registry. You can find the featured tags on the [dockerhub page](https://hub.docker.com/_/microsoft-azure-cognitive-services-textanalytics-healthcare)
+Use the [`docker pull`](https://docs.docker.com/engine/reference/commandline/pull/) command to download this container image from the Microsoft public container registry. You can find the featured tags on the [Microsoft Container Registry](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/healthcare/about)
``` docker pull mcr.microsoft.com/azure-cognitive-services/textanalytics/healthcare:<tag-name>
ai-services Encryption https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/metrics-advisor/encryption.md
Metrics Advisor supports CMK and double encryption by using BYOS (bring your own
1. Set 'Allow access to Azure services' as 'Yes'. 2. Add your clientIP address to log in to Azure Database for PostgreSQL. -- Get the access-token for your account with resource type 'https://ossrdbms-aad.database.windows.net'. The access token is the password you need to log in to the Azure Database for PostgreSQL by your account. An example using `az` client:
+- Get the access-token for your account with resource type 'https://ossrdbms-aad.database.windows.net'. The access token is the password you need to sign in to the Azure Database for PostgreSQL by your account. An example using `az` client:
``` az login
ai-services Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/concepts/models.md
description: Learn about the different model capabilities that are available wit
Previously updated : 07/20/2023 Last updated : 07/21/2023
These models can only be used with the Chat Completion API.
| `gpt-4-32k` <sup>1</sup> (0613) | East US, France Central | N/A | 32,768 | September 2021 | <sup>1</sup> The model is [only available by request](https://aka.ms/oai/get-gpt4).<br>
-<sup>2</sup> Version `0314` of gpt-4 and gpt-4-32k will be retired on January 4, 2024. See [model updates](#model-updates) for model upgrade behavior.
+<sup>2</sup> Version `0314` of gpt-4 and gpt-4-32k will be retired no earlier than July 5, 2024. See [model updates](#model-updates) for model upgrade behavior.
### GPT-3.5 models
GPT-3.5 Turbo is used with the Chat Completion API. GPT-3.5 Turbo (0301) can als
| `gpt-35-turbo` (0613) | East US, France Central, Japan East, North Central US, UK South | N/A | 4,096 | Sep 2021 | | `gpt-35-turbo-16k` (0613) | East US, France Central, Japan East, North Central US, UK South | N/A | 16,384 | Sep 2021 |
-<sup>1</sup> Version `0301` of gpt-35-turbo will be retired on January 4, 2024. See [model updates](#model-updates) for model upgrade behavior.
+<sup>1</sup> Version `0301` of gpt-35-turbo will be retired no earlier than July 5, 2024. See [model updates](#model-updates) for model upgrade behavior.
### Embeddings models
When you select a specific model version for a deployment this version will rema
### GPT-35-Turbo 0301 and GPT-4 0314 retirement
-The `gpt-35-turbo` (`0301`) and both `gpt-4` (`0314`) models will be retired on January 4, 2024. Upon retirement, deployments will automatically be upgraded to the default version at the time of retirement. If you would like your deployment to stop accepting completion requests rather than upgrading, then you will be able to set the model upgrade option to expire through the API. We will publish guidelines on this by September 1.
+The `gpt-35-turbo` (`0301`) and both `gpt-4` (`0314`) models will be retired no earlier than July 5, 2024. Upon retirement, deployments will automatically be upgraded to the default version at the time of retirement. If you would like your deployment to stop accepting completion requests rather than upgrading, then you will be able to set the model upgrade option to expire through the API. We will publish guidelines on this by September 1.
### Viewing deprecation dates
ai-services Function Calling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/function-calling.md
Last updated 07/20/2023
-# How to use function calling with Azure OpenAI Service
+# How to use function calling with Azure OpenAI Service (Preview)
The latest versions of gpt-35-turbo and gpt-4 have been fine-tuned to work with functions and are able to both determine when and how a function should be called. If one or more functions are included in your request, the model will then determine if any of the functions should be called based on the context of the prompt. When the model determines that a function should be called, it will then respond with a JSON object including the arguments for the function.
if response_message.get("function_call"):
available_functions = { "search_hotels": search_hotels, }
+ function_to_call = available_functions[function_name]
+ function_args = json.loads(response_message["function_call"]["arguments"])
- function_response = fuction_to_call(**function_args)
+ function_response = function_to_call(**function_args)
# Add the assistant response and function response to the messages messages.append( # adding assistant response to messages
To learn more about our recommendations on how to use Azure OpenAI models respon
## Next steps * [Learn more about Azure OpenAI](../overview.md).
-* For more examples on working with functions, check out the [Azure OpenAI Samples GitHub repository](https://aka.ms/oai/function-samples)
-* Get started with the GPT-35-Turbo model with [the GPT-35-Turbo quickstart](../chatgpt-quickstart.md).
+* For more examples on working with functions, check out the [Azure OpenAI Samples GitHub repository](https://aka.ms/oai/functions-samples)
+* Get started with the GPT-35-Turbo model with [the GPT-35-Turbo quickstart](../chatgpt-quickstart.md).
ai-services Use Your Data Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/use-your-data-quickstart.md
Title: 'Use your own data with Azure OpenAI service'
+ Title: 'Use your own data with Azure OpenAI Service'
description: Use this article to import and use your data in Azure OpenAI.
ai-services Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure AI services description: Lists Azure Policy Regulatory Compliance controls available for Azure AI services. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/language-support.md
The table in this section summarizes the locales and voices supported for Text t
Additional remarks for Text to speech locales are included in the [Voice styles and roles](#voice-styles-and-roles), [Prebuilt neural voices](#prebuilt-neural-voices), and [Custom Neural Voice](#custom-neural-voice) sections below. > [!TIP]
-> Check the the [Voice Gallery](https://speech.microsoft.com/portal/voicegallery) and determine the right voice for your business needs.
+> Check the [Voice Gallery](https://speech.microsoft.com/portal/voicegallery) and determine the right voice for your business needs.
[!INCLUDE [Language support include](includes/language-support/tts.md)]
ai-services Text Translation Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/text-translation-overview.md
Add Text Translation to your projects and applications using the following resou
> > * To use the Translator container you must complete and submit the [**Azure AI services Application for Gated Services**](https://aka.ms/csgate-translator) online request form and have it approved to acquire access to the container. >
- > * The [**Translator container image**](https://hub.docker.com/_/microsoft-azure-cognitive-services-translator-text-translation) supports limited features compared to cloud offerings.
+ > * The [**Translator container image**](https://mcr.microsoft.com/product/azure-cognitive-services/translator/text-translation/about) supports limited features compared to cloud offerings.
> ## Get started with Text Translation
aks Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/availability-zones.md
Title: Use availability zones in Azure Kubernetes Service (AKS) description: Learn how to create a cluster that distributes nodes across availability zones in Azure Kubernetes Service (AKS)-+ Last updated 02/22/2023- # Create an Azure Kubernetes Service (AKS) cluster that uses availability zones
If a single zone becomes unavailable, your applications continue to run on clust
## Create an AKS cluster across availability zones
-When you create a cluster using the [az aks create][az-aks-create] command, the `--zones` parameter specifies the zones to deploy agent nodes into. The control plane components such as etcd or the API spread across the available zones in the region during cluster deployment. The specific zones that the control plane components spread across, are independent of what explicit zones you select for the initial node pool.
-
-If you don't specify any zones for the default agent pool when you create an AKS cluster, the control plane components aren't present in availability zones. You can add more node pools using the [az aks nodepool add][az-aks-nodepool-add] command and specify `--zones` for new nodes. The command converts the AKS control plane to spread across availability zones.
+When you create a cluster using the [az aks create][az-aks-create] command, the `--zones` parameter specifies the availability zones to deploy agent nodes into. The availability zones that the managed control plane components are deployed into are **not** controlled by this parameter. They are automatically spread across all availability zones (if present) in the region during cluster deployment.
-The following example creates an AKS cluster named *myAKSCluster* in the resource group named *myResourceGroup* with a total of three nodes. One agent in zone *1*, one in *2*, and then one in *3*.
+The following example creates an AKS cluster named *myAKSCluster* in the resource group named *myResourceGroup* with a total of three nodes. One agent node in zone *1*, one in *2*, and then one in *3*.
```azurecli-interactive az group create --name myResourceGroup --location eastus2
aks Azure Blob Csi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-blob-csi.md
Title: Use Container Storage Interface (CSI) driver for Azure Blob storage on Azure Kubernetes Service (AKS) description: Learn how to use the Container Storage Interface (CSI) driver for Azure Blob storage in an Azure Kubernetes Service (AKS) cluster. + Last updated 04/13/2023- # Use Azure Blob storage Container Storage Interface (CSI) driver
aks Azure Csi Blob Storage Provision https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-csi-blob-storage-provision.md
Title: Create a persistent volume with Azure Blob storage in Azure Kubernetes Se
description: Learn how to create a static or dynamic persistent volume with Azure Blob storage for use with multiple concurrent pods in Azure Kubernetes Service (AKS) + Last updated 05/17/2023- # Create and use a volume with Azure Blob storage in Azure Kubernetes Service (AKS)
The following YAML creates a pod that uses the persistent volume or persistent v
[enable-blob-csi-driver]: azure-blob-csi.md#before-you-begin [az-tags]: ../azure-resource-manager/management/tag-resources.md [sas-tokens]: ../storage/common/storage-sas-overview.md
-[azure-datalake-storage-account]: ../storage/blobs/upgrade-to-data-lake-storage-gen2-how-to.md
+[azure-datalake-storage-account]: ../storage/blobs/upgrade-to-data-lake-storage-gen2-how-to.md
aks Azure Csi Disk Storage Provision https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-csi-disk-storage-provision.md
Title: Create a persistent volume with Azure Disks in Azure Kubernetes Service (
description: Learn how to create a static or dynamic persistent volume with Azure Disks for use with multiple concurrent pods in Azure Kubernetes Service (AKS) -+ Last updated 04/11/2023
aks Azure Csi Files Storage Provision https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-csi-files-storage-provision.md
Title: Create a persistent volume with Azure Files in Azure Kubernetes Service (
description: Learn how to create a static or dynamic persistent volume with Azure Files for use with multiple concurrent pods in Azure Kubernetes Service (AKS) -+ Last updated 05/17/2023
aks Azure Disk Csi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-disk-csi.md
Title: Use Container Storage Interface (CSI) driver for Azure Disk on Azure Kubernetes Service (AKS) description: Learn how to use the Container Storage Interface (CSI) driver for Azure Disk in an Azure Kubernetes Service (AKS) cluster. + Last updated 04/19/2023
aks Azure Disk Customer Managed Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-disk-customer-managed-keys.md
Title: Use a customer-managed key to encrypt Azure disks in Azure Kubernetes Service (AKS) description: Bring your own keys (BYOK) to encrypt AKS OS and Data disks. -+ Last updated 07/10/2023
Review [best practices for AKS cluster security][best-practices-security]
[byok-azure-portal]: ../storage/common/customer-managed-keys-configure-key-vault.md [customer-managed-keys-windows]: ../virtual-machines/disk-encryption.md#customer-managed-keys [customer-managed-keys-linux]: ../virtual-machines/disk-encryption.md#customer-managed-keys
-[key-vault-generate]: ../key-vault/general/manage-with-cli2.md
+[key-vault-generate]: ../key-vault/general/manage-with-cli2.md
aks Azure Files Csi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-files-csi.md
Title: Use Container Storage Interface (CSI) driver for Azure Files on Azure Kubernetes Service (AKS) description: Learn how to use the Container Storage Interface (CSI) driver for Azure Files in an Azure Kubernetes Service (AKS) cluster. + Last updated 04/19/2023
aks Azure Netapp Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-netapp-files.md
Title: Configure Azure NetApp Files for Azure Kubernetes Service description: Learn how to configure Azure NetApp Files for an Azure Kubernetes Service cluster. -+ Last updated 05/08/2023
aks Cis Ubuntu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cis-ubuntu.md
Title: Azure Kubernetes Service (AKS) Ubuntu image alignment with Center for Internet Security (CIS) benchmark description: Learn how AKS applies the CIS benchmark + Last updated 04/19/2023
aks Configure Kube Proxy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/configure-kube-proxy.md
Title: Configure kube-proxy (iptables/IPVS) (preview)
description: Learn how to configure kube-proxy to utilize different load balancing configurations with Azure Kubernetes Service (AKS). -+ Last updated 10/25/2022
aks Configure Kubenet Dual Stack https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/configure-kubenet-dual-stack.md
description: Learn how to configure dual-stack kubenet networking in Azure Kuber
-+ Last updated 06/27/2023
aks Csi Secrets Store Driver https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-secrets-store-driver.md
Last updated 02/10/2023-+ # Use the Azure Key Vault Provider for Secrets Store CSI Driver in an AKS cluster
A container using subPath volume mount won't receive secret updates when it's ro
```bash kubectl get pods -n kube-system -l 'app in (secrets-store-csi-driver,secrets-store-provider-azure)'
+ ```
+ ```output
NAME READY STATUS RESTARTS AGE aks-secrets-store-csi-driver-4vpkj 3/3 Running 2 4m25s aks-secrets-store-csi-driver-ctjq6 3/3 Running 2 4m21s
After the pod starts, the mounted content at the volume path that you specified
* Use the following commands to validate your secrets and print a test secret.
+To show secrets held in the secrets store:
```bash
- ## show secrets held in secrets-store
kubectl exec busybox-secrets-store-inline -- ls /mnt/secrets-store/-
- ## print a test secret 'ExampleSecret' held in secrets-store
- kubectl exec busybox-secrets-store-inline -- cat /mnt/secrets-store/ExampleSecret
```
+To display a secret in the store, for example this command shows the test secret `ExampleSecret`:
+
+```
+kubectl exec busybox-secrets-store-inline -- cat /mnt/secrets-store/ExampleSecret
+```
+ ## Obtain certificates and keys The Azure Key Vault design makes sharp distinctions between keys, secrets, and certificates. The Key Vault serviceΓÇÖs certificates features were designed to make use of its key and secret capabilities. When a key vault certificate is created, an addressable key and secret are also created with the same name. The key allows key operations, and the secret allows the retrieval of the certificate value as a secret.
A key vault certificate also contains public x509 certificate metadata. The key
* To disable autorotation, first disable the addon. Then, re-enable the addon without the `enable-secret-rotation` parameter.
- ```azurecli-interactive
- # disable the addon
- az aks addon disable -g myResourceGroup -n myAKSCluster2 -a azure-keyvault-secrets-provider
+Disable the secrets provider addon:
- # re-enable the addon without the `enable-secret-rotation` parameter
- az aks addon enable -g myResourceGroup -n myAKSCluster2 -a azure-keyvault-secrets-provider
- ```
+```azurecli-interactive
+az aks addon disable -g myResourceGroup -n myAKSCluster2 -a azure-keyvault-secrets-provider
+```
+
+Re-enable the secrets provider addon, but without the `enable-secret-rotation` parameter:
+
+```bash
+az aks addon enable -g myResourceGroup -n myAKSCluster2 -a azure-keyvault-secrets-provider
+```
### Sync mounted content with a Kubernetes secret
In this article, you learned how to use the Azure Key Vault Provider for Secrets
<!-- LINKS INTERNAL --> [az-aks-create]: /cli/azure/aks#az-aks-create+ [az-aks-enable-addons]: /cli/azure/aks#az-aks-enable-addons+ [az-aks-disable-addons]: /cli/azure/aks#az-aks-disable-addons+ [csi-storage-drivers]: ./csi-storage-drivers.md+ [identity-access-methods]: ./csi-secrets-store-identity-access.md+ [aad-pod-identity]: ./use-azure-ad-pod-identity.md+ [aad-workload-identity]: workload-identity-overview.md+ [az-keyvault-create]: /cli/azure/keyvault#az-keyvault-create.md+ [az-keyvault-secret-set]: /cli/azure/keyvault#az-keyvault-secret-set.md+ [az-aks-addon-update]: /cli/azure/aks#addon-update.md <!-- LINKS EXTERNAL --> [kube-csi]: https://kubernetes-csi.github.io/docs/+ [reloader]: https://github.com/stakater/Reloader+ [kubernetes-version-support]: ./supported-kubernetes-versions.md?tabs=azure-cli#kubernetes-version-support-policy++
aks Csi Secrets Store Identity Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-secrets-store-identity-access.md
Last updated 02/27/2023-+ # Provide an identity to access the Azure Key Vault Provider for Secrets Store CSI Driver
aks Csi Storage Drivers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-storage-drivers.md
The Container Storage Interface (CSI) is a standard for exposing arbitrary block
The CSI storage driver support on AKS allows you to natively use: -- [**Azure Disks**](azure-disk-csi.md) can be used to create a Kubernetes *DataDisk* resource. Disks can use Azure Premium Storage, backed by high-performance SSDs, or Azure Standard Storage, backed by regular HDDs or Standard SSDs. For most production and development workloads, use Premium Storage. Azure Disks are mounted as *ReadWriteOnce* and are only available to one node in AKS. For storage volumes that can be accessed by multiple pods simultaneously, use Azure Files.
+- [**Azure Disks**](azure-disk-csi.md) can be used to create a Kubernetes *DataDisk* resource. Disks can use Azure Premium Storage, backed by high-performance SSDs, or Azure Standard Storage, backed by regular HDDs or Standard SSDs. For most production and development workloads, use Premium Storage. Azure Disks are mounted as *ReadWriteOnce* and are only available to one node in AKS. For storage volumes that can be accessed by multiple nodes simultaneously, use Azure Files.
- [**Azure Files**](azure-files-csi.md) can be used to mount an SMB 3.0/3.1 share backed by an Azure storage account to pods. With Azure Files, you can share data across multiple nodes and pods. Azure Files can use Azure Standard storage backed by regular HDDs or Azure Premium storage backed by high-performance SSDs. - [**Azure Blob storage**](azure-blob-csi.md) can be used to mount Blob storage (or object storage) as a file system into a container or pod. Using Blob storage enables your cluster to support applications that work with large unstructured datasets like log file data, images or documents, HPC, and others. Additionally, if you ingest data into [Azure Data Lake storage](../storage/blobs/data-lake-storage-introduction.md), you can directly mount and use it in AKS without configuring another interim filesystem.
aks Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/faq.md
Title: Frequently asked questions for Azure Kubernetes Service (AKS) description: Find answers to some of the common questions about Azure Kubernetes Service (AKS). Previously updated : 06/17/2022-- Last updated : 07/20/2022+ # Frequently asked questions about Azure Kubernetes Service (AKS)
Moving or renaming your AKS cluster and its associated resources isn't supported
Most clusters are deleted upon user request. In some cases, especially cases where you bring your own Resource Group or perform cross-RG tasks, deletion can take more time or even fail. If you have an issue with deletes, double-check that you don't have locks on the RG, that any resources outside of the RG are disassociated from the RG, and so on.
+## Why is my cluster create/update taking so long?
+If you have issues with create and update cluster operations, make sure you don't have any assigned policies or service constraints that may block your AKS cluster from managing resources like VMs, load balancers, tags, etc.
+ ## Can I restore my cluster after deleting it? No, you're unable to restore your cluster after deleting it. When you delete your cluster, the associated resource group and all its resources are deleted. If you want to keep any of your resources, move them to another resource group before deleting your cluster. If you have the **Owner** or **User Access Administrator** built-in role, you can lock Azure resources to protect them from accidental deletions and modifications. For more information, see [Lock your resources to protect your infrastructure][lock-azure-resources].
aks Howto Deploy Java Liberty App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/howto-deploy-java-liberty-app.md
description: Deploy a Java application with Open Liberty/WebSphere Liberty on an
Last updated 12/21/2022 keywords: java, jakartaee, javaee, microprofile, open-liberty, websphere-liberty, aks, kubernetes-+ # Deploy a Java application with Open Liberty or WebSphere Liberty on an Azure Kubernetes Service (AKS) cluster
aks Howto Deploy Java Wls App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/howto-deploy-java-wls-app.md
Use the following steps to create a storage account and container. Some of these
The steps in this section direct you to deploy WLS on AKS in the simplest possible way. WLS on AKS offers a broad and deep selection of Azure integrations. For more information, see [What are solutions for running Oracle WebLogic Server on the Azure Kubernetes Service?](/azure/virtual-machines/workloads/oracle/weblogic-aks) - The following steps show you how to find the WLS on AKS offer and fill out the **Basics** pane. 1. In the search bar at the top of the Azure portal, enter *weblogic*. In the auto-suggested search results, in the **Marketplace** section, select **Oracle WebLogic Server on Azure Kubernetes Service**.
- :::image type="content" source="media/howto-deploy-java-wls-app/marketplace-search-results.png" alt-text="Screenshot of Azure portal showing WLS in search results." lightbox="media/howto-deploy-java-wls-app/marketplace-search-results.png":::
+ :::image type="content" source="media/howto-deploy-java-wls-app/marketplace-search-results.png" alt-text="Screenshot of the Azure portal showing WLS in search results." lightbox="media/howto-deploy-java-wls-app/marketplace-search-results.png":::
You can also go directly to the [Oracle WebLogic Server on Azure Kubernetes Service](https://aka.ms/wlsaks) offer. 1. On the offer page, select **Create**. 1. On the **Basics** pane, ensure the value shown in the **Subscription** field is the same one that has the roles listed in the prerequisites section.
-1. You must deploy the offer in an empty resource group. In the **Resource group** field, select **Create new** and fill in a value for the resource group. Because resource groups must be unique within a subscription, pick a unique name. An easy way to have unique names is to use a combination of your initials, today's date, and some identifier. For example, `ejb0723wls`.
+
+ :::image type="content" source="media/howto-deploy-java-wls-app/portal-start-experience.png" alt-text="Screenshot of the Azure portal showing WebLogic Server on AKS." lightbox="media/howto-deploy-java-wls-app/portal-start-experience.png":::
+
+1. You must deploy the offer in an empty resource group. In the **Resource group** field, select **Create new** and then fill in a value for the resource group. Because resource groups must be unique within a subscription, pick a unique name. An easy way to have unique names is to use a combination of your initials, today's date, and some identifier - for example, `ejb0723wls``.
1. Under **Instance details**, select the region for the deployment. For a list of Azure regions where AKS is available, see [AKS region availability](https://azure.microsoft.com/global-infrastructure/services/?products=kubernetes-service). 1. Under **Credentials for WebLogic**, leave the default value for **Username for WebLogic Administrator**. 1. Fill in `wlsAksCluster2022` for the **Password for WebLogic Administrator**. Use the same value for the confirmation and **Password for WebLogic Model encryption** fields. 1. Scroll to the bottom of the **Basics** pane and notice the helpful links for documentation, community support, and how to report problems.
-1. Select **Next: Configure AKS cluster**.
+1. Select **Next: AKS**.
The following steps show you how to start the deployment process. 1. Scroll to the section labeled **Provide an Oracle Single Sign-On (SSO) account**. Fill in your Oracle SSO credentials from the preconditions.
-1. Accurately answer the question **Is the specified SSO account associated with an active Oracle support contract?** by selecting **Yes** or **No** accordingly. If you answer this question incorrectly, the steps in this quickstart won't work. If in doubt, select **No**.
-1. In the section **Java EE Application**, next to **Deploy your application package**, select **Yes**.
+
+ :::image type="content" source="media/howto-deploy-java-wls-app/configure-single-sign-on.png" alt-text="Screenshot of the Azure portal showing the configure sso pane." lightbox="media/howto-deploy-java-wls-app/configure-single-sign-on.png":::
+
+1. In the **Application** section, next to **Deploy an application?**, select **Yes**.
+
+ :::image type="content" source="media/howto-deploy-java-wls-app/configure-application.png" alt-text="Screenshot of the Azure portal showing the configure applications pane." lightbox="media/howto-deploy-java-wls-app/configure-application.png":::
+ 1. Next to **Application package (.war,.ear,.jar)**, select **Browse**. 1. Start typing the name of the storage account from the preceding section. When the desired storage account appears, select it. 1. Select the storage container from the preceding section.
The following steps show you how to start the deployment process.
The following steps make it so the WLS admin console and the sample app are exposed to the public Internet with a built-in Kubernetes `LoadBalancer` service. For a more secure and scalable way to expose functionality to the public Internet, see [Tutorial: Migrate a WebLogic Server cluster to Azure with Azure Application Gateway as a load balancer](/azure/developer/java/migration/migrate-weblogic-with-app-gateway).
-1. Select the **Networking** pane.
-1. Next to the question **Create Standard Load Balancer services for Oracle WebLogic Server?**, select **Yes**.
+
+1. Select the **Load balancing** pane.
+1. Next to **Load Balancing Options**, select **Standard Load Balancer Service**.
1. In the table that appears, under **Service name prefix**, fill in the values as shown in the following table. The port values of *7001* for the admin server and *8001* for the cluster must be filled in exactly as shown. | Service name prefix | Target | Port |
The following steps make it so the WLS admin console and the sample app are expo
| console | admin-server | 7001 | | app | cluster-1 | 8001 |
- :::image type="content" source="media/howto-deploy-java-wls-app/load-balancer-minimal-config.png" alt-text="Screenshot of Azure portal showing the simplest possible load balancer configuration on the Create Oracle WebLogic Server on Azure Kubernetes Service page." lightbox="media/howto-deploy-java-wls-app/load-balancer-minimal-config.png":::
- 1. Select **Review + create**. Ensure the green **Validation Passed** message appears at the top. If it doesn't, fix any validation problems, then select **Review + create** again. 1. Select **Create**. 1. Track the progress of the deployment on the **Deployment is in progress** page.
If you navigated away from the **Deployment is in progress** page, the following
1. In the left navigation pane, in the **Settings** section, select **Deployments**. You'll see an ordered list of the deployments to this resource group, with the most recent one first. 1. Scroll to the oldest entry in this list. This entry corresponds to the deployment you started in the preceding section. Select the oldest deployment, as shown in the following screenshot.
- :::image type="content" source="media/howto-deploy-java-wls-app/resource-group-deployments.png" alt-text="Screenshot of Azure portal showing the resource group deployments list." lightbox="media/howto-deploy-java-wls-app/resource-group-deployments.png":::
+ :::image type="content" source="media/howto-deploy-java-wls-app/resource-group-deployments.png" alt-text="Screenshot of the Azure portal showing the resource group deployments list." lightbox="media/howto-deploy-java-wls-app/resource-group-deployments.png":::
1. In the left panel, select **Outputs**. This list shows the output values from the deployment. Useful information is included in the outputs. 1. The **adminConsoleExternalUrl** value is the fully qualified, public Internet visible link to the WLS admin console for this AKS cluster. Select the copy icon next to the field value to copy the link to your clipboard. Save this value aside for later.
aks Http Application Routing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/http-application-routing.md
Title: HTTP application routing add-on for Azure Kubernetes Service (AKS) description: Use the HTTP application routing add-on to access applications deployed on Azure Kubernetes Service (AKS). -+ Last updated 04/05/2023
aks Ingress Tls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/ingress-tls.md
Title: Use TLS with an ingress controller on Azure Kubernetes Service (AKS)
description: Learn how to install and configure an ingress controller that uses TLS in an Azure Kubernetes Service (AKS) cluster. -+
aks Quick Kubernetes Deploy Bicep https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-kubernetes-deploy-bicep.md
Title: Quickstart - Create an Azure Kubernetes Service (AKS) cluster by using Bi
description: Learn how to quickly create a Kubernetes cluster using a Bicep file and deploy an application in Azure Kubernetes Service (AKS) Last updated 11/01/2022-+ #Customer intent: As a developer or cluster operator, I want to quickly create an AKS cluster and deploy an application so that I can see how to run applications using the managed Kubernetes service in Azure.
aks Quick Kubernetes Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-kubernetes-deploy-cli.md
Title: 'Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster using Azure
description: Learn how to create a Kubernetes cluster, deploy an application, and monitor performance in Azure Kubernetes Service (AKS) using Azure CLI. Last updated 05/04/2023-+ #Customer intent: As a developer or cluster operator, I want to create an AKS cluster and deploy an application so I can see how to run and monitor applications using the managed Kubernetes service in Azure.
aks Quick Kubernetes Deploy Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-kubernetes-deploy-portal.md
description: Learn how to quickly create a Kubernetes cluster, deploy an application, and monitor performance in Azure Kubernetes Service (AKS) using the Azure portal. Last updated 11/01/2022-+ #Customer intent: As a developer or cluster operator, I want to quickly create an AKS cluster and deploy an application so that I can see how to run and monitor applications using the managed Kubernetes service in Azure.
aks Quick Kubernetes Deploy Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-kubernetes-deploy-powershell.md
Title: 'Quickstart: Deploy an AKS cluster by using PowerShell'
description: Learn how to quickly create a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) using PowerShell. Last updated 11/01/2022-+ #Customer intent: As a developer or cluster operator, I want to quickly create an AKS cluster and deploy an application so that I can see how to run applications using the managed Kubernetes service in Azure.
aks Quick Windows Container Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-windows-container-deploy-cli.md
Title: Create a Windows Server container on an AKS cluster by using Azure CLI description: Learn how to quickly create a Kubernetes cluster, deploy an application in a Windows Server container in Azure Kubernetes Service (AKS) using the Azure CLI. -+ Last updated 11/01/2022 #Customer intent: As a developer or cluster operator, I want to quickly create an AKS cluster and deploy a Windows Server container so that I can see how to run applications running on a Windows Server container using the managed Kubernetes service in Azure.
aks Quick Windows Container Deploy Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-windows-container-deploy-powershell.md
Title: Create a Windows Server container on an AKS cluster by using PowerShell
description: Learn how to quickly create a Kubernetes cluster, deploy an application in a Windows Server container in Azure Kubernetes Service (AKS) using PowerShell. Last updated 11/01/2022---+ #Customer intent: As a developer or cluster operator, I want to quickly create an AKS cluster and deploy a Windows Server container so that I can see how to run applications running on a Windows Server container using the managed Kubernetes service in Azure.
aks Tutorial Kubernetes Workload Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/tutorial-kubernetes-workload-identity.md
Title: Tutorial - Use a workload identity with an application on Azure Kubernetes Service (AKS) description: In this Azure Kubernetes Service (AKS) tutorial, you deploy an Azure Kubernetes Service cluster and configure an application to use a workload identity. -+ Last updated 05/24/2023
aks Limit Egress Traffic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/limit-egress-traffic.md
Title: Control egress traffic using Azure Firewall in Azure Kubernetes Service (AKS) description: Learn how to control egress traffic using Azure Firewall in Azure Kubernetes Service (AKS) -+ Last updated 03/10/2023
aks Node Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/node-access.md
description: Learn how to connect to Azure Kubernetes Service (AKS) cluster node
Last updated 04/26/2023 --+ #Customer intent: As a cluster operator, I want to learn how to connect to virtual machines in an AKS cluster to perform maintenance or troubleshoot a problem.
If you need more troubleshooting data, you can [view the kubelet logs][view-kube
[ssh-linux-kubectl-debug]: #create-an-interactive-shell-connection-to-a-linux-node [az-aks-update]: /cli/azure/aks#az-aks-update [how-to-install-azure-extensions]: /cli/azure/azure-cli-extensions-overview#how-to-install-extensions--
aks Node Image Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/node-image-upgrade.md
Title: Upgrade Azure Kubernetes Service (AKS) node images description: Learn how to upgrade the images on AKS cluster nodes and node pools. -+ Last updated 03/28/2023
aks Node Updates Kured https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/node-updates-kured.md
Title: Handle Linux node reboots with kured
description: Learn how to update Linux nodes and automatically reboot them with kured in Azure Kubernetes Service (AKS) -+ Last updated 04/19/2023 #Customer intent: As a cluster administrator, I want to know how to automatically apply Linux updates and reboot nodes in AKS for security and/or compliance
aks Open Ai Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/open-ai-quickstart.md
Now that you've seen how to add OpenAI functionality to an AKS application, lear
[az-aks-get-credentials]: /cli/azure/aks#az-aks-get-credentials
-[aoai-get-started]: ../cognitive-services/openai/quickstart.md
+[aoai-get-started]: ../ai-services/openai/quickstart.md
-[managed-identity]: /azure/cognitive-services/openai/how-to/managed-identity#authorize-access-to-managed-identities
+[managed-identity]: /azure/ai-services/openai/how-to/managed-identity#authorize-access-to-managed-identities
[key-vault]: csi-secrets-store-driver.md
-[aoai]: ../cognitive-services/openai/index.yml
+[aoai]: ../ai-services/openai/index.yml
[learn-aoai]: /training/modules/explore-azure-openai
aks Quickstart Dapr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/quickstart-dapr.md
Last updated 06/22/2023-+ # Quickstart: Deploy an application using the Dapr cluster extension for Azure Kubernetes Service (AKS) or Arc-enabled Kubernetes
aks Rdp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/rdp.md
Title: RDP to AKS Windows Server nodes
description: Learn how to create an RDP connection with Azure Kubernetes Service (AKS) cluster Windows Server nodes for troubleshooting and maintenance tasks. -+ Last updated 04/26/2023 #Customer intent: As a cluster operator, I want to learn how to use RDP to connect to nodes in an AKS cluster to perform maintenance or troubleshoot a problem.
aks Resize Node Pool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/resize-node-pool.md
Title: Resize node pools in Azure Kubernetes Service (AKS) description: Learn how to resize node pools for a cluster in Azure Kubernetes Service (AKS) by cordoning and draining. + Last updated 02/08/2023 #Customer intent: As a cluster operator, I want to resize my node pools so that I can run more or larger workloads.
aks Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Kubernetes Service (AKS) description: Lists Azure Policy Regulatory Compliance controls available for Azure Kubernetes Service (AKS). These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
aks Upgrade Windows 2019 2022 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/upgrade-windows-2019-2022.md
Title: Upgrade Kubernetes workloads from Windows Server 2019 to 2022 description: Learn how to upgrade the OS version for Windows workloads on AKS + Last updated 8/18/2022 - # Upgrade Kubernetes workloads from Windows Server 2019 to 2022
aks Use Multiple Node Pools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-multiple-node-pools.md
Title: Use multiple node pools in Azure Kubernetes Service (AKS) description: Learn how to create and manage multiple node pools for a cluster in Azure Kubernetes Service (AKS) -+ Last updated 06/27/2023
aks Use Pod Sandboxing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-pod-sandboxing.md
Title: Pod Sandboxing (preview) with Azure Kubernetes Service (AKS) description: Learn about and deploy Pod Sandboxing (preview), also referred to as Kernel Isolation, on an Azure Kubernetes Service (AKS) cluster. -+ Last updated 06/07/2023
aks Use Wasi Node Pools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-wasi-node-pools.md
Title: Create WebAssembly System Interface (WASI) node pools in Azure Kubernetes Service (AKS) to run your WebAssembly (WASM) workload (preview) description: Learn how to create a WebAssembly System Interface (WASI) node pool in Azure Kubernetes Service (AKS) to run your WebAssembly (WASM) workload on Kubernetes. -+ Last updated 05/17/2023
aks Vertical Pod Autoscaler https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/vertical-pod-autoscaler.md
Title: Vertical Pod Autoscaling (preview) in Azure Kubernetes Service (AKS) description: Learn how to vertically autoscale your pod on an Azure Kubernetes Service (AKS) cluster. -+ Last updated 03/17/2023
aks Workload Identity Deploy Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/workload-identity-deploy-cluster.md
Title: Deploy and configure an Azure Kubernetes Service (AKS) cluster with workload identity description: In this Azure Kubernetes Service (AKS) article, you deploy an Azure Kubernetes Service cluster and configure it with an Azure AD workload identity. -+ Last updated 05/24/2023
aks Workload Identity Migrate From Pod Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/workload-identity-migrate-from-pod-identity.md
Title: Migrate your Azure Kubernetes Service (AKS) pod to use workload identity description: In this Azure Kubernetes Service (AKS) article, you learn how to configure your Azure Kubernetes Service pod to authenticate with workload identity. -+ Last updated 05/23/2023
api-management Authorizations How To Azure Ad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/authorizations-how-to-azure-ad.md
You learn how to:
Create an Azure AD application for the API and give it the appropriate permissions for the requests that you want to call.
-1. Sign into the [Azure portal](https://portal.azure.com/) with an account with sufficient permissions in the tenant.
+1. Sign in to the [Azure portal](https://portal.azure.com) with an account with sufficient permissions in the tenant.
1. Under **Azure Services**, search for **Azure Active Directory**. 1. On the left menu, select **App registrations**, and then select **+ New registration**. :::image type="content" source="media/authorizations-how-to-azure-ad/create-registration.png" alt-text="Screenshot of creating an Azure AD app registration in the portal.":::
api-management Configure Graphql Resolver https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/configure-graphql-resolver.md
-+ Last updated 06/08/2023
api-management Enable Cors Power Platform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/enable-cors-power-platform.md
If you've exported an API from API Management as a [custom connector](export-api
Follow these steps to configure the CORS policy in API Management.
-1. Sign into [Azure portal](https://portal.azure.com) and go to your API Management instance.
+1. Sign in to the [Azure portal](https://portal.azure.com) and go to your API Management instance.
1. In the left menu, select **APIs** and select the API that you exported as a custom connector. If you want to, select only an API operation to apply the policy to. 1. In the **Policies** section, in the **Inbound processing** section, select **+ Add policy**. 1. Select **Allow cross-origin resource sharing (CORS)**.
api-management Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure API Management description: Lists Azure Policy Regulatory Compliance controls available for Azure API Management. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
api-management Self Hosted Gateway Settings Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/self-hosted-gateway-settings-reference.md
Here is an overview of all configuration options:
| config.service.auth.azureAd.clientSecret | Secret of the Azure AD app to authenticate with. | Yes, when using Azure AD authentication (unless certificate is specified) | N/A | v2.3+ | | config.service.auth.azureAd.certificatePath | Path to certificate to authenticate with for the Azure AD app. | Yes, when using Azure AD authentication (unless secret is specified) | N/A | v2.3+ | | config.service.auth.azureAd.authority | Authority URL of Azure AD. | No | `https://login.microsoftonline.com` | v2.3+ |
+| config.service.endpoint.disableCertificateValidation | Defines if the self-hosted gateway should validate the server-side certificate of the Configuration API. It is recommended to use certificate validation, only disable for testing purposes and with caution as it can introduce security risk. | No | `false` | v2.0+ |
The self-hosted gateway provides support for a few authentication options to integrate with the Configuration API which can be defined by using `config.service.auth`.
app-service Deploy Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/deploy-best-practices.md
There are examples below for common automation frameworks.
### Use Azure DevOps
-App Service has [built-in continuous delivery](deploy-continuous-deployment.md) for containers through the Deployment Center. Navigate to your app in the [Azure portal](https://portal.azure.com/) and select **Deployment Center** under **Deployment**. Follow the instructions to select your repository and branch. This will configure a DevOps build and release pipeline to automatically build, tag, and deploy your container when new commits are pushed to your selected branch.
+App Service has [built-in continuous delivery](deploy-continuous-deployment.md) for containers through the Deployment Center. Navigate to your app in the [Azure portal](https://portal.azure.com) and select **Deployment Center** under **Deployment**. Follow the instructions to select your repository and branch. This will configure a DevOps build and release pipeline to automatically build, tag, and deploy your container when new commits are pushed to your selected branch.
### Use GitHub Actions
az ad sp create-for-rbac --name "myServicePrincipal" --role contributor \
In your script, log in using `az login --service-principal`, providing the principalΓÇÖs information. You can then use `az webapp config container set` to set the container name, tag, registry URL, and registry password. Below are some helpful links for you to construct your container CI process. -- [How to log into the Azure CLI on Circle CI](https://circleci.com/orbs/registry/orb/circleci/azure-cli)
+- [How to sign in to the Azure CLI on Circle CI](https://circleci.com/orbs/registry/orb/circleci/azure-cli)
## Language-Specific Considerations
app-service Migration Alternatives https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/migration-alternatives.md
You can select a custom backup and restore it to an App Service in your App Serv
This solution is recommended for users that are using Windows App Service and can't migrate using the [migration feature](migrate.md). You need to set up your new App Service Environment v3 before cloning any apps. Cloning an app can take up to 30 minutes to complete. Cloning can be done using PowerShell as described in the [documentation](../app-service-web-app-cloning.md#cloning-an-existing-app-to-an-app-service-environment) or using the Azure portal.
-To clone an app using the [Azure portal](https://www.portal.azure.com), navigate to your existing App Service and select **Clone App** under **Development Tools**. Fill in the required fields using the details for your new App Service Environment v3.
+To clone an app using the [Azure portal](https://portal.azure.com), navigate to your existing App Service and select **Clone App** under **Development Tools**. Fill in the required fields using the details for your new App Service Environment v3.
1. Select an existing or create a new **Resource Group**. 1. Give your app a **Name**. This name can be the same as the old app, but note the site's default URL using the new environment will be different. You need to update any custom DNS or connected resources to point to the new URL.
app-service Networking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/networking.md
Title: App Service Environment networking
description: App Service Environment networking details Previously updated : 02/06/2023 Last updated : 07/21/2023
You must delegate the subnet to `Microsoft.Web/hostingEnvironments`, and the sub
The size of the subnet can affect the scaling limits of the App Service plan instances within the App Service Environment. It's a good idea to use a `/24` address space (256 addresses) for your subnet, to ensure enough addresses to support production scale. >[!NOTE]
-> Windows Containers uses an additional IP address per app for each App Service plan instance, and you need to size the subnet accordingly. If your App Service Environment has for example 2 Windows Container App Service plans each with 25 instances and each with 5 apps running, you will need 300 IP addresses and additional addresses to support horizontal (up/down) scale.
+> Windows Containers uses an additional IP address per app for each App Service plan instance, and you need to size the subnet accordingly. If your App Service Environment has for example 2 Windows Container App Service plans each with 25 instances and each with 5 apps running, you will need 300 IP addresses and additional addresses to support horizontal (in/out) scale.
+>
+> Sample calculation:
+>
+> For each App Service plan instance, you need:
+> 5 Windows Container apps = 5 IP addresses
+> 1 IP address per App Service plan instance
+> 5 + 1 = 6 IP addresses
+>
+> For 25 instances:
+> 6 x 25 = 150 IP addresses per App Service plan
+>
+> Since you have 2 App Service plans, 2 x 150 = 300 IP addresses.
If you use a smaller subnet, be aware of the following limitations:
app-service Overview Vnet Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/overview-vnet-integration.md
Title: Integrate your app with an Azure virtual network
description: Integrate your app in Azure App Service with Azure virtual networks. Previously updated : 05/24/2023 Last updated : 07/21/2023
When you scale up/down in size or in/out in number of instances, the required ad
Because subnet size can't be changed after assignment, use a subnet that's large enough to accommodate whatever scale your app might reach. You should also reserve IP addresses for platform upgrades. To avoid any issues with subnet capacity, use a `/26` with 64 addresses. When you're creating subnets in Azure portal as part of integrating with the virtual network, a minimum size of /27 is required. If the subnet already exists before integrating through the portal, you can use a /28 subnet. >[!NOTE]
-> Windows Containers uses an additional IP address per app for each App Service plan instance, and you need to size the subnet accordingly. If you have for example 10 Windows Container App Service plan instances with 4 apps running, you will need 50 IP addresses and additional addresses to support horizontal (up/down) scale.
+> Windows Containers uses an additional IP address per app for each App Service plan instance, and you need to size the subnet accordingly. If you have for example 10 Windows Container App Service plan instances with 4 apps running, you will need 50 IP addresses and additional addresses to support horizontal (in/out) scale.
+>
+> Sample calculation:
+>
+> For each App Service plan instance, you need:
+> 4 Windows Container apps = 4 IP addresses
+> 1 IP address per App Service plan instance
+> 4 + 1 = 5 IP addresses
+>
+> For 10 instances:
+> 5 x 10 = 50 IP addresses per App Service plan
+>
+> Since you have 1 App Service plan, 1 x 50 = 50 IP addresses.
When you want your apps in your plan to reach a virtual network that's already connected to by apps in another plan, select a different subnet than the one being used by the pre-existing virtual network integration.
app-service Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/overview.md
description: Learn how Azure App Service helps you develop and host web applicat
ms.assetid: 94af2caf-a2ec-4415-a097-f60694b860b3 Previously updated : 06/14/2023 Last updated : 07/19/2023
app-service Quickstart Php https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-php.md
Browse to the deployed application in your web browser at the URL `http://<app-n
1. From the left navigation, select **Deployment Center**.
- ![Screenshot of the App Service in the Azure Portal. The Deployment Center option in the Deployment section of the left navigation is highlighted.](media/quickstart-php/azure-portal-configure-app-service-deployment-center.png)
+ ![Screenshot of the App Service in the Azure portal. The Deployment Center option in the Deployment section of the left navigation is highlighted.](media/quickstart-php/azure-portal-configure-app-service-deployment-center.png)
1. Under **Settings**, select a **Source**. For this quickstart, select *GitHub*.
app-service Quickstart Ruby https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-ruby.md
ms.assetid: 6d00c73c-13cb-446f-8926-923db4101afa
Last updated 04/27/2021 ms.devlang: ruby-+ # Create a Ruby on Rails App in App Service
http://<app-name>.azurewebsites.net
1. From the left navigation, select **Deployment Center**.
- ![Screenshot of the App Service in the Azure Portal. The Deployment Center option in the Deployment section of the left navigation is highlighted.](media/quickstart-ruby/azure-portal-configure-app-service-deployment-center.png)
+ ![Screenshot of the App Service in the Azure portal. The Deployment Center option in the Deployment section of the left navigation is highlighted.](media/quickstart-ruby/azure-portal-configure-app-service-deployment-center.png)
1. Under **Settings**, select a **Source**. For this quickstart, select *GitHub*.
app-service Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure App Service description: Lists Azure Policy Regulatory Compliance controls available for Azure App Service. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
app-service Tutorial Multi Container App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/tutorial-multi-container-app.md
Last updated 11/18/2022 -+ # Tutorial: Create a multi-container (preview) app in Web App for Containers
app-spaces Deploy App Spaces Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-spaces/deploy-app-spaces-template.md
To use a sample app for Azure App Spaces, you must have the following items:
Do the following steps to deploy a sample app to App Spaces.
-1. Sign in to the [Azure portal](https://ms.portal.azure.com/#home).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Enter `App Spaces` in the search box, and then select **App Spaces**. 3. Select a sample app. For this example, we selected the **Static Web App with Node.js API - Mongo DB** template.
app-spaces Quickstart Deploy Web App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-spaces/quickstart-deploy-web-app.md
To deploy your repository to App Spaces, you must have the following items:
Do the following steps to deploy an existing repository from GitHub.
-1. Sign in to the [Azure portal](https://ms.portal.azure.com/#home).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Enter `App Spaces` in the search box, and then select **App Spaces**. 3. Choose **Start deploying**.
application-gateway Ingress Controller Autoscale Pods https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-autoscale-pods.md
description: This article provides instructions on how to scale your AKS backend
+ Last updated 04/27/2023
ab -n10000 http://<applicaiton-gateway-ip-address>/
``` ## Next steps-- [**Troubleshoot Ingress Controller issues**](ingress-controller-troubleshoot.md): Troubleshoot any issues with the Ingress Controller.
+- [**Troubleshoot Ingress Controller issues**](ingress-controller-troubleshoot.md): Troubleshoot any issues with the Ingress Controller.
application-gateway Ingress Controller Expose Service Over Http Https https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-expose-service-over-http-https.md
description: This article provides information on how to expose an AKS service o
+ Last updated 04/27/2023
application-gateway Ingress Controller Install Existing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-install-existing.md
description: This article provides information on how to deploy an Application G
-+ Last updated 05/25/2023
application-gateway Ingress Controller Install New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-install-new.md
description: This article provides information on how to deploy an Application G
+ Last updated 04/27/2023
application-gateway Ingress Controller Letsencrypt Certificate Application Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-letsencrypt-certificate-application-gateway.md
description: This article provides information on how to obtain a certificate fr
+ Last updated 04/27/2023
application-gateway Ingress Controller Private Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-private-ip.md
description: This article provides information on how to use private IPs for int
+ Last updated 04/27/2023
This makes the ingress controller filter the IP address configurations for a Pri
AGIC can panic and crash if `usePrivateIP: true` and no Private IP is assigned. > [!NOTE]
-> Application Gateway v2 SKU requires a Public IP. Should you require Application Gateway to be private, Attach a [`Network Security Group`](../virtual-network/network-security-groups-overview.md) to the Application Gateway's subnet to restrict traffic.
+> Application Gateway v2 SKU requires a Public IP. Should you require Application Gateway to be private, Attach a [`Network Security Group`](../virtual-network/network-security-groups-overview.md) to the Application Gateway's subnet to restrict traffic.
application-gateway Ingress Controller Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-troubleshoot.md
description: This article provides documentation on how to troubleshoot common q
+ Last updated 04/27/2023
application-gateway Ingress Controller Update Ingress Controller https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ingress-controller-update-ingress-controller.md
description: This article provides information on how to upgrade an Application
+ Last updated 04/27/2023
application-gateway Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/quick-create-cli.md
Last updated 06/07/2023 -+ # Quickstart: Direct web traffic with Azure Application Gateway - Azure CLI
application-gateway Redirect Http To Https Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-http-to-https-cli.md
description: Learn how to create an HTTP to HTTPS redirection and add a certific
-+ Last updated 04/27/2023
application-gateway Redirect Internal Site Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-internal-site-cli.md
description: Learn how to create an application gateway that redirects internal
-+ Last updated 04/27/2023
application-gateway Self Signed Certificates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/self-signed-certificates.md
Last updated 04/27/2023 -+ # Generate an Azure Application Gateway self-signed certificate with a custom root CA
application-gateway Tutorial Manage Web Traffic Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-manage-web-traffic-cli.md
Last updated 04/27/2023 -+ # Manage web traffic with an application gateway using the Azure CLI
application-gateway Tutorial Multiple Sites Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-multiple-sites-cli.md
Last updated 04/27/2023 -+ #Customer intent: As an IT administrator, I want to use Azure CLI to configure Application Gateway to host multiple web sites , so I can ensure my customers can access the web information they need.
application-gateway Tutorial Ssl Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-ssl-cli.md
Last updated 04/27/2023 -+ # Create an application gateway with TLS termination using the Azure CLI
application-gateway Tutorial Url Redirect Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-url-redirect-cli.md
Last updated 04/27/2023 -+ #Customer intent: As an IT administrator, I want to use Azure CLI to set up URL path redirection of web traffic to specific pools of servers so I can ensure my customers have access to the information they need.
application-gateway Tutorial Url Route Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-url-route-cli.md
Last updated 04/27/2023 -+ #Customer intent: As an IT administrator, I want to use Azure CLI to set up routing of web traffic to specific pools of servers based on the URL that the customer uses, so I can ensure my customers have the most efficient route to the information they need.
attestation Private Endpoint Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/attestation/private-endpoint-powershell.md
New-AzPrivateDnsZoneGroup -ResourceGroupName $rg -PrivateEndpointName "myPrivate
In this section, you'll use the virtual machine you created in the previous step to connect to the SQL server across the private endpoint.
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Resource groups** in the left-hand navigation pane.
automation Automation Hrw Run Runbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-hrw-run-runbooks.md
Last updated 03/27/2023 -+ # Run Automation runbooks on a Hybrid Runbook Worker
automation Automation Linux Hrw Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-linux-hrw-install.md
Title: Deploy an agent-based Linux Hybrid Runbook Worker in Automation
description: This article tells how to install an agent-based Hybrid Runbook Worker to run runbooks on Linux-based machines in your local datacenter or cloud environment. + Last updated 04/12/2023
automation Automation Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-role-based-access-control.md
The following section shows you how to configure Azure RBAC on your Automation a
### Configure Azure RBAC using the Azure portal
-1. Sign in to the [Azure portal](https://portal.azure.com/) and open your Automation account from the **Automation Accounts** page.
+1. Sign in to the [Azure portal](https://portal.azure.com) and open your Automation account from the **Automation Accounts** page.
1. Select **Access control (IAM)** and select a role from the list of available roles. You can choose any of the available built-in roles that an Automation account supports or any custom role you might have defined. Assign the role to a user to which you want to give permissions.
New-AzRoleAssignment -ObjectId $userId -RoleDefinitionName "Automation Job Opera
New-AzRoleAssignment -ObjectId $userId -RoleDefinitionName "Automation Runbook Operator" -Scope $rb.ResourceId ```
-Once the script has run, have the user log in to the Azure portal and select **All Resources**. In the list, the user can see the runbook for which he/she has been added as an Automation Runbook Operator.
+Once the script has run, have the user sign in to the Azure portal and select **All Resources**. In the list, the user can see the runbook for which he/she has been added as an Automation Runbook Operator.
![Runbook Azure RBAC in the portal](./media/automation-role-based-access-control/runbook-rbac.png)
When a user assigned to the Automation Operator role on the Runbook scope views
* To learn about security guidelines, see [Security best practices in Azure Automation](automation-security-guidelines.md). * To find out more about Azure RBAC using PowerShell, see [Add or remove Azure role assignments using Azure PowerShell](../role-based-access-control/role-assignments-powershell.md). * For details of the types of runbooks, see [Azure Automation runbook types](automation-runbook-types.md).
-* To start a runbook, see [Start a runbook in Azure Automation](start-runbooks.md).
+* To start a runbook, see [Start a runbook in Azure Automation](start-runbooks.md).
automation Manage Change Tracking Monitoring Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/change-tracking/manage-change-tracking-monitoring-agent.md
Title: Manage change tracking and inventory in Azure Automation using Azure Moni
description: This article tells how to use change tracking and inventory to track software and Microsoft service changes in your environment using Azure Monitoring Agent (Preview) Previously updated : 12/28/2022 Last updated : 07/17/2023
This article describes how to manage change tracking, and includes the procedure
To manage tracking and inventory, ensure that you enable Change tracking with AMA on your VM.
-1. In the Azure portal, select the virtual machine.
+1. In the [Azure portal](https://portal.azure.com), select the virtual machine.
1. Select a specific VM for which you would like to configure the Change tracking settings. 1. Under **Operations**, select **Change tracking** 1. Select **Settings** to view the **Data Collection Rule Configuration** (DCR) page. Here, you can do the following actions:
To manage tracking and inventory, ensure that you enable Change tracking with AM
You can now view the virtual machines configured to the DCR. +
+### Configure file content changes
+
+To configure file content changes, follow these steps:
+
+1. In your virtual machine, under **Operations**, select **Change tracking** > **Settings**.
+1. In the **Data Collection Rule Configuration (Preview)** page, select **File Content** > **Link** to link the storage account.
+
+ :::image type="content" source="media/manage-change-tracking-monitoring-agent/file-content-inline.png" alt-text="Screenshot of selecting the link option to connect with the Storage account." lightbox="media/manage-change-tracking-monitoring-agent/file-content-expanded.png":::
+
+1. In **Content Location for Change Tracking** screen, select your **Subscription**, **Storage** and confirm if you are using **System Assigned Managed Identity**.
+1. Select **Upload file content for all settings**, and then select **Save**. It ensures that the file content changes for all the files residing in this DCR will be tracked.
+
+#### [System Assigned Managed Identity](#tab/sa-mi)
+
+When the storage account is linked using the system assigned managed identity, a blob is created.
+
+1. From [Azure portal](https://portal.azure.com), go to **Storage accounts**, and select the storage account.
+1. In the storage account page, under **Data storage**, select **Containers** > **Changetracking blob** > **Access Control (IAM)**.
+1. In the **Changetrackingblob | Access Control (IAM)** page, select **Add** and then select **Add role assignment**.
+
+ :::image type="content" source="media/manage-change-tracking-monitoring-agent/blob-add-role-inline.png" alt-text="Screenshot of selecting to add role." lightbox="media/manage-change-tracking-monitoring-agent/blob-add-role-expanded.png":::
+
+1. In the **Add role assignment** page, use the search for **Blob Data contributor** to assign a storage Blob contributor role for the specific VM. This permission provides access to read, write, and delete storage blob containers and data.
+
+ :::image type="content" source="media/manage-change-tracking-monitoring-agent/blob-contributor-inline.png" alt-text="Screenshot of selecting the contributor role for storage blog." lightbox="media/manage-change-tracking-monitoring-agent/blob-contributor-expanded.png":::
+
+1. Select the role and assign it to your virtual machine.
+
+ :::image type="content" source="media/manage-change-tracking-monitoring-agent/blob-add-role-vm-inline.png" alt-text="Screenshot of assigning the role to VM." lightbox="media/manage-change-tracking-monitoring-agent/blob-add-role-vm-expanded.png":::
+
+#### [User Assigned Managed Identity](#tab/ua-mi)
+
+For user-assigned managed identity, follow these steps to assign the user assigned managed identity to the VM and provide the permission.
+
+1. In the storage account page, under **Data storage**, select **Containers** > **Changetracking blob** > **Access Control (IAM)**.
+1. In **Changetrackingblob | Access Control (IAM)** page, select **Add** and then select **Add role assignment**.
+1. Search for **Storage Blob Data Contributor**, select the role and assign it to your user-assigned managed identity.
+
+ :::image type="content" source="media/manage-change-tracking-monitoring-agent/user-assigned-add-role-inline.png" alt-text="Screenshot of adding the role to user-assigned managed identity." lightbox="media/manage-change-tracking-monitoring-agent/user-assigned-add-role-expanded.png":::
+
+1. Go to your virtual machine, under **Settings**, select **Identity**, under **User assigned** tab, select **+Add**.
+
+1. In the **Add user assigned managed identity**, select the **Subscription** and add the user-assigned managed identity.
+ :::image type="content" source="media/manage-change-tracking-monitoring-agent/user-assigned-assign-role-inline.png" alt-text="Screenshot of assigning the role to user-assigned managed identity." lightbox="media/manage-change-tracking-monitoring-agent/user-assigned-assign-role-expanded.png":::
++
+#### Upgrade the extension version
+
+> [!NOTE]
+> Ensure that ChangeTracking-Linux/ ChangeTracking-Windows extension version is upgraded to 2.13
+
+Use the following command to upgrade the extension version:
+
+```azurecli-interactive
+az vm extension set -n {ExtensionName} --publisher Microsoft.Azure.ChangeTrackingAndInventory --ids {VirtualMachineResourceId}
+```
+The extension for Windows is `Vms - ChangeTracking-Windows`and for Linux is `Vms - ChangeTracking-Linux`.
+ ### Configure using wildcards To configure the monitoring of files and folders using wildcards, do the following:
automation Overview Monitoring Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/change-tracking/overview-monitoring-agent.md
Title: Azure Automation Change Tracking and Inventory overview using Azure Monit
description: This article describes the Change Tracking and Inventory feature using Azure monitoring agent (Preview), which helps you identify software and Microsoft service changes in your environment. Previously updated : 06/15/2023 Last updated : 07/17/2023
You can enable Change Tracking and Inventory in the following ways:
For tracking changes in files on both Windows and Linux, Change Tracking and Inventory uses SHA256 hashes of the files. The feature uses the hashes to detect if changes have been made since the last inventory.
+## Tracking file content changes
+
+Change Tracking and Inventory allows you to view the contents of a Windows or Linux file. For each change to a file, Change Tracking and Inventory stores the contents of the file in an [Azure Storage account](../../storage/common/storage-account-create.md). When you're tracking a file, you can view its contents before or after a change. The file content can be viewed either inline or side by side. [Learn more](manage-change-tracking-monitoring-agent.md#configure-file-content-changes).
+
+![Screenshot of viewing changes in a Windows or Linux file.](./media/overview/view-file-changes.png)
++ ## Tracking of registry keys
-Change Tracking and Inventory allows monitoring of changes to Windows registry keys. Monitoring allows you to pinpoint extensibility points where third-party code and malware can activate. The following table lists preconfigured (but not enabled) registry keys. To track these keys, you must enable each one.
+Change Tracking and Inventory allows monitoring of changes to Windows registry keys. Monitoring allows you to pinpoint extensibility points where third-party code and malware can activate. The following table lists pre-configured (but not enabled) registry keys. To track these keys, you must enable each one.
> [!div class="mx-tdBreakAll"] > |Registry Key | Purpose |
automation Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/change-tracking/overview.md
Title: Azure Automation Change Tracking and Inventory overview
description: This article describes the Change Tracking and Inventory feature, which helps you identify software and Microsoft service changes in your environment. + Last updated 02/27/2023
automation Dsc Linux Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/dsc-linux-powershell.md
description: This article tells you how to configure a Linux virtual machine to
-+ Last updated 08/31/2021
automation Enable Managed Identity For Automation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/enable-managed-identity-for-automation.md
New-AzRoleAssignment `
To verify a role to a system-assigned managed identity of the Automation account, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to your Automation account. 1. Under **Account Settings**, select **Identity**.
automation Migrate Existing Agent Based Hybrid Worker To Extension Based Workers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md
description: This article provides information on how to migrate an existing age
Last updated 04/11/2023-+ #Customer intent: As a developer, I want to learn about extension so that I can efficiently migrate agent based hybrid workers to extension based workers.
automation Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Automation description: Lists Azure Policy Regulatory Compliance controls available for Azure Automation. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
automation Remove Node And Configuration Package https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/state-configuration/remove-node-and-configuration-package.md
description: This article explains how to remove an Azure Automation State Confi
+ Last updated 04/16/2021
automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/troubleshoot/hybrid-runbook-worker.md
description: This article tells how to troubleshoot and resolve issues that aris
Last updated 04/26/2023 + # Troubleshoot agent-based Hybrid Runbook Worker issues in Automation
automation Update Agent Issues Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/troubleshoot/update-agent-issues-linux.md
Last updated 11/01/2021 + # Troubleshoot Linux update agent issues
automation View Update Assessments https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/update-management/view-update-assessments.md
In Update Management, you can view information about your machines, missing upda
To view update assessment from an Azure VM:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Virtual Machines** and select your virtual machine from the list. From the left menu, under **Operations**, select **Updates**, and select **Go to Update Management**. In Update Management, you can view information about your machine, missing updates, update deployments, manage multiple machines, scheduled update deployments and so on.
In Update Management, you can view information about your machine, missing updat
To view update assessment from an Azure Arc-enabled server:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Servers - Azure Arc** and select your server from the list. From the left menu, under **Operation**, select **Guest + host updates** and select **Go to Updates using Update management center**. In Update Management, you can view information about your Azure Arc-enabled machine, total updates, assess updates, scheduled update deployments, and so on.
In Update Management, you can view information about your Azure Arc-enabled mach
To view update assessment across all machines, including Azure Arc-enabled servers from your Automation account:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Automation accounts** and select your Automation account with Update Management enabled from the list. In your Automation account, select **Update management** from the left menu. The updates for your environment are listed on the **Update management** page. If any updates are identified as missing, a list of them appears in the **Missing updates** tab.
azure-app-configuration Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure App Configuration description: Lists Azure Policy Regulatory Compliance controls available for Azure App Configuration. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
azure-arc Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/connectivity.md
Previously updated : 03/30/2022 Last updated : 07/19/2023 # Connectivity modes and requirements
+This article describes the connectivity modes available for Azure Arc-enabled data services, and their respective requirements.
## Connectivity modes
azure-arc Delete Azure Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/delete-azure-resources.md
Title: Delete resources from Azure
-description: Delete resources from Azure
+ Title: Delete resources from Azure Arc-enabled data services
+description: Describes how to delete resources from Azure Arc-enabled data services
Previously updated : 11/03/2021 Last updated : 07/19/2023
This article describes how to delete Azure Arc-enabled data service resources fr
> [!WARNING] > When you delete resources as described in this article, these actions are irreversible.
+The information in this article applies to resources in Azure Arc-enabled data services. To delete resources in Azure, review the information at [Azure Resource Manager resource group and resource deletion](../../azure-resource-manager/management/delete-resource-group.md).
+ ## Before Before you delete a resource such as Azure Arc SQL managed instance or Azure Arc data controller, you need to export and upload the usage information to Azure for accurate billing calculation by following the instructions described in [Upload billing data to Azure - Indirectly connected mode](view-billing-data-in-azure.md#upload-billing-data-to-azureindirectly-connected-mode).
From Azure portal:
3. Optionally delete the Custom Location that the Azure Arc data controller is deployed to. 4. Optionally, you can also delete the namespace on your Kubernetes cluster if there are no other resources created in the namespace. -- See [Manage Azure resources by using the Azure portal](../../azure-resource-manager/management/manage-resources-portal.md). ## Indirect connectivity mode
azure-arc Managed Instance Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/managed-instance-overview.md
Previously updated : 07/30/2021 Last updated : 07/19/2023
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/overview.md
Title: What are Azure Arc-enabled data services
-description: Introduces Azure Arc-enabled data services
+ Title: Introducing Azure Arc-enabled data services
+description: Describes Azure Arc-enabled data services
Previously updated : 07/30/2021 Last updated : 07/19/2023 # Customer intent: As a data professional, I want to understand why my solutions would benefit from running with Azure Arc-enabled data services so that I can leverage the capability of the feature.
azure-arc Plan Azure Arc Data Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/plan-azure-arc-data-services.md
Previously updated : 11/03/2021 Last updated : 07/19/2023 # Plan an Azure Arc-enabled data services deployment
In order to experience Azure Arc-enabled data services, you'll need to complete
1. [Install client tools](install-client-tools.md). 1. Register the Microsoft.AzureArcData provider for the subscription where the Azure Arc-enabled data services will be deployed, as follows:
- ```console
+
+ ```azurecli
az provider register --namespace Microsoft.AzureArcData ```
Verify that:
- The other [client tools](install-client-tools.md) are installed. - You have access to the Kubernetes cluster. - Your *kubeconfig* file is configured. It should point to the Kubernetes cluster that you want to deploy to. To verify the current context of the cluster, run the following command:+ ```console kubectl cluster-info ```
azure-arc Resize Persistent Volume Claim https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/resize-persistent-volume-claim.md
Previously updated : 08/29/2022 Last updated : 07/19/2023
This article explains how to resize an existing persistent volume to increase it
> [!NOTE] > Resizing PVCs using this method only works your `StorageClass` supports `AllowVolumeExpansion=True`.
-When you deploy an Azure Arc enabled SQL managed instance, you can configure the size of the persistent volume (PV) for `data`, `logs`, `datalogs`, and `backups`. The deployment creates these volumes based on the values set by parameters `--volume-size-data`, `--volume-size-logs`, `--volume-size-datalogs`, and `--volume-size-backups`. When these volumes become full, you will need to resize the `PersistentVolumes`. Azure Arc enabled SQL Managed Instance is deployed as part of a `StatefulSet` for both General Purpose or Business Critical service tiers. Kubernetes supports automatic resizing for persistent volumes but not for volumes attached to `StatefulSet`.
+When you deploy an Azure Arc-enabled SQL managed instance, you can configure the size of the persistent volume (PV) for `data`, `logs`, `datalogs`, and `backups`. The deployment creates these volumes based on the values set by parameters `--volume-size-data`, `--volume-size-logs`, `--volume-size-datalogs`, and `--volume-size-backups`. When these volumes become full, you will need to resize the `PersistentVolumes`. Azure Arc-enabled SQL Managed Instance is deployed as part of a `StatefulSet` for both General Purpose or Business Critical service tiers. Kubernetes supports automatic resizing for persistent volumes but not for volumes attached to `StatefulSet`.
Following are the steps to resize persistent volumes attached to `StatefulSet`:
For example: The below command sets the `StatefulSet` replicas to 3.
``` kubectl scale statefulsets sqlmi1 --namespace arc --replicas=3 ```
-Ensure the Arc enabled SQL managed instance is back to ready status by running:
+Ensure the Arc-enabled SQL managed instance is back to ready status by running:
```console kubectl get sqlmi -A
azure-arc Service Tiers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/service-tiers.md
Previously updated : 03/01/2022- Last updated : 07/19/2023+ # Azure Arc-enabled SQL Managed Instance service tiers
azure-arc What Is Azure Arc Enabled Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/what-is-azure-arc-enabled-postgresql.md
Previously updated : 11/03/2021- Last updated : 07/19/2023+ # What is Azure Arc-enabled PostgreSQL server [!INCLUDE [azure-arc-data-preview](../../../includes/azure-arc-data-preview.md)] -- **Azure Arc-enabled PostgreSQL server** is one of the database engines available as part of Azure Arc-enabled data services. ## Compare PostgreSQL solutions provided by Microsoft in Azure
azure-arc Private Link Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/private-link-security.md
Title: Use Azure Private Link to securely connect servers to Azure Arc description: Learn how to use Azure Private Link to securely connect networks to Azure Arc. + Last updated 06/20/2023
azure-arc Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Arc-enabled servers (preview) description: Lists Azure Policy Regulatory Compliance controls available for Azure Arc-enabled servers (preview). These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
azure-cache-for-redis Cache Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-managed-identity.md
Managed identity for storage isn't supported on caches that have a dependency on
## Create a new cache with managed identity using the portal
-1. Sign into the [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Create a new Azure Cache for Redis resource with a **Cache type** of any of the premium tiers. Complete **Basics** tab with all the required information.
+1. Create a new Azure Cache for Redis resource with a **Cache type** of any of the premium tiers. Complete **Basics** tab with all the required information.
:::image type="content" source="media/cache-managed-identity/basics.png" alt-text="Screenshot of showing how to create a premium cache.":::
azure-cache-for-redis Cache Monitor Diagnostic Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-monitor-diagnostic-settings.md
For more pricing information, [Azure Monitor pricing](https://azure.microsoft.co
### [Portal with Basic, Standard, and Premium tiers](#tab/basic-standard-premium)
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Cache for Redis account. Open the **Diagnostic settings** pane under the **Monitoring section** on the left. Then, select **Add diagnostic setting**.
For more pricing information, [Azure Monitor pricing](https://azure.microsoft.co
### [Portal with Enterprise and Enterprise Flash tiers (preview)](#tab/enterprise-enterprise-flash)
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Cache for Redis account. Open the **Diagnostic Settings - Auditing** pane under the **Monitoring** section on the left. Then, select **Add diagnostic setting**. :::image type="content" source="media/cache-monitor-diagnostic-settings/cache-enterprise-auditing.png" alt-text="Screenshot of Diagnostic settings - Auditing selected in the Resource menu.":::
azure-cache-for-redis Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Cache for Redis description: Lists Azure Policy Regulatory Compliance controls available for Azure Cache for Redis. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
azure-functions Analyze Telemetry Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/analyze-telemetry-data.md
Azure Functions integrates with Application Insights to better enable you to mon
By default, the data collected from your function app is stored in Application Insights. In the [Azure portal](https://portal.azure.com), Application Insights provides an extensive set of visualizations of your telemetry data. You can drill into error logs and query events and metrics. This article provides basic examples of how to view and query your collected data. To learn more about exploring your function app data in Application Insights, see [What is Application Insights?](../azure-monitor/app/app-insights-overview.md).
-To be able to view Application Insights data from a function app, you must have at least Contributor role permissions on the function app. You also need to have the the [Monitoring Reader permission](../azure-monitor/roles-permissions-security.md#monitoring-reader) on the Application Insights instance. You have these permissions by default for any function app and Application Insights instance that you create.
+To be able to view Application Insights data from a function app, you must have at least Contributor role permissions on the function app. You also need to have the [Monitoring Reader permission](../azure-monitor/roles-permissions-security.md#monitoring-reader) on the Application Insights instance. You have these permissions by default for any function app and Application Insights instance that you create.
To learn more about data retention and potential storage costs, see [Data collection, retention, and storage in Application Insights](../azure-monitor/app/data-retention-privacy.md).
azure-functions Functions How To Use Azure Function App Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-how-to-use-azure-function-app-settings.md
Connection strings, environment variables, and other application settings are de
## Get started in the Azure portal
-1. To begin, go to the [Azure portal] and sign in to your Azure account. In the search bar at the top of the portal, enter the name of your function app and select it from the list.
+1. To begin, sign in to the [Azure portal] using your Azure account. In the search bar at the top of the portal, enter the name of your function app and select it from the list.
2. Under **Settings** in the left pane, select **Configuration**.
azure-large-instances Configure Azure Service Health Alerts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-large-instances/configure-azure-service-health-alerts.md
downtime that affects your infrastructure.
Follow these steps to configure Service Health alerts:
-1. Go to the [Microsoft Azure portal](https://portal.Azure.Com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for ΓÇ£service healthΓÇ¥ in the search bar and select **Service Health** from the results. :::image type="content" source="media/health-alerts-step-2.png" alt-text="Screenshot of the health alert dashboard.":::
Follow these steps to configure Service Health alerts:
1. Click **OK** to add the Action Group. 1. Verify you see your newly created Action Group. You will now receive alerts when there are health issues or maintenance actions on your systems.-
azure-maps Set Map Style Ios Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-maps/set-map-style-ios-sdk.md
Title: Set a map style in iOS maps | Microsoft Azure Maps
description: Learn two ways of setting the style of a map. See how to use the Azure Maps iOS SDK in either the layout file or the activity class to adjust the style. Previously updated : 10/22/2021 Last updated : 07/22/2023
# Set map style in the iOS SDK (Preview)
-This article shows you two ways to set map styles using the Azure Maps iOS SDK. Azure Maps has six different maps styles to choose from. For more information about supported map styles, see [supported map styles in Azure Maps](supported-map-styles.md).
+This article shows you two ways to set map styles using the Azure Maps iOS SDK. Azure Maps has six different maps styles to choose from. For more information about supported map styles, see [supported map styles in Azure Maps].
## Prerequisites
-Be sure to complete the steps in the [Quickstart: Create an iOS app](quick-ios-app.md) document.
-
-> [!IMPORTANT]
-> The procedure in this section requires an Azure Maps account in Gen 1 or Gen 2 pricing tier. For more information on pricing tiers, see [Choose the right pricing tier in Azure Maps](choose-pricing-tier.md).
+- Complete the [Create an iOS app] quickstart.
+- An [Azure Maps account].
## Set map style in the map control init
map.setCameraOptions([
]) ```
-Often it is desirable to focus the map over a set of data. A bounding box can be calculated from features using the `BoundingBox.fromData(_:)` method and can be passed into the `bounds` option of the map camera. When setting a map view based on a bounding box, it's often useful to specify a `padding` value to account for the point size of data points being rendered as bubbles or symbols. The following code shows how to set all optional camera options when using a bounding box to set the position of the camera.
+Often it's desirable to focus the map over a set of data. A bounding box can be calculated from features using the `BoundingBox.fromData(_:)` method and can be passed into the `bounds` option of the map camera. When setting a map view based on a bounding box, it's often useful to specify a `padding` value to account for the point size of data points being rendered as bubbles or symbols. The following code shows how to set all optional camera options when using a bounding box to set the position of the camera.
```swift //Set the camera of the map using a bounding box.
map.setCameraBoundsOptions([
]) ```
-The aspect ratio of a bounding box may not be the same as the aspect ratio of the map, as such the map will often show the full bounding box area, but will often only be tight vertically or horizontally.
+The aspect ratio of a bounding box may not be the same as the aspect ratio of the map, as such the map often shows the full bounding box area, and are often only tight vertically or horizontally.
### Animate map view
When setting the camera options of the map, animation options can also be used t
| Option | Description | |--|-|
-| `animationDuration(_ duration: Double)` | Specifies how long the camera will animate between the views in milliseconds (ms). |
-| `animationType(_ animationType: AnimationType)` | Specifies the type of animation transition to perform.<br/><br/> - `.jump` - an immediate change.<br/> - `.ease` - gradual change of the camera's settings.<br/> - `.fly` - gradual change of the camera's settings following an arc resembling flight. |
+| `animationDuration(_ duration: Double)` | Specifies how long the camera animates between the views in milliseconds (ms). |
+| `animationType(_ animationType: AnimationType)` | Specifies the type of animation transition to perform.<br><br> - `.jump` - an immediate change.<br> - `.ease` - gradual change of the camera's settings.<br> - `.fly` - gradual change of the camera's settings following an arc resembling flight. |
The following code shows how to animate the map view using a `.fly` animation over a duration of three seconds.
map.setCameraOptions([
]) ```
-The following demonstrates the above code animating the map view from New York to Seattle.
+The following animation demonstrates the above code animating the map view from New York to Seattle.
:::image type="content" source="./media/ios-sdk/set-map-style-ios/ios-animate-camera.gif" alt-text="Map animating the camera from New York to Seattle.":::
The following demonstrates the above code animating the map view from New York t
See the following articles for more code samples to add to your maps: -- [Add a symbol layer](add-symbol-layer-ios.md)-- [Add a bubble layer](add-bubble-layer-map-ios.md)
+- [Add a symbol layer]
+- [Add a bubble layer]
+
+[Add a bubble layer]: add-bubble-layer-map-ios.md
+[Add a symbol layer]: add-symbol-layer-ios.md
+[Azure Maps account]: https://azure.microsoft.com/services/azure-maps
+[Create an iOS app]: quick-ios-app.md
+[supported map styles in Azure Maps]: supported-map-styles.md
azure-maps Spatial Io Add Ogc Map Layer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-maps/spatial-io-add-ogc-map-layer.md
# Add a map layer from the Open Geospatial Consortium (OGC)
-The `atlas.layer.OgcMapLayer` class can overlay Web Map Services (WMS) imagery and Web Map Tile Services (WMTS) imagery on the map. WMS is a standard protocol developed by OGC for serving georeferenced map images over the internet. Image georeferencing is the processes of associating an image to a geographical location. WMTS is also a standard protocol developed by OGC. It's designed for serving pre-rendered and georeferenced map tiles.
+The `atlas.layer.OgcMapLayer` class can overlay Web Map Services (WMS) imagery and Web Map Tile Services (WMTS) imagery on the map. WMS is a standard protocol developed by OGC for serving georeferenced map images over the internet. Image georeferencing is the processes of associating an image to a geographical location. WMTS is also a standard protocol developed by OGC. It's designed for serving prerendered and georeferenced map tiles.
-The following sections outline the web map service features that are supported by the `OgcMapLayer` class.
+The following sections outline the web map service features supported by the `OgcMapLayer` class.
**Web Map Service (WMS)**
The following sections outline the web map service features that are supported b
- Supported versions: `1.0.0` - Tiles must be square, such that `TileWidth == TileHeight`.-- CRS supported: `EPSG:3857` or `GoogleMapsCompatible` -- TileMatrix identifier must be an integer value that corresponds to a zoom level on the map. On an azure map, the zoom level is a value between `"0"` and `"22"`. So, `"0"` is supported, but `"00"` isn't supported.
+- CRS supported: `EPSG:3857` or `GoogleMapsCompatible`
+- TileMatrix identifier must be an integer value that corresponds to a zoom level on the map. In Azure Maps, the zoom level is a value between `"0"` and `"22"`. So, `"0"` is supported, but `"00"` isn't supported.
- Supported operations: | Operation | Description |
The [OGC map layer options] sample demonstrates the different OGC map layer opti
The [OGC Web Map Service explorer] sample overlays imagery from the Web Map Services (WMS) and Web Map Tile Services (WMTS) as layers. You may select which layers in the service are rendered on the map. You may also view the associated legends for these layers. For the source code for this sample, see [OGC Web Map Service explorer source code]. <!- <iframe height='750' scrolling='no' title='OGC Web Map Service explorer' src='//codepen.io/azuremaps/embed/YzXxYdX/?height=750&theme-id=0&default-tab=result&embed-version=2&editable=true' frameborder='no' allowtransparency='true' allowfullscreen='true'>See the Pen <a href='https://codepen.io/azuremaps/pen/YzXxYdX/'>OGC Web Map Service explorer</a> by Azure Maps (<a href='https://codepen.io/azuremaps'>@azuremaps</a>) on <a href='https://codepen.io'>CodePen</a>.</iframe>
You may also specify the map settings to use a proxy service. The proxy service
Learn more about the classes and methods used in this article: > [!div class="nextstepaction"]
-> [OgcMapLayer](/javascript/api/azure-maps-spatial-io/atlas.layer.ogcmaplayer)
+> [OgcMapLayer]
> [!div class="nextstepaction"]
-> [OgcMapLayerOptions](/javascript/api/azure-maps-spatial-io/atlas.ogcmaplayeroptions)
+> [OgcMapLayerOptions]
See the following articles, which contain code samples you could add to your maps: > [!div class="nextstepaction"]
-> [Connect to a WFS service](spatial-io-connect-wfs-service.md)
+> [Connect to a WFS service]
> [!div class="nextstepaction"]
-> [Leverage core operations](spatial-io-core-operations.md)
+> [Leverage core operations]
> [!div class="nextstepaction"]
-> [Supported data format details](spatial-io-supported-data-format-details.md)
+> [Supported data format details]
-[OGC map layer]: https://samples.azuremaps.com/spatial-io-module/ogc-map-layer-example
+[Connect to a WFS service]: spatial-io-connect-wfs-service.md
+[Leverage core operations]: spatial-io-core-operations.md
+[OGC map layer options source code]: https://github.com/Azure-Samples/AzureMapsCodeSamples/blob/main/Samples/Spatial%20IO%20Module/OGC%20map%20layer%20options/OGC%20map%20layer%20options.html
[OGC map layer options]: https://samples.azuremaps.com/spatial-io-module/ogc-map-layer-options
-[OGC Web Map Service explorer]: https://samples.azuremaps.com/spatial-io-module/ogc-web-map-service-explorer
- [OGC map layer source code]: https://github.com/Azure-Samples/AzureMapsCodeSamples/blob/main/Samples/Spatial%20IO%20Module/OGC%20map%20layer%20example/OGC%20map%20layer%20example.html
-[OGC map layer options source code]: https://github.com/Azure-Samples/AzureMapsCodeSamples/blob/main/Samples/Spatial%20IO%20Module/OGC%20map%20layer%20options/OGC%20map%20layer%20options.html
+[OGC map layer]: https://samples.azuremaps.com/spatial-io-module/ogc-map-layer-example
[OGC Web Map Service explorer source code]: https://github.com/Azure-Samples/AzureMapsCodeSamples/blob/main/Samples/Spatial%20IO%20Module/OGC%20Web%20Map%20Service%20explorer/OGC%20Web%20Map%20Service%20explorer.html
+[OGC Web Map Service explorer]: https://samples.azuremaps.com/spatial-io-module/ogc-web-map-service-explorer
+[OgcMapLayer]: /javascript/api/azure-maps-spatial-io/atlas.layer.ogcmaplayer
+[OgcMapLayerOptions]: /javascript/api/azure-maps-spatial-io/atlas.ogcmaplayeroptions
+[Supported data format details]: spatial-io-supported-data-format-details.md
azure-monitor Agent Linux Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/agent-linux-troubleshoot.md
Title: Troubleshoot Azure Log Analytics Linux Agent | Microsoft Docs description: Describe the symptoms, causes, and resolution for the most common issues with the Log Analytics agent for Linux in Azure Monitor. + Last updated 04/25/2023 - # Troubleshoot issues with the Log Analytics agent for Linux
azure-monitor Agent Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/agent-linux.md
Title: Install Log Analytics agent on Linux computers description: This article describes how to connect Linux computers hosted in other clouds or on-premises to Azure Monitor with the Log Analytics agent for Linux. + Last updated 06/01/2023 - # Install the Log Analytics agent on Linux computers
azure-monitor Azure Monitor Agent Extension Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-extension-versions.md
We strongly recommended to always update to the latest version, or opt in to the
## Version details | Release Date | Release notes | Windows | Linux | |:|:|:|:|
+| July 2023| **Windows** <ul><li>Fix crash when Event Log subscription callback throws errors.<li>MetricExtension updated to 2.2023.609.2051</li></ui> |1.18.0| Comming Soon|
| June 2023| **Windows** <ul><li>Add new file path column to custom logs table</li><li>Config setting to disable custom IMDS endpoint in Tenant.json file</li><li>FluentBit binaries signed with Microsoft customer Code Sign cert</li><li>Minimize number of retries on calls to refresh tokens</li><li>Don't overwrite resource ID with empty string</li><li>AzSecPack updated to version 4.27</li><li>AzureProfiler and AzurePerfCollector updated to version 1.0.0.990</li><li>MetricsExtension updated to version 2.2023.513.10</li><li>Troubleshooter updated to version 1.5.0</li></ul>**Linux** <ul><li>Add new column CollectorHostName to syslog table to identify forwarder/collector machine</li><li>Link OpenSSL dynamically</li><li>Support Arc-Enabled Servers proxy configuration file</li><li>**Fixes**<ul><li>Allow uploads soon after AMA start up</li><li>Run LocalSink GC on a dedicated thread to avoid thread pool scheduling issues</li><li>Fix upgrade restart of disabled services</li><li>Handle Linux Hardening where sudo on root is blocked</li><li>CEF processing fixes for noncomliant RFC 5424 logs</li><li>ASA tenant can fail to start up due to config-cache directory permissions</li><li>Fix auth proxy in AMA</li><li>Fix to remove null characters in agentlauncher.log after log rotation</li></ul></li></ul>|1.17.0 |1.27.2|
-| May 2023 | **Windows** <ul><li>Enable Large Event support for all regions.</li><li>Update to TroubleShooter 1.4.0.</li><li>Fixed issue when Event Log subscription become invalid an would not resubscribe.</li><li>AMA: Fixed issue with Large Event sending too large data. Also affecting Custom Log.</li></ul> **Linux** <ul><li>Support for CIS and SELinux [hardening](./agents-overview.md)</li><li>Include Ubuntu 22.04 (Jammy) in azure-mdsd package publishing</li><li>Move storage SDK patch to build container</li><li>Add system Telegraf counters to AMA</li><li>Drop msgpack and syslog data if not configured in active configuration</li><li>Limit the events sent to Public ingestion pipeline</li><li>**Fixes** <ul><li>Fix mdsd crash in init when in persistent mode </li><li>Remove FdClosers from ProtocolListeners to avoid a race condition</li><li>Fix sed regex special character escaping issue in rpm macro for Centos 7.3.Maipo</li><li>Fix latency and future timestamp issue</li><li>Install AMA syslog configs only if customer is opted in for syslog in DCR</li><li>Fix heartbeat time check</li><li>Skip unnecessary cleanup in fatal signal handler</li><li>Fix case where fast-forwarding may cause intervals to be skipped</li><li>Fix comma separated custom log paths with fluent</li><li>Fix to prevent events folder growing too large and filling the disk</li></ul></li><ul> | 1.16.0.0 | 1.26.2 |
+| May 2023 | **Windows** <ul><li>Enable Large Event support for all regions.</li><li>Update to TroubleShooter 1.4.0.</li><li>Fixed issue when Event Log subscription become invalid an would not resubscribe.</li><li>AMA: Fixed issue with Large Event sending too large data. Also affecting Custom Log.</li></ul> **Linux** <ul><li>Support for CIS and SELinux [hardening](./agents-overview.md)</li><li>Include Ubuntu 22.04 (Jammy) in azure-mdsd package publishing</li><li>Move storage SDK patch to build container</li><li>Add system Telegraf counters to AMA</li><li>Drop msgpack and syslog data if not configured in active configuration</li><li>Limit the events sent to Public ingestion pipeline</li><li>**Fixes** <ul><li>Fix mdsd crash in init when in persistent mode </li><li>Remove FdClosers from ProtocolListeners to avoid a race condition</li><li>Fix sed regex special character escaping issue in rpm macro for Centos 7.3.Maipo</li><li>Fix latency and future timestamp issue</li><li>Install AMA syslog configs only if customer is opted in for syslog in DCR</li><li>Fix heartbeat time check</li><li>Skip unnecessary cleanup in fatal signal handler</li><li>Fix case where fast-forwarding may cause intervals to be skipped</li><li>Fix comma separated custom log paths with fluent</li><li>Fix to prevent events folder growing too large and filling the disk</li><li>hot fix (1.26.3) for Syslog</li></ul><</li><ul> | 1.16.0.0 | 1.26.2 1.26.3<sup>Hotfix</sup>|
| Apr 2023 | **Windows** <ul><li>AMA: Enable Large Event support based on Region.</li><li>AMA: Upgrade to FluentBit version 2.0.9</li><li>Update Troubleshooter to 1.3.1</li><li>Update ME version to 2.2023.331.1521</li><li>Updating package version for AzSecPack 4.26 release</li></ul>|1.15.0| Coming soon| | Mar 2023 | **Windows** <ul><li>Text file collection improvements to handle high rate logging and continuous tailing of longer lines</li><li>VM Insights fixes for collecting metrics from non-English OS</li></ul> | 1.14.0.0 | Coming soon | | Feb 2023 | <ul><li>**Linux (hotfix)** Resolved potential data loss due to "Bad file descriptor" errors seen in the mdsd error log with previous version. Upgrade to hotfix version</li><li>**Windows** Reliability improvements in Fluentbit buffering to handle larger text files</li></ul> | 1.13.1 | 1.25.2<sup>Hotfix</sup> |
azure-monitor Azure Monitor Agent Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-manage.md
The following prerequisites must be met prior to installing Azure Monitor Agent.
{ "authentication": { "managedIdentity": {
- "identifier-name": "msi_res_id" or "object_id" or "client_id",
+ "identifier-name": "mi_res_id" or "object_id" or "client_id",
"identifier-value": "<resource-id-of-uai>" or "<guid-object-or-client-id>" } } } ```
- We recommend that you use `msi_res_id` as the `identifier-name`. The following sample commands only show usage with `mi_res_id` for the sake of brevity. For more information on `msi_res_id`, `object_id`, and `client_id`, see the [Managed identity documentation](../../active-directory/managed-identities-azure-resources/how-to-use-vm-token.md#get-a-token-using-http).
+ We recommend that you use `mi_res_id` as the `identifier-name`. The following sample commands only show usage with `mi_res_id` for the sake of brevity. For more information on `mi_res_id`, `object_id`, and `client_id`, see the [Managed identity documentation](../../active-directory/managed-identities-azure-resources/how-to-use-vm-token.md#get-a-token-using-http).
- **System-assigned**: This managed identity is suited for initial testing or small deployments. When used at scale, for example, for all VMs in a subscription, it results in a substantial number of identities created (and deleted) in Azure Active Directory. To avoid this churn of identities, use user-assigned managed identities instead. *For Azure Arc-enabled servers, system-assigned managed identity is enabled automatically* as soon as you install the Azure Arc agent. It's the only supported type for Azure Arc-enabled servers. - **Not required for Azure Arc-enabled servers**: The system identity is enabled automatically when you [create a data collection rule in the Azure portal](data-collection-rule-azure-monitor-agent.md#create-a-data-collection-rule). - **Networking**: If you use network firewalls, the [Azure Resource Manager service tag](../../virtual-network/service-tags-overview.md) must be enabled on the virtual network for the virtual machine. The virtual machine must also have access to the following HTTPS endpoints:
azure-monitor Azure Monitor Agent Troubleshoot Linux Vm Rsyslog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-troubleshoot-linux-vm-rsyslog.md
Last updated 5/31/2023-+ # Syslog troubleshooting guide for Azure Monitor Agent for Linux
azure-monitor Application Insights Asp Net Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/application-insights-asp-net-agent.md
Enable the instrumentation engine if:
#### Examples ```powershell
-PS C:\> Enable-InstrumentationEngine
+Enable-InstrumentationEngine
``` #### Parameters
After you enable monitoring, we recommend that you use [Live Metrics](live-strea
In this example, all apps on the current computer are assigned a single instrumentation key. ```powershell
-PS C:\> Enable-ApplicationInsightsMonitoring -InstrumentationKey xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
+Enable-ApplicationInsightsMonitoring -InstrumentationKey xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
``` ##### Example with an instrumentation key map
In this example:
- Spaces are added for readability. ```powershell
-PS C:\> Enable-ApplicationInsightsMonitoring -InstrumentationKeyMap
- @(@{MachineFilter='.*';AppFilter='WebAppExclude'},
- @{MachineFilter='.*';AppFilter='WebAppOne';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx1'}},
- @{MachineFilter='.*';AppFilter='WebAppTwo';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx2'}},
- @{MachineFilter='.*';AppFilter='.*';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxdefault'}})
-
+Enable-ApplicationInsightsMonitoring -InstrumentationKeyMap `
+ ` @(@{MachineFilter='.*';AppFilter='WebAppExclude'},
+ ` @{MachineFilter='.*';AppFilter='WebAppOne';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx1'}},
+ ` @{MachineFilter='.*';AppFilter='WebAppTwo';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx2'}},
+ ` @{MachineFilter='.*';AppFilter='.*';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxdefault'}})
``` > [!NOTE]
Restart IIS for the changes to take effect.
#### Examples ```powershell
-PS C:\> Disable-InstrumentationEngine
+Disable-InstrumentationEngine
``` #### Parameters
This cmdlet will remove edits to the IIS applicationHost.config and remove regis
#### Examples ```powershell
-PS C:\> Disable-ApplicationInsightsMonitoring
+Disable-ApplicationInsightsMonitoring
``` #### Parameters
Gets the config file and prints the values to the console.
#### Examples ```powershell
-PS C:\> Get-ApplicationInsightsMonitoringConfig
+Get-ApplicationInsightsMonitoringConfig
``` #### Parameters
This cmdlet will report version information and information about key files requ
Run the command `Get-ApplicationInsightsMonitoringStatus` to display the monitoring status of web sites. ```powershell-
-PS C:\Windows\system32> Get-ApplicationInsightsMonitoringStatus
+Get-ApplicationInsightsMonitoringStatus
IIS Websites:
In this example;
Run the command `Get-ApplicationInsightsMonitoringStatus -PowerShellModule` to display information about the current module: ```powershell-
-PS C:\> Get-ApplicationInsightsMonitoringStatus -PowerShellModule
+Get-ApplicationInsightsMonitoringStatus -PowerShellModule
PowerShell Module version: 0.4.0-alpha
Run the command `Get-ApplicationInsightsMonitoringStatus -InspectProcess`:
```
-PS C:\> Get-ApplicationInsightsMonitoringStatus -InspectProcess
+Get-ApplicationInsightsMonitoringStatus -InspectProcess
iisreset.exe /status Status for IIS Admin Service ( IISADMIN ) : Running
Restart IIS for your changes to take effect.
In this example, all apps on the current computer will be assigned a single instrumentation key. ```powershell
-PS C:\> Enable-ApplicationInsightsMonitoring -InstrumentationKey xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
+Enable-ApplicationInsightsMonitoring -InstrumentationKey xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
``` ##### Example with an instrumentation key map
In this example:
```powershell Enable-ApplicationInsightsMonitoring -InstrumentationKeyMap `
- @(@{MachineFilter='.*';AppFilter='WebAppExclude'},
- @{MachineFilter='.*';AppFilter='WebAppOne';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx1'}},
- @{MachineFilter='.*';AppFilter='WebAppTwo';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx2'}},
- @{MachineFilter='.*';AppFilter='.*';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxdefault'}})
+ ` @(@{MachineFilter='.*';AppFilter='WebAppExclude'},
+ ` @{MachineFilter='.*';AppFilter='WebAppOne';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx1'}},
+ ` @{MachineFilter='.*';AppFilter='WebAppTwo';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx2'}},
+ ` @{MachineFilter='.*';AppFilter='.*';InstrumentationSettings=@{InstrumentationKey='xxxxxxxx-xxxx-xxxx-xxxx-xxxxxdefault'}})
``` #### Parameters
The full path will be displayed during script execution.
##### Example of application startup logs ```powershell
-PS C:\Windows\system32> Start-ApplicationInsightsMonitoringTrace -CollectRedfieldEvents
+Start-ApplicationInsightsMonitoringTrace -CollectRedfieldEvents
Starting... Log File: C:\Program Files\WindowsPowerShell\Modules\Az.ApplicationMonitor\content\logs\20190627_144217_ApplicationInsights_ETW_Trace.etl Tracing enabled, waiting for events.
azure-monitor Eventcounters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/eventcounters.md
Title: Event counters in Application Insights | Microsoft Docs description: Monitor system and custom .NET/.NET Core EventCounters in Application Insights. Previously updated : 09/20/2019 Last updated : 07/21/2023 -+ # EventCounters introduction [`EventCounter`](/dotnet/core/diagnostics/event-counters) is .NET/.NET Core mechanism to publish and consume counters or statistics. EventCounters are supported in all OS platforms - Windows, Linux, and macOS. It can be thought of as a cross-platform equivalent for the [PerformanceCounters](/dotnet/api/system.diagnostics.performancecounter) that is only supported in Windows systems.
-While users can publish any custom `EventCounters` to meet their needs, [.NET](/dotnet/fundamentals/) publishes a set of these counters by default. This document will walk through the steps required to collect and view `EventCounters` (system defined or user defined) in Azure Application Insights.
+While users can publish any custom `EventCounters` to meet their needs, [.NET](/dotnet/fundamentals/) publishes a set of these counters by default. This document walks through the steps required to collect and view `EventCounters` (system defined or user defined) in Azure Application Insights.
## Using Application Insights to collect EventCounters
-Application Insights supports collecting `EventCounters` with its `EventCounterCollectionModule`, which is part of the newly released NuGet package [Microsoft.ApplicationInsights.EventCounterCollector](https://www.nuget.org/packages/Microsoft.ApplicationInsights.EventCounterCollector). `EventCounterCollectionModule` is automatically enabled when using either [AspNetCore](asp-net-core.md) or [WorkerService](worker-service.md). `EventCounterCollectionModule` collects counters with a non-configurable collection frequency of 60 seconds. There are no special permissions required to collect EventCounters.
+Application Insights supports collecting `EventCounters` with its `EventCounterCollectionModule`, which is part of the newly released NuGet package [Microsoft.ApplicationInsights.EventCounterCollector](https://www.nuget.org/packages/Microsoft.ApplicationInsights.EventCounterCollector). `EventCounterCollectionModule` is automatically enabled when using either [AspNetCore](asp-net-core.md) or [WorkerService](worker-service.md). `EventCounterCollectionModule` collects counters with a nonconfigurable collection frequency of 60 seconds. There are no special permissions required to collect EventCounters. For ASP.NET Core applications, you also want to add the [Microsoft.ApplicationInsights.AspNetCore](https://www.nuget.org/packages/Microsoft.ApplicationInsights.AspNetCore) package.
+
+```dotnetcli
+dotnet add package Microsoft.ApplicationInsights.EventCounterCollector
+dotnet add package Microsoft.ApplicationInsights.AspNetCore
+```
## Default counters collected
-Starting with 2.15.0 version of either [AspNetCore SDK](asp-net-core.md) or [WorkerService SDK](worker-service.md), no counters are collected by default. The module itself is enabled, so users can simply add the desired counters to
+Starting with 2.15.0 version of either [AspNetCore SDK](asp-net-core.md) or [WorkerService SDK](worker-service.md), no counters are collected by default. The module itself is enabled, so users can add the desired counters to
collect them. To get a list of well known counters published by the .NET Runtime, see [Available Counters](/dotnet/core/diagnostics/event-counters#available-counters) document. ## Customizing counters to be collected
-The following example shows how to add/remove counters. This customization would be done in the `ConfigureServices` method of your application after Application Insights telemetry collection is enabled using either `AddApplicationInsightsTelemetry()` or `AddApplicationInsightsWorkerService()`. Following is an example code from an ASP.NET Core application. For other type of applications, refer to [this](worker-service.md#configure-or-remove-default-telemetry-modules) document.
+The following example shows how to add/remove counters. This customization would be done as part of your application service configuration after Application Insights telemetry collection is enabled using either `AddApplicationInsightsTelemetry()` or `AddApplicationInsightsWorkerService()`. Following is an example code from an ASP.NET Core application. For other type of applications, refer to [this](worker-service.md#configure-or-remove-default-telemetry-modules) document.
++
+# [ASP.NET Core 6.0+](#tab/dotnet6)
+
+```csharp
+using Microsoft.ApplicationInsights.Extensibility.EventCounterCollector;
+using Microsoft.Extensions.DependencyInjection;
+
+builder.Services.ConfigureTelemetryModule<EventCounterCollectionModule>(
+ (module, o) =>
+ {
+ // Removes all default counters, if any.
+ module.Counters.Clear();
+
+ // Adds a user defined counter "MyCounter" from EventSource named "MyEventSource"
+ module.Counters.Add(
+ new EventCounterCollectionRequest("MyEventSource", "MyCounter"));
+
+ // Adds the system counter "gen-0-size" from "System.Runtime"
+ module.Counters.Add(
+ new EventCounterCollectionRequest("System.Runtime", "gen-0-size"));
+ }
+ );
+```
+
+# [ASP.NET Core 3.1](#tab/dotnet31)
```csharp
- using Microsoft.ApplicationInsights.Extensibility.EventCounterCollector;
- using Microsoft.Extensions.DependencyInjection;
-
- public void ConfigureServices(IServiceCollection services)
- {
- //... other code...
-
- // The following code shows how to configure the module to collect
- // additional counters.
- services.ConfigureTelemetryModule<EventCounterCollectionModule>(
- (module, o) =>
- {
- // This removes all default counters, if any.
- module.Counters.Clear();
-
- // This adds a user defined counter "MyCounter" from EventSource named "MyEventSource"
- module.Counters.Add(new EventCounterCollectionRequest("MyEventSource", "MyCounter"));
-
- // This adds the system counter "gen-0-size" from "System.Runtime"
- module.Counters.Add(new EventCounterCollectionRequest("System.Runtime", "gen-0-size"));
- }
- );
- }
+using Microsoft.ApplicationInsights.Extensibility.EventCounterCollector;
+using Microsoft.Extensions.DependencyInjection;
+
+public void ConfigureServices(IServiceCollection services)
+{
+ //... other code...
+
+ // The following code shows how to configure the module to collect
+ // additional counters.
+ services.ConfigureTelemetryModule<EventCounterCollectionModule>(
+ (module, o) =>
+ {
+ // Removes all default counters, if any.
+ module.Counters.Clear();
+
+ // Adds a user defined counter "MyCounter" from EventSource named "MyEventSource"
+ module.Counters.Add(
+ new EventCounterCollectionRequest("MyEventSource", "MyCounter"));
+
+ // Adds the system counter "gen-0-size" from "System.Runtime"
+ module.Counters.Add(
+ new EventCounterCollectionRequest("System.Runtime", "gen-0-size"));
+ }
+ );
+}
``` ++ ## Disabling EventCounter collection module
-`EventCounterCollectionModule` can be disabled by using `ApplicationInsightsServiceOptions`. An
-example when using ASP.NET Core SDK is shown below.
+`EventCounterCollectionModule` can be disabled by using `ApplicationInsightsServiceOptions`.
+
+The following example uses the ASP.NET Core SDK.
+
+# [ASP.NET Core 6.0+](#tab/dotnet6)
+
+```csharp
+using Microsoft.ApplicationInsights.AspNetCore.Extensions;
+using Microsoft.Extensions.DependencyInjection;
+
+var applicationInsightsServiceOptions = new ApplicationInsightsServiceOptions();
+applicationInsightsServiceOptions.EnableEventCounterCollectionModule = false;
+builder.Services.AddApplicationInsightsTelemetry(applicationInsightsServiceOptions);
+```
+
+# [ASP.NET Core 3.1](#tab/dotnet31)
```csharp
- using Microsoft.ApplicationInsights.AspNetCore.Extensions;
- using Microsoft.Extensions.DependencyInjection;
+using Microsoft.ApplicationInsights.AspNetCore.Extensions;
+using Microsoft.Extensions.DependencyInjection;
- public void ConfigureServices(IServiceCollection services)
- {
- //... other code...
+public void ConfigureServices(IServiceCollection services)
+{
+ //... other code...
- var applicationInsightsServiceOptions = new ApplicationInsightsServiceOptions();
- applicationInsightsServiceOptions.EnableEventCounterCollectionModule = false;
- services.AddApplicationInsightsTelemetry(applicationInsightsServiceOptions);
- }
+ var applicationInsightsServiceOptions = new ApplicationInsightsServiceOptions();
+ applicationInsightsServiceOptions.EnableEventCounterCollectionModule = false;
+ services.AddApplicationInsightsTelemetry(applicationInsightsServiceOptions);
+}
```
-A similar approach can be used for the WorkerService SDK as well, but the namespace must be
-changed as shown in the example below.
++
+A similar approach can be used for the WorkerService SDK as well, but the namespace must be changed as shown in the following example.
+
+# [ASP.NET Core 6.0+](#tab/dotnet6)
```csharp
- using Microsoft.ApplicationInsights.WorkerService;
- using Microsoft.Extensions.DependencyInjection;
+using Microsoft.ApplicationInsights.AspNetCore.Extensions;
+using Microsoft.Extensions.DependencyInjection;
+
+var applicationInsightsServiceOptions = new ApplicationInsightsServiceOptions();
+applicationInsightsServiceOptions.EnableEventCounterCollectionModule = false;
+builder.Services.AddApplicationInsightsTelemetry(applicationInsightsServiceOptions);
+```
+
+# [ASP.NET Core 3.1](#tab/dotnet31)
+
+```csharp
+using Microsoft.ApplicationInsights.WorkerService;
+using Microsoft.Extensions.DependencyInjection;
+
+public void ConfigureServices(IServiceCollection services)
+{
+ //... other code...
var applicationInsightsServiceOptions = new ApplicationInsightsServiceOptions(); applicationInsightsServiceOptions.EnableEventCounterCollectionModule = false; services.AddApplicationInsightsTelemetryWorkerService(applicationInsightsServiceOptions);
+}
``` ++ ## Event counters in Metric Explorer To view EventCounter metrics in [Metric Explorer](../essentials/metrics-charts.md), select Application Insights resource, and chose Log-based metrics as metric namespace. Then EventCounter metrics get displayed under Custom category.
customMetrics
Like other telemetry, **customMetrics** also has a column `cloud_RoleInstance` that indicates the identity of the host server instance on which your app is running. The above query shows the counter value per instance, and can be used to compare performance of different server instances. ## Alerts
-Like other metrics, you can [set an alert](../alerts/alerts-log.md) to warn you if an event counter goes outside a limit you specify. Open the Alerts pane and click Add Alert.
+Like other metrics, you can [set an alert](../alerts/alerts-log.md) to warn you if an event counter goes outside a limit you specify. Open the Alerts pane and select Add Alert.
## Frequently asked questions ### Can I see EventCounters in Live Metrics?
-Live Metrics do not show EventCounters as of today. Use Metric Explorer or Analytics to see the telemetry.
+Live Metrics don't show EventCounters as of today. Use Metric Explorer or Analytics to see the telemetry.
-### I have enabled Application Insights from Azure Web App Portal. But I can't see EventCounters.?
+### I have enabled Application Insights from Azure Web App Portal. Why can't I see EventCounters?
- [Application Insights extension](./azure-web-apps.md) for ASP.NET Core doesn't yet support this feature. This document will be updated when this feature is supported.
+ [Application Insights extension](./azure-web-apps.md) for ASP.NET Core doesn't yet support this feature.
## <a name="next"></a>Next steps
-* [Dependency tracking](./asp-net-dependencies.md)
+* [Dependency tracking](./asp-net-dependencies.md)
azure-monitor Sdk Connection String https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/sdk-connection-string.md
A connection string consists of a list of settings represented as key-value pair
#### Syntax - `InstrumentationKey` (for example, 00000000-0000-0000-0000-000000000000).
- The connection string is a *required* field.
+ This is a *required* field.
- `Authorization` (for example, ikey). This setting is optional because today we only support ikey authorization. - `EndpointSuffix` (for example, applicationinsights.azure.cn). Setting the endpoint suffix will instruct the SDK on which Azure cloud to connect to. The SDK will assemble the rest of the endpoint for individual services.
azure-monitor Tutorial Asp Net Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/tutorial-asp-net-core.md
The sample application makes calls to multiple Azure resources, including Azure
Application Insights introspects the incoming telemetry data and is able to generate a visual map of the system integrations it detects.
-1. Access and log into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Open the resource group for the sample application, which is `application-insights-azure-cafe`.
For the latest updates and bug fixes, see the [release notes](./release-notes.md
* [Logging in ASP.NET Core](/aspnet/core/fundamentals/logging) * [.NET trace logs in Application Insights](./asp-net-trace-logs.md) * [Autoinstrumentation for Application Insights](./codeless-overview.md)-
azure-monitor Collect Custom Metrics Linux Telegraf https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/collect-custom-metrics-linux-telegraf.md
description: Instructions on how to deploy the InfluxData Telegraf agent on a Li
+ Last updated 06/16/2022 # Collect custom metrics for a Linux VM with the InfluxData Telegraf agent
azure-monitor Data Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/data-security.md
The retention period of collected data stored in the database depends on the sel
Data in database storage cannot be altered once ingested but can be deleted via [*purge* API path](personal-data-mgmt.md#delete). Although data cannot be altered, some certifications require that data is kept immutable and cannot be changed or deleted in storage. Data immutability can be achieved using [data export](logs-data-export.md) to a storage account that is configured as [immutable storage](../../storage/blobs/immutable-policy-configure-version-scope.md). ### 4. Use Azure Monitor to access the data
-To access your Log Analytics workspace, you sign into the Azure portal using the organizational account or Microsoft account that you set up previously. All traffic between the portal and Azure Monitor service is sent over a secure HTTPS channel. When using the portal, a session ID is generated on the user client (web browser) and data is stored in a local cache until the session is terminated. When terminated, the cache is deleted. Client-side cookies, which do not contain personally identifiable information, are not automatically removed. Session cookies are marked HTTPOnly and are secured. After a pre-determined idle period, the Azure portal session is terminated.
+To access your Log Analytics workspace, you sign in to the Azure portal using the organizational account or Microsoft account that you set up previously. All traffic between the portal and Azure Monitor service is sent over a secure HTTPS channel. When using the portal, a session ID is generated on the user client (web browser) and data is stored in a local cache until the session is terminated. When terminated, the cache is deleted. Client-side cookies, which do not contain personally identifiable information, are not automatically removed. Session cookies are marked HTTPOnly and are secured. After a pre-determined idle period, the Azure portal session is terminated.
## Additional security features
azure-monitor Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Monitor description: Lists Azure Policy Regulatory Compliance controls available for Azure Monitor. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
azure-netapp-files Performance Linux Nfs Read Ahead https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/performance-linux-nfs-read-ahead.md
ms.assetid:
na+ Last updated 09/29/2022
azure-netapp-files Troubleshoot Volumes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/troubleshoot-volumes.md
ms.assetid:
na+ Last updated 02/21/2023
azure-netapp-files Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/whats-new.md
ms.assetid:
na+ Last updated 06/26/2023
azure-relay Private Link Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-relay/private-link-service.md
Your private endpoint uses a private IP address in your virtual network.
The following procedure provides step-by-step instructions for disabling public access to a Relay namespace and then adding a private endpoint to the namespace.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the search bar, type in **Relays**. 3. Select the **namespace** from the list to which you want to add a private endpoint. 4. On the left menu, select the **Networking** tab under **Settings**.
azure-resource-manager Control Plane Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/control-plane-metrics.md
Title: Control plane metrics in Azure Monitor description: Azure Resource Manager metrics in Azure Monitor | Traffic and latency observability for subscription-level control plane requests -+ Last updated 04/26/2023
azure-resource-manager Manage Resource Groups Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/manage-resource-groups-cli.md
Learn how to use Azure CLI with [Azure Resource Manager](overview.md) to manage
* Azure CLI. For more information, see [How to install the Azure CLI](/cli/azure/install-azure-cli).
-* After installing, sign in for the first time. For more information, see [How to sign into the Azure CLI](/cli/azure/get-started-with-azure-cli#how-to-sign-into-the-azure-cli).
+* After installing, sign in for the first time. For more information, see [How to sign in to the Azure CLI](/cli/azure/get-started-with-azure-cli#how-to-sign-into-the-azure-cli).
## What is a resource group
azure-resource-manager Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Resource Manager description: Lists Azure Policy Regulatory Compliance controls available for Azure Resource Manager. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
azure-signalr Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure SignalR description: Lists Azure Policy Regulatory Compliance controls available for Azure SignalR. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
azure-sql-edge Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/configure.md
Last updated 09/22/2020 +
To remove a data volume container, use the `docker volume rm` command.
## Next steps - [Connect to Azure SQL Edge](connect.md)-- [Build an end-to-end IoT solution with SQL Edge](tutorial-deploy-azure-resources.md)
+- [Build an end-to-end IoT solution with SQL Edge](tutorial-deploy-azure-resources.md)
azure-video-indexer Create Account Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/create-account-portal.md
Search for **Microsoft.Media** and **Microsoft.EventGrid**. If not in the regist
## Use the Azure portal to create an Azure AI Video Indexer account
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
Alternatively, you can start creating the **unlimited** account from the [videoindexer.ai](https://www.videoindexer.ai) website. 1. Using the search bar at the top, enter **"Video Indexer"**.
azure-video-indexer Switch Tenants Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/switch-tenants-portal.md
This article shows two options to solve the same problem - how to switch tenants
> [!div class="mx-imgBorder"] > ![Screenshot of a tenant list.](./media/switch-directory/tenants.png)
- Once clicked, the logged-in credentials will be used to relog-in to the Azure AI Video Indexer website with the new directory.
+ Once clicked, the authenticated credentials will be used to sign in again to the Azure AI Video Indexer website with the new directory.
## Switch tenants from outside the Azure AI Video Indexer website
This section shows how to get the domain name from the Azure portal. You can the
### Get the domain name
-1. In the [Azure portal](https://portal.azure.com/), sign in with the same subscription tenant in which your Azure AI Video Indexer Azure Resource Manager (ARM) account was created.
+1. Sign in to the [Azure portal](https://portal.azure.com) using the same subscription tenant in which your Azure AI Video Indexer Azure Resource Manager (ARM) account was created.
1. Hover over your account name (in the right-top corner). > [!div class="mx-imgBorder"]
azure-vmware Concepts Run Command https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/concepts-run-command.md
Azure VMware Solution supports the following operations:
You can view the status of any executed Run Command, including the output, errors, warnings, and information logs of the cmdlets.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
>[!NOTE] >If you need access to the Azure US Gov portal, go to https://portal.azure.us/ - 1. Select **Run command** > **Run execution status**. You can sort by the various columns by selecting the column.
azure-vmware Tutorial Access Private Cloud https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/tutorial-access-private-cloud.md
Azure VMware Solution doesn't allow you to manage your private cloud with your on-premises vCenter Server. Instead, you'll need to connect to the Azure VMware Solution vCenter Server instance through a jump box.
-In this tutorial, you'll create a jump box in the resource group you created in the [previous tutorial](tutorial-configure-networking.md) and sign into the Azure VMware Solution vCenter Server. This jump box is a Windows virtual machine (VM) on the same virtual network you created. It provides access to both vCenter Server and the NSX Manager.
+In this tutorial, you'll create a jump box in the resource group you created in the [previous tutorial](tutorial-configure-networking.md) and sign in to the Azure VMware Solution vCenter Server. This jump box is a Windows virtual machine (VM) on the same virtual network you created. It provides access to both vCenter Server and the NSX Manager.
In this tutorial, you learn how to: > [!div class="checklist"] > * Create a Windows VM to access the Azure VMware Solution vCenter
-> * Sign into vCenter Server from this VM
+> * Sign in to vCenter Server from this VM
## Create a new Windows virtual machine
azure-vmware Tutorial Configure Networking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/tutorial-configure-networking.md
The vNet with the provided address range and GatewaySubnet is created in your su
### Create a vNet manually
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
>[!NOTE] >If you need access to the Azure US Gov portal, go to https://portal.azure.us/
azure-web-pubsub Reference Client Sdk Csharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-web-pubsub/reference-client-sdk-csharp.md
+
+ Title: Reference - C# Client-side SDK for Azure Web PubSub
+description: This reference describes the C# client-side SDK for Azure Web PubSub service.
+++++ Last updated : 07/17/2023++
+# Azure Web PubSub client library for .NET
+
+> [!NOTE]
+> Details about the terms used here are described in [key concepts](./key-concepts.md) article.
+
+The client-side SDK aims to speed up developer's workflow; more specifically,
+- simplifies managing client connections
+- simplifies sending messages among clients
+- automatically retries after unintended drops of client connection
+- **reliably** deliveries messages in number and in order after recovering from connection drops
+
+As shown in the diagram, your clients establish WebSocket connections with your Web PubSub resource.
+
+## Getting started
+
+### Install the package
+
+Install the client library from [NuGet](https://www.nuget.org/):
+
+```dotnetcli
+dotnet add package Azure.Messaging.WebPubSub.Client --prerelease
+```
+
+### Prerequisites
+
+- An Azure subscription
+- An existing Web PubSub instance
+
+### Authenticate the client
+
+A Client uses a `Client Access URL` to connect and authenticate with the service. `Client Access URL` follows the pattern as `wss://<service_name>.webpubsub.azure.com/client/hubs/<hub_name>?access_token=<token>`. There are multiple ways to get a `Client Access URL`. As a quick start, you can copy and paste from Azure portal, and for production, you usually need a negotiation server to generate `Client Access URL`. [See details.](#use-negotiation-server-to-generate-client-access-url)
+
+#### Use Client Access URL from Azure portal
+
+As a quick start, you can go to Azure portal and copy the **Client Access URL** from **Keys** blade.
++
+As shown in the diagram, the client is granted permission of sending messages to specific groups and joining specific groups. Learn more about client permission, see [permissions.](./reference-json-reliable-webpubsub-subprotocol.md#permissions)
+
+```C# Snippet:WebPubSubClient_Construct
+var client = new WebPubSubClient(new Uri("<client-access-uri>"));
+```
+
+#### Use negotiation server to generate `Client Access URL`
+
+In production, a client usually fetches the `Client Access URL` from a negotiation server. The server holds the `connection string` and generates the `Client Access URL` through `WebPubSubServiceClient`. As a sample, the code snippet just demonstrates how to generate the `Client Access URL` inside a single process.
+
+```C# Snippet:WebPubSubClient_Construct2
+var client = new WebPubSubClient(new WebPubSubClientCredential(token =>
+{
+ // In common practice, you will have a negotiation server for generating token. Client should fetch token from it.
+ return FetchClientAccessTokenFromServerAsync(token);
+}));
+```
+
+```C# Snippet:WebPubSubClient_GenerateClientAccessUri
+public async ValueTask<Uri> FetchClientAccessTokenFromServerAsync(CancellationToken token)
+{
+ var serviceClient = new WebPubSubServiceClient("<< Connection String >>", "hub");
+ return await serviceClient.GetClientAccessUriAsync();
+}
+```
+
+Features to differentiate `WebPubSubClient` and `WebPubSubServiceClient`.
+
+|Class Name|WebPubSubClient|WebPubSubServiceClient|
+||||
+|NuGet Package Name|Azure.Messaging.WebPubSub.Client |Azure.Messaging.WebPubSub|
+|Features|Used on client side. Publish messages and subscribe to messages.|Used on server side. Generate Client Access Uri and manage clients|
+
+## Examples
+
+### Consume messages from the server and groups
+
+A client can add callbacks to consume messages from the server and groups. Note, clients can only receive group messages that it has joined.
+
+```C# Snippet:WebPubSubClient_Subscribe_ServerMessage
+client.ServerMessageReceived += eventArgs =>
+{
+ Console.WriteLine($"Receive message: {eventArgs.Message.Data}");
+ return Task.CompletedTask;
+};
+```
+
+```C# Snippet:WebPubSubClient_Subscribe_GroupMessage
+client.GroupMessageReceived += eventArgs =>
+{
+ Console.WriteLine($"Receive group message from {eventArgs.Message.Group}: {eventArgs.Message.Data}");
+ return Task.CompletedTask;
+};
+```
+
+### Add callbacks for `connected`, `disconnected`, and `stopped` events
+
+When a client connection is connected to the service, the `connected` event is triggered once it received the connected message from the service.
+
+```C# Snippet:WebPubSubClient_Subscribe_Connected
+client.Connected += eventArgs =>
+{
+ Console.WriteLine($"Connection {eventArgs.ConnectionId} is connected");
+ return Task.CompletedTask;
+};
+```
+
+When a client connection is disconnected and fails to recover, the `disconnected` event is triggered.
+
+```C# Snippet:WebPubSubClient_Subscribe_Disconnected
+client.Disconnected += eventArgs =>
+{
+ Console.WriteLine($"Connection is disconnected");
+ return Task.CompletedTask;
+};
+```
+
+When a client is stopped, which means the client connection is disconnected and the client stops trying to reconnect, the `stopped` event is triggered. This usually happens after the `client.StopAsync()` is called, or disabled `AutoReconnect`. If you want to restart the client, you can call `client.StartAsync()` in the `Stopped` event.
+
+```C# Snippet:WebPubSubClient_Subscribe_Stopped
+client.Stopped += eventArgs =>
+{
+ Console.WriteLine($"Client is stopped");
+ return Task.CompletedTask;
+};
+```
+
+### Auto rejoin groups and handle rejoin failure
+
+When a client connection has dropped and fails to recover, all group contexts are cleaned up on the service side. That means when the client reconnects, it needs to rejoin groups. By default, the client enabled `AutoRejoinGroups` options. However, this feature has limitations. The client can only rejoin groups that it's originally joined **by the client** rather than joined **by the server side**. And rejoin group operations may fail due to various reasons, for example, the client doesn't have permission to join groups. In such cases, users need to add a callback to handle such failure.
+
+```C# Snippet:WebPubSubClient_Subscribe_RestoreFailed
+client.RejoinGroupFailed += eventArgs =>
+{
+ Console.WriteLine($"Restore group failed");
+ return Task.CompletedTask;
+};
+```
+
+### Operation and retry
+
+By default, the operation such as `client.JoinGroupAsync()`, `client.LeaveGroupAsync()`, `client.SendToGroupAsync()`, `client.SendEventAsync()` has three reties. You can use `WebPubSubClientOptions.MessageRetryOptions` to change. If all retries have failed, an error is thrown. You can keep retrying by passing in the same `ackId` as previous retries, thus the service can help to deduplicate the operation with the same `ackId`.
+
+```C# Snippet:WebPubSubClient_JoinGroupAndRetry
+// Send message to group "testGroup"
+try
+{
+ await client.JoinGroupAsync("testGroup");
+}
+catch (SendMessageFailedException ex)
+{
+ if (ex.AckId != null)
+ {
+ await client.JoinGroupAsync("testGroup", ackId: ex.AckId);
+ }
+}
+```
+
+## Troubleshooting
+### Enable logs
+You can set the following environment variable to get the debug logs when using this library.
+
+```bash
+export AZURE_LOG_LEVEL=verbose
+```
+
+For more detailed instructions on how to enable logs, you can look at the [@azure/logger package docs](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/core/logger).
+
+### Live Trace
+Use [Live Trace tool](./howto-troubleshoot-resource-logs.md#capture-resource-logs-by-using-the-live-trace-tool) from Azure portal to inspect live message traffic through your Web PubSub resource.
azure-web-pubsub Reference Client Sdk Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-web-pubsub/reference-client-sdk-java.md
+
+ Title: Reference - Java Client-side SDK for Azure Web PubSub
+description: This reference describes the Java client-side SDK for Azure Web PubSub service.
+++++ Last updated : 07/17/2023++
+# Azure WebPubSub client library for Java
+
+> [!NOTE]
+> Details about the terms used here are described in [key concepts](./key-concepts.md) article.
+
+The client-side SDK aims to speed up developer's workflow; more specifically,
+- simplifies managing client connections
+- simplifies sending messages among clients
+- automatically retries after unintended drops of client connection
+- **reliably** deliveries messages in number and in order after recovering from connection drops
+
+As shown in the diagram, your clients establish WebSocket connections with your Web PubSub resource.
++
+## Getting started
+
+### Prerequisites
+
+- Java Development Kit (JDK) with version 8 or above
+- Azure subscription
+- An existing Web PubSub instance
+
+### Adding the package to your product
+
+[//]: # ({x-version-update-start;com.azure:azure-messaging-webpubsub-client;current})
+```xml
+<dependency>
+ <groupId>com.azure</groupId>
+ <artifactId>azure-messaging-webpubsub-client</artifactId>
+ <version>1.0.0-beta.1</version>
+</dependency>
+```
+[//]: # ({x-version-update-end})
+
+### Authenticate the client
+
+A client uses a `Client Access URL` to connect and authenticate with the service. The URL follows a pattern of `wss://<service_name>.webpubsub.azure.com/client/hubs/<hub_name>?access_token=<token>`. There are multiple ways to get a `Client Access URL`. As a quick start, you can copy and paste from Azure portal, and for production, you usually need a negotiation server to generate the URL. [See details.](#use-negotiation-server-to-generate-client-access-url)
+
+#### Use `Client Access URL` from Azure portal
+
+As a quick start, you can go to Azure portal and copy the **Client Access URL** from **Keys** blade.
++
+As shown in the diagram, the client is granted permission of sending messages to specific groups and joining specific groups. Learn more about client permission, see [permissions.](./reference-json-reliable-webpubsub-subprotocol.md#permissions)
+
+```java readme-sample-createClientFromUrl
+WebPubSubClient client = new WebPubSubClientBuilder()
+ .clientAccessUrl("<client-access-url>")
+ .buildClient();
+```
+
+#### Use negotiation server to generate `Client Access URL`
+
+In production, a client usually fetches the `Client Access URL` from a negotiation server. The server holds the `connection string` and generates the `Client Access URL` through `WebPubSubServiceClient`. As a sample, the code snippet just demonstrates how to generate the `Client Access URL` inside a single process.
+
+```java readme-sample-createClientFromCredential
+// WebPubSubServiceAsyncClient is from com.azure:azure-messaging-webpubsub
+// create WebPubSub service client
+WebPubSubServiceAsyncClient serverClient = new WebPubSubServiceClientBuilder()
+ .connectionString("<connection-string>")
+ .hub("<hub>>")
+ .buildAsyncClient();
+
+// wrap WebPubSubServiceAsyncClient.getClientAccessToken as WebPubSubClientCredential
+WebPubSubClientCredential clientCredential = new WebPubSubClientCredential(Mono.defer(() ->
+ serverClient.getClientAccessToken(new GetClientAccessTokenOptions()
+ .setUserId("<user-name>")
+ .addRole("webpubsub.joinLeaveGroup")
+ .addRole("webpubsub.sendToGroup"))
+ .map(WebPubSubClientAccessToken::getUrl)));
+
+// create WebPubSub client
+WebPubSubClient client = new WebPubSubClientBuilder()
+ .credential(clientCredential)
+ .buildClient();
+```
+
+Features to differentiate `WebPubSubClient` and `WebPubSubServiceClient`.
+
+|Class Name|WebPubSubClient|WebPubSubServiceClient|
+||||
+|Package Name|azure-messaging-webpubsub-client|azure-messaging-webpubsub|
+|Features|Used on client side. Publish messages and subscribe to messages.|Used on server side. Generate `Client Access URL` and manage clients.|
+
+## Examples
+
+### Consume messages from the server and groups
+
+A client can add callbacks to consume messages from the server and groups. Note, clients can only receive group messages that it has joined.
+
+```java readme-sample-listenMessages
+client.addOnGroupMessageEventHandler(event -> {
+ System.out.println("Received group message from " + event.getFromUserId() + ": "
+ + event.getData().toString());
+});
+client.addOnServerMessageEventHandler(event -> {
+ System.out.println("Received server message: "
+ + event.getData().toString());
+});
+```
+
+### Add callbacks for `connected`, `disconnected`, and `stopped` events
+
+When a client connection is connected to the service, the `connected` event is triggered.
+
+When a client connection is disconnected and fails to recover, the `disconnected` event is triggered.
+
+When a client is stopped, which means the client connection is disconnected and the client stops trying to reconnect, the `stopped` event is triggered. This usually happens after the `client.StopAsync()` is called, or disabled `AutoReconnect`. If you want to restart the client, you can call `client.StartAsync()` in the `Stopped` event.
+
+```java readme-sample-listenEvent
+client.addOnConnectedEventHandler(event -> {
+ System.out.println("Connection is connected: " + event.getConnectionId());
+});
+client.addOnDisconnectedEventHandler(event -> {
+ System.out.println("Connection is disconnected");
+});
+client.addOnStoppedEventHandler(event -> {
+ System.out.println("Client is stopped");
+});
+```
+
+### Operation and retry
+
+By default, the operation such as `client.joinGroup()`, `client.leaveGroup()`, `client.sendToGroup()`, `client.sendEvent()` has three reties. You can use `WebPubSubClientBuilder.retryOptions()` to change. If all retries have failed, an error is thrown. You can keep retrying by passing in the same `ackId` as previous retries, thus the service can help to deduplicate the operation with the same `ackId`.
+
+```java readme-sample-sendAndRetry
+try {
+ client.joinGroup("testGroup");
+} catch (SendMessageFailedException e) {
+ if (e.getAckId() != null) {
+ client.joinGroup("testGroup", e.getAckId());
+ }
+}
+```
+
+## Troubleshooting
+### Enable logs
+You can set the following environment variable to get the debug logs when using this library.
+
+```bash
+export AZURE_LOG_LEVEL=verbose
+```
+
+For more detailed instructions on how to enable logs, you can look at the [@azure/logger package docs](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/core/logger).
+
+### Live Trace
+Use [Live Trace tool](./howto-troubleshoot-resource-logs.md#capture-resource-logs-by-using-the-live-trace-tool) from Azure portal to inspect live message traffic through your Web PubSub resource.
azure-web-pubsub Reference Client Sdk Javascript https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-web-pubsub/reference-client-sdk-javascript.md
+
+ Title: Reference - JavaScript Client-side SDK for Azure Web PubSub
+description: This reference describes the JavaScript client-side SDK for Azure Web PubSub service.
+++++ Last updated : 07/17/2023+
+# Web PubSub client-side SDK for JavaScript
+
+> [!NOTE]
+> Details about the terms used here are described in [key concepts](./key-concepts.md) article.
+
+The client-side SDK aims to speed up developer's workflow; more specifically,
+- simplifies managing client connections
+- simplifies sending messages among clients
+- automatically retries after unintended drops of client connection
+- **reliably** deliveries messages in number and in order after recovering from connection drops
+
+As shown in the diagram, your clients establish WebSocket connections with your Web PubSub resource.
++
+## Getting started
+
+### Prerequisites
+- [LTS versions of Node.js](https://nodejs.org/about/releases/)
+- An Azure subscription
+- A Web PubSub resource
+
+### 1. Install the `@azure/web-pubsub-client` package
+```bash
+npm install @azure/web-pubsub-client
+```
+
+### 2. Connect with your Web PubSub resource
+A client uses `Client Access URL` to connect and authenticate with the service, which follows a pattern of `wss://<service_name>.webpubsub.azure.com/client/hubs/<hub_name>?access_token=<token>`. A client can have a few ways to obtain `Client Access URL`. For this quick guide, you can copy and paste one from Azure portal shown. (For production, your clients usually get `Client Access URL` generated on your application server. [See details](#use-an-application-server-to-generate-client-access-url-programatically) )
++
+As shown in the diagram, the client has the permissions to send messages to and join a specific group named **`group1`**.
+
+```js
+// Imports the client libray
+const { WebPubSubClient } = require("@azure/web-pubsub-client");
+
+// Instantiates the client object
+const client = new WebPubSubClient("<client-access-url>");
+
+// Starts the client connection with your Web PubSub resource
+await client.start();
+
+// ...
+// The client can join/leave groups, send/receive messages to and from those groups all in real-time
+```
+
+### 3. Join groups
+A client can only receive messages from groups that it has joined. You can add a callback to specify the logic of what to do when receiving messages.
+
+```js
+// ...continues the code snippet from above
+
+// Specifies the group to join
+let groupName = "group1";
+
+// Registers a listener for the event 'group-message' early before joining a group to not miss messages
+client.on("group-message", (e) => {
+ console.log(`Received message: ${e.message.data}`);
+});
+
+// A client needs to join the group it wishes to receive messages from
+await client.joinGroup(groupName);
+```
+
+### 4. Send messages to a group
+```js
+// ...continues the code snippet from above
+
+// Send a message to a joined group
+await client.sendToGroup(groupName, "hello world", "text");
+
+// In the Console tab of your developer tools found in your browser, you should see the message printed there.
+```
+
+## Examples
+### Handle `connected`, `disconnected` and `stopped` events
+Azure Web PubSub fires system events like `connected`, `disconnected` and `stopped`. You can register event handlers to decide what the program should do when the events are fired.
+
+1. When a client is successfully connected to your Web PubSub resource, the `connected` event is triggered. This snippet simply prints out the [connection ID](./key-concepts.md)
+```js
+client.on("connected", (e) => {
+ console.log(`Connection ${e.connectionId} is connected.`);
+});
+```
+
+2. When a client is disconnected and fails to recover the connection, the `disconnected` event is triggered. This snippet simply prints out the message.
+```js
+client.on("disconnected", (e) => {
+ console.log(`Connection disconnected: ${e.message}`);
+});
+```
+
+3. The `stopped` event is triggered when the client is disconnected **and** the client stops trying to reconnect. This usually happens after the `client.stop()` is called, or `autoReconnect` is disabled or a specified limit to trying to reconnect has reached. If you want to restart the client, you can call `client.start()` in the stopped event.
+
+```js
+// Registers an event handler for the "stopped" event
+client.on("stopped", () => {
+ console.log(`Client has stopped`);
+});
+```
+
+### Use an application server to generate `Client Access URL` programatically
+In production, clients usually fetch `Client Access URL` from an application server. The server holds the `connection string` to your Web PubSub resource and generates the `Client Access URL` with help from the server-side library `@azure/web-pubsub`.
+
+#### 1. Application server
+The code snippet is an example of an application server exposes a `/negotiate` endpoint and returns `Client Access URL`.
+
+```js
+// This code snippet uses the popular Express framework
+const express = require('express');
+const app = express();
+const port = 8080;
+
+// Imports the server library, which is different from the client library
+const { WebPubSubServiceClient } = require('@azure/web-pubsub');
+const hubName = 'sample_chat';
+
+const serviceClient = new WebPubSubServiceClient("<web-pubsub-connectionstring>", hubName);
+
+// Note that the token allows the client to join and send messages to any groups. It is specified with the "roles" option.
+app.get('/negotiate', async (req, res) => {
+ let token = await serviceClient.getClientAccessToken({roles: ["webpubsub.joinLeaveGroup", "webpubsub.sendToGroup"] });
+ res.json({
+ url: token.url
+ });
+});
+
+app.listen(port, () => console.log(`Application server listening at http://localhost:${port}/negotiate`));
+```
+
+#### 2. Client side
+```js
+const { WebPubSubClient } = require("@azure/web-pubsub-client")
+
+const client = new WebPubSubClient({
+ getClientAccessUrl: async () => {
+ let value = await (await fetch(`/negotiate`)).json();
+ return value.url;
+ }
+});
+
+await client.start();
+```
+
+> [!NOTE]
+> To see the full code of this sample, please refer to [samples-browser](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/web-pubsub/web-pubsub-client/samples-browser).
++
+### A client consumes messages from the application server or joined groups
+A client can add callbacks to consume messages from an application server or groups.
+
+```js
+// Registers a listener for the "server-message". The callback is invoked when your application server sends message to the connectionID, to or broadcast to all connections.
+client.on("server-message", (e) => {
+ console.log(`Received message ${e.message.data}`);
+});
+
+// Registers a listener for the "group-message". The callback is invoked when the client receives a message from the groups it has joined.
+client.on("group-message", (e) => {
+ console.log(`Received message from ${e.message.group}: ${e.message.data}`);
+});
+```
+
+> [!NOTE]
+> For `group-message` event, the client can **only** receive messages from the groups that it has joined.
+
+### Handle rejoin failure
+When a client is disconnected and fails to recover, all group contexts are cleaned up in your Web PubSub resource. This means when the client reconnects, it needs to rejoin groups. By default, the client has `autoRejoinGroup` option enabled.
+
+However, you should be aware of `autoRejoinGroup`'s limitations.
+- The client can only rejoin groups that it has been joined by the client code _not_ by the server side code.
+- "Rejoin group" operations may fail due to various reasons, for example, the client doesn't have permission to join the groups. In such cases, you need to add a callback to handle this failure.
+
+```js
+// By default autoRejoinGroups=true. You can disable it by setting to false.
+const client = new WebPubSubClient("<client-access-url>", { autoRejoinGroups: true });
+
+// Registers a listener to handle "rejoin-group-failed" event
+client.on("rejoin-group-failed", e => {
+ console.log(`Rejoin group ${e.group} failed: ${e.error}`);
+})
+```
+
+### Retry
+By default, the operation such as `client.joinGroup()`, `client.leaveGroup()`, `client.sendToGroup()`, `client.sendEvent()` has three retries. You can configure through the `messageRetryOptions`. If all retries have failed, an error is thrown. You can keep retrying by passing in the same `ackId` as previous retries so that the Web PubSub service can deduplicate the operation.
+
+```js
+try {
+ await client.joinGroup(groupName);
+} catch (err) {
+ let id = null;
+ if (err instanceof SendMessageError) {
+ id = err.ackId;
+ }
+ await client.joinGroup(groupName, {ackId: id});
+}
+```
+
+## JavaScript Bundle
+To use this client library in the browser, you need to use a bundler. For details on how to create a bundle, refer to our [bundling documentation](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Bundling.md).
+
+## Troubleshooting
+### Enable logs
+You can set the following environment variable to get the debug logs when using this library.
+
+```bash
+export AZURE_LOG_LEVEL=verbose
+```
+
+For more detailed instructions on how to enable logs, you can look at the [@azure/logger package docs](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/core/logger).
+
+### Live Trace
+Use [Live Trace tool](./howto-troubleshoot-resource-logs.md#capture-resource-logs-by-using-the-live-trace-tool) from Azure portal to inspect live message traffic through your Web PubSub resource.
azure-web-pubsub Reference Client Sdk Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-web-pubsub/reference-client-sdk-python.md
+
+ Title: Reference - Python Client-side SDK for Azure Web PubSub
+description: This reference describes the Python client-side SDK for Azure Web PubSub service.
+++++ Last updated : 07/17/2023++
+# Azure Web PubSub client library for Python
+
+> [!NOTE]
+> Details about the terms used here are described in [key concepts](./key-concepts.md) article.
+
+The client-side SDK aims to speed up developer's workflow; more specifically,
+- simplifies managing client connections
+- simplifies sending messages among clients
+- automatically retries after unintended drops of client connection
+- **reliably** deliveries messages in number and in order after recovering from connection drops
+
+As shown in the diagram, your clients establish WebSocket connections with your Web PubSub resource.
++
+## Getting started
+
+### Prerequisites
+- [Python 3.7+](https://www.python.org/downloads/)
+- An Azure subscription
+- A Web PubSub resource
+
+### 1. Install the `azure-messaging-webpubsubclient` package
+
+```bash
+pip install azure-messaging-webpubsubclient
+```
+
+### 2. Connect with your Web PubSub resource
+
+A client uses a `Client Access URL` to connect and authenticate with the service, which follows a pattern of `wss://<service_name>.webpubsub.azure.com/client/hubs/<hub_name>?access_token=<token>`. A client can have a few ways to obtain the `Client Access URL`. For this quick start, you can copy and paste one from Azure portal shown.
++
+As shown in the diagram, the client has the permissions to send messages to and join a specific group named **`group1`**.
+
+```python
+from azure.messaging.webpubsubclient import WebPubSubClient
+
+client = WebPubSubClient("<<client-access-url>>")
+with client:
+ # The client can join/leave groups, send/receive messages to and from those groups all in real-time
+ ...
+```
+
+### 3. Join groups
+
+A client can only receive messages from groups that it has joined and you need to add a callback to specify the logic when receiving messages.
+
+```python
+# ...continues the code snippet from above
+
+# Registers a listener for the event 'group-message' early before joining a group to not miss messages
+group_name = "group1";
+client.on("group-message", lambda e: print(f"Received message: {e.data}"));
+
+# A client needs to join the group it wishes to receive messages from
+client.join_group(groupName);
+```
+
+### 4. Send messages to a group
+
+```python
+# ...continues the code snippet from above
+
+# Send a message to a joined group
+client.send_to_group(group_name, "hello world", "text");
+
+# In the Console tab of your developer tools found in your browser, you should see the message printed there.
+```
+
+## Examples
+### Add callbacks for `connected`, `disconnected` and `stopped` events
+1. When a client is successfully connected to your Web PubSub resource, the `connected` event is triggered.
+
+ ```python
+ client.on("connected", lambda e: print(f"Connection {e.connection_id} is connected"))
+ ```
+
+2. When a client is disconnected and fails to recover the connection, the `disconnected` event is triggered.
+
+ ```python
+ client.on("disconnected", lambda e: print(f"Connection disconnected: {e.message}"))
+ ```
+
+3. The `stopped` event is triggered when the client is disconnected **and** the client stops trying to reconnect. This usually happens after the `client.stop()` is called, or `auto_reconnect` is disabled or a specified limit to trying to reconnect has reached. If you want to restart the client, you can call `client.start()` in the stopped event.
+
+ ```python
+ client.on("stopped", lambda : print("Client has stopped"))
+ ```
+
+### A client consumes messages from the application server or joined groups
+
+A client can add callbacks to consume messages from your application server or groups. Note, for `group-message` event the client can _only_ receive group messages that it has joined.
+
+ ```python
+ # Registers a listener for the "server-message". The callback is invoked when your application server sends message to the connectionID, to or broadcast to all connections.
+ client.on("server-message", lambda e: print(f"Received message {e.data}"))
+
+ # Registers a listener for the "group-message". The callback is invoked when the client receives a message from the groups it has joined.
+ client.on("group-message", lambda e: print(f"Received message from {e.group}: {e.data}"))
+ ```
+
+### Handle rejoin failure
+When a client is disconnected and fails to recover, all group contexts are cleaned up in your Web PubSub resource. This means when the client reconnects, it needs to rejoin groups. By default, the client has `auto_rejoin_groups` option enabled.
+
+However, you should be aware of `auto_rejoin_groups`'s limitations.
+- The client can only rejoin groups that it's originally joined **by the client code _not_ by the server side code**.
+- "rejoin group" operations may fail due to various reasons, for example, the client doesn't have permission to join the groups. In such cases, you need to add a callback to handle this failure.
+
+```python
+# By default auto_rejoin_groups=True. You can disable it by setting to False.
+client = WebPubSubClient("<client-access-url>", auto_rejoin_groups=True);
+
+# Registers a listener to handle "rejoin-group-failed" event
+client.on("rejoin-group-failed", lambda e: print(f"Rejoin group {e.group} failed: {e.error}"))
+```
+
+### Operation and retry
+
+By default, the operation such as `client.join_group()`, `client.leave_group()`, `client.send_to_group()`, `client.send_event()` has three retries. You can configure through the key-word arguments. If all retries have failed, an error is thrown. You can keep retrying by passing in the same `ack_id` as previous retries so that the Web PubSub service can deduplicate the operation.
+
+```python
+try:
+ client.join_group(group_name)
+except SendMessageError as e:
+ client.join_group(group_name, ack_id=e.ack_id)
+```
+
+## Troubleshooting
+### Enable logs
+You can set the following environment variable to get the debug logs when using this library.
+
+```bash
+export AZURE_LOG_LEVEL=verbose
+```
+
+For more detailed instructions on how to enable logs, you can look at the [@azure/logger package docs](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/core/logger).
+
+### Live Trace
+Use [Live Trace tool](./howto-troubleshoot-resource-logs.md#capture-resource-logs-by-using-the-live-trace-tool) from Azure portal to inspect live message traffic through your Web PubSub resource.
backup Backup Azure Mabs Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-mabs-troubleshoot.md
Reg query "HKLM\SOFTWARE\Microsoft\Microsoft Data Protection Manager\Setup"
| Operation | Error details | Workaround | | | | | | Change passphrase |The security PIN that was entered is incorrect. Provide the correct security PIN to complete this operation. |**Cause:**<br/> This error occurs when you enter an invalid or expired security PIN while you're performing a critical operation (such as changing a passphrase). <br/>**Recommended action:**<br/> To complete the operation, you must enter a valid security PIN. To get the PIN, sign in to the Azure portal and go to the Recovery Services vault. Then go to **Settings** > **Properties** > **Generate Security PIN**. Use this PIN to change the passphrase. |
-| Change passphrase |Operation failed. ID: 120002 |**Cause:**<br/>This error occurs when security settings are enabled, or when you try to change the passphrase when you're using an unsupported version.<br/>**Recommended action:**<br/> To change the passphrase, you must first update the backup agent to the minimum version, which is 2.0.9052. You also need to update Azure Backup Server to the minimum of update 1, and then enter a valid security PIN. To get the PIN, sign into the Azure portal and go to the Recovery Services vault. Then go to **Settings** > **Properties** > **Generate Security PIN**. Use this PIN to change the passphrase. |
+| Change passphrase |Operation failed. ID: 120002 |**Cause:**<br/>This error occurs when security settings are enabled, or when you try to change the passphrase when you're using an unsupported version.<br/>**Recommended action:**<br/> To change the passphrase, you must first update the backup agent to the minimum version, which is 2.0.9052. You also need to update Azure Backup Server to the minimum of update 1, and then enter a valid security PIN. To get the PIN, sign in to the Azure portal and go to the Recovery Services vault. Then go to **Settings** > **Properties** > **Generate Security PIN**. Use this PIN to change the passphrase. |
## Configure email notifications
backup Backup Azure Restore Files From Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-restore-files-from-vm.md
Title: Recover files and folders from Azure VM backup
description: In this article, learn how to recover files and folders from an Azure virtual machine recovery point. Last updated 06/30/2023-+
backup Backup Azure Troubleshoot Vm Backup Fails Snapshot Timeout https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-troubleshoot-vm-backup-fails-snapshot-timeout.md
description: Symptoms, causes, and resolutions of Azure Backup failures related
Last updated 05/05/2022 +
backup Backup Azure Vm File Recovery Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-vm-file-recovery-troubleshoot.md
Title: Troubleshoot Azure VM file recovery description: Troubleshoot issues when recovering files and folders from an Azure VM backup. + Last updated 07/12/2020
backup Backup Create Recovery Services Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-create-recovery-services-vault.md
Title: Create and configure Recovery Services vaults description: Learn how to create and configure Recovery Services vaults, and how to restore in a secondary region by using Cross Region Restore. Previously updated : 04/06/2023 Last updated : 07/21/2023
Before you begin, consider the following information:
- After you opt in, it might take up to 48 hours for the backup items to be available in secondary regions. - Cross Region Restore currently can't be reverted to GRS or LRS after the protection starts for the first time. - Currently, secondary region RPO is 36 hours. This is because the RPO in the primary region is 24 hours and can take up to 12 hours to replicate the backup data from the primary to the secondary region.
+- Review the [permissions required to use Cross Region Restore](backup-rbac-rs-vault.md#minimum-role-requirements-for-azure-vm-backup).
A vault created with GRS redundancy includes the option to configure the Cross Region Restore feature. Every GRS vault has a banner that links to the documentation.
backup Multi User Authorization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/multi-user-authorization.md
The **Security admin** can use PIM to create an eligible assignment for the Back
To create an eligible assignment, follow the steps: -
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to security tenant of Resource Guard, and in the search, enter **Privileged Identity Management**. 1. In the left pane, select **Manage and go to Azure Resources**. 1. Select the resource (the Resource Guard or the containing subscription/RG) to which you want to assign the Contributor role.
backup Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Backup description: Lists Azure Policy Regulatory Compliance controls available for Azure Backup. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
batch Automatic Certificate Rotation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/automatic-certificate-rotation.md
Title: Enable automatic certificate rotation in a Batch pool description: You can create a Batch pool with a managed identity and a certificate that will automatically be renewed. + Last updated 05/24/2023- # Enable automatic certificate rotation in a Batch pool
batch Batch Account Create Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-account-create-portal.md
Title: Create a Batch account in the Azure portal
description: Learn how to use the Azure portal to create and manage an Azure Batch account for running large-scale parallel workloads in the cloud. Last updated 04/03/2023--+ # Create a Batch account in the Azure portal
batch Batch Automatic Scaling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-automatic-scaling.md
Title: Autoscale compute nodes in an Azure Batch pool
description: Enable automatic scaling on an Azure Batch cloud pool to dynamically adjust the number of compute nodes in the pool. Last updated 05/26/2023--+ # Create a formula to automatically scale compute nodes in a Batch pool
batch Batch Ci Cd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-ci-cd.md
Title: Use Azure Pipelines to build and deploy an HPC solution
description: Use Azure Pipelines CI/CD build and release pipelines to deploy Azure Resource Manager templates for an Azure Batch high performance computing (HPC) solution. Last updated 04/12/2023 -+ # Use Azure Pipelines to build and deploy an HPC solution
batch Batch Cli Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-cli-templates.md
Title: Run jobs end-to-end using templates
description: With only CLI commands, you can create a pool, upload input data, create jobs and associated tasks, and download the resulting output data. Last updated 05/26/2023-+ # Use Azure Batch CLI templates and file transfer
batch Batch Docker Container Workloads https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-docker-container-workloads.md
description: Learn how to run and scale apps from container images on Azure Batc
Last updated 07/14/2023 ms.devlang: csharp, python-+ # Use Azure Batch to run container workloads
batch Batch Js Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-js-get-started.md
description: Learn the basic concepts of Azure Batch and build a simple solution
Last updated 05/16/2023 ms.devlang: javascript-+ # Get started with Batch SDK for JavaScript
batch Batch Linux Nodes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-linux-nodes.md
description: Learn how to process parallel compute workloads on pools of Linux v
Last updated 05/18/2023 ms.devlang: csharp, python-+ zone_pivot_groups: programming-languages-batch-linux-nodes # Provision Linux compute nodes in Batch pools
batch Batch Parallel Node Tasks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-parallel-node-tasks.md
Title: Run tasks concurrently to maximize usage of Batch compute nodes description: Learn how to increase efficiency and lower costs by using fewer compute nodes and parallelism in an Azure Batch pool. -+ Last updated 05/24/2023 ms.devlang: csharp
batch Batch Pool Compute Intensive Sizes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-pool-compute-intensive-sizes.md
Title: Use compute-intensive Azure VMs with Batch description: How to take advantage of HPC and GPU virtual machine sizes in Azure Batch pools. Learn about OS dependencies and see several scenario examples. + Last updated 05/01/2023 # Use RDMA or GPU instances in Batch pools
batch Batch Pool No Public Ip Address https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-pool-no-public-ip-address.md
Title: Create an Azure Batch pool without public IP addresses (preview)
description: Learn how to create an Azure Batch pool without public IP addresses. Last updated 05/30/2023-+ # Create a Batch pool without public IP addresses (preview)
batch Batch Powershell Cmdlets Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-powershell-cmdlets-get-started.md
Title: Get started with PowerShell
description: A quick introduction to the Azure PowerShell cmdlets you can use to manage Batch resources. Last updated 05/24/2023-+ # Manage Batch resources with PowerShell cmdlets
batch Batch Sig Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-sig-images.md
description: Custom image pools are an efficient way to configure compute nodes
Last updated 05/12/2023 ms.devlang: csharp, python-+ # Use the Azure Compute Gallery to create a custom image pool
batch Batch Spot Vms https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-spot-vms.md
Title: Run Batch workloads on cost-effective Spot VMs
description: Learn how to provision Spot VMs to reduce the cost of Azure Batch workloads. Last updated 04/11/2023-+ # Use Spot VMs with Batch workloads
batch Batch User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/batch-user-accounts.md
Title: Run tasks under user accounts
description: Learn the types of user accounts and how to configure them. Last updated 05/16/2023-+ ms.devlang: csharp, java, python # Run tasks under user accounts in Batch
batch Create Pool Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/create-pool-availability-zones.md
description: Learn how to create a Batch pool with zonal policy to help protect
Last updated 05/25/2023 ms.devlang: csharp+ # Create an Azure Batch pool across Availability Zones
batch Create Pool Extensions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/create-pool-extensions.md
Title: Use extensions with Batch pools description: Extensions are small applications that facilitate post-provisioning configuration and setup on Batch compute nodes. + Last updated 05/26/2023
batch Create Pool Public Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/create-pool-public-ip.md
Title: Create a Batch pool with specified public IP addresses description: Learn how to create an Azure Batch pool that uses your own static public IP addresses. + Last updated 05/26/2023
batch Managed Identity Pools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/managed-identity-pools.md
description: Learn how to enable user-assigned managed identities on Batch pools
Last updated 04/03/2023 ms.devlang: csharp+ # Configure managed identities in Batch pools
batch Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/quick-create-cli.md
Title: 'Quickstart: Use the Azure CLI to create a Batch account and run a job'
description: Follow this quickstart to use the Azure CLI to create a Batch account, a pool of compute nodes, and a job that runs basic tasks on the pool. Last updated 04/12/2023-+ # Quickstart: Use the Azure CLI to create a Batch account and run a job
batch Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Batch description: Lists Azure Policy Regulatory Compliance controls available for Azure Batch. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
batch Simplified Node Communication Pool No Public Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/simplified-node-communication-pool-no-public-ip.md
Title: Create a simplified node communication pool without public IP addresses
description: Learn how to create an Azure Batch simplified node communication pool without public IP addresses. Last updated 12/16/2022-+ # Create a simplified node communication pool without public IP addresses
batch Tutorial Batch Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/tutorial-batch-functions.md
description: Learn how to apply OCR to scanned documents as they're added to a s
ms.devlang: csharp Last updated 04/21/2023-+ # Tutorial: Trigger a Batch job using Azure Functions
batch Tutorial Parallel Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/tutorial-parallel-python.md
description: Learn how to process media files in parallel using ffmpeg in Azure
ms.devlang: python Last updated 05/25/2023-+ # Tutorial: Run a parallel workload with Azure Batch using the Python API
batch Virtual File Mount https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/virtual-file-mount.md
Title: Mount a virtual file system on a pool
description: Learn how to mount different kinds of virtual file systems on Batch pool nodes, and how to troubleshoot mounting issues. ms.devlang: csharp-+ Last updated 04/28/2023
cdn Cdn Azure Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-azure-diagnostic-logs.md
An Azure CDN profile is required for the following steps. Refer to [create an Az
Follow these steps enable logging for your Azure CDN endpoint:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
-2. In the Azure portal, navigate to **All resources** -> **your-cdn-profile**
+2. In the Azure portal, navigate to **All resources** > **your-cdn-profile**.
2. Select the CDN endpoint for which you want to enable diagnostics logs:
cdn Cdn Custom Ssl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-custom-ssl.md
Last updated 06/06/2022 -+ #Customer intent: As a website owner, I want to enable HTTPS on the custom domain of my CDN endpoint so that my users can use my custom domain to access my content securely.
chaos-studio Chaos Studio Permissions Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/chaos-studio/chaos-studio-permissions-security.md
Title: Permissions and security for Azure Chaos Studio Preview description: Understand how permissions work in Azure Chaos Studio Preview and how you can secure resources from accidental fault injection.-+ Previously updated : 11/01/2021 Last updated : 06/30/2023
Chaos Studio has three levels of security to help you control how and when fault
When you attempt to control the ability to inject faults against a resource, the most important operation to restrict is `Microsoft.Chaos/experiments/start/action`. This operation starts a chaos experiment that injects faults.
-* Second, a chaos experiment has a [system-assigned managed identity](../active-directory/managed-identities-azure-resources/overview.md) that executes faults on a resource. When you create an experiment, the system-assigned managed identity is created in your Azure Active Directory tenant with no permissions.
+* Second, a chaos experiment has a [system-assigned managed identity](../active-directory/managed-identities-azure-resources/overview.md) or a [user-assigned managed identity](../active-directory/managed-identities-azure-resources/overview.md) that executes faults on a resource. If you choose to use a system-assigned managed identity for your experiment, the identity is created at experiment creation time in your Azure Active Directory tenant. User-assigned managed identites may be used across any number of experiments.
- Before you run your chaos experiment, you must grant its identity [appropriate permissions](chaos-studio-fault-providers.md) to all target resources. If the experiment identity doesn't have appropriate permission to a resource, it can't execute a fault against that resource.
+ Within a chaos experiment, you can choose to enable custom role assignment on either your system-assigned or user-assigned managed identity selection. Enabling this functionality allows Chaos Studio to create and assign a custom role containing any necessary experiment action capabilities to your experiment's identity (that do not already exist in your identity selection). If a chaos experiment is using a user-assigned managed identity, any custom roles assigned to the experiment identity by Chaos Studio will persist after experiment deletion.
+
+ If you choose to grant your experiment permissions manually, you must grant its identity [appropriate permissions](chaos-studio-fault-providers.md) to all target resources. If the experiment identity doesn't have appropriate permission to a resource, it can't execute a fault against that resource.
* Third, each resource must be onboarded to Chaos Studio as [a target with corresponding capabilities enabled](chaos-studio-targets-capabilities.md). If a target or the capability for the fault being executed doesn't exist, the experiment fails without affecting the resource.
+## User-assigned Managed Identity
+
+A chaos experiment can utilize a [user-assigned managed identity](../active-directory/managed-identities-azure-resources/overview.md) to obtain sufficient permissions to inject faults on the experiment's target resources. Additionally, user-assigned managed identities may be used across any number of experiments in Chaos Studio. To utilize this functionality, you must:
+* First, create a user-assigned managed identity within the [Managed Identities](../active-directory/managed-identities-azure-resources/overview.md) service. You may assign your user-assigned managed identity required permissions to run your chaos experiment(s) at this point.
+* Second, when creating your chaos experiment, select a user-assigned managed identity from your Subscription. You can choose to enable custom role assignment at this step. Enabling this functionality would grant your identity selection any required permissions it may need based on the faults contained in your experiment.
+* Third, after you've added all of your faults to your chaos experiment, review if your identity configuration contains all the necessary actions for your chaos experiment to run successfully. If it does not, contact your system administrator for access or edit your experiment's fault selections.
+ ## Agent authentication When you run agent-based faults, you must install the Chaos Studio agent on your virtual machine (VM) or virtual machine scale set. The agent uses a [user-assigned managed identity](../active-directory/managed-identities-azure-resources/overview.md) to authenticate to Chaos Studio and an *agent profile* to establish a relationship to a specific VM resource.
All user interactions with Chaos Studio happen through Azure Resource Manager. I
* Currently, Chaos Studio can't execute Chaos Mesh faults if the AKS cluster has [local accounts disabled](../aks/manage-local-accounts-managed-azure-ad.md). * **Agent-based faults**: To use agent-based faults, the agent needs access to the Chaos Studio agent service. A VM or virtual machine scale set must have outbound access to the agent service endpoint for the agent to connect successfully. The agent service endpoint is `https://acs-prod-<region>.chaosagent.trafficmanager.net`. You must replace the `<region>` placeholder with the region where your VM is deployed. An example is `https://acs-prod-eastus.chaosagent.trafficmanager.net` for a VM in East US.
-Chaos Studio doesn't support Azure Private Link for agent-based scenarios.
+Chaos Studio doesn't support Azure Private Link for agent-based scenarios.
## Service tags A [service tag](../virtual-network/service-tags-overview.md) is a group of IP address prefixes that can be assigned to inbound and outbound rules for network security groups. It automatically handles updates to the group of IP address prefixes without any intervention.
cloud-services-extended-support Certificates And Key Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services-extended-support/certificates-and-key-vault.md
# Use certificates with Azure Cloud Services (extended support)
-Key Vault is used to store certificates that are associated to Cloud Services (extended support). Key Vaults can be created through [Azure portal](../key-vault/general/quick-create-portal.md) and [PowerShell](../key-vault/general/quick-create-powershell.md). Add the certificates to Key Vault, then reference the certificate thumbprints in Service Configuration file. You also need to enable Key Vault for appropriate permissions so that Cloud Services (extended support) resource can retrieve certificate stored as secrets from Key Vault.
+Key Vault is used to store certificates that are associated to Cloud Services (extended support). Key Vaults can be created through the [Azure portal](../key-vault/general/quick-create-portal.md) and [PowerShell](../key-vault/general/quick-create-powershell.md). Add the certificates to Key Vault, then reference the certificate thumbprints in Service Configuration file. You also need to enable Key Vault for appropriate permissions so that Cloud Services (extended support) resource can retrieve certificate stored as secrets from Key Vault.
## Upload a certificate to Key Vault
-1. Sign in to the Azure portal and navigate to the Key Vault. If you do not have a Key Vault set up, you can opt to create one in this same window.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to the Key Vault. If you do not have a Key Vault set up, you can opt to create one in this same window.
2. Select **Access Configuration**
cloud-services-extended-support Configure Scaling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services-extended-support/configure-scaling.md
Consider the following information when configuring scaling of your Cloud Servic
## Configure and manage scaling
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select the Cloud Service (extended support) deployment you want to configure scaling on. 3. Select the **Scale** blade.
cloud-services-extended-support Deploy Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services-extended-support/deploy-portal.md
This article explains how to use the Azure portal to create a Cloud Service (ext
Review the [deployment prerequisites](deploy-prerequisite.md) for Cloud Services (extended support) and create the associated resources. ## Deploy a Cloud Services (extended support)
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Using the search bar located at the top of the Azure portal, search for and select **Cloud Services (extended support)**.
cloud-services-extended-support Enable Alerts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services-extended-support/enable-alerts.md
This article explains how to enable alerts on existing Cloud Service (extended support) deployments. ## Add monitoring rules
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select the Cloud Service (extended support) deployment you want to enable alerts for. 3. Select the **Alerts** blade.
cloud-shell Msi Authorization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-shell/msi-authorization.md
ms.contributor: jahelmic
Last updated 11/14/2022 tags: azure-resource-manager+ Title: Acquiring a user token in Azure Cloud Shell # Acquire a token in Azure Cloud Shell
cloud-shell Persisting Shell Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-shell/persisting-shell-storage.md
ms.contributor: jahelmic
Last updated 04/25/2023 tags: azure-resource-manager+ Title: Persist files in Azure Cloud Shell
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Autosuggest/language-support.md
The following lists the languages supported by Bing Autosuggest API.
## See also -- [Azure Cognitive Services Documentation page](../index.yml)
+- [Azure AI services documentation page](../../ai-services/index.yml)
- [Azure Cognitive Services Product page](https://azure.microsoft.com/services/cognitive-services/)
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Image-Search/quickstarts/csharp.md
Responses from the Bing Image Search API are returned as JSON. This sample respo
* [What is the Bing Image Search API?](../overview.md) * [Try an online interactive demo](https://azure.microsoft.com/services/cognitive-services/bing-image-search-api/) * [Pricing details for the Bing Search APIs](https://azure.microsoft.com/pricing/details/cognitive-services/search-api/)
-* [Azure Cognitive Services documentation](../../index.yml)
+* [Azure AI services documentation](../../../ai-services/index.yml)
* [Bing Image Search API reference](/rest/api/cognitiveservices-bingsearch/bing-images-api-v7-reference)
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Image-Search/quickstarts/java.md
Responses from the Bing Image Search API are returned as JSON. This sample respo
* [What is the Bing Image Search API?](../overview.md) * [Try an online interactive demo](https://azure.microsoft.com/services/cognitive-services/bing-image-search-api/) * [Pricing details for the Bing Search APIs](https://azure.microsoft.com/pricing/details/cognitive-services/search-api/)
-* [Azure Cognitive Services documentation](../../index.yml)
+* [Azure AI services documentation](../../../ai-services/index.yml)
* [Bing Image Search API reference](/rest/api/cognitiveservices-bingsearch/bing-images-api-v7-reference)
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Image-Search/quickstarts/nodejs.md
Responses from the Bing Image Search API are returned as JSON. This sample respo
* [What is the Bing Image Search API?](../overview.md) * [Try an online interactive demo](https://azure.microsoft.com/services/cognitive-services/bing-image-search-api/) * [Pricing details for the Bing Search APIs](https://azure.microsoft.com/pricing/details/cognitive-services/search-api/)
-* [Azure Cognitive Services documentation](../../index.yml)
+* [Azure AI services documentation](../../../ai-services/index.yml)
* [Bing Image Search API reference](/rest/api/cognitiveservices-bingsearch/bing-images-api-v7-reference)
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Image-Search/quickstarts/php.md
Responses from the Bing Image Search API are returned as JSON. This sample respo
* [What is the Bing Image Search API?](../overview.md) * [Try an online interactive demo](https://azure.microsoft.com/services/cognitive-services/bing-image-search-api/) * [Pricing details for the Bing Search APIs](https://azure.microsoft.com/pricing/details/cognitive-services/search-api/)
-* [Azure Cognitive Services documentation](../../index.yml)
+* [Azure AI services documentation](../../../ai-services/index.yml)
* [Bing Image Search API reference](/rest/api/cognitiveservices-bingsearch/bing-images-api-v7-reference)
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Image-Search/quickstarts/python.md
Responses from the Bing Image Search API are returned as JSON. This sample respo
* [What is the Bing Image Search API?](../overview.md) * [Pricing details for the Bing Search APIs](https://azure.microsoft.com/pricing/details/cognitive-services/search-api/)
-* [Azure Cognitive Services documentation](../../index.yml)
+* [Azure AI services documentation](../../../ai-services/index.yml)
* [Bing Image Search API reference](/rest/api/cognitiveservices-bingsearch/bing-images-api-v7-reference)
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Image-Search/quickstarts/ruby.md
Responses from the Bing Image Search API are returned as JSON. This sample respo
* [What is the Bing Image Search API?](../overview.md) * [Try an online interactive demo](https://azure.microsoft.com/services/cognitive-services/bing-image-search-api/)
-* [Azure Cognitive Services documentation](../../index.yml)
+* [Azure AI services documentation](../../../ai-services/index.yml)
* [Bing Image Search API reference](/rest/api/cognitiveservices-bingsearch/bing-images-api-v7-reference)
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Spell-Check/language-support.md
Please note that to work with any other language than `en-US`, the `mkt` should
## See also -- [Cognitive Services Documentation page](../index.yml)
+- [Cognitive Services Documentation page](../../ai-services/index.yml)
- [Cognitive Services Product page](https://azure.microsoft.com/services/cognitive-services/)
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Bing-Web-Search/overview.md
The Bing Web Search API is a RESTful service that provides instant answers to user queries. Search results are easily configured to include web pages, images, videos, news, translations, and more. Bing Web Search provides the results as JSON based on search relevance and your Bing Web Search subscriptions.
-This API is optimal for applications that need access to all content that is relevant to a user's search query. If you're building an application that requires only a specific type of result, consider using the [Bing Image Search API](../bing-image-search/overview.md), [Bing Video Search API](../bing-video-search/overview.md), or [Bing News Search API](../bing-news-search/search-the-web.md). See [Cognitive Services APIs](../index.yml) for a complete list of Bing Search APIs.
+This API is optimal for applications that need access to all content that is relevant to a user's search query. If you're building an application that requires only a specific type of result, consider using the [Bing Image Search API](../bing-image-search/overview.md), [Bing Video Search API](../bing-video-search/overview.md), or [Bing News Search API](../bing-news-search/search-the-web.md). See [Cognitive Services APIs](../../ai-services/index.yml) for a complete list of Bing Search APIs.
Want to see how it works? Try our [Bing Web Search API demo](https://azure.microsoft.com/services/cognitive-services/bing-web-search-api/).
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/router/concepts.md
An exception policy controls the behavior of a Job based on a trigger and execut
### Next steps -- Let's get started with Job Router, check out the [Job Router Quickstart](../../quickstarts/router/get-started-router.md)
+> [!div class="nextstepaction"]
+> [Get started with Job Router](../../quickstarts/router/get-started-router.md)
#### Learn more about these key Job Router concepts
communication-services Recognize Action https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/call-automation/recognize-action.md
This guide will help you get started with recognizing DTMF input provided by par
|RecognizeCompleted|200|8531|Action completed, max digits received.| |RecognizeCompleted|200|8514|Action completed as stop tone was detected.| |RecognizeCompleted|400|8508|Action failed, the operation was canceled.|
-|RecognizeFailed|400|8510|Action failed, initial silence timeout reached|
-|RecognizeFailed|400|8532|Action failed, inter-digit silence timeout reached.|
+|RecognizeCompleted|400|8532|Action failed, inter-digit silence timeout reached.|
+|RecognizeFailed|400|8510|Action failed, initial silence timeout reached.|
|RecognizeFailed|500|8511|Action failed, encountered failure while trying to play the prompt.| |RecognizeFailed|500|8512|Unknown internal server error.| |RecognizeCanceled|400|8508|Action failed, the operation was canceled.|
communication-services Recognize Ai Action https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/call-automation/recognize-ai-action.md
This guide helps you get started recognizing user input in the forms of DTMF or
|RecognizeCompleted|200|8545|Action completed, speech option matched.| |RecognizeCompleted|200|8514|Action completed as stop tone was detected.| |RecognizeCompleted|200|8569|Action completed, speech was recognized.|
-|RecognizeCompleted|400|8508|Action failed, the operation was canceled.|
+|RecognizeCompleted|400|8532|Action failed, inter-digit silence time out reached.|
|RecognizeFailed|400|8563|Action failed, speech could not be recognized.| |RecognizeFailed|408|8570|Action failed, speech recognition timed out.|
-|RecognizeFailed|400|8510|Action failed, initial silence time out reached|
+|RecognizeFailed|400|8510|Action failed, initial silence time out reached.|
|RecognizeFailed|500|8511|Action failed, encountered failure while trying to play the prompt.|
-|RecognizeFailed|400|8532|Action failed, inter-digit silence time out reached.|
-|RecognizeFailed|400|8547|Action failed, speech option not matched.|
+|RecognizeFailed|400|8547|Action failed, recognized phrase does not match a valid option.|
|RecognizeFailed|500|8534|Action failed, incorrect tone entered.| |RecognizeFailed|500|9999|Unspecified error.| |RecognizeCanceled|400|8508|Action failed, the operation was canceled.|
communication-services Get Started Router https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/router/get-started-router.md
zone_pivot_groups: acs-js-csharp-java-python
Get started with Azure Communication Services Job Router by setting up your client, then configuring core functionality such as queues, policies, workers, and Jobs. To learn more about Job Router concepts, visit [Job Router conceptual documentation](../../concepts/router/concepts.md) ::: zone pivot="programming-language-csharp"-
-Get started with Azure Communication Services Job Router by setting up your client, then configuring core functionality such as queues, policies, workers, and Jobs. To learn more about Job Router concepts, visit [Job Router conceptual documentation](../../concepts/router/concepts.md)
[!INCLUDE [Use Job Router with .NET SDK](./includes/router-quickstart-net.md)] ::: zone-end ::: zone pivot="programming-language-javascript"-
-Get started with Azure Communication Services Job Router by setting up your client, then configuring core functionality such as queues, policies, workers, and Jobs. To learn more about Job Router concepts, visit [Job Router conceptual documentation](../../concepts/router/concepts.md)
[!INCLUDE [Use Job Router with JavaScript SDK](./includes/router-quickstart-javascript.md)] ::: zone-end ::: zone pivot="programming-language-python"-
-Get started with Azure Communication Services Job Router by setting up your client, then configuring core functionality such as queues, policies, workers, and Jobs. To learn more about Job Router concepts, visit [Job Router conceptual documentation](../../concepts/router/concepts.md)
[!INCLUDE [Use Job Router with Python SDK](./includes/router-quickstart-python.md)] ::: zone-end ::: zone pivot="programming-language-java"-
-Get started with Azure Communication Services Job Router by setting up your client, then configuring core functionality such as queues, policies, workers, and Jobs. To learn more about Job Router concepts, visit [Job Router conceptual documentation](../../concepts/router/concepts.md)
[!INCLUDE [Use Job Router with Java SDK](./includes/router-quickstart-java.md)] ::: zone-end
+## Next Steps
+Explore Job Router How-To's [tutorials](https://learn.microsoft.com/azure/communication-services/concepts/router/concepts#check-out-our-how-to-guides)
+ <!-- LINKS --> [subscribe_events]: ../../how-tos/router-sdk/subscribe-events.md [worker_registered_event]: ../../how-tos/router-sdk/subscribe-events.md#microsoftcommunicationrouterworkerregistered
confidential-computing Confidential Enclave Nodes Aks Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/confidential-enclave-nodes-aks-get-started.md
Last updated 04/11/2023 -+ # Quickstart: Deploy an AKS cluster with confidential computing Intel SGX agent nodes by using the Azure CLI
confidential-computing Guest Attestation Example https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/guest-attestation-example.md
Last updated 04/11/2023-+ # Use sample application for guest attestation
confidential-computing Quick Create Confidential Vm Arm Amd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/quick-create-confidential-vm-arm-amd.md
Last updated 04/12/2023 -+ ms.devlang: azurecli
This is an example parameter file for a Windows Server 2022 Gen 2 confidential V
> [!div class="nextstepaction"] > [Quickstart: Create a confidential VM on AMD in the Azure portal](quick-create-confidential-vm-portal-amd.md)-
confidential-computing Quick Create Confidential Vm Azure Cli Amd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/quick-create-confidential-vm-azure-cli-amd.md
Last updated 11/29/2022 -+ # Quickstart: Create an AMD-based confidential VM with the Azure CLI
confidential-computing Quick Create Confidential Vm Portal Amd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/quick-create-confidential-vm-portal-amd.md
Last updated 3/27/2022 -+ # Quickstart: Create confidential VM on AMD in the Azure portal
confidential-ledger Create Client Certificate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-ledger/create-client-certificate.md
description: Creating a Client Certificate with Microsoft Azure confidential led
+ Last updated 04/11/2023 - # Creating a Client Certificate
container-apps Azure Arc Enable Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/azure-arc-enable-cluster.md
description: 'Tutorial: learn how to set up Azure Container Apps in your Azure A
-+ Last updated 3/24/2023
container-apps Background Processing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/background-processing.md
Last updated 11/02/2021 -+ # Tutorial: Deploy a background processing application with Azure Container Apps
container-apps Dapr Github Actions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/dapr-github-actions.md
Last updated 07/10/2023-+ # Tutorial: Deploy a Dapr application with GitHub Actions for Azure Container Apps
container-apps Deploy Visual Studio Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/deploy-visual-studio-code.md
Last updated 09/01/2022-+ # Quickstart: Deploy to Azure Container Apps using Visual Studio Code
container-apps Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/disaster-recovery.md
-+ Last updated 1/18/2023
container-apps Get Started Existing Container Image https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/get-started-existing-container-image.md
description: Deploy an existing container image to Azure Container Apps with the
-+ Last updated 08/31/2022
container-apps Github Actions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/github-actions.md
description: Learn to automatically create new revisions in Azure Container Apps
-+ Last updated 11/09/2022
container-apps Manage Secrets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/manage-secrets.md
Last updated 05/10/2023 -+ # Manage secrets in Azure Container Apps
container-apps Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/managed-identity.md
User-assigned identities are ideal for workloads that:
Using managed identities in scale rules isn't supported. You'll still need to include the connection string or key in the `secretRef` of the scaling rule.
+[Init containers](containers.md#init-containers) can't access managed identities.
+ ## Configure managed identities You can configure your managed identities through:
container-apps Microservices Dapr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/microservices-dapr.md
Last updated 09/29/2022 -+ ms.devlang: azurecli
container-apps Scale App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/scale-app.md
description: Learn how applications scale in and out in Azure Container Apps.
-+ Last updated 12/08/2022
container-apps Tutorial Java Quarkus Connect Managed Identity Postgresql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/tutorial-java-quarkus-connect-managed-identity-postgresql-database.md
Last updated 09/26/2022-+ # Tutorial: Connect to PostgreSQL Database from a Java Quarkus Container App without secrets using a managed identity
container-apps Vnet Custom Internal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/vnet-custom-internal.md
description: Learn how to integrate a VNET to an internal Azure Container Apps e
-+ Last updated 08/31/2022
container-apps Vnet Custom https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/vnet-custom.md
description: Learn how to integrate a VNET with an external Azure Container Apps
-+ Last updated 08/31/2022
container-instances Container Instances Custom Dns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-custom-dns.md
description: Configure a public or private DNS configuration for a container gro
-+ Last updated 05/25/2022
container-instances Container Instances Github Action https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-github-action.md
Last updated 12/09/2022-+ # Configure a GitHub Action to create a container instance
container-instances Container Instances Gpu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-gpu.md
+ Last updated 06/17/2022
container-instances Container Instances Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-managed-identity.md
-+ Last updated 06/17/2022
az container exec \
--exec-command "/bin/bash" ```
-Run the following commands in the bash shell in the container. First log in to the Azure CLI using the managed identity:
+Run the following commands in the bash shell in the container. First, sign in to the Azure CLI using the managed identity:
```azurecli-interactive az login --identity
container-instances Container Instances Readiness Probe https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-readiness-probe.md
+ Last updated 06/17/2022
container-instances Container Instances Restart Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-restart-policy.md
+ Last updated 06/17/2022
container-instances Container Instances Start Command https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-start-command.md
-+ Last updated 06/17/2022
container-instances Container Instances Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-troubleshooting.md
Last updated 06/17/2022-+ # Troubleshoot common issues in Azure Container Instances
container-instances Container Instances Tutorial Azure Function Trigger https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-tutorial-azure-function-trigger.md
Last updated 06/17/2022-+ # Tutorial: Use an HTTP-triggered Azure function to create a container group
container-instances Container Instances Tutorial Deploy Confidential Containers Cce Arm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-tutorial-deploy-confidential-containers-cce-arm.md
Last updated 05/23/2023-+ # Tutorial: Create an ARM template for a confidential container deployment with custom confidential computing enforcement policy
container-instances Container Instances Tutorial Prepare Acr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-tutorial-prepare-acr.md
Last updated 06/17/2022-+ # Tutorial: Create an Azure container registry and push a container image
container-instances Container Instances Tutorial Prepare App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-tutorial-prepare-app.md
Last updated 06/17/2022-+ # Tutorial: Create a container image for deployment to Azure Container Instances
container-instances Container Instances Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-vnet.md
Last updated 06/17/2022-+ # Deploy container instances into an Azure virtual network
container-registry Container Registry Auth Service Principal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-auth-service-principal.md
Once logged in, Docker caches the credentials.
### Use with certificate
-If you've added a certificate to your service principal, you can sign into the Azure CLI with certificate-based authentication, and then use the [az acr login][az-acr-login] command to access a registry. Using a certificate as a secret instead of a password provides additional security when you use the CLI.
+If you've added a certificate to your service principal, you can sign in to the Azure CLI with certificate-based authentication, and then use the [az acr login][az-acr-login] command to access a registry. Using a certificate as a secret instead of a password provides additional security when you use the CLI.
A self-signed certificate can be created when you [create a service principal](/cli/azure/create-an-azure-service-principal-azure-cli). Or, add one or more certificates to an existing service principal. For example, if you use one of the scripts in this article to create or update a service principal with rights to pull or push images from a registry, add a certificate using the [az ad sp credential reset][az-ad-sp-credential-reset] command.
-To use the service principal with certificate to [sign into the Azure CLI](/cli/azure/authenticate-azure-cli#sign-in-with-a-service-principal), the certificate must be in PEM format and include the private key. If your certificate isn't in the required format, use a tool such as `openssl` to convert it. When you run [az login][az-login] to sign into the CLI using the service principal, also provide the service principal's application ID and the Active Directory tenant ID. The following example shows these values as environment variables:
+To use the service principal with certificate to [sign in to the Azure CLI](/cli/azure/authenticate-azure-cli#sign-in-with-a-service-principal), the certificate must be in PEM format and include the private key. If your certificate isn't in the required format, use a tool such as `openssl` to convert it. When you run [az login][az-login] to sign into the CLI using the service principal, also provide the service principal's application ID and the Active Directory tenant ID. The following example shows these values as environment variables:
```azurecli az login --service-principal --username $SP_APP_ID --tenant $SP_TENANT_ID --password /path/to/cert/pem/file
container-registry Container Registry Authentication Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-authentication-managed-identity.md
Title: Authenticate with managed identity description: Provide access to images in your private container registry by using a user-assigned or system-assigned managed Azure identity. -+ Last updated 10/11/2022
container-registry Container Registry Content Trust https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-content-trust.md
Azure Container Registry implements Docker's [content trust][docker-content-trus
> [!NOTE] > Content trust is a feature of the [Premium service tier](container-registry-skus.md) of Azure Container Registry.
+## Limitations
+- Token with repository-scoped permissions does not currently support docker push and pull of signed images.
+ ## How content trust works Important to any distributed system designed with security in mind is verifying both the *source* and the *integrity* of data entering the system. Consumers of the data need to be able to verify both the publisher (source) of the data, as well as ensure it's not been modified after it was published (integrity).
container-registry Container Registry Enable Conditional Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-enable-conditional-access-policy.md
The following steps will help create a Conditional Access policy for Azure Conta
## Prerequisites >* [Install or upgrade Azure CLI](/cli/azure/install-azure-cli) version 2.40.0 or later. To find the version, run `az --version`.
->* Sign into [Azure portal.](https://portal.azure.com).
+>* Sign in to the [Azure portal](https://portal.azure.com).
## Disable authentication-as-arm in ACR - Azure CLI
Disabling `authentication-as-arm` property by assigning a built-in policy will a
You can disable authentication-as-arm in the ACR, by following below steps:
- 1. Sign in to the [Azure portal](https://portal.azure.com).
+ 1. Sign in to the [Azure portal](https://portal.azure.com).
2. Refer to the ACR's built-in policy definitions in the [azure-container-registry-built-in-policy definition's](policy-reference.md). 3. Assign a built-in policy to disable authentication-as-arm definition - Azure portal.
container-registry Container Registry Get Started Docker Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-get-started-docker-cli.md
There are [several ways to authenticate](container-registry-authentication.md) t
### [Azure CLI](#tab/azure-cli)
-The recommended method when working in a command line is with the Azure CLI command [az acr login](/cli/azure/acr#az-acr-login). For example, to log in to a registry named *myregistry*, log into the Azure CLI and then authenticate to your registry:
+The recommended method when working in a command line is with the Azure CLI command [az acr login](/cli/azure/acr#az-acr-login). For example, to access a registry named `myregistry`, sign in the Azure CLI and then authenticate to your registry:
```azurecli az login
container-registry Container Registry Repositories https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-repositories.md
docker push myregistry.azurecr.io/samples/nginx
To view a repository:
-1. Sign in to the [Azure portal](https://portal.azure.com)
-1. Select the **Azure Container Registry** to which you pushed the Nginx image
-1. Select **Repositories** to see a list of the repositories that contain the images in the registry
-1. Select a repository to see the image tags within that repository
+1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Select the **Azure Container Registry** to which you pushed the Nginx image.
+1. Select **Repositories** to see a list of the repositories that contain the images in the registry.
+1. Select a repository to see the image tags within that repository.
For example, if you pushed the Nginx image as instructed in [Push and pull an image](container-registry-get-started-docker-cli.md), you should see something similar to:
container-registry Container Registry Troubleshoot Login https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-troubleshoot-login.md
When using `docker login`, provide the full login server name of the registry, s
docker login myregistry.azurecr.io ```
-When using [az acr login](/cli/azure/acr#az-acr-login) with an Azure Active Directory identity, first [sign into the Azure CLI](/cli/azure/authenticate-azure-cli), and then specify the Azure resource name of the registry. The resource name is the name provided when the registry was created, such as *myregistry* (without a domain suffix). Example:
+When using [az acr login](/cli/azure/acr#az-acr-login) with an Azure Active Directory identity, first [sign in to the Azure CLI](/cli/azure/authenticate-azure-cli), and then specify the Azure resource name of the registry. The resource name is the name provided when the registry was created, such as *myregistry* (without a domain suffix). Example:
```azurecli az acr login --name myregistry
container-registry Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/security-controls-policy.md
description: Lists Azure Policy Regulatory Compliance controls available for Azu
Previously updated : 07/06/2023 Last updated : 07/20/2023
cosmos-db Analytical Store Private Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/analytical-store-private-endpoints.md
The following access restrictions are applicable when data-exfiltration protecti
### Add a managed private endpoint for Azure Cosmos DB analytical store
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. From the Azure portal, navigate to your Synapse Analytics workspace and open the **Overview** pane.
cosmos-db Audit Control Plane Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/audit-control-plane-logs.md
You can enable diagnostic logs for control plane operations by using the Azure p
Use the following steps to enable logging on control plane operations:
-1. Sign into [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
1. Open the **Diagnostic settings** pane, provide a **Name** for the logs to create.
You can also store the logs in a storage account or stream to an event hub. This
After you turn on logging, use the following steps to track down operations for a specific account:
-1. Sign into [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Open the **Monitor** tab from the left-hand navigation and then select the **Logs** pane. It opens a UI where you can easily run queries with that specific account in scope. Run the following query to view control plane logs:
cosmos-db Migrate Data Striim https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cassandra/migrate-data-striim.md
This article shows how to use Striim to migrate data from an **Oracle database**
## Deploy the Striim marketplace solution
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Create a resource** and search for **Striim** in the Azure marketplace. Select the first option and **Create**.
In this section, you will configure the Azure Cosmos DB for Apache Cassandra acc
:::image type="content" source="media/migrate-data-striim/get-ssh-url.png" alt-text="Get the SSH URL":::
-1. Open a new terminal window and run the SSH command you copied from the Azure portal. This article uses terminal in a MacOS, you can follow the similar instructions using an SSH client on a Windows machine. When prompted, type **yes** to continue and enter the **password** you have set for the virtual machine in the previous step.
+1. Open a new terminal window and run the SSH command you copied from the Azure portal. This article uses terminal in a macOS, you can follow the similar instructions using an SSH client on a Windows machine. When prompted, type **yes** to continue and enter the **password** you have set for the virtual machine in the previous step.
:::image type="content" source="media/migrate-data-striim/striim-vm-connect.png" alt-text="Connect to Striim VM":::
cosmos-db Configure Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/configure-synapse-link.md
The first step to use Synapse Link is to enable it for your Azure Cosmos DB data
### Azure portal
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. [Create a new Azure account](create-sql-api-dotnet.md#create-account), or select an existing Azure Cosmos DB account.
Please note the following details when enabling Azure Synapse Link on your exist
### Azure portal #### New container
-1. Sign in to the [Azure portal](https://portal.azure.com/) or the [Azure Cosmos DB Explorer](https://cosmos.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com) or the [Azure Cosmos DB Explorer](https://cosmos.azure.com).
1. Navigate to your Azure Cosmos DB account and open the **Data Explorer** tab.
Please note the following details when enabling Azure Synapse Link on your exist
#### Existing container
-1. Sign in to the [Azure portal](https://portal.azure.com/) or the [Azure Cosmos DB Explorer](https://cosmos.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com) or the [Azure Cosmos DB Explorer](https://cosmos.azure.com).
1. Navigate to your Azure Cosmos DB account and open the **Azure Synapse Link** tab.
cosmos-db Continuous Backup Restore Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/continuous-backup-restore-permissions.md
Scope is a set of resources that have access, to learn more on scopes, see the [
To perform a restore, a user or a principal need the permission to restore (that is *restore/action* permission), and permission to provision a new account (that is *write* permission). To grant these permissions, the owner of the subscription can assign the `CosmosRestoreOperator` and `Cosmos DB Operator` built in roles to a principal.
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to your subscription. The `CosmosRestoreOperator` role is available at subscription level.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your subscription. The `CosmosRestoreOperator` role is available at subscription level.
1. Select **Access control (IAM)**.
az role definition create --role-definition <JSON_Role_Definition_Path>
## Next steps
-* Provision continuous backup using [Azure portal](provision-account-continuous-backup.md#provision-portal), [PowerShell](provision-account-continuous-backup.md#provision-powershell), [CLI](provision-account-continuous-backup.md#provision-cli), or [Azure Resource Manager](provision-account-continuous-backup.md#provision-arm-template).
+* Provision continuous backup using the [Azure portal](provision-account-continuous-backup.md#provision-portal), [PowerShell](provision-account-continuous-backup.md#provision-powershell), [CLI](provision-account-continuous-backup.md#provision-cli), or [Azure Resource Manager](provision-account-continuous-backup.md#provision-arm-template).
* [Get the latest restorable timestamp](get-latest-restore-timestamp.md) for SQL and MongoDB accounts.
-* Restore an account using [Azure portal](restore-account-continuous-backup.md#restore-account-portal), [PowerShell](restore-account-continuous-backup.md#restore-account-powershell), [CLI](restore-account-continuous-backup.md#restore-account-cli), or [Azure Resource Manager](restore-account-continuous-backup.md#restore-arm-template).
+* Restore an account using the [Azure portal](restore-account-continuous-backup.md#restore-account-portal), [PowerShell](restore-account-continuous-backup.md#restore-account-powershell), [CLI](restore-account-continuous-backup.md#restore-account-cli), or [Azure Resource Manager](restore-account-continuous-backup.md#restore-arm-template).
* [Migrate to an account from periodic backup to continuous backup](migrate-continuous-backup.md). * [Resource model of continuous backup mode](continuous-backup-restore-resource-model.md)
cosmos-db Database Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/database-security.md
The process of key rotation and regeneration is simple. First, make sure that **
After you rotate or regenerate a key, you can track its status from the Activity log. Use the following steps to track the status:
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
1. Open the **Activity log** pane and set the following filters:
cosmos-db Integrated Power Bi Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/integrated-power-bi-synapse-link.md
Synapse Link enables you to build Power BI dashboards with no performance or cos
Use the following steps to build a Power BI report from Azure Cosmos DB data in DirectQuery mode:
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
1. From the **Integrations** section, open the **Power BI** pane and select **Get started**.
cosmos-db Migrate Continuous Backup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/migrate-continuous-backup.md
[!INCLUDE[NoSQL, MongoDB, Gremlin, Table](includes/appliesto-nosql-mongodb-gremlin-table.md)]
-Azure Cosmos DB accounts with periodic mode backup policy can be migrated to continuous mode using [Azure portal](#portal), [CLI](#cli), [PowerShell](#powershell), or [Resource Manager templates](#ARM-template). Migration from periodic to continuous mode is a one-way migration and itΓÇÖs not reversible. After migrating from periodic to continuous mode, you can apply the benefits of continuous mode.
+Azure Cosmos DB accounts with periodic mode backup policy can be migrated to continuous mode using the [Azure portal](#portal), [CLI](#cli), [PowerShell](#powershell), or [Resource Manager templates](#ARM-template). Migration from periodic to continuous mode is a one-way migration and itΓÇÖs not reversible. After migrating from periodic to continuous mode, you can apply the benefits of continuous mode.
The following are the key reasons to migrate into continuous mode:
After you migrate your account to continuous backup mode, the cost can change wh
Use the following steps to migrate your account from periodic backup to continuous backup mode:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Navigate to your Azure Cosmos DB account and open the **Backup & Restore** pane. Select **Backup Policies** tab and select on **change**. Once you choose the target continuous mode, select on **Save**.
To learn more about continuous backup mode, see the following articles:
* [Continuous backup mode resource model.](continuous-backup-restore-resource-model.md)
-* Restore an account using [Azure portal](restore-account-continuous-backup.md#restore-account-portal), [PowerShell](restore-account-continuous-backup.md#restore-account-powershell), [CLI](restore-account-continuous-backup.md#restore-account-cli), or [Azure Resource Manager](restore-account-continuous-backup.md#restore-arm-template).
+* Restore an account using the [Azure portal](restore-account-continuous-backup.md#restore-account-portal), [PowerShell](restore-account-continuous-backup.md#restore-account-powershell), [CLI](restore-account-continuous-backup.md#restore-account-cli), or [Azure Resource Manager](restore-account-continuous-backup.md#restore-arm-template).
cosmos-db Post Migration Optimization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/post-migration-optimization.md
Most users leave their consistency level at the default session consistency sett
The processing of cutting-over or connecting your application allows you to switch your application to use Azure Cosmos DB once migration is finished. Follow the steps below:
-1. In a new window sign into the [Azure portal](https://www.portal.azure.com/)
+1. In a new window, sign in to the [Azure portal](https://www.portal.azure.com/).
2. From the [Azure portal](https://www.portal.azure.com/), in the left pane open the **All resources** menu and find the Azure Cosmos DB account to which you have migrated your data. 3. Open the **Connection String** blade. The right pane contains all the information that you need to successfully connect to your account. 4. Use the connection information in your application's configuration (or other relevant places) to reflect the Azure Cosmos DB's API for MongoDB connection in your app.
cosmos-db Monitor Account Key Updates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/monitor-account-key-updates.md
Azure Monitor for Azure Cosmos DB provides metrics, alerts, and logs to monitor
## Monitor key updates with metrics
-1. Sign into the [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Monitor** from the left navigation bar and select **Metrics**.
cosmos-db Monitor Resource Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/monitor-resource-logs.md
Here, we walk through the process of creating diagnostic settings for your accou
### [Azure portal](#tab/azure-portal)
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Cosmos DB account. Open the **Diagnostic settings** pane under the **Monitoring section**, and then select **Add diagnostic setting** option.
cosmos-db Certificate Based Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/certificate-based-authentication.md
Certificate-based authentication enables your client application to be authentic
In this step, you will register a sample web application in your Azure AD account. This application is later used to read the keys from your Azure Cosmos DB account. Use the following steps to register an application:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Open the Azure **Active Directory** pane, go to **App registrations** pane, and select **New registration**.
The above command results in the output similar to the screenshot below:
## Configure your Azure Cosmos DB account to use the new identity
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Cosmos DB account.
You can associate the certificate-based credential with the client application i
In the Azure app registration for the client application:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Open the Azure **Active Directory** pane, go to the **App registrations** pane, and open the sample app you created in the previous step.
cosmos-db Kafka Connector Sink https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/kafka-connector-sink.md
curl -H "Content-Type: application/json" -X POST -d @<path-to-JSON-config-file>
## Confirm data written to Azure Cosmos DB
-Sign into the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account. Check that the three records from the ΓÇ£hotelsΓÇ¥ topic are created in your account.
+Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account. Check that the three records from the ΓÇ£hotelsΓÇ¥ topic are created in your account.
## Cleanup
You can learn more about change feed in Azure Cosmo DB with the following docs:
* [Reading from change feed](read-change-feed.md) You can learn more about bulk operations in V4 Java SDK with the following docs:
-* [Perform bulk operations on Azure Cosmos DB data](./bulk-executor-java.md)
+* [Perform bulk operations on Azure Cosmos DB data](./bulk-executor-java.md)
cosmos-db Kafka Connector Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/kafka-connector-source.md
curl -H "Content-Type: application/json" -X POST -d @<path-to-JSON-config-file>
## Insert document into Azure Cosmos DB
-1. Sign into the [Azure portal](https://portal.azure.com/learn.docs.microsoft.com) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com/learn.docs.microsoft.com) and navigate to your Azure Cosmos DB account.
1. Open the **Data Explore** tab and select **Databases** 1. Open the "kafkaconnect" database and "kafka" container you created earlier. 1. To create a new JSON document, in the API for NoSQL pane, expand "kafka" container, select **Items**, then select **New Item** in the toolbar.
cosmos-db Migrate Data Striim https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/migrate-data-striim.md
This article shows how to use Striim to migrate data from an **Oracle database**
## Deploy the Striim marketplace solution
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Create a resource** and search for **Striim** in the Azure marketplace. Select the first option and **Create**.
cosmos-db Powerbi Visualize https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/powerbi-visualize.md
You can enable Azure Synapse Link on your existing Azure Cosmos DB containers an
To build a Power BI report/dashboard:
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
1. From the **Integrations** section, open the **Power BI** pane and select **Get started**.
cosmos-db Iif https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/iif.md
Previously updated : 07/01/2023 Last updated : 07/20/2023
Returns an expression, which could be of any type.
This first example evaluates a static boolean expression and returns one of two potential expressions.
-```sql
-SELECT VALUE {
- evalTrue: IIF(true, 123, 456),
- evalFalse: IIF(false, 123, 456),
- evalNumberNotTrue: IIF(123, 123, 456),
- evalStringNotTrue: IIF("ABC", 123, 456),
- evalArrayNotTrue: IIF([1,2,3], 123, 456),
- evalObjectNotTrue: IIF({"name": "Alice", "age": 20}, 123, 456)
-}
-```
-```json
-[
- {
- "evalTrue": 123,
- "evalFalse": 456,
- "evalNumberNotTrue": 456,
- "evalStringNotTrue": 456,
- "evalArrayNotTrue": 456,
- "evalObjectNotTrue": 456
- }
-]
-```
This example evaluates one of two potential expressions on multiple items in a container based on an expression that evaluates a boolean property.
-```json
-[
- {
- "id": "68719519221",
- "name": "Estrel Set Cutlery",
- "onSale": true,
- "pricing": {
- "msrp": 55.95,
- "sale": 30.85
- }
- },
- {
- "id": "68719520367",
- "name": "Willagno Spork",
- "onSale": false,
- "pricing": {
- "msrp": 20.15,
- "sale": 12.55
- }
- }
-]
-```
-```sql
-SELECT
- p.name,
- IIF(p.onSale, p.pricing.sale, p.pricing.msrp) AS price
-FROM
- products p
-```
+The query uses fields in the original items.
-```json
-[
- {
- "name": "Estrel Set Cutlery",
- "price": 30.85
- },
- {
- "name": "Willagno Spork",
- "price": 20.15
- }
-]
-```
+ ## Remarks
FROM
## See also - [System functions](system-functions.yml)-- [Equality and comparison operators](equality-comparison-operators.md)
+- [`ToString`](tostring.md)
cosmos-db Index Of https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/index-of.md
Title: INDEX_OF in Azure Cosmos DB query language
-description: Learn about SQL system function INDEX_OF in Azure Cosmos DB.
-
+ Title: INDEX_OF
+
+description: An Azure Cosmos DB for NoSQL system function that returns the index of the first occurrence of a string.
+++ - Previously updated : 08/30/2022--+ Last updated : 07/20/2023+
-# INDEX_OF (Azure Cosmos DB)
+# INDEX_OF (NoSQL query)
[!INCLUDE[NoSQL](../../includes/appliesto-nosql.md)]
-Returns the starting position of the first occurrence of the second string expression within the first specified string expression, or `-1` if the string isn't found.
+Returns the starting index of the first occurrence of a substring expression within a specified string expression.
## Syntax ```sql
-INDEX_OF(<str_expr1>, <str_expr2> [, <numeric_expr>])
+INDEX_OF(<string_expr_1>, <string_expr_2> [, <numeric_expr>])
``` ## Arguments
-*str_expr1*
- Is the string expression to be searched.
-
-*str_expr2*
- Is the string expression to search for.
-
-*numeric_expr*
- Optional numeric expression that sets the position the search will start. The first position in *str_expr1* is 0.
+| | Description |
+| | |
+| **`string_expr_1`** | A string expression that is the target of the search. |
+| **`string_expr_2`** | A string expression with the substring that is the source of the search (or to search for). |
+| **`numeric_expr` *(Optional)*** | An optional numeric expression that indicates where, in `string_expr_1`, to start the search. If not specified, the default value is `0`. |
## Return types
Returns a numeric expression.
## Examples
-The following example returns the index of various substrings inside "abc".
-
-```sql
-SELECT
- INDEX_OF("abc", "ab") AS index_of_prefix,
- INDEX_OF("abc", "b") AS index_of_middle,
- INDEX_OF("abc", "c") AS index_of_last,
- INDEX_OF("abc", "d") AS index_of_missing
-```
+The following example returns the index of various substrings inside the larger string **"AdventureWorks"**.
-Here's the result set.
-```json
-[
- {
- "index_of_prefix": 0,
- "index_of_middle": 1,
- "index_of_last": 2,
- "index_of_missing": -1
- }
-]
-```
## Next steps - [System functions Azure Cosmos DB](system-functions.yml)-- [Introduction to Azure Cosmos DB](../../introduction.md)
+- [`SUBSTRING`](substring.md)
cosmos-db Left https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/left.md
Previously updated : 07/01/2023 Last updated : 07/20/2023
Returns a string expression.
## Examples The following example returns the left part of the string `Microsoft` for various length values.
-
-```sql
-SELECT VALUE {
- firstZero: LEFT("AdventureWorks", 0),
- firstOne: LEFT("AdventureWorks", 1),
- firstFive: LEFT("AdventureWorks", 5),
- fullLength: LEFT("AdventureWorks", LENGTH("AdventureWorks")),
- beyondMaxLength: LEFT("AdventureWorks", 100)
-}
-```
-
-```json
-[
- {
- "firstZero": "",
- "firstOne": "A",
- "firstFive": "Adven",
- "fullLength": "AdventureWorks",
- "beyondMaxLength": "AdventureWorks"
- }
-]
-```
++ ## Remarks -- This system function benefits from a [range index](../../index-policy.md#includeexclude-strategy).
+- This function benefits from a [range index](../../index-policy.md#includeexclude-strategy).
## Next steps
cosmos-db Length https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/length.md
Previously updated : 07/01/2023 Last updated : 07/20/2023
Returns a numeric expression.
## Examples The following example returns the length of a static string.
-
-```sql
-SELECT VALUE {
- stringValue: LENGTH("AdventureWorks"),
- emptyString: LENGTH(""),
- nullValue: LENGTH(null),
- numberValue: LENGTH(0),
- arrayValue: LENGTH(["Adventure", "Works"])
-}
-```
-
-```json
-[
- {
- "stringValue": 14,
- "emptyString": 0
- }
-]
-```
++ ## Remarks
cosmos-db Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/log.md
Title: LOG in Azure Cosmos DB query language
-description: Learn about the LOG SQL system function in Azure Cosmos DB to return the natural logarithm of the specified numeric expression
-
+ Title: LOG
+description: An Azure Cosmos DB for NoSQL system function that returns the natural logarithm of the specified numeric expression
+++ - Previously updated : 09/13/2019--+ Last updated : 07/20/2023+
-# LOG (Azure Cosmos DB)
+
+# LOG (NoSQL query)
+ [!INCLUDE[NoSQL](../../includes/appliesto-nosql.md)]
- Returns the natural logarithm of the specified numeric expression.
-
+Returns the natural logarithm of the specified numeric expression.
+ ## Syntax
-
+ ```sql
-LOG (<numeric_expr> [, <base>])
+LOG(<numeric_expr> [, <numeric_base>])
```
-
+ ## Arguments
-
-*numeric_expr*
- Is a numeric expression.
-
-*base*
- Optional numeric argument that sets the base for the logarithm.
-
+
+| | Description |
+| | |
+| **`numeric_expr`** | A numeric expression. |
+| **`numeric_base` *(Optional)*** | An optional numeric value that sets the base for the logarithm. If not set, the default value is the natural logarithm approximately equal to `2.718281828``. |
+ ## Return types
-
- Returns a numeric expression.
-
-## Remarks
-
- By default, `LOG()` returns the natural logarithm. You can change the base of the logarithm to another value by using the optional base parameter.
-
- The natural logarithm is the logarithm to the base **e**, where **e** is an irrational constant approximately equal to *2.718281828*.
-
- The natural logarithm of the exponential of a number is the number itself: `LOG( EXP( n ) ) = n`. And the exponential of the natural logarithm of a number is the number itself: `EXP( LOG( n ) ) = n`.
-
- This system function won't utilize the index.
-
+
+Returns a numeric expression.
+ ## Examples
-
- The following example declares a variable and returns the logarithm value of the specified variable (10).
-
-```sql
-SELECT LOG(10) AS log
-```
-
- Here's the result set.
-
-```json
-[{log: 2.3025850929940459}]
-```
-
- The following example calculates the `LOG` for the exponent of a number.
-
-```sql
-SELECT EXP(LOG(10)) AS expLog
-```
-
- Here's the result set.
-
-```json
-[{expLog: 10.000000000000002}]
-```
+
+The following example returns the logarithm value of various values.
+++
+## Remarks
+
+- This function doesn't use the index.
+- The natural logarithm of the exponential of a number is the number itself: `LOG( EXP( n ) ) = n`. And the exponential of the natural logarithm of a number is the number itself: `EXP( LOG( n ) ) = n`.
## Next steps - [System functions Azure Cosmos DB](system-functions.yml)-- [Introduction to Azure Cosmos DB](../../introduction.md)
+- [`LOG10`](log10.md)
cosmos-db Log10 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/log10.md
Title: LOG10 in Azure Cosmos DB query language
-description: Learn about the LOG10 SQL system function in Azure Cosmos DB to return the base-10 logarithm of the specified numeric expression
-
+ Title: LOG10
+description: An Azure Cosmos DB for NoSQL system function that returns the base-10 logarithm of the specified numeric expression
+++ - Previously updated : 09/13/2019--+ Last updated : 07/20/2023+
-# LOG10 (Azure Cosmos DB)
+
+# LOG10 (NoSQL query)
+ [!INCLUDE[NoSQL](../../includes/appliesto-nosql.md)]
- Returns the base-10 logarithm of the specified numeric expression.
+Returns the base-10 logarithm of the specified numeric expression.
## Syntax
-
+ ```sql
-LOG10 (<numeric_expr>)
+LOG10(<numeric_expr>)
```
-
+ ## Arguments
-
-*numeric_expression*
- Is a numeric expression.
-
+
+| | Description |
+| | |
+| **`numeric_expr`** | A numeric expression. |
+ ## Return types
-
- Returns a numeric expression.
-
-## Remarks
-
- The LOG10 and POWER functions are inversely related to one another. For example, 10 ^ LOG10(n) = n. This system function will not utilize the index.
-
+
+Returns a numeric expression.
+ ## Examples
-
- The following example declares a variable and returns the LOG10 value of the specified variable (100).
-
-```sql
-SELECT LOG10(100) AS log10
-```
-
- Here is the result set.
-
-```json
-[{log10: 2}]
-```
+
+The following example returns the logarithm value of various values.
+++
+## Remarks
+
+- This function doesn't use the index.
+- The `LOG10` and `POWER` functions are inversely related to one another.
## Next steps - [System functions Azure Cosmos DB](system-functions.yml)-- [Introduction to Azure Cosmos DB](../../introduction.md)
+- [`LOG`](log.md)
cosmos-db Lower https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/lower.md
Previously updated : 07/01/2023 Last updated : 07/20/2023
Returns a string expression.
The following example shows how to use the function to modify various strings.
-```sql
-SELECT VALUE {
- lowercase: LOWER("adventureworks"),
- uppercase: LOWER("ADVENTUREWORKS"),
- camelCase: LOWER("adventureWorks"),
- pascalCase: LOWER("AdventureWorks"),
- upperSnakeCase: LOWER("ADVENTURE_WORKS")
-}
-```
-
-```json
-[
- {
- "lowercase": "adventureworks",
- "uppercase": "adventureworks",
- "camelCase": "adventureworks",
- "pascalCase": "adventureworks",
- "upperSnakeCase": "adventure_works"
- }
-]
-```
+ ## Remarks -- This system function doesn't use the index.
+- This function doesn't use the index.
- If you plan to do frequent case insensitive comparisons, this function may consume a significant number of RUs. Consider normalizing the casing of strings when ingesting your data. Then a query like `SELECT * FROM c WHERE LOWER(c.name) = 'USERNAME'` is simplified to `SELECT * FROM c WHERE c.name = 'USERNAME'`. ## Next steps
cosmos-db Ltrim https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/ltrim.md
Previously updated : 07/01/2023 Last updated : 07/20/2023
Returns a string expression.
The following example shows how to use this function with various parameters inside a query.
-```sql
-SELECT VALUE {
- whitespaceStart: LTRIM(" AdventureWorks"),
- whitespaceStartEnd: LTRIM(" AdventureWorks "),
- whitespaceEnd: LTRIM("AdventureWorks "),
- noWhitespace: LTRIM("AdventureWorks"),
- trimSuffix: LTRIM("AdventureWorks", "Works"),
- trimPrefix: LTRIM("AdventureWorks", "Adventure"),
- trimEntireTerm: LTRIM("AdventureWorks", "AdventureWorks"),
- trimEmptyString: LTRIM("AdventureWorks", "")
-}
-```
-
-```json
-[
- {
- "whitespaceStart": "AdventureWorks",
- "whitespaceStartEnd": "AdventureWorks ",
- "whitespaceEnd": "AdventureWorks ",
- "noWhitespace": "AdventureWorks",
- "trimSuffix": "AdventureWorks",
- "trimPrefix": "Works",
- "trimEntireTerm": "",
- "trimEmptyString": "AdventureWorks"
- }
-]
-```
+ ## Remarks -- This system function doesn't use the index.
+- This function doesn't use the index.
## Next steps
cosmos-db Numberbin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/numberbin.md
Previously updated : 07/01/2023 Last updated : 07/20/2023
Returns a numeric value.
This first example bins a single static number with various bin sizes.
-```sql
-SELECT VALUE {
- roundToNegativeHundreds: NumberBin(37.752, -100),
- roundToTens: NumberBin(37.752, 10),
- roundToOnes: NumberBin(37.752, 1),
- roundToZeroes: NumberBin(37.752, 0),
- roundToOneTenths: NumberBin(37.752, 0.1),
- roundToOneHundreds: NumberBin(37.752, 0.01)
-}
-```
-```json
-[
- {
- "roundToNegativeHundreds": 100,
- "roundToTens": 30,
- "roundToOnes": 37,
- "roundToOneTenths": 37.7,
- "roundToOneHundreds": 37.75
- }
-]
-```
-This next example uses a value from an existing item and rounds that value using the function.
+This next example uses a field from an existing item.
-```json
-{
- "name": "Ignis Cooking System",
- "price": 155.23478
-}
-```
-```sql
-SELECT
- p.name,
- NumberBin(p.price, 0.01) AS price
-FROM
- products p
-```
+This query rounds the previous field using the function.
-```json
-[
- {
- "name": "Ignis Cooking System",
- "price": 155.23
- }
-]
-```
+ ## Remarks -- This function returns **undefined** if the specified bin size is `0`.
+- This function returns `undefined` if the specified bin size is `0`.
- The default bin size is `1`. This bin size effectively returns a numeric value rounded to the next integer. ## See also
cosmos-db Right https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/query/right.md
Title: RIGHT in Azure Cosmos DB query language
-description: Learn about SQL system function RIGHT in Azure Cosmos DB.
-
+ Title: RIGHT
+
+description: An Azure Cosmos DB for NoSQL system function that returns a substring from the right side of a string.
+++ - Previously updated : 03/03/2020--+ Last updated : 07/20/2023+
-# RIGHT (Azure Cosmos DB)
+
+# RIGHT (NoSQL query)
+ [!INCLUDE[NoSQL](../../includes/appliesto-nosql.md)]
- Returns the right part of a string with the specified number of characters.
+Returns the right part of a string up to the specified number of characters.
## Syntax ```sql
-RIGHT(<str_expr>, <num_expr>)
+RIGHT(<string_expr>, <numeric_expr>)
``` ## Arguments
-*str_expr*
- Is the string expression to extract characters from.
-
-*num_expr*
- Is a numeric expression which specifies the number of characters.
+| | Description |
+| | |
+| **`string_expr`** | A string expression. |
+| **`numeric_expr`** | A numeric expression specifying the number of characters to extract from `string_expr`. |
## Return types
- Returns a string expression.
+Returns a string expression.
## Examples
- The following example returns the right part of "abc" for various length values.
-
-```sql
-SELECT RIGHT("abc", 1) AS r1, RIGHT("abc", 2) AS r2
-```
-
- Here is the result set.
-
-```json
-[{"r1": "c", "r2": "bc"}]
-```
+The following example returns the right part of the string `Microsoft` for various length values.
++ ## Remarks
-This system function will not utilize the index.
+- This function benefits from a [range index](../../index-policy.md#includeexclude-strategy).
## Next steps - [System functions Azure Cosmos DB](system-functions.yml)-- [Introduction to Azure Cosmos DB](../../introduction.md)
+- [`LEFT`](left.md)
cosmos-db Sdk Java V4 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/nosql/sdk-java-v4.md
Release history is maintained in the azure-sdk-for-java repo, for detailed list
## Recommended version
-It's strongly recommended to use version 4.37.1 and above.
+It's strongly recommended to use version 4.48.0 and above.
## FAQ [!INCLUDE [cosmos-db-sdk-faq](../includes/cosmos-db-sdk-faq.md)]
cosmos-db Periodic Backup Modify Interval Retention https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/periodic-backup-modify-interval-retention.md
Use the following steps to change the default backup options for an existing Azu
### [Azure portal](#tab/azure-portal)
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Cosmos DB account and open the **Backup & Restore** pane. Update the backup interval and the backup retention period as required.
cosmos-db Periodic Backup Request Data Restore https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/periodic-backup-request-data-restore.md
After the restore operation completes, you may want to know the source account d
Use the following steps to get the restore details from Azure portal:
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to the restored account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to the restored account.
1. Open the **Tags** page.
cosmos-db Periodic Backup Update Storage Redundancy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/periodic-backup-update-storage-redundancy.md
Use the following steps to update backup storage redundancy.
### [Azure portal](#tab/azure-portal)
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
1. Open the **Backup & Restore** pane, update the backup storage redundancy and select **Submit**. It takes few minutes for the operation to complete.
cosmos-db Plan Manage Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/plan-manage-costs.md
When you use cost analysis, you can view the Azure Cosmos DB costs in graphs and
To view Azure Cosmos DB costs in cost analysis:
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Open the scope in the Azure portal and select **Cost analysis** in the menu. For example, go to **Subscriptions**, select a subscription from the list, and then select **Cost analysis** in the menu. Select **Scope** to switch to a different scope in cost analysis.
cosmos-db Reserved Capacity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/reserved-capacity.md
The size of the reserved capacity purchase should be based on the total amount o
We calculate purchase recommendations based on your hourly usage pattern. Usage over last 7, 30 and 60 days is analyzed, and reserved capacity purchase that maximizes your savings is recommended. You can view recommended reservation sizes in the Azure portal using the following steps:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **All services** > **Reservations** > **Add**.
This recommendation to purchase a 30,000 RU/s reservation indicates that, among
## Buy Azure Cosmos DB reserved capacity
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **All services** > **Reservations** > **Add**.
You can cancel, exchange, or refund reservations with certain limitations. For m
## Exceeding reserved capacity
-When you reserve capacity for your Azure Cosmos DB resources, you are reserving [provisioned thorughput](set-throughput.md). If the provisioned throughput is exceeded, requests beyond that provisioning will be billed using pay-as-you go rates. For more information on reservations, see the [Azure reservations](../cost-management-billing/reservations/save-compute-costs-reservations.md) article. For more information on provisioned throughput, see [provisioned throughput types](how-to-choose-offer.md#overview-of-provisioned-throughput-types).
+When you reserve capacity for your Azure Cosmos DB resources, you are reserving [provisioned throughput](set-throughput.md). If the provisioned throughput is exceeded, requests beyond that provisioning will be billed using pay-as-you go rates. For more information on reservations, see the [Azure reservations](../cost-management-billing/reservations/save-compute-costs-reservations.md) article. For more information on provisioned throughput, see [provisioned throughput types](how-to-choose-offer.md#overview-of-provisioned-throughput-types).
## Next steps
cosmos-db Restore Account Continuous Backup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/restore-account-continuous-backup.md
Azure Cosmos DB's point-in-time restore feature helps you to recover from an accidental change within a container, to restore a deleted account, database, or a container or to restore into any region (where backups existed). The continuous backup mode allows you to do restore to any point of time within the last 30 days.
-This article describes how to identify the restore time and restore a live or deleted Azure Cosmos DB account. It shows restore the account using [Azure portal](#restore-account-portal), [PowerShell](#restore-account-powershell), [CLI](#restore-account-cli), or an [Azure Resource Manager template](#restore-arm-template).
+This article describes how to identify the restore time and restore a live or deleted Azure Cosmos DB account. It shows how to restore the account using the [Azure portal](#restore-account-portal), [PowerShell](#restore-account-powershell), [CLI](#restore-account-cli), or an [Azure Resource Manager template](#restore-arm-template).
This article describes how to identify the restore time and restore a live or de
You can use Azure portal to restore an entire live account or selected databases and containers under it. Use the following steps to restore your data:
-1. Sign into the [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Cosmos DB account and open the **Point In Time Restore** blade. > [!NOTE]
For example, if you want to restore to the point before a certain container was
You can use Azure portal to completely restore a deleted account within 30 days of its deletion. Use the following steps to restore a deleted account:
-1. Sign into the [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search for *Azure Cosmos DB* resources in the global search bar. It lists all your existing accounts. 1. Next select the **Restore** button. The Restore blade displays a list of deleted accounts that can be restored within the retention period, which is 30 days from deletion time. 1. Choose the account that you want to restore.
After the restore operation completes, you may want to know the source account d
Use the following steps to get the restore details from Azure portal:
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to the restored account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to the restored account.
1. Navigate to the **Export template** pane. It opens a JSON template, corresponding to the restored account.
az deployment group create -g <ResourceGroup> --template-file <RestoreTemplateFi
## Next steps
-* Provision continuous backup using [Azure portal](provision-account-continuous-backup.md#provision-portal), [PowerShell](provision-account-continuous-backup.md#provision-powershell), [CLI](provision-account-continuous-backup.md#provision-cli), or [Azure Resource Manager](provision-account-continuous-backup.md#provision-arm-template).
+* Provision continuous backup using the [Azure portal](provision-account-continuous-backup.md#provision-portal), [PowerShell](provision-account-continuous-backup.md#provision-powershell), [CLI](provision-account-continuous-backup.md#provision-cli), or [Azure Resource Manager](provision-account-continuous-backup.md#provision-arm-template).
* [How to migrate to an account from periodic backup to continuous backup](migrate-continuous-backup.md). * [Continuous backup mode resource model.](continuous-backup-restore-resource-model.md) * [Manage permissions](continuous-backup-restore-permissions.md) required to restore data with continuous backup mode.--
cosmos-db Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Cosmos DB description: Lists Azure Policy Regulatory Compliance controls available for Azure Cosmos DB. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
cosmos-db Use Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/use-metrics.md
This article walks through common use cases and how Azure Cosmos DB insights can
## View insights from Azure portal
-1. Sign into [Azure portal](https://portal.azure.com/) and navigate to your Azure Cosmos DB account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to your Azure Cosmos DB account.
1. You can view your account metrics either from the **Metrics** pane or the **Insights** pane.
The Metadata Request by Status Code graph above aggregates requests at increasin
You might want to learn more about improving database performance by reading the following articles: * [Measure Azure Cosmos DB for NoSQL performance with a benchmarking framework](performance-testing.md)
-* [Performance tips for Azure Cosmos DB and .NET SDK v2](performance-tips.md)
+* [Performance tips for Azure Cosmos DB and .NET SDK v2](performance-tips.md)
cost-management-billing Direct Ea Administration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/direct-ea-administration.md
Title: EA Billing administration on the Azure portal
description: This article explains the common tasks that an enterprise administrator accomplishes in the Azure portal. Previously updated : 07/07/2023 Last updated : 07/21/2023
This article explains the common tasks that an Enterprise Agreement (EA) adminis
> [!NOTE] > We recommend that both direct and indirect EA Azure customers use Cost Management + Billing in the Azure portal to manage their enrollment and billing instead of using the EA portal. For more information about enrollment management in the Azure portal, see [Get started with EA billing in the Azure portal](ea-direct-portal-get-started.md). >
-> As of February 20, 2023 indirect EA customers wonΓÇÖt be able to manage their billing account in the EA portal. Instead, they must use the Azure portal.
+> As of February 20, 2023 indirect EA customers no longer manage their billing account in the EA portal. Instead, they use the Azure portal.
>
-> This change doesnΓÇÖt affect Azure Government EA enrollments. They continue using the EA portal to manage their enrollment.
->
-> As of August 14, 2023 EA customers won't be able to manage their Azure Government EA enrollments from [Azure portal](https://portal.azure.com) instead they can manage it from [Azure Government portal](https://portal.azure.us).
+> Until August 14, 2023, this change doesnΓÇÖt affect customers with Azure Government EA enrollments. They continue using the EA portal to manage their enrollment until then. However, after August 14, 2023, EA customers won't be able to manage their Azure Government EA enrollments from the [Azure portal](https://portal.azure.com). Instead, they can manage it from the Azure Government portal at [https://portal.azure.us](https://portal.azure.us). The functionality mentioned in this article the same as the Azure Government portal.
## Manage your enrollment
After account ownership is confirmed, you can create subscriptions and purchase
If you're a new EA account owner with a .onmicrosoft.com account, you might not have a forwarding email address by default. In that situation, you might not receive the activation email. If this situation applies to you, use the following steps to activate your account ownership.
-1. Sign into the [Azure portal](https://portal.azure.com/#blade/Microsoft_Azure_GTM/ModernBillingMenuBlade/AllBillingScopes).
+1. Sign in to the [Azure portal](https://portal.azure.com/#blade/Microsoft_Azure_GTM/ModernBillingMenuBlade/AllBillingScopes).
1. Navigate to **Cost Management + Billing** and select a billing scope. 1. Select your account. 1. In the left menu under **Settings**, select **Activate Account**. 1. On the Activate Account page, select **Yes, I wish to continue** and the select **Activate this account**. :::image type="content" source="./media/direct-ea-administration/activate-account.png" alt-text="Screenshot showing the Activate Account page for onmicrosoft.com accounts." lightbox="./media/direct-ea-administration/activate-account.png" ::: 1. After the activation process completes, copy and paste the following link to your browser. The page opens and creates a subscription that's associated with your enrollment.
- `https://signup.azure.com/signup?offer=MS-AZR-0017P&appId=IbizaCatalogBlade`
+ - For Azure global, the URL is `https://signup.azure.com/signup?offer=MS-AZR-0017P&appId=IbizaCatalogBlade`.
+ - For Azure Government, the URL is `https://signup.azure.us/signup?offer=MS-AZR-0017P&appId=IbizaCatalogBlade`.
## Change Azure subscription or account ownership
cost-management-billing Link Partner Id Power Apps Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/link-partner-id-power-apps-accounts.md
Perform PAL Association on this Service Account.
To use the Azure portal to link to a new partner ID:
-1. Go to [Link to a partner ID](https://portal.azure.com/#blade/Microsoft_Azure_Billing/managementpartnerblade) in the Azure portal.
-2. Sign in to the Azure portal
-3. Enter the [Microsoft Cloud Partner Program](https://partner.microsoft.com/) ID for your organization. Be sure to use the **Associated Partner ID** shown on your partner center profile. It's typically known as your [partner location ID](/partner-center/account-structure).
+1. Go to [Link to a partner ID](https://portal.azure.com/#blade/Microsoft_Azure_Billing/managementpartnerblade) in the Azure portal and sign in.
+1. Enter the [Microsoft Cloud Partner Program](https://partner.microsoft.com/) ID for your organization. Be sure to use the **Associated Partner ID** shown on your partner center profile. It's typically known as your [partner location ID](/partner-center/account-structure).
:::image type="content" source="./media/link-partner-id-power-apps-accounts/link-partner-id.png" alt-text="Screenshot showing the Link to a partner ID window." lightbox="./media/link-partner-id-power-apps-accounts/link-partner-id.png" ::: > [!NOTE]
cost-management-billing Open Banking Strong Customer Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/open-banking-strong-customer-authentication.md
The following sections describe how to complete multi-factor authentication in t
You can change the active payment method of your Azure account by following these steps:
-1. Sign into the [Azure portal](https://portal.azure.com) as the Account Administrator and navigate to **Cost Management + Billing**.
+1. Sign in to the [Azure portal](https://portal.azure.com) as the Account Administrator and navigate to **Cost Management + Billing**.
2. In the **Overview** page, select the corresponding subscription from the **My subscriptions** grid. 3. Under 'Billing', select **Payment methods**. You can add a new credit card or set an existing card as the active payment method for the subscription. If your bank requires multi-factor authentication, you're prompted to complete an authentication challenge during the process.
For more details, see [Add, update, or remove a credit card for Azure](change-cr
If your bank rejects the charges, your Azure account status will change to **Past due** in the Azure portal. You can check the status of your account by following these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com/) as the Account Administrator.
+1. Sign in to the [Azure portal](https://portal.azure.com) as the Account Administrator.
2. Search on **Cost Management + Billing.** 3. On the **Cost Management + Billing** **Overview** page, review the status column in the **My subscriptions** grid. 4. If your subscription is labeled **Past due**, select **Settle balance**. You're prompted to complete multi-factor authentication during the process.
If your bank rejects the charges, your Azure account status will change to **Pas
Marketplace and reservation purchases are billed separately from Azure services. If your bank rejects the Marketplace or reservation charges, your invoice will become past due and you'll see the option to **Pay now** in the Azure portal. You can pay for past due Marketplace and reservation invoices by following these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com/) as the Account Administrator.
+1. Sign in to the [Azure portal](https://portal.azure.com) as the Account Administrator.
2. Search on **Cost Management + Billing.** 3. Under 'Billing', select **Invoices**. 5. In the subscription drop-down filter, select the subscription associated with your Marketplace or reservation purchase.
cost-management-billing Troubleshoot Sign In Issue https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/troubleshoot-sign-in-issue.md
If your internet browser page hangs, try each of the following steps until you c
- Use a different internet browser. - Use the private browsing mode for your browser:
- - **Edge:** Open **Settings** (the three dots by your profile picture), select **New InPrivate window**, and then browse and sign in to the [Azure portal](https://portal.azure.com/).
+ - **Edge:** Open **Settings** (the three dots by your profile picture), select **New InPrivate window**, and then browse and sign in to the [Azure portal](https://portal.azure.com).
- **Chrome:** Choose **Incognito** mode. - **Safari:** Choose **File**, then **New Private Window**.
To resolve the issue, try one of the following methods:
- **Chrome:** Choose **Settings** and select **Clear browsing data** under **Privacy and Security**. - Reset your browser settings to defaults. - Use the private browsing mode for your browser.
- - **Edge:** Open **Settings** (the three dots by your profile picture), select **New InPrivate window**, and then browse and sign in to the [Azure portal](https://portal.azure.com/).
+ - **Edge:** Open **Settings** (the three dots by your profile picture), select **New InPrivate window**, and then browse and sign in to the [Azure portal](https://portal.azure.com).
- **Chrome:** Choose **Incognito** mode. - **Safari:** Choose **File**, then **New Private Window**.
To resolve the issue, try one of the following methods:
This problem occurs if you selected at the wrong directory, or if your account doesn't have sufficient permissions.
-**Scenario:** You receive the error signing into the [Azure portal](https://portal.azure.com/)
+**Scenario:** You receive the error signing into the [Azure portal](https://portal.azure.com).
To fix this issue:
Other troubleshooting articles for Azure Billing and Subscriptions
## Contact us for help
-If you have questions or need help but can't sign into the Azure portal, [create a support request](https://support.microsoft.com/oas/?prid=15470).
+If you have questions or need help but can't sign in to the Azure portal, [create a support request](https://support.microsoft.com/oas/?prid=15470).
cost-management-billing Understand Ea Roles https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/understand-ea-roles.md
The following administrative user roles are part of your enterprise enrollment:
Use the Azure portal's Cost Management blade the [Azure portal](https://portal.azure.com) to manage Azure Enterprise Agreement roles.
-Direct EA customers can complete all administrative tasks in the Azure portal. You can use the [Azure Portal](https://portal.azure.com) to manage billing, costs, and Azure services.
+Direct EA customers can complete all administrative tasks in the Azure portal. You can use the [Azure portal](https://portal.azure.com) to manage billing, costs, and Azure services.
User roles are associated with a user account. To validate user authenticity, each user must have a valid work, school, or Microsoft account. Ensure that each account is associated with an email address that's actively monitored. Enrollment notifications are sent to the email address.
The following sections describe the limitations and capabilities of each role.
## Add a new enterprise administrator
-Enterprise administrators have the most privileges when managing an Azure EA enrollment. The initial Azure EA admin was created when the EA agreement was set up. However, you can add or remove new admins at any time. New admins are only added by existing admins. For more information about adding additional enterprise admins, see [Create another enterprise admin](ea-portal-administration.md#create-another-enterprise-administrator). Direct EA customers can use the Azure portal to add EA admins, see [Create another enterprise admin on Azure Portal](direct-ea-administration.md#add-another-enterprise-administrator). For more information about billing profile roles and tasks, see [Billing profile roles and tasks](understand-mca-roles.md#billing-profile-roles-and-tasks).
+Enterprise administrators have the most privileges when managing an Azure EA enrollment. The initial Azure EA admin was created when the EA agreement was set up. However, you can add or remove new admins at any time. New admins are only added by existing admins. For more information about adding additional enterprise admins, see [Create another enterprise admin](ea-portal-administration.md#create-another-enterprise-administrator). Direct EA customers can use the Azure portal to add EA admins, see [Create another enterprise admin on Azure portal](direct-ea-administration.md#add-another-enterprise-administrator). For more information about billing profile roles and tasks, see [Billing profile roles and tasks](understand-mca-roles.md#billing-profile-roles-and-tasks).
## Update account owner state from pending to active
Enterprise administrators have the most privileges when managing an Azure EA enr
When new Account Owners (AO) are added to an Azure EA enrollment for the first time, their status appears as _pending_. When a new account owner receives the activation welcome email, they can sign in to activate their account. > [!NOTE]
-> If the Account Owner is a service account and doesn't have an email, use an In-Private session to log into the Azure portal and navigate to Cost Management to be prompted to accept the activation welcome email.
+> If the Account Owner is a service account and doesn't have an email, use an In-Private session to sign in to the Azure portal and navigate to Cost Management to be prompted to accept the activation welcome email.
Once they activate their account, the account status is updated from _pending_ to _active_. The account owner needs to read the 'Warning' message and select **Continue**. New users might get prompted to enter their first and last name to create a Commerce Account. If so, they must add the required information to continue and then the account is activated.
cost-management-billing View Payment History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/view-payment-history.md
To view the payment history for your billing account, you must have at least the
To view your payment history, you can navigate to the Payment history page under a billing account or a specific billing profile. To vew payment history at billing account level:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for **Cost Management + Billing** and select it. 3. Select a Billing scope, if necessary. 4. In the left menu under **Billing**, select **Payment history**. To view payment history at a billing profile level:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for **Cost Management + Billing** and select it. 3. Select a Billing scope, if necessary. 4. In the left menu under **Billing**, select **Billing profiles**.
cost-management-billing Manage Reserved Vm Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/manage-reserved-vm-instance.md
To allow other people to manage reservations, you have two options:
If you're a billing administrator, use following steps to view and manage all reservations and reservation transactions.
-1. Sign into the [Azure portal](https://portal.azure.com) and navigate to **Cost Management + Billing**.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to **Cost Management + Billing**.
- If you're an EA admin, in the left menu, select **Billing scopes** and then in the list of billing scopes, select one. - If you're a Microsoft Customer Agreement billing profile owner, in the left menu, select **Billing profiles**. In the list of billing profiles, select one. 2. In the left menu, select **Products + services** > **Reservations**.
Azure reservation savings only result from sustained resource use. When you make
One way of viewing reservation usage is in the Azure portal.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **All services** > [**Reservations**](https://portal.azure.com/#blade/Microsoft_Azure_Reservations/ReservationsBrowseBlade) and note the **Utilization (%)** for a reservation. ![Image showing the list of reservations](./media/manage-reserved-vm-instance/reservation-list.png) 3. Select a reservation.
cost-management-billing Reservation Discount Databricks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-discount-databricks.md
Title: How an Azure Databricks pre-purchase discount is applied
-description: Learn how an Azure Databricks pre-purchase discount applies to your usage. You can use these Databricks at any time during the purchase term.
+ Title: How an Azure Databricks prepurchase discount is applied
+description: Learn how an Azure Databricks prepurchase discount applies to your usage. You can use these Databricks at any time during the purchase term.
Last updated 12/06/2022
-# How Azure Databricks pre-purchase discount is applied
+# How Azure Databricks prepurchase discount is applied
-You can use pre-purchased Azure Databricks commit units (DBCU) at any time during the purchase term. Any Azure Databricks usage is deducted from the pre-purchased DBCUs automatically.
+You can use prepurchased Azure Databricks commit units (DBCU) at any time during the purchase term. Any Azure Databricks usage is deducted from the prepurchased DBCUs automatically.
-Unlike VMs, pre-purchased units don't expire on an hourly basis. You can use them at any time during the term of the purchase. To get the pre-purchase discounts, you don't need to redeploy or assign a pre-purchased plan to your Azure Databricks workspaces for the usage.
+Unlike VMs, prepurchased units don't expire on an hourly basis. You can use them at any time during the term of the purchase. To get the prepurchase discounts, you don't need to redeploy or assign a prepurchased plan to your Azure Databricks workspaces for the usage.
-The pre-purchase discount applies only to Azure Databricks unit (DBU) usage. Other charges such as compute, storage, and networking are charged separately.
+The prepurchase discount applies only to Azure Databricks unit (DBU) usage. Other charges such as compute, storage, and networking are charged separately.
-## Pre-purchase discount application
+## Prepurchase discount application
-Databricks pre-purchase applies to all Databricks workloads and tiers. You can think of the pre-purchase as a pool of pre-paid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier. Usage is deducted in the following ratio:
+Databricks prepurchase applies to all Databricks workloads and tiers. You can think of the prepurchase as a pool of prepaid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier. Usage is deducted in the following ratio:
| **Workload** | **DBU application ratio ΓÇö Standard tier** | **DBU application ratio ΓÇö Premium tier** | | | | |
Databricks pre-purchase applies to all Databricks workloads and tiers. You can t
| Jobs Light Compute | 0.07 | 0.22 | | SQL Compute | NA | 0.22 | | Delta Live Tables | NA | 0.30 (core), 0.38 (pro), 0.54 (advanced) |
+| All Purpose Photon | NA | 0.55 |
-For example, when a quantity of Data Analytics ΓÇô Standard tier is consumed, the pre-purchased Databricks commit units is deducted by 0.4 units. When a quantity of Data Engineering Light ΓÇô Standard tier is used, the pre-purchased Databricks commit unit is deducted by 0.07 units.
+For example, when a quantity of Data Analytics ΓÇô Standard tier is consumed, the prepurchased Databricks commit units is deducted by 0.4 units. When a quantity of Data Engineering Light ΓÇô Standard tier is used, the prepurchased Databricks commit unit is deducted by 0.07 units.
Note: enabling Photon will increase the DBU count.
To determine your DBCU plan use, go to the Azure portal > **Reservations** and s
## How discount application shows in usage data
-When the pre-purchase discount applies to your Databricks usage, on-demand charges appear as zero in the usage data. For more information about reservation costs and usage, see [Get Enterprise Agreement reservation costs and usage](understand-reserved-instance-usage-ea.md).
+When the prepurchase discount applies to your Databricks usage, on-demand charges appear as zero in the usage data. For more information about reservation costs and usage, see [Get Enterprise Agreement reservation costs and usage](understand-reserved-instance-usage-ea.md).
## Need help? Contact us.
If you have questions or need help, [create a support request](https://portal.az
## Next steps - To learn how to manage a reservation, see [Manage Azure Reservations](manage-reserved-vm-instance.md).-- To learn more about pre-purchasing Azure Databricks to save money, see [Optimize Azure Databricks costs with a pre-purchase](prepay-databricks-reserved-capacity.md).
+- To learn more about prepurchasing Azure Databricks to save money, see [Optimize Azure Databricks costs with a pre-purchase](prepay-databricks-reserved-capacity.md).
- To learn more about Azure Reservations, see the following articles: - [What are Azure Reservations?](save-compute-costs-reservations.md) - [Manage Reservations in Azure](manage-reserved-vm-instance.md)
cost-management-billing Troubleshoot Reservation Transfers Between Tenants https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-reservation-transfers-between-tenants.md
When you change a reservation order's directory, all reservations under the orde
Use the following steps to change a reservation order's directory and its dependent reservations to another tenant.
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. If you're not a billing administrator but you're a reservation owner, navigate to **Reservations,** and then skip to step 5. 1. Navigate to **Cost Management + Billing**. - If you're an EA admin, in the left menu, select **Billing scopes** and then in the list of billing scopes, select one.
Assume that the reservation is set to a management group scope. After you change
## Next steps -- For more information about reservations, see [What are Azure Reservations?](save-compute-costs-reservations.md).
+- For more information about reservations, see [What are Azure Reservations?](save-compute-costs-reservations.md).
cost-management-billing View Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/view-reservations.md
The reservation lifecycle is independent of an Azure subscription, so the reserv
If you're a billing administrator, use following steps to view and manage all reservations and reservation transactions in the Azure portal.
-1. Sign into the [Azure portal](https://portal.azure.com) and navigate to **Cost Management + Billing**.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to **Cost Management + Billing**.
- If you're an EA admin, in the left menu, select **Billing scopes** and then in the list of billing scopes, select one. - If you're a Microsoft Customer Agreement billing profile owner, in the left menu, select **Billing profiles**. In the list of billing profiles, select one. 1. In the left menu, select **Products + services** > **Reservations**.
When you use the PowerShell script to assign the ownership role and it runs succ
[User Access Administrator](../../role-based-access-control/built-in-roles.md#user-access-administrator) rights are required before you can grant users or groups the Reservations Administrator and Reservations Reader roles at the tenant level. In order to get User Access Administrator rights at the tenant level, follow [Elevate access](../../role-based-access-control/elevate-access-global-admin.md) steps. ### Add a Reservations Administrator role or Reservations Reader role at the tenant level
-You can assign these roles from [Azure portal](https://portal.azure.com).
+You can assign these roles from the [Azure portal](https://portal.azure.com).
1. Sign in to the Azure portal and navigate to **Reservations**. 1. Select a reservation that you have access to.
cost-management-billing Manage Savings Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/manage-savings-plan.md
Changing a savings plan's scope doesn't affect its term.
To update a savings plan scope:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for **Cost Management + Billing** > **Savings plans**. 3. Select the savings plan. 4. Select **Settings** > **Configuration**.
For more information, see [Permissions to view and manage Azure savings plans](p
If you're a billing administrator you don't need to be an owner on the subscription. Use following steps to view and manage all savings plans and to their transactions.
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to **Cost Management + Billing**.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to **Cost Management + Billing**.
- If you're an EA admin, in the left menu, select **Billing scopes** and then in the list of billing scopes, select one. - If you're a Microsoft Customer Agreement billing profile owner, in the left menu, select **Billing profiles**. In the list of billing profiles, select one. 2. In the left menu, select **Products + services** > **Savings plan**.
Although you can't cancel, exchange, or refund a savings plan, you can transfer
Billing administrators can view savings plan usage Cost Management + Billing.
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Cost Management + Billing** > **Savings plans** and note the **Utilization (%)** for a savings plan. 1. Select a savings plan. 1. Review the savings plan use trend over time.
cost-management-billing Permission View Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/permission-view-manage.md
To allow other people to manage savings plans, you have two options:
If you're a billing administrator, use following steps to view and manage all savings plans and savings plan transactions in the Azure portal:
-1. Sign into the [Azure portal](https://portal.azure.com/) and navigate to **Cost Management + Billing**.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to **Cost Management + Billing**.
- If you're an EA admin, in the left menu, select **Billing scopes** and then in the list of billing scopes, select one. - If you're a Microsoft Customer Agreement billing profile owner, in the left menu, select **Billing profiles**. In the list of billing profiles, select one. 1. In the left menu, select **Products + services** > **Savings plans**.
Add a user as billing administrator to an Enterprise Agreement or a Microsoft Cu
If you purchased the savings plan or you're added to a savings plan, use the following steps to view and manage savings plans in the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **All Services** > **Savings plans** to list savings plans that you have access to. ## Manage subscriptions and management groups with elevated access
After you have elevated access:
## Next steps -- [Manage Azure savings plans](manage-savings-plan.md).
+- [Manage Azure savings plans](manage-savings-plan.md).
cost-management-billing Pay Bill https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/understand/pay-bill.md
On 8 June 2022, The Reserve Bank of India (RBI) increased the limit of e-mandate
On 30 September 2022, Microsoft and other online merchants will no longer be storing credit card information. To comply with this regulation Microsoft will be removing all stored card details from Microsoft Azure. To avoid service interruption, you'll need to add and verify your payment method to make a payment in the Azure portal for all invoices.
-[Learn about the Reserve Bank of India directive; Restriction on storage of actual card data ](https://rbidocs.rbi.org.in/rdocs/notification/PDFs/DPSSC09B09841EF3746A0A7DC4783AC90C8F3.PDF)
+[Learn about the Reserve Bank of India directive; Restriction on storage of actual card data](https://rbidocs.rbi.org.in/rdocs/notification/PDFs/DPSSC09B09841EF3746A0A7DC4783AC90C8F3.PDF)
### UPI and NetBanking payment options
If your default payment method is wire transfer, check your invoice for payment
To pay invoices in the Azure portal, you must have the correct [MCA permissions](../manage/understand-mca-roles.md) or be the Billing Account admin. The Billing Account admin is the user who originally signed up for the MCA account.
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search on **Cost Management + Billing**. 1. In the left menu, select **Invoices** under **Billing**. 1. If any of your eligible invoices are due or past due, you'll see a blue **Pay now** link for that invoice. Select **Pay now**.
data-factory Join Azure Ssis Integration Runtime Virtual Network Ui https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/join-azure-ssis-integration-runtime-virtual-network-ui.md
Use Azure portal to configure an Azure Resource Manager virtual network before y
1. Start Microsoft Edge or Google Chrome. Currently, only these web browsers support ADF UI.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **More services**. Filter for and select **Virtual networks**.
Use Azure portal to configure a classic virtual network before you try to join y
1. Start Microsoft Edge or Google Chrome. Currently, only these web browsers support ADF UI.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **More services**. Filter for and select **Virtual networks (classic)**.
data-factory Monitor Managed Virtual Network Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/monitor-managed-virtual-network-integration-runtime.md
Title: Monitor managed virtual network integration runtime in Azure Data Factory
-description: Learn how to monitor managed virtual network integration runtime in Azure Data Factory.
+ Title: Monitor an integration runtime within a managed virtual network
+description: Learn how to monitor an integration runtime within an Azure Data Factory managed virtual network.
-# Enhanced monitoring with Managed Virtual Network Integration Runtime
+# Monitor an integration runtime within a managed virtual network
+ [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-Azure Data Factory Managed Virtual Network is a feature that allows you to securely connect your data sources to a virtual network managed by Azure Data Factory service. By using this capability, you can establish a private and isolated environment for your data integration and orchestration processes. By using Azure Data Factory Managed Virtual Network, you can combine the power of Azure Data Factory's data integration and orchestration capabilities with the security and flexibility provided by Azure virtual networks. It empowers you to build robust, scalable, and secure data integration pipelines that seamlessly connect to your network resources, whether they're on-premises or in the cloud.
-One common pain point of managed compute is the lack of visibility into the performance and health especially within a managed virtual network environment. Without proper monitoring, identifying and resolving issues becomes challenging, leading to potential delays, errors, and performance degradation.
-By using our new enhanced monitoring feature, users can gain valuable insights into their data integration processes, leading to improved efficiency, better resource utilization, and enhanced overall performance. With proactive monitoring and timely alerts, users can proactively address issues, optimize workflows, and ensure the smooth execution of their data integration pipelines within the managed virtual network environment.
+
+You can use an Azure Data Factory managed virtual network to securely connect your data sources to a virtual network that the Data Factory service manages. By using this capability, you can establish a private and isolated environment for your data integration and orchestration processes.
+
+When you use a managed virtual network, you combine the data integration and orchestration capabilities in Data Factory with the security and flexibility of Azure virtual networks. It empowers you to build robust, scalable, and secure data integration pipelines that seamlessly connect to your network resources, whether they're on-premises or in the cloud.
+
+One common problem of managed compute is the lack of visibility into performance and health, especially within a managed virtual network environment. Without proper monitoring, identifying and resolving problems becomes challenging and can lead to potential delays, errors, and performance degradation.
+
+By using enhanced monitoring in Data Factory, you can gain valuable insights into your data integration processes. These insights can lead to improved efficiency, better resource utilization, and enhanced overall performance. With proactive monitoring and timely alerts, you can address issues, optimize workflows, and ensure the smooth execution of your data integration pipelines within the managed virtual network environment.
## New metrics
-The introduction of the new metrics in the Managed Virtual Network Integration Runtime feature significantly enhances the visibility and monitoring capabilities within virtual network environments. These new metrics have been designed to address the pain point of limited monitoring, providing users with valuable insights into the performance and health of their data integration workflows.
-![NOTE]
-> These metrics are only valid when enabling Time-To-Live in managed virtual network integration runtime.
-Azure Data Factory provides three distinct types of compute pools, each tailored to handle specific activity execution requirements. These compute pools offer flexibility and scalability to accommodate diverse workloads and ensure optimal resource allocation:
+The introduction of new metrics enhances the visibility and monitoring capabilities within managed virtual network environments.
+
+Azure Data Factory provides three distinct types of compute pools:
-To ensure consistent and comprehensive monitoring across all compute pools, we have implemented the same sets of monitoring metrics.
+- Compute for a copy activity
+- Compute for a pipeline activity, such as a lookup
+- Compute for an external activity, such as an Azure Databricks notebook
-Regardless of the type of compute pool being used, users can access and analyze a standardized set of metrics to gain insights into the performance and health of their data integration activities.
+These compute pools offer flexibility and scalability to accommodate diverse workloads and allocate resources optimally. Each is tailored to handle specific activity execution requirements.
+
+To help ensure consistent and comprehensive monitoring across all compute pools, we've implemented the same sets of monitoring metrics:
+
+- Capacity utilization
+- Available capacity percentage
+- Waiting queue length
+
+Regardless of the type of compute pool that you're using, you can access and analyze a standardized set of metrics to gain insights into the performance and health of your data integration activities.
+
+> [!NOTE]
+> These metrics are valid only when you're enabling time-to-live (TTL) in an integration runtime within a managed virtual network.
|Metric|Unit|Description| ||-|--|
-|Copy capacity utilization of MVNet integration runtime|Percent|The maximum percentage of DIU utilization for managed vNet Integration runtime time-to-live copy activities within 1-minute window.|
-|Copy available capacity percentage of MVNet integration runtime|Percent|The maximum percentage of available DIU for managed vNet Integration runtime time-to-live copy activities within 1-minute window.|
-|Copy waiting queue length of MVNet integration runtime|Count|The waiting queue length of managed vNet Integration runtime time-to-live copy activities within 1-minute window.|
-|Pipeline capacity utilization of MVNet integration runtime|Percent|The maximum percentage of DIU utilization for managed vNet Integration runtime pipeline activities within 1-minute window.|
-|Pipeline available capacity percentage of MVNet integration runtime|Percent|The maximum percentage of available DIU for managed vNet Integration runtime pipeline activities within 1-minute window.|
-|Pipeline waiting queue length of MVNet integration runtime|Count|The waiting queue length of managed vNet Integration runtime pipeline activities within 1-minute window.|
-|External capacity utilization of MVNet integration runtime|Percent|The maximum percentage of DIU utilization for managed vNet Integration runtime external activities within 1-minute window.|
-|External available capacity percentage of MVNet integration runtime|Percent|The maximum percentage of available DIU for managed vNet Integration runtime external activities within 1-minute window.|
-|External waiting queue length of MVNet integration runtime|Count|The waiting queue length of managed vNet Integration runtime external activities within 1-minute window.|
+|Copy capacity utilization of MVNet integration runtime|Percent|The maximum percentage of Data Integration Unit (DIU) utilization for TTL copy activities in a managed virtual network's integration runtime within a 1-minute window.|
+|Copy available capacity percentage of MVNet integration runtime|Percent|The maximum percentage of available DIU for TTL copy activities in a managed virtual network's integration runtime within a 1-minute window.|
+|Copy waiting queue length of MVNet integration runtime|Count|The waiting queue length of TTL copy activities in a managed virtual network's integration runtime within a 1-minute window.|
+|Pipeline capacity utilization of MVNet integration runtime|Percent|The maximum percentage of DIU utilization for pipeline activities in a managed virtual network's integration runtime within a 1-minute window.|
+|Pipeline available capacity percentage of MVNet integration runtime|Percent|The maximum percentage of available DIU for pipeline activities in a managed virtual network's integration runtime within a 1-minute window.|
+|Pipeline waiting queue length of MVNet integration runtime|Count|The waiting queue length of pipeline activities in a managed virtual network's integration runtime within a 1-minute window.|
+|External capacity utilization of MVNet integration runtime|Percent|The maximum percentage of DIU utilization for external activities in a managed virtual network's integration runtime within a 1-minute window.|
+|External available capacity percentage of MVNet integration runtime|Percent|The maximum percentage of available DIU for external activities in a managed virtual network's integration runtime within a 1-minute window.|
+|External waiting queue length of MVNet integration runtime|Count|The waiting queue length of external activities in a managed virtual network's integration runtime within a 1-minute window.|
## Using metrics for performance optimization
-By using these metrics, you can seamlessly track and assess the performance and robustness of your integration runtime within a managed virtual network. Moreover, you can uncover potential areas for continuous improvement by optimizing the compute settings and workflow to maximize efficiency.
-To provide further clarity on the practical application of these metrics, here are a few example scenarios:
+By using the metrics, you can seamlessly track and assess the performance and robustness of your integration runtime within a managed virtual network. You can also uncover potential areas for continuous improvement by optimizing the compute settings and workflow to maximize efficiency.
+
+To provide more clarity on the practical application of these metrics, here are a few example scenarios.
### Balanced
-If you observe that the Capacity Utilization is below 100% and the Available Capacity Percentage is high, it indicates that the compute resources you have reserved are being efficiently utilized. Additionally, if the Waiting Queue Length remains consistently low or experiences occasional short spikes, it's advisable to queue other activities until the Capacity Utilization reaches 100%. This ensures optimal utilization of resources and helps maintain a smooth workflow with minimal delays.
+If you observe that capacity utilization is below 100 percent and the available capacity percentage is high, the compute resources that you reserved are being efficiently utilized.
-### Performance-oriented
-If you observe that the Capacity Utilization is consistently low, and the Waiting Queue Length remains consistently low or experiences occasional short spikes, it indicates that the compute resources you have reserved are higher than the actual demand for activities. In such cases, regardless of whether the Available Capacity Percentage is high or low, it's recommended to reduce the allocated compute resources to lower your costs. By rightsizing the compute to match the actual workload requirements, you can optimize your resource utilization and achieve cost savings without compromising the efficiency of your operations.
+If the waiting queue length remains consistently low or experiences occasional short spikes, we advise you to queue other activities until the capacity utilization reaches 100 percent. This approach helps ensure optimal utilization of resources and helps maintain a smooth workflow with minimal delays.
-### Cost-oriented
-If you notice that all metrics, including Capacity Utilization, Available Capacity Percentage, and Waiting Queue Length, are high, it suggests that the compute resources you have reserved are insufficient for your activities. In this scenario, it's recommended to increase the allocated compute resources to reduce queue time. By adding more compute capacity, you can ensure that your activities have sufficient resources to execute efficiently, minimizing any delays caused by a crowded queue.
+### Performance oriented
+If you observe that capacity utilization is consistently low, and the waiting queue length remains consistently low or experiences occasional short spikes, the compute resources that you reserved are higher than the demand for activities.
+
+In such cases, regardless of whether the available capacity percentage is high or low, we recommend that you reduce the allocated compute resources to lower your costs. By rightsizing the compute to match the workload requirements, you can optimize your resource utilization and save costs without compromising the efficiency of your operations.
++
+### Cost oriented
+
+If you notice that all metrics (including capacity utilization, available capacity percentage, and waiting queue length) are high, the compute resources that you reserved are likely insufficient for your activities.
+
+In this scenario, we recommend that you increase the allocated compute resources to reduce queue time. Adding more compute capacity helps ensure that your activities have sufficient resources to run efficiently, which minimizes any delays that a crowded queue causes.
+ ### Intermittent activity execution
-If you notice that the Available Capacity Percentage fluctuates between low and high within a specific time period, it's likely due to the intermittent execution of your activities, where the Time-To-Live (TTL) period you have configured is shorter than the interval between your activities. This can have a significant impact on the performance of your workflow and can increase costs, as we charge for the warm-up time of the compute for up to 2 minutes.
-To address this issue, there are two possible solutions. First, you can queue more activities to maintain a consistent workload and utilize the available compute resources more effectively. By keeping the compute continuously engaged, you can avoid the warm-up time and achieve better performance.
-Alternatively, you can consider enlarging the TTL period to align with the interval between your activities. This ensures that the compute resources remain available for a longer duration, reducing the frequency of warm-up periods and optimizing cost-efficiency.
+
+If you notice that the available capacity percentage fluctuates between low and high within a specific time period, it's likely due to the intermittent execution of your activities. That is, the TTL period that you configured is shorter than the interval between your activities. This problem can have a significant impact on the performance of your workflow and can increase costs, because we charge for the warm-up time of the compute for up to 2 minutes.
+
+To address this problem, there are two possible solutions:
+
+- Queue more activities to maintain a consistent workload and utilize the available compute resources more effectively. By keeping the compute continuously engaged, you can avoid the warm-up time and achieve better performance.
+- Consider enlarging the TTL period to align with the interval between your activities. This approach keeps the compute resources available for a longer duration, which reduces the frequency of warm-up periods and optimizes cost efficiency.
+ By implementing either of these solutions, you can enhance the performance of your workflow, minimize cost implications, and ensure a smoother execution of your intermittent activities. ## Next steps
-Advance to the following tutorial to learn about Managed Virtual Network: [Managed virtual network and managed private endpoints](managed-virtual-network-private-endpoint.md).
+
+Advance to the following article to learn about managed virtual networks and managed private endpoints: [Azure Data Factory managed virtual network](managed-virtual-network-private-endpoint.md).
data-factory Data Factory Copy Activity Tutorial Using Visual Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/v1/data-factory-copy-activity-tutorial-using-visual-studio.md
Note the following points:
## Monitor pipeline Navigate to the home page for your data factory:
-1. Log in to [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Click **More services** on the left menu, and click **Data factories**. :::image type="content" source="media/data-factory-copy-activity-tutorial-using-visual-studio/browse-data-factories.png" alt-text="Browse data factories":::
data-lake-analytics Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-lake-analytics/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Data Lake Analytics description: Lists Azure Policy Regulatory Compliance controls available for Azure Data Lake Analytics. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
data-lake-store Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-lake-store/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Data Lake Storage Gen1 description: Lists Azure Policy Regulatory Compliance controls available for Azure Data Lake Storage Gen1. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
databox Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/databox/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Data Box description: Lists Azure Policy Regulatory Compliance controls available for Azure Data Box. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
ddos-protection Ddos View Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ddos-protection/ddos-view-diagnostic-logs.md
The following table lists the field names and descriptions:
| **AttackVectors** | Degradation of attack types. The keys include `TCP SYN flood`, `TCP flood`, `UDP flood`, `UDP reflection`, and `Other packet flood`. | | **TrafficOverview** | Degradation of attack traffic. The keys include `Total packets`, `Total packets dropped`, `Total TCP packets`, `Total TCP packets dropped`, `Total UDP packets`, `Total UDP packets dropped`, `Total Other packets`, and `Total Other packets dropped`. |  | **Protocols**  | Breakdown of protocols included. The keys include `TCP`, `UDP`, and `Other`.   |  
-| **DropReasons** | Analysis of causes of dropped packets. The keys include `Protocol violation invalid TCP`. `syn Protocol violation invalid TCP`, `Protocol violation invalid UDP`, `UDP reflection`, `TCP rate limit exceeded`, `UDP rate limit exceeded`, `Destination limit exceeded`, `Other packet flood Rate limit exceeded`, and `Packet was forwarded to service`. |
+| **DropReasons** | Analysis of causes of dropped packets. The keys include `Protocol violation invalid TCP`. `syn Protocol violation invalid TCP`, `Protocol violation invalid UDP`, `UDP reflection`, `TCP rate limit exceeded`, `UDP rate limit exceeded`, `Destination limit exceeded`, `Other packet flood Rate limit exceeded`, and `Packet was forwarded to service`. Protocol violation invalid drop reasons refer to malformed packets. |
| **TopSourceCountries** | Breakdown of the top 10 source countries into inbound traffic. | | **TopSourceCountriesForDroppedPackets** | Analysis of the top 10 source countries for attack traffic that have been throttled. | | **TopSourceASNs** | Analysis of the top 10 sources of autonomous system numbers (ASNs) of incoming traffic.  | 
defender-for-cloud Connect Azure Subscription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/connect-azure-subscription.md
Defender for Cloud helps you find and fix security vulnerabilities. Defender for
> [!TIP] > To enable Defender for Cloud on all subscriptions within a management group, see [Enable Defender for Cloud on multiple Azure subscriptions](onboard-management-group.md).
-1. Sign into the [Azure portal](https://azure.microsoft.com/features/azure-portal/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search for and select **Microsoft Defender for Cloud**.
defender-for-cloud Data Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/data-security.md
description: Learn how data is managed and safeguarded in Microsoft Defender for
Previously updated : 11/09/2021 Last updated : 07/18/2023 # Microsoft Defender for Cloud data security
To help customers prevent, detect, and respond to threats, Microsoft Defender fo
This article explains how data is managed and safeguarded in Defender for Cloud. ## Data sources+ Defender for Cloud analyzes data from the following sources to provide visibility into your security state, identify vulnerabilities and recommend mitigations, and detect active threats: - **Azure services**: Uses information about the configuration of Azure services you have deployed by communicating with that serviceΓÇÖs resource provider.
Defender for Cloud analyzes data from the following sources to provide visibilit
- **Partner solutions**: Uses security alerts from integrated partner solutions, such as firewalls and antimalware solutions. - **Your machines**: Uses configuration details and information about security events, such as Windows event and audit logs, and syslog messages from your machines.
+## Data sharing
+
+When you enable Defender for Storage Malware Scanning, it may share metadata, including metadata classified as customer data (e.g. SHA-256 hash), with Microsoft Defender for Endpoint.
+ ## Data protection ### Data segregation
-Data is kept logically separate on each component throughout the service. All data is tagged per organization. This tagging persists throughout the data lifecycle, and it is enforced at each layer of the service.
+Data is kept logically separate on each component throughout the service. All data is tagged per organization. This tagging persists throughout the data lifecycle, and it's enforced at each layer of the service.
### Data access To provide security recommendations and investigate potential security threats, Microsoft personnel may access information collected or analyzed by Azure services, including process creation events, and other artifacts, which may unintentionally include customer data or personal data from your machines.
-We adhere to the [Microsoft Online Services Data Protection Addendum](https://www.microsoftvolumelicensing.com/Downloader.aspx?DocumentId=17880), which states that Microsoft will not use Customer Data or derive information from it for any advertising or similar commercial purposes. We only use Customer Data as needed to provide you with Azure services, including purposes compatible with providing those services. You retain all rights to Customer Data.
+We adhere to the [Microsoft Online Services Data Protection Addendum](https://www.microsoftvolumelicensing.com/Downloader.aspx?DocumentId=17880), which states that Microsoft won't use customer data or derive information from it for any advertising or similar commercial purposes. We only use customer data as needed to provide you with Azure services, including purposes compatible with providing those services. You retain all rights to customer data.
### Data use Microsoft uses patterns and threat intelligence seen across multiple tenants to enhance our prevention and detection capabilities; we do so in accordance with the privacy commitments described in our [Privacy Statement](https://privacy.microsoft.com/privacystatement).
Microsoft uses patterns and threat intelligence seen across multiple tenants to
## Manage data collection from machines When you enable Defender for Cloud in Azure, data collection is turned on for each of your Azure subscriptions. You can also enable data collection for your subscriptions in Defender for Cloud. When data collection is enabled, Defender for Cloud provisions the Log Analytics agent on all existing supported Azure virtual machines and any new ones that are created.
-The Log Analytics agent scans for various security-related configurations and events it into [Event Tracing for Windows](/windows/win32/etw/event-tracing-portal) (ETW) traces. In addition, the operating system will raise event log events during the course of running the machine. Examples of such data are: operating system type and version, operating system logs (Windows event logs), running processes, machine name, IP addresses, logged in user, and tenant ID. The Log Analytics agent reads event log entries and ETW traces and copies them to your workspace(s) for analysis. The Log Analytics agent also enables process creation events and command line auditing.
+The Log Analytics agent scans for various security-related configurations and events it into [Event Tracing for Windows](/windows/win32/etw/event-tracing-portal) (ETW) traces. In addition, the operating system raises event log events during the course of running the machine. Examples of such data are: operating system type and version, operating system logs (Windows event logs), running processes, machine name, IP addresses, logged in user, and tenant ID. The Log Analytics agent reads event log entries and ETW traces and copies them to your workspace(s) for analysis. The Log Analytics agent also enables process creation events and command line auditing.
If you aren't using Microsoft Defender for Cloud's enhanced security features, you can also disable data collection from virtual machines in the Security Policy. Data Collection is required for subscriptions that are protected by enhanced security features. VM disk snapshots and artifact collection will still be enabled even if data collection has been disabled.
You can specify the workspace and region where data collected from your machines
||-| | United States, Brazil, South Africa | United States | | Canada | Canada |
-| Europe (Excluding United Kingdom) | Europe |
+| Europe (excluding United Kingdom) | Europe |
| United Kingdom | United Kingdom |
-| Asia (Excluding India, Japan, Korea, China) | Asia Pacific |
+| Asia (excluding India, Japan, Korea, China) | Asia Pacific |
| Korea | Asia Pacific | | India | India | | Japan | Japan |
Customers can access Defender for Cloud related data from the following data str
In this document, you learned how data is managed and safeguarded in Microsoft Defender for Cloud.
-To learn more about Microsoft Defender for Cloud, see [What is Microsoft Defender for Cloud?](defender-for-cloud-introduction.md)
+To learn more about Microsoft Defender for Cloud, see [What is Microsoft Defender for Cloud?](defender-for-cloud-introduction.md).
defender-for-cloud Data Sensitivity Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/data-sensitivity-settings.md
Import as follows (Import only once):
To customize data sensitivity settings that appear in Defender for Cloud, review the [prerequisites](concept-data-security-posture-prepare.md#configuring-data-sensitivity-settings), and then do the following.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Microsoft Defender for Cloud** > **Environment settings**. 1. Select **Data sensitivity**. 1. Select the info type category that you want to customize:
If you're using Microsoft Purview sensitivity labels, make sure that:
- the label scope is set to "Items"; under which you should configure [auto labeling for files and emails](/microsoft-365/compliance/apply-sensitivity-label-automatically#how-to-configure-auto-labeling-for-office-apps) - labels must be [published](/microsoft-365/compliance/create-sensitivity-labels#publish-sensitivity-labels-by-creating-a-label-policy) with a label policy that is in effect.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Microsoft Defender for Cloud** > **Environment settings**. 1. Select **Data sensitivity**. The current minimum sensitivity threshold is shown.
defender-for-cloud Defender For Apis Prepare https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/defender-for-apis-prepare.md
Onboarding requirements for Defender for APIs are as follows.
**Requirement** | **Details** | API Management instance | At least one API Management instance in an Azure subscription. Defender for APIs is enabled at the level of a subscription.<br/><br/> One or more supported APIs must be imported to the API Management instance.
-Azure account | You need an Azure account to sign into the Azure portal.
+Azure account | You need an Azure account to sign in to the Azure portal.
Onboarding permissions | To enable and onboard Defender for APIs, you need the Owner or Contributor role on the Azure subscriptions, resource groups, or Azure API Management instance that you want to secure. If you don't have the Contributor role, you need to enable these roles:<br/><br/> - Security Admin role for full access in Defender for Cloud.<br/> - Security Reader role to view inventory and recommendations in Defender for Cloud. Onboarding location | You can [enable Defender for APIs in the Defender for Cloud portal](defender-for-apis-deploy.md), or in the [Azure API Management portal](../api-management/protect-with-defender-for-apis.md).
defender-for-cloud Defender For Sql Scan Results https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/defender-for-sql-scan-results.md
This article describes several ways to consume and export your scan results.
**To query and export your findings with ARG with Defender for Cloud**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Microsoft Defender for Cloud** > **Recommendations**.
These queries are editable and can be customized to a specific resource, set of
**To query and export your findings with ARG**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Resource Graph Explorer**.
This query is editable and can be customized to a specific resource, set of find
**To open a query from your SQL database**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to `Your SQL database` > **Microsoft Defender for Cloud**.
defender-for-cloud Management Groups Roles https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/management-groups-roles.md
For a detailed overview of management groups, see the [Organize your resources w
### View and create management groups in the Azure portal
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search for and select **Management Groups**.
For a detailed overview of management groups, see the [Organize your resources w
You can add subscriptions to the management group that you created.
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Search for and select **Management Groups**
+1. Search for and select **Management Groups**.
1. Select the management group for your subscription.
You can add subscriptions to the management group that you created.
### Assign Azure roles to users through the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Search for and select **Management Groups**
+1. Search for and select **Management Groups**.
1. Select the relevant management group.
defender-for-cloud Quickstart Onboard Aws https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/quickstart-onboard-aws.md
Title: Connect your AWS account description: Defend your AWS resources by using Microsoft Defender for Cloud. + Last updated 06/28/2023
defender-for-cloud Regulatory Compliance Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/regulatory-compliance-dashboard.md
The regulatory compliance dashboard shows your selected compliance standards wit
Use the regulatory compliance dashboard to help focus your attention on the gaps in compliance with your chosen standards and regulations. This focused view also enables you to continuously monitor your compliance over time within dynamic cloud and hybrid environments.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Defender for Cloud** > **Regulatory compliance**.
You can use the information in the regulatory compliance dashboard to investigat
**To investigate your compliance issues**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Defender for Cloud** > **Regulatory compliance**.
The regulatory compliance has both automated and manual assessments that may nee
**To remediate an automated assessment**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Defender for Cloud** > **Regulatory compliance**.
The regulatory compliance has automated and manual assessments that may need to
**To remediate a manual assessment**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Defender for Cloud** > **Regulatory compliance**.
Transparency provided by the compliance offerings (currently in preview) , allow
**To check the compliance offerings status**:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Defender for Cloud** > **Regulatory compliance**.
defender-for-iot Respond Ot Alert https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/respond-ot-alert.md
Triage alerts on a regular basis to prevent alert fatigue in your network and en
**To triage alerts**:
-1. In [Defender for IoT](https://ms.portal.azure.com/#view/Microsoft_Azure_IoT_Defender/IoTDefenderDashboard/~/Getting_started) in the Azure portal, go to the **Alerts** page. By default, alerts are sorted by the **Last detection** column, from most recent to oldest alert, so that you can first see the latest alerts in your network.
+1. In [Defender for IoT](https://portal.azure.com/#view/Microsoft_Azure_IoT_Defender/IoTDefenderDashboard/~/Getting_started) in the Azure portal, go to the **Alerts** page. By default, alerts are sorted by the **Last detection** column, from most recent to oldest alert, so that you can first see the latest alerts in your network.
1. Use other filters, such as **Sensor** or **Severity** to find specific alerts.
For high severity alerts, you may want to take action immediately.
## Next steps > [!div class="nextstepaction"]
-> [Enhance security posture with security recommendations](recommendations.md)
+> [Enhance security posture with security recommendations](recommendations.md)
dev-box How To Configure Dev Box Azure Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dev-box/how-to-configure-dev-box-azure-diagnostic-logs.md
A dev center is required for the following step.
Follow these steps enable logging for your Azure DevCenter resource:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the Azure portal, navigate to **All resources** -> **your-devcenter**
devtest Quickstart Individual Credit https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest/offer/quickstart-individual-credit.md
Remember, the account you sign in with will dictate what tenant your directory w
For more details go to my.visualstudio.com, or follow this link: [Use Microsoft Azure in Visual Studio subscriptions](/visualstudio/subscriptions/vs-azure#:~:text=Eligibility%20%20%20%20Subscription%20Level%20%2F%20Program,%20%20Yes%20%2013%20more%20rows%20)
-### Sign in through Azure - [portal.azure.com](https://portal.azure.com)
+### Sign in via the [Azure portal](https://portal.azure.com)
-1. Choose or enter the email address to authenticate.
+1. Choose or enter the email address to authenticate.
- ![A screenshot of the Microsoft Azure pick an account screen.](media/quickstart-individual-credit/pick-an-account.png "Select an account to log into the Azure Portal.")
-2. Once youΓÇÖre logged in, go to Subscriptions under Azure Services.
+ ![A screenshot of the Microsoft Azure pick an account screen.](media/quickstart-individual-credit/pick-an-account.png "Select an account to sign in to the Azure portal.")
- ![A screenshot of the available Azure Portal Services](media/quickstart-individual-credit/azure-services.png "Select Subscriptions under Azure Services.")
-3. Select 'add'.
+2. Once youΓÇÖre logged in, go to Subscriptions under Azure Services.
+
+ ![A screenshot of services available in the Azure portal](media/quickstart-individual-credit/azure-services.png "Select Subscriptions under Azure Services.")
+3. Select **+ Add**.
![A screenshot of a pop up window for adding a subscription](media/quickstart-individual-credit/click-add.png "Click the add button.")
-4. This action takes you to a page where you can find the eligible offers
-5. Select the correct subscription offer to associate with your account
+4. This action takes you to a page where you can find the eligible offers.
+5. Select the correct subscription offer to associate with your account.
> [!NOTE]
-> This method uses the login credentials you used when signing in through your Azure Portal. This way of signing in has a higher probability of associating your subscription with your organizationΓÇÖs directory through your corporate Microsoft Account.
+> This method uses the login credentials you used when signing in to the Azure portal. This way of signing in has a higher probability of associating your subscription with your organizationΓÇÖs directory through your corporate Microsoft Account.
<a name="maintain-a-subscription-to-use-monthly-credits"></a> ## Troubleshoot removed/expired subscriptions
digital-twins Concepts Azure Digital Twins Explorer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/concepts-azure-digital-twins-explorer.md
Developers may find this tool especially useful in the following scenarios:
The explorer's main purpose is to help you visualize and understand your graph, and update your graph as needed. For large-scale solutions and for work that should be repeated or automated, consider using the [APIs and SDKs](./concepts-apis-sdks.md) to interact with your instance through code instead. - ## How to access The main way to access Azure Digital Twins Explorer is through the [Azure portal](https://portal.azure.com).
To open Azure Digital Twins Explorer for an Azure Digital Twins instance, first
Azure Digital Twins Explorer is organized into panels, each with a different set of capabilities for exploring and managing your models, twins, and relationships. - The sections of the explorer are as follows: * **Query Explorer**: Run queries against the twin graph and see the visual results in the **Twin Graph** panel. * **Models**: View a list of your models and perform model actions such as add, remove, and view model details.
The sections of the explorer are as follows:
For detailed instructions on how to use each feature, see [Use Azure Digital Twins Explorer](how-to-use-azure-digital-twins-explorer.md). + ## How to contribute Azure Digital Twins Explorer is an open-source tool that welcomes contributions to the code and documentation. The hosted application is deployed regularly from a source code repository in GitHub.
digital-twins How To Use Azure Digital Twins Explorer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-use-azure-digital-twins-explorer.md
The Twin Graph panel also provides several abilities to customize your graph vie
* [Show and hide twin graph elements](#show-and-hide-twin-graph-elements) * [Filter and highlight twin graph elements](#filter-and-highlight-twin-graph-elements) - ### Explore twin data Run a query using the [Query Explorer](#query-your-digital-twin-graph) to see the twins and relationships in the query result displayed in the **Twin Graph** panel.
This section describes how to perform the following management activities:
For information about the viewing experience for twins and relationships, see [Explore twins and the Twin Graph](#explore-the-twin-graph). - ### View flat list of twins and relationships The **Twins** panel shows a flat list of your twins and their associated relationships. You can search for twins by name, and expand them for details about their incoming and outgoing relationships.
The **Twins** panel shows a flat list of your twins and their associated relatio
You can create a new digital twin from its model definition in the **Models** panel. - To create a twin from a model, find that model in the list and choose the menu dots next to the model name. Then, select **Create a Twin**. You'll be asked to enter a **name** for the new twin, which must be unique. Then save the twin, which will add it to your graph. :::image type="content" source="media/how-to-use-azure-digital-twins-explorer/models-panel-create-a-twin.png" alt-text="Screenshot of Azure Digital Twins Explorer Models panel. The menu dots for a single model are highlighted, and the menu option to Create a Twin is also highlighted." lightbox="media/how-to-use-azure-digital-twins-explorer/models-panel-create-a-twin-large.png":::
You can also choose to delete all of the twins in your instance at the same time
## Explore models and the Model Graph - Models can be viewed both in the **Models** panel on the left side of the Azure Digital Twins Explorer screen, and in the **Model Graph** panel in the middle of the screen. The **Models** panel:
The **Models** panel:
The **Model Graph** panel: :::image type="content" source="media/how-to-use-azure-digital-twins-explorer/model-graph-panel.png" alt-text="Screenshot of Azure Digital Twins Explorer. The Model Graph panel is highlighted." lightbox="media/how-to-use-azure-digital-twins-explorer/model-graph-panel.png"::: + You can use these panels to [view your models](#view-models). The Model Graph panel also provides several abilities to customize your graph viewing experience:
You can view a flat list of the models in your instance in the **Models** panel.
You can use the **Model Graph** panel to view a graphical representation of the models in your instance, along with the relationships, inheritance, and components that connect them to each other. + #### View model definition To see the full definition of a model, find that model in the **Models** pane and select the menu dots next to the model name. Then, select **View Model**. Doing so will display a **Model Information** modal showing the raw DTDL definition of the model.
Then, to upload the images at the same time, use the **Upload Model Images** ico
## Manage models - You can use the **Models** panel on the left side of the Azure Digital Twins Explorer screen to perform management activities on the entire set of models, or on individual models. :::image type="content" source="media/how-to-use-azure-digital-twins-explorer/models-panel.png" alt-text="Screenshot of Azure Digital Twins Explorer. The Models panel is highlighted." lightbox="media/how-to-use-azure-digital-twins-explorer/models-panel.png":::
For information about the viewing experience for models, see [Explore models and
You can upload models from your machine by selecting model files individually, or by uploading an entire folder of model files at once. If you're uploading one JSON file that contains the code for many models, be sure to review the [bulk model upload limitations](#limitations-of-bulk-model-upload). + To upload one or more models that are individually selected, select the **Upload a model** icon showing an upwards arrow. :::image type="content" source="media/how-to-use-azure-digital-twins-explorer/models-panel-upload.png" alt-text="Screenshot of Azure Digital Twins Explorer Models panel. The Upload a model icon is highlighted.":::
dns Dns Private Resolver Get Started Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dns/dns-private-resolver-get-started-template.md
New-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -TemplateUri
## Validate the deployment
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Resource groups** from the left pane.
dns Dns Reverse Dns Hosting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dns/dns-reverse-dns-hosting.md
Last updated 04/27/2023 -+ ms.devlang: azurecli
dns Dns Reverse Dns Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dns/dns-reverse-dns-overview.md
na+ Last updated 04/27/2023
education-hub It Admin Allocate Credit https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/education-hub/it-admin-allocate-credit.md
After you have created credits, they will be shown as rows in the "Credits" tab.
3. You can also modify which Educator's have access to the Credit. To do this, navigate to Cost Management and add or remove Educators from the billing profile associated with the credit.
-The chosen Educators should now receive an email inviting them to visit the Education Hub to begin using these Credits. Ensure the Educators log into the Azure portal with the account associated with the credit's billing profile.
+The chosen Educators should now receive an email inviting them to visit the Education Hub to begin using these Credits. Ensure the Educators sign in to the Azure portal with the account associated with the credit's billing profile.
## Next steps
event-grid Communication Services Email Events https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/communication-services-email-events.md
This section contains an example of what that data would look like for each even
``` > [!NOTE]
-> Possible values for `Status` are `Delivered`, `Expanded`, `Bounced`, `Suppressed`, `FilteredSpam` and `Failed`.
+> Possible values for `Status` are:
+> - `Delivered`: The message was successfully handed over to the intended destination (recipient Mail Transfer Agent).
+> - `Suppressed`: The recipient email had hard bounced previously, and all subsequent emails to this recipient are being temporarily suppressed as a result.
+> - `Bounced`: The email hard bounced, which may have happened because the email address does not exist or the domain is invalid.
+> - `Quarantined`: The message was quarantined (as spam, bulk mail, or phishing).
+> - `FilteredSpam`: The message was identified as spam, and was rejected or blocked (not quarantined).
+> - `Expanded`: A distribution group recipient was expanded before delivery to the individual members of the group.
+> - `Failed`: The message wasn't delivered.
### Microsoft.Communication.EmailEngagementTrackingReportReceived event
This section contains an example of what that data would look like for each even
``` > [!NOTE]
-> Possible values for `engagementType` are `View`, and `Click`. When the `engagementType` is `Click`, `engagementContext` is the link in the Email sent which was clicked.
+> Possible values for `engagementType` are `View` and `Click`. When the `engagementType` is `Click`, `engagementContext` is the link in the Email sent which was clicked.
## Tutorial For a tutorial that shows how to subscribe for email events using web hooks, see [Quickstart: Handle email events](../communication-services/quickstarts/email/handle-email-events.md).
event-grid Concepts Pull Delivery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/concepts-pull-delivery.md
This article describes the main concepts related to the new resource model that
> [!NOTE] > For Event Grid concepts related to push delivery exclusively used in custom, system, partner, and domain topics, see this [concepts](concepts.md) article. + ## Events An event is the smallest amount of information that fully describes something that happened in a system. Every event has common information like `source` of the event, `time` the event took place, and a unique identifier. Every event also has specific information that is only relevant to the specific type of event. For example, an event about a new file being created in Azure Storage has details about the file, such as the `lastTimeModified` value. An Event Hubs event has the `URL` of the Capture file. An event about a new order in your Orders microservice may have an `orderId` attribute and a `URL` attribute to the orderΓÇÖs state representation.
event-grid Create View Manage Event Subscriptions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/create-view-manage-event-subscriptions.md
Last updated 05/24/2023
# Create, view, and manage event subscriptions in namespace topics
+This article shows you how to create, view, and manage event subscriptions to namespace topics in Azure Event Grid.
+ ## Create an event subscription
event-grid Create View Manage Namespace Topics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/create-view-manage-namespace-topics.md
Last updated 05/23/2023
# Create, view, and manage namespace topics
+This article shows you how to create, view, and manage namespace topics in Azure Event Grid.
+ ## Create a namespace topic
event-grid Create View Manage Namespaces https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/create-view-manage-namespaces.md
Last updated 05/23/2023
A namespace in Azure Event Grid is a logical container for one or more topics, clients, client groups, topic spaces and permission bindings. It provides a unique namespace, allowing you to have multiple resources in the same Azure region. With an Azure Event Grid namespace you can group now together related resources and manage them as a single unit in your Azure subscription. + This article shows you how to use the Azure portal to create, view and manage an Azure Event Grid namespace. ## Create a namespace
event-grid Custom Disaster Recovery Client Side https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/custom-disaster-recovery-client-side.md
The following table illustrates the client-side failover and geo disaster recove
| Partner Namespaces | Supported | Not supported | | Namespaces | Supported | Not supported | ++ ## Client-side failover considerations 1. Create and configure your **primary** Event Grid resource.
event-grid Event Hubs Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/event-hubs-integration.md
In this step, you deploy the required infrastructure with a [Resource Manager te
### Use Azure CLI to deploy the infrastructure
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Cloud Shell** button at the top. :::image type="content" source="media/event-hubs-functions-synapse-analytics/azure-portal.png" alt-text="Screenshot of Azure portal showing the selection of Cloud Shell button.":::
After publishing the function, you're ready to subscribe to the event.
## Subscribe to the event
-1. In a new tab or new window of a web browser, navigate to the [Azure portal](https://portal.azure.com).
+1. In a new tab or new window of a web browser, sign in to the [Azure portal](https://portal.azure.com).
2. In the Azure portal, select **Resource groups** on the left menu. 3. Filter the list of resource groups by entering the name of your resource group in the search box. 4. Select your resource group in the list.
event-grid Monitor Mqtt Delivery Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/monitor-mqtt-delivery-reference.md
Last updated 05/23/2023
# Monitor data reference for Azure Event Grid's MQTT delivery This article provides a reference of log and metric data collected to analyze the performance and availability of Azure Event Grid's MQTT delivery. + ## Metrics | Metric | Display name | Unit | Aggregation | Description | Dimensions |
event-grid Monitor Pull Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/monitor-pull-reference.md
Last updated 04/28/2023
# Monitor data reference for Azure Event Grid's pull event delivery This article provides a reference of log and metric data collected to analyze the performance and availability of Azure Event Grid's pull delivery. + ## Metrics ### Microsoft.EventGrid/namespaces
event-grid Mqtt Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-access-control.md
# Access control for MQTT clients
-Access control enables you to manage the authorization of clients to publish or subscribe to topics, using a role-based access control model. Given the enormous scale of IoT environments, assigning permission for each client to each topic is incredibly tedious. Event GridΓÇÖs flexible access control tackles this scale challenge through grouping clients and topics into client groups and topic spaces. The main components of the access control model are:
+Access control enables you to manage the authorization of clients to publish or subscribe to topics, using a role-based access control model. Given the enormous scale of IoT environments, assigning permission for each client to each topic is incredibly tedious. Event GridΓÇÖs flexible access control tackles this scale challenge through grouping clients and topics into client groups and topic spaces.
++
+The main components of the access control model are:
A **[client](mqtt-clients.md)** represents the device or application that needs to publish and/or subscribe to MQTT topics.
A **permission binding** grants access to a specific client group to publish or
:::image type="content" source="media/mqtt-overview/access-control-high-res.png" alt-text="Diagram of the access control model." border="false"::: ++ ## Examples: The following examples detail how to configure the access control model based on the following requirements.
event-grid Mqtt Automotive Connectivity And Data Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-automotive-connectivity-and-data-solution.md
This reference architecture is designed to support automotive OEMs and Mobility Providers in the development of advanced connected vehicle applications and digital services. Its goal is to provide reliable and efficient messaging, data and analytics infrastructure. The architecture includes message processing, command processing, and state storage capabilities to facilitate the integration of various services through managed APIs. It also describes a data and analytics solution that ensures the storage and accessibility of data in a scalable and secure manner for digital engineering and data sharing with the wider mobility ecosystem. + ## Architecture :::image type="content" source="media/mqtt-automotive-connectivity-and-data-solution/high-level-architecture.png" alt-text="Diagram of the high-level architecture." border="false" lightbox="media/mqtt-automotive-connectivity-and-data-solution/high-level-architecture.png":::
event-grid Mqtt Certificate Chain Client Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-certificate-chain-client-authentication.md
# Client authentication using CA certificate chain Use CA certificate chain in Azure Event Grid to authenticate clients while connecting to the service. + In this guide, you perform the following tasks: 1. Upload a CA certificate, the immediate parent certificate of the client certificate, to the namespace. 2. Configure client authentication settings.
event-grid Mqtt Client Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-client-authentication.md
We support authentication of clients using X.509 certificates. X.509 certificate provides the credentials to associate a particular client with the tenant. In this model, authentication generally happens once during session establishment. Then, all future operations using the same session are assumed to come from that identity. + ## Supported authentication modes - Certificates issued by a Certificate Authority (CA)
event-grid Mqtt Client Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-client-groups.md
# Client groups Client groups allow you to group a set of client together based on commonalities. The main purpose of client groups is to make configuring authorization easy. You can authorize a client group to publish or subscribe to a topic space. All the clients in the client group are authorized to perform the publish or subscribe action on the topic space. + In a namespace, we provide a default client group named "$all". The client group includes all the clients in the namespace. For ease of testing, you can use $all to configure permissions. > [!NOTE]
event-grid Mqtt Client Life Cycle Events https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-client-life-cycle-events.md
Client Life Cycle events allow applications to react to client connection or disconnection events. For example, you can build an application that updates a database, creates a ticket, and delivers an email notification every time a client is disconnected for mitigating action. + ## Event types The Event Grid namespace publishes the following event types:
event-grid Mqtt Clients https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-clients.md
# MQTT clients In this article, you learn about configuring MQTT clients and client groups. + ## Clients Clients can be devices or applications, such as devices or vehicles that send/receive MQTT messages.
event-grid Mqtt Establishing Multiple Sessions Per Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-establishing-multiple-sessions-per-client.md
In this guide, you learn how to establish multiple sessions for a single client to an Event Grid namespace. + ## Prerequisites - You have an Event Grid namespace created. Refer to this [Quickstart - Publish and subscribe on a MQTT topic](mqtt-publish-and-subscribe-portal.md) to create the namespace, subresources, and to publish/subscribe on a topic.
event-grid Mqtt Event Grid Namespace Terminology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-event-grid-namespace-terminology.md
# Terminology Key terms relevant for Event Grid namespace and MQTT resources are explained. + ## Namespace An Event Grid namespace is a declarative space that provides a scope to all the nested resources or subresources such as topics, certificates, clients, client groups, topic spaces, permission bindings.
event-grid Mqtt Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-overview.md
Title: 'Overview of the MQTT Support in Azure Event Grid'
+ Title: 'Overview of MQTT Support in Azure Event Grid (preview)'
description: 'Describes the main concepts for the MQTT Support in Azure Event Grid.' Last updated 05/23/2023
# Overview of the MQTT Support in Azure Event Grid (Preview)
-Azure Event Grid enables your MQTT clients to communicate with each other and with Azure services, to support your Internet of Things (IoT) solutions. Event GridΓÇÖs MQTT support enables you to accomplish the following scenarios:
+Azure Event Grid enables your MQTT clients to communicate with each other and with Azure services, to support your Internet of Things (IoT) solutions.
++
+Event GridΓÇÖs MQTT support enables you to accomplish the following scenarios:
+ - Ingest telemetry using a many-to-one messaging pattern. This pattern enables the application to offload the burden of managing the high number of connections with devices to Event Grid. - Control your MQTT clients using the request-response (one-to-one) messaging pattern. This pattern enables any client to communicate with any other client without restrictions, regardless of the clients' roles. - Broadcast alerts to a fleet of clients using the one-to-many messaging pattern. This pattern enables the application to publish only one message that the service replicates for every interested client.
The MQTT support in Event Grid is ideal for the implementation of automotive and
:::image type="content" source="media/overview/mqtt-messaging-high-res.png" alt-text="High-level diagram of Event Grid that shows bidirectional MQTT communication with publisher and subscriber clients." border="false":::
-> [!NOTE]
-> This feature is currently in preview. It's provided without a service level agreement, and is not recommended for production workloads. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+ ## Key concepts: The following are a list of key concepts involved in MQTT messaging on Event Grid.
event-grid Mqtt Publish And Subscribe Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-publish-and-subscribe-cli.md
Azure Event Grid supports messaging using the MQTT protocol. Clients (both devices and cloud applications) can publish and subscribe MQTT messages over flexible hierarchical topics for scenarios such as high scale broadcast, and command & control. + In this article, you use the Azure CLI to do the following tasks: 1. Create an Event Grid namespace and enable MQTT 2. Create subresources such as clients, client groups, and topic spaces
event-grid Mqtt Publish And Subscribe Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-publish-and-subscribe-portal.md
In this article, you use the Azure portal to do the following tasks:
3. Grant clients access to publish and subscribe to topic spaces 4. Publish and receive messages between clients + ## Prerequisites - If you don't have an Azure subscription, create an Azure free account before you begin.
event-grid Mqtt Routing Enrichment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-routing-enrichment.md
# Enrichments for MQTT Routed Messages-- The enrichments support enables you to add up to 20 custom key-value properties to your messages before they're sent to the Event Grid custom topic. These enrichments enable you to: - Add contextual data to your messages. For example, enriching the message with the client's name or the namespace name could provide endpoints with information about the source of the message. - Reduce computing load on endpoints. For example, enriching the message with the MQTT publish request's payload format indicator or the content type informs endpoints how to process the message's payload without trying multiple parsers first. - Filter your routed messages through Event Grid event subscriptions based on the added data. For example, enriching a client attribute enables you to filter the messages to be routed to the endpoint based on the different attribute's values. + ## Configuration ### Enrichment Key: The enrichment key is a string that needs to comply with these requirements: - Include only lower-case alphanumerics: only (a-z) and (0-9)-- Must not be "specversion", "id", "time", "type", "source", "subject", "datacontenttype", "dataschema", "data", or "data_base64".-- Must not start with ΓÇ£azspΓÇ¥.
+- Must not be `specversion`, `id`, `time`, `type`, `source`, `subject`, `datacontenttype`, `dataschema`, `data`, or `data_base64`.
+- Must not start with `azsp`.
- Must not be duplicated. - Must not be more than 20 characters.
The following CloudEvent is a sample output of a MQTTv5 message with PFI=0 after
### Handling special cases: -- Unspecified client attributes/user properties: if a dynamic enrichment pointed to a client attribute/user property that doesnΓÇÖt exist, the enrichment will include the specified key with an empty string for a value. For example, "emptyproperty": "".-- Arrays: Arrays in client attributes and duplicate user properties are transformed to a comma-separated string. For example: if the enriched client attribute is set to be ΓÇ£arrayΓÇ¥: ΓÇ£value1ΓÇ¥, ΓÇ£value2ΓÇ¥, ΓÇ£value3ΓÇ¥, the resulting enriched property will be ΓÇ£arrayΓÇ¥: ΓÇ£value1,value2,value3ΓÇ¥. Another example: if the same MQTT publish request has the following user properties > "userproperty1": "value1", "userproperty1": "value2", resulting enriched property will be ΓÇ£userproperty1ΓÇ¥: ΓÇ£value1,value2ΓÇ¥.
+- Unspecified client attributes/user properties: if a dynamic enrichment pointed to a client attribute/user property that doesnΓÇÖt exist, the enrichment will include the specified key with an empty string for a value. For example, `emptyproperty`: "".
+- Arrays: Arrays in client attributes and duplicate user properties are transformed to a comma-separated string. For example: if the enriched client attribute is set to be ΓÇ£arrayΓÇ¥: ΓÇ£value1ΓÇ¥, ΓÇ£value2ΓÇ¥, ΓÇ£value3ΓÇ¥, the resulting enriched property will be `array`: `value1,value2,value3`. Another example: if the same MQTT publish request has the following user properties > "userproperty1": "value1", "userproperty1": "value2", resulting enriched property will be `userproperty1`: `value1,value2`.
## Next steps:
event-grid Mqtt Routing Event Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-routing-event-schema.md
# Event Schema for MQTT Routed Messages++ MQTT Messages are routed to an Event Grid topic as CloudEvents according to the following logic: For MQTT v3 messages or MQTT v5 messages of a payload format indicator=0, the payload will be forwarded in the data_base64 object and encoded as a base 64 string according to the following schema sample.
event-grid Mqtt Routing Filtering https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-routing-filtering.md
# Filtering of MQTT Routed Messages You can use the Event Grid SubscriptionΓÇÖs filtering capability to filter the routed MQTT messages. + ## Topic filtering You can filter on the messagesΓÇÖ MQTT topics through filtering on the "subject" property in the Cloud Event schema. Event Grid Subscriptions supports free simple subject filtering by specifying a starting or ending value for the subject. For example,
event-grid Mqtt Routing To Event Hubs Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-routing-to-event-hubs-cli.md
Use message routing in Azure Event Grid to send data from your MQTT clients to Azure services such as storage queues, and Event Hubs. + In this article, you perform the following tasks: - Create Event Subscription in your Event Grid topic - Configure routing in your Event Grid Namespace
event-grid Mqtt Routing To Event Hubs Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-routing-to-event-hubs-portal.md
In this tutorial, you perform the following tasks:
- Configure routing in your Event Grid Namespace. - View the MQTT messages in the Event Hubs using Azure Stream Analytics. + ## Prerequisites - If you don't have an [Azure subscription](/azure/guides/developer/azure-developer-guide#understanding-accounts-subscriptions-and-billing), create an [Azure free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
event-grid Mqtt Routing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-routing.md
Event Grid allows you to route your MQTT messages to Azure services or webhooks for further processing. Accordingly, you can build end-to-end solutions by leveraging your IoT data for data analysis, storage, and visualizations, among other use cases. + :::image type="content" source="media/mqtt-overview/routing-high-res.png" alt-text="Diagram of the MQTT message routing." border="false"::: ## How can I use the routing feature?
The routing configuration enables you to send all your messages from your client
The Event Grid custom topic that is used for routing need to fulfill the following requirements: - It needs to be set to use the Cloud Event Schema v1.0 - It needs to be in the same region as the namespace.-- You need to assign "EventGrid Data Sender" role to yourself on the Event Grid custom topic.
+- You need to assign "Event Grid Data Sender" role to yourself on the Event Grid custom topic.
- In the portal, go to the created Event Grid topic resource. - In the "Access control (IAM)" menu item, select "Add a role assignment".
- - In the "Role" tab, select "EventGrid Data Sender", then select "Next".
+ - In the "Role" tab, select "Event Grid Data Sender", then select "Next".
- In the "Members" tab, select +Select members, then type your AD user name in the "Select" box that will appear (for example, [user@contoso.com](mailto:user@contoso.com)). - Select your AD user name, then select "Review + assign"
event-grid Mqtt Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-support.md
# MQTT features support in Azure Event Grid MQTT is a publish-subscribe messaging transport protocol that was designed for constrained environments. ItΓÇÖs efficient, scalable, and reliable, which made it the gold standard for communication in IoT scenarios. Event Grid supports clients that publish and subscribe to messages over MQTT v3.1.1, MQTT v3.1.1 over WebSockets, MQTT v5, and MQTT v5 over WebSockets. Event Grid also supports cross MQTT version (MQTT 3.1.1 and MQTT 5) communication. + MQTT v5 has introduced many improvements over MQTT v3.1.1 to deliver a more seamless, transparent, and efficient communication. It added: - Better error reporting. - More transparent communication clients through features like user properties and content type.
event-grid Mqtt Topic Spaces https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-topic-spaces.md
A topic space represents multiple topics through a set of topic templates. Topic templates are an extension of MQTT filters that support variables, along with the MQTT wildcards. Each topic space represents the MQTT topics that the same set of clients need to use to communicate. + Topic spaces are used to simplify access control management by enabling you to grant publish or subscribe access to a group of topics at once instead of managing access for each individual topic. To publish or subscribe to any MQTT topic, you need to: 1. Create a **client** resource for each client that needs to communicate over MQTT.
event-grid Mqtt Troubleshoot Errors https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-troubleshoot-errors.md
This guide provides you with information on what you can do to troubleshoot things before you submit a support ticket. + ## Unable to connect an MQTT client to your Event Grid namespace If your client doesn't connect to your Event Grid namespace, a few things to verify:
event-grid Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/overview.md
Azure Event Grid is a generally available service deployed across availability z
>[!NOTE] >The following features have been released with our 2023-06-01-preview API: >
->- MQTT v3.1.1 and v5.0 support
->- Pull-style event consumption (HTTP)
+>- MQTT v3.1.1 and v5.0 support (preview)
+>- Pull-style event consumption using HTTP (preview)
>
->The initial regions where these features are available are:
->
->- East US
->- Central US
->- South Central US
->- West US 2
->- East Asia
->- Southeast Asia
->- North Europe
->- West Europe
->- UAE North
+>The initial regions where these features are available are: East US, Central US, South Central US, West US 2, East Asia, Southeast Asia, North Europe, West Europe, UAE North
+ ## Overview Azure Event Grid is used at different stages of data pipelines to achieve a diverse set of integration goals.
-**MQTT messaging**. IoT devices and applications can communicate with each other over MQTT. Event Grid can also be used to route MQTT messages to Azure services or custom endpoints for further data analysis, visualization, or storage. This integration with Azure services enables you to build data pipelines that start with data ingestion from your IoT devices.
+**MQTT messaging (preview)**. IoT devices and applications can communicate with each other over MQTT. Event Grid can also be used to route MQTT messages to Azure services or custom endpoints for further data analysis, visualization, or storage. This integration with Azure services enables you to build data pipelines that start with data ingestion from your IoT devices.
-**Data distribution using push and pull delivery modes**. At any point in a data pipeline, HTTP applications can consume messages using push or pull APIs. The source of the data may include MQTT clientsΓÇÖ data, but also includes the following data sources that send their events over HTTP:
+**Data distribution using push and pull delivery (preview) modes**. At any point in a data pipeline, HTTP applications can consume messages using push or pull APIs. The source of the data may include MQTT clientsΓÇÖ data, but also includes the following data sources that send their events over HTTP:
- Azure services - Your custom applications
When configuring Event Grid for push delivery, Event Grid can send data to [dest
Event Grid offers a rich mixture of features. These features include:
-### MQTT messaging
+### MQTT messaging (preview)
- **[MQTT v3.1.1 and MQTT v5.0](mqtt-publish-and-subscribe-portal.md)** support ΓÇô use any open source MQTT client library to communicate with the service. - **Custom topics with wildcards support** - leverage your own topic structure.
Event Grid offers a rich mixture of features. These features include:
### Event messaging (HTTP) -- **Flexible event consumption model** ΓÇô when using HTTP, consume events using pull or push delivery mode.
+- **Flexible event consumption model** ΓÇô when using HTTP, consume events using pull (preview) or push delivery mode.
- **System events** ΓÇô Get up and running quickly with built-in Azure service events. - **Your own application events** - Use Event Grid to route, filter, and reliably deliver custom events from your app. - **Partner events** ΓÇô Subscribe to your partner SaaS provider events and process them on Azure.
One or more clients can connect to Azure Event Grid to read messages at their ow
You can configure **private links** to connect to Azure Event Grid to **publish and read** CloudEvents through a [private endpoint](../private-link/private-endpoint-overview.md) in your virtual network. Traffic between your virtual network and Event Grid travels the Microsoft backbone network. >[!Important]
-> Private links are available with pull delivery, not with push delivery. This is not a gap. Private links “…[enables you to access Azure PaaS Services](../private-link/private-link-overview.md)…” That is, private links were designed to be used used when you connect to Event Grid for publishing events or receiving events, not when Event Grid is connecting (sending events) to your webhook or Azure Service.
+> Private links are available with pull delivery, not with push delivery. This is not a gap. Private links “…[enables you to access Azure PaaS Services](../private-link/private-link-overview.md)…” That is, private links were designed to be used when you connect to Event Grid for publishing events or receiving events, not when Event Grid is connecting (sending events) to your webhook or Azure Service.
## How much does Event Grid cost?
-Azure Event Grid uses a pay-per-event pricing model. You only pay for what you use. For the push-style delivery that is generally available, the first 100,000 operations per month are free. Examples of operations include event publication, event delivery, delivery attempts, event filter evaluations that refer to event data properties (sometimes referred as Advanced Filters), and events sent to a dead letter location. For details, see the [pricing page](https://azure.microsoft.com/pricing/details/event-grid/).
+Azure Event Grid uses a pay-per-event pricing model. You only pay for what you use. For the push-style delivery that is generally available, the first 100,000 operations per month are free. Examples of operations include event publication, event delivery, delivery attempts, event filter evaluations that refer to event data properties (sometimes referred to as Advanced Filters), and events sent to a dead letter location. For details, see the [pricing page](https://azure.microsoft.com/pricing/details/event-grid/).
Event Grid operations involving Namespaces and its resources, including MQTT and pull HTTP delivery operations, are in public preview and are available at no charge today.
event-grid Publish Events Using Namespace Topics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/publish-events-using-namespace-topics.md
Last updated 05/24/2023
This article describes the steps to publish and consume events using the [CloudEvents](https://github.com/cloudevents/spec) with [JSON format](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/formats/json-format.md) using namespace topics and event subscriptions.
-Follow the steps in this article if you need to send application events to Event Grid so that they're received by consumer clients. Consumers connect to Event Grid to read the events ([pull delivery](pull-delivery-overview.md)).
-
->[!Important]
-> Namespaces, namespace topics, and event subscriptions associated to namespace topics are iniatially available in the following regions:
->
->- East US
->- Central US
->- South Central US
->- West US 2
->- East Asia
->- Southeast Asia
->- North Europe
->- West Europe
->- UAE North
+Follow the steps in this article if you need to send application events to Event Grid so that they're received by consumer clients. Consumers connect to Event Grid to read the events ([pull delivery](pull-delivery-overview.md)).
->[!Important]
-> The Azure [CLI Event Grid extension](/cli/azure/eventgrid) does not yet support namespaces and any of the resources it contains. We will use [Azure CLI resource](/cli/azure/resource) to create Event Grid resources.
-
->[!Important]
-> Azure Event Grid namespaces currently supports Shared Access Signatures (SAS) token and access keys authentication.
+>[!NOTE]
+> - Namespaces, namespace topics, and event subscriptions associated to namespace topics are initially available in the following regions: East US, Central US, South Central US, West US 2, East Asia, Southeast Asia, North Europe, West Europe, UAE North
+> - The Azure [CLI Event Grid extension](/cli/azure/eventgrid) does not yet support namespaces and any of the resources it contains. We will use [Azure CLI resource](/cli/azure/resource) to create Event Grid resources.
+> - Azure Event Grid namespaces currently supports Shared Access Signatures (SAS) token and access keys authentication.
[!INCLUDE [quickstarts-free-trial-note.md](../../includes/quickstarts-free-trial-note.md)]
key=$(az resource invoke-action --action listKeys --ids $namespace_resource_id -
``` ### Publish an event
-Retrieve the namespace hostname. You use it to compose the namespace HTTP endpoint to which events are sent. Please note that the following operations were first available with API version `2023-06-01-preview`.
+Retrieve the namespace hostname. You use it to compose the namespace HTTP endpoint to which events are sent. Note that the following operations were first available with API version `2023-06-01-preview`.
```azurecli-interactive publish_operation_uri="https://"$(az resource show --resource-group $resource_group --namespace Microsoft.EventGrid --resource-type namespaces --name $namespace --query "properties.topicsConfiguration.hostname" --output tsv)"/topics/"$topic:publish?api-version=2023-06-01-preview
Finally, submit a request to acknowledge the event received:
curl -X POST -H "Content-Type: application/json" -H "Authorization:SharedAccessKey $key" -d "$acknowledge_request_payload" $acknowledge_operation_uri ```
-If the acknowledge operation is executed before the lock token expires (300 seconds as set when we created the event subscription), you should see a response like the following:
+If the acknowledge operation is executed before the lock token expires (300 seconds as set when we created the event subscription), you should see a response like the following example:
```json {"succeededLockTokens":["CiYKJDQ4NjY5MDEyLTk1OTAtNDdENS1BODdCLUYyMDczNTYxNjcyMxISChDZae43pMpE8J8ovYMSQBZS"],"failedLockTokens":[]}
event-grid Publish Iot Hub Events To Logic Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/publish-iot-hub-events-to-logic-apps.md
This article walks through a sample configuration that uses IoT Hub and Event Gr
You can quickly create a new IoT hub using the Azure Cloud Shell terminal in the portal.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. On the upper right of the page, select the Cloud Shell button.
event-grid Pull Delivery Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/pull-delivery-overview.md
# Pull delivery with HTTP (Preview) This article builds on [What is Azure Event Grid?](overview.md) to provide essential information before you start using Event GridΓÇÖs pull delivery over HTTP. It covers fundamental concepts, resource models, and message delivery modes supported. At the end of this document, you find useful links to articles that guide you on how to use Event Grid and to articles that offer in-depth conceptual information.
->[!Important]
+>[!NOTE]
> This document helps you get started with Event Grid capabilities that use the HTTP protocol. This article is suitable for users who need to integrate applications on the cloud. If you require to communicate IoT device data, see [Overview of the MQTT Support in Azure Event Grid](mqtt-overview.md).
-> [!NOTE]
-> This feature is currently in preview. It's provided without a service level agreement, and is not recommended for production workloads. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
## Core concepts
event-grid Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Event Grid description: Lists Azure Policy Regulatory Compliance controls available for Azure Event Grid. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
event-grid Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/whats-new.md
Azure Event Grid receives improvements on an ongoing basis. To stay up to date w
The following features have been released as public preview features in May 2023: -- Pull delivery (HTTP)
+- Pull.style event consumption using HTTP
- MQTT v3.1.1 and v5.0 support ++ Here are the articles that we recommend you read through to learn about these features.
-### Pull delivery (HTTP)
+### Pull delivery using HTTP (preview)
- [Introduction to pull delivery of events](pull-delivery-overview.md#pull-delivery-1) - [Publish and subscribe using namespace topics](publish-events-using-namespace-topics.md)
Here are the articles that we recommend you read through to learn about these fe
- [Create, view, and manage namespace topics](create-view-manage-namespace-topics.md) - [Create, view, and manage event subscriptions](create-view-manage-event-subscriptions.md)
-### MQTT messaging
+### MQTT messaging (preview)
- [Introduction to MQTT messaging in Azure Event Grid](mqtt-overview.md) - Publish and subscribe to MQTT messages on Event Grid namespace - [Azure portal](mqtt-publish-and-subscribe-portal.md), [CLI](mqtt-publish-and-subscribe-cli.md)
event-hubs Process Data Azure Stream Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/process-data-azure-stream-analytics.md
Here are the key benefits of Azure Event Hubs and Azure Stream Analytics integra
> [!IMPORTANT] > If you aren't a member of [owner](../role-based-access-control/built-in-roles.md#owner) or [contributor](../role-based-access-control/built-in-roles.md#contributor) roles at the Azure subscription level, you must be a member of the [Stream Analytics Query Tester](../role-based-access-control/built-in-roles.md#stream-analytics-query-tester) role at the Azure subscription level to successfully complete steps in this section. This role allows you to perform testing queries without creating a stream analytics job first. For instructions on assigning a role to a user, see [Assign AD roles to users](../active-directory/roles/manage-roles-portal.md).
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your **Event Hubs namespace** and then navigate to the **event hub**, which has the incoming data. 1. Select **Process Data** on the event hub page or select **Process data** on the left menu.
event-hubs Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Event Hubs description: Lists Azure Policy Regulatory Compliance controls available for Azure Event Hubs. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
expressroute Expressroute Faqs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-faqs.md
See [here](./designing-for-high-availability-with-expressroute.md) for designing
You can achieve high availability by connecting up to 4 ExpressRoute circuits in the same peering location to your virtual network. You can also connect up to 16 ExpressRoute circuits in different peering locations to your virtual network. For example, Singapore and Singapore2. If one ExpressRoute circuit disconnects, connectivity fails over to another ExpressRoute circuit. By default, traffic leaving your virtual network is routed based on Equal Cost Multi-path Routing (ECMP). You can use **connection weight** to prefer one circuit to another. For more information, see [Optimizing ExpressRoute Routing](expressroute-optimize-routing.md).
+> [!NOTE]
+> Although it is possible to connect up to 16 circuits to your virtual network, the outgoing traffic from your virtual network will be load-balanced using Equal-Cost Multipath (ECMP) across a maximum of 4 circuits.
+ ### How do I ensure that my traffic destined for Azure Public services like Azure Storage and Azure SQL on Microsoft peering or public peering is preferred on the ExpressRoute path? You must implement the *Local Preference* attribute on your router(s) to ensure that the path from on-premises to Azure is always preferred on your ExpressRoute circuit(s).
expressroute Expressroute Howto Macsec https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-howto-macsec.md
To start the configuration, sign in to your Azure account and select the subscri
> CAK length depends on cipher suite specified: > * For GcmAes128 and GcmAesXpn128, the CAK must be an even-length string with 32 hexadecimal digits (0-9, A-F). > * For GcmAes256 and GcmAesXpn256, the CAK must be an even-length string with 64 hexadecimal digits (0-9, A-F).++
+ > [!NOTE]
+ > ExpressRoute is a Trusted Service within Azure that supports Network Security policies within the Azure Key Vault. For more information refer to [Configure Azure Key Vault Firewall and Virtual Networks](https://learn.microsoft.com/azure/key-vault/general/network-security)
> 1. Assign the GET permission to the user identity.
firewall Ftp Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/firewall/ftp-support.md
The following table shows the configuration required to support various FTP scen
> [!TIP] > Remember that it may also be necessary to configure firewall rules on the client side to support the connection.
+> [!NOTE]
+> By default, Passive FTP is enabled, and Active FTP needs additional configured on Azure Firewall. For instructions, see next section.
+>
+> Most FTP servers do not accept data and control channels from different source IP addresses for security reasons. Hence, FTP sessions via Azure Firewall are required to connect with a single client IP. This implies E-W FTP traffic should never be SNATΓÇÖed with Azure Firewall Private IP and instead use client IP for FTP flows. Likewise for internet FTP traffic, it is recommended to provision Azure Firewall with a single public IP for FTP connectivity. It is recommended to use NAT Gateway to avoid SNAT exhaustion.
+ |Firewall Scenario |Active FTP mode |Passive FTP mode | |||| |VNet-VNet |Network Rules to configure:<br>- Allow From Source VNet to Dest IP port 21<br>- Allow From Dest IP port 20 to Source VNet |Network Rules to configure:<br>- Allow From Source VNet to Dest IP port 21<br>- Allow From Source VNet to Dest IP \<Range of Data Ports>|
-|Outbound VNet - Internet<br><br>(FTP client in VNet, server on Internet) |Not supported *|**Pre-Condition**: Configure FTP server to accept data and control channels from different source IP addresses. Alternatively, configure Azure Firewall with single Public IP address.<br><br>Network Rules to configure:<br>- Allow From Source VNet to Dest IP port 21<br>- Allow From Source VNet to Dest IP \<Range of Data Ports> |
-|Inbound DNAT<br><br>(FTP client on Internet, server in VNet) |DNAT rule to configure:<br>- DNAT From Internet Source to VNet IP port 21<br><br>Network rule to configure:<br>- Allow **from** FTP server VNet IP **to** client Internet destination IP at destination client configured active ftp client port ranges |**Pre-Condition**:<br>Configure FTP server to accept data and control channels from different source IP addresses.<br><br>Tip: Azure Firewall supports limited number of DNAT rules. It's important to configure the FTP server to use a small port range on the Data channel.<br><br>DNAT Rules to configure:<br>- DNAT From Internet Source to VNet IP port 21<br>- DNAT From Internet Source to VNet IP \<Range of Data Ports> |
+|Outbound VNet - Internet<br><br>(FTP client in VNet, server on Internet) |Not supported *|Network Rules to configure:<br>- Allow From Source VNet to Dest IP port 21<br>- Allow From Source VNet to Dest IP \<Range of Data Ports> |
+|Inbound DNAT<br><br>(FTP client on Internet, FTP server in VNet) |DNAT rule to configure:<br>- DNAT From Internet Source to VNet IP port 21<br><br>Network rule to configure:<br>- Allow **traffic from** FTP server IP **to** the internet client IP on the active FTP port ranges. |Tip: Azure Firewall supports limited number of DNAT rules. It's important to configure the FTP server to use a small port range on the Data channel.<br><br>DNAT Rules to configure:<br>- DNAT From Internet Source to VNet IP port 21<br>- DNAT From Internet Source to VNet IP \<Range of Data Ports> |
\* Active FTP doesn't work when the FTP client must reach an FTP server on the Internet. Active FTP uses a PORT command from the FTP client that tells the FTP server what IP address and port to use for the data channel. The PORT command uses the private IP address of the client, which can't be changed. Client-side traffic traversing the Azure Firewall is NATed for Internet-based communications, so the PORT command is seen as invalid by the FTP server. This is a general limitation of Active FTP when used with a client-side NAT.
firewall Protect Azure Kubernetes Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/firewall/protect-azure-kubernetes-service.md
Title: Use Azure Firewall to protect Azure Kubernetes Service (AKS) clusters
description: Learn how to use Azure Firewall to protect Azure Kubernetes Service (AKS) clusters -+ Last updated 10/27/2022
frontdoor Custom Domain https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/frontdoor/scripts/custom-domain.md
Title: "Azure CLI example: Deploy custom domain in Azure Front Door"
description: Use this Azure CLI example script to deploy a Custom Domain name and TLS certificate on an Azure Front Door front-end. -+ ms.devlang: azurecli
governance NZ_ISM_Restricted_V3_5 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/NZ_ISM_Restricted_v3_5.md
Title: Regulatory Compliance details for NZ ISM Restricted v3.5 description: Details of the NZ ISM Restricted v3.5 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance RBI_ITF_Banks_V2016 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/RBI_ITF_Banks_v2016.md
Title: Regulatory Compliance details for Reserve Bank of India IT Framework for Banks v2016 description: Details of the Reserve Bank of India IT Framework for Banks v2016 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Australia Ism https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/australia-ism.md
Title: Regulatory Compliance details for Australian Government ISM PROTECTED description: Details of the Australian Government ISM PROTECTED Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Azure Security Benchmark https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/azure-security-benchmark.md
Title: Regulatory Compliance details for Microsoft cloud security benchmark description: Details of the Microsoft cloud security benchmark Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
initiative definition.
||||| |[API Management minimum API version should be set to 2019-12-01 or higher](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F549814b6-3212-4203-bdc8-1548d342fb67) |To prevent service secrets from being shared with read-only users, the minimum API version should be set to 2019-12-01 or higher. |Audit, Deny, Disabled |[1.0.1](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/API%20Management/ApiManagement_MinimumApiVersion_AuditDeny.json) | |[API Management secret named values should be stored in Azure Key Vault](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2Ff1cc7827-022c-473e-836e-5a51cae0b249) |Named values are a collection of name and value pairs in each API Management service. Secret values can be stored either as encrypted text in API Management (custom secrets) or by referencing secrets in Azure Key Vault. To improve security of API Management and secrets, reference secret named values from Azure Key Vault. Azure Key Vault supports granular access management and secret rotation policies. |Audit, Disabled, Deny |[1.0.2](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/API%20Management/ApiManagement_NamedValueSecretsInKV_AuditDeny.json) |
-|[Machines should have secret findings resolved](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F3ac7c827-eea2-4bde-acc7-9568cd320efa) |Audits virtual machines to detect whether they contain secret findings from the secret scanning solutions on your virtual machines. |AuditIfNotExists, Disabled |[1.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Security%20Center/ASC_ServerSecretAssessment_Audit.json) |
+|[Machines should have secret findings resolved](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F3ac7c827-eea2-4bde-acc7-9568cd320efa) |Audits virtual machines to detect whether they contain secret findings from the secret scanning solutions on your virtual machines. |AuditIfNotExists, Disabled |[1.0.1](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Security%20Center/ASC_ServerSecretAssessment_Audit.json) |
## Privileged Access
initiative definition.
|Name<br /><sub>(Azure portal)</sub> |Description |Effect(s) |Version<br /><sub>(GitHub)</sub> | ||||| |[A vulnerability assessment solution should be enabled on your virtual machines](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F501541f7-f7e7-4cd6-868c-4190fdad3ac9) |Audits virtual machines to detect whether they are running a supported vulnerability assessment solution. A core component of every cyber risk and security program is the identification and analysis of vulnerabilities. Azure Security Center's standard pricing tier includes vulnerability scanning for your virtual machines at no extra cost. Additionally, Security Center can automatically deploy this tool for you. |AuditIfNotExists, Disabled |[3.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Security%20Center/ASC_ServerVulnerabilityAssessment_Audit.json) |
-|[Machines should have secret findings resolved](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F3ac7c827-eea2-4bde-acc7-9568cd320efa) |Audits virtual machines to detect whether they contain secret findings from the secret scanning solutions on your virtual machines. |AuditIfNotExists, Disabled |[1.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Security%20Center/ASC_ServerSecretAssessment_Audit.json) |
+|[Machines should have secret findings resolved](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F3ac7c827-eea2-4bde-acc7-9568cd320efa) |Audits virtual machines to detect whether they contain secret findings from the secret scanning solutions on your virtual machines. |AuditIfNotExists, Disabled |[1.0.1](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Security%20Center/ASC_ServerSecretAssessment_Audit.json) |
|[Vulnerability assessment should be enabled on SQL Managed Instance](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F1b7aa243-30e4-4c9e-bca8-d0d3022b634a) |Audit each SQL Managed Instance which doesn't have recurring vulnerability assessment scans enabled. Vulnerability assessment can discover, track, and help you remediate potential database vulnerabilities. |AuditIfNotExists, Disabled |[1.0.1](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/SQL/VulnerabilityAssessmentOnManagedInstance_Audit.json) | |[Vulnerability assessment should be enabled on your SQL servers](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2Fef2a8f2a-b3d9-49cd-a8a8-9a3aaaf647d9) |Audit Azure SQL servers which do not have vulnerability assessment properly configured. Vulnerability assessment can discover, track, and help you remediate potential database vulnerabilities. |AuditIfNotExists, Disabled |[3.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/SQL/VulnerabilityAssessmentOnServer_Audit.json) |
governance Azure Security Benchmarkv1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/azure-security-benchmarkv1.md
Title: Regulatory Compliance details for Azure Security Benchmark v1 description: Details of the Azure Security Benchmark v1 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Canada Federal Pbmm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/canada-federal-pbmm.md
Title: Regulatory Compliance details for Canada Federal PBMM description: Details of the Canada Federal PBMM Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Cis Azure 1 1 0 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/cis-azure-1-1-0.md
Title: Regulatory Compliance details for CIS Microsoft Azure Foundations Benchmark 1.1.0 description: Details of the CIS Microsoft Azure Foundations Benchmark 1.1.0 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Cis Azure 1 3 0 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/cis-azure-1-3-0.md
Title: Regulatory Compliance details for CIS Microsoft Azure Foundations Benchmark 1.3.0 description: Details of the CIS Microsoft Azure Foundations Benchmark 1.3.0 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Cis Azure 1 4 0 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/cis-azure-1-4-0.md
Title: Regulatory Compliance details for CIS Microsoft Azure Foundations Benchmark 1.4.0 description: Details of the CIS Microsoft Azure Foundations Benchmark 1.4.0 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Cmmc L3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/cmmc-l3.md
Title: Regulatory Compliance details for CMMC Level 3 description: Details of the CMMC Level 3 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Fedramp High https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/fedramp-high.md
Title: Regulatory Compliance details for FedRAMP High description: Details of the FedRAMP High Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Fedramp Moderate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/fedramp-moderate.md
Title: Regulatory Compliance details for FedRAMP Moderate description: Details of the FedRAMP Moderate Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Azure Security Benchmark https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-azure-security-benchmark.md
Title: Regulatory Compliance details for Microsoft cloud security benchmark (Azure Government) description: Details of the Microsoft cloud security benchmark (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Cis Azure 1 1 0 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-cis-azure-1-1-0.md
Title: Regulatory Compliance details for CIS Microsoft Azure Foundations Benchmark 1.1.0 (Azure Government) description: Details of the CIS Microsoft Azure Foundations Benchmark 1.1.0 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Cis Azure 1 3 0 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-cis-azure-1-3-0.md
Title: Regulatory Compliance details for CIS Microsoft Azure Foundations Benchmark 1.3.0 (Azure Government) description: Details of the CIS Microsoft Azure Foundations Benchmark 1.3.0 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Cmmc L3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-cmmc-l3.md
Title: Regulatory Compliance details for CMMC Level 3 (Azure Government) description: Details of the CMMC Level 3 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Fedramp High https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-fedramp-high.md
Title: Regulatory Compliance details for FedRAMP High (Azure Government) description: Details of the FedRAMP High (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Fedramp Moderate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-fedramp-moderate.md
Title: Regulatory Compliance details for FedRAMP Moderate (Azure Government) description: Details of the FedRAMP Moderate (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Irs 1075 Sept2016 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-irs-1075-sept2016.md
Title: Regulatory Compliance details for IRS 1075 September 2016 (Azure Government) description: Details of the IRS 1075 September 2016 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Iso 27001 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-iso-27001.md
Title: Regulatory Compliance details for ISO 27001:2013 (Azure Government) description: Details of the ISO 27001:2013 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Gov Nist Sp 800 53 R5 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/gov-nist-sp-800-53-r5.md
Title: Regulatory Compliance details for NIST SP 800-53 Rev. 5 (Azure Government) description: Details of the NIST SP 800-53 Rev. 5 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Hipaa Hitrust 9 2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/hipaa-hitrust-9-2.md
Title: Regulatory Compliance details for HIPAA HITRUST 9.2 description: Details of the HIPAA HITRUST 9.2 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Irs 1075 Sept2016 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/irs-1075-sept2016.md
Title: Regulatory Compliance details for IRS 1075 September 2016 description: Details of the IRS 1075 September 2016 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Iso 27001 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/iso-27001.md
Title: Regulatory Compliance details for ISO 27001:2013 description: Details of the ISO 27001:2013 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance New Zealand Ism https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/new-zealand-ism.md
Title: Regulatory Compliance details for New Zealand ISM Restricted description: Details of the New Zealand ISM Restricted Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps to **compliance domains** and **controls** in New Zealand ISM Restricted. For more information about this compliance standard, see
-[New Zealand ISM Restricted](https://www.nzism.gcsb.govt.nz/). To understand
+[New Zealand ISM Restricted](https://www.nzism.gcsb.govt.nz/ism-document). To understand
_Ownership_, see [Azure Policy policy definition](../concepts/definition-structure.md#type) and [Shared responsibility in the cloud](../../../security/fundamentals/shared-responsibility.md).
governance Nist Sp 800 53 R5 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/nist-sp-800-53-r5.md
Title: Regulatory Compliance details for NIST SP 800-53 Rev. 5 description: Details of the NIST SP 800-53 Rev. 5 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Rbi_Itf_Nbfc_V2017 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/rbi_itf_nbfc_v2017.md
Title: Regulatory Compliance details for Reserve Bank of India - IT Framework for NBFC description: Details of the Reserve Bank of India - IT Framework for NBFC Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Rmit Malaysia https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/rmit-malaysia.md
Title: Regulatory Compliance details for RMIT Malaysia description: Details of the RMIT Malaysia Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
governance Ukofficial Uknhs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/ukofficial-uknhs.md
Title: Regulatory Compliance details for UK OFFICIAL and UK NHS description: Details of the UK OFFICIAL and UK NHS Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 07/06/2023 Last updated : 07/20/2023
hdinsight Manage Clusters Runbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/manage-clusters-runbooks.md
If you donΓÇÖt have an Azure subscription, create a [free account](https://azure
## Install HDInsight modules
-1. Sign in to the the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select your Azure Automation Accounts. 1. Select **Modules gallery** under **Shared Resources**. 1. Type **AzureRM.Profile** in the box and hit enter to search. Select the available search result.
hdinsight Apache Spark Intellij Tool Plugin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/spark/apache-spark-intellij-tool-plugin.md
Steps to install the Scala plugin:
## Connect to your HDInsight cluster
-User can either [sign in to Azure subscription](#sign-in-to-your-azure-subscription), or [link a HDInsight cluster](#link-a-cluster). Use the Ambari username/password or domain joined credential to connect to your HDInsight cluster.
+User can either [sign in to your Azure subscription](#sign-in-to-your-azure-subscription), or [link a HDInsight cluster](#link-a-cluster). Use the Ambari username/password or domain joined credential to connect to your HDInsight cluster.
### Sign in to your Azure subscription
You can convert the existing Spark Scala applications that you created in Intell
If you're not going to continue to use this application, delete the cluster that you created with the following steps:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In the **Search** box at the top, type **HDInsight**.
healthcare-apis Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure API for FHIR description: Lists Azure Policy Regulatory Compliance controls available for Azure API for FHIR. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
healthcare-apis Dicomweb Standard Apis C Sharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/dicom/dicomweb-standard-apis-c-sharp.md
To use the DICOMweb&trade; Standard APIs, you must have an instance of the DICOM
After you've deployed an instance of the DICOM service, retrieve the URL for your App service:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search **Recent resources** and select your DICOM service instance. 1. Copy the **Service URL** of your DICOM service. Make sure to specify the version as part of the url when making requests. More information can be found in the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
healthcare-apis Dicomweb Standard Apis Curl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/dicom/dicomweb-standard-apis-curl.md
To use the DICOMWeb&trade; Standard APIs, you must have an instance of the DICOM
Once you've deployed an instance of the DICOM service, retrieve the URL for your App service:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search **Recent resources** and select your DICOM service instance. 3. Copy the **Service URL** of your DICOM service. 4. If you haven't already obtained a token, see [Get access token for the DICOM service using Azure CLI](dicom-get-access-token-azure-cli.md).
healthcare-apis Dicomweb Standard Apis Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/dicom/dicomweb-standard-apis-python.md
To use the DICOMWeb&trade; Standard APIs, you must have an instance of the DICOM
After you've deployed an instance of the DICOM service, retrieve the URL for your App service:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Search **Recent resources** and select your DICOM service instance. 1. Copy the **Service URL** of your DICOM service. 2. If you haven't already obtained a token, see [Get access token for the DICOM service using Azure CLI](dicom-get-access-token-azure-cli.md).
healthcare-apis Device Messages Through Iot Hub https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/device-messages-through-iot-hub.md
Previously updated : 07/05/2023 Last updated : 07/21/2023
> [!NOTE] > [Fast Healthcare Interoperability Resources (FHIR&#174;)](https://www.hl7.org/fhir/) is an open healthcare specification.
-For enhanced workflows and ease of use, you can use the MedTech service to receive messages from devices you create and manage through an IoT hub in [Azure IoT Hub](../../iot-hub/iot-concepts-and-iot-hub.md). This tutorial uses an Azure Resource Manager template (ARM template) and a **Deploy to Azure** button to deploy a MedTech service. The template deploys an IoT hub to create and manage devices, and then routes the device messages to an event hub for the MedTech service to read and process.
+The MedTech service can receive messages from devices you create and manage through an IoT hub in [Azure IoT Hub](../../iot-hub/iot-concepts-and-iot-hub.md). This tutorial uses an Azure Resource Manager template (ARM template) and a **Deploy to Azure** button to deploy a MedTech service. The template also deploys an IoT hub to create and manage devices, and message routes device messages to an event hub for the MedTech service to read and process. After device data processing, the FHIR resources are persisted into a FHIR service, which is also included in the template.
:::image type="content" source="media\device-messages-through-iot-hub\data-flow-diagram.png" border="false" alt-text="Diagram of the IoT device message flow through an IoT hub and event hub, and then into the MedTech service." lightbox="media\device-messages-through-iot-hub\data-flow-diagram.png"::: > [!TIP]
-> To learn how the MedTech service transforms and persists device data into the FHIR service as FHIR Observations, see [Overview of the MedTech service device data processing stages](overview-of-device-data-processing-stages.md).
+> To learn how the MedTech service transforms and persists device data into the FHIR service as FHIR resources, see [Overview of the MedTech service device data processing stages](overview-of-device-data-processing-stages.md).
In this tutorial, learn how to:
To begin your deployment and complete the tutorial, you must have the following
- **Owner** or **Contributor and User Access Administrator** role assignments in the Azure subscription. For more information, see [What is Azure role-based access control (Azure RBAC)?](../../role-based-access-control/overview.md) -- The Microsoft.HealthcareApis, Microsoft.EventHub, and Microsoft.Devices resource providers registered with your Azure subscription. To learn more, see [Azure resource providers and types](../../azure-resource-manager/management/resource-providers-and-types.md).
+- Microsoft.HealthcareApis, Microsoft.EventHub, and Microsoft.Devices resource providers registered with your Azure subscription. To learn more, see [Azure resource providers and types](../../azure-resource-manager/management/resource-providers-and-types.md).
- [Visual Studio Code](https://code.visualstudio.com/Download) installed locally.
To begin deployment in the Azure portal, select the **Deploy to Azure** button:
- **Location**: A supported Azure region for Azure Health Data Services (the value can be the same as or different from the region your resource group is in). For a list of Azure regions where Health Data Services is available, see [Products available by regions](https://azure.microsoft.com/explore/global-infrastructure/products-by-region/?products=health-data-services).
- - **Fhir Contributor Principle Id** (optional): An Azure Active Directory (Azure AD) user object ID to provide read/write permissions in the FHIR service.
+ - **Fhir Contributor Principle Id** (optional): An Azure Active Directory (Azure AD) user object ID to provide FHIR service read/write permissions.
- You can use this account to give access to the FHIR service to view the device messages that are generated in this tutorial. We recommend that you use your own Azure AD user object ID, so you can access the messages in the FHIR service. If you choose not to use the **Fhir Contributor Principle Id** option, clear the text box.
+ You can use this account to give access to the FHIR service to view the FHIR Observations that are generated in this tutorial. We recommend that you use your own Azure AD user object ID so you can access the messages in the FHIR service. If you choose not to use the **Fhir Contributor Principle Id** option, clear the text box.
To learn how to get an Azure AD user object ID, see [Find the user object ID](/partner-center/find-ids-and-domain-names#find-the-user-object-id). The user object ID that's used in this tutorial is only an example. If you use this option, use your own user object ID or the object ID of another person who you want to be able to access the FHIR service.
To begin deployment in the Azure portal, select the **Deploy to Azure** button:
:::image type="content" source="media\device-messages-through-iot-hub\review-and-create-button.png" alt-text="Screenshot that shows the Review + create button selected in the Azure portal.":::
-3. In **Review + create**, check the template validation status. If validation is successful, the template displays **Validation Passed**. If validation fails, fix the detail that's indicated in the error message, and then select **Review + create** again.
+3. In **Review + create**, check the template validation status. If validation is successful, the template displays **Validation Passed**. If validation fails, fix the issue that's indicated in the error message, and then select **Review + create** again.
:::image type="content" source="media\device-messages-through-iot-hub\validation-complete.png" alt-text="Screenshot that shows the Review + create pane displaying the Validation Passed message.":::
To begin deployment in the Azure portal, select the **Deploy to Azure** button:
## Review deployed resources and access permissions
-When deployment is completed, the following resources and access roles are created in the template deployment:
+When the deployment completes, the following resources and access roles are created:
* Event Hubs namespace and event hub. In this deployment, the event hub is named *devicedata*.
You complete the steps by using Visual Studio Code with the Azure IoT Hub extens
1. Open Visual Studio Code with Azure IoT Tools installed.
-2. In Explorer, in **Azure IoT Hub**, select **…** and choose **Select IoT Hub**.
+2. In Explorer, under **Azure IoT Hub**, select **…** and choose **Select IoT Hub**.
:::image type="content" source="media\device-messages-through-iot-hub\select-iot-hub.png" alt-text="Screenshot of Visual Studio Code with the Azure IoT Hub extension with the deployed IoT hub selected." lightbox="media\device-messages-through-iot-hub\select-iot-hub.png":::
You complete the steps by using Visual Studio Code with the Azure IoT Hub extens
* **Message**: **Plain Text**.
- * **Edit**: Clear any existing text, and then paste the following JSON.
+ * **Edit**: Clear any existing text, and then copy/paste the following test message JSON.
> [!TIP]
- > You can use the **Copy** option in in the right corner of the below test message, and then paste it within the **Edit** option.
+ > You can use the **Copy** option in in the right corner of the below test message, and then paste it within the **Edit** window.
```json {
- "HeartRate": 78,
- "RespiratoryRate": 12,
- "HeartRateVariability": 30,
- "BodyTemperature": 98.6,
- "BloodPressure": {
- "Systolic": 120,
- "Diastolic": 80
- }
- }
+ "HeartRate": 78,
+ "RespiratoryRate": 12,
+ "HeartRateVariability": 30,
+ "BodyTemperature": 98.6,
+ "BloodPressure": {
+ "Systolic": 120,
+ "Diastolic": 80
+ }
+ }
``` 8. To begin the process of sending a test message to your IoT hub, select **Send**.
You complete the steps by using Visual Studio Code with the Azure IoT Hub extens
## Review metrics from the test message
-Now that you have successfully sent a test message to your IoT hub, review your MedTech service metrics. You review metrics to verify that your MedTech service received, grouped, transformed, and persisted the test message to your FHIR service. To learn more, see [How to display the MedTech service monitoring tab metrics](how-to-use-monitoring-tab.md).
+Now that you have successfully sent a test message to your IoT hub, you can now review your MedTech service metrics. Review metrics to verify that your MedTech service received, grouped, transformed, and persisted the test message into your FHIR service. To learn more, see [How to use the MedTech service monitoring and health checks tabs](how-to-use-monitoring-and-health-checks-tabs.md#use-the-medtech-service-monitoring-tab).
For your MedTech service metrics, you can see that your MedTech service completed the following steps for the test message:
-* **Number of Incoming Messages**: Received the incoming test message from the device message event hub.
+* **Number of Incoming Messages**: Received the incoming test message from the event hub.
* **Number of Normalized Messages**: Created five normalized messages. * **Number of Measurements**: Created five measurements.
-* **Number of FHIR resources**: Created five FHIR resources that are persisted in your FHIR service.
+* **Number of FHIR resources**: Created five FHIR resources that are persisted into your FHIR service.
:::image type="content" source="media\device-messages-through-iot-hub\metrics-tile-one.png" alt-text="Screenshot that shows a MedTech service metrics tile and test data metrics." lightbox="media\device-messages-through-iot-hub\metrics-tile-one.png":::
To learn how to get an Azure AD access token and view FHIR resources in your FHI
In this tutorial, you deployed an ARM template in the Azure portal, connected to your IoT hub, created a device, sent a test message, and reviewed your MedTech service metrics.
-To learn about other methods of deploying the MedTech service, see
+To learn about methods of deploying the MedTech service, see
> [!div class="nextstepaction"] > [Choose a deployment method for the MedTech service](deploy-new-choose.md)
healthcare-apis Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Health Data Services FHIR service description: Lists Azure Policy Regulatory Compliance controls available. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
iot-central Howto Manage Dashboards With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-dashboards-with-rest-api.md
The response to this request looks like the following example:
PATCH https://{your app subdomain}.azureiotcentral.com/api/dashboards/{dashboardId}?api-version=2022-10-31-preview ```
-The following example shows a request body that updates the display name of a dashboard and size of the tile:
+The following example shows a request body that updates the display name of a dashboard and adds the dashboard to the list of favorites:
```json { "displayName": "New Dashboard Name",
- "tiles": [
- {
- "displayName": "lineChart",
- "configuration": {
- "type": "lineChart",
- "capabilities": [
- {
- "capability": "AvailableMemory",
- "aggregateFunction": "avg"
- }
- ],
- "devices": [
- "1cfqhp3tue3",
- "mcoi4i2qh3"
- ],
- "group": "da48c8fe-bac7-42bc-81c0-d8158551f066",
- "format": {
- "xAxisEnabled": true,
- "yAxisEnabled": true,
- "legendEnabled": true
- },
- "queryRange": {
- "type": "time",
- "duration": "PT30M",
- "resolution": "PT1M"
- }
- },
- "x": 5,
- "y": 0,
- "width": 5,
- "height": 5
- }
- ],
- "favorite": false
+ "favorite": true
} ```
The response to this request looks like the following example:
"height": 5 } ],
- "favorite": false
+ "favorite": true
} ```
iot-central Howto Manage Data Export With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-data-export-with-rest-api.md
The response to this request looks like the following example:
```json { "id": "8dbcdb53-c6a7-498a-a976-a824b694c150",
- "displayName": "Blob Storage Destination",
+ "displayName": "Blob Storage",
"type": "blobstorage@v1",
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=********;EndpointSuffix=core.windows.net",
- "containerName": "central-data",
+ "authorization": {
+ "type": "connectionString",
+ "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=*****;EndpointSuffix=core.windows.net",
+ "containerName": "central-data"
+ },
"status": "waiting" } ```
The response to this request looks like the following example:
```json { "id": "8dbcdb53-c6a7-498a-a976-a824b694c150",
- "displayName": "Blob Storage Destination",
+ "displayName": "Blob Storage",
"type": "blobstorage@v1",
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=********;EndpointSuffix=core.windows.net",
- "containerName": "central-data",
+ "authorization": {
+ "type": "connectionString",
+ "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=*****;EndpointSuffix=core.windows.net",
+ "containerName": "central-data"
+ },
"status": "waiting" } ```
The response to this request looks like the following example:
PATCH https://{your app subdomain}/api/dataExport/destinations/{destinationId}?api-version=2022-10-31-preview ```
-You can use this call to perform an incremental update to an export. The sample request body looks like the following example that updates the `displayName` to a destination:
+You can use this call to perform an incremental update to an export. The sample request body looks like the following example that updates the `connectionString` of a destination:
```json {
- "displayName": "Blob Storage",
- "type": "blobstorage@v1",
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=********;EndpointSuffix=core.windows.net",
- "containerName": "central-data"
+ "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=********;EndpointSuffix=core.windows.net"
} ```
The response to this request looks like the following example:
"id": "8dbcdb53-c6a7-498a-a976-a824b694c150", "displayName": "Blob Storage", "type": "blobstorage@v1",
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=********;EndpointSuffix=core.windows.net",
- "containerName": "central-data",
+ "authorization": {
+ "type": "connectionString",
+ "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=*****;EndpointSuffix=core.windows.net",
+ "containerName": "central-data"
+ },
"status": "waiting"
-}
+}
``` ### Delete a destination
You can use this call to perform an incremental update to an export. The sample
```json {
- "displayName": "Enriched Export",
- "enabled": true,
- "source": "telemetry",
"enrichments": { "Custom data": { "value": "My value 2" }
- },
- "destinations": [
- {
- "id": "9742a8d9-c3ca-4d8d-8bc7-357bdc7f39d9"
- }
- ]
+ }
} ```
The response to this request looks like the following example:
"source": "telemetry", "enrichments": { "Custom data": {
- "value": "My"
+ "value": "My value 2"
} }, "destinations": [
iot-central Howto Manage Deployment Manifests With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-deployment-manifests-with-rest-api.md
The response to this request looks like the following example:
PATCH https://{your app subdomain}/api/deploymentManifests/{deploymentManifestId}?api-version=2022-10-31-preview ```
-The following sample request body updates the deployment manifest but leaves the display name unchanged:
+The following sample request body updates the `SendInterval` desired property setting for the `SimuatedTemperatureSetting` module:
```json { "data": { "modulesContent": {
- "$edgeAgent": {
- "properties.desired": {
- "schemaVersion": "1.0",
- "runtime": {
- "type": "docker",
- "settings": {
- "minDockerVersion": "v1.25",
- "loggingOptions": "",
- "registryCredentials": {}
- }
- },
- "systemModules": {
- "edgeAgent": {
- "type": "docker",
- "settings": {
- "image": "mcr.microsoft.com/azureiotedge-agent:1.4",
- "createOptions": "{}"
- }
- },
- "edgeHub": {
- "type": "docker",
- "status": "running",
- "restartPolicy": "always",
- "settings": {
- "image": "mcr.microsoft.com/azureiotedge-hub:1.4",
- "createOptions": "{}"
- }
- }
- },
- "modules": {
- "SimulatedTemperatureSensor": {
- "version": "1.0",
- "type": "docker",
- "status": "running",
- "restartPolicy": "always",
- "settings": {
- "image": "mcr.microsoft.com/azureiotedge-simulated-temperature-sensor:1.2",
- "createOptions": "{}"
- }
- }
- }
- }
- },
- "$edgeHub": {
- "properties.desired": {
- "schemaVersion": "1.0",
- "routes": {
- "route": "FROM /* INTO $upstream"
- },
- "storeAndForwardConfiguration": {
- "timeToLiveSecs": 7200
- }
- }
- },
"SimulatedTemperatureSensor": { "properties.desired": {
- "SendData": true,
- "SendInterval": 10
+ "SendInterval": 30
} } }
The response to this request looks like the following example:
"SimulatedTemperatureSensor": { "properties.desired": { "SendData": true,
- "SendInterval": 10
+ "SendInterval": 30
} } }
iot-central Howto Manage Device Templates With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-device-templates-with-rest-api.md
The response to this request looks like the following example:
PATCH https://{your app subdomain}/api/deviceTemplates/{deviceTemplateId}?api-version=2022-07-31 ```
->[!NOTE]
->`{deviceTemplateId}` should be the same as the `@id` in the payload.
-
-The sample request body looks like the following example that adds a `LastMaintenanceDate` cloud property to the device template:
+The sample request body looks like the following example that adds a `LastMaintenanceDate` cloud property to the `capabilityModel` in the device template:
```json {
- "displayName": "Thermostat",
-
- "@id": "dtmi:contoso:mythermostattemplate",
- "@type": [
- "ModelDefinition",
- "DeviceModel"
- ],
- "@context": [
- "dtmi:iotcentral:context;2",
- "dtmi:dtdl:context;2"
- ],
"capabilityModel": {
- "@id": "dtmi:contoso:Thermostat;1",
- "@type": "Interface",
"contents": [ { "@type": [
iot-central Howto Manage Devices With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-devices-with-rest-api.md
The response to this request looks like the following example:
PATCH https://{your app subdomain}/api/devices/{deviceId}?api-version=2022-07-31 ```
->[!NOTE]
->`{deviceTemplateId}` should be the same as the `@id` in the payload.
-
-The sample request body looks like the following example that updates the `displayName` to the device:
+The following sample request body changes the `enabled` field to `false`:
```json {
- "displayName": "CheckoutThermostat5",
- "template": "dtmi:contoso:Thermostat;1",
- "simulated": true,
- "enabled": true
+ "enabled": false
} ```
The response to this request looks like the following example:
{ "id": "thermostat1", "etag": "eyJoZWFkZXIiOiJcIjI0MDAwYTdkLTAwMDAtMDMwMC0wMDAwLTYxYjgxZDIwMDAwMFwiIiwiZGF0YSI6IlwiMzMwMDQ1M2EtMDAwMC0wMzAwLTAwMDAtNjFiODFkMjAwMDAwXCIifQ",
- "displayName": "CheckoutThermostat5",
+ "displayName": "CheckoutThermostat",
"simulated": true, "provisioned": false, "template": "dtmi:contoso:Thermostat;1",
- "enabled": true
+ "enabled": false
} ```
iot-central Howto Manage Organizations With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-organizations-with-rest-api.md
Use the following request to update details of an organization in your applicati
PATCH https://{your app subdomain}.azureiotcentral.com/api/organizations/{organizationId}?api-version=2022-07-31 ```
-The following example shows a request body that updates an organization.
+The following example shows a request body that updates the parent of the organization:
```json {
- "id": "seattle",
- "displayName": "Seattle Sales",
"parent": "washington" } ```
iot-central Howto Upload File Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-upload-file-rest-api.md
The response to this request looks like the following example:
## Update the file upload storage account configuration
-Use the following request to update a file upload blob storage account configuration in your IoT Central application:
+Use the following request to update a file upload blob storage account connection string in your IoT Central application:
```http PATCH https://{your-app-subdomain}.azureiotcentral.com/api/fileUploads?api-version=2022-07-31
PATCH https://{your-app-subdomain}.azureiotcentral.com/api/fileUploads?api-versi
```json {
- "account": "yourAccountName",
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=*****;BlobEndpoint=https://yourAccountName.blob.core.windows.net/",
- "container": "yourContainerName2",
- "sasTtl": "PT1H"
+ "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=*****;BlobEndpoint=https://yourAccountName.blob.core.windows.net/"
} ```
The response to this request looks like the following example:
{ "account": "yourAccountName", "connectionString": "DefaultEndpointsProtocol=https;AccountName=yourAccountName;AccountKey=*****;BlobEndpoint=https://yourAccountName.blob.core.windows.net/",
- "container": "yourContainerName2",
+ "container": "yourContainerName",
"sasTtl": "PT1H",
- "state": "succeeded",
- "etag": "\"7502ac89-0000-0300-0000-627eaf100000\""
+ "state": "succeeded",
+ "etag": "\"7502ac89-0000-0300-0000-627eaf100000\""
} ```
iot-dps Iot Dps Https Sym Key Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/iot-dps-https-sym-key-support.md
There are different paths through this article depending on the type of enrollme
* If you're running in Windows, install the latest version of [Git](https://git-scm.com/download/). Make sure that Git is added to the environment variables accessible to the command window. See [Software Freedom Conservancy's Git client tools](https://git-scm.com/download/) for the latest version of `git` tools to install, which includes *Git Bash*, the command-line app that you can use to interact with your local Git repository. On Windows, you'll enter all commands on your local system in a GitBash prompt. * Azure CLI. You have two options for running Azure CLI commands in this article:
- * Use the Azure Cloud Shell, an interactive shell that runs CLI commands in your browser. This option is recommended because you don't need to install anything. If you're using Cloud Shell for the first time, log into the [Azure portal](https://portal.azure.com). Follow the steps in [Cloud Shell quickstart](../cloud-shell/quickstart.md) to **Start Cloud Shell** and **Select the Bash environment**.
+ * Use the Azure Cloud Shell, an interactive shell that runs CLI commands in your browser. This option is recommended because you don't need to install anything. If you're using Cloud Shell for the first time, sign in to the [Azure portal](https://portal.azure.com). Follow the steps in [Cloud Shell quickstart](../cloud-shell/quickstart.md) to **Start Cloud Shell** and **Select the Bash environment**.
* Optionally, run Azure CLI on your local machine. If Azure CLI is already installed, run `az upgrade` to upgrade the CLI and extensions to the current version. To install Azure CLI, see [Install Azure CLI]( /cli/azure/install-azure-cli). * If you're running in a Linux or a WSL environment, open a Bash prompt to run commands locally. If you're running in a Windows environment, open a GitBash prompt.
iot-dps Iot Dps Https X509 Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/iot-dps-https-x509-support.md
There are multiple paths through this article depending on the type of enrollmen
* If you're running in Windows, install the latest version of [Git](https://git-scm.com/download/). Make sure that Git is added to the environment variables accessible to the command window. See [Software Freedom Conservancy's Git client tools](https://git-scm.com/download/) for the latest version of `git` tools to install, which includes *Git Bash*, the command-line app that you can use to interact with your local Git repository. On Windows, you'll enter all commands on your local system in a GitBash prompt. * Azure CLI. You have two options for running Azure CLI commands in this article:
- * Use the Azure Cloud Shell, an interactive shell that runs CLI commands in your browser. This option is recommended because you don't need to install anything. If you're using Cloud Shell for the first time, log into the [Azure portal](https://portal.azure.com). Follow the steps in [Cloud Shell quickstart](../cloud-shell/quickstart.md) to **Start Cloud Shell** and **Select the Bash environment**.
+ * Use the Azure Cloud Shell, an interactive shell that runs CLI commands in your browser. This option is recommended because you don't need to install anything. If you're using Cloud Shell for the first time, sign in to the [Azure portal](https://portal.azure.com). Follow the steps in [Cloud Shell quickstart](../cloud-shell/quickstart.md) to **Start Cloud Shell** and **Select the Bash environment**.
* Optionally, run Azure CLI on your local machine. If Azure CLI is already installed, run `az upgrade` to upgrade the CLI and extensions to the current version. To install Azure CLI, see [Install Azure CLI]( /cli/azure/install-azure-cli). * If you're running in a Linux or a WSL environment, open a Bash prompt to run commands locally. If you're running in a Windows environment, open a GitBash prompt.
iot-edge How To Provision Single Device Linux Symmetric https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/how-to-provision-single-device-linux-symmetric.md
Title: Create and provision an IoT Edge device on Linux using symmetric keys - A
description: Create and provision a single IoT Edge device in IoT Hub for manual provisioning with symmetric keys + Last updated 04/25/2023
iot-hub Horizontal Arm Route Messages https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/horizontal-arm-route-messages.md
This section provides the steps to deploy the template, create a virtual device,
[![Deploy To Azure](https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/1-CONTRIBUTION-GUIDE/images/deploytoazure.svg?sanitize=true)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.devices%2Fiothub-auto-route-messages%2Fazuredeploy.json)
-1. Open a command window and go to the folder where you unzipped the IoT C# SDK. Find the folder with the arm-read-write.csproj file. You create the environment variables in this command window. Log into the [Azure portal](https://portal.azure.com) to get the keys. Select **Resource Groups** then select the resource group used for this quickstart.
+1. Open a command window and go to the folder where you unzipped the IoT C# SDK. Find the folder with the arm-read-write.csproj file. You create the environment variables in this command window. Sign in to the [Azure portal](https://portal.azure.com) to get the keys. Select **Resource Groups** then select the resource group used for this quickstart.
![Select the resource group](./media/horizontal-arm-route-messages/01-select-resource-group.png)
This section provides the steps to deploy the template, create a virtual device,
## Review deployed resources
-1. Log in to the [Azure portal](https://portal.azure.com) and select the Resource Group, then select the storage account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and select the Resource Group, then select the storage account.
1. Drill down into the storage account until you find files.
You have deployed an ARM template to create an IoT hub and a storage account, an
## Clean up resources
-To remove the resources added during this quickstart, log into the [Azure portal](https://portal.azure.com). Select **Resource Groups**, then find the resource group you used for this quickstart. Select the resource group and then select *Delete*. It will delete all of the resources in the group.
+To remove the resources added during this quickstart, sign in to the [Azure portal](https://portal.azure.com). Select **Resource Groups**, then find the resource group you used for this quickstart. Select the resource group and then select *Delete*. It will delete all of the resources in the group.
## Next steps
iot-hub Iot Hub Ha Dr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-ha-dr.md
Previously updated : 12/08/2022 Last updated : 07/20/2023
The IoT Hub service provides intra-region HA by implementing redundancies in alm
## Availability zones
-IoT Hub supports [Azure availability zones](../availability-zones/az-overview.md). An availability zone is a high-availability offering that protects your applications and data from datacenter failures. A region with availability zone support comprises three zones supporting that region. Each zone provides one or more datacenters, each in a unique physical location with independent power, cooling, and networking. This configuration provides replication and redundancy within the region. Availability zone support for IoT Hub is enabled automatically for new IoT Hub resources created in the following Azure regions:
-
-* Australia East
-* Brazil South
-* Canada Central
-* Central US
-* France Central
-* Germany West Central
-* Japan East
-* North Europe
-* Southeast Asia
-* UK South
-* West US 2
+IoT Hub supports [Azure availability zones](../availability-zones/az-overview.md). An availability zone is a high-availability offering that protects your applications and data from datacenter failures. A region with availability zone support comprises three zones supporting that region. Each zone provides one or more datacenters, each in a unique physical location with independent power, cooling, and networking. This configuration provides replication and redundancy within the region.
+
+Availability zones provide two advantages: data resiliency and smoother deployments.
+
+*Data resiliency* comes from replacing the underlying storage services with availability-zones-supported storage. Data resilience is important for IoT solutions because these solutions often operate in complex, dynamic, and uncertain environments where failures or disruptions can have significant consequences. Whether an IoT solution supports a manufacturing floor, retail or restaurant environments, healthcare systems, or infrastructure, the availability and quality of data is necessary to recover from failures and to provide reliable and consistent services.
+
+*Smoother deployments* come from replacing the underlying data center hardware with newer hardware that supports availability zones. These hardware improvements minimize customer impact from device disconnects and reconnects as well as other deployment-related downtime. The IoT Hub engineering team deploys multiple updates to each IoT hub ever month, for both security reasons and to provide feature improvements. Availability-zones-supported hardware is split into 15 update domains so that each update goes smoother, with minimal impact to your workflows. For more information about update domains, see [Availability sets](../virtual-machines/availability-set-overview.md).
+
+Availability zone support for IoT Hub is enabled automatically for new IoT Hub resources created in the following Azure regions:
+
+| Region | Data resiliency | Smoother deployments |
+| | | |
+| Australia East | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Brazil South | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Canada Central | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Central India | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Central US | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| East US | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| France Central | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Germany West Central | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Japan East | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Korea Central | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| North Europe | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Norway East | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Qatar Central | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Southcentral US | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| Southeast Asia | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| UK South | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| West Europe | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| West US 2 | :::image type="icon" source="./media/icons/yes-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
+| West US 3 | :::image type="icon" source="./media/icons/no-icon.png"::: | :::image type="icon" source="./media/icons/yes-icon.png"::: |
## Cross region DR
iot-hub Migrate Hub Arm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/migrate-hub-arm.md
This section provides specific instructions for migrating the hub.
### Export the original hub to a resource template
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to the IoT hub that you want to move.
If you moved the routing resources as well, update the name, ID, and resource gr
Create the new hub using the edited template. If you have routing resources that are going to move, the resources should be set up in the new location and the references in the template updated to match. If you aren't moving the routing resources, they should be in the template with the updated keys.
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select **Create a resource**.
iot-hub Quickstart Bicep Route Messages https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/quickstart-bicep-route-messages.md
This section provides the steps to deploy the Bicep file, create a virtual devic
1. Download and unzip the [IoT C# SDK](https://github.com/Azure/azure-iot-sdk-csharp/archive/main.zip).
-1. Open a command window and go to the folder where you unzipped the IoT C# SDK. Find the folder with the arm-read-write.csproj file. You create the environment variables in this command window. Log into the [Azure portal](https://portal.azure.com) to get the keys. Select **Resource Groups** then select the resource group used for this quickstart.
+1. Open a command window and go to the folder where you unzipped the IoT C# SDK. Find the folder with the arm-read-write.csproj file. You create the environment variables in this command window. Sign in to the [Azure portal](https://portal.azure.com) to get the keys. Select **Resource Groups** then select the resource group used for this quickstart.
![Select the resource group](./media/horizontal-arm-route-messages/01-select-resource-group.png)
This section provides the steps to deploy the Bicep file, create a virtual devic
## Review deployed resources
-1. Log in to the [Azure portal](https://portal.azure.com) and select the Resource Group, then select the storage account.
+1. Sign in to the [Azure portal](https://portal.azure.com) and select the Resource Group, then select the storage account.
1. Drill down into the storage account until you find files.
iot-hub Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure IoT Hub description: Lists Azure Policy Regulatory Compliance controls available for Azure IoT Hub. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
key-vault Network Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/general/network-security.md
To allow an IP Address or range of an Azure resource, such as a Web App or Logic
1. Sign in to the Azure portal. 1. Select the resource (specific instance of the service).
-1. Select on the 'Properties' blade under 'Settings'.
-1. Look for the "IP Address" field.
+1. Select the **Properties** blade under **Settings**.
+1. Look for the **IP Address** field.
1. Copy this value or range and enter it into the key vault firewall allowlist. To allow an entire Azure service, through the Key Vault firewall, use the list of publicly documented data center IP addresses for Azure [here](https://www.microsoft.com/download/details.aspx?id=56519). Find the IP addresses associated with the service you would like in the region you want and add those IP addresses to the key vault firewall.
If you are trying to allow an Azure resource such as a virtual machine through k
In this case, you should create the resource within a virtual network, and then allow traffic from the specific virtual network and subnet to access your key vault.
-1. Sign in to the Azure portal
-1. Select the key vault you wish to configure
-1. Select the 'Networking' blade
-1. Select '+ Add existing virtual network'
+1. Sign in to the Azure portal.
+1. Select the key vault you wish to configure.
+1. Select the 'Networking' blade.
+1. Select '+ Add existing virtual network'.
1. Select the virtual network and subnet you would like to allow through the key vault firewall. ### Key Vault Firewall Enabled (Private Link)
key-vault Key Vault Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/key-vault-insights-overview.md
From Azure Monitor, you can view request, latency, and failure details from mult
To view the utilization and operations of your key vaults across all your subscriptions, perform the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Monitor** from the left-hand pane in the Azure portal, and under the Insights section, select **Key Vaults**.
key-vault Managed Hsm Technical Details https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/managed-hsm/managed-hsm-technical-details.md
+
+ Title: How Managed HSM implements key sovereignty, availability, performance, and scalability without tradeoffs
+description: A technical description of how Customer Key control is implemented cryptographically by managed HSM
+++++ Last updated : 07/20/2023++
+# Azure Managed HSM: key sovereignty, availability, performance, and scalability
+
+Cryptographic keys are the root of trust for securing modern computer systems, be they in the cloud or on-premises. As such, controlling who has authority over those keys is critical to building secure and compliant applications. In Azure, our vision of how the key management should be done in the cloud is expressed in the concept of **key sovereignty**, which means a customer's organization has full and exclusive control over what people can access keys and change key management policies; and over what Azure services consume these keys. Once these decisions are made by the customer, Microsoft personnel are prevented through technical means from changing these decisions. The key management service code executes the customer's decisions until the customer tells it to do otherwise, and Microsoft personnel cannot intervene.
+
+At the same time, it is our belief that every service in the cloud must be fully managed ΓÇô it must provide the availability, resiliency, security and cloud fundamental promises, backed by service level agreements (SLA). In order to deliver a managed service, Microsoft needs to patch key management servers, upgrade HSM firmware, heal failing hardware, perform failovers, etc. ΓÇô high privilege operations. As most security professionals know, denying someone with high privilege or physical access to a system access to the data within that system is a difficult problem. This article explains how we solved this problem in the Managed HSM service (giving customers both full key sovereignty and fully managed service SLAs) by using confidential computing technology paired with Hardware Security Modules (HSMs).
+
+## The Managed HSM hardware environment
+
+A customer's Managed HSM pool in any given Azure region is housed in a [secure Azure datacenter](../../security/fundamentals/physical-security.md), with three instances spread over several servers, each deployed in a different rack to ensure redundancy. Each server has a [FIPS 140-2 Level 3](https://csrc.nist.gov/publications/detail/fips/140/2/final) validated Marvell Liquid Security HSM Adapter with multiple cryptographic cores used to create fully isolated HSM partitions including fully isolated credentials, data storage, access control, etc.
+
+The physical separation of the instances inside the datacenter is critical to ensuring that the loss of a single component (top-of-rack switch, power management unit in a rack, etc.) can't affect all the instances of a pool. These servers are dedicated to the Azure Security HSM team, and are not shared with other Azure teams, and no customer workloads are deployed to these servers. Physical access controls, including locked racks, are used to prevent unauthorized access to the servers. These controls meet FedRAMP-High, PCI, SOC 1/2/3, ISO 270x, and other security and privacy standards, and are regularly independently verified as part of [Azure's compliance program](https://www.microsoft.com/trust-center/compliance/compliance-overview?rtc=1). The HSMs have enhanced physical security, validated to meet FIPS 140-2 Level 3 and the entire Managed HSM service is built on top of the standard [secure Azure platform](../../security/fundamentals/platform.md) including [Trusted Launch](../../virtual-machines/trusted-launch.md), which protects against advanced persistent threats (APTs).
+
+The HSM adapters can support dozens of isolated HSM partitions. Running on each server is a control process, called Node Service (NS), that takes ownership of each adapter and installs the credentials for the adapter owner, in this case Microsoft. The HSM is designed so that ownership of the adapter does not provide Microsoft with access to data stored in customer partitions. It only allows Microsoft to create, resize and delete customer partitions, and it supports taking blind backups of any partition for the customer. A blind backup is one wrapped by a customer provided key that can be restored by the service
+code only inside an HSM instance owned by the customer, and whose contents are not readable by Microsoft.
+
+### Architecture of a Managed HSM Pool
+
+Figure 1 below show the architecture of an HSM pool, which consists of three Linux VMs, each running on an HSM server in its own datacenter rack to support availability. The important components are:
+- The HSM fabric controller (HFC) is the control plane for the service, which drives automated patching and repairs for the pool.
+- A FIPS 140-2 Level 3 compliant cryptographic boundary, exclusive for each customer, including three Intel SGX confidential enclaves, each connected to an HSM instance. The root keys for this boundary are generated and stored in the three HSMs. As we will describe, no Microsoft person has access to the data within this boundary; only service code running in the SGX enclave (including the Node Service agent), acting on behalf of the customer, has access.
+
+![Architectural diagram of an MHSM pool showing the TEEs inside the customer cryptographic boundary and health maintenance operations outside of the boundary.](../../media/mhsm-technical-details/mhsm-architecture.png)
+
+### The trusted execution environment (TEE)
+
+A Managed HSM pool consists of three service instances each implemented as a Trusted Execution Environment (TEE) which uses [Intel Secure Guard Extensions (SGX)](https://www.intel.com/content/www/us/en/architecture-and-technology/software-guard-extensions.html) capabilities and the [Open Enclave SDK](https://openenclave.io/sdk/). Execution within a TEE ensures that no person on either the VM hosting the service or the VM's host server has access to customer secrets, data, or the HSM partition. Each TEE is dedicated to a specific customer, and it runs TLS management, request handling, and access control to the HSM partition. No credentials or customer-specific data encryption keys exist in the clear outside this TEE, except as part of the Security Domain package. That package is encrypted to customer-provided key and downloads when their pool is first created.
+
+The TEEs communicate among themselves using [attested TLS](https://arxiv.org/pdf/1801.05863.pdf), which combines the remote attestation capabilities of the SGX platform with TLS 1.2. This allows MSHM code in the TEE to limit its communication to only other code signed by the same MHSM service code signing key, preventing man-in-the-middle attacks. The MHSM service's code signing key is stored in Microsoft's Product Release and Security Service (which is also used to store, for example, the Windows code signing key), and is controlled by the Managed HSM team. As part of our regulatory and compliance obligations for change management, this key cannot be used by any other Microsoft team to sign their code.
+
+The TLS certificates used for TEE to TEE communication are self-issued by the service code inside the TEE, and contain what is called a platform report generated by the Intel SGX enclave on the server. The platform report is signed with keys derived from keys fused by Intel into the CPU when it's manufactured. The report identifies the code that is loaded into the SGX enclave by its code signing key and binary hash. Given this platform report, service instances can determine that a peer is also signed by the MHSM service code signing key and, with some crypto entanglement via the platform report, can also determine that the self-issued certificate signing key must also have been generated inside the TEE, preventing external impersonation.
+
+## Delivering availability SLAs with full Customer Key control
+
+### Managed HSM pool creation
+
+The high-availability properties of Managed HSM pools come from the automatically managed triple redundant HSM instances that are always kept in sync (or if using [multi-region replication](multi-region-replication.md), from keeping all six instances in sync). Pool creation is managed by the HSM Fabric Controller (HFC) service that allocates pools across the available hardware in the Azure region chosen by the customer. When a new pool is requested, HFC selects three servers across several racks with available space on their HSM adapters, and starts creating the pool:
+1. HFC instructs Node Service on each of the three TEEs to launch a new instance of the service code with a set of parameters that identify the customer's Azure Active Directory tenant, the internal VNET IP addresses of all three instances, and some other service configuration. One partition is randomly assigned as Primary.
+1. The three instances start. Each instance connects to a partition on its local HSM adapter, then zeroizes and initializes the partition using randomly generated usernames and credentials (to ensure the partition cannot be accessed by a human operator or other TEE instance).
+1. The primary instance creates a partition owner root certificate with the private key generated in the HSM, and establishes ownership of the pool by signing a partition level certificate for the HSM partition with this root certificate. The primary also generates a data encryption key, which is used to protect all customer data at rest inside the service (for key material, a double wrapping is used as the HSM also protects the key material itself).
+1. Next, this ownership data is synchronized to the two secondary instances. Each secondary contacts the primary using attested TLS. The primary shares the partition owner root certificate with private key, and the data encryption key. The secondaries now use the partition root certificate to issue a partition certificate to their own HSM partitions. Once this is done, we have HSM partitions on three separate servers owned by the same partition root certificate.
+1. Still over the attested TLS link, the primary's HSM partition shares with the secondaries its generated data wrapping key (used to encrypt messages between the three HSMs), using a secure API provided by the HSM vendor. During this exchange, the HSMs confirm that they have the same partition owner certificate, and then use a Diffie-Hellman scheme to encrypt the messages such that Microsoft service code cannot read them ΓÇô all the service code can do is transport opaque blobs between the HSMs.
+
+ At this point, all three instances are ready to be exposed as a pool on the customer's VNET: They share the same partition owner certificate and private key, the same data encryption key, and a common data wrapping key. However, each instance has unique credentials to their HSM partitions. Now the final steps are completed:
+
+1. Each instance generates an RSA key pair and a Certificate Signing Request (CSR) for its public facing TLS certificate. The CSR is signed by the Microsoft PKI system using a Microsoft public root, and the resultant TLS certificate is returned to the instance.
+1. All three instances obtain their own SGX sealing key from their local CPU: this key is generated using the CPU's own unique key, and the TEE's code signing key.
+1. The pool derives a unique pool key from the SGX sealing keys, encrypts all its secrets with this pool key, and then writes the encrypted blobs to disk. These blobs can only be decrypted by code signed by the same SGX sealing key, running on the same physical CPU. The secrets are thus bound to that particular instance.
+
+The secure bootstrap process is now complete. This process has allowed for both the creation of a triple redundant HSM pool, and creation of a cryptographic guarantee of the sovereignty of customer data.
+
+### Maintaining availability SLAs at runtime using confidential service healing
+
+The pool creation story above can explain how the Managed HSM service is able to deliver its high availability SLAs by securely managing the servers that underlie the service. Imagine that a server, or an HSM adapter, or even the power supply to the rack fails. The goal of the MSHM service is, without any customer intervention or the possibility of secrets being exposed in clear text outside the TEE, to heal the pool back to three healthy instances. This is achieved through confidential service healing.
+It starts with the HFC knowing which pools had instances on the failed server. HFC finds new, healthy servers within the pool's region to deploy the replacement instances to. It launches new instances, which are then treated exactly as a secondary during the initial provisioning step: initialize the HSM, find its primary, securely exchange secrets over attested TLS, sign the HSM into the ownership hierarchy, and then seal its service data to its new CPU. The service is now healed, fully automatically and fully confidentially.
+
+### Recovering from disaster using the security domain
+
+The Security Domain (SD) is a secured blob that contains all the credentials needed to rebuild the HSM partition from scratch: the partition owner key, the partition credentials, the data wrapping key, plus an initial backup of the HSM. Before the service becomes live, the customer must download the SD by providing a set of RSA encryption keys to secure it. The SD data originates in the TEEs and is protected by a generated symmetric key and an implementation of [Shamir's Secret Sharing algorithm](https://en.wikipedia.org/wiki/Shamir%27s_Secret_Sharing) which splits the key shares across the customer provided RSA public keys according to customer selected quorum parameters. During this process, none of the service keys or credentials are ever exposed in plaintext outside the service code running in the TEEs; only the customer, by presenting a quorum of their RSA keys to the TEE, can decrypt the SD during a recovery scenario.
+
+The SD is needed only for cases where due to some catastrophe, entire Azure region is, and Microsoft loses all three instances of the pool simultaneously. If only one or even two instances are lost, then confidential service healing with quietly recover to three healthy instances with no customer intervention. If the entire region is lost, then because SGX sealing keys are unique to each CPU, Microsoft has no way to recover the HSM credentials and partition owner keys; they exist only within the context of the instances. In the extremely unlikely event that this catastrophe happens, the customer can recover their previous pool state and data by creating a new blank pool, and injecting it into the SD, and then presenting their RSA key quorum to prove ownership of the SD. For the case where a customer has enabled multi-region replication, the even more unlikely catastrophe of both of the regions experiencing a simultaneous, complete failure would have to happen before customer intervention would be needed to recover the pool from the SD.
+
+### Controlling access to the service
+
+As we have described, our service code in the TEE is the only entity with access to the HSM itself, as the necessary credentials are not given to the customer or anyone else. Instead, the customer's pool is bound to their Azure Active Directory instance, and this is used for authentication and authorization. At initial provisioning, the customer can choose an initial set of Administrators for the pool, and these individuals, as well as the customer's Azure Active Directory tenant Global Administrator, can then set access control policies within the pool. All access control policies are stored by the service in the same database as the masked keys, also encrypted. Only the service code in the TEE has access to these access control policies.
+
+## Conclusion
+
+Managed HSM removes the need for customers to make tradeoffs between availability and control over cryptographic keys by using cutting edge, hardware backed confidential enclave technology. As discussed in this paper, the implementation is such that no Microsoft personnel can access Customer Key material or related secrets, even with physical access to the Managed HSM host machines and HSMs. This has allowed our customers in the sectors of financial services, manufacturing, public sector, defense and other verticals to accelerate their migration to the cloud with full confidence.
+
+## What's next
+
+Further reading:
+- [Azure Key Vault Managed HSM ΓÇô Control your data in the cloud](mhsm-control-data.md)
+- [About the Managed HSM security domain](security-domain.md)
+- [Managed HSM access control](access-control.md)
+- [Local RBAC built in roles](built-in-roles.md)
+- [Managing compliance in the cloud](https://www.microsoft.com/trust-center/compliance/compliance-overview?rtc=1)
key-vault Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/managed-hsm/overview.md
The term "Managed HSM instance" is synonymous with "Managed HSM pool". To avoid
## Next steps - [Key management in Azure](../../security/fundamentals/key-management.md)
+- For technical details, see [How Managed HSM implements key sovereignty, availability, performance, and scalability without tradeoffs](managed-hsm-technical-details.md)
- See [Quickstart: Provision and activate a managed HSM using Azure CLI](quick-create-cli.md) to create and activate a managed HSM - [Azure Managed HSM security baseline](/security/benchmark/azure/baselines/key-vault-managed-hsm-security-baseline) - See [Best Practices using Azure Key Vault Managed HSM](best-practices.md)
key-vault Quick Create Go https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/secrets/quick-create-go.md
Get started with the [azsecrets](https://aka.ms/azsdk/go/keyvault-secrets/docs)
For purposes of this quickstart, you use the [azidentity](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/azidentity) package to authenticate to Azure by using the Azure CLI. To learn about the various authentication methods, see [Azure authentication with the Azure SDK for Go](/azure/developer/go/azure-sdk-authentication).
-### Sign in to the Azure portal
+### Sign in to the Azure portal
1. In the Azure CLI, run the following command:
key-vault Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Key Vault description: Lists Azure Policy Regulatory Compliance controls available for Azure Key Vault. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
load-balancer Load Balancer Ipv6 For Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/load-balancer-ipv6-for-linux.md
Last updated 04/21/2023 -+ # Configure DHCPv6 for Linux VMs
load-balancer Load Balancer Multiple Virtual Machine Scale Set https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/load-balancer-multiple-virtual-machine-scale-set.md
In this section, youΓÇÖll learn how to attach your Virtual Machine Scale Sets be
# [Azure portal](#tab/azureportal)
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In the search box at the top of the portal, enter **Load balancer**. Select **Load balancers** in the search results. 1. Select your balancer from the list. 1. In your load balancer's page, select **Backend pools** under **Settings**.
load-balancer Monitor Load Balancer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/monitor-load-balancer.md
When you create a diagnostic setting, you specify which categories of logs to co
### Portal
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the search box at the top of the portal, enter **Load balancer**.
load-balancer Skus https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/skus.md
Previously updated : 04/20/2023 Last updated : 07/10/2023
To compare and understand the differences between Basic and Standard SKU, see th
| **Global VNet Peering Support** | Standard Internal Load Balancer is supported via Global VNet Peering | Not supported | | **[NAT Gateway Support](../virtual-network/nat-gateway/nat-overview.md)** | Both Standard Internal Load Balancer and Standard Public Load Balancer are supported via Nat Gateway | Not supported | | **[Private Link Support](../private-link/private-link-overview.md)** | Standard Internal Load Balancer is supported via Private Link | Not supported |
-| **[Global tier (Preview)](./cross-region-overview.md)** | Standard Load Balancer supports the Global tier for Public Load Balancers enabling cross-region load balancing | Not supported |
+| **[Global tier](./cross-region-overview.md)** | Standard Load Balancer supports the Global tier for Public Load Balancers enabling cross-region load balancing | Not supported |
For more information, see [Load balancer limits](../azure-resource-manager/management/azure-subscription-service-limits.md#load-balancer). For Standard Load Balancer details, see [overview](./load-balancer-overview.md), [pricing](https://aka.ms/lbpricing), and [SLA](https://aka.ms/lbsla). For information on Gateway SKU - catered for third-party network virtual appliances (NVAs), see [Gateway Load Balancer overview](gateway-overview.md)
load-testing How To Test Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-testing/how-to-test-private-endpoint.md
If you plan to further restrict access to your virtual network with a network se
To configure outbound access for Azure Load Testing:
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to your network security group.
logic-apps Create Run Custom Code Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-run-custom-code-functions.md
ms.suite: integration
Previously updated : 05/22/2023 Last updated : 07/21/2023 # Customer intent: As a logic app workflow developer, I want to write and run my own .NET Framework code to perform custom integration tasks.
The latest Azure Logic Apps (Standard) extension for Visual Studio Code includes
1. Open Visual Studio Code. On the Activity Bar, select the **Azure** icon. (Keyboard: Shift+Alt+A)
-1. In the **Azure** window that opens, on the **Workspace** toolbar, select **Create new logic app workspace**.
+1. In the **Azure** window that opens, on the **Workspace** section toolbar, from the **Azure Logic Apps** menu, select **Create new logic app workspace**.
- :::image type="content" source="media/create-run-custom-code-functions/create-workspace.png" alt-text="Screenshot shows Visual Studio Code, Azure window, and selected option for Create new logic app workspace.":::
+ :::image type="content" source="media/create-run-custom-code-functions/create-workspace.png" alt-text="Screenshot shows Visual Studio Code, Azure window, Workspace section toolbar, and selected option for Create new logic app workspace.":::
1. In the **Create new logic app workspace** prompt that appears, find and select the local folder that you created for your project.
logic-apps Create Single Tenant Workflows Visual Studio Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-single-tenant-workflows-visual-studio-code.md
ms.suite: integration Previously updated : 05/23/2023 Last updated : 07/21/2023 # Customer intent: As a logic apps developer, I want to create a Standard logic app workflow that runs in single-tenant Azure Logic Apps using Visual Studio Code.
Before you can create your logic app, create a local project so that you can man
1. In Visual Studio Code, close all open folders.
-1. In the Azure window, on the **Workspace** section toolbar, select **Create New Project** (folder icon).
+1. In the **Azure** window, on the **Workspace** section toolbar, from the **Azure Logic Apps** menu, select **Create New Project**.
- ![Screenshot shows Azure window and Workspace toolbar with Create New Project selected.](./media/create-single-tenant-workflows-visual-studio-code/create-new-project-folder.png)
+ ![Screenshot shows Azure window, Workspace toolbar, and Azure Logic Apps menu with Create New Project selected.](./media/create-single-tenant-workflows-visual-studio-code/create-new-project-folder.png)
1. If Windows Defender Firewall prompts you to grant network access for `Code.exe`, which is Visual Studio Code, and for `func.exe`, which is the Azure Functions Core Tools, select **Private networks, such as my home or work network** **>** **Allow access**.
Deployment for the Standard logic app resource requires a hosting plan and prici
1. On the Visual Studio Code Activity Bar, select the Azure icon to open the Azure window.
-1. In the **Workspace** section, on the toolbar, select **Deploy** > **Deploy to Logic App**.
+1. In the **Azure** window, on the **Workspace** section toolbar, from the **Azure Logic Apps** menu, select **Deploy to Logic App**.
- ![Screenshot shows Azure window with Workspace toolbar and Deploy shortcut menu with Deploy to Logic App selected.](./media/create-single-tenant-workflows-visual-studio-code/deploy-to-logic-app.png)
+ ![Screenshot shows Azure window with Workspace toolbar and Azure Logic Apps shortcut menu with Deploy to Logic App selected.](./media/create-single-tenant-workflows-visual-studio-code/deploy-to-logic-app.png)
1. If prompted, select the Azure subscription to use for your logic app deployment.
Deployment for the Standard logic app resource requires a hosting plan and prici
1. For optimal performance, select the same resource group as your project for the deployment. > [!NOTE]
+ >
> Although you can create or use a different resource group, doing so might affect performance. > If you create or choose a different resource group, but cancel after the confirmation prompt appears, > your deployment is also canceled.
Deployment for the Standard logic app resource requires a hosting plan and prici
``` > [!TIP]
+ >
> You can check whether the trigger and action names correctly appear in your Application Insights instance. > > 1. In the Azure portal, go to your Application Insights resource.
You can have multiple workflows in your logic app project. To add a blank workfl
1. On the Visual Studio Code Activity Bar, select the Azure icon.
-1. In the Azure window, in the **Workspace** section, on the toolbar, select **Create Workflow** (Azure Logic Apps icon).
+1. In the **Azure** window, on the **Workspace** section toolbar, from the **Azure Logic Apps** menu, select **Create workflow**.
1. Select the workflow type that you want to add: **Stateful** or **Stateless**
logic-apps Export From Consumption To Standard Logic App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/export-from-consumption-to-standard-logic-app.md
ms.suite: integration Previously updated : 10/28/2022 Last updated : 07/21/2023 #Customer intent: As a developer, I want to export one or more Consumption workflows to a Standard workflow.
Consider the following recommendations when you select logic apps for export:
1. In Visual Studio Code, sign in to Azure, if you haven't already.
-1. In the left navigation bar, select **Azure** to open the **Azure** window (Shift + Alt + A), and expand the **Logic Apps (Standard)** extension view.
+1. On the Visual Studio Code Activity Bar, select **Azure** to open the **Azure** window (Shift + Alt + A).
- :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-azure-view.png" alt-text="Screenshot showing Visual Studio Code with 'Azure' view selected.":::
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-azure-view.png" alt-text="Screenshot showing Visual Studio Code Activity Bar with Azure icon selected.":::
-1. On the extension toolbar, select **Export Logic App...**.
+1. On the **Workspace** section toolbar, from the **Azure Logic Apps** menu, select **Export Logic App**.
- :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-export-logic-app.png" alt-text="Screenshot showing Visual Studio Code and 'Logic Apps (Standard)' extension toolbar with 'Export Logic App' selected.":::
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-export-logic-app.png" alt-text="Screenshot showing Azure window, Workspace section toolbar, and Export Logic App selected.":::
1. After the **Export** tab opens, select your Azure subscription and region, and then select **Next**.
- :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-subscription-consumption.png" alt-text="Screenshot showing 'Export' tab and 'Select logic app instance' section with Azure subscription and region selected.":::
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-subscription-consumption.png" alt-text="Screenshot showing Export tab with Azure subscription and region selected.":::
1. Select the logic apps to export. Each selected logic app appears on the **Selected logic apps** list to the side. When you're done, select **Next**.
logic-apps Export From Ise To Standard Logic App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/export-from-ise-to-standard-logic-app.md
ms.suite: integration Previously updated : 10/28/2022 Last updated : 07/21/2023 #Customer intent: As a developer, I want to export one or more ISE workflows to a Standard workflow.
Consider the following recommendations when you select logic apps for export:
1. In Visual Studio Code, sign in to Azure, if you haven't already.
-1. In the left navigation bar, select **Azure** to open the **Azure** window (Shift + Alt + A), and expand the **Logic Apps (Standard)** extension view.
+1. On the Visual Studio Code Activity Bar, select **Azure** to open the **Azure** window (Shift + Alt + A).
- ![Screenshot showing Visual Studio Code with 'Azure' view selected.](media/export-from-ise-to-standard-logic-app/select-azure-view.png)
+ ![Screenshot showing Visual Studio Code Activity Bar with Azure icon selected.](media/export-from-ise-to-standard-logic-app/select-azure-view.png)
-1. On the extension toolbar, select **Export Logic App...**.
+1. On the **Workspace** section toolbar, from the **Azure Logic Apps** menu, select **Export Logic App**.
- ![Screenshot showing Visual Studio Code and 'Logic Apps (Standard)' extension toolbar with 'Export Logic App' selected.](media/export-from-ise-to-standard-logic-app/select-export-logic-app.png)
+ ![Screenshot showing Azure window, Workspace section toolbar, and Export Logic App selected.](media/export-from-ise-to-standard-logic-app/select-export-logic-app.png)
1. After the **Export** tab opens, select your Azure subscription and ISE instance, and then select **Next**.
- ![Screenshot showing 'Export' tab and 'Select logic app instance' section with Azure subscription and ISE instance selected.](media/export-from-ise-to-standard-logic-app/select-subscription-ise.png)
+ ![Screenshot showing Export tab with Azure subscription and ISE instance selected.](media/export-from-ise-to-standard-logic-app/select-subscription-ise.png)
1. Select the logic apps to export. Each selected logic app appears on the **Selected logic apps** list to the side. When you're done, select **Next**.
logic-apps Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Logic Apps description: Lists Azure Policy Regulatory Compliance controls available for Azure Logic Apps. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
machine-learning Concept Designer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-designer.md
As shown in below GIF, you can build a pipeline visually by dragging and droppin
The building blocks of pipeline are called assets in Azure Machine Learning, which includes: - [Data](./concept-data.md)
+ - [Model](https://learn.microsoft.com/azure/machine-learning/how-to-manage-models?view=azureml-api-2&tabs=cli%2Cuse-local)
- [Component](./concept-component.md) Designer has an asset library on the left side, where you can access all the assets you need to create your pipeline. It shows both the assets you created in your workspace, and the assets shared in [registry](./how-to-share-models-pipelines-across-workspaces-with-registries.md) that you have permission to access.
machine-learning Concept Sourcing Human Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-sourcing-human-data.md
# What is "human data" and why is it important to source responsibly? Human data is data collected directly from, or about, people. Human data may include personal data such as names, age, images, or voice clips and sensitive data such as genetic data, biometric data, gender identity, religious beliefs, or political affiliations.
machine-learning Dsvm Ubuntu Intro https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/data-science-virtual-machine/dsvm-ubuntu-intro.md
Last updated 04/18/2023-+ #Customer intent: As a data scientist, I want to learn how to provision the Linux DSVM so that I can move my existing workflow to the cloud.
machine-learning Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/data-science-virtual-machine/overview.md
description: Overview of Azure Data Science Virtual Machine - An easy to use vir
keywords: data science tools, data science virtual machine, tools for data science, linux data science + Last updated 06/23/2022- # What is the Azure Data Science Virtual Machine for Linux and Windows?
machine-learning Reference Ubuntu Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/data-science-virtual-machine/reference-ubuntu-vm.md
description: Details on tools included in the Ubuntu Data Science Virtual Machin
-+ Last updated 04/18/2023 -- # Reference: Ubuntu (Linux) Data Science Virtual Machine
machine-learning Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/data-science-virtual-machine/release-notes.md
description: Release notes for the Azure Data Science Virtual Machine + Last updated 04/18/2023
machine-learning Ubuntu Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/data-science-virtual-machine/ubuntu-upgrade.md
description: Learn how to upgrade from CentOS and Ubuntu 18.04 to the latest Ubu
keywords: deep learning, AI, data science tools, data science virtual machine, team data science process +
machine-learning How To Administrate Data Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-administrate-data-authentication.md
Learn how to manage data access and how to authenticate in Azure Machine Learning > [!IMPORTANT] > The information in this article is intended for Azure administrators who are creating the infrastructure required for an Azure Machine Learning solution.
machine-learning How To Auto Train Image Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-auto-train-image-models.md
Previously updated : 07/13/2022 Last updated : 07/16/2023 #Customer intent: I'm a data scientist with ML knowledge in the computer vision space, looking to build ML models using image data in Azure Machine Learning with full control of the model architecture, hyperparameters, and training and deployment environments.
Last updated 07/13/2022
[!INCLUDE [dev v2](includes/machine-learning-dev-v2.md)]
-In this article, you learn how to train computer vision models on image data with automated ML with the Azure Machine Learning CLI extension v2 or the Azure Machine Learning Python SDK v2.
+In this article, you learn how to train computer vision models on image data with automated ML. You can train models using the Azure Machine Learning CLI extension v2 or the Azure Machine Learning Python SDK v2.
Automated ML supports model training for computer vision tasks like image classification, object detection, and instance segmentation. Authoring AutoML models for computer vision tasks is currently supported via the Azure Machine Learning Python SDK. The resulting experimentation trials, models, and outputs are accessible from the Azure Machine Learning studio UI. [Learn more about automated ml for computer vision tasks on image data](concept-automated-ml.md).
Automated ML supports model training for computer vision tasks like image classi
To install the SDK you can either, * Create a compute instance, which automatically installs the SDK and is pre-configured for ML workflows. For more information, see [Create an Azure Machine Learning compute instance](how-to-create-compute-instance.md).
- * Use the following commands to install Azure Machine Learning Python SDK v2:
+ * Use the following command is used to install Azure Machine Learning Python SDK v2:
* Uninstall previous preview version: ```python pip uninstall azure-ai-ml
Field| Description
`image_details`|Image metadata information consists of height, width, and format. This field is optional and hence may or may not exist. `label`| A json representation of the image label, based on the task type.
-The following is a sample JSONL file for image classification:
+The following code is a sample JSONL file for image classification:
```json {
transformations:
column_type: stream_info ```
-Automated ML doesn't impose any constraints on training or validation data size for computer vision tasks. Maximum dataset size is only limited by the storage layer behind the dataset (i.e. blob store). There's no minimum number of images or labels. However, we recommend starting with a minimum of 10-15 samples per label to ensure the output model is sufficiently trained. The higher the total number of labels/classes, the more samples you need per label.
+Automated ML doesn't impose any constraints on training or validation data size for computer vision tasks. Maximum dataset size is only limited by the storage layer behind the dataset (Example: blob store). There's no minimum number of images or labels. However, we recommend starting with a minimum of 10-15 samples per label to ensure the output model is sufficiently trained. The higher the total number of labels/classes, the more samples you need per label.
# [Azure CLI](#tab/cli) [!INCLUDE [cli v2](includes/machine-learning-cli-v2.md)]
-Training data is a required parameter and is passed in using the `training_data` key. You can optionally specify another MLtable as a validation data with the `validation_data` key. If no validation data is specified, 20% of your training data will be used for validation by default, unless you pass `validation_data_size` argument with a different value.
+Training data is a required parameter and is passed in using the `training_data` key. You can optionally specify another MLtable as a validation data with the `validation_data` key. If no validation data is specified, 20% of your training data is used for validation by default, unless you pass `validation_data_size` argument with a different value.
Target column name is a required parameter and used as target for supervised ML task. It's passed in using the `target_column_name` key. For example,
You can create data inputs from training and validation MLTable from your local
[!Notebook-python[] (~/azureml-examples-main/sdk/python/jobs/automl-standalone-jobs/automl-image-object-detection-task-fridge-items/automl-image-object-detection-task-fridge-items.ipynb?name=data-load)]
-Training data is a required parameter and is passed in using the `training_data` parameter of the task specific `automl` type function. You can optionally specify another MLTable as a validation data with the `validation_data` parameter. If no validation data is specified, 20% of your training data will be used for validation by default, unless you pass `validation_data_size` argument with a different value.
+Training data is a required parameter and is passed in using the `training_data` parameter of the task specific `automl` type function. You can optionally specify another MLTable as a validation data with the `validation_data` parameter. If no validation data is specified, 20% of your training data is used for validation by default, unless you pass `validation_data_size` argument with a different value.
Target column name is a required parameter and used as target for supervised ML task. It's passed in using the `target_column_name` parameter of the task specific `automl` function. For example,
image_object_detection_job = automl.image_object_detection(
## Compute to run experiment
-Provide a [compute target](concept-azure-machine-learning-architecture.md#compute-targets) for automated ML to conduct model training. Automated ML models for computer vision tasks require GPU SKUs and support NC and ND families. We recommend the NCsv3-series (with v100 GPUs) for faster training. A compute target with a multi-GPU VM SKU leverages multiple GPUs to also speed up training. Additionally, when you set up a compute target with multiple nodes you can conduct faster model training through parallelism when tuning hyperparameters for your model.
+Provide a [compute target](concept-azure-machine-learning-architecture.md#compute-targets) for automated ML to conduct model training. Automated ML models for computer vision tasks require GPU SKUs and support NC and ND families. We recommend the NCsv3-series (with v100 GPUs) for faster training. A compute target with a multi-GPU VM SKU uses multiple GPUs to also speed up training. Additionally, when you set up a compute target with multiple nodes you can conduct faster model training through parallelism when tuning hyperparameters for your model.
> [!NOTE] > If you are using a [compute instance](concept-compute-instance.md) as your compute target, please make sure that multiple AutoML jobs are not run at the same time. Also, please make sure that `max_concurrent_trials` is set to 1 in your [job limits](#job-limits).
image_object_detection_job = automl.image_object_detection(
For computer vision tasks, you can launch either [individual trials](#individual-trials), [manual sweeps](#manually-sweeping-model-hyperparameters) or [automatic sweeps](#automatically-sweeping-model-hyperparameters-automode). We recommend starting with an automatic sweep to get a first baseline model. Then, you can try out individual trials with certain models and hyperparameter configurations. Finally, with manual sweeps you can explore multiple hyperparameter values near the more promising models and hyperparameter configurations. This three step workflow (automatic sweep, individual trials, manual sweeps) avoids searching the entirety of the hyperparameter space, which grows exponentially in the number of hyperparameters.
-Automatic sweeps can yield competitive results for many datasets. Additionally, they do not require advanced knowledge of model architectures, they take into account hyperparameter correlations and they work seamlessly across different hardware setups. All these reasons make them a strong option for the early stage of your experimentation process.
+Automatic sweeps can yield competitive results for many datasets. Additionally, they don't require advanced knowledge of model architectures, they take into account hyperparameter correlations and they work seamlessly across different hardware setups. All these reasons make them a strong option for the early stage of your experimentation process.
### Primary metric
limits:
> [!IMPORTANT] > This feature is currently in public preview. This preview version is provided without a service-level agreement. Certain features might not be supported or might have constrained capabilities. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-It is generally hard to predict the best model architecture and hyperparameters for a dataset. Also, in some cases the human time allocated to tuning hyperparameters may be limited. For computer vision tasks, you can specify a number of trials and the system will automatically determine the region of the hyperparameter space to sweep. You do not have to define a hyperparameter search space, a sampling method or an early termination policy.
+It's hard to predict the best model architecture and hyperparameters for a dataset. Also, in some cases the human time allocated to tuning hyperparameters may be limited. For computer vision tasks, you can specify any number of trials and the system automatically determines the region of the hyperparameter space to sweep. You don't have to define a hyperparameter search space, a sampling method or an early termination policy.
#### Triggering AutoMode
-You can run automatic sweeps by setting `max_trials` to a value greater than 1 in `limits` and by not specifying the search space, sampling method and termination policy. We call this functionality AutoMode; please see an example below.
+You can run automatic sweeps by setting `max_trials` to a value greater than 1 in `limits` and by not specifying the search space, sampling method and termination policy. We call this functionality AutoMode; please see the following example.
# [Azure CLI](#tab/cli)
image_object_detection_job.set_limits(max_trials=10, max_concurrent_trials=2)
```
-A number of trials between 10 and 20 will likely work well on many datasets. The [time budget](#job-limits) for the AutoML job can still be set, but we recommend doing this only if each trial may take a long time.
+A number of trials between 10 and 20 likely works well on many datasets. The [time budget](#job-limits) for the AutoML job can still be set, but we recommend doing this only if each trial may take a long time.
> [!Warning] > Launching automatic sweeps via the UI is not supported at this time.
Learn more about [how to configure the early termination policy for your hyperpa
> For a complete sweep configuration sample, please refer to this [tutorial](tutorial-auto-train-image-models.md#manual-hyperparameter-sweeping-for-image-tasks).
-You can configure all the sweep related parameters as shown in the example below.
+You can configure all the sweep related parameters as shown in the following example.
# [Azure CLI](#tab/cli)
sweep:
#### Fixed settings
-You can pass fixed settings or parameters that don't change during the parameter space sweep as shown below.
+You can pass fixed settings or parameters that don't change during the parameter space sweep as shown in the following example.
# [Azure CLI](#tab/cli)
training_parameters:
## Data augmentation
-In general, deep learning model performance can often improve with more data. Data augmentation is a practical technique to amplify the data size and variability of a dataset which helps to prevent overfitting and improve the model's generalization ability on unseen data. Automated ML applies different data augmentation techniques based on the computer vision task, before feeding input images to the model. Currently, there's no exposed hyperparameter to control data augmentations.
+In general, deep learning model performance can often improve with more data. Data augmentation is a practical technique to amplify the data size and variability of a dataset, which helps to prevent overfitting and improve the model's generalization ability on unseen data. Automated ML applies different data augmentation techniques based on the computer vision task, before feeding input images to the model. Currently, there's no exposed hyperparameter to control data augmentations.
|Task | Impacted dataset | Data augmentation technique(s) applied | |-|-||
In general, deep learning model performance can often improve with more data. Da
|Object detection using yolov5| Training <br><br> Validation & Test |Mosaic, random affine (rotation, translation, scale, shear), horizontal flip <br><br><br> Letterbox resizing| Currently the augmentations defined above are applied by default for an Automated ML for image job. To provide control over augmentations, Automated ML for images exposes below two flags to turn-off certain augmentations. Currently, these flags are only supported for object detection and instance segmentation tasks.
- 1. **apply_mosaic_for_yolo:** This flag is only specific to Yolo model. Setting it to False turns off the mosaic data augmentation which is applied at the training time.
+ 1. **apply_mosaic_for_yolo:** This flag is only specific to Yolo model. Setting it to False turns off the mosaic data augmentation, which is applied at the training time.
2. **apply_automl_train_augmentations:** Setting this flag to false turns off the augmentation applied during training time for the object detection and instance segmentation models. For augmentations, see the details in the table above.
- - For non-yolo object detection model and instance segmentation models, this flag turns off only the first three augmentations i.e., *Random crop around bounding boxes, expand, horizontal flip*. The normalization and resize augmentations are still applied regardless of this flag.
+ - For non-yolo object detection model and instance segmentation models, this flag turns off only the first three augmentations. For example: *Random crop around bounding boxes, expand, horizontal flip*. The normalization and resize augmentations are still applied regardless of this flag.
- For Yolo model, this flag turns off the random affine and horizontal flip augmentations. These two flags are supported via *advanced_settings* under *training_parameters* and can be controlled in the following way.
training_parameters:
advanced_settings: > {"apply_automl_train_augmentations": false} ```
- Please note that these two flags are independent of each other and can also be used in combination using the following settings.
+ Note that these two flags are independent of each other and can also be used in combination using the following settings.
```yaml training_parameters: advanced_settings: >
In our experiments, we found that these augmentations help the model to generali
## Incremental training (optional)
-Once the training job is done, you have the option to further train the model by loading the trained model checkpoint. You can either use the same dataset or a different one for incremental training.
+Once the training job is done, you can choose to further train the model by loading the trained model checkpoint. You can either use the same dataset or a different one for incremental training. If you are satisfied with the model, you can choose to stop training and use the current model.
### Pass the checkpoint via job ID
When you've configured your AutoML Job to the desired settings, you can submit t
## Outputs and evaluation metrics
-The automated ML training jobs generates output model files, evaluation metrics, logs and deployment artifacts like the scoring file and the environment file which can be viewed from the outputs and logs and metrics tab of the child jobs.
+The automated ML training jobs generates output model files, evaluation metrics, logs and deployment artifacts like the scoring file and the environment file. These files and metrics can be viewed from the outputs and logs and metrics tab of the child jobs.
> [!TIP] > Check how to navigate to the job results from the [View job results](how-to-understand-automated-ml.md#view-job-results) section.
For definitions and examples of the performance charts and metrics provided for
## Register and deploy model
-Once the job completes, you can register the model that was created from the best trial (configuration that resulted in the best primary metric). You can either register the model after downloading or by specifying the azureml path with corresponding jobid. Note: If you want to change the inference settings that are described below you need to download the model and change settings.json and register using the updated model folder.
+Once the job completes, you can register the model that was created from the best trial (configuration that resulted in the best primary metric). You can either register the model after downloading or by specifying the azureml path with corresponding jobid. Note: When you want to change the inference settings that are described below you need to download the model and change settings.json and register using the updated model folder.
### Get the best trial
auth_mode: key
### Create the endpoint
-Using the `MLClient` created earlier, we'll now create the Endpoint in the workspace. This command will start the endpoint creation and return a confirmation response while the endpoint creation continues.
+Using the `MLClient` created earlier, we create the Endpoint in the workspace. This command starts the endpoint creation and returns a confirmation response while the endpoint creation continues.
# [Azure CLI](#tab/cli)
Each of the tasks (and some models) has a set of parameters. By default, we use
|Object detection using `yolov5`| `img_size`<br>`model_size`<br>`box_score_thresh`<br>`nms_iou_thresh` | 640<br>medium<br>0.1<br>0.5 | |Instance segmentation| `min_size`<br>`max_size`<br>`box_score_thresh`<br>`nms_iou_thresh`<br>`box_detections_per_img`<br>`mask_pixel_score_threshold`<br>`max_number_of_polygon_points`<br>`export_as_image`<br>`image_type` | 600<br>1333<br>0.3<br>0.5<br>100<br>0.5<br>100<br>False<br>JPG|
-For a detailed description on task specific hyperparameters, please refer to [Hyperparameters for computer vision tasks in automated machine learning](./reference-automl-images-hyperparameters.md).
+For a detailed description on task specific hyperparameters, refer to [Hyperparameters for computer vision tasks in automated machine learning](./reference-automl-images-hyperparameters.md).
-If you want to use tiling, and want to control tiling behavior, the following parameters are available: `tile_grid_size`, `tile_overlap_ratio` and `tile_predictions_nms_thresh`. For more details on these parameters please check [Train a small object detection model using AutoML](./how-to-use-automl-small-object-detect.md).
+If you want to use tiling, and want to control tiling behavior, the following parameters are available: `tile_grid_size`, `tile_overlap_ratio` and `tile_predictions_nms_thresh`. For more details on these parameters check [Train a small object detection model using AutoML](./how-to-use-automl-small-object-detect.md).
### Test the deployment
-Please check this [Test the deployment](./tutorial-auto-train-image-models.md#test-the-deployment) section to test the deployment and visualize the detections from the model.
+Check this [Test the deployment](./tutorial-auto-train-image-models.md#test-the-deployment) section to test the deployment and visualize the detections from the model.
## Generate explanations for predictions
Some of the advantages of using Explainable AI (XAI) with AutoML for images:
- Helps in discovering the bias ### Explanations
-Explanations are **feature attributions** or weights given to each pixel in the input image based on its contribution to model's prediction. Each weight can be negative (negatively correlated with the prediction) or positive (positively correlated with the prediction). These attributions are calculated against the predicted class. For multi-class classification, exactly one attribution matrix of size `[3, valid_crop_size, valid_crop_size]` will be generated per sample, whereas for multi-label classification, attribution matrix of size `[3, valid_crop_size, valid_crop_size]` will be generated for each predicted label/class for each sample.
+Explanations are **feature attributions** or weights given to each pixel in the input image based on its contribution to model's prediction. Each weight can be negative (negatively correlated with the prediction) or positive (positively correlated with the prediction). These attributions are calculated against the predicted class. For multi-class classification, exactly one attribution matrix of size `[3, valid_crop_size, valid_crop_size]` is generated per sample, whereas for multi-label classification, attribution matrix of size `[3, valid_crop_size, valid_crop_size]` is generated for each predicted label/class for each sample.
-Using Explainable AI in AutoML for Images on the deployed endpoint, users can get **visualizations** of explanations (attributions overlaid on an input image) and/or **attributions** (multi-dimensional array of size `[3, valid_crop_size, valid_crop_size]`) for each image. Apart from visualizations, users can also get attribution matrices to gain more control over the explanations (like generating custom visualizations using attributions or scrutinizing segments of attributions). All the explanation algorithms will use cropped square images with size `valid_crop_size` for generating attributions.
+Using Explainable AI in AutoML for Images on the deployed endpoint, users can get **visualizations** of explanations (attributions overlaid on an input image) and/or **attributions** (multi-dimensional array of size `[3, valid_crop_size, valid_crop_size]`) for each image. Apart from visualizations, users can also get attribution matrices to gain more control over the explanations (like generating custom visualizations using attributions or scrutinizing segments of attributions). All the explanation algorithms use cropped square images with size `valid_crop_size` for generating attributions.
-Explanations can be generated either from **online endpoint** or **batch endpoint**. Once the deployment is done, this endpoint can be utilized to generate the explanations for predictions. In case of online deployment, make sure to pass `request_settings = OnlineRequestSettings(request_timeout_ms=90000)` parameter to `ManagedOnlineDeployment` and set `request_timeout_ms` to its maximum value to avoid **timeout issues** while generating explanations (refer to [register and deploy model section](#register-and-deploy-model)). Some of the explainability (XAI) methods like `xrai` consume more time (specially for multi-label classification as we need to generate attributions and/or visualizations against each predicted label). So, we recommend any GPU instance for faster explanations. For more information on input and output schema for generating explanations, see the [schema docs](reference-automl-images-schema.md#data-format-for-online-scoring-and-explainability-xai).
+Explanations can be generated either from **online endpoint** or **batch endpoint**. Once the deployment is done, this endpoint can be utilized to generate the explanations for predictions. In online deployments, make sure to pass `request_settings = OnlineRequestSettings(request_timeout_ms=90000)` parameter to `ManagedOnlineDeployment` and set `request_timeout_ms` to its maximum value to avoid **timeout issues** while generating explanations (refer to [register and deploy model section](#register-and-deploy-model)). Some of the explainability (XAI) methods like `xrai` consume more time (specially for multi-label classification as we need to generate attributions and/or visualizations against each predicted label). So, we recommend any GPU instance for faster explanations. For more information on input and output schema for generating explanations, see the [schema docs](reference-automl-images-schema.md#data-format-for-online-scoring-and-explainability-xai).
We support following state-of-the-art explainability algorithms in AutoML for images:
Internally XRAI algorithm uses integrated gradients. So, `n_steps` parameter is
We recommend using XRAI > Guided GradCAM > Integrated Gradients > Guided BackPropagation algorithms for better explanations, whereas Guided BackPropagation > Guided GradCAM > Integrated Gradients > XRAI are recommended for faster explanations in the specified order.
-A sample request to the online endpoint looks like the following. This request generates explanations when `model_explainability` is set to `True`. Following request will generate visualizations and attributions using faster version of XRAI algorithm with 50 steps.
+A sample request to the online endpoint looks like the following. This request generates explanations when `model_explainability` is set to `True`. Following request generates visualizations and attributions using faster version of XRAI algorithm with 50 steps.
```python import base64
image = Image.open(BytesIO(img_bytes))
Following picture describes the Visualization of explanations for a sample input image. ![Screenshot of visualizations generated by XAI for AutoML for images.](./media/how-to-auto-train-image-models/xai-visualization.jpg)
-Decoded base64 figure will have four image sections within a 2 x 2 grid.
+Decoded base64 figure has four image sections within a 2 x 2 grid.
- Image at Top-left corner (0, 0) is the cropped input image - Image at top-right corner (0, 1) is the heatmap of attributions on a color scale bgyw (blue green yellow white) where the contribution of white pixels on the predicted class is the highest and blue pixels is the lowest.
image_object_detection_job = automl.image_object_detection(
### Streaming image files from storage
-By default, all image files are downloaded to disk prior to model training. If the size of the image files is greater than available disk space, the job will fail. Instead of downloading all images to disk, you can select to stream image files from Azure storage as they're needed during training. Image files are streamed from Azure storage directly to system memory, bypassing disk. At the same time, as many files as possible from storage are cached on disk to minimize the number of requests to storage.
+By default, all image files are downloaded to disk prior to model training. If the size of the image files is greater than available disk space, the job fails. Instead of downloading all images to disk, you can select to stream image files from Azure storage as they're needed during training. Image files are streamed from Azure storage directly to system memory, bypassing disk. At the same time, as many files as possible from storage are cached on disk to minimize the number of requests to storage.
> [!NOTE] > If streaming is enabled, ensure the Azure storage account is located in the same region as compute to minimize cost and latency.
image_object_detection_job.set_training_parameters(
## Example notebooks
-Review detailed code examples and use cases in the [GitHub notebook repository for automated machine learning samples](https://github.com/Azure/azureml-examples/tree/main/sdk/python/jobs/automl-standalone-jobs). Please check the folders with 'automl-image-' prefix for samples specific to building computer vision models.
+Review detailed code examples and use cases in the [GitHub notebook repository for automated machine learning samples](https://github.com/Azure/azureml-examples/tree/main/sdk/python/jobs/automl-standalone-jobs). Check the folders with 'automl-image-' prefix for samples specific to building computer vision models.
## Code examples
Review detailed code examples and use cases in the [GitHub notebook repository f
## Next steps * [Tutorial: Train an object detection model with AutoML and Python](tutorial-auto-train-image-models.md).
-* [Troubleshoot automated ML experiments (SDK v1)](./v1/how-to-troubleshoot-auto-ml.md?view=azureml-api-1&preserve-view=true).
machine-learning How To Autoscale Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-autoscale-endpoints.md
Last updated 02/07/2023
# Autoscale an online endpoint + Autoscale automatically runs the right amount of resources to handle the load on your application. [Online endpoints](concept-endpoints.md) supports autoscaling through integration with the Azure Monitor autoscale feature. Azure Monitor autoscaling supports a rich set of rules. You can configure metrics-based scaling (for instance, CPU utilization >70%), schedule-based scaling (for example, scaling rules for peak business hours), or a combination. For more information, see [Overview of autoscale in Microsoft Azure](../azure-monitor/autoscale/autoscale-overview.md).
machine-learning How To Create Component Pipeline Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-create-component-pipeline-python.md
Now that you've created and loaded all components and input data to build the pi
> To use [serverless compute (preview)](how-to-use-serverless-compute.md), add `from azure.ai.ml.entities import ResourceConfiguration` to the top. > Then replace: > * `default_compute=cpu_compute_target,` with `default_compute="serverless",`
-> * `train_node.compute = gpu_compute_target` with `train_node.resources = "ResourceConfiguration(instance_type="Standard_NC6",instance_count=2)`
+> * `train_node.compute = gpu_compute_target` with `train_node.resources = "ResourceConfiguration(instance_type="Standard_NC6s_v3",instance_count=2)`
[!notebook-python[] (~/azureml-examples-main/sdk/python/jobs/pipelines/2e_image_classification_keras_minist_convnet/image_classification_keras_minist_convnet.ipynb?name=build-pipeline)]
Reference for more available credentials if it doesn't work for you: [configure
#### Get a handle to a workspace with compute
-Create a `MLClient` object to manage Azure Machine Learning services.
+Create a `MLClient` object to manage Azure Machine Learning services. If you use [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) then there is no need to create these computes.
[!notebook-python[] (~/azureml-examples-main/sdk/python/jobs/pipelines/2e_image_classification_keras_minist_convnet/image_classification_keras_minist_convnet.ipynb?name=workspace)]
machine-learning How To Create Vector Index https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-create-vector-index.md
When a Vector Index is created, Azure Machine Learning will chunk the data, crea
* Test data for your data source.
-* A sample prompt flow, which uses the Vector Index you created. The sample prompt flow, which gets created has several key features like: a. Automatically generated prompt variants. b. Evaluation of each of these variations using the test data generated<TBD - link to eval blog>. c. Metrics against each of the variants to help you choose the best variant to run. You can use this sample to continue developing your prompt.
+* A sample prompt flow, which uses the Vector Index you created. The sample prompt flow, which gets created has several key features like: Automatically generated prompt variants. Evaluation of each of these variations using the [test data generated](https://aka.ms/prompt_flow_blog). Metrics against each of the variants to help you choose the best variant to run. You can use this sample to continue developing your prompt.
[!INCLUDE [machine-learning-preview-generic-disclaimer](includes/machine-learning-preview-generic-disclaimer.md)]
machine-learning How To Deploy Custom Container https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-deploy-custom-container.md
ms.devlang: azurecli
# Use a custom container to deploy a model to an online endpoint - Learn how to use a custom container for deploying a model to an online endpoint in Azure Machine Learning.
machine-learning How To Interactive Jobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-interactive-jobs.md
Previously updated : 06/15/2023+ Last updated : 07/15/2023+ #Customer intent: I'm a data scientist with ML knowledge in the machine learning space, looking to build ML models using data in Azure Machine Learning with full control of the model training including debugging and monitoring of live jobs. # Debug jobs and monitor training progress
-Machine learning model training is usually an iterative process and requires significant experimentation. With the Azure Machine Learning interactive job experience, data scientists can use the Azure Machine Learning Python SDKv2, Azure Machine Learning CLIv2 or the Azure Studio to access the container where their job is running. Once the job container is accessed, users can iterate on training scripts, monitor training progress or debug the job remotely like they typically do on their local machines. Jobs can be interacted with via different training applications including **JupyterLab, TensorBoard, VS Code** or by connecting to the job container directly via **SSH**.
+Machine learning model training is an iterative process and requires significant experimentation. With the Azure Machine Learning interactive job experience, data scientists can use the Azure Machine Learning Python SDK, Azure Machine Learning CLI or the Azure Studio to access the container where their job is running. Once the job container is accessed, users can iterate on training scripts, monitor training progress or debug the job remotely like they typically do on their local machines. Jobs can be interacted with via different training applications including JupyterLab, TensorBoard, VS Code or by connecting to the job container directly via SSH.
Interactive training is supported on **Azure Machine Learning Compute Clusters** and **Azure Arc-enabled Kubernetes Cluster**. ## Prerequisites+ - Review [getting started with training on Azure Machine Learning](./how-to-train-model.md).-- To use **VS Code**, [follow this guide](how-to-setup-vs-code.md) to set up the Azure Machine Learning extension.
+- For more information, see this link for [VS Code](how-to-setup-vs-code.md) to set up the Azure Machine Learning extension.
- Make sure your job environment has the `openssh-server` and `ipykernel ~=6.0` packages installed (all Azure Machine Learning curated training environments have these packages installed by default).-- Interactive applications can't be enabled on distributed training runs where the distribution type is anything other than Pytorch, Tensorflow or MPI. Custom distributed training setup (configuring multi-node training without using the above distribution frameworks) is not currently supported.
+- Interactive applications can't be enabled on distributed training runs where the distribution type is anything other than Pytorch, Tensorflow or MPI. Custom distributed training setup (configuring multi-node training without using the above distribution frameworks) isn't currently supported.
- To use SSH, you need an SSH key pair. You can use the `ssh-keygen -f "<filepath>"` command to generate a public and private key pair. ## Interact with your job container
By specifying interactive applications at job creation, you can connect directly
1. Create a new job from the left navigation pane in the studio portal.
-2. Choose `Compute cluster` or `Attached compute` (Kubernetes) as the compute type, choose the compute target, and specify how many nodes you need in `Instance count`.
+2. Choose **Compute cluster** or **Attached compute** (Kubernetes) as the compute type, choose the compute target, and specify how many nodes you need in `Instance count`.
:::image type="content" source="./media/interactive-jobs/select-compute.png" alt-text="Screenshot of selecting a compute location for a job."::: 3. Follow the wizard to choose the environment you want to start the job.
-4. In `Job settings` step, add your training code (and input/output data) and reference it in your command to make sure it's mounted to your job.
+4. In **Job settings** step, add your training code (and input/output data) and reference it in your command to make sure it's mounted to your job.
:::image type="content" source="./media/interactive-jobs/sleep-command.png" alt-text="Screenshot of reviewing a drafted job and completing the creation.":::
By specifying interactive applications at job creation, you can connect directly
> [!NOTE] > If you use `sleep infinity`, you will need to manually [cancel the job](./how-to-interactive-jobs.md#end-job) to let go of the compute resource (and stop billing).
-5. Select at least one training application you want to use to interact with the job. If you do not select an application, the debug feature will not be available.
+5. Select at least one training application you want to use to interact with the job. If you don't select an application, the debug feature won't be available.
:::image type="content" source="./media/interactive-jobs/select-training-apps.png" alt-text="Screenshot of selecting a training application for the user to use for a job.":::
By specifying interactive applications at job creation, you can connect directly
# [Python SDK](#tab/python) 1. Define the interactive services you want to use for your job. Make sure to replace `your compute name` with your own value. If you want to use your own custom environment, follow the examples in [this tutorial](how-to-manage-environments-v2.md) to create a custom environment.
- Note that you have to import the `JobService` class from the `azure.ai.ml.entities` package to configure interactive services via the SDKv2.
+ You have to import the `JobService` class from the `azure.ai.ml.entities` package to configure interactive services via the SDK.
```python command_job = command(...
By specifying interactive applications at job creation, you can connect directly
> [!NOTE] > If you use `sleep infinity`, you will need to manually [cancel the job](./how-to-interactive-jobs.md#end-job) to let go of the compute resource (and stop billing).
-2. Submit your training job. For more details on how to train with the Python SDKv2, check out this [article](./how-to-train-model.md).
+2. Submit your training job. For more details on how to train with the Python SDK, check out this [article](./how-to-train-model.md).
# [Azure CLI](#tab/azurecli)
-1. Create a job yaml `job.yaml` with below sample content. Make sure to replace `your compute name` with your own value. If you want to use custom environment, follow the examples in [this tutorial](how-to-manage-environments-v2.md) to create a custom environment.
+1. Create a job yaml `job.yaml` using the sample content. Make sure to replace `your compute name` with your own value. If you want to use custom environment, follow the examples in [this tutorial](how-to-manage-environments-v2.md) to create a custom environment.
```dotnetcli code: src command:
By specifying interactive applications at job creation, you can connect directly
> [!NOTE] > If you use `sleep infinity`, you will need to manually [cancel the job](./how-to-interactive-jobs.md#end-job) to let go of the compute resource (and stop billing).
-2. Run command `az ml job create --file <path to your job yaml file> --workspace-name <your workspace name> --resource-group <your resource group name> --subscription <sub-id> `to submit your training job. For more details on running a job via CLIv2, check out this [article](./how-to-train-model.md).
+2. Run command `az ml job create --file <path to your job yaml file> --workspace-name <your workspace name> --resource-group <your resource group name> --subscription <sub-id> `to submit your training job. For more details on running a job via CLI, check out this [article](./how-to-train-model.md).
### Connect to endpoints # [Azure Machine Learning studio](#tab/ui)
-To interact with your running job, click the button **Debug and monitor** on the job details page.
+To interact with your running job, select the button **Debug and monitor** on the job details page.
:::image type="content" source="media/interactive-jobs/debug-and-monitor.png" alt-text="Screenshot of interactive jobs debug and monitor panel location.":::
To interact with your running job, click the button **Debug and monitor** on the
Clicking the applications in the panel opens a new tab for the applications. You can access the applications only when they are in **Running** status and only the **job owner** is authorized to access the applications. If you're training on multiple nodes, you can pick the specific node you would like to interact with. It might take a few minutes to start the job and the training applications specified during job creation. # [Python SDK](#tab/python) - Once the job is submitted, you can use `ml_client.jobs.show_services("<job name>", <compute node index>)` to view the interactive service endpoints. -- To connect via SSH to the container where the job is running, run the command `az ml job connect-ssh --name <job-name> --node-index <compute node index> --private-key-file-path <path to private key>`. To set up the Azure Machine Learning CLIv2, follow this [guide](./how-to-configure-cli.md).
+- To connect via SSH to the container where the job is running, run the command `az ml job connect-ssh --name <job-name> --node-index <compute node index> --private-key-file-path <path to private key>`. To set up the Azure Machine Learning CLI, follow this [guide](./how-to-configure-cli.md).
-You can find the reference documentation for the SDKv2 [here](./index.yml).
+You can find the reference documentation for the SDK [here](./index.yml).
You can access the applications only when they are in **Running** status and only the **job owner** is authorized to access the applications. If you're training on multiple nodes, you can pick the specific node you would like to interact with by passing in the node index. # [Azure CLI](#tab/azurecli)-- When the job is **running**, Run the command `az ml job show-services --name <job name> --node-index <compute node index>` to get the URL to the applications. The endpoint URL will show under `services` in the output. Note that for VS Code, you must copy and paste the provided URL in your browser.
+- When the job is **running**, Run the command `az ml job show-services --name <job name> --node-index <compute node index>` to get the URL to the applications. The endpoint URL shows under `services` in the output. For VS Code, you must copy and paste the provided URL in your browser.
- To connect via SSH to the container where the job is running, run the command `az ml job connect-ssh --name <job-name> --node-index <compute node index> --private-key-file-path <path to private key>`.
You can access the applications only when they are in **Running** status and onl
### Interact with the applications
-When you click on the endpoints to interact when your job, you're taken to the user container under your working directory, where you can access your code, inputs, outputs, and logs. If you run into any issues while connecting to the applications, the interactive capability and applications logs can be found from **system_logs->interactive_capability** under **Outputs + logs** tab.
+When you select on the endpoints to interact when your job, you're taken to the user container under your working directory, where you can access your code, inputs, outputs, and logs. If you run into any issues while connecting to the applications, the interactive capability and applications logs can be found from **system_logs->interactive_capability** under **Outputs + logs** tab.
:::image type="content" source="./media/interactive-jobs/interactive-logs.png" alt-text="Screenshot of interactive jobs interactive logs panel location.":::
When you click on the endpoints to interact when your job, you're taken to the u
- If you have logged tensorflow events for your job, you can use TensorBoard to monitor the metrics when your job is running.
- :::image type="content" source="./media/interactive-jobs/tensorboard-open.png" alt-text="Screenshot of interactive jobs tensorboard panel when first opened. This information will vary depending upon customer data":::
+ :::image type="content" source="./media/interactive-jobs/tensorboard-open.png" alt-text="Screenshot of interactive jobs tensorboard panel when first opened. This information varies depending upon customer data":::
### End job
-Once you're done with the interactive training, you can also go to the job details page to cancel the job which will release the compute resource. Alternatively, use `az ml job cancel -n <your job name>` in the CLI or `ml_client.job.cancel("<job name>")` in the SDK.
+Once you're done with the interactive training, you can also go to the job details page to cancel the job, which will release the compute resource. Alternatively, use `az ml job cancel -n <your job name>` in the CLI or `ml_client.job.cancel("<job name>")` in the SDK.
:::image type="content" source="./media/interactive-jobs/cancel-job.png" alt-text="Screenshot of interactive jobs cancel job option and its location for user selection"::: ## Attach a debugger to a job
-To submit a job with a debugger attached and the execution paused, you can use debugpy and VS Code (`debugpy` must be installed in your job environment).
+To submit a job with a debugger attached and the execution paused, you can use debugpy, and VS Code (`debugpy` must be installed in your job environment).
-1. During job submission (either through the UI, the CLIv2 or the SDKv2) use the debugpy command to run your python script. For example, the below screenshot shows a sample command that uses debugpy to attach the debugger for a tensorflow script (`tfevents.py` can be replaced with the name of your training script).
+1. During job submission (either through the UI, the CLI or the SDK) use the debugpy command to run your python script. For example, the below screenshot shows a sample command that uses debugpy to attach the debugger for a tensorflow script (`tfevents.py` can be replaced with the name of your training script).
:::image type="content" source="./media/interactive-jobs/use-debugpy.png" alt-text="Screenshot of interactive jobs configuration of debugpy":::
-2. Once the job has been submitted, [connect to the VS Code](./how-to-interactive-jobs.md#connect-to-endpoints), and click on the in-built debugger.
+2. Once the job has been submitted, [connect to the VS Code](./how-to-interactive-jobs.md#connect-to-endpoints), and select the in-built debugger.
:::image type="content" source="./media/interactive-jobs/open-debugger.png" alt-text="Screenshot of interactive jobs location of open debugger on the left side panel":::
machine-learning How To Manage Quotas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-manage-quotas.md
Azure Machine Learning managed online endpoints have limits described in the fol
<sup>2</sup> We reserve 20% extra compute resources for performing upgrades. For example, if you request 10 instances in a deployment, you must have a quota for 12. Otherwise, you receive an error.
-<sup>3</sup> If you request a limit increase, be sure to calculate related limit increases you might need. For example, if you request a limit increase for requests per second, you might also want to compute the required connections and bandwidth limits and include these limit increases in the same request.
+<sup>3</sup> The default limit for some subscriptions may be different. For example, when you request a limit increase it may show 100 instead. If you request a limit increase, be sure to calculate related limit increases you might need. For example, if you request a limit increase for requests per second, you might also want to compute the required connections and bandwidth limits and include these limit increases in the same request.
To determine the current usage for an endpoint, [view the metrics](how-to-monitor-online-endpoints.md#metrics).
machine-learning How To Manage Synapse Spark Pool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-manage-synapse-spark-pool.md
# Attach and manage a Synapse Spark pool in Azure Machine Learning + In this article, you'll learn how to attach a [Synapse Spark Pool](../synapse-analytics/spark/apache-spark-concepts.md#spark-pools) in Azure Machine Learning. You can attach a Synapse Spark Pool in Azure Machine Learning in one of these ways: - Using Azure Machine Learning studio UI
machine-learning How To Mltable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-mltable.md
# Working with tables in Azure Machine Learning Azure Machine Learning supports a Table type (`mltable`). This allows for the creation of a *blueprint* that defines how to load data files into memory as a Pandas or Spark data frame. In this article you learn:
machine-learning How To Monitor Online Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-monitor-online-endpoints.md
To access the metrics pages through links available in the studio:
To access metrics directly from the Azure portal:
-1. Go to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to the online endpoint or deployment resource. Online endpoints and deployments are Azure Resource Manager (ARM) resources that can be found by going to their owning resource group. Look for the resource types **Machine Learning online endpoint** and **Machine Learning online deployment**.
machine-learning How To Network Security Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-network-security-overview.md
monikerRange: 'azureml-api-2 || azureml-api-1'
# Secure Azure Machine Learning workspace resources using virtual networks (VNets) :::moniker range="azureml-api-2" :::moniker-end :::moniker range="azureml-api-1" :::moniker-end Secure Azure Machine Learning workspace resources and compute environments using Azure Virtual Networks (VNets). This article uses an example scenario to show you how to configure a complete virtual network.
machine-learning How To Safely Rollout Online Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-safely-rollout-online-endpoints.md
# Perform safe rollout of new deployments for real-time inference - In this article, you'll learn how to deploy a new version of a machine learning model in production without causing any disruption. You'll use a blue-green deployment strategy (also known as a safe rollout strategy) to introduce a new version of a web service to production. This strategy will allow you to roll out your new version of the web service to a small subset of users or requests before rolling it out completely.
machine-learning How To Submit Spark Jobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-submit-spark-jobs.md
# Submit Spark jobs in Azure Machine Learning + Azure Machine Learning supports submission of standalone machine learning jobs and creation of [machine learning pipelines](./concept-ml-pipelines.md) that involve multiple machine learning workflow steps. Azure Machine Learning handles both standalone Spark job creation, and creation of reusable Spark components that Azure Machine Learning pipelines can use. In this article, you'll learn how to submit Spark jobs using: - Azure Machine Learning studio UI - Azure Machine Learning CLI
machine-learning How To Troubleshoot Environments https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-troubleshoot-environments.md
monikerRange: 'azureml-api-1 || azureml-api-2'
# Troubleshooting environment issues + In this article, learn how to troubleshoot common problems you may encounter with environment image builds and learn about AzureML environment vulnerabilities. We are actively seeking your feedback! If you navigated to this page via your Environment Definition or Build Failure Analysis logs, we'd like to know if the feature was helpful to you, or if you'd like to report a failure scenario that isn't yet covered by our analysis. You can also leave feedback on this documentation. Leave your thoughts [here](https://aka.ms/azureml/environment/log-analysis-feedback).
machine-learning How To Tune Hyperparameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-tune-hyperparameters.md
# Hyperparameter tuning a model (v2) Automate efficient hyperparameter tuning using Azure Machine Learning SDK v2 and CLI v2 by way of the SweepJob type.
machine-learning How To Use Automl Small Object Detect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-use-automl-small-object-detect.md
# Train a small object detection model with AutoML In this article, you'll learn how to train an object detection model to detect small objects in high-resolution images with [automated ML](concept-automated-ml.md) in Azure Machine Learning.
machine-learning How To Use Foundation Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-use-foundation-models.md
You can filter the list of models in the model catalog by Task, or by license. S
You can quickly test out any pre-trained model using the Sample Inference widget on the model card, providing your own sample input to test the result. Additionally, the model card for each model includes a brief description of the model and links to samples for code based inferencing, finetuning and evaluation of the model.
-> [!NOTE]
->If you are using a private workspace, your virtual network needs to allow outbound access in order to use foundation models in Azure Machine Learning
+> [!IMPORTANT]
+> Deploying foundational models to a managed online endpoint is currently supported with __public workspaces__ (and their public associated resources) only.
+>
+> * When `egress_public_network_access` is set to `disabled`, the deployment can only access the workspace-associated resources secured in the virtual network.
+> * When `egress_public_network_access` is set to `enabled` for a managed online endpoint deployment, the deployment can only access the resources with public access. Which means that it cannot access resources secured in the virtual network.
+>
+> For more information, see [Outbound resource access for managed online endpoints](how-to-secure-online-endpoint.md#outbound-resource-access).
## How to evaluate foundation models using your own test data
To enable users to get started with model evaluation, we have published samples
## How to finetune foundation models using your own training data In order to improve model performance in your workload, you might want to fine tune a foundation model using your own training data. You can easily finetune these foundation models by using either the finetune settings in the studio or by using the code based samples linked from the model card.
-
+
### Finetune using the studio You can invoke the finetune settings form by selecting on the **Finetune** button on the model card for any foundation model.
machine-learning How To Use Serverless Compute https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-use-serverless-compute.md
resources:
``` -
+View more examples of training with serverless compute at:-
+* [Quick Start](https://github.com/Azure/azureml-examples/blob/main/tutorials/get-started-notebooks/quickstart.ipynb)
+* [Train Model](https://github.com/Azure/azureml-examples/blob/main/tutorials/get-started-notebooks/train-model.ipynb)
+
## AutoML job There's no need to specify compute for AutoML jobs. Resources can be optionally specified. If instance count isn't specified, then it's defaulted based on max_concurrent_trials and max_nodes parameters. If you submit an AutoML image classification or NLP task with no instance type, we will automatically select a GPU VM size. It is possible to submit AutoML job through CLIs, SDK, or Studio. To submit AutoML jobs with serverless compute in studio first enable the *Guided experience for submitting training jobs with serverless compute* feature in the preview panel and then [submit a training job in studio (preview)](how-to-train-with-ui.md).
You can also set serverless compute as the default compute in Designer.
## Next steps
-View more examples of training with serverless compute at [azureml-examples-main](https://github.com/Azure/azureml-examples-main)
+View more examples of training with serverless compute at:-
+* [Quick Start](https://github.com/Azure/azureml-examples/blob/main/tutorials/get-started-notebooks/quickstart.ipynb)
+* [Train Model](https://github.com/Azure/azureml-examples/blob/main/tutorials/get-started-notebooks/train-model.ipynb)
machine-learning Azure Translator Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/tools-reference/azure-translator-tool.md
Last updated 06/30/2023
# Azure Translator tool (preview)
-Azure Cognitive Services Translator is a cloud-based machine translation service you can use to translate text in with a simple REST API call. See the [Azure Translator API](../../../cognitive-services/translator/index.yml) for more information.
+Azure AI Translator is a cloud-based machine translation service you can use to translate text in with a simple REST API call. See the [Azure Translator API](../../../ai-services/translator/index.yml) for more information.
> [!IMPORTANT] > Prompt flow is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
Azure Cognitive Services Translator is a cloud-based machine translation service
## Prerequisites -- [Create a Translator resource](../../../cognitive-services/translator/create-translator-resource.md).
+- [Create a Translator resource](../../../ai-services/translator/create-translator-resource.md).
## Inputs
machine-learning Quickstart Spark Jobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/quickstart-spark-jobs.md
Last updated 05/22/2023
# Configure Apache Spark jobs in Azure Machine Learning + The Azure Machine Learning integration, with Azure Synapse Analytics, provides easy access to distributed computing capability - backed by Azure Synapse - for scaling Apache Spark jobs on Azure Machine Learning. In this article, you learn how to submit a Spark job using Azure Machine Learning serverless Spark compute, Azure Data Lake Storage (ADLS) Gen 2 storage account, and user identity passthrough in a few simple steps.
machine-learning Reference Yaml Job Command https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/reference-yaml-job-command.md
The source JSON schema can be found at https://azuremlschemas.azureedge.net/late
| | - | -- | -- | - | | `type` | string | The type of job input. Specify `uri_file` for input data that points to a single file source, or `uri_folder` for input data that points to a folder source. | `uri_file`, `uri_folder`, `mlflow_model`, `custom_model`| `uri_folder` | | `path` | string | The path to the data to use as input. This can be specified in a few ways: <br><br> - A local path to the data source file or folder, e.g. `path: ./iris.csv`. The data will get uploaded during job submission. <br><br> - A URI of a cloud path to the file or folder to use as the input. Supported URI types are `azureml`, `https`, `wasbs`, `abfss`, `adl`. See [Core yaml syntax](reference-yaml-core-syntax.md) for more information on how to use the `azureml://` URI format. <br><br> - An existing registered Azure Machine Learning data asset to use as the input. To reference a registered data asset use the `azureml:<data_name>:<data_version>` syntax or `azureml:<data_name>@latest` (to reference the latest version of that data asset), e.g. `path: azureml:cifar10-data:1` or `path: azureml:cifar10-data@latest`. | | |
-| `mode` | string | Mode of how the data should be delivered to the compute target. <br><br> For read-only mount (`ro_mount`), the data will be consumed as a mount path. A folder will be mounted as a folder and a file will be mounted as a file. Azure Machine Learning will resolve the input to the mount path. <br><br> For `download` mode the data will be downloaded to the compute target. Azure Machine Learning will resolve the input to the downloaded path. <br><br> If you only want the URL of the storage location of the data artifact(s) rather than mounting or downloading the data itself, you can use the `direct` mode. This will pass in the URL of the storage location as the job input. Note that in this case you are fully responsible for handling credentials to access the storage. | `ro_mount`, `download`, `direct` | `ro_mount` |
+| `mode` | string | Mode of how the data should be delivered to the compute target. <br><br> For read-only mount (`ro_mount`), the data will be consumed as a mount path. A folder will be mounted as a folder and a file will be mounted as a file. Azure Machine Learning will resolve the input to the mount path. <br><br> For `download` mode the data will be downloaded to the compute target. Azure Machine Learning will resolve the input to the downloaded path. <br><br> If you only want the URL of the storage location of the data artifact(s) rather than mounting or downloading the data itself, you can use the `direct` mode. This will pass in the URL of the storage location as the job input. Note that in this case you are fully responsible for handling credentials to access the storage. <br><br> The `eval_mount` and `eval_download` modes are unique to MLTable, and either mounts the data as a path or downloads the data to the compute target. <br><br> For more information on modes, see [Access data in a job](how-to-read-write-data-v2.md?tabs=cli#modes) | `ro_mount`, `download`, `direct`, `eval_download`, `eval_mount` | `ro_mount` |
### Job outputs
machine-learning Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Machine Learning description: Lists Azure Policy Regulatory Compliance controls available for Azure Machine Learning. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
machine-learning Tutorial Pipeline Python Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/tutorial-pipeline-python-sdk.md
In the future, you can fetch the same dataset from the workspace using `credit_d
## Create a compute resource to run your pipeline > [!NOTE]
-> To try [serverless compute (preview)](./how-to-use-serverless-compute.md), skip this step and proceed to [create a job environment](#create-a-job-environment-for-pipeline-steps).
+> To use [serverless compute (preview)](./how-to-use-serverless-compute.md) to run this pipeline, you can skip this compute creation step and proceed directly to [create a job environment](#create-a-job-environment-for-pipeline-steps).
Each step of an Azure Machine Learning pipeline can use a different compute resource for running the specific job of that step. It can be single or multi-node machines with Linux or Windows OS, or a specific compute fabric like Spark.
managed-instance-apache-cassandra Monitor Clusters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-instance-apache-cassandra/monitor-clusters.md
Platform metrics and the Activity logs are collected automatically, whereas you
## <a id="create-setting-portal"></a> Create diagnostic settings via the Azure portal
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Managed Instance for Apache Cassandra cluster resource.
mariadb Howto Configure Ssl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-configure-ssl.md
Last updated 04/19/2023 ms.devlang: csharp, golang, java, php, python, ruby-+ # Configure SSL connectivity in your application to securely connect to Azure Database for MariaDB
mariadb Howto Data In Replication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-data-in-replication.md
Title: Configure data-in Replication - Azure Database for MariaDB description: This article describes how to set up Data-in Replication in Azure Database for MariaDB. +
mariadb Howto Migrate Dump Restore https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-migrate-dump-restore.md
+ Last updated 04/19/2023
mariadb Howto Move Regions Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-move-regions-portal.md
You can use an Azure Database for MariaDB [cross-region read replica](concepts-r
To create a cross-region read replica server in the target region using the Azure portal, use the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select the existing Azure Database for MariaDB server that you want to use as the source server. This action opens the **Overview** page. 1. Select **Replication** from the menu, under **SETTINGS**. 1. Select **Add Replica**.
mariadb Howto Read Replicas Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-read-replicas-portal.md
In this article, you will learn how to create and manage read replicas in the Az
A read replica server can be created using the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select the existing Azure Database for MariaDB server that you want to use as a master. This action opens the **Overview** page.
To delete a source server from the Azure portal, use the following steps:
## Monitor replication
-1. In the [Azure portal](https://portal.azure.com/), select the replica Azure Database for MariaDB server you want to monitor.
+1. In the [Azure portal](https://portal.azure.com), select the replica Azure Database for MariaDB server you want to monitor.
2. Under the **Monitoring** section of the sidebar, select **Metrics**:
mariadb Howto Redirection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-redirection.md
Title: Connect with redirection - Azure Database for MariaDB description: This article describes how you can configure your application to connect to Azure Database for MariaDB with redirection. +
mariadb Howto Restore Server Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/howto-restore-server-portal.md
While creating a server through the Azure portal, the **Pricing Tier** window is
For more information about setting these values during create, see the [Azure Database for MariaDB server quickstart](quickstart-create-mariadb-server-database-using-azure-portal.md). The backup retention period can be changed on a server through the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select your Azure Database for MariaDB server. This action opens the **Overview** page.
mariadb Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mariadb/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Database for MariaDB description: Lists Azure Policy Regulatory Compliance controls available for Azure Database for MariaDB. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
migrate How To Set Up Appliance Vmware https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/how-to-set-up-appliance-vmware.md
ms.
Last updated 01/31/2023-+ # Set up an appliance for servers in a VMware environment
To start vCenter Server discovery, in **Step 3: Provide server credentials to pe
## Next steps
-Review the [tutorials for VMware assessment](./tutorial-assess-vmware-azure-vm.md) and [tutorials for agentless migration](tutorial-migrate-vmware.md).
+Review the [tutorials for VMware assessment](./tutorial-assess-vmware-azure-vm.md) and [tutorials for agentless migration](tutorial-migrate-vmware.md).
migrate Prepare For Agentless Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/prepare-for-agentless-migration.md
Last updated 12/12/2022-+ # Prepare for VMware agentless migration
mysql Connect Azure Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/connect-azure-cli.md
-+ Last updated 05/03/2023
mysql Connect Csharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/connect-csharp.md
ms.devlang: csharp-+ Last updated 05/03/2023
mysql Connect Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/connect-java.md
-+ ms.devlang: java Last updated 05/03/2023
mysql Connect Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/connect-nodejs.md
Last updated 06/19/2023
-
- - mvc
- - seo-javascript-september2019
- - seo-javascript-october2019
- - devx-track-js
- - mode-api
+ ms.devlang: javascript
mysql Connect Php https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/connect-php.md
-+ Last updated 05/03/2023
mysql How To Data In Replication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/how-to-data-in-replication.md
Last updated 12/30/2022 +
mysql Concepts Migrate Mydumper Myloader https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/migrate/concepts-migrate-mydumper-myloader.md
+ Last updated 05/03/2023
mysql How To Migrate Single Flexible Minimum Downtime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/migrate/how-to-migrate-single-flexible-minimum-downtime.md
Last updated 05/03/2023 +
mysql Concepts Migrate Dump Restore https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/concepts-migrate-dump-restore.md
Title: Migrate using dump and restore - Azure Database for MySQL
description: This article explains two common ways to back up and restore databases in your Azure Database for MySQL, using tools such as mysqldump, MySQL Workbench, and PHPMyAdmin. +
mysql Connect Go https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/connect-go.md
-+ ms.devlang: golang Last updated 05/03/2023
mysql Connect Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/connect-java.md
ms.devlang: java -+ Last updated 05/03/2023
mysql Connect Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/connect-nodejs.md
ms.devlang: javascript -+ Last updated 05/03/2023
mysql Connect Ruby https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/connect-ruby.md
ms.devlang: ruby -+ Last updated 05/03/2023
mysql How To Connect With Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/how-to-connect-with-managed-identity.md
-+ Last updated 05/03/2023
mysql How To Data In Replication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/how-to-data-in-replication.md
Title: Configure Data-in Replication - Azure Database for MySQL
description: This article describes how to set up Data-in Replication for Azure Database for MySQL. +
mysql How To Move Regions Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/how-to-move-regions-portal.md
You can use an Azure Database for MySQL [cross-region read replica](concepts-rea
To create a cross-region read replica server in the target region using the Azure portal, use the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select the existing Azure Database for MySQL server that you want to use as the source server. This action opens the **Overview** page. 1. Select **Replication** from the menu, under **SETTINGS**. 1. Select **Add Replica**.
In this tutorial, you moved an Azure Database for MySQL server from one region t
- Learn more about [read replicas](concepts-read-replicas.md) - Learn more about [managing read replicas in the Azure portal](how-to-read-replicas-portal.md)-- Learn more about [business continuity](concepts-business-continuity.md) options
+- Learn more about [business continuity](concepts-business-continuity.md) options
mysql How To Read Replicas Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/how-to-read-replicas-portal.md
In this article, you will learn how to create and manage read replicas in the Az
A read replica server can be created using the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select the existing Azure Database for MySQL server that you want to use as a master. This action opens the **Overview** page.
To delete a source server from the Azure portal, use the following steps:
## Monitor replication
-1. In the [Azure portal](https://portal.azure.com/), select the replica Azure Database for MySQL server you want to monitor.
+1. In the [Azure portal](https://portal.azure.com), select the replica Azure Database for MySQL server you want to monitor.
2. Under the **Monitoring** section of the sidebar, select **Metrics**:
To delete a source server from the Azure portal, use the following steps:
## Next steps -- Learn more about [read replicas](concepts-read-replicas.md)
+- Learn more about [read replicas](concepts-read-replicas.md)
mysql How To Redirection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/how-to-redirection.md
Last updated 05/03/2023 +
mysql How To Restore Server Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/how-to-restore-server-portal.md
While creating a server through the Azure portal, the **Pricing Tier** window is
For more information about setting these values during create, see the [Azure Database for MySQL server quickstart](quickstart-create-mysql-server-database-using-azure-portal.md). The backup retention period can be changed on a server through the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select your Azure Database for MySQL server. This action opens the **Overview** page. 3. Select **Pricing Tier** from the menu, under **SETTINGS**. Using the slider you can change the **Backup Retention Period** to your preference between 7 and 35 days. In the screenshot below it has been increased to 34 days.
The new server created during a restore does not have the VNet service endpoints
## Next steps - Learn more about the service's [backups](concepts-backup.md) - Learn about [replicas](concepts-read-replicas.md)-- Learn more about [business continuity](concepts-business-continuity.md) options
+- Learn more about [business continuity](concepts-business-continuity.md) options
mysql Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/single-server/security-controls-policy.md
Previously updated : 07/06/2023 Last updated : 07/20/2023 # Azure Policy Regulatory Compliance controls for Azure Database for MySQL
nat-gateway Quickstart Create Nat Gateway Bicep https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/nat-gateway/quickstart-create-nat-gateway-bicep.md
Previously updated : 04/24/2023 Last updated : 07/21/2023 -+ # Customer intent: I want to create a NAT gateway using Bicep so that I can provide outbound connectivity for my virtual machines.
Get started with Azure NAT Gateway using Bicep. This Bicep file deploys a virtual network, a NAT gateway resource, and Ubuntu virtual machine. The Ubuntu virtual machine is deployed to a subnet that is associated with the NAT gateway resource. + [!INCLUDE [About Bicep](../../includes/resource-manager-quickstart-bicep-introduction.md)] ## Prerequisites
nat-gateway Quickstart Create Nat Gateway Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/nat-gateway/quickstart-create-nat-gateway-template.md
description: This quickstart shows how to create a NAT gateway by using the Azur
Previously updated : 04/24/2023 Last updated : 07/21/2023 -+ # Customer intent: I want to create a NAT gateway by using an Azure Resource Manager template so that I can provide outbound connectivity for my virtual machines.
Get started with Azure NAT Gateway by using an Azure Resource Manager template (ARM template). This template deploys a virtual network, a NAT gateway resource, and Ubuntu virtual machine. The Ubuntu virtual machine is deployed to a subnet that is associated with the NAT gateway resource. + [!INCLUDE [About Azure Resource Manager](../../includes/resource-manager-quickstart-introduction.md)] If your environment meets the prerequisites and you're familiar with using ARM templates, select the **Deploy to Azure** button. The template opens in the Azure portal.
nat-gateway Resource Health https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/nat-gateway/resource-health.md
The health of your NAT gateway resource is displayed as one of the following sta
| Resource health status | Description | ||| | Available | Your NAT gateway resource is healthy and available. |
-| Degraded | Your NAT gateway resource has platform or user initiated events impacting the health of your NAT gateway. The metric for the data-path availability has reported less than 80% but greater than 25% health for the last fifteen minutes. |
-| Unavailable | Your NAT gateway resource is not healthy. The metric for the data-path availability has reported less than 25% for the past 15 minutes. You may experience unavailability of your NAT gateway resource for outbound connectivity. |
+| Degraded | Your NAT gateway resource has platform or user initiated events impacting the health of your NAT gateway. The metric for the data-path availability has reported less than 80% but greater than 25% health for the last fifteen minutes. You'll experience moderate to severe performance impact. |
+| Unavailable | Your NAT gateway resource isn't healthy. The metric for the data-path availability has reported less than 25% for the past 15 minutes. You'll experience significant performance impact or unavailability of your NAT gateway resource for outbound connectivity. There may be user or platform events causing unavailability. |
| Unknown | Health status for your NAT gateway resource hasnΓÇÖt been updated or hasnΓÇÖt received information for data-path availability for more than 5 minutes. This state should be transient and will reflect the correct status as soon as data is received. | For more information about Azure Resource Health, see [Resource Health overview](../service-health/resource-health-overview.md).
nat-gateway Troubleshoot Nat Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/nat-gateway/troubleshoot-nat-connectivity.md
Title: Troubleshoot Azure NAT Gateway connectivity
description: Troubleshoot connectivity issues with a NAT gateway. -+ Last updated 04/24/2023
nat-gateway Tutorial Hub Spoke Nat Firewall https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/nat-gateway/tutorial-hub-spoke-nat-firewall.md
In this tutorial, you learn how to:
The hub virtual network contains the firewall subnet that is associated with the Azure Firewall and NAT gateway. Use the following example to create the hub virtual network.
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the search box at the top of the portal, enter **Virtual network**. Select **Virtual networks** in the search results.
network-watcher Network Insights Topology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-insights-topology.md
The following are the resource types supported by Topology:
To view a topology, follow these steps:
-1. Log into the [Azure portal](https://portal.azure.com) with an account that has the necessary [permissions](required-rbac-permissions.md).
+1. Sign in to the [Azure portal](https://portal.azure.com) with an account that has the necessary [permissions](required-rbac-permissions.md).
2. Select **More services**. 3. In the **All services** screen, enter **Monitor** in the **Filter services** search box and select it from the search result. 4. Under **Insights**, select **Networks**.
Follow these steps to find the next hop.
## Next steps
-[Learn more](./connection-monitor-overview.md) about connectivity related metrics.
+[Learn more](./connection-monitor-overview.md) about connectivity related metrics.
network-watcher Network Watcher Analyze Nsg Flow Logs Graylog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-analyze-nsg-flow-logs-graylog.md
Last updated 05/03/2023 -+ # Manage and analyze network security group flow logs in Azure using Network Watcher and Graylog
network-watcher Network Watcher Nsg Grafana https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-nsg-grafana.md
Last updated 05/03/2023 -+ # Manage and analyze network security group flow logs using Network Watcher and Grafana
network-watcher Network Watcher Visualize Nsg Flow Logs Open Source Tools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-visualize-nsg-flow-logs-open-source-tools.md
Last updated 05/03/2023 -+ # Visualize Azure Network Watcher NSG flow logs using open source tools
networking Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/networking/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure networking services description: Lists Azure Policy Regulatory Compliance controls available for Azure networking services. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
open-datasets How To Create Azure Machine Learning Dataset From Open Dataset https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/how-to-create-azure-machine-learning-dataset-from-open-dataset.md
Title: Create datasets with Azure Open Datasets
description: Learn how to create an Azure Machine Learning dataset from Azure Open Datasets. --++ Last updated 08/05/2020 #Customer intent: As an experienced Python developer, I want to use Azure Open Datasets in my ML workflows for improved model accuracy.
open-datasets Overview What Are Open Datasets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/overview-what-are-open-datasets.md
Title: What are open datasets? Curated public datasets
description: Learn about Azure Open Datasets, curated datasets from the public domain such as weather, census, holidays, and location to enrich predictive solutions. --++ Last updated 05/06/2020
open-datasets Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/samples.md
Title: Example Jupyter notebooks using NOAA data
description: Use example Jupyter notebooks for Azure Open Datasets to learn how to load open datasets and use them to enrich demo data. Techniques include use of Spark and Pandas to process data. --++ Last updated 05/06/2020
openshift Howto Custom Dns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/openshift/howto-custom-dns.md
The output:
machineconfig.machineconfiguration.openshift.io "25-machineconfig-worker-reboot" deleted ```
-Wait for all of the worker nodes to reboot. This is similar to the the [reboot of worker nodes](#reboot-worker-nodes) above.
+Wait for all of the worker nodes to reboot. This is similar to the [reboot of worker nodes](#reboot-worker-nodes) above.
Now we'll reboot the master nodes.
postgresql How To Move Regions Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/single-server/how-to-move-regions-portal.md
You can use an Azure Database for PostgreSQL [cross-region read replica](concept
To prepare the source server for replication using the Azure portal, use the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Select the existing Azure Database for PostgreSQL server that you want to use as the source server. This action opens the **Overview** page. 1. From the server's menu, select **Replication**. If Azure replication support is set to at least **Replica**, you can create read replicas. 1. If Azure replication support is not set to at least **Replica**, set it. Select **Save**.
postgresql How To Restore Server Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/single-server/how-to-restore-server-portal.md
While creating a server through the Azure portal, the **Pricing Tier** window is
For more information about setting these values during create, see the [Azure Database for PostgreSQL server quickstart](quickstart-create-server-database-portal.md). The backup retention period of a server can be changed through the following steps:
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select your Azure Database for PostgreSQL server. This action opens the **Overview** page. 3. Select **Pricing Tier** from the menu, under **SETTINGS**. Using the slider you can change the **Backup Retention Period** to your preference between 7 and 35 days. In the screenshot below it has been increased to 34 days.
postgresql Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/single-server/security-controls-policy.md
Previously updated : 07/06/2023 Last updated : 07/20/2023 # Azure Policy Regulatory Compliance controls for Azure Database for PostgreSQL
private-link Inspect Traffic With Azure Firewall https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/inspect-traffic-with-azure-firewall.md
Last updated 04/27/2023 -+ # Use Azure Firewall to inspect traffic destined to a private endpoint
private-link Tutorial Private Endpoint Sql Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/tutorial-private-endpoint-sql-cli.md
az network private-endpoint dns-zone-group create \
In this section, you'll use the virtual machine you created in the previous step to connect to the SQL server across the private endpoint.
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Resource groups** in the left-hand navigation pane.
private-link Tutorial Private Endpoint Sql Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/tutorial-private-endpoint-sql-powershell.md
New-AzPrivateDnsZoneGroup @parameters4
In this section, you'll use the virtual machine you created in the previous step to connect to the SQL server across the private endpoint.
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select **Resource groups** in the left-hand navigation pane.
purview Abap Functions Deployment Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/abap-functions-deployment-guide.md
- Title: SAP ABAP function module deployment guide - Microsoft Purview
-description: This article outlines the steps to deploy the ABAP function module in your SAP server.
----- Previously updated : 08/03/2022--
-# SAP ABAP function module deployment guide
-
-When you scan [SAP ECC](register-scan-sapecc-source.md), [SAP S/4HANA](register-scan-saps4hana-source.md), and [SAP BW](register-scan-sap-bw.md) sources in Microsoft Purview, you need to create the dependent ABAP function module in your SAP server. Microsoft Purview invokes this function module to extract the metadata from your SAP system during scan.
-
-This article describes the steps required to deploy this module.
-
-> [!Note]
-> The following instructions were compiled based on the SAP GUI v. 7.2.
-
-## Prerequisites
-
-Download the SAP ABAP function module source code from the Microsoft Purview governance portal. After you register a source for [SAP ECC](register-scan-sapecc-source.md), [SAP S/4HANA](register-scan-saps4hana-source.md), or [SAP BW](register-scan-sap-bw.md), you can find a download link on top as shown in the following image. You can also see the link when you create a new scan or edit a scan.
--
-## Deploy the module
-
-Follow the instructions to deploy a module.
-
-### Create a package
-
-This step is optional, and an existing package can be used.
-
-1. Sign in to the SAP server and open **Object Navigator** (SE80 transaction).
-
-1. Select **Package** from the list and enter a name for the new package. For example, use **Z\_MITI**. Then select **Display**.
-
-1. In the **Create Package** window, select **Yes**. In the **Package Builder: Create Package** window, enter a value in the **Short Description** box. Select the **Continue** icon.
-
-1. In the **Prompt for local Workbench request** window, select **Own Requests**. Select the **development** request.
-
-### Create a function group
-
-1. In **Object Navigator**, select **Function Group** from the list and enter a name in the input box. For example, use **Z\_MITI\_FGROUP**. Select the **View** icon.
-
-1. In the **Create Object** window, select **yes** to create a new function group.
-
-1. Enter a description in the **Short Text** box and select **Save**.
-
-1. Select a package that was prepared in the **Create a Package** step, and select **Save**.
-
-1. Confirm a request by selecting **Continue**.
-
-1. Activate the function group.
-
-### Create the ABAP function module
-
-1. After the function group is created, select it.
-
-1. Select and hold (or right-click) the function group name in the repository browser. Select **Create** and then select **Function Module**.
-
-1. In the **Function module** box, enter **Z_MITI_DOWNLOAD** in case of SAP ECC or S/4HANA and **Z_MITI_BW_DOWNLOAD** in case of SAP BW. Enter a description in the **Short Text** box.
-
-After the module is created, specify the following information:
-
-1. Go to the **Attributes** tab.
-
-1. Under **Processing Type**, select **Remote-Enabled Module**.
-
- :::image type="content" source="media/abap-functions-deployment-guide/processing-type.png" alt-text="Screenshot that shows registering the sources option as Remote-Enabled Module." border="true":::
-
-1. Go to the **Source code** tab. There are two ways to deploy code for the function:
-
- 1. On the main menu, upload the text file you downloaded from the Microsoft Purview governance portal as described in [Prerequisites](#prerequisites). To do so, select **Utilities** > **More Utilities** > **Upload/Download** > **Upload**.
-
- 1. Alternatively, open the file and copy and paste the contents in the **Source code** area.
-
-1. Go to the **Import** tab and create the following parameters:
-
- 1. *P\_AREA TYPE DD02L-TABNAME* (**Optional** = True)
-
- 1. *P\_LOCAL\_PATH TYPE STRING* (**Optional** = True)
-
- 1. *P\_LANGUAGE TYPE L001TAB-DATA DEFAULT \'E\'*
-
- 1. *ROWSKIPS TYPE SO\_INT DEFAULT 0*
-
- 1. *ROWCOUNT TYPE SO\_INT DEFAULT 0*
-
- > [!Note]
- > Select the **Pass Value** checkbox for all the parameters.
-
- :::image type="content" source="media/abap-functions-deployment-guide/import.png" alt-text="Screenshot that shows registering the sources option as Import parameters." border="true":::
-
-1. Go to the **Tables** tab and define *EXPORT_TABLE LIKE TAB512*.
-
- :::image type="content" source="media/abap-functions-deployment-guide/export-table.png" alt-text="Screenshot that shows the Tables tab." border="true":::
-
-1. Go to the **Exceptions** tab and define the exception *E_EXP_GUI_DOWNLOADFAILED*.
-
- :::image type="content" source="media/abap-functions-deployment-guide/exceptions.png" alt-text="Screenshot that shows the Exceptions tab." border="true":::
-
-1. Save the function by selecting **Ctrl+S**. Or select **Function module** and then select **Save** on the main menu.
-
-1. Select the **Activate** icon on the toolbar and then select **Continue**. You can also select **Ctrl+F3**. If prompted, select the generated includes to be activated along with the main function module.
-
-### Test the function
-
-After you finish the previous steps, test the function:
-
-1. Open the **Z_MITI_DOWNLOAD** or **Z_MITI_BW_DOWNLOAD** function module you created.
-
-1. On the main menu, select **Function Module** > **Test** > **Test Function Module**. You can also select **F8**.
-
-1. Enter a path to the folder on the local file system in the parameter *P\_LOCAL\_PATH*. Then select the **Execute** icon on the toolbar. You can also select **F8**.
-
-1. Enter the name of the area of interest in the **P\_AREA** field if a file with metadata must be downloaded or updated. After the function finishes working, the folder indicated in the *P\_LOCAL\_PATH* parameter must contain several files with metadata inside. The names of files mimic areas that can be specified in the **P\_AREA** field.
-
-The function finishes its execution and metadata is downloaded much faster if it's launched on the machine that has a high-speed network connection with the SAP server.
-
-## Next steps
--- [Connect to and manage SAP ECC source](register-scan-sapecc-source.md)-- [Connect to and manage SAP S/4HANA source](register-scan-saps4hana-source.md)-- [Connect to and manage SAP Business Warehouse (BW) source](register-scan-sap-bw.md)
purview Account Upgrades https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/account-upgrades.md
- Title: Get ready for the next enhancement in Microsoft Purview
-description: A new experience is coming to Microsoft Purview. This article provides the steps you need to take to try it out!
---- Previously updated : 06/16/2023---
-# Get ready for the next enhancement in Microsoft Purview
-
->[!NOTE]
->This new experience will be available within the next few weeks. Account owners and root collection admins will receive an email when it has been made available for your organization.
-
-You spoke, we listened! Get ready for your next Microsoft Purview enhancement. We're rolling this new experience out to our customers to try. When it's available for your organization, your account owners and root collection admins will receive an email, and can follow these steps to enable the new experience!
-
-The new experience is an enhancement to the current Microsoft Purview, and doesn't impact your information already stored in Microsoft Purview or your ability to use our APIs.
-
-## How can you use our new experience?
-
-The new experience is automatically available once your organization has been enabled, but getting started with it depends on your organization's current structure.
-Follow one of these guides below:
--- [You only have one Microsoft Purview account](#one-microsoft-purview-account)
- - [Your account region maps to your tenant region](#your-region-maps-to-an-available-region)
- - [Your account region doesn't match your tenant region](#your-region-doesnt-map-to-an-available-region)
-- [You have multiple Microsoft Purview accounts](#multiple-microsoft-purview-accounts)-
-### One Microsoft Purview account
-
-#### Your region maps to an available region
-
-If your Microsoft Purview account's region matches with a currently available region, you're automatically able to use the new experience! No upgrade required.
-
-To access it, you can either:
--- Select the note at the top of the Microsoft Purview accounts page.-- Select the toggle in the Microsoft Purview governance portal.-
- :::image type="content" source="./media/account-upgrades/switch.png" alt-text="Screenshot of the toggle to switch to the new experience.":::
-
-#### Your region doesn't map to an available region
-
-1. Have an account owner or root collection admin select the **Upgrade** button in the Azure portal, Microsoft Purview portal, or upgrade email.
-
-1. If your account is in a different region than your tenant, an admin needs to confirm set up.
-
- :::image type="content" source="./media/account-upgrades/different-region.png" alt-text="Screenshot showing the different region confirmation pop-up window.":::
-
- >[!NOTE]
- >If you want to use a different region, you will either need to cancel and wait for upcoming region related features, or cancel and create a new Microsoft Purview account in one of the [available regions](#regions).
-1. After confirmation, the new portal will launch.
-1. You can take the tour and begin exploring the new Microsoft Purview experience!
-
->[!TIP]
->You can switch between the classic and new experiences using the toggle at the top of the portal.
->
->:::image type="content" source="./media/account-upgrades/switch.png" alt-text="Screenshot of the toggle to switch between the new and classic experiences after upgrading.":::
-
-### Multiple Microsoft Purview accounts
-
-1. Have an account owner or root collection admin select the **Upgrade** button in the Azure portal, Microsoft Purview portal, or upgrade email.
-1. The new experience requires a tenant-level/organization-wide account that is the primary account for your organization. Select an existing account to upgrade it to be your organization-wide account.
- >[!NOTE]
- >This does not affect your other Microsoft Purview accounts, or their data. In the future, their information will also be made available in the primary account.
-
- :::image type="content" source="./media/account-upgrades/selected-account.png" alt-text="Screenshot of the pop up window for selecting an organization wide primary account.":::
-
-1. If the account you select is in a different region than your tenant, an admin needs to confirm set up.
-
- :::image type="content" source="./media/account-upgrades/selected-account-different-region.png" alt-text="Screenshot of confirmation for selecting an account in a region that's different from your tenant region.":::
-
- >[!NOTE]
- >If you want to use a different [region](#regions), you will either need to cancel and select or create a different Microsoft Purview account, or cancel and wait for upcoming region related features.
-1. After confirmation, the new portal will launch.
-1. You can take the tour and begin exploring the new Microsoft Purview experience!
-
->[!TIP]
->You can switch between the classic and new experiences using the toggle at the top of the portal.
->
-> :::image type="content" source="./media/account-upgrades/switch.png" alt-text="Screenshot of the toggle to switch between the new and classic experiences.":::
-
-## Regions
-
-For information regarding all available Microsoft Purview regions, see [the Azure product by regions page.](https://azure.microsoft.com/explore/global-infrastructure/products-by-region/?products=purview&regions=all)
-
-These are the regions that are currently available for the new experience:
-
-| Purview Account Location | Mapped Azure AD Country or Region | Tenant Location Code |
-| | -- | -- |
-| Australia East |  Australia | AU |
-| |  Fiji | FJ |
-| |  New Zealand | NZ |
-| Brazil South |  Argentina | AR |
-| |  Bolivia (Plurinational State of) | BO |
-| |  Bonaire Sint Eustatius and Saba | BQ |
-| |  Brazil | BR |
-| |  Chile | CL |
-| |  Colombia | CO |
-| |  Curaçao | CW |
-| |  Ecuador | EC |
-| |  Falkland Islands (the) | FK |
-| |  French Guiana | GF |
-| |  Guyana | GY |
-| |  Peru | PE |
-| |  Paraguay | PY |
-| |  Suri | SR |
-| |  Sint Maarten (Dutch part) | SX |
-| |  Uruguay | UY |
-| |  Venezuela (Bolivarian Republic of) | VE |
-| Canada Central | Canada | CA |
-| Central India |  India | IN |
-| |  Sri Lanka | LK |
-| |  Nepal | NP |
-| Central US |  United States of America (the) | US |
-| East US | Antigua and Barbuda | AG |
-| |  Anguilla | AI |
-| |  Aruba | AW |
-| |  Barbados | BB |
-| |  Bermuda | BM |
-| |  Bahamas (the) | BS |
-| |  Cuba | CU |
-| |  Dominica | DM |
-| |  Dominican Republic (the) | DO |
-| |  Grenada | GD |
-| |  Guadeloupe | GP |
-| |  Haiti | HT |
-| |  Jamaica | JM |
-| |  Saint Kitts and Nevis | KN |
-| |  Cayman Islands (the) | KY |
-| |  Saint Lucia | LC |
-| |  Martinique | MQ |
-| |  Panama | PA |
-| |  Puerto Rico | PR |
-| |  Trinidad and Tobago | TT |
-| |  Saint Vincent and the Grenadines | VC |
-| |  Virgin Islands (British) | VG |
-| |  Virgin Islands (U.S.) | VI |
-| |  United States of America (the) | US |
-| East US 2 |  United States of America (the) | US |
-| France Central | France | FR |
-| Germany West Central | Germany | DE |
-| Japan East | Japan | JP |
-| Korea Central |  Korea (the Republic of) | KR |
-| North Europe |  Finland | FI |
-| |  Faroe Islands (the) | FO |
-| |  Ireland | IE |
-| |  Iceland | IS |
-| |  Moldova (the Republic of) | MD |
-| |  Sweden | SE |
-| Qatar Central | Qatar | QA |
-| South Africa North |  Angola | AO |
-| |  Burkina Faso | BF |
-| |  Burundi | BI |
-| |  Benin | BJ |
-| |  Botswana | BW |
-| |  Congo (the Democratic Republic of the) | CD |
-| |  Central African Republic (the) | CF |
-| |  Congo (the) | CG |
-| |  Côte d'Ivoire | CI |
-| |  Cameroon | CM |
-| |  Cabo Verde | CV |
-| |  Djibouti | DJ |
-| |  Algeria | DZ |
-| |  Egypt | EG |
-| |  Eritrea | ER |
-| |  Ethiopia | ET |
-| |  Gabon | GA |
-| |  Ghana | GH |
-| |  Gambia (the) | GM |
-| |  Guinea | GN |
-| |  Equatorial Guinea | GQ |
-| |  Guinea-Bissau | GW |
-| |  Kenya | KE |
-| |  Comoros (the) | KM |
-| |  Liberia | LR |
-| |  Lesotho | LS |
-| |  Libya | LY |
-| |  Morocco | MA |
-| |  Madagascar | MG |
-| |  Mali | ML |
-| |  Mauritania | MR |
-| |  Mauritius | MU |
-| |  Malawi | MW |
-| |  Mozambique | MZ |
-| |  Namibia | NA |
-| |  Niger (the) | NE |
-| |  Nigeria | NG |
-| |  Réunion | RE |
-| |  Rwanda | RW |
-| |  Seychelles | SC |
-| |  Sudan (the) | SD |
-| |  Saint Helena Ascension and Tristan da Cunha | SH |
-| |  Sierra Leone | SL |
-| |  Senegal | SN |
-| |  Somalia | SO |
-| |  South Sudan | SS |
-| |  São Tomé and Príncipe | ST |
-| |  Eswatini | SZ |
-| |  Chad | TD |
-| |  Togo | TG |
-| |  Tunisia | TN |
-| |  Tanzania United Republic of | TZ |
-| |  Uganda | UG |
-| |  Mayotte | YT |
-| |  South Africa | ZA |
-| |  Zambia | ZM |
-| |  Zimbabwe | ZW |
-| Southeast Asia |  Armenia | AM |
-| |  American Samoa | AS |
-| |  Brunei Darussalam | BN |
-| |  Cocos (Keeling) Islands (the) | CC |
-| |  Cook Islands (the) | CK |
-| |  China | CN |
-| |  Christmas Island | CX |
-| |  Micronesia (Federated States of) | FM |
-| |  Guam | GU |
-| |  Hong Kong SAR | HK |
-| |  Indonesia | ID |
-| |  Cambodia | KH |
-| |  Kiribati | KI |
-| |  Lao People's Democratic Republic (the) | LA |
-| |  Marshall Islands (the) | MH |
-| |  Myanmar | MM |
-| |  Macao SAR | MO |
-| |  Northern Mariana Islands (the) | MP |
-| |  Malaysia | MY |
-| |  New Caledonia | NC |
-| |  Norfolk Island | NF |
-| |  Nauru | NR |
-| |  Niue | NU |
-| |  French Polynesia | PF |
-| |  Papua New Guinea | PG |
-| |  Philippines (the) | PH |
-| |  Pitcairn | PN |
-| |  Palau | PW |
-| |  Solomon Islands | SB |
-| |  Singapore | SG |
-| |  Thailand | TH |
-| |  Tokelau | TK |
-| |  Tonga | TO |
-| |  Tuvalu | TV |
-| |  Taiwan (Province of China) | TW |
-| |  United States Minor Outlying Islands (the) | UM |
-| |  Viet Nam | VN |
-| |  Vanuatu | VU |
-| |  Wallis and Futuna | WF |
-| |  Samoa | WS |
-| South Central US |  United States of America (the) | US |
-| Switzerland North | Switzerland | CH |
-| UAE North | United Arab Emirates (the) | AE |
-| UK South |  United Kingdom of Great Britain and Northern Ireland (the) | GB |
-| West Central US |  United States of America (the) | US |
-| West Europe |  Andorra | AD |
-| |  Albania | AL |
-| |  Austria | AT |
-| |  Åland Islands | AX |
-| |  Azerbaijan | AZ |
-| |  Bosnia and Herzegovina | BA |
-| |  Belgium | BE |
-| |  Bulgaria | BG |
-| |  Bahrain | BH |
-| |  Belarus | BY |
-| |  Cyprus | CY |
-| |  Czechia | CZ |
-| |  Denmark | DK |
-| |  Estonia | EE |
-| |  Spain | ES |
-| |  Georgia | GE |
-| |  Guernsey | GG |
-| |  Gibraltar | GI |
-| |  Greece | GR |
-| |  Croatia | HR |
-| |  Hungary | HU |
-| |  Israel | IL |
-| |  Isle of Man | IM |
-| |  Italy | IT |
-| |  Jersey | JE |
-| |  Jordan | JO |
-| |  Kuwait | KW |
-| |  Kazakhstan | KZ |
-| |  Lebanon | LB |
-| |  Liechtenstein | LI |
-| |  Lithuania | LT |
-| |  Luxembourg | LU |
-| |  Latvia | LV |
-| |  Monaco | MC |
-| |  Montenegro | ME |
-| |  Republic of North Macedonia | MK |
-| |  Malta | MT |
-| |  Netherlands (the) | NL |
-| |  Norway | NO |
-| |  Oman | OM |
-| |  Pakistan | PK |
-| |  Poland | PL |
-| |  Portugal | PT |
-| |  Romania | RO |
-| |  Serbia | RS |
-| |  Russian Federation (the) | RU |
-| |  Saudi Arabia | SA |
-| |  Slovenia | SI |
-| |  Svalbard and Jan Mayen | SJ |
-| |  Slovakia | SK |
-| |  San Marino | SM |
-| |  Türkiye | TR |
-| |  Ukraine | UA |
-| |  Holy See (the) | VA |
-| |  Yemen | YE |
-| West US |  United States of America (the) | US |
-| West US 2 |  United States of America (the) | US |
-| West US 3 |  Costa Rica | CR |
-| |  Guatemala | GT |
-| |  Honduras | HN |
-| |  Mexico | MX |
-| |  Nicaragua | NI |
-| |  El Salvador | SV |
-| |  New Mexico | NM |
-| |  United States of America (the) | US |
-
-## Next steps
-
-Stay tuned for updates! Until then, make use of Microsoft Purview's other features:
--- [Create an account](create-microsoft-purview-portal.md)-- [Share and receive data](quickstart-data-share.md)-- [Data lineage](concept-data-lineage.md)-- [Data owner policies](concept-policies-data-owner.md)-- [Classification](concept-classification.md)
purview Apply Classifications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/apply-classifications.md
- Title: Automatically apply classifications on assets
-description: This document describes how to automatically apply classifications on assets.
----- Previously updated : 12/30/2022-
-# Automatically apply classifications on assets in Microsoft Purview
-
-After data sources are [registered](manage-data-sources.md#register-a-new-source) in the Microsoft Purview Data Map, the next step is to [scan](concept-scans-and-ingestion.md) the data sources. The scanning process establishes a connection to the data source, captures technical metadata, and can automatically classify data using either the [supported system classifications](supported-classifications.md) or [rules for your custom classifications](create-a-custom-classification-and-classification-rule.md#custom-classification-rules). For example, if you have a file named *multiple.docx* and it has a National ID number in its content, during the scanning process Microsoft Purview adds the classification **EU National Identification Number** to the file asset's detail page.
-
-These [classifications](concept-classification.md) help you and your team identify the kinds of data you have across your data estate. For example: if files or tables contain credit card numbers, or addresses. Then you can more easily search for certain kinds of information, like customer IDs, or prioritize security for sensitive data types.
-
-Classifications can be automatically applied on file and column assets during scanning.
-
-In this article we'll discuss:
--- [How Microsoft Purview classifies data](#how-microsoft-purview-classifies-assets)-- [The steps to automatically apply classifications](#automatically-apply-classifications)-
-## How Microsoft Purview classifies assets
-
-When a data source is scanned, Microsoft Purview compares data in the asset to a list of possible classifications called a [scan rule set](create-a-scan-rule-set.md).
-
-There are [system scan rule sets](create-a-scan-rule-set.md#system-scan-rule-sets) already available for each data source that contains every currently available system classification for that data source. Or, you can [create a custom scan rule set](create-a-scan-rule-set.md) to make a list of classifications tailored to your data set.
-
-Making a custom rule sets for your data can be a good idea if your data is limited to specific kinds of information, or regions, as comparing your data to fewer classification types will speed up the scanning process. For example, if your dataset only contains European data, you could create a custom scan rule set that excludes identification for other regions.
-
-You might also make a custom rule set if you've created [custom classifications](create-a-custom-classification-and-classification-rule.md#steps-to-create-a-custom-classification) and [classification rules](create-a-custom-classification-and-classification-rule.md#custom-classification-rules), so that your custom classifications can be automatically applied during scanning.
-
-For more information about the available system classifications and how your data is classified, see the [system classifications page.](supported-classifications.md)
-
-## Automatically apply classifications
-
->[!NOTE]
->Table assets are not automatically assigned classifications, because the classifications are assigned to their columns, but you can [manually apply classifications to table assets](manually-apply-classifications.md#manually-apply-classification-to-a-table-asset).
-
-After data sources are [registered](manage-data-sources.md#register-a-new-source), you can automatically classify data in that source's data assets by running a [scan](concept-scans-and-ingestion.md).
-
-1. Check the **Scan** section of the [source article](microsoft-purview-connector-overview.md) for your data source to confirm any prerequisites or authentication are set up and ready for a scan.
-
-1. Search the Microsoft Purview Data Map the registered source that has the data assets (files and columns), you want to classify.
-
-1. Select the **New Scan** icon under the resource.
-
- :::image type="content" source="./media/apply-classifications/new-scan.png" alt-text="Screenshot of the Microsoft Purview Data Map, with the new scan button selected under a registered source.":::
-
- >[!TIP]
- >If you don't see the New Scan button, you may not have correct permissions. To run a scan, you'll need at least [data source administrator permissions](catalog-permissions.md) on the collection where the source is registered.
-
-1. Select your credential and authenticate with your source. (For more information about authenticating with your source, see the **prerequisite** and **scan** sections of your specific source [source article](microsoft-purview-connector-overview.md).) Select **Continue**.
-
-1. If necessary, select the assets in the source you want to scan. You can scan all assets, or a subset of folders, files, or tables depending on the source.
-
-1. Select your scan rule set. You'll see a list of available scan rule sets and can select one, or you can choose to create a new scan rule set using the **New scan rule set** button at the top. The scan rule set will determine which classifications will be compared and applied to your data. For more information, see [how Microsoft Purview classifies assets](#how-microsoft-purview-classifies-assets).
-
- :::image type="content" source="./media/apply-classifications/select-scan-rule-set.png" alt-text="Screenshot of the scan rule set page of the scan menu, with the new scan rule set and existing scan rule set buttons highlighted.":::
-
- >[!TIP]
- >For more information about the options available when creating a scan rule set, start at step 4 of these [steps to create a scan rule set](create-a-scan-rule-set.md#steps-to-create-a-scan-rule-set).
-
-1. Schedule your scan.
-
-1. Save and run your scan. Applicable classifications in your scan rule set will be automatically applied to the assets you scan. You'll be able to view and manage them once the scan is complete.
--
-## Next steps
--- To learn how to create a custom classification, see [create a custom classification](create-a-custom-classification-and-classification-rule.md).-- To learn about how to manually apply classifications, see [manually apply classifications](manually-apply-classifications.md).
purview Asset Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/asset-insights.md
- Title: Asset insights on your data in Microsoft Purview
-description: This how-to guide describes how to view and use Microsoft Purview Data Estate Insights asset reporting on your data.
------ Previously updated : 05/16/2022--
-# Asset insights on your data in Microsoft Purview
-
-This guide describes how to access, view, and filter Microsoft Purview asset insight reports for your data.
-
-In this guide, you'll learn how to:
-
-> [!div class="checklist"]
-> * View data estate insights from your Microsoft Purview account.
-> * Get a bird's eye view of your data.
-> * Drill down for more asset count details.
-
-## Prerequisites
-
-Before getting started with Microsoft Purview Data Estate Insights, make sure that you've completed the following steps:
-
-* Set up a storage resource and populated the account with data.
-
-* Set up and completed a scan your storage source.
-
-For more information to create and complete a scan, see [the manage data sources in Microsoft Purview article](manage-data-sources.md).
-
-## Understand your asset inventory in Data Estate Insights
-
-In Microsoft Purview Data Estate Insights, you can get an overview of the assets that have been scanned into the Data Map and view key gaps that can be closed by governance stakeholders, for better governance of the data estate.
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
- :::image type="content" source="./media/asset-insights/portal-access.png" alt-text="Screenshot of Microsoft Purview account in Azure portal with the Microsoft Purview governance portal button highlighted.":::
-
-1. On the Microsoft Purview **Home** page, select **Data Estate Insights** on the left menu.
-
- :::image type="content" source="./media/asset-insights/view-insights.png" alt-text="Screenshot of the Microsoft Purview governance portal with the Data Estate Insights button highlighted in the left menu.":::
-
-1. In the **Data Estate Insights** area, look for **Assets** in the **Inventory and Ownership** section.
-
- :::image type="content" source="./media/asset-insights/asset-insights-table-of-contents.png" alt-text="Screenshot of the Microsoft Purview governance portal Insights menu with Assets highlighted.":::
--
-### View asset summary
-
-1. The **Assets Summary** report provides several high-level KPIs, with these graphs:
-
- * **Unclassified assets**: Assets with no system or custom classification on the entity or its columns.
- * **Unassigned data owner**: Assets that have the owner attribute within "Contacts" tab as blank.
- * **Net new assets in last 30 days**: Assets that were added to the Purview account, via data scan or Atlas API pushes.
- * **Deleted assets in last 30 days**: Assets that were deleted from the Purview account, as a result of deletion from data sources.
-
- :::image type="content" source="./media/asset-insights/asset-insights-summary-report-small.png" alt-text="Screenshot of the insights assets summary graphs, showing the four main KPI charts." lightbox="media/asset-insights/asset-insights-summary-report.png":::
-
-1. Below these KPIs, you can also view your data asset distribution by collection.
-
- :::image type="content" source="./media/asset-insights/assets-by-collection-small.png" alt-text="Screenshot of the insights assets by collection section, showing all a graphic that summarizes number of assets by collection." lightbox="media/asset-insights/assets-by-collection.png":::
-
-1. Using filters you can drill down to assets within a specific collection or classification category.
-
- :::image type="content" source="./media/asset-insights/filter.png" alt-text="Screenshot of the insights assets by collection section, with the filter at the top selected, showing available collections.":::
-
- > [!NOTE]
- > ***Each classification filter has some common values:***
- > * **Applied**: Any filter value is applied
- > * **Not Applied**: No filter value is applied. For example, if you pick a classification filter with value as "Not Applied", the graph will show all assets with no classification.
- > * **All**: Filter values are cleared. Meaning the graph will show all assets, with or without classification.
- > * **Specific**: You have picked a specific classification from the filter, and only that classification will be shown.
-
-1. To learn more about which specific assets are shown in the graph, select **View details**.
-
- :::image type="content" source="./media/asset-insights/view-details.png" alt-text="Screenshot of the insights assets by collection section, with the view-details button at the bottom highlighted.":::
-
- :::image type="content" source="./media/asset-insights/details-view.png" alt-text="Screenshot of the asset details view screen, which is still within the Data Estate Insights application.":::
-
-1. You can select any collection to view the collection's asset list.
-
- :::image type="content" source="./media/asset-insights/select-collection.png" alt-text="Screenshot of the asset details view screen, with one of the collections highlighted.":::
-
- :::image type="content" source="./media/asset-insights/asset-list.png" alt-text="Screenshot of the asset list screen, showing all assets within the selected collection.":::
-
-1. You can also select an asset to edit without leaving the Data Estate Insights App.
-
- :::image type="content" source="./media/asset-insights/edit-asset.png" alt-text="Screenshot of the asset list screen, with an asset selected for editing and the asset edit screen open within the Data Estate Insights application.":::
-
-
-### File-based source types
-
-The next graphs in asset insights show a distribution of file-based source types. The first graph, called **Size trend (GB) of file type within source types**, shows top file type size trends over the last 30 days.
-
-1. Pick your source type to view the file type within the source.
-
-1. Select **View details** to see the current data size, change in size, current asset count and change in asset count.
-
- > [!NOTE]
- > If the scan has run only once in last 30 days or any catalog change like classification addition/removed happened only once in 30 days, then the change information above appears blank.
-
-1. See the top folders with change top asset counts when you select source type.
-
-1. Select the path to see the asset list.
-
-The second graph in file-based source types is **Files not associated with a resource set**. If you expect that all files should roll up into a resource set, this graph can help you understand which assets haven't been rolled up. Missing assets can be an indication of the wrong file-pattern in the folder. You can select **View details** below the graph for more information.
-
- :::image type="content" source="./media/asset-insights/file-based-assets-inline.png" alt-text="View file based assets" lightbox="./media/asset-insights/file-based-assets.png":::
-
-## Next steps
-
-Learn how to use Data Estate Insights with resources below:
-
-* [Learn how to use data stewardship insights](data-stewardship.md)
-* [Learn how to use classification insights](classification-insights.md)
-* [Learn how to use glossary insights](glossary-insights.md)
-* [Learn how to use label insights](sensitivity-insights.md)
purview Available Metadata https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/available-metadata.md
- Title: Available metadata for Power BI in the Microsoft Purview governance portal
-description: This reference article provides a list of metadata that is available for a Power BI tenant in the Microsoft Purview governance portal.
----- Previously updated : 01/31/2023--
-# Available metadata for Power BI in the Microsoft Purview Data Catalog
-
-This article has a list of the metadata that is available for a Power BI tenant in the Microsoft Purview governance portal.
-
-## Power BI
-
-| Metadata | Population method | Source of truth | Asset type | Editable | Upstream metadata |
-| | -- | -- | | -- | - |
-| Classification | Manual | Microsoft Purview | All types | Yes | N/A |
-| Sensitivity Labels | Automatic | Microsoft Purview | All types | No | |
-| Glossary terms | Manual | Microsoft Purview | All types | Yes | N/A |
-| Collection | Automatic | Microsoft Purview | All types | Yes | N/A |
-| Hierarchy | Automatic | Microsoft Purview | All types | No | N/A |
-| qualifiedName | Automatic | Microsoft Purview | All types | No | N/A |
-| Asset Description | Automatic/Manual* | Microsoft Purview | All types | Yes | N/A |
-| Contacts - Expert | Manual | Microsoft Purview | All types | Yes | N/A |
-| Contacts - Owner | Manual | Microsoft Purview | All types | Yes | N/A |
-| name | Automatic | Power BI | Power BI Dashboard | Yes | dashboard.DisplayName |
-| isReadOnly | Automatic | Power BI | Power BI Dashboard | No | dashboard.IsReadOnly |
-| embedUrl | Automatic | Power BI | Power BI Dashboard | No | dashboard.EmbedUrl |
-| tileNames | Automatic | Power BI | Power BI Dashboard | No | TileTitles |
-| users | Automatic | Power BI | Power BI Dashboard | No | dashboard.Users |
-| Lineage | Automatic | Power BI | Power BI Dashboard | No | N/A |
-| name | Automatic | Power BI | Power BI Dataflow | Yes | dataflow.Name |
-| description | Automatic | Power BI | Power BI Dataflow | Yes | dataflow.Description |
-| configuredBy | Automatic | Power BI | Power BI Dataflow | No | dataflow.ConfiguredBy |
-| modifiedBy | Automatic | Power BI | Power BI Dataflow | No | dataflow.ModifiedBy |
-| modifiedDateTime | Automatic | Power BI | Power BI Dataflow | No | dataflow.ModifiedDateTime |
-| Endorsement | Automatic | Power BI | Power BI Dataflow | No | dataflow.EndorsementDetails |
-| users | Automatic | Power BI | Power BI Dataflow | No | dataflow.Users |
-| Lineage | Automatic | Microsoft Purview | Power BI Dataflow | No | |
-| name | Automatic | Power BI | Power BI Datamart | Yes | datamart.Name |
-| description | Automatic | Power BI | Power BI Datamart | Yes | datamart.Description |
-| configuredBy | Automatic | Power BI | Power BI Datamart | No | datamart.ConfiguredBy |
-| modifiedBy | Automatic | Power BI | Power BI Datamart | No | datamart.ModifiedBy |
-| modifiedDateTime | Automatic | Power BI | Power BI Datamart | No | datamart.ModifiedDateTime |
-| Endorsement | Automatic | Power BI | Power BI Datamart | No | datamart.EndorsementDetails |
-| users | Automatic | Power BI | Power BI Datamart | No | datamart.Users |
-| Lineage | Automatic | Microsoft Purview | Power BI Datamart | No | |
-| name | Automatic | Power BI | Power BI Dataset | Yes | dataset.Name |
-| description | Automatic | Power BI | Power BI Dataset | Yes | dataset.Description |
-| isRefreshable | Automatic | Power BI | Power BI Dataset | No | dataset.IsRefreshable |
-| configuredBy | Automatic | Power BI | Power BI Dataset | No | dataset.ConfiguredBy |
-| contentProviderType | Automatic | Power BI | Power BI Dataset | No | dataset.ContentProviderType |
-| createdDate | Automatic | Power BI | Power BI Dataset | No | dataset.CreatedDateTime |
-| targetStorageMode | Automatic | Power BI | Power BI Dataset | No | dataset.TargetStorageMode |
-| Schema | Automatic/Manual | Power BI | Power BI Dataset | | tables & columns |
-| Endorsement | Automatic | Power BI | Power BI Dataset | No | dataset.EndorsementDetails |
-| users | Automatic | Power BI | Power BI Dataset | No | dataset.Users |
-| Lineage | Automatic | Microsoft Purview | Power BI Dataset | No | |
-| name | Automatic | Power BI | Power BI Report | Yes | report.Name |
-| description | Automatic | Power BI | Power BI Report | Yes | report.Description |
-| createdDateTime | Automatic | Power BI | Power BI Report | No | report.CreatedDateTime |
-| webUrl | Automatic | Power BI | Power BI Report | No | report.WebUrl |
-| embedUrl | Automatic | Power BI | Power BI Report | No | report.EmbedUrl |
-| PBIDatasetId | Automatic | Power BI | Power BI Report | No | report.DatasetId |
-| modifiedBy | Automatic | Power BI | Power BI Report | No | report.ModifiedBy |
-| modifiedDateTime | Automatic | Power BI | Power BI Report | No | report.ModifiedDateTime |
-| reportType | Automatic | Power BI | Power BI Report | No | report.ReportType |
-| Endorsement | Automatic | Power BI | Power BI Report | No | report.EndorsementDetails |
-| users | Automatic | Power BI | Power BI Report | No | report.Users |
-| Lineage | Automatic | Microsoft Purview | Power BI Report | No | N/A |
-| name | Automatic | Power BI | Power BI Workspace | Yes | workspace.Name |
-| description | Automatic | Power BI | Power BI Workspace | Yes | workspace.Description |
-| state | Automatic | Power BI | Power BI Workspace | No | workspace.State |
-| type | Automatic | Power BI | Power BI Workspace | No | ResourceType.Workspace |
-| IsReadOnly | Automatic | Power BI | Power BI Workspace | No | workspace.IsReadOnly |
-| IsOnDedicatedCapacity | Automatic | Power BI | Power BI Workspace | No | workspace.IsOnDedicatedCapacity |
-| users | Automatic | Power BI | Power BI Workspace | No | workspace.Users |
-
-## Next steps
--- [Connect to and manage a Power BI tenant](register-scan-power-bi-tenant.md)-- [Connect to and manage Power BI across tenants](register-scan-power-bi-tenant-cross-tenant.md)-- [Connect to and manage Power BI troubleshooting](register-scan-power-bi-tenant-troubleshoot.md)
purview Catalog Adoption Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-adoption-insights.md
- Title: Catalog Adoption Insights in Microsoft Purview
-description: This article describes the catalog adoption dashboards in Microsoft Purview, and how they can be used to govern and manage your data estate.
----- Previously updated : 05/10/2023--
-# Get insights into catalog adoption from Microsoft Purview
-
-As described in the [insights concepts](concept-insights.md), the catalog adoption report is part of the "Health" section of the Data Estate Insights App. This report offers a one-stop shop experience for administrators to determine if and how the Microsoft Purview Data Catalog is being used. It helps answer questions like:
--- What are my users searching for?-- How many people used the data catalog last month?-- What are the most used data assets?-
-## Prerequisites
-
-Before getting started with Microsoft Purview Data Estate Insights, make sure that you've completed the following steps:
-
-* Set up a storage resource and populated the account with data.
-
-* Set up and completed a scan your storage source.
-
-* [Enable and schedule your data estate insights reports](how-to-schedule-data-estate-insights.md).
-
-For more information to create and complete a scan, see [the manage data sources in Microsoft Purview article](manage-data-sources.md).
-
-## Understand your data estate and catalog health in Data Estate Insights
-
-In Microsoft Purview Data Estate Insights, you can get an overview of all assets inventoried in the Data Map, and any key gaps that can be closed by governance stakeholders, for better governance of the data estate.
-
-1. Access the [Microsoft Purview Governance Portal](https://web.purview.azure.com/) and open your Microsoft Purview account.
-
-1. On the Microsoft Purview **Home** page, select **Data Estate Insights** on the left menu.
-
- :::image type="content" source="./media/catalog-adoption-insights/view-insights.png" alt-text="Screenshot of the Microsoft Purview governance portal with the Data Estate Insights button highlighted in the left menu.":::
-
-1. In the **Data Estate Insights** area, look for **Catalog Adoption** in the **Health** section.
-
- :::image type="content" source="./media/catalog-adoption-insights/select-catalog-adoption.png" alt-text="Screenshot of the Microsoft Purview governance portal Data Estate Insights menu with Catalog Adoption highlighted under the Health section.":::
-
-## View catalog adoption dashboard
-
-The catalog adoption dashboard has several curated tiles and charts to identify:
--- How many active users your data catalog had in the last month-- How many total searches were performed in the last month-- Which data catalog features are being used-- Most viewed assets-- Top searched keywords in your data catalog--
-### Monthly active users
-
-The monthly active users tile provides a count of users who have taken at least one action in the Microsoft Purview Data Catalog in the last 30 days. An action in the data catalog includes: searching for a term, browsing in the data catalog, and updating an asset in the data catalog.
-
-This tile also includes an indicator of a percentage increase or decrease in users from the previous month.
-
-### Total searches
-
-The total searches tile provides a count of all searches performed in the Microsoft Purview Data Catalog over the last 30 days. At the bottom it also includes an indicator of a percentage increase or decrease in the number of searches from the previous month.
-
-### Active users by feature category
-
-The active users by feature category chart allows you to monitory user activity trends in your data catalog.
--
-At the top of the chart, you can select your date range to view user activity on a daily, weekly, or monthly basis.
-
-The table then shows the number of users in the date range performing one of these three categories of actions:
--- Search and browse - the count of users who performed any search or browse action in the Microsoft Purview Data Catalog-- Asset curation - the count of users who performed at least any one of these actions on an asset:
- - Added, removed, or updated a rating
- - Added or removed a tag
- - Added or removed a glossary term
- - Added or removed a classification
- - Edited an asset's name or description
- - Added or removed a certification
- - Added or removed a column level classification, glossary term, description, name, or data type
- - Added or removed manual lineage
- - Added or removed a contact
-- All - the count of users who both performed search and browse and asset curation actions within the date range-
-### Most viewed assets
-
-The most viewed assets table shows the top assets in your data catalog, by sum of views in the last 30 days.
--
-This table allows you to select through to your most viewed assets for more information, but the table also provides these other details:
--- Curation status - There are three possible statuses: "Fully curated", "Partially curated" and "Not curated", based on certain attributes of assets being present. An asset is "Fully curated" if it has at least one classification tag, an assigned Data Owner and a description. If any of these attributes is missing, but not all, then the asset is categorized as "Partially curated" and if all of them are missing, then it's "Not curated". For more information about the curation status of your data assets, see the [data stewardship dashboard](data-stewardship.md).-- Views - count of the number of views the asset received in the last 30 days.-
-### Top searched keywords
-
-The top searched keywords table shows your top keywords both for searches that produced results, and searches that didn't. That way you can know what users are finding with the data catalog, and what they're still looking for.
--
-At the top of the table, you can select one of the two radio buttons to select whether to show keywords for searches with results, or searches without results.
-
-You can select any of the keywords to run the search in the data catalog and see the results for yourself as well.
-
-The table also provides search volume, which is the number of times that keyword was searched in the last 30 days.
-
-## Next steps
-
-Learn more about Microsoft Purview Data Estate Insights through:
-* [Data Estate Insights Concepts](concept-insights.md)
-* [Data stewardship insights](data-stewardship.md)
purview Catalog Asset Details https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-asset-details.md
- Title: Asset management in the Microsoft Purview Data Catalog
-description: View relevant information and take action on assets in the Microsoft Purview Data Catalog.
----- Previously updated : 05/26/2023-
-# Asset management in the Microsoft Purview Data Catalog
-
-This article discusses how assets are displayed in the Microsoft Purview Data Catalog, and all the features and details available to them. It describes how you can view relevant information or take action on assets in your catalog.
-
-## Prerequisites
--- Set up your data sources and scan the assets into your catalog.-- *Or* Use the Microsoft Purview Atlas APIs to ingest assets into the catalog.-
-## Discover assets
-
-You can discover your assets in the Microsoft Purview Data Catalog by either:
-- [Browsing the data catalog](how-to-browse-catalog.md)-- [Searching the data catalog](how-to-search-catalog.md)-
-Once you've discovered an asset, select it to view the asset details page, which contains all the asset details, and more actions you can take.
-
-## Editing assets
-
-To edit an asset, other than adding a rating, you will need the [data curator](catalog-permissions.md) role on the collection where the asset is housed.
-
-To edit assets you can either:
--- Select multiple assets in the catalog to [bulk edit assets](how-to-bulk-edit-assets.md)-- Select a single asset and select the **Edit** button at the top of the [asset details page](#asset-details-page)-
-#### Scan behavior after editing assets
-
- Microsoft Purview works to reflect the truth of the source system whenever possible. For example, if you edit a column and later it's deleted from the source table. A scan will remove the column metadata from the asset in Microsoft Purview.
-
-Both column-level and asset-level updates such as adding a description, glossary term or classification don't impact scan updates. Scans will update new columns and classifications regardless if these changes are made.
-
-If you update the **name** or **data type** of a column, subsequent scans **won't** update the asset schema. New columns and classifications **won't** be detected.
-
-## Delete asset
-
-If you're a data curator on the collection containing an asset, you can delete an asset by selecting the delete icon under the name of the asset.
-
-> [!IMPORTANT]
-> You cannot delete an asset that has child assets.
->
-> Currently, Microsoft Purview doesn't support cascaded deletes. For example, if you attempt to delete a storage account asset in your catalog the containers, folders and files within them will still exist in the data map and the the storage account asset will still exist in relation to them.
-
-Any asset you delete using the delete button is permanently deleted in Microsoft Purview. However, if you run a **full scan** on the source from which the asset was ingested into the catalog, then the asset is reingested and you can discover it using the Microsoft Purview Data Catalog.
-
-If you have a scheduled scan (weekly or monthly) on the source, the **deleted asset won't get re-ingested** into the catalog unless the asset is modified by an end user since the previous run of the scan. For example, say you manually delete a SQL table from the Microsoft Purview Data Map. Later, a data engineer adds a new column to the source table. When Microsoft Purview scans the database, the table will be reingested into the data map and be discoverable in the data catalog.
-
-## Asset details page
-
-At the top of the asset details page there are several tabs:
---- **Overview** - An asset's basic details like description, classification, hierarchy, and glossary terms.-- **Properties** - The technical metadata and relationships discovered in the data source. -- **Schema** - The schema of the asset including column names, data types, column level classifications, terms, and descriptions are represented in the schema tab.-- **Lineage** - This tab contains lineage graph details for assets where it's available.-- **Contacts** - Every asset can have an assigned owner and expert that can be viewed and managed from the contacts tab.-- **Related** - This tab lets you navigate through the technical hierarchy of assets that are related to the current asset you're viewing.-
-## Asset overview
-
-The overview section of the asset details gives a summarized view of an asset. The sections that follow explains the different parts of the overview page.
--
-### Asset description
-
-An asset description gives a synopsis of what the asset represents. You can add or update an asset description by [editing the asset](#editing-assets).
-
-#### Adding rich text to a description
-
-Microsoft Purview enables users to add rich formatting to asset descriptions such as adding bolding, underlining, or italicizing text. Users can also create tables, bulleted lists, or hyperlinks to external resources.
--
-Below are the rich text formatting options:
-
-| Name | Description | Shortcut key |
-| - | -- | |
-| Bold | Make your text bold. Adding the '*' character around text will also bold it. | Ctrl+B |
-| Italic | Italicize your text. Adding the '_' character around text will also italicize it. | Ctrl+I |
-| Underline | Underline your text. | Ctrl+U |
-| Bullets | Create a bulleted list. Adding the '-' character before text will also create a bulleted list. | |
-| Numbering | Create a numbered list Adding the '1' character before text will also create a bulleted list. | |
-| Heading | Add a formatted heading | |
-| Font size | Change the size of your text. The default size is 12. | |
-| Decrease indent | Move your paragraph closer to the margin. | |
-| Increase indent | Move your paragraph farther away from the margin. | |
-| Add hyperlink | Create a link in your document for quick access to web pages and files. | |
-| Remove hyperlink | Change a link to plain text. | |
-| Quote | Add quote text | |
-| Add table | Add a table to your content. | |
-| Edit table | Insert or delete a column or row from a table | |
-| Clear formatting | Remove all formatting from a selection of text, leaving only the normal, unformatted text. | |
-| Undo | Undo changes you made to the content. | Ctrl+Z |
-| Redo | Redo changes you made to the content. | Ctrl+Y |
-
-> [!NOTE]
-> Updating a description with the rich text editor updates the `userDescription` field of an entity. If you have already added an asset description before the release of this feature, that description is stored in the `description` field. When overwriting a plain text description with rich text, the entity model will persist both `userDescription` and `description`. The asset details overview page will only show `userDescription`. The `description` field can't be edited in the Microsoft Purview studio user experience.
-
-### Classifications
-
-Classifications identify the kind of data being represented by an asset or column such as "ABA routing number", "Email Address", or "U.S. Passport number". These attributes can be assigned during scans or added manually. For a full list of classifications, see the [supported classifications in Microsoft Purview](supported-classifications.md). You can see classifications assigned both to the asset and columns in the schema from the overview page.which you can also view as part of the schema.
-
-### Glossary terms
-
-Glossary terms are a managed vocabulary for business terms that can be used to categorize and relate assets across your organization. For more information, see the [business glossary page](concept-business-glossary.md). You can view the assigned glossary terms for an asset in the overview section. If you're a data curator on the asset, you can add or remove a glossary term on an asset by [editing the asset](#editing-assets).
-
-### Collection hierarchy
-
-In Microsoft Purview, collections organize assets and data sources. They also manage access across the Microsoft Purview governance portal. You can view an assets containing collection under the **Collection path** section.
-
-### Asset hierarchy
-
-You can view the full asset hierarchy within the overview tab. As an example: if you navigate to a SQL table, then you can see the schema, database, and the server the table belongs to.
-
-## Asset actions
-
-Below are a list of actions you can take from an asset details page. Actions available to you vary depending on your permissions and the type of asset you're looking at. Available actions are generally available on the global actions bar.
--
-### Request access to data
-
-If a [self-service data access workflow](how-to-workflow-self-service-data-access-hybrid.md) has been created, you can request access to a desired asset directly from the asset details page! To learn more about Microsoft Purview's data policy applications, see [how to enable data use management](how-to-enable-data-use-management.md).
-
-### Open in Power BI
-
-Microsoft Purview makes it easy to work with useful data you find the data catalog. You can open certain assets in Power BI Desktop from the asset details page. Power BI Desktop integration is supported for the following sources.
--- Azure Blob Storage-- Azure Cosmos DB-- Azure Data Lake Storage Gen2-- Azure Dedicated SQL pool (formerly SQL DW)-- Azure SQL Database-- Azure SQL Managed Instance-- Azure Synapse Analytics-- Azure Database for MySQL-- Azure Database for PostgreSQL-- Oracle DB-- SQL Server-- Teradata-
-## Ratings
-
-Assets can be rated by all users with read access, or better, to that asset in Microsoft Purview.
-Ratings allow users to give an asset a rating from 1 to 5 stars, and leave a comment about the asset.
-
-These ratings can be seen by all users with read access, and rating can be [added as a facet](how-to-search-catalog.md#use-the-facets) when [searching the data catalog](how-to-search-catalog.md) so users can find assets with a certain rating.
-
-### View ratings
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) for your asset in Microsoft Purview and select it.
-1. In the header of the asset you can see a rating, which will show an aggregate star rating of the asset, and the number of reviews.
- :::image type="content" source="media/catalog-asset-details/view-rating-aggregate.png" alt-text="Screenshot of a rating in the header of an asset.":::
-1. To see a percentage breakdown of the ratings, select the rating.
-1. To see specific ratings and their comments, or to add your own rating, select **Open ratings**.
- :::image type="content" source="media/catalog-asset-details/open-rating-details.png" alt-text="Screenshot of a selected rating in the header of an asset showing the percentage breakdown.":::
-
-### Add a rating to an asset
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) for your asset in Microsoft Purview and select it.
-1. Select the ratings button in the asset's header.
-1. Select the **Open ratings** button.
- :::image type="content" source="media/catalog-asset-details/open-ratings.png" alt-text="Screenshot that shows the ratings button selected, and the open ratings button highlighted.":::
-1. Choose a star rating, add a comment, and select **Submit**.
- :::image type="content" source="media/catalog-asset-details/rate-asset.png" alt-text="Screenshot of a rating, showing five start selected and a comment about the quality of the data.":::
-
-### Edit or delete your rating
-
-1. Select the ratings button in the asset's header.
-1. Select the **Open ratings** button.
-1. Under **My rating** select the ellipsis button in your rating.
- :::image type="content" source="media/catalog-asset-details/edit-rating.png" alt-text="Screenshot of the user's rating, shown under the My rating menu, with the ellipsis button selected.":::
-1. To delete your rating, select **Delete rating**.
-1. To edit your rating, select **Edit rating**, then update your score and comment and select **Submit**.
-
-## Tags
-
-Asset can be tagged by users with data curator permissions or better, and any users with reader permissions on these assets in Microsoft Purview can see these tags.
-Users can add tags [as a filter](how-to-search-catalog.md#refine-results) when [searching the data catalog](how-to-search-catalog.md), so users can see all assets with certain tags.
-
->[!NOTE]
->Tag limitations:
->
-> - An asset can only have up to 50 tags
-> - Tags can only be 50 characters
-> - Allowed characters: numbers, letters, -, and _
-
-### Add a tag to an asset
-
-If you have [data curator](catalog-permissions.md) permissions Microsoft Purview, you can add a tag to an asset by:
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) for your asset in Microsoft Purview and select it.
-1. Select the **+ Add Tag** button under the asset's name.
- :::image type="content" source="media/catalog-asset-details/add-new-tag.png" alt-text="Screenshot that shows the new tag button highlighted on an asset detail page.":::
-1. Select an existing available tag, or input a new tag.
-
-### Remove a tag from an asset
-
-If you have [data curator](catalog-permissions.md) permissions Microsoft Purview, you can remove a tag from an asset by:
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) for your asset in Microsoft Purview and select it.
-1. Select the **X** button next to an existing tag under the asset's name.
- :::image type="content" source="media/catalog-asset-details/remove-tag.png" alt-text="Screenshot that shows the remove tag button highlighted next to an existing page.":::
-1. Confirm the removal of the tag.
-
-## Duplicate assets
-
-If you notice duplicate assets in your Microsoft Purview Data Catalog, review the [asset normalization](concept-asset-normalization.md) documentation. Microsoft Purview normalizes assets to prevent duplication, but not all possible scenarios are covered.
-
-Compare the fully qualified asset names for duplicate assets, and update ingestion points to resolve capitalization or character differences. Then, [delete the duplicated asset](#delete-asset) in your catalog.
-
-## Next steps
--- [Browse the Microsoft Purview Data Catalog](how-to-browse-catalog.md)-- [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)-- [Asset normalization](concept-asset-normalization.md)
purview Catalog Conditional Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-conditional-access.md
- Title: Configure Azure AD Conditional Access for Microsoft Purview
-description: This article describes steps how to configure Azure AD Conditional Access for Microsoft Purview
----- Previously updated : 03/23/2023
-# Customer intent: As an identity and security admin, I want to set up Azure Active Directory Conditional Access for Microsoft Purview, for secure access.
--
-# Conditional Access with Microsoft Purview
-
-[Microsoft Purview](./overview.md) supports Microsoft Conditional Access.
-
-The following steps show how to configure Microsoft Purview to enforce a Conditional Access policy.
-
-## Prerequisites
--- When multi-factor authentication is enabled, to sign in to the Microsoft Purview governance portal, you must perform multi-factor authentication.-
-## Configure conditional access
-
-1. Sign in to the Azure portal, select **Azure Active Directory**, and then select **Conditional Access**. For more information, see [Azure Active Directory Conditional Access technical reference](../active-directory/conditional-access/concept-conditional-access-conditions.md).
-
- :::image type="content" source="media/catalog-conditional-access/conditional-access-blade.png" alt-text="Screenshot that shows Conditional Access blade." lightbox="media/catalog-conditional-access/conditional-access-blade.png":::
-
-1. In the **Conditional Access-Policies** menu, select **New policy**, provide a name, and then select **Configure rules**.
-1. Under **Assignments**, select **Users and groups**, check **Select users and groups**, and then select the user or group for Conditional Access. Select **Select**, and then select **Done** to accept your selection.
-
- :::image type="content" source="media/catalog-conditional-access/select-users-and-groups.png" alt-text="Screenshot that shows User and Group selection." lightbox="media/catalog-conditional-access/select-users-and-groups.png":::
-
-1. Select **Cloud apps**, select **Select apps**. You see all apps available for Conditional Access. Select **Microsoft Purview**, at the bottom select **Select**, and then select **Done**.
-
- :::image type="content" source="media/catalog-conditional-access/select-azure-purview.png" alt-text="Screenshot that shows Applications selection." lightbox="media/catalog-conditional-access/select-azure-purview.png":::
-
-1. Select **Access controls**, select **Grant**, and then check the policy you want to apply. For this example, we select **Require multi-factor authentication**.
-
- :::image type="content" source="media/catalog-conditional-access/grant-access.png" alt-text="Screenshot that shows Grant access tab." lightbox="media/catalog-conditional-access/grant-access.png":::
-
-1. Set **Enable policy** to **On** and select **Create**.
-
-## Next steps
--- [Use the Microsoft Purview governance portal](./use-purview-studio.md)
purview Catalog Firewall https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-firewall.md
- Title: Configure Microsoft Purview firewall
-description: This article describes how to configure firewall settings for your Microsoft Purview account
----- Previously updated : 01/13/2023
-# Customer intent: As a Microsoft Purview admin, I want to set firewall settings for my Microsoft Purview account.
--
-# Configure firewall settings for your Microsoft Purview account
-
-This article describes how to configure firewall settings for Microsoft Purview.
-
-## Prerequisites
-
-To configure Microsoft Purview account firewall settings, ensure you meet the following prerequisites:
-
-1. An Azure account with an active subscription. [Create an account for free.](https://azure.microsoft.com/free/?WT.mc_id=A261C142F)
-<br>
-2. An existing Microsoft Purview account.
-<br>
-
-## Microsoft Purview firewall deployment scenarios
-
-To configure Microsoft Purview firewall follow these steps:
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-
-2. Navigate to your Microsoft Purview account in the portal.
-
-3. Under **Settings*, choose **Networking**.
-
-4. In the **Firewall** tab, under **Public network access**, change the firewall settings to the option that suits your scenario:
--- **Enabled from all networks**-
- :::image type="content" source="media/catalog-private-link/purview-firewall-public.png" alt-text="Screenshot showing the purview account firewall page, selecting public network in the Azure portal.":::
-
- By choosing this option:
-
- - All public network access into your Microsoft Purview account is allowed.
- - Public network access is set to _Enabled from all networks_ on your Microsoft Purview account's Managed storage account.
- - Public network access is set to _All networks_ on your Microsoft Purview account's Managed Event Hubs, if it's used.
-
- > [!NOTE]
- > Even though the network access is enabled through public internet, to gain access to Microsoft Purview governance portal, [users must be first authenticated and authorized](catalog-permissions.md).
--- **Disabled for ingestion only (Preview)**-
- :::image type="content" source="media/catalog-private-link/purview-firewall-ingestion.png" alt-text="Screenshot showing the purview account firewall page, selecting ingestion only in the Azure portal.":::
-
- > [!NOTE]
- > Currently, this option is available in public preview.
-
- By choosing this option:
- - Public network access to your Microsoft Purview account through API and Microsoft Purview governance portal is allowed.
- - All public network traffic for ingestion is disabled. In this case, you must configure a private endpoint for ingestion before setting up any scans. For more information, see [Use private endpoints for your Microsoft Purview account](catalog-private-link.md).
- - Public network access is set to _Disabled_ on your Microsoft Purview account's Managed storage account.
- - Public network access is set to _Disabled_ on your Microsoft Purview account's Managed Event Hubs, if it's used.
-
-- **Disabled from all networks**-
- :::image type="content" source="media/catalog-private-link/purview-firewall-private.png" alt-text="Screenshot showing the purview account firewall page, selecting private network in the Azure portal.":::
-
- By choosing this option:
-
- - All public network access into your Microsoft Purview account is disabled.
- - All network access to your Microsoft Purview account through APIs or Microsoft Purview governance portal including traffic to run scans is allowed only through private network using private endpoints. For more information, see [Connect to your Microsoft Purview and scan data sources privately and securely](catalog-private-link-end-to-end.md).
- - Public network access is set to _Disabled_ on your Microsoft Purview account's Managed storage account.
- - Public network access is set to _Disabled_ on your Microsoft Purview account's Managed Event Hubs, if it's used.
-
-5. Select **Save**.
-
- :::image type="content" source="media/catalog-private-link/purview-firewall-save.png" alt-text="Screenshot showing the purview account firewall page, selecting save in the Azure portal.":::
-
-## Next steps
--- [Deploy end to end private networking](./catalog-private-link-end-to-end.md)-- [Deploy private networking for the Microsoft Purview governance portal](./catalog-private-link-account-portal.md)
purview Catalog Lineage User Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-lineage-user-guide.md
- Title: Data Catalog lineage user guide
-description: This article provides an overview of the catalog lineage feature of Microsoft Purview.
----- Previously updated : 06/01/2023-
-# Microsoft Purview Data Catalog lineage user guide
-
-This article provides an overview of the data lineage features in Microsoft Purview Data Catalog.
-
-## Background
-
-One of the platform features of Microsoft Purview is the ability to show the lineage between datasets created by data processes. Systems like Data Factory, Data Share, and Power BI capture the lineage of data as it moves. Custom lineage reporting is also supported via Atlas hooks and REST API.
-
-## Lineage collection
-
- Metadata collected in Microsoft Purview from enterprise data systems are stitched across to show an end to end data lineage. Data systems that collect lineage into Microsoft Purview are broadly categorized into following three types:
--
-Each system supports a different level of lineage scope. Check the sections below, or your system's individual lineage article, to confirm the scope of lineage currently available.
-
-### Known limitations
-
-* Database Views used as source of process activity(Azure Data Factory, Synapse Pipelines, Azure SQL Database, Azure Data Share) are currently captured as Database Table objects in Microsoft Purview. If the Database is also scanned, the View assets are discovered separately in Microsoft Purview. In this scenario, two assets with same name captured in Microsoft Purview, one as a Table with data lineage and another as a View.
-* If a stored procedure contains drop or create statements, they are not currently captured in lineage.
-
-### Data processing systems
-Data integration and ETL tools can push lineage into Microsoft Purview at execution time. Tools such as Data Factory, Data Share, Synapse, Azure Databricks, and so on, belong to this category of data processing systems. The data processing systems reference datasets as source from different databases and storage solutions to create target datasets. The list of data processing systems currently integrated with Microsoft Purview for lineage are listed in below table.
-
-| Data processing system | Supported scope |
-| - | |
-| Azure Data Factory | [Copy activity](how-to-link-azure-data-factory.md#copy-activity-support) <br> [Data flow activity](how-to-link-azure-data-factory.md#data-flow-support) <br> [Execute SSIS package activity](how-to-link-azure-data-factory.md#execute-ssis-package-support) |
-| Azure Synapse Analytics | [Copy activity](how-to-lineage-azure-synapse-analytics.md#copy-activity-support) <br> [Data flow activity](how-to-lineage-azure-synapse-analytics.md#data-flow-support) |
-| Azure SQL Database (Preview) | [Lineage extraction](register-scan-azure-sql-database.md?tabs=sql-authentication#lineagepreview) |
-| Azure Data Share | [Share snapshot](how-to-link-azure-data-share.md) |
-
-### Data storage systems
-Databases & storage solutions such as Oracle, Teradata, and SAP have query engines to transform data using scripting language. Data lineage information from views/stored procedures/etc is collected into Microsoft Purview and stitched with lineage from other systems. Lineage is supported for the following data sources via Microsoft Purview data scan. Learn more about the supported lineage scenarios from the respective article.
-
-|**Category**| **Data source** |
-|||
-|Azure| [Azure Databricks](register-scan-azure-databricks.md)
-|Database| [Cassandra](register-scan-cassandra-source.md)|
-|| [Db2](register-scan-db2.md) |
-|| [Google BigQuery](register-scan-google-bigquery-source.md)|
-|| [Hive Metastore Database](register-scan-hive-metastore-source.md) |
-|| [MySQL](register-scan-mysql.md) |
-|| [Oracle](register-scan-oracle-source.md) |
-|| [PostgreSQL](register-scan-postgresql.md) |
-|| [Snowflake](register-scan-snowflake.md) |
-|| [Teradata](register-scan-teradata-source.md)|
-|Services and apps| [Erwin](register-scan-erwin-source.md)|
-|| [Looker](register-scan-looker-source.md)|
-|| [SAP ECC](register-scan-sapecc-source.md)|
-|| [SAP S/4HANA](register-scan-saps4hana-source.md) |
-
-### Data analytics and reporting systems
-Data analytics and reporting systems like Azure Machine Learning and Power BI report lineage into Microsoft Purview. These systems will use the datasets from storage systems and process through their meta model to create BI Dashboards, ML experiments and so on.
-
-| Data analytics & reporting system | Supported scope |
-| - | |
-| Power BI | [Datasets, Dataflows, Reports & Dashboards](register-scan-power-bi-tenant.md)
-
-## Get started with lineage
-
-> [!VIDEO https://www.microsoft.com/videoplayer/embed/RWxTAK]
-
-Lineage in Microsoft Purview includes datasets and processes. Datasets are also referred to as nodes while processes can be also called edges:
-
-* **Dataset (Node)**: A dataset (structured or unstructured) provided as an input to a process. For example, a SQL Table, Azure blob, and files (such as .csv and .xml), are all considered datasets. In the lineage section of Microsoft Purview, datasets are represented by rectangular boxes.
-
-* **Process (Edge)**: An activity or transformation performed on a dataset is called a process. For example, ADF Copy activity, Data Share snapshot and so on. In the lineage section of Microsoft Purview, processes are represented by round-edged boxes.
-
-To access lineage information for an asset in Microsoft Purview, follow the steps:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. On the Microsoft Purview governance portal **Home** page, search for a dataset name or the process name such as ADF Copy or Data Flow activity. And then press Enter.
-
-1. From the search results, select the asset and select its **Lineage** tab.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/select-lineage-from-asset.png" alt-text="Screenshot showing how to select the Lineage tab." border="true":::
-
-## Asset-level lineage
-
-Microsoft Purview supports asset level lineage for the datasets and processes. To see the asset level lineage go to the **Lineage** tab of the current asset in the catalog. Select the current dataset asset node. By default the list of columns belonging to the data appears in the left pane.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/view-columns-from-lineage-inline.png" alt-text="Screenshot showing how to select View columns in the lineage page." lightbox="./media/catalog-lineage-user-guide/view-columns-from-lineage.png"border="true":::
-
-## Manual lineage
-
-Data lineage in Microsoft Purview is [automated](#lineage-collection) for many assets in on-premises, multicloud, and SaaS environments. While we continue to add more automated sources, manual lineage allows you to document lineage metadata for sources where automation isn't yet supported, without using any code.
-
-To add manual lineage for any of your assets, follow these steps:
-
-1. [Search for your asset in the data catalog](how-to-search-catalog.md) and select it to view details.
-1. Select **Edit**, navigate to the **Lineage** tab, and select **Add manual lineage** in the bottom panel.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/add-manual-lineage.png" alt-text="Screenshot of editing an asset and adding manual lineage.":::
-
-1. To configure the asset lineage:
-
- 1. Select the asset dropdown to find the asset from the suggested list or **View more** to search the full catalog. Select the asset youΓÇÖd like to link.
- 1. Select the swap icon to configure the relationship direction as **Produces** (for downstream lineage) or **Consumes** (for upstream lineage).
- 1. If you want to delete a lineage, select the trash can icon.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/select-asset-dropdown.png" alt-text="Screenshot of a data asset lineage page, with the asset dropdown highlighted.":::
-
-1. When you add lineage between two data assets, you can additionally configure the column level lineage. Select the expand icon at the beginning of the row, select the upstream and downstream columns from the corresponding dropdown lists to configure the column mapping. Select the plus icon to add more column lineage; select the trash bin icon to delete existing ones.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/add-column-lineage.png" alt-text="Screenshot of configuring column level lineage.":::
-
-1. You can add more asset level lineage by selecting the **Add manual lineage** button again. When you're finished, select the **Save** button to save your lineage and exit edit mode.
-
-### Known limitations of manual lineage
-
-* Current asset picker experience allows selecting only one asset at a time.
-* Column level manual lineage is currently supported for lineage between two data assets, while not supported when process asset is involved in-between.
-* Data curation access required for both source and target assets.
-* These asset types don't currently allow manual lineage because they support automated lineage:
- * Azure Data Factory
- * Synapse pipelines
- * Power BI datasets
- * Teradata stored procedure
- * Azure SQL stored procedure
-
-## Dataset column lineage
-
-To see column-level lineage of a dataset, go to the **Lineage** tab of the current asset in the catalog and follow below steps:
-
-1. Once you are in the lineage tab, in the left pane, select the check box next to each column you want to display in the data lineage.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/select-columns-to-show-in-lineage-inline.png" alt-text="Screenshot showing how to select columns to display in the lineage page." lightbox="./media/catalog-lineage-user-guide/select-columns-to-show-in-lineage.png":::
-
-1. Hover over a selected column on the left pane or in the dataset of the lineage canvas to see the column mapping. All the column instances are highlighted.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/show-column-flow-in-lineage-inline.png" alt-text="Screenshot showing how to hover over a column name to highlight the column flow in a data lineage path." lightbox="./media/catalog-lineage-user-guide/show-column-flow-in-lineage.png":::
-
-1. If the number of columns is larger than what can be displayed in the left pane, use the filter option to select a specific column by name. Alternatively, you can use your mouse to scroll through the list.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/filter-columns-by-name.png" alt-text="Screenshot showing how to filter columns by column name on the lineage page." lightbox="./media/catalog-lineage-user-guide/filter-columns-by-name.png":::
-
-1. If the lineage canvas contains more nodes and edges, use the filter to select data asset or process nodes by name. Alternatively, you can use your mouse to pan around the lineage window.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/filter-assets-by-name.png" alt-text="Screenshot showing data asset nodes by name on the lineage page." lightbox="./media/catalog-lineage-user-guide/filter-assets-by-name.png":::
-
-1. Use the toggle in the left pane to highlight the list of datasets in the lineage canvas. If you turn off the toggle, any asset that contains at least one of the selected columns is displayed. If you turn on the toggle, only datasets that contain all of the columns are displayed.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/use-toggle-to-filter-nodes.png" alt-text="Screenshot showing how to use the toggle to filter the list of nodes on the lineage page." lightbox="./media/catalog-lineage-user-guide/use-toggle-to-filter-nodes.png":::
-
-## Process column lineage
-
-You can also view data processes, like copy activities, in the data catalog. For example, in this lineage flow, select the copy activity:
--
-The copy activity will expand, and then you can select the **Switch to asset** button, which will give you more details about the process itself.
--
-Data process can take one or more input datasets to produce one or more outputs. In Microsoft Purview, column level lineage is available for process nodes.
-
-1. Switch between input and output datasets from a drop-down in the columns panel.
-1. Select columns from one or more tables to see the lineage flowing from input dataset to corresponding output dataset.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/process-column-lineage-inline.png" alt-text="Screenshot showing columns lineage of a process node." lightbox="./media/catalog-lineage-user-guide/process-column-lineage.png":::
-
-## Browse assets in lineage
-
-1. Select **Switch to asset** on any asset to view its corresponding metadata from the lineage view. Doing so is an effective way to browse to another asset in the catalog from the lineage view.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/select-switch-to-asset-inline.png" alt-text="Screenshot how to select Switch to asset in a lineage data asset." lightbox="./media/catalog-lineage-user-guide/select-switch-to-asset.png":::
-
-1. The lineage canvas could become complex for popular datasets. To avoid clutter, the default view will only show five levels of lineage for the asset in focus. The rest of the lineage can be expanded by selecting the bubbles in the lineage canvas. Data consumers can also hide the assets in the canvas that are of no interest. To further reduce the clutter, turn off the toggle **More Lineage** at the top of lineage canvas. This action will hide all the bubbles in lineage canvas.
-
- :::image type="content" source="./media/catalog-lineage-user-guide/use-toggle-to-hide-bubbles-inline.png" alt-text="Screenshot showing how to toggle More lineage." lightbox="./media/catalog-lineage-user-guide/use-toggle-to-hide-bubbles.png":::
-
-1. Use the smart buttons in the lineage canvas to get an optimal view of the lineage:
- 1. Full screen
- 1. Zoom to fit
- 1. Zoom in/out
- 1. Auto align
- 1. Zoom preview
- 1. And more options:
- 1. Center the current asset
- 1. Reset to default view
-
- :::image type="content" source="./media/catalog-lineage-user-guide/use-lineage-smart-buttons-inline.png" alt-text="Screenshot showing how to select the lineage smart buttons." lightbox="./media/catalog-lineage-user-guide/use-lineage-smart-buttons.png":::
-
-## Next steps
-
-* [Link to Azure Data Factory for lineage](how-to-link-azure-data-factory.md)
-* [Link to Azure Data Share for lineage](how-to-link-azure-data-share.md)
purview Catalog Managed Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-managed-vnet.md
- Title: Managed Virtual Network and managed private endpoints
-description: This article describes Managed Virtual Network and managed private endpoints in Microsoft Purview.
----- Previously updated : 07/18/2023-
-# Customer intent: As a Microsoft Purview admin, I want to set up Managed Virtual Network and managed private endpoints for my Microsoft Purview account.
--
-# Use a Managed VNet with your Microsoft Purview account
-
-> [!IMPORTANT]
-> Currently, Managed Virtual Network and managed private endpoints are available for Microsoft Purview accounts that are deployed in the following regions:
-> - Australia East
-> - Canada Central
-> - East US
-> - East US 2
-> - North Europe
-> - West Europe
-
-## Conceptual overview
-
-This article describes how to configure Managed Virtual Network and managed private endpoints for Microsoft Purview.
-
-### Supported regions
-
-Currently, Managed Virtual Network and managed private endpoints are available for Microsoft Purview accounts that are deployed in the following regions:
-> - Australia East
-> - Canada Central
-> - East US
-> - East US 2
-> - North Europe
-> - West Europe
-
-### Supported data sources
-
-Currently, the following data sources are supported to have a managed private endpoint and can be scanned using Managed VNet Runtime in Microsoft Purview:
--- Azure Blob Storage -- Azure Cosmos DB-- Azure Data Lake Storage Gen 2 -- Azure Database for MySQL-- Azure Database for PostgreSQL-- Azure Dedicated SQL pool (formerly SQL DW)-- Azure Files-- Azure SQL Database -- Azure SQL Managed Instance-- Azure Synapse Analytics-
-Additionally, you can deploy managed private endpoints for your Azure Key Vault resources if you need to run scans using any authentication options rather than Managed Identities, such as SQL Authentication or Account Key.
-
-### Managed Virtual Network
-
-A Managed Virtual Network in Microsoft Purview is a virtual network that is deployed and managed by Azure inside the same region as Microsoft Purview account to allow scanning Azure data sources inside a managed network, without having to deploy and manage any self-hosted integration runtime virtual machines by the customer in Azure.
--
-You can deploy an Azure Managed Integration Runtime within a Microsoft Purview Managed Virtual Network. From there, the Managed VNet Runtime will use private endpoints to securely connect to and scan supported data sources.
-
-Creating a Managed VNet Runtime within Managed Virtual Network ensures that data integration process is isolated and secure.
-
-Benefits of using Managed Virtual Network:
--- With a Managed Virtual Network, you can offload the burden of managing the Virtual Network to Microsoft Purview. You don't need to create and manage VNets or subnets for Azure Integration Runtime to use for scanning Azure data sources. -- It doesn't require deep Azure networking knowledge to do data integrations securely. Using a Managed Virtual Network is much simplified for data engineers. -- Managed Virtual Network along with Managed private endpoints protects against data exfiltration. -
-> [!IMPORTANT]
-> Currently, the Managed Virtual Network is only supported in the same region as Microsoft Purview account region.
-
-> [!Note]
-> You cannot switch a global Azure integration runtime or self-hosted integration runtime to a Managed VNet Runtime and vice versa.
-
-A Managed VNet is created for your Microsoft Purview account when you create a Managed VNet Runtime for the first time in your Microsoft Purview account. You can't view or manage the Managed VNets.
-
-### Managed private endpoints
-
-Managed private endpoints are private endpoints created in the Microsoft Purview Managed Virtual Network establishing a private link to Microsoft Purview and Azure resources. Microsoft Purview manages these private endpoints on your behalf.
--
-Microsoft Purview supports private links. Private link enables you to access Azure (PaaS) services (such as Azure Storage, Azure Cosmos DB, Azure Synapse Analytics).
-
-When you use a private link, traffic between your data sources and Managed Virtual Network traverses entirely over the Microsoft backbone network. Private Link protects against data exfiltration risks. You establish a private link to a resource by creating a private endpoint.
-
-Private endpoint uses a private IP address in the Managed Virtual Network to effectively bring the service into it. Private endpoints are mapped to a specific resource in Azure and not the entire service. Customers can limit connectivity to a specific resource approved by their organization. Learn more about [private links and private endpoints](../private-link/index.yml).
-
-> [!NOTE]
-> To reduce administrative overhead, it's recommended that you create managed private endpoints to scan all supported Azure data sources.
-
-> [!WARNING]
-> If an Azure PaaS data store (Blob, Azure Data Lake Storage Gen2, Azure Synapse Analytics) has a private endpoint already created against it, and even if it allows access from all networks, Microsoft Purview would only be able to access it using a managed private endpoint. If a private endpoint does not already exist, you must create one in such scenarios.
-
-A private endpoint connection is created in a "Pending" state when you create a managed private endpoint in Microsoft Purview. An approval workflow is initiated. The private link resource owner is responsible to approve or reject the connection.
--
-If the owner approves the connection, the private link is established. Otherwise, the private link won't be established. In either case, the Managed private endpoint will be updated with the status of the connection.
--
-Only a Managed private endpoint in an approved state can send traffic to a given private link resource.
-
-### Interactive authoring
-
-Interactive authoring capabilities are used for functionalities like test connection, browse folder list and table list, get schema, and preview data. You can enable interactive authoring when creating or editing an Azure Integration Runtime that is in Purview-Managed Virtual Network. The backend service will pre-allocate compute for interactive authoring functionalities. Otherwise, the compute will be allocated every time any interactive operation is performed which will take more time. The Time To Live (TTL) for interactive authoring is 60 minutes, which means it will automatically become disabled after 60 minutes of the last interactive authoring operation.
--
-## Deployment Steps
-
-### Prerequisites
-
-Before deploying a Managed VNet and Managed VNet Runtime for a Microsoft Purview account, ensure you meet the following prerequisites:
-
-1. A Microsoft Purview account deployed in one of the [supported regions](#supported-regions).
-2. From Microsoft Purview roles, you must be a data curator at root collection level in your Microsoft Purview account.
-3. From Azure RBAC roles, you must be contributor on the Microsoft Purview account and data source to approve private links.
-
-### Deploy Managed VNet Runtimes
-
-> [!NOTE]
-> The following guide shows how to register and scan an Azure Data Lake Storage Gen 2 using Managed VNet Runtime.
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-azure-portal.png" alt-text="Screenshot that shows the Microsoft Purview account":::
-
-2. Navigate to the **Data Map --> Integration runtimes**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-vnet.png" alt-text="Screenshot that shows Microsoft Purview Data Map menus":::
-
-3. From **Integration runtimes** page, select **+ New** icon, to create a new runtime. Select Azure and then select **Continue**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-ir-create.png" alt-text="Screenshot that shows how to create new Azure runtime":::
-
-4. Provide a name for your Managed VNet Runtime, select the region and configure interactive authoring. Select **Create**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-ir-region.png" alt-text="Screenshot that shows to create a Managed VNet Runtime":::
-
-5. Deploying the Managed VNet Runtime for the first time triggers multiple workflows in the Microsoft Purview governance portal for creating managed private endpoints for Microsoft Purview and its Managed Storage Account. Select on each workflow to approve the private endpoint for the corresponding Azure resource.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-ir-workflows.png" alt-text="Screenshot that shows deployment of a Managed VNet Runtime":::
-
-6. In Azure portal, from your Microsoft Purview account resource window, approve the managed private endpoint. From Managed storage account page approve the managed private endpoints for blob and queue
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-purview.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Microsoft Purview":::
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-purview-approved.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Microsoft Purview - approved":::
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-managed-storage.png" alt-text="Screenshot that shows how to approve a managed private endpoint for managed storage account":::
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-managed-storage-approved.png" alt-text="Screenshot that shows how to approve a managed private endpoint for managed storage account - approved":::
-
-7. From Management, select Managed private endpoint to validate if all managed private endpoints are successfully deployed and approved. All private endpoints be approved.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list.png" alt-text="Screenshot that shows managed private endpoints in Microsoft Purview":::
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-approved.png" alt-text="Screenshot that shows managed private endpoints in Microsoft Purview - approved ":::
-
-### Deploy managed private endpoints for data sources
-
-You can use managed private endpoints to connect your data sources to ensure data security during transmission. If your data source allows public access and you want to connect via public network, you can skip this step. Scan run can be executed as long as the integration runtime can connect to your data source.
-
-To deploy and approve a managed private endpoint for a data source, follow these steps selecting data source of your choice from the list:
-
-1. Navigate to **Management**, and select **Managed private endpoints**.
-
-2. Select **+ New**.
-
-3. From the list of supported data sources, select the type that corresponds to the data source you're planning to scan using Managed VNet Runtime.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source.png" alt-text="Screenshot that shows how to create a managed private endpoint for data sources":::
-
-4. Provide a name for the managed private endpoint, select the Azure subscription and the data source from the drop-down lists. Select **create**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source-pe.png" alt-text="Screenshot that shows how to select data source for setting managed private endpoint":::
-
-5. From the list of managed private endpoints, select on the newly created managed private endpoint for your data source and then select on **Manage approvals in the Azure portal**, to approve the private endpoint in Azure portal.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source-approval.png" alt-text="Screenshot that shows the approval for managed private endpoint for data sources":::
-
-6. By clicking on the link, you're redirected to Azure portal. Under private endpoints connection, select the newly created private endpoint and select **approve**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source-pe-azure.png" alt-text="Screenshot that shows how to approve a private endpoint for data sources in Azure portal":::
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source-pe-azure-approved.png" alt-text="Screenshot that shows approved private endpoint for data sources in Azure portal":::
-
-7. Inside the Microsoft Purview governance portal, the managed private endpoint must be shown as approved as well.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-2.png" alt-text="Screenshot that shows managed private endpoints including data sources' in Purview governance portal":::
-
-### Register and scan a data source using Managed VNet Runtime
-
-#### Register data source
-It's important to register the data source in Microsoft Purview prior to setting up a scan for the data source. Follow these steps to register data source if you haven't yet registered it.
-
-1. Go to your Microsoft Purview account.
-1. Select **Data Map** on the left menu.
-1. Select **Register**.
-2. On **Register sources**, select your data source
-3. Select **Continue**.
-4. On the **Register sources** screen, do the following:
-
- 1. In the **Name** box, enter a name that the data source will be listed with in the catalog.
- 2. In the **Subscription** dropdown list box, select a subscription.
- 3. In the **Select a collection** box, select a collection.
- 4. Select **Register** to register the data sources.
-
-For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md).
-
-#### Scan data source
-
-You can use any of the following options to scan data sources using Microsoft Purview Managed VNet Runtime:
---- [Using Managed Identity](#scan-using-managed-identity) (Recommended) - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview system-assigned managed identity (SAMI) to perform the scans.--- [Using other authentication options](#scan-using-other-authentication-options):
-
- - Account Key or SQL Authentication- Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Microsoft Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
-
- - Service Principal - In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant.
-
-##### Scan using Managed Identity
-
-To scan a data source using a Managed VNet Runtime and Microsoft Purview managed identity perform these steps:
-
-1. Select the **Data Map** tab on the left pane in the Microsoft Purview governance portal.
-
-1. Select the data source that you registered.
-
-1. Select **View details** > **+ New scan**, or use the **Scan** quick-action icon on the source tile.
-
-1. Provide a **Name** for the scan.
-
-1. Under **Connect via integration runtime**, select the newly created Managed VNet Runtime.
-
-1. For **Credential** Select the managed identity, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-scan.png" alt-text="Screenshot that shows how to create a new scan using Managed VNet":::
-
-1. Follow the steps to select the appropriate scan rule and scope for your scan.
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
-1. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-scan-run.png" alt-text="review scan":::
-
-##### Scan using other authentication options
-
-You can also use other supported options to scan data sources using Microsoft Purview Managed Runtime. This requires setting up a private connection to Azure Key Vault where the secret is stored.
-
-To set up a scan using Account Key or SQL Authentication follow these steps:
-
-1. [Grant Microsoft Purview access to your Azure Key Vault](manage-credentials.md#grant-microsoft-purview-access-to-your-azure-key-vault).
-
-2. [Create a new credential in Microsoft Purview](manage-credentials.md#create-a-new-credential).
-
-3. Navigate to **Management**, and select **Managed private endpoints**.
-
-4. Select **+ New**.
-
-5. From the list of supported data sources, select **Key Vault**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault.png" alt-text="Screenshot that shows how to create a managed private endpoint for Azure Key Vault":::
-
-6. Provide a name for the managed private endpoint, select the Azure subscription and the Azure Key Vault from the drop-down lists. Select **create**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-create.png" alt-text="Screenshot that shows how to create a managed private endpoint for Azure Key Vault in the Microsoft Purview governance portal":::
-
-7. From the list of managed private endpoints, select on the newly created managed private endpoint for your Azure Key Vault and then select on **Manage approvals in the Azure portal**, to approve the private endpoint in Azure portal.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-approve.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Azure Key Vault":::
-
-8. By clicking on the link, you're redirected to Azure portal. Under private endpoints connection, select the newly created private endpoint for your Azure Key Vault and select **approve**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-az-approve.png" alt-text="Screenshot that shows how to approve a private endpoint for an Azure Key Vault in Azure portal":::
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-az-approved.png" alt-text="Screenshot that shows approved private endpoint for Azure Key Vault in Azure portal":::
-
-9. Inside the Microsoft Purview governance portal, the managed private endpoint must be shown as approved as well.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-3.png" alt-text="Screenshot that shows managed private endpoints including Azure Key Vault in Purview governance portal":::
-
-10. Select the **Data Map** tab on the left pane in the Microsoft Purview governance portal.
-
-11. Select the data source that you registered.
-
-12. Select **View details** > **+ New scan**, or use the **Scan** quick-action icon on the source tile.
-
-13. Provide a **Name** for the scan.
-
-14. Under **Connect via integration runtime**, select the newly created Managed VNet Runtime.
-
-15. For **Credential** Select the credential you've registered earlier, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-scan.png" alt-text="Screenshot that shows how to create a new scan using Managed VNet and a SPN":::
-
-16. Follow the steps to select the appropriate scan rule and scope for your scan.
-
-17. Choose your scan trigger. You can set up a schedule or run the scan once.
-
-18. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-scan-spn-run.png" alt-text="review scan using SPN":::
-
-## Next steps
--- [Manage data sources in Microsoft Purview](manage-data-sources.md)
purview Catalog Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-permissions.md
- Title: Understand access and permissions in the Microsoft Purview governance portal
-description: This article gives an overview permission, access control, and collections in the Microsoft Purview governance portal. Role-based access control is managed within the Microsoft Purview Data Map in the governance portal itself, so this guide will cover the basics to secure your information.
------ Previously updated : 12/19/2022--
-# Access control in the Microsoft Purview governance portal
-
-The Microsoft Purview governance portal uses **Collections** in the Microsoft Purview Data Map to organize and manage access across its sources, assets, and other artifacts. This article describes collections and access management for your account in the Microsoft Purview governance portal.
-
-> [!IMPORTANT]
-> This article refers to permissions required for the Microsoft Purview governance portal, and applications like the Microsoft Purview Data Map, Data Catalog, Data Estate Insights, etc. If you are looking for permissions information for the Microsoft Purview compliance center, follow [the article for permissions in the Microsoft Purview compliance portal](/microsoft-365/compliance/microsoft-365-compliance-center-permissions).
-
-## Permissions to access the Microsoft Purview governance portal
-
-There are two main ways to access the Microsoft Purview governance portal, and you'll need specific permissions for either:
--- To access your Microsoft Purview governance portal directly at [https://web.purview.azure.com](https://web.purview.azure.com), you'll need at least a [reader role](#roles) on a collection in your Microsoft Purview Data Map.-- To access your Microsoft Purview governance portal through the [Azure portal](https://portal.azure.com) by searching for your Microsoft Purview account, opening it, and selecting **Open Microsoft Purview governance portal**, you'll need at least a **Reader** role under **Access Control (IAM)**.-
-> [!NOTE]
-> If you created your account using a service principal, to be able to access the Microsoft Purview governance portal you will need to [grant a user collection admin permissions on the root collection](#administrator-change).
-
-## Collections
-
-A collection is a tool that the Microsoft Purview Data Map uses to group assets, sources, and other artifacts into a hierarchy for discoverability and to manage access control. All accesses to the Microsoft Purview governance portal's resources are managed from collections in the Microsoft Purview Data Map.
-
-## Roles
-
-The Microsoft Purview governance portal uses a set of predefined roles to control who can access what within the account. These roles are currently:
--- **Collection administrator** - a role for users that will need to assign roles to other users in the Microsoft Purview governance portal or manage collections. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.
- A collection administrator on the [root collection](reference-azure-purview-glossary.md#root-collection) also automatically has permission to the Microsoft Purview governance portal. If your **root collection administrator** ever needs to be changed, you can [follow the steps in the section below](#administrator-change).
-- **Data curators** - a role that provides access to the data catalog to manage assets, configure custom classifications, set up glossary terms, and view data estate insights. Data curators can create, read, modify, move, and delete assets. They can also apply annotations to assets.-- **Data readers** - a role that provides read-only access to data assets, classifications, classification rules, collections and glossary terms.-- **Data source administrator** - a role that allows a user to manage data sources and scans. If a user is granted only to **Data source admin** role on a given data source, they can run new scans using an existing scan rule. To create new scan rules, the user must be also granted as either **Data reader** or **Data curator** roles.-- **Insights reader** - a role that provides read-only access to insights reports for collections where the insights reader also has at least the **Data reader** role. For more information, see [insights permissions.](insights-permissions.md)-- **Policy author (Preview)** - a role that allows a user to view, update, and delete Microsoft Purview policies through the policy management app within Microsoft Purview.-- **Workflow administrator** - a role that allows a user to access the workflow authoring page in the Microsoft Purview governance portal, and publish workflows on collections where they have access permissions. Workflow administrator only has access to authoring, and so will need at least Data reader permission on a collection to be able to access the Purview governance portal.-
-> [!NOTE]
-> At this time, Microsoft Purview policy author role is not sufficient to create policies. The Microsoft Purview data source admin role is also required.
-
-## Who should be assigned to what role?
-
-|User Scenario|Appropriate Role(s)|
-|-|--|
-|I just need to find assets, I don't want to edit anything|Data reader|
-|I need to edit and manage information about assets|Data curator|
-|I want to create custom classifications | Data curator **or** data source administrator |
-|I need to edit the business glossary |Data curator|
-|I need to view Data Estate Insights to understand the governance posture of my data estate|Data curator|
-|My application's Service Principal needs to push data to the Microsoft Purview Data Map|Data curator|
-|I need to set up scans via the Microsoft Purview governance portal|Data curator on the collection **or** data curator **and** data source administrator where the source is registered.|
-|I need to enable a Service Principal or group to set up and monitor scans in the Microsoft Purview Data Map without allowing them to access the catalog's information |Data source administrator|
-|I need to put users into roles in the Microsoft Purview governance portal| Collection administrator |
-|I need to create and publish access policies | Data source administrator and policy author |
-|I need to create workflows for my Microsoft Purview account in the governance portal| Workflow administrator |
-|I need to share data from sources registered in Microsoft Purview | Data reader |
-|I need to receive shared data in Microsoft Purview | Data reader |
-|I need to view insights for collections I'm a part of | Insights reader **or** data curator |
-|I need to create or manage our [self-hosted integration runtime (SHIR)](manage-integration-runtimes.md) | Data source administrator |
-|I need to create managed private endpoints | Data source administrator |
-
->[!NOTE]
-> **\*Data curator** - Data curators can read insights only if they are assigned data curator at the root collection level.
-> **\*\*Data source administrator permissions on Policies** - Data source administrators are also able to publish data policies.
-
-## Understand how to use the Microsoft Purview governance portal's roles and collections
-
-All access control is managed through collections in the Microsoft Purview Data Map. The collections can be found in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/). Open your account in the [Azure portal](https://portal.azure.com) and select the Microsoft Purview governance portal tile on the Overview page. From there, navigate to the data map on the left menu, and then select the 'Collections' tab.
-
-When a Microsoft Purview (formerly Azure Purview) account is created, it starts with a root collection that has the same name as the account itself. The creator of the account is automatically added as a Collection Admin, Data Source Admin, Data Curator, and Data Reader on this root collection, and can edit and manage this collection.
-
-Sources, assets, and objects can be added directly to this root collection, but so can other collections. Adding collections will give you more control over who has access to data across your account.
-
-All other users can only access information within the Microsoft Purview governance portal if they, or a group they're in, are given one of the above roles. This means, when you create an account, no one but the creator can access or use its APIs until they're [added to one or more of the above roles in a collection](how-to-create-and-manage-collections.md#add-role-assignments).
-
-Users can only be added to a collection by a collection admin, or through permissions inheritance. The permissions of a parent collection are automatically inherited by its subcollections. However, you can choose to [restrict permission inheritance](how-to-create-and-manage-collections.md#restrict-inheritance) on any collection. If you do this, its subcollections will no longer inherit permissions from the parent and will need to be added directly, though collection admins that are automatically inherited from a parent collection can't be removed.
-
-You can assign roles to users, security groups, and service principals from your Azure Active Directory that is associated with your subscription.
-
-## Assign permissions to your users
-
-After creating a Microsoft Purview (formerly Azure Purview) account, the first thing to do is create collections and assign users to roles within those collections.
-
-> [!NOTE]
-> If you created your account using a service principal, to be able to access the Microsoft Purview governance portal and assign permissions to users, you will need to [grant a user collection admin permissions on the root collection](#administrator-change).
-
-### Create collections
-
-Collections can be customized for structure of the sources in your Microsoft Purview Data Map, and can act like organized storage bins for these resources. When you're thinking about the collections you might need, consider how your users will access or discover information. Are your sources broken up by departments? Are there specialized groups within those departments that will only need to discover some assets? Are there some sources that should be discoverable by all your users?
-
-This will inform the collections and subcollections you may need to most effectively organize your data map.
-
-New collections can be added directly to the data map where you can choose their parent collection from a drop-down, or they can be added from the parent as a sub collection. In the data map view, you can see all your sources and assets ordered by the collections, and in the list, the source's collection is listed.
-
-For more instructions and information, you can follow our [guide for creating and managing collections](how-to-create-and-manage-collections.md).
-
-#### A collections example
-
-Now that we have a base understanding of collections, permissions, and how they work, let's look at an example.
--
-This is one way an organization might structure their data: Starting with their root collection (Contoso, in this example) collections are organized into regions, and then into departments and subdepartments. Data sources and assets can be added to any one these collections to organize data resources by these regions and department, and manage access control along those lines. There's one subdepartment, Revenue, that has strict access guidelines so permissions will need to be tightly managed.
-
-The [data reader role](#roles) can access information within the catalog, but not manage or edit it. So for our example above, adding the Data Reader permission to a group on the root collection and allowing inheritance will give all users in that group reader permissions on sources and assets in the Microsoft Purview Data Map. This makes these resources discoverable, but not editable, by everyone in that group. [Restricting inheritance](how-to-create-and-manage-collections.md#restrict-inheritance) on the Revenue group will control access to those assets. Users who need access to revenue information can be added separately to the Revenue collection.
-Similarly with the Data Curator and Data Source Admin roles, permissions for those groups will start at the collection where they're assigned and trickle down to subcollections that haven't restricted inheritance. Below we have assigned permissions for several groups at collections levels in the Americas sub collection.
--
-### Add users to roles
-
-Role assignment is managed through the collections. Only a user with the [collection admin role](#roles) can grant permissions to other users on that collection. When new permissions need to be added, a collection admin will access the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), navigate to data map, then the collections tab, and select the collection where a user needs to be added. From the Role Assignments tab, they'll be able to add and manage users who need permissions.
-
-For full instructions, see our [how-to guide for adding role assignments](how-to-create-and-manage-collections.md#add-role-assignments).
-
-## Administrator change
-
-There may be a time when your [root collection admin](#roles) needs to change, or an admin needs to be added after an account is created by an application. By default, the user who creates the account is automatically assigned collection admin to the root collection. To update the root collection admin, there are four options:
--- You can manage root collection administrators in the [Azure portal](https://portal.azure.com/):
- 1. Sign in to the Azure portal and search for your Microsoft Purview account.
- 1. Select **Root collection permission** from the left-side menu on your Microsoft Purview account page.
- 1. Select **Add root collection admin** to add an administrator.
- :::image type="content" source="./media/catalog-permissions/root-collection-admin.png" alt-text="Screenshot of a Microsoft Purview account page in the Azure portal with the Root collection permission page selected and the Add root collection admin option highlighted." border="true":::
- 1. You can also select **View all root collection admins** to be taken to the root collection in the Microsoft Purview governance portal.
--- You can [assign permissions through the Microsoft Purview governance portal](how-to-create-and-manage-collections.md#add-role-assignments) as you have for any other role.--- You can use the REST API to add a collection administrator. Instructions to use the REST API to add a collection admin can be found in our [REST API for collections documentation.](tutorial-metadata-policy-collections-apis.md#add-the-root-collection-administrator-role) For additional information, you can see our [REST API reference](/rest/api/purview/accounts/add-root-collection-admin).--- You can also use [the below Azure CLI command](/cli/azure/purview/account#az-purview-account-add-root-collection-admin). The object-id is optional. For more information and an example, see the [CLI command reference page](/cli/azure/purview/account#az-purview-account-add-root-collection-admin).-
- ```azurecli
- az purview account add-root-collection-admin --account-name [Microsoft Purview Account Name] --resource-group [Resource Group Name] --object-id [User Object Id]
- ```
-
-## Next steps
-
-Now that you have a base understanding of collections, and access control, follow the guides below to create and manage those collections, or get started with registering sources into your Microsoft Purview Data Map.
--- [How to create and manage collections](how-to-create-and-manage-collections.md)-- [Supported data sources in the Microsoft Purview Data Map](azure-purview-connector-overview.md)
purview Catalog Private Link Account Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-account-portal.md
- Title: Connect privately and securely to your Microsoft Purview account
-description: This article describes how you can set up a private endpoint to connect to your Microsoft Purview account from restricted network.
----- Previously updated : 12/02/2022
-# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account for secure access.
--
-# Connect privately and securely to your Microsoft Purview account
-In this guide, you will learn how to deploy private endpoints for your Microsoft Purview account to allow you to connect to your Microsoft Purview account only from VNets and private networks. To achieve this goal, you need to deploy _account_ and _portal_ private endpoints for your Microsoft Purview account.
-
-The Microsoft Purview _account_ private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the Microsoft Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
-
-The Microsoft Purview _portal_ private endpoint is required to enable connectivity to [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/) using a private network.
-
-> [!NOTE]
-> If you only create _account_ and _portal_ private endpoints, you won't be able to run any scans. To enable scanning on a private network, you will also need to [create an ingestion private endpoint](catalog-private-link-end-to-end.md).
-
- :::image type="content" source="media/catalog-private-link/purview-private-link-account-portal.png" alt-text="Diagram that shows Microsoft Purview and Private Link architecture.":::
-
-For more information about Azure Private Link service, see [private links and private endpoints](../private-link/private-endpoint-overview.md) to learn more.
-
-## Deployment checklist
-Using one of the deployment options from this guide, you can deploy a new Microsoft Purview account with _account_ and _portal_ private endpoints or you can choose to deploy these private endpoints for an existing Microsoft Purview account:
-
-1. Choose an appropriate Azure virtual network and a subnet to deploy Microsoft Purview private endpoints. Select one of the following options:
- - Deploy a [new virtual network](../virtual-network/quick-create-portal.md) in your Azure subscription.
- - Locate an existing Azure virtual network and a subnet in your Azure subscription.
-
-2. Define an appropriate [DNS name resolution method](./catalog-private-link-name-resolution.md#deployment-options), so Microsoft Purview account and web portal can be accessible through private IP addresses. You can use any of the following options:
- - Deploy new Azure DNS zones using the steps explained further in this guide.
- - Add required DNS records to existing Azure DNS zones using the steps explained further in this guide.
- - After completing the steps in this guide, add required DNS A records in your existing DNS servers manually.
-3. Deploy a [new Microsoft Purview account](#option-1deploy-a-new-microsoft-purview-account-with-account-and-portal-private-endpoints) with account and portal private endpoints, or deploy account and portal private endpoints for an [existing Microsoft Purview account](#option-2enable-account-and-portal-private-endpoint-on-existing-microsoft-purview-accounts).
-4. [Enable access to Azure Active Directory](#enable-access-to-azure-active-directory) if your private network has network security group rules set to deny for all public internet traffic.
-5. After completing this guide, adjust DNS configurations if needed.
-6. Validate your network and name resolution from management machine to Microsoft Purview.
-
-## Option 1 - Deploy a new Microsoft Purview account with _account_ and _portal_ private endpoints
-
-1. Go to the [Azure portal](https://portal.azure.com), and then go to the **Microsoft Purview accounts** page. Select **+ Create** to create a new Microsoft Purview account.
-
-2. Fill in the basic information, and on the **Networking** tab, set the connectivity method to **Private endpoint**. Set enable private endpoint to **Account and Portal only**.
-
-3. Under **Account and portal** select **+ Add** to add a private endpoint for your Microsoft Purview account.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-account-portal.png" alt-text="Screenshot that shows create private endpoint for account and portal page selections.":::
-
-4. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**, choose your location, provide a name for _account_ private endpoint and select **account**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-account.png" alt-text="Screenshot that shows create account private endpoint page.":::
-
- > [!NOTE]
- > You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-5. Select **OK**.
-
-6. In **Create Microsoft Purview account** wizard, select **+Add** again to add _portal_ private endpoint.
-
-7. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**,choose your location, provide a name for _portal_ private endpoint and select **portal**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-portal.png" alt-text="Screenshot that shows create portal private endpoint page.":::
-
- > [!NOTE]
- > You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-8. Select **OK**.
-
-9. Select **Review + Create**. On the **Review + Create** page, Azure validates your configuration.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-account-portal-2.png" alt-text="Screenshot that shows create private endpoint review page.":::
--
-10. When you see the "Validation passed" message, select **Create**.
-
-## Option 2 - Enable _account_ and _portal_ private endpoint on existing Microsoft Purview accounts
-
-There are two ways you can add Microsoft Purview _account_ and _portal_ private endpoints for an existing Microsoft Purview account:
--- Use the Azure portal (Microsoft Purview account).-- Use the Private Link Center.-
-### Use the Azure portal (Microsoft Purview account)
-
-1. Go to the [Azure portal](https://portal.azure.com), and then select your Microsoft Purview account, and under **Settings** select **Networking**, and then select **Private endpoint connections**.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-add-to-existing.png" alt-text="Screenshot that shows creating an account private endpoint.":::
-
-2. Select **+ Private endpoint** to create a new private endpoint.
-
-3. Fill in the basic information.
-
-4. On the **Resource** tab, for **Resource type**, select **Microsoft.Purview/accounts**.
-
-5. For **Resource**, select the Microsoft Purview account, and for **Target sub-resource**, select **account**.
-
-6. On the **Configuration** tab, select the virtual network and optionally, select Azure Private DNS zone to create a new Azure DNS Zone.
-
- > [!NOTE]
- > For DNS configuration, you can also use your existing Azure Private DNS Zones from the dropdown list or add the required DNS records to your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-7. Go to the summary page, and select **Create** to create the portal private endpoint.
-
-8. Follow the same steps when you select **portal** for **Target sub-resource**.
-
-### Use the Private Link Center
-
-1. Go to the [Azure portal](https://portal.azure.com).
-
-1. In the search bar at the top of the page, search for **private link** and go to the **Private Link** pane by selecting the first option.
-
-1. Select **+ Add**, and fill in the basic details.
-
- :::image type="content" source="media/catalog-private-link/private-link-center.png" alt-text="Screenshot that shows creating private endpoints from the Private Link Center.":::
-
-1. For **Resource**, select the already created Microsoft Purview account. For **Target sub-resource**, select **account**.
-
-1. On the **Configuration** tab, select the virtual network and private DNS zone. Go to the summary page, and select **Create** to create the account private endpoint.
-
-> [!NOTE]
-> Follow the same steps when you select **portal** for **Target sub-resource**.
-
-## Enable access to Azure Active Directory
-
-> [!NOTE]
-> If your VM, VPN gateway, or VNet Peering gateway has public internet access, it can access the Microsoft Purview portal and the Microsoft Purview account enabled with private endpoints. For this reason, you don't have to follow the rest of the instructions. If your private network has network security group rules set to deny all public internet traffic, you'll need to add some rules to enable Azure Active Directory (Azure AD) access. Follow the instructions to do so.
-
-These instructions are provided for accessing Microsoft Purview securely from an Azure VM. Similar steps must be followed if you're using VPN or other VNet Peering gateways.
-
-1. Go to your VM in the Azure portal, and under **Settings**, select **Networking**. Then select **Outbound port rules**, **Add outbound port rule**.
-
- :::image type="content" source="media/catalog-private-link/outbound-rule-add.png" alt-text="Screenshot that shows adding an outbound rule.":::
-
-1. On the **Add outbound security rule** pane:
-
- 1. Under **Destination**, select **Service Tag**.
- 1. Under **Destination service tag**, select **AzureActiveDirectory**.
- 1. Under **Destination port ranges**, select *.
- 1. Under **Action**, select **Allow**.
- 1. Under **Priority**, the value should be higher than the rule that denied all internet traffic.
-
- Create the rule.
-
- :::image type="content" source="media/catalog-private-link/outbound-rule-details.png" alt-text="Screenshot that shows adding outbound rule details.":::
-
-1. Follow the same steps to create another rule to allow the **AzureResourceManager** service tag. If you need to access the Azure portal, you can also add a rule for the **AzurePortal** service tag.
-
-1. Connect to the VM and open the browser. Go to the browser console by selecting Ctrl+Shift+J, and switch to the network tab to monitor network requests. Enter web.purview.azure.com in the URL box, and try to sign in by using your Azure AD credentials. Sign-in will probably fail, and on the **Network** tab on the console, you can see Azure AD trying to access aadcdn.msauth.net but getting blocked.
-
- :::image type="content" source="media/catalog-private-link/login-fail.png" alt-text="Screenshot that shows sign-in fail details.":::
-
-1. In this case, open a command prompt on the VM, ping aadcdn.msauth.net, get its IP, and then add an outbound port rule for the IP in the VM's network security rules. Set the **Destination** to **IP Addresses** and set **Destination IP addresses** to the aadcdn IP. Because of Azure Load Balancer and Azure Traffic Manager, the Azure AD Content Delivery Network IP might be dynamic. After you get its IP, it's better to add it into the VM's host file to force the browser to visit that IP to get the Azure AD Content Delivery Network.
-
- :::image type="content" source="media/catalog-private-link/ping.png" alt-text="Screenshot that shows the test ping.":::
-
- :::image type="content" source="media/catalog-private-link/aadcdn-rule.png" alt-text="Screenshot that shows the Azure A D Content Delivery Network rule.":::
-
-1. After the new rule is created, go back to the VM and try to sign in by using your Azure AD credentials again. If sign-in succeeds, then the Microsoft Purview portal is ready to use. But in some cases, Azure AD redirects to other domains to sign in based on a customer's account type. For example, for a live.com account, Azure AD redirects to live.com to sign in, and then those requests are blocked again. For Microsoft employee accounts, Azure AD accesses msft.sts.microsoft.com for sign-in information.
-
- Check the networking requests on the browser **Networking** tab to see which domain's requests are getting blocked, redo the previous step to get its IP, and add outbound port rules in the network security group to allow requests for that IP. If possible, add the URL and IP to the VM's host file to fix the DNS resolution. If you know the exact sign-in domain's IP ranges, you can also directly add them into networking rules.
-
-1. Now your Azure AD sign-in should be successful. The Microsoft Purview portal will load successfully, but listing all the Microsoft Purview accounts won't work because it can only access a specific Microsoft Purview account. Enter `web.purview.azure.com/resource/{PurviewAccountName}` to directly visit the Microsoft Purview account that you successfully set up a private endpoint for.
-
-## Next steps
--- [Verify resolution for private endpoints](./catalog-private-link-name-resolution.md)-- [Manage data sources in Microsoft Purview](./manage-data-sources.md)-- [Troubleshooting private endpoint configuration for your Microsoft Purview account](./catalog-private-link-troubleshoot.md)
purview Catalog Private Link End To End https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-end-to-end.md
- Title: Connect to your Microsoft Purview and scan data sources privately and securely
-description: This article describes how you can set up a private endpoint to connect to your Microsoft Purview account and scan data sources from restricted network for an end to end isolation
----- Previously updated : 01/13/2023
-# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account to access purview account and scan data sources from restricted network.
--
-# Connect to your Microsoft Purview and scan data sources privately and securely
-
-In this guide, you will learn how to deploy _account_, _portal_ and _ingestion_ private endpoints for your Microsoft Purview account to access purview account and scan data sources using a self-hosted integration runtime securely and privately, thereby enabling end-to-end network isolation.
-
-The Microsoft Purview _account_ private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the Microsoft Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
-
-The Microsoft Purview _portal_ private endpoint is required to enable connectivity to [Microsoft Purview governance portal](https://web.purview.azure.com/resource/) using a private network.
-
-Microsoft Purview can scan data sources in Azure or an on-premises environment by using _ingestion_ private endpoints. Three private endpoint resources are required to be deployed and linked to Microsoft Purview managed or configured resources when ingestion private endpoint is deployed:
--
- :::image type="content" source="media/catalog-private-link/purview-private-link-architecture.png" alt-text="Diagram that shows Microsoft Purview and Private Link architecture.":::
-
-## Deployment checklist
-Using one of the deployment options explained further in this guide, you can deploy a new Microsoft Purview account with _account_, _portal_ and _ingestion_ private endpoints or you can choose to deploy these private endpoints for an existing Microsoft Purview account:
-
-1. Choose an appropriate Azure virtual network and a subnet to deploy Microsoft Purview private endpoints. Select one of the following options:
- - Deploy a [new virtual network](../virtual-network/quick-create-portal.md) in your Azure subscription.
- - Locate an existing Azure virtual network and a subnet in your Azure subscription.
-
-2. Define an appropriate [DNS name resolution method](./catalog-private-link-name-resolution.md#deployment-options), so you can access Microsoft Purview account and scan data sources using private network. You can use any of the following options:
- - Deploy new Azure DNS zones using the steps explained further in this guide.
- - Add required DNS records to existing Azure DNS zones using the steps explained further in this guide.
- - After completing the steps in this guide, add required DNS A records in your existing DNS servers manually.
-3. Deploy a [new Microsoft Purview account](#option-1deploy-a-new-microsoft-purview-account-with-account-portal-and-ingestion-private-endpoints) with account, portal and ingestion private endpoints, or deploy private endpoints for an [existing Microsoft Purview account](#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-microsoft-purview-accounts).
-4. [Enable access to Azure Active Directory](#enable-access-to-azure-active-directory) if your private network has network security group rules set to deny for all public internet traffic.
-5. Deploy and register [Self-hosted integration runtime](#deploy-self-hosted-integration-runtime-ir-and-scan-your-data-sources) inside the same VNet or a peered VNet where Microsoft Purview account and ingestion private endpoints are deployed.
-6. After completing this guide, adjust DNS configurations if needed.
-7. Validate your network and name resolution between management machine, self-hosted IR VM and data sources to Microsoft Purview.
-
- > [!NOTE]
- > If you enable a managed event hub after deploying your ingestion private endpoint, you'll need to redeploy the ingestion private endpoint.
-
-## Option 1 - Deploy a new Microsoft Purview account with _account_, _portal_ and _ingestion_ private endpoints
-
-1. Go to the [Azure portal](https://portal.azure.com), and then go to the **Microsoft Purview accounts** page. Select **+ Create** to create a new Microsoft Purview account.
-
-2. Fill in the basic information, and on the **Networking** tab, set the connectivity method to **Private endpoint**. Set enable private endpoint to **Account, Portal and ingestion**.
-
-3. Under **Account and portal** select **+ Add** to add a private endpoint for your Microsoft Purview account.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-end-to-end.png" alt-text="Screenshot that shows create private endpoint end-to-end page selections.":::
-
-4. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**, choose your location, provide a name for _account_ private endpoint and select **account**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-account.png" alt-text="Screenshot that shows create account private endpoint page.":::
-
- > [!NOTE]
- > You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-5. Select **OK**.
-
-6. Under **Account and portal** wizard, again select **+Add** again to add _portal_ private endpoint.
-
-7. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**,choose your location, provide a name for _portal_ private endpoint and select **portal**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-portal.png" alt-text="Screenshot that shows create portal private endpoint page.":::
-
- > [!NOTE]
- > You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-8. Select **OK**.
-
-9. Under **Ingestion**, set up your ingestion private endpoints by providing details for **Subscription**, **Virtual network**, and **Subnet** that you want to pair with your private endpoint.
-
-10. Optionally, select **Private DNS integration** to use Azure Private DNS Zones.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-deploy-ingestion.png" alt-text="Screenshot that shows create private endpoint overview page.":::
-
- > [!IMPORTANT]
- > It is important to select correct Azure Private DNS Zones to allow correct name resolution between Microsoft Purview and data sources. You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md).
-
-11. Select **Review + Create**. On the **Review + Create** page, Azure validates your configuration.
-
-12. When you see the "Validation passed" message, select **Create**.
-
-
-## Option 2 - Enable _account_, _portal_ and _ingestion_ private endpoint on existing Microsoft Purview accounts
-
-1. Go to the [Azure portal](https://portal.azure.com), and then select your Microsoft Purview account, and under **Settings** select **Networking**, and then select **Private endpoint connections**.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-add-to-existing.png" alt-text="Screenshot that shows creating an account private endpoint.":::
-
-2. Select **+ Private endpoint** to create a new private endpoint.
-
-3. Fill in the basic information.
-
-4. On the **Resource** tab, for **Resource type**, select **Microsoft.Purview/accounts**.
-
-5. For **Resource**, select the Microsoft Purview account, and for **Target sub-resource**, select **account**.
-
-6. On the **Configuration** tab, select the virtual network and optionally, select Azure Private DNS zone to create a new Azure DNS Zone.
-
- > [!NOTE]
- > For DNS configuration, you can also use your existing Azure Private DNS Zones from the dropdown list or add the required DNS records to your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-7. Go to the summary page, and select **Create** to create the account private endpoint.
-
-8. Repeat steps 2 through 7 to create the portal private endpoint. Make sure you select **portal** for **Target sub-resource**.
-
-9. From your Microsoft Purview account, under **Settings** select **Networking**, and then select **Ingestion private endpoint connections**.
-
-10. Under Ingestion private endpoint connections, select **+ New** to create a new ingestion private endpoint.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-add-ingestion-to-existing.png" alt-text="Screenshot that shows add private endpoint to existing account.":::
-
-11. Fill in the basic information, selecting your existing virtual network and a subnet details. Optionally, select **Private DNS integration** to use Azure Private DNS Zones. Select correct Azure Private DNS Zones from each list.
-
- > [!NOTE]
- > You can also use your existing Azure Private DNS zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md)
-
-12. Select **Create** to finish the setup.
-
-## Enable access to Azure Active Directory
-
-> [!NOTE]
-> If your VM, VPN gateway, or VNet Peering gateway has public internet access, it can access the Microsoft Purview governance portal and the Microsoft Purview account enabled with private endpoints. For this reason, you don't have to follow the rest of the instructions. If your private network has network security group rules set to deny all public internet traffic, you'll need to add some rules to enable Azure Active Directory (Azure AD) access. Follow the instructions to do so.
-
-These instructions are provided for accessing Microsoft Purview securely from an Azure VM. Similar steps must be followed if you're using VPN or other VNet Peering gateways.
-
-1. Go to your VM in the Azure portal, and under **Settings**, select **Networking**. Then select **Outbound port rules** > **Add outbound port rule**.
-
- :::image type="content" source="media/catalog-private-link/outbound-rule-add.png" alt-text="Screenshot that shows adding an outbound rule.":::
-
-1. On the **Add outbound security rule** pane:
-
- 1. Under **Destination**, select **Service Tag**.
- 1. Under **Destination service tag**, select **AzureActiveDirectory**.
- 1. Under **Destination port ranges**, select *.
- 1. Under **Action**, select **Allow**.
- 1. Under **Priority**, the value should be higher than the rule that denied all internet traffic.
-
- Create the rule.
-
- :::image type="content" source="media/catalog-private-link/outbound-rule-details.png" alt-text="Screenshot that shows adding outbound rule details.":::
-
-1. Follow the same steps to create another rule to allow the **AzureResourceManager** service tag. If you need to access the Azure portal, you can also add a rule for the **AzurePortal** service tag.
-
-1. Connect to the VM and open the browser. Go to the browser console by selecting Ctrl+Shift+J, and switch to the network tab to monitor network requests. Enter web.purview.azure.com in the URL box, and try to sign in by using your Azure AD credentials. Sign-in will probably fail, and on the **Network** tab on the console, you can see Azure AD trying to access aadcdn.msauth.net but getting blocked.
-
- :::image type="content" source="media/catalog-private-link/login-fail.png" alt-text="Screenshot that shows sign-in fail details.":::
-
-1. In this case, open a command prompt on the VM, ping aadcdn.msauth.net, get its IP, and then add an outbound port rule for the IP in the VM's network security rules. Set the **Destination** to **IP Addresses** and set **Destination IP addresses** to the aadcdn IP. Because of Azure Load Balancer and Azure Traffic Manager, the Azure AD Content Delivery Network IP might be dynamic. After you get its IP, it's better to add it into the VM's host file to force the browser to visit that IP to get the Azure AD Content Delivery Network.
-
- :::image type="content" source="media/catalog-private-link/ping.png" alt-text="Screenshot that shows the test ping.":::
-
- :::image type="content" source="media/catalog-private-link/aadcdn-rule.png" alt-text="Screenshot that shows the Azure A D Content Delivery Network rule.":::
-
-1. After the new rule is created, go back to the VM and try to sign in by using your Azure AD credentials again. If sign-in succeeds, then the Microsoft Purview governance portal is ready to use. But in some cases, Azure AD redirects to other domains to sign in based on a customer's account type. For example, for a live.com account, Azure AD redirects to live.com to sign in, and then those requests are blocked again. For Microsoft employee accounts, Azure AD accesses msft.sts.microsoft.com for sign-in information.
-
- Check the networking requests on the browser **Networking** tab to see which domain's requests are getting blocked, redo the previous step to get its IP, and add outbound port rules in the network security group to allow requests for that IP. If possible, add the URL and IP to the VM's host file to fix the DNS resolution. If you know the exact sign-in domain's IP ranges, you can also directly add them into networking rules.
-
-1. Now your Azure AD sign-in should be successful. The Microsoft Purview governance portal will load successfully, but listing all the Microsoft Purview accounts won't work because it can only access a specific Microsoft Purview account. Enter `web.purview.azure.com/resource/{PurviewAccountName}` to directly visit the Microsoft Purview account that you successfully set up a private endpoint for.
-
-## Deploy self-hosted integration runtime (IR) and scan your data sources.
-Once you deploy ingestion private endpoints for your Microsoft Purview, you need to setup and register at least one self-hosted integration runtime (IR):
--- All on-premises source types like Microsoft SQL Server, Oracle, SAP, and others are currently supported only via self-hosted IR-based scans. The self-hosted IR must run within your private network and then be peered with your virtual network in Azure.
-
-- For all Azure source types like Azure Blob Storage and Azure SQL Database, you must explicitly choose to run the scan by using a self-hosted integration runtime that is deployed in the same VNet or a peered VNet where Microsoft Purview account and ingestion private endpoints are deployed. -
-Follow the steps in [Create and manage a self-hosted integration runtime](manage-integration-runtimes.md) to set up a self-hosted IR. Then set up your scan on the Azure source by choosing that self-hosted IR in the **Connect via integration runtime** dropdown list to ensure network isolation.
-
- :::image type="content" source="media/catalog-private-link/shir-for-azure.png" alt-text="Screenshot that shows running an Azure scan by using self-hosted IR.":::
-
-> [!IMPORTANT]
-> Make sure you download and install the latest version of self-hosted integration runtime from [Microsoft download center](https://www.microsoft.com/download/details.aspx?id=39717).
-
-## Firewalls to restrict public access
-
-To cut off access to the Microsoft Purview account completely from the public internet, follow these steps. This setting applies to both private endpoint and ingestion private endpoint connections.
-
-1. From the [Azure portal](https://portal.azure.com), go to the Microsoft Purview account, and under **Settings**, select **Networking**.
-
-1. Go to the **Firewall** tab, and ensure that the toggle is set to **Disable from all networks**.
-
- :::image type="content" source="media/catalog-private-link/purview-firewall-private.png" alt-text="Screenshot that shows private endpoint firewall settings.":::
-
-## Next steps
--- [Verify resolution for private endpoints](./catalog-private-link-name-resolution.md)-- [Manage data sources in Microsoft Purview](./manage-data-sources.md)-- [Troubleshooting private endpoint configuration for your Microsoft Purview account](./catalog-private-link-troubleshoot.md)
purview Catalog Private Link Faqs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-faqs.md
- Title: Microsoft Purview private endpoints and managed vnets frequently asked questions (FAQ)
-description: This article answers frequently asked questions about Microsoft Purview private endpoints and managed vnets.
----- Previously updated : 06/23/2023
-# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints and managed vnets for my Microsoft Purview account for secure access or ingestion.
-
-# FAQ about Microsoft Purview private endpoints and Managed VNets
-
-This article answers common questions that customers and field teams often ask about Microsoft Purview network configurations by using [Azure Private Link](../private-link/private-link-overview.md) or [Microsoft Purview Managed VNets](./catalog-managed-vnet.md). It's intended to clarify questions about Microsoft Purview firewall settings, private endpoints, DNS configuration, and related configurations.
-
-To set up Microsoft Purview by using Private Link, see [Use private endpoints for your Microsoft Purview account](./catalog-private-link.md).
-To configure Managed VNets for a Microsoft Purview account, see [Use a Managed VNet with your Microsoft Purview account](./catalog-managed-vnet.md)
-
-## Common questions
-
-Check out the answers to the following common questions.
-
-### When should I use a self-hosted integration runtime or a Managed IR?
-
-Use a Managed IR if:
-- Your Microsoft Purview account is deployed in one of the [supported regions for Managed VNets](catalog-managed-vnet.md#supported-regions).-- You are planning to scan any of the [supported data sources](catalog-managed-vnet.md#supported-data-sources) by Managed IR.-
-Use a self-hosted integration runtime if:
-- You are planning to scan data sources in Azure IaaS, SaaS services behind private network or in your on-premises network.-- Managed VNet is not available in the region where your Microsoft Purview account is deployed.-- You are planning to scan any sources that are not listed under [Managed VNet IR supported sources](catalog-managed-vnet.md#supported-data-sources). -
-### Can I use both self-hosted integration runtime and Managed IR inside a Microsoft Purview account?
-
-Yes. You can use one or all of the runtime options in a single Microsoft Purview account: Azure IR, Managed IR and self-hosted integration runtime. You can use only one runtime option in a single scan.
-
-### What's the purpose of deploying the Microsoft Purview account private endpoint?
-
-The Microsoft Purview account private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the account. This private endpoint is also a prerequisite for the portal private endpoint.
-
-### What's the purpose of deploying the Microsoft Purview portal private endpoint?
-
-The Microsoft Purview portal private endpoint provides private connectivity to the Microsoft Purview governance portal.
-
-### What's the purpose of deploying the Microsoft Purview ingestion private endpoints?
-
-Microsoft Purview can scan data sources in Azure or an on-premises environment by using ingestion private endpoints. Three other private endpoint resources are deployed and linked to Microsoft Purview managed or configured resources when ingestion private endpoints are created:
--- **Blob** is linked to a Microsoft Purview managed storage account.-- **Queue** is linked to a Microsoft Purview managed storage account.-- **namespace** is linked to a Microsoft Purview configured event hub namespace.-
-### Can I scan a data source through a public endpoint if a private endpoint is enabled on my Microsoft Purview account?
-
-Yes. Data sources that aren't connected through a private endpoint can be scanned by using a public endpoint while Microsoft Purview is configured to use a private endpoint.
-
-### Can I scan a data source through a service endpoint if a private endpoint is enabled?
-
-Yes. Data sources that aren't connected through a private endpoint can be scanned by using a service endpoint while Microsoft Purview is configured to use a private endpoint.
-
-Make sure you enable **Allow trusted Microsoft services** to access the resources inside the service endpoint configuration of the data source resource in Azure. For example, if you're going to scan Azure Blob Storage in which the firewalls and virtual networks settings are set to **selected networks**, make sure the **Allow trusted Microsoft services to access this storage account** checkbox is selected as an exception.
-
-### Can I scan a data source through a public endpoint using Managed IR?
-
-Yes. If data source is supported by Managed VNet. As a prerequisite, you need to deploy a managed private endpoint for the data source.
-
-### Can I scan a data source through a service endpoint using Managed IR?
-
-Yes. If data source is supported by Managed VNet. As a prerequisite, you need to deploy a managed private endpoint for the data source.
-
-### Can I access the Microsoft Purview governance portal from a public network if Public network access is set to Deny in Microsoft Purview account networking?
-
-No. Connecting to Microsoft Purview from a public endpoint where **Public network access** is set to **Deny** results in the following error message:
-
-"Not authorized to access this Microsoft Purview account. This Microsoft Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Microsoft Purview account's private endpoint."
-
-In this case, to open the Microsoft Purview governance portal, either use a machine that's deployed in the same virtual network as the Microsoft Purview portal private endpoint or use a VM that's connected to your CorpNet in which hybrid connectivity is allowed.
-
-### Is it possible to restrict access to the Microsoft Purview managed storage account and event hub namespace (for private endpoint ingestion only) but keep portal access enabled for users across the web?
-
-Yes. You can configure Microsoft Purview firewall setting to Disabled for ingestion only (Preview). By choosing this option, public network access to your Microsoft Purview account through API and Microsoft Purview governance portal is allowed, however public network access is set to disabled on your Microsoft Purview account's Managed storage account and event hub.
-
-### If public network access is set to Allow, does it mean the managed storage account and event hub namespace are accessible by anyone?
-
-No. As protected resources, access to the Microsoft Purview managed storage account and event hub namespace is restricted to Microsoft Purview only using RBAC authenitcation schemes. These resources are deployed with a deny assignment to all principals, which prevents any applications, users, or groups from gaining access to them.
-
-To read more about Azure deny assignment, see [Understand Azure deny assignments](../role-based-access-control/deny-assignments.md).
-
-### What are the supported authentication types when I use a private endpoint?
-
-Depends on authentication type supported by the data source type such as SQL Authentication, Windows Authentication, Basic Authentication, Service Principal, etc. stored in Azure Key Vault. MSI cannot be used.
-
-### What are the supported authentication types when I use Managed IR?
-Depends on authentication type supported by the data source type such as SQL Authentication, Windows Authentication, Basic Authentication, Service Principal, etc. stored in Azure Key Vault or MSI.
-
-### What private DNS zones are required for Microsoft Purview for a private endpoint?
-
-For Microsoft Purview _account_ and _portal_ private endpoints:
--- `privatelink.purview.azure.com`-
-For Microsoft Purview _ingestion_ private endpoints:
--- `privatelink.blob.core.windows.net`-- `privatelink.queue.core.windows.net`-- `privatelink.servicebus.windows.net`-
-### Do I have to use a dedicated virtual network and dedicated subnet when I deploy Microsoft Purview private endpoints?
-
-No. However, `PrivateEndpointNetworkPolicies` must be disabled in the destination subnet before you deploy the private endpoints. Consider deploying Microsoft Purview into a virtual network that has network connectivity to data source virtual networks through VNet Peering and access to an on-premises network if you plan to scan data sources cross-premises.
-
-Read more about [Disable network policies for private endpoints](../private-link/disable-private-endpoint-network-policy.md).
-
-### Can I deploy Microsoft Purview private endpoints and use existing private DNS zones in my subscription to register the A records?
-
-Yes. Your private endpoint DNS zones can be centralized in a hub or data management subscription for all internal DNS zones required for Microsoft Purview and all data source records. We recommend this method to allow Microsoft Purview to resolve data sources by using their private endpoint internal IP addresses.
-
-You're also required to set up a [virtual network link](../dns/private-dns-virtual-network-links.md) for virtual networks for the existing private DNS zone.
-
-### Can I use Azure integration runtime to scan data sources through a private endpoint?
-
-No. You have to deploy and register a self-hosted integration runtime to scan data by using private connectivity. Azure Key Vault or Service Principal must be used as the authentication method to data sources.
-
-### Can I use Managed IR to scan data sources through a private endpoint?
-
-If you are planning to use Managed IR to scan any of the supported data sources, the data source requires a managed private endpoint created inside Microsoft Purview Managed VNet. For more information, see [Microsoft Purview Managed VNets](./catalog-managed-vnet.md).
-
-### What are the outbound ports and firewall requirements for virtual machines with self-hosted integration runtime for Microsoft Purview when you use a private endpoint?
-
-The VMs in which self-hosted integration runtime is deployed must have outbound access to Azure endpoints and a Microsoft Purview private IP address through port 443.
-
-### Do I need to enable outbound internet access from the virtual machine running self-hosted integration runtime if a private endpoint is enabled?
-
-No. However, it's expected that the virtual machine running self-hosted integration runtime can connect to your instance of Microsoft Purview through an internal IP address by using port 443. Use common troubleshooting tools for name resolution and connectivity testing, such as nslookup.exe and Test-NetConnection.
-
-### Do I still need to deploy private endpoints for my Microsoft Purview account if I am using Managed VNet?
-
-At least one account and portal private endpoints are required, if public access in Microsoft Purview account is set to **deny**.
-At least one account, portal and ingestion private endpoint are required, if public access in Microsoft Purview account is set to **deny** and you are planning to scan additional data sources using a self-hosted integration runtime.
-
-### What inbound and outbound communications are allowed through public endpoint for Microsoft Purview Managed VNets?
-
-No inbound communication is allowed into a Managed VNet from public network.
-All ports are opened for outbound communications.
-In Microsoft Purview, a Managed VNet can be used to privately connect to Azure data sources to extract metadata during scan.
-
-### Why do I receive the following error message when I try to launch Microsoft Purview governance portal from my machine?
-
-"This Microsoft Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Microsoft Purview account's private endpoint."
-
-It's likely your Microsoft Purview account is deployed by using Private Link and public access is disabled on your Microsoft Purview account. As a result, you have to browse the Microsoft Purview governance portal from a virtual machine that has internal network connectivity to Microsoft Purview.
-
-If you're connecting from a VM behind a hybrid network or using a jump machine connected to your virtual network, use common troubleshooting tools for name resolution and connectivity testing, such as nslookup.exe and Test-NetConnection.
-
-1. Validate if you can resolve the following addresses through your Microsoft Purview account's private IP addresses.
-
- - `Web.Purview.Azure.com`
- - `<YourPurviewAccountName>.Purview.Azure.com`
-
-1. Verify network connectivity to your Microsoft Purview account by using the following PowerShell command:
-
- ```powershell
- Test-NetConnection -ComputerName <YourPurviewAccountName>.Purview.Azure.com -Port 443
- ```
-
-1. Verify your cross-premises DNS configuration if you use your own DNS resolution infrastructure.
-
-For more information about DNS settings for private endpoints, see [Azure private endpoint DNS configuration](../private-link/private-endpoint-dns.md).
-
-## Can I move private endpoints associated with Microsoft Purview account or its managed resources to another Azure subscription or resource group?
-
-No. Move operations for Account, Portal or Ingestion private endpoints are not supported. For more information, see [Move networking resources to new resource group or subscription](../azure-resource-manager/management/move-limitations/networking-move-limitations.md#private-endpoints).
-
-## Next steps
-
-To set up Microsoft Purview by using Private Link, see [Use private endpoints for your Microsoft Purview account](./catalog-private-link.md).
purview Catalog Private Link Name Resolution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-name-resolution.md
- Title: Configure DNS Name Resolution for private endpoints
-description: This article describes an overview of how you can use a private end point for your Microsoft Purview account
----- Previously updated : 03/29/2023
-# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account, for secure access.
--
-# Configure and verify DNS Name Resolution for Microsoft Purview private endpoints
-
-## Conceptual overview
-Accurate name resolution is a critical requirement when setting up private endpoints for your Microsoft Purview accounts.
-
-You may require enabling internal name resolution in your DNS settings to resolve the private endpoint IP addresses to the fully qualified domain name (FQDN) from data sources and your management machine to Microsoft Purview account and self-hosted integration runtime, depending on scenarios that you are deploying.
-
-The following example shows Microsoft Purview DNS name resolution from outside the virtual network or when an Azure private endpoint is not configured.
-
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-external.png" alt-text="Screenshot that shows Microsoft Purview name resolution from outside CorpNet.":::
-
-The following example shows Microsoft Purview DNS name resolution from inside the virtual network.
-
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-private-link.png" alt-text="Screenshot that shows Microsoft Purview name resolution from inside CorpNet.":::
-
-## Deployment options
-
-Use any of the following options to set up internal name resolution when using private endpoints for your Microsoft Purview account:
--- [Deploy new Azure Private DNS Zones](#option-1deploy-new-azure-private-dns-zones) in your Azure environment part of private endpoint deployment. (Default option)-- [Use existing Azure Private DNS Zones](#option-2use-existing-azure-private-dns-zones). Use this option if you using a private endpoint in a hub-and-spoke model from a different subscription or even within the same subscription. -- [Use your own DNS Servers](#option-3use-your-own-dns-servers) if you do not use DNS forwarders and instead you manage A records directly in your on-premises DNS servers.-
-## Option 1 - Deploy new Azure Private DNS Zones
-
-### Deploy new Azure Private DNS Zones
-To enable internal name resolution, you can deploy the required Azure DNS Zones inside your Azure subscription where Microsoft Purview account is deployed.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-dns-zones.png" alt-text="Screenshot that shows DNS Zones.":::
-
-When you create ingestion, portal and account private endpoints, the DNS CNAME resource records for Microsoft Purview is automatically updated to an alias in few subdomains with the prefix `privatelink`:
--- By default, during the deployment of _account_ private endpoint for your Microsoft Purview account, we also create a [private DNS zone](../dns/private-dns-overview.md) that corresponds to the `privatelink` subdomain for Microsoft Purview as `privatelink.purview.azure.com` including DNS A resource records for the private endpoints.--- During the deployment of _portal_ private endpoint for your Microsoft Purview account, we also create a new private DNS zone that corresponds to the `privatelink` subdomain for Microsoft Purview as `privatelink.purviewstudio.azure.com` including DNS A resource records for _Web_.--- If you enable ingestion private endpoints, additional DNS zones are required for managed or configured resources. -
-The following table shows an example of Azure Private DNS zones and DNS A Records that are deployed as part of configuration of private endpoint for a Microsoft Purview account if you enable _Private DNS integration_ during the deployment:
-
-Private endpoint |Private endpoint associated to |DNS Zone (new) |A Record (example) |
-|||||
-|Account |Microsoft Purview |`privatelink.purview.azure.com` |Contoso-Purview |
-|Portal |Microsoft Purview |`privatelink.purviewstudio.azure.com` |Web |
-|Ingestion |Microsoft Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Microsoft Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Microsoft Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
-
-### Validate virtual network links on Azure Private DNS Zones
-
-Once the private endpoint deployment is completed, make sure there is a [Virtual network link](../dns/private-dns-virtual-network-links.md) on all corresponding Azure Private DNS zones to Azure virtual network where private endpoint was deployed.
-
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-link.png" alt-text="Screenshot that shows virtual network links on DNS Zone.":::
-
-For more information, see [Azure private endpoint DNS configuration](../private-link/private-endpoint-dns.md).
-
-### Verify internal name resolution
-
-When you resolve the Microsoft Purview endpoint URL from outside the virtual network with the private endpoint, it resolves to the public endpoint of Microsoft Purview. When resolved from the virtual network hosting the private endpoint, the Microsoft Purview endpoint URL resolves to the private endpoint's IP address.
-
-As an example, if a Microsoft Purview account name is 'Contoso-Purview', when it is resolved from outside the virtual network that hosts the private endpoint, it will be:
-
-| Name | Type | Value |
-| - | -- | |
-| `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | CNAME | \<Microsoft Purview public endpoint\> |
-| \<Microsoft Purview public endpoint\> | A | \<Microsoft Purview public IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Microsoft Purview governance portal public endpoint\> |
-
-The DNS resource records for Contoso-Purview, when resolved in the virtual network hosting the private endpoint, will be:
-
-| Name | Type | Value |
-| - | -- | |
-| `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | A | \<Microsoft Purview account private endpoint IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Microsoft Purview portal private endpoint IP address\> |
-
-## Option 2 - Use existing Azure Private DNS Zones
-
-### Use existing Azure Private DNS Zones
-
-During the deployment of Microsft Purview private endpoints, you can choose _Private DNS integration_ using existing Azure Private DNS zones. This is common case for organizations where private endpoint is used for other services in Azure. In this case, during the deployment of private endpoints, make sure you select the existing DNS zones instead of creating new ones.
-
-This scenario also applies if your organization uses a central or hub subscription for all Azure Private DNS Zones.
-
-The following list shows the required Azure DNS zones and A records for Microsoft Purview private endpoints:
-
-> [!NOTE]
-> Update all names with `Contoso-Purview`,`scaneastusabcd1234` and `atlas-12345678-1234-1234-abcd-123456789abc` with corresponding Azure resources name in your environment. For example, instead of `scaneastusabcd1234` use the name of your Microsoft Purview managed storage account.
-
-Private endpoint |Private endpoint associated to |DNS Zone (existing) |A Record (example) |
-|||||
-|Account |Microsoft Purview |`privatelink.purview.azure.com` |Contoso-Purview |
-|Portal |Microsoft Purview |`privatelink.purviewstudio.azure.com` |Web |
-|Ingestion |Microsoft Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Microsoft Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Microsoft Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
-
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-diagram.png" alt-text="Diagram that shows Microsoft Purview name resolution"lightbox="media/catalog-private-link/purview-name-resolution-diagram.png":::
-
-For more information, see [Virtual network workloads without custom DNS server](../private-link/private-endpoint-dns.md#virtual-network-workloads-without-custom-dns-server) and [On-premises workloads using a DNS forwarder](../private-link/private-endpoint-dns.md#on-premises-workloads-using-a-dns-forwarder) scenarios in [Azure Private Endpoint DNS configuration](../private-link/private-endpoint-dns.md).
-
-### Verify virtual network links on Azure Private DNS Zones
-
-Once the private endpoint deployment is completed, make sure there is a [Virtual network link](../dns/private-dns-virtual-network-links.md) on all corresponding Azure Private DNS zones to Azure virtual network where private endpoint was deployed.
-
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-link.png" alt-text="Screenshot that shows virtual network links on DNS Zone.":::
-
-For more information, see [Azure private endpoint DNS configuration](../private-link/private-endpoint-dns.md).
-
-### Configure DNS Forwarders if custom DNS is used
-
-Additionally it is required to validate your DNS configurations on Azure virtual network where self-hosted integration runtime VM or management PC is located.
-
- :::image type="content" source="media/catalog-private-link/purview-pe-custom-dns.png" alt-text="Diagram that shows Azure virtual network custom DNS":::
--- If it is configured to _Default_, no further action is required in this step.--- If custom DNS server is used, you should add corresponding DNS forwarders inside your DNS servers for the following zones:
-
- - Purview.azure.com
- - purviewstudio.azure.com
- - Blob.core.windows.net
- - Queue.core.windows.net
- - Servicebus.windows.net
-
-### Verify internal name resolution
-
-When you resolve the Microsoft Purview endpoint URL from outside the virtual network with the private endpoint, it resolves to the public endpoint of Microsoft Purview. When resolved from the virtual network hosting the private endpoint, the Microsoft Purview endpoint URL resolves to the private endpoint's IP address.
-
-As an example, if a Microsoft Purview account name is 'Contoso-Purview', when it is resolved from outside the virtual network that hosts the private endpoint, it will be:
-
-| Name | Type | Value |
-| - | -- | |
-| `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | CNAME | \<Microsoft Purview public endpoint\> |
-| \<Microsoft Purview public endpoint\> | A | \<Microsoft Purview public IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Microsoft Purview governance portal public endpoint\> |
-
-The DNS resource records for Contoso-Purview, when resolved in the virtual network hosting the private endpoint, will be:
-
-| Name | Type | Value |
-| - | -- | |
-| `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | A | \<Microsoft Purview account private endpoint IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Microsoft Purview portal private endpoint IP address\> |
-
-## Option 3 - Use your own DNS Servers
-
-If you do not use DNS forwarders and instead you manage A records directly in your on-premises DNS servers to resolve the endpoints through their private IP addresses, you might need to create the following A records in your DNS servers.
-
-> [!NOTE]
-> Update all names with `Contoso-Purview`,`scaneastusabcd1234` and `atlas-12345678-1234-1234-abcd-123456789abc` with corresponding Azure resources name in your environment. For example, instead of `scaneastusabcd1234` use the name of your Microsoft Purview managed storage account.
-
-| Name | Type | Value |
-| - | -- | |
-| `web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview> |
-| `scaneastusabcd1234.blob.core.windows.net` | A | \<blob-ingestion private endpoint IP address of Microsoft Purview> |
-| `scaneastusabcd1234.queue.core.windows.net` | A | \<queue-ingestion private endpoint IP address of Microsoft Purview> |
-| `atlas-12345678-1234-1234-abcd-123456789abc.servicebus.windows.net`| A | \<namespace-ingestion private endpoint IP address of Microsoft Purview> |
-| `Contoso-Purview.Purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview> |
-| `Contoso-Purview.scan.Purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview> |
-| `Contoso-Purview.catalog.Purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
-| `Contoso-Purview.proxy.purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
-| `Contoso-Purview.guardian.purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
-| `gateway.purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
-| `insight.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `manifest.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `cdn.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `hub.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `catalog.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `cseo.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `datascan.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `datashare.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `datasource.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `policy.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `sensitivity.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `web.privatelink.purviewstudio.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-| `workflow.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
-
-## Verify and DNS test name resolution and connectivity
-
-1. If you are using Azure Private DNS Zones, make sure the following DNS Zones and the corresponding A records are created in your Azure Subscription:
-
- |Private endpoint |Private endpoint associated to |DNS Zone |A Record )(example) |
- |||||
- |Account |Microsoft Purview |`privatelink.purview.azure.com` |Contoso-Purview |
- |Portal |Microsoft Purview |`privatelink.purviewstudio.azure.com` |Web |
- |Ingestion |Microsoft Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
- |Ingestion |Microsoft Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
- |Ingestion |Microsoft Purview configured Event Hubs - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
-
-2. Create [Virtual network links](../dns/private-dns-virtual-network-links.md) in your Azure Private DNS Zones for your Azure Virtual Networks to allow internal name resolution.
-
-3. From your management PC and self-hosted integration runtime VM, test name resolution and network connectivity to your Microsoft Purview account using tools such as Nslookup.exe and PowerShell
-
-To test name resolution you need to resolve the following FQDNs through their private IP addresses:
-(Instead of Contoso-Purview, scaneastusabcd1234 or atlas-12345678-1234-1234-abcd-123456789abc, use the hostname associated with your purview account name and managed or configured resources names)
--- `Contoso-Purview.purview.azure.com`-- `web.purview.azure.com`-- `scaneastusabcd1234.blob.core.windows.net`-- `scaneastusabcd1234.queue.core.windows.net`-- `atlas-12345678-1234-1234-abcd-123456789abc.servicebus.windows.net`-
-To test network connectivity, from self-hosted integration runtime VM you can launch PowerShell console and test connectivity using `Test-NetConnection`.
-You must resolve each endpoint by their private endpoint and obtain TcpTestSucceeded as True. (Instead of Contoso-Purview, scaneastusabcd1234 or atlas-12345678-1234-1234-abcd-123456789abc, use the hostname associated with your purview account name and managed or configured resources names)
--- `Test-NetConnection -ComputerName Contoso-Purview.purview.azure.com -port 443`-- `Test-NetConnection -ComputerName web.purview.azure.com -port 443`-- `Test-NetConnection -ComputerName scaneastusabcd1234.blob.core.windows.net -port 443`-- `Test-NetConnection -ComputerName scaneastusabcd1234.queue.core.windows.net -port 443`-- `Test-NetConnection -ComputerName atlas-12345678-1234-1234-abcd-123456789abc.servicebus.windows.net -port 443` -
-## Next steps
--- [Troubleshooting private endpoint configuration for your Microsoft Purview account](catalog-private-link-troubleshoot.md)-- [Manage data sources in Microsoft Purview](./manage-data-sources.md)
purview Catalog Private Link Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-troubleshoot.md
- Title: Troubleshooting private endpoint configuration for Microsoft Purview accounts
-description: This article describes how to troubleshoot problems with your Microsoft Purview account related to private endpoints configurations
----- Previously updated : 12/09/2022
-# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account, for secure access.
--
-# Troubleshooting private endpoint configuration for Microsoft Purview accounts
-
-This guide summarizes known limitations related to using private endpoints for Microsoft Purview and provides a list of steps and solutions for troubleshooting some of the most common relevant issues.
-
-## Known limitations
--- We currently don't support ingestion private endpoints that work with your AWS sources.-- Scanning Azure Multiple Sources using self-hosted integration runtime isn't supported.-- Using Azure integration runtime to scan data sources behind private endpoint isn't supported.-- The ingestion private endpoints can be created via the Microsoft Purview governance portal experience described in the steps [here](catalog-private-link-end-to-end.md#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-microsoft-purview-accounts). They can't be created from the Private Link Center.-- Creating a DNS record for ingestion private endpoints inside existing Azure DNS Zones, while the Azure Private DNS Zones are located in a different subscription than the private endpoints isn't supported via the Microsoft Purview governance portal experience. A record can be added manually in the destination DNS Zones in the other subscription.-- If you enable a managed event hub after deploying an ingestion private endpoint, you'll need to redeploy the ingestion private endpoint.-- Self-hosted integration runtime machine must be deployed in the same VNet or a peered VNet where Microsoft Purview account and ingestion private endpoints are deployed.-- We currently don't support scanning a cross-tenant Power BI tenant, which has a private endpoint configured with public access blocked.-- For limitation related to Private Link service, see [Azure Private Link limits](../azure-resource-manager/management/azure-subscription-service-limits.md#private-link-limits).-
-## Recommended troubleshooting steps
-
-1. Once you deploy private endpoints for your Microsoft Purview account, review your Azure environment to make sure private endpoint resources are deployed successfully. Depending on your scenario, one or more of the following Azure private endpoints must be deployed in your Azure subscription:
-
- |Private endpoint |Private endpoint assigned to | Example|
- ||||
- |Account |Microsoft Purview Account |mypurview-private-account |
- |Portal |Microsoft Purview Account |mypurview-private-portal |
- |Ingestion |Managed Storage Account (Blob) |mypurview-ingestion-blob |
- |Ingestion |Managed Storage Account (Queue) |mypurview-ingestion-queue |
- |Ingestion |Event Hubs Namespace* |mypurview-ingestion-namespace |
-
- >[!NOTE]
- > *Event Hubs Namespace is only needed if it has been configured on your Microsoft Purview account. You can check in **Kafka configuration** under settings on your Microsoft Purview account page in the Azure Portal.
-
-2. If portal private endpoint is deployed, make sure you also deploy account private endpoint.
-
-3. If portal private endpoint is deployed, and public network access is set to deny in your Microsoft Purview account, make sure you launch [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/) from internal network.
- <br>
- - To verify the correct name resolution, you can use a **NSlookup.exe** command line tool to query `web.purview.azure.com`. The result must return a private IP address that belongs to portal private endpoint.
- - To verify network connectivity, you can use any network test tools to test outbound connectivity to `web.purview.azure.com` endpoint to port **443**. The connection must be successful.
-
-3. If Azure Private DNS Zones are used, make sure the required Azure DNS Zones are deployed and there's DNS (A) record for each private endpoint.
-
-4. Test network connectivity and name resolution from management machine to Microsoft Purview endpoint and purview web url. If account and portal private endpoints are deployed, the endpoints must be resolved through private IP addresses.
--
- ```powershell
- Test-NetConnection -ComputerName web.purview.azure.com -Port 443
- ```
-
- Example of successful outbound connection through private IP address:
-
- ```
- ComputerName : web.purview.azure.com
- RemoteAddress : 10.9.1.7
- RemotePort : 443
- InterfaceAlias : Ethernet 2
- SourceAddress : 10.9.0.10
- TcpTestSucceeded : True
- ```
-
- ```powershell
- Test-NetConnection -ComputerName purview-test01.purview.azure.com -Port 443
- ```
-
- Example of successful outbound connection through private IP address:
-
- ```
- ComputerName : purview-test01.purview.azure.com
- RemoteAddress : 10.9.1.8
- RemotePort : 443
- InterfaceAlias : Ethernet 2
- SourceAddress : 10.9.0.10
- TcpTestSucceeded : True
- ```
-
-5. If you've created your Microsoft Purview account after 18 August 2021, make sure you download and install the latest version of self-hosted integration runtime from [Microsoft download center](https://www.microsoft.com/download/details.aspx?id=39717).
-
-6. From self-hosted integration runtime VM, test network connectivity and name resolution to Microsoft Purview endpoint.
-
-7. From self-hosted integration runtime, test network connectivity and name resolution to Microsoft Purview managed resources such as blob queue, and secondary resources like Event Hubs through port 443 and private IP addresses. (Replace the managed storage account and Event Hubs namespace with corresponding resource names).
-
- ```powershell
- Test-NetConnection -ComputerName `scansoutdeastasiaocvseab`.blob.core.windows.net -Port 443
- ```
- Example of successful outbound connection to managed blob storage through private IP address:
-
- ```
- ComputerName : scansoutdeastasiaocvseab.blob.core.windows.net
- RemoteAddress : 10.15.1.6
- RemotePort : 443
- InterfaceAlias : Ethernet 2
- SourceAddress : 10.15.0.4
- TcpTestSucceeded : True
- ```
-
- ```powershell
- Test-NetConnection -ComputerName `scansoutdeastasiaocvseab`.queue.core.windows.net -Port 443
- ```
- Example of successful outbound connection to managed queue storage through private IP address:
-
- ```
- ComputerName : scansoutdeastasiaocvseab.blob.core.windows.net
- RemoteAddress : 10.15.1.5
- RemotePort : 443
- InterfaceAlias : Ethernet 2
- SourceAddress : 10.15.0.4
- TcpTestSucceeded : True
- ```
-
- ```powershell
- Test-NetConnection -ComputerName `Atlas-1225cae9-d651-4039-86a0-b43231a17a4b`.servicebus.windows.net -Port 443
- ```
- Example of successful outbound connection to Event Hubs namespace through private IP address:
-
- ```
- ComputerName : Atlas-1225cae9-d651-4039-86a0-b43231a17a4b.servicebus.windows.net
- RemoteAddress : 10.15.1.4
- RemotePort : 443
- InterfaceAlias : Ethernet 2
- SourceAddress : 10.15.0.4
- TcpTestSucceeded : True
- ```
-
-8. From the network where data source is located, test network connectivity and name resolution to Microsoft Purview endpoint and managed or configured resources endpoints.
-
-9. If data sources are located in on-premises network, review your DNS forwarder configuration. Test name resolution from within the same network where data sources are located to self-hosted integration runtime, Microsoft Purview endpoints and managed or configured resources. It's expected to obtain a valid private IP address from DNS query for each endpoint.
-
- For more information, see [Virtual network workloads without custom DNS server](../private-link/private-endpoint-dns.md#virtual-network-workloads-without-custom-dns-server) and [On-premises workloads using a DNS forwarder](../private-link/private-endpoint-dns.md#on-premises-workloads-using-a-dns-forwarder) scenarios in [Azure Private Endpoint DNS configuration](../private-link/private-endpoint-dns.md).
-
-10. If management machine and self-hosted integration runtime VMs are deployed in on-premises network and you have set up DNS forwarder in your environment, verify DNS and network settings in your environment.
-
-11. If ingestion private endpoint is used, make sure self-hosted integration runtime is registered successfully inside Microsoft Purview account and shows as running both inside the self-hosted integration runtime VM and in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/) .
-
-## Common errors and messages
-
-### Issue
-You may receive the following error message when running a scan:
-
-`Internal system error. Please contact support with correlationId:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx System Error, contact support.`
-
-### Cause
-This can be an indication of issues related to connectivity or name resolution between the VM running self-hosted integration runtime and Microsoft Purview's managed storage account or configured Event Hubs.
-
-### Resolution
-Validate if name resolution is successful between the VM running the Self-Hosted Integration Runtime and the Microsoft Purview managed blob queue or configured Event Hubs through port 443 and private IP addresses (step 8 above.)
--
-### Issue
-You may receive the following error message when running a new scan:
-
- `message: Unable to setup config overrides for this scan. Exception:'Type=Microsoft.WindowsAzure.Storage.StorageException,Message=The remote server returned an error: (404) Not Found.,Source=Microsoft.WindowsAzure.Storage,StackTrace= at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndExecuteAsync[T](IAsyncResult result)`
-
-### Cause
-This can be an indication of running an older version of self-hosted integration runtime. You'll need to use the self-hosted integration runtime version 5.9.7885.3 or greater.
-
-### Resolution
-Upgrade self-hosted integration runtime to 5.9.7885.3.
--
-### Issue
-Microsoft Purview account with private endpoint deployment failed with Azure Policy validation error during the deployment.
-
-### Cause
-This error suggests that there may be an existing Azure Policy Assignment on your Azure subscription that is preventing the deployment of any of the required Azure resources.
-
-### Resolution
-Review your existing Azure Policy Assignments and make sure deployment of the following Azure resources are allowed in your Azure subscription.
-
-> [!NOTE]
-> Depending on your scenario, you may need to deploy one or more of the following Azure resource types:
-> - Microsoft Purview (Microsoft.Purview/Accounts)
-> - Private Endpoint (Microsoft.Network/privateEndpoints)
-> - Private DNS Zones (Microsoft.Network/privateDnsZones)
-> - Event Hub Name Space (Microsoft.EventHub/namespaces)
-> - Storage Account (Microsoft.Storage/storageAccounts)
--
-### Issue
-Not authorized to access this Microsoft Purview account. This Microsoft Purview account is behind a private endpoint. Access the account from a client in the same virtual network (VNet) that has been configured for the Microsoft Purview account's private endpoint.
-
-### Cause
-User is trying to connect to Microsoft Purview from a public endpoint or using Microsoft Purview public endpoints where **Public network access** is set to **Deny**.
-
-### Resolution
-In this case, to open the Microsoft Purview governance portal, either use a machine that is deployed in the same virtual network as the Microsoft Purview governance portal private endpoint or use a VM that is connected to your CorpNet in which hybrid connectivity is allowed.
-
-### Issue
-You may receive the following error message when scanning a SQL server, using a self-hosted integration runtime:
-
- `Message=This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms`
-
-### Cause
-Self-hosted integration runtime machine has enabled the FIPS mode.
-Federal Information Processing Standards (FIPS) defines a certain set of cryptographic algorithms that are allowed to be used. When FIPS mode is enabled on the machine, some cryptographic classes that the invoked processes depend on are blocked in some scenarios.
-
-### Resolution
-Disable FIPS mode on self-hosted integration server.
-
-## Next steps
-
-If your problem isn't listed in this article or you can't resolve it, get support by visiting one of
-the following channels:
--- Get answers from experts through
- [Microsoft Q&A](/answers/topics/azure-purview.html).
-- Connect with [@AzureSupport](https://twitter.com/azuresupport). This official Microsoft Azure resource on Twitter helps improve the customer experience by connecting the Azure community to the right answers, support, and experts.-- If you still need help, go to the [Azure support site](https://azure.microsoft.com/support/options/) and select **Submit a support
- request**.
purview Catalog Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link.md
- Title: Use private endpoints for secure access to Microsoft Purview
-description: This article describes a high level overview of how you can use a private end point for your Microsoft Purview account
----- Previously updated : 02/16/2023
-# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account, for secure access.
--
-# Use private endpoints for your Microsoft Purview account
-
-This article describes how to configure private endpoints for Microsoft Purview.
-
-## Conceptual Overview
-You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Microsoft Purview accounts to allow users on a virtual network (VNet) to securely access the catalog over a Private Link. A private endpoint uses an IP address from the VNet address space for your Microsoft Purview account. Network traffic between the clients on the VNet and the Microsoft Purview account traverses over the VNet and a private link on the Microsoft backbone network.
-
-You can deploy Microsoft Purview _account_ private endpoint, to allow only client calls to Microsoft Purview that originate from within the private network.
-
-To connect to the Microsoft Purview governance portal using a private network connectivity, you can deploy _portal_ private endpoint.
-
-You can deploy _ingestion_ private endpoints if you need to scan Azure IaaS and PaaS data sources inside Azure virtual networks and on-premises data sources through a private connection. This method ensures network isolation for your metadata flowing from the data sources to Microsoft Purview Data Map.
--
-## Prerequisites
-
-Before deploying private endpoints for Microsoft Purview account, ensure you meet the following prerequisites:
-
-1. An Azure account with an active subscription. [Create an account for free.](https://azure.microsoft.com/free/?WT.mc_id=A261C142F)
-<br>
-2. An existing Azure Virtual network. Deploy a new [Azure virtual network](../virtual-network/quick-create-portal.md) if you do not have one.
-<br>
-
-## Microsoft Purview private endpoint deployment scenarios
-
-Use the following recommended checklist to perform deployment of Microsoft Purview account with private endpoints:
-
-|Scenario |Objectives |
-|||
-|**Scenario 1** - [Connect to your Microsoft Purview and scan data sources privately and securely](./catalog-private-link-end-to-end.md) |You need to restrict access to your Microsoft Purview account only via a private endpoint, including access to the Microsoft Purview governance portal, Atlas APIs and scan data sources in on-premises and Azure (but inside a virtual network) using self-hosted integration runtime ensuring end to end network isolation. (Deploy _account_, _portal_ and _ingestion_ private endpoints.) |
-|**Scenario 2** - [Connect privately and securely to your Microsoft Purview account](./catalog-private-link-account-portal.md) | You need to enable access to your Microsoft Purview account, including access to _the Microsoft Purview governance portal_ and Atlas API through private endpoints. (Deploy _account_ and _portal_ private endpoints). |
-|**Scenario 3** - [Scan data source securely using Managed Virtual Network](./catalog-managed-vnet.md) | You need to scan Azure data sources securely, without having to manage a virtual network or a self-hosted integration runtime VM. (Deploy managed private endpoint for Microsoft Purview, managed storage account and Azure data sources). |
--
-## Support matrix for Scanning data sources through _ingestion_ private endpoint
-
-For scenarios where _ingestion_ private endpoint is used in your Microsoft Purview account, and public access on your data sources is disabled, Microsoft Purview can scan the following data sources that are behind a private endpoint:
-
-|Data source behind a private endpoint |Integration runtime type |Credential type |
-||||
-|Azure Blob Storage | Self-Hosted IR | Service Principal|
-|Azure Blob Storage | Self-Hosted IR | Account Key|
-|Azure Data Lake Storage Gen 2 | Self-Hosted IR| Service Principal|
-|Azure Data Lake Storage Gen 2 | Self-Hosted IR| Account Key|
-|Azure SQL Database | Self-Hosted IR| SQL Authentication|
-|Azure SQL Database | Self-Hosted IR| Service Principal|
-|Azure SQL Managed Instance | Self-Hosted IR| SQL Authentication|
-|Azure Cosmos DB| Self-Hosted IR| Account Key|
-|SQL Server | Self-Hosted IR| SQL Authentication|
-|Azure Synapse Analytics | Self-Hosted IR| Service Principal|
-|Azure Synapse Analytics | Self-Hosted IR| SQL Authentication|
-|Power BI tenant (Same tenant) |Self-Hosted IR| Delegated Auth|
-
-## Frequently Asked Questions
-
-For FAQs related to private endpoint deployments in Microsoft Purview, see [FAQ about Microsoft Purview private endpoints](./catalog-private-link-faqs.md).
-
-## Troubleshooting guide
-For troubleshooting private endpoint configuration for Microsoft Purview accounts, see [Troubleshooting private endpoint configuration for Microsoft Purview accounts](./catalog-private-link-troubleshoot.md).
-
-## Known limitations
-To view list of current limitations related to Microsoft Purview private endpoints, see [Microsoft Purview private endpoints known limitations](./catalog-private-link-troubleshoot.md#known-limitations).
-
-## Next steps
--- [Deploy end to end private networking](./catalog-private-link-end-to-end.md)-- [Deploy private networking for the Microsoft Purview governance portal](./catalog-private-link-account-portal.md)
purview Classification Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/classification-insights.md
- Title: Classification reporting on your data in Microsoft Purview using Microsoft Purview Data Estate Insights
-description: This how-to guide describes how to view and use Microsoft Purview classification reporting on your data.
----- Previously updated : 05/16/2022-
-#Customer intent: As a security officer, I need to understand how to use Microsoft Purview Data Estate Insights to learn about sensitive data identified and classified and labeled during scanning.
--
-# Classification insights about your data in Microsoft Purview
-
-This guide describes how to access, view, and filter Microsoft Purview Classification insight reports for your data.
-
-In this guide, you'll learn how to:
-
-> [!div class="checklist"]
-> - Launch your Microsoft Purview account from Azure
-> - View classification insights on your data
-> - Drill down for more classification details on your data
-
-## Prerequisites
-
-Before getting started with Microsoft Purview Data Estate Insights, make sure that you've completed the following steps:
-
-* Set up a storage resource and populated the account with data.
-
-* Set up and completed a scan on the data in each data source. For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md) and [Create a scan rule set](create-a-scan-rule-set.md).
-
-* Signed in to Microsoft Purview with account with a [data Curator role or insight reader role](catalog-permissions.md#roles).
--
-## Use Microsoft Purview Data Estate Insights for classifications
-
-In Microsoft Purview, classifications are similar to subject tags, and are used to mark and identify data of a specific type that's found within your data estate during scanning.
-
-Microsoft Purview uses the same sensitive information types as Microsoft 365, allowing you to stretch your existing security policies and protection across your entire data estate.
-
-> [!NOTE]
-> After you have scanned your source types, give **classification insights** a couple of hours to reflect the new assets.
-
-**To view classification insights:**
-
-1. Go to the **Microsoft Purview** [instance screen in the Azure portal](https://aka.ms/purviewportal) and select your Microsoft Purview account.
-
-1. On the **Overview** page, in the **Get Started** section, select the **Microsoft Purview governance portal** tile.
-
-1. In Microsoft Purview, select the **Data Estate Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: menu item on the left to access your **Data Estate Insights** area.
-
-1. In the **Data Estate Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: area, select **Classifications** to display the Microsoft Purview **Classification insights** report.
-
- :::image type="content" source="./media/insights/select-classification-labeling.png" alt-text="Screenshot of the classification insights report." lightbox="media/insights/select-classification-labeling.png":::
-
- The main **classification insights** page displays the following areas:
-
- |Area |Description |
- |||
- |**Overview of sources with classifications** |Displays tiles that provide: <br>- The number of subscriptions found in your data <br>- The number of unique classifications found in your data <br>- The number of classified sources found <br>- The number of classified files found <br>- The number of classified tables found |
- |**Top sources with classified data (last 30 days)** |Shows the trend, over the past 30 days, of the number of sources found with classified data. |
- |**Top classification categories by sources** |Shows the number of sources found by classification category, such as **Financial** or **Government**. |
- |**Top classifications for files** |Shows the top classifications applied to files in your data, such as credit card numbers or national/regional identification numbers. |
- |**Top classifications for tables** | Shows the top classifications applied to tables in your data, such as personal identifying information. |
- | **Classification activity** <br>(files and tables) | Displays separate graphs for files and tables, each showing the number of files or tables classified over the selected timeframe. <br>**Default**: 30 days<br>Select the **Time** filter above the graphs to select a different time frame to display. |
- | | |
-
-## Classification insights drilldown
-
-In any of the following **Classification insights** graphs, select the **View details** link to drill down for more details:
--- **Top classification categories by sources**-- **Top classifications for files**-- **Top classifications for tables**-- **Classification activity > Classification data**-
-For example:
--
-Do any of the following to learn more:
-
-|Option |Description |
-|||
-|**Filter your data** | Use the filters above the grid to filter the data shown, including the classification name, subscription name, or source type. <br><br>If you're not sure of the exact classification name, you can enter part or all of the name in the **Filter by keyword** box. |
-|**Sort the grid** |Select a column header to sort the grid by that column. |
-|**Edit columns** | To display more or fewer columns in your grid, select **Edit Columns** :::image type="icon" source="media/insights/ico-columns.png" border="false":::, and then select the columns you want to view or change the order. |
-|**Drill down further** | To drill down to a specific classification, select a name in the **Classification** column to view the **Classification by source** report. <br><br>This report displays data for the selected classification, including the source name, source type, subscription ID, and the numbers of classified files and tables. |
-|**Browse assets** | To browse through the assets found with a specific classification or source, select a classification or source, depending on the report you're viewing, and then select **Browse assets** :::image type="icon" source="medi). |
-| | |
-
-## Next steps
-
-Learn how to use Data Estate Insights with resources below:
-
-* [Learn how to use Asset insights](asset-insights.md)
-* [Learn how to use Data Stewardship](data-stewardship.md)
-* [Learn how to use Classification insights](classification-insights.md)
-* [Learn how to use Glossary insights](glossary-insights.md)
-* [Learn how to use Label insights](sensitivity-insights.md)
purview Concept Asset Normalization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-asset-normalization.md
- Title: Asset normalization
-description: Learn how Microsoft Purview prevents duplicating assets in your data map through asset normalization.
----- Previously updated : 05/26/2023---
-# Asset normalization
-
-When ingesting assets into the Microsoft Purview data map, different sources updating the same data asset may send similar, but slightly different qualified names. While these qualified names represent the same asset, slight differences such as an extra character may cause these assets on the surface to appear different and cause duplicate entries in Microsoft Purview. To avoid storing duplicate entries and causing confusion when consuming the data catalog, Microsoft Purview applies normalization during ingestion to ensure all fully qualified names of the same entity type are in the same format.
-
-For example, you scan in an Azure Blob with the qualified name `https://myaccount.file.core.windows.net/myshare/folderA/folderB/my-file.parquet`. This blob is also consumed by an Azure Data Factory pipeline that will then add lineage information to the asset. The ADF pipeline may be configured to read the file as `https://myAccount.file.core.windows.net//myshare/folderA/folderB/my-file.parquet`. While the qualified name is different, this ADF pipeline is consuming the same piece of data. Normalization ensures that all the metadata from both Azure Blob Storage and Azure Data Factory is visible on a single asset, `https://myaccount.file.core.windows.net/myshare/folderA/folderB/my-file.parquet`.
-
->[!IMPORTANT]
->The rules listed below are the only kinds of potential dupilcation Microsoft Purview currently recognizes. If you are experiencing accidental asset duplication, compare the assets fully qualified names to check for caplitalization differences or additional characters. Update any ingestion points, for example your ADF pipelines, so that the qualified names match.
-
-## Normalization rules
-
-Below are the normalization rules applied by Microsoft Purview.
-
-### Encode curly brackets
-Applies to: All Assets
-
-Before: `https://myaccount.file.core.windows.net/myshare/{folderA}/folder{B/`
-
-After: `https://myaccount.file.core.windows.net/myshare/%7BfolderA%7D/folder%7BB/`
-
-### Trim section spaces
-Applies to: Azure Blob, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure Data Factory, Azure SQL Database, Azure SQL Managed Instance, Azure SQL pool, Azure Cosmos DB, Azure Cognitive Search, Azure Data Explorer, Azure Data Share, Amazon S3
-
-Before: `https://myaccount.file.core.windows.net/myshare/ folder A/folderB /`
-
-After: `https://myaccount.file.core.windows.net/myshare/folder A/folderB/`
-
-### Remove hostname spaces
-Applies to: Azure Blob, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, Azure SQL Managed Instance, Azure SQL pool, Azure Cosmos DB, Azure Cognitive Search, Azure Data Explorer, Azure Data Share, Amazon S3
-
-Before: `https://myaccount .file. core.win dows. net/myshare/folderA/folderB/`
-
-After: `https://myaccount.file.core.windows.net/myshare/folderA/folderB/`
-
-### Remove square brackets
-Applies to: Azure SQL Database, Azure SQL Managed Instance, Azure SQL pool
-
-Before: `mssql://foo.database.windows.net/[bar]/dbo/[foo bar]`
-
-After: `mssql://foo.database.windows.net/bar/dbo/foo%20bar`
-
-> [!NOTE]
-> Spaces between two square brackets will be encoded
-
-### Lowercase scheme
-Applies to: Azure Blob, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, Azure SQL Managed Instance, Azure SQL pool, Azure Cosmos DB, Azure Cognitive Search, Azure Data Explorer, Amazon S3
-
-Before: `HTTPS://myaccount.file.core.windows.net/myshare/folderA/folderB/`
-
-After: `https://myaccount.file.core.windows.net/myshare/folderA/folderB/`
-
-### Lowercase hostname
-Applies to: Azure Blob, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, Azure SQL Managed Instance, Azure SQL pool, Azure Cosmos DB, Azure Cognitive Search, Azure Data Explorer, Amazon S3
-
-Before: `https://myAccount.file.Core.Windows.net/myshare/folderA/folderB/`
-
-After: `https://myaccount.file.core.windows.net/myshare/folderA/folderB/`
-
-### Lowercase file extension
-Applies to: Azure Blob, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Amazon S3
-
-Before: `https://myAccount.file.core.windows.net/myshare/folderA/data.TXT`
-
-After: `https://myaccount.file.core.windows.net/myshare/folderA/data.txt`
-
-### Remove duplicate slash
-Applies to: Azure Blob, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure Data Factory, Azure SQL Database, Azure SQL Managed Instance, Azure SQL pool, Azure Cosmos DB, Azure Cognitive Search, Azure Data Explorer, Azure Data Share, Amazon S3
-
-Before: `https://myAccount.file.core.windows.net//myshare/folderA////folderB/`
-
-After: `https://myaccount.file.core.windows.net/myshare/folderA/folderB/`
-
-### Convert to ADL scheme
-Applies to: Azure Data Lake Storage Gen1
-
-Before: `https://mystore.azuredatalakestore.net/folderA/folderB/abc.csv`
-
-After: `adl://mystore.azuredatalakestore.net/folderA/folderB/abc.csv`
-
-### Remove Trailing Slash
-Remove the trailing slash from higher level assets for Azure Blob, ADLS Gen1, and ADLS Gen2
-
-Applies to: Azure Blob, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2
-
-Asset types: "azure_blob_container", "azure_blob_service", "azure_storage_account", "azure_datalake_gen2_service", "azure_datalake_gen2_filesystem", "azure_datalake_gen1_account".
-
-Before: `https://myaccount.core.windows.net/`
-
-After: `https://myaccount.core.windows.net`
-## Next steps
-
-[Scan in an Azure Blob Storage](register-scan-azure-blob-storage-source.md) account into the Microsoft Purview data map.
purview Concept Best Practices Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-accounts.md
- Title: Microsoft Purview (formerly Azure Purview) accounts architecture and best practices
-description: This article provides examples of accounts architectures and describes best practices for deploying Microsoft Purview (formerly Azure Purview).
----- Previously updated : 06/07/2023--
-# Microsoft Purview accounts architectures and best practices
-
-To enable [Microsoft Purview governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions), like Microsoft Purview Data Map and Data Catalog, in your environment, [you'll deploy a Microsoft Purview (formerly Azure Purview) account in the Azure portal](create-microsoft-purview-portal.md). You'll use this account to centrally manage data governance across your data estate, spanning both cloud and on-premises environments. To use Microsoft Purview as your centralized data governance solution, you may need to deploy one or more Microsoft Purview accounts inside your Azure subscription. We recommend keeping the number of Microsoft Purview instances as minimum, however, in some cases more Microsoft Purview instances are needed to fulfill business security and compliance requirements.
-
-## Single Microsoft Purview account
-
-Consider deploying minimum number of Microsoft Purview (formerly Azure Purview) accounts for the entire organization. This approach takes maximum advantage of the "network effects" where the value of the platform increases exponentially as a function of the data that resides inside the platform.
-
-Use [Microsoft Purview Data Map collections hierarchy](./concept-best-practices-collections.md) to lay out your organization's data management structure inside a single Microsoft Purview account. In this scenario, one account is deployed in an Azure subscription. Data sources from one or more Azure subscriptions can be registered and scanned inside the Microsoft Purview. You can also register and scan data sources from your on-premises or multicloud environments.
--
-## Multiple Microsoft Purview accounts
-
-Some organizations may require setting up multiple Microsoft Purview accounts. Review the following scenarios as few examples when defining your Microsoft Purview accounts architecture.
-
-### Tag your accounts
-
-When you use or create multiple Microsoft Purview accounts in your environment, use the tagging system in Azure to define them. You can add a tag when you create the resource under the Tags tab, or you can [add a tag later in the Azure portal using the Tags page in the resource](/azure/azure-resource-manager/management/tag-resources-portal).
-
-Add a tag called **Purview environment**, and give it one of the below values:
-
-|Value |Meaning |
-|-|--|
-|Production|This account is being used or will be used in the future to support all my cataloging and governance requirements in production.|
-|Pre-Production|This account is being used or will be used in the future to validate cataloging and governance requirements before making it available to my users in production.|
-|Test|This account is being used or will be used in the future to test out capabilities in Microsoft Purview Governance. |
-|Dev|This account is being used or will be used in the future to test out capabilities or develop custom code, scripts etc. in Microsoft Purview Governance.|
-|Proof of Concept|This account is being used or will be used in the future to test out capabilities or develop custom code, scripts etc. in Microsoft Purview Governance.|
-
-### Testing new features
-
-It's recommended to create a new account when testing scan configurations or classifications in isolated environments. For some scenarios, there's a "versioning" feature in some areas of the platform such as glossary, however, it would be easier to have a "disposable" instance of Microsoft Purview to freely test expected functionality and then plan to roll out the feature into the production instance.
-
-Additionally, consider using a test Microsoft Purview account when you can't perform a rollback. For example, currently you can't remove a glossary term attribute from a Microsoft Purview instance once it's added to your Microsoft Purview account. In this case, it's recommended using a test Microsoft Purview account first.
-
-### Isolating Production and nonproduction environments
-
-Consider deploying separate instances of Microsoft Purview accounts for development, testing and production environments, specially when you have separate instances of data for each environment.
-
-In this scenario, production and nonproduction data sources can be registered and scanned inside their corresponding Microsoft Purview instances.
-
-Optionally, you can register a data source in more than one Microsoft Purview instance, if needed.
--
-### Fulfilling compliance requirements
-
-When you scan data sources in the Microsoft Purview Data Map, information related to your metadata is ingested and stored inside your data map in the Azure region where your Microsoft Purview account is deployed. Consider deploying separate instances of Microsoft Purview if you have specific regulatory and compliance requirements that include even having metadata in a specific geographical location.
-
-If your organization has data in multiple geographies and you must keep metadata in the same region as the actual data, you'll have to deploy multiple Microsoft Purview instances, one for each geography. In this case, data sources from each region should be registered and scanned in the Microsoft Purview account that corresponds to the data source region or geography.
--
-### Having Data sources distributed across multiple tenants
-
-Currently, Microsoft Purview doesn't support multi-tenancy. If you have Azure data sources distributed across multiple Azure subscriptions under different Azure Active Directory tenants, it's recommended deploying separate Microsoft Purview accounts under each tenant.
-
-An exception applies to VM-based data sources and Power BI tenants.For more information about how to scan and register a cross tenant Power BI in a single Microsoft Purview account, see, [Register and scan a cross-tenant Power BI](./register-scan-power-bi-tenant-cross-tenant.md).
--
-## Default Microsoft Purview account
-
-Having multiple Microsoft Purview accounts in a tenant poses the challenge of which Microsoft Purview account should all other services such as Power BI tenant or Azure Synapse connect to.
-
-This is where default Microsoft Purview account will help. An Azure global administrator (or tenant admin) can designate a Microsoft Purview account as **default** Microsoft Purview account at the tenant level. At any point in time a tenant can have only 0 or 1 default accounts. Once this is set any user in your organization has clear understanding that this account is the "right" one, when connecting to Microsoft Purview.
-
-### Manage default account for tenant
-
-* You can set default flag as 'Yes' only after the account is created.
-
-* Setting up wrong default account can have security implications so only Azure global administrator at tenant level (Tenant Admin) can set the default account flag as 'Yes'.
-
-* Changing the default account is a two-step process. First you need to change the flag as 'No' to the current default Microsoft Purview account and then set the flag as 'Yes' to the new Microsoft Purview account.
-
-* Setting up default account is a control plane operation and hence the Microsoft Purview governance portal won't have any changes if an account is defined as default. However, in the studio you can see the account name is appended with "(default)" for the default Microsoft Purview account.
-
-## Billing model
-
-Review [Microsoft Purview Pricing model](https://azure.microsoft.com/pricing/details/azure-purview) when defining budgeting model and designing an architecture for your organization. One billing is generated for a single Microsoft Purview account in the subscription where Microsoft Purview account is deployed. This model also applies to other Microsoft Purview costs such as scanning and classifying metadata inside Microsoft Purview Data Map.
-
-Some organizations often have many business units (BUs) that operate separately, and, in some cases, they don't even share billing with each other. In those cases, the organization will end up creating a Microsoft Purview instance for each BU.
-
-For more information about cloud computing cost model in chargeback and showback models, see: [What is cloud accounting?](/azure/cloud-adoption-framework/strategy/cloud-accounting).
-
-## Selecting an Azure region
-
-Microsoft Purview is an Azure platform as a service solution. You can deploy a Microsoft Purview account inside your Azure subscription in any
-[supported Azure regions](https://azure.microsoft.com/explore/global-infrastructure/products-by-region/?products=purview&regions=all).
-
-If Microsoft Purview isn't available in your primary Azure region, consider the following factors when choosing a secondary region to deploy your Microsoft Purview account:
--- Review the latency between your primary Azure region where data sources are deployed and your secondary Azure region, where Microsoft Purview account will be deployed. For more information, see [Azure network round-trip latency statistics](../networking/azure-network-latency.md).--- Review your data residency requirements. When you scan data sources in the Microsoft Purview Data Map, information related to your metadata is ingested and stored inside your data map in the Azure region where your Microsoft Purview account is deployed. For more information, see [Where is metadata stored](concept-best-practices-security.md#where-is-metadata-stored)--- Review your network and security requirements if private network connectivity for user access or metadata ingestion is required. For more information, see [If Microsoft Purview isn't available in your primary region](concept-best-practices-network.md#if-microsoft-purview-isnt-available-in-your-primary-region)-
-## Other considerations and recommendations
--- Keep the number of Microsoft Purview accounts low for simplified administrative overhead. If you plan building multiple Microsoft Purview accounts, you may require creating and managing extra scans, access control model, credentials, and runtimes across your Microsoft Purview accounts. Additionally, you may need to manage classifications and glossary terms for each Microsoft Purview account.--- Review your budgeting and financial requirements. If possible, use chargeback or showback model when using Azure services and divide the cost of Microsoft Purview across the organization to keep the number of Microsoft Purview accounts minimum. --- Use [collections](concept-best-practices-collections.md) to define metadata access control inside Microsoft Purview Data Map for your organization's business users, data management and governance teams. For more information, see [Access control in Microsoft Purview](./catalog-permissions.md).--- Review [Microsoft Purview limits](./how-to-manage-quotas.md#microsoft-purview-limits) before deploying any new Microsoft Purview accounts. Currently, the default limit of Microsoft Purview accounts per region, per tenant (all subscriptions combined) is 3. You may need to contact Microsoft support to increase this limit in your subscription or tenant before deploying extra instances of Microsoft Purview.  --- Review [Microsoft Purview prerequisites](./create-catalog-portal.md#prerequisites) before deploying any new Microsoft Purview accounts in your environment.
-ΓÇ»
-## Next steps
-- [Create a Microsoft Purview account](./create-catalog-portal.md)
purview Concept Best Practices Annotating Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-annotating-data.md
- Title: Best practices for describing data in Microsoft Purview
-description: Microsoft Purview provides a variety of ways to annotate and organize your data. This article covers best practices for using tags, terms, managed attributes, and business assets.
---- Previously updated : 04/24/2023---
-# Describing metadata in Microsoft Purview
-
-Microsoft Purview provides a variety of ways to annotate and organize your data. You can use tags, terms, managed attributes, and business assets.
-
-But it may not always be obvious when to use which feature. If you want to show that a data set is published by your accounting team, should you tag it? Assign a managed attribute called account team? What about using a term called accounting? Or maybe you should create a relationship to a department asset called accounting?
-
-There's no one right way to add context to your data, but here are some best practices for tags, managed attributes, business terms, and business assets.
-
-## Best practices for using tags
-
-Use tags when you want to quickly label your data assets without the need for consistency or control. Tags are simple keywords or phrases that can be applied to data assets to provide quick, informal metadata. TheyΓÇÖre useful for categorizing data assets, making them easier to discover and understand. They're also a great way to see how your data consumers describe your data so you can incorporate this language into your business glossary over time.
-
-In the following example, I've tagged a few assets with Q4 Revenue so I can easily find the data assets I plan to use for a new report with this information. Searching the keyword returns all data with that tag applied:
--
-## Best practices for using managed attributes
-Use managed attributes to extend the fields available for an asset in Purview. Managed attributes are key-value pairs that add structured metadata to your data catalog. When Purview scans data, it adds technical information about the data like data type, classification, etc. If you want to add more fields, youΓÇÖll need to define managed attributes.
-
-In the following example, I add a managed attribute that lets me tag tables with the department that publishes them. I use a managed attribute because I want to make sure assets are always tagged in exactly the same way with this information. I also want to filter by the publisher field when I search for data.
--
-The managed attribute in this example helps people quickly find all data published by the supply chain team, but doesn't help someone understand the definition of a publisher or what it means if supply chain is the publisher of the data. For any information that needs a business explanation, we use terms.
-
-## Best practices for using business terms
-
-Use business terms to define a shared vocabulary for your organization. By creating terms, identifying their synonyms, acronyms, related terms, and more, you can create a flexible controlled taxonomy organized in a hierarchical way. Glossaries of terms help bridge the communication gap between various departments in your company by providing consistent definitions for concepts, metrics, and other important elements across the organization.
-
-I assign the term order to this table, because it contains order information.
--
-I use a term so that anyone who finds this data can go to the term to explore the business definition of an order:
--
-## Best practice for using business assets
-
-Finally, you can extend Purview's metamodel by creating additional asset types for describing real-world things in your organization such as departments, projects, products, and lines of business. When you look at your data estate, it's often helpful to understand how your data fits into your business. Use business assets whenever you want to associate data assets to specific organizational structures, business processes, or other anything else that could be convincingly modeled as an entity.
-
-In the example below, I describe more business context for the SalesOrderDetail table by showing that the Supply chain department (a business asset) manages the order fulfillment business process (a business asset) which uses the SalesOrderDetail table. Visualizing business context in this way can help others identify the ΓÇ£officialΓÇ¥ dataset that's used for a particular business purpose and understand whether data is being used compliantly.
--
-## Next steps
-
-Learn more about organizing your data in Microsoft Purview: [Govern your domains with Microsoft Purview: best practices for using collections, glossary, and business context](/azure/purview/concept-best-practices-governing-domains)
--
purview Concept Best Practices Asset Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-asset-lifecycle.md
- Title: Microsoft Purview asset management processes
-description: This article provides process and best practice guidance to effectively manage the lifecycle of assets in the Microsoft Purview Data Catalog.
----- Previously updated : 05/26/2023--
-# Business processes for managing data effectively
-
-As data and content has a lifecycle that requires active management (for example, acquisition - processing - disposal) assets in the Microsoft Purview Data Catalog need active management in a similar way. "Assets" in the catalog include the technical metadata that describes collection, lineage and scan information. Metadata describing the business structure of data such as glossary, classifications and ownership also needs to be managed.
-
-To manage data assets, responsible people in the organization must understand how and when to apply data governance processes and manage workflows.
-
-## Why do you need business processes for managing assets in Microsoft Purview data governance?
-
-An organization employing [Microsoft Purview data governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions) should define processes and people structure to manage the lifecycle of assets and ensure data is valuable to users of the catalog. Metadata in the catalog must be maintained to be able to manage data at scale for discovery, quality, security and privacy.
-
-### Benefits
--- Agreed definition and structure of data. This is required for the Microsoft Purview Data Catalog to provide effective data search and protection functionality at scale across organizations' data estates. --- A well defined process for asset lifecycle management. This is key to maintain accurate asset metadata, which improves usability of the catalog and the ability to protect relevant data. --- Business users looking for data are more likely to use the catalog to search for data when it's maintained using data governance processes.-
-### Best practice processes that should be considered when starting the data governance journey with Microsoft Purview:
--- **Capture and maintain assets** - Understand how to initially structure and record assets in the catalog for management-- **Glossary and Classification management** - Understand how to effectively manage the catalog metadata needed to apply for the ingested assets and how to and maintain a business glossary and custom classifications-- **Moving and deleting assets** ΓÇô Managing collections and assets by understanding how to move assets from one collection to another or delete asset metadata from Microsoft Purview-
-## Data curator organizational personas
-
-The [Data Curator](catalog-permissions.md) role in Microsoft Purview controls read/write permission to assets within a collection group. To support the data governance processes, the Data Curator role has been granted to separate data governance personas in the organization:
-
-> [!Note]
-> The 4 **personas** listed are suggested read/write users, and would all be assigned Data Curator role in Microsoft Purview.
--- Data Owner or Data Expert:-
- - A Data Owner is typically a senior business stakeholder with authority and budget who is accountable for overseeing the quality and protection of a data subject area. This person is accountable for making decisions on who has the right to access data and how it's used.
-
- - A Data Expert is an individual who is an authority in the business process, data manufacturing process or data consumption patterns.
--- Data Steward or Data Custodian-
- - A Data Steward is typically a business professional responsible for overseeing the definition, quality and management of a data subject area or data entity. They're typically experts in the data domain and work with other data stewards to make decisions on how to apply all aspects of data management.
-
- - A Data Custodian is an individual responsible for performing one or more data controls.
-
-## 1. Capture and maintain assets
-
-This process describes the high-level steps and suggested roles to capture and maintain assets in the Microsoft Purview Data Catalog.
--
-### Process Guidance
-
-| Process Step | Guidance |
-| | -- |
-| 1 | [Microsoft Purview collections architecture and best practices](concept-best-practices-collections.md) |
-| 2 | [How to create and manage collections](how-to-create-and-manage-collections.md)
-| 3 & 4 | [Understand Microsoft Purview access and permissions](catalog-permissions.md)
-| 5 | [Microsoft Purview supported sources](purview-connector-overview.md) <br> [Microsoft Purview private endpoint networking](catalog-private-link.md) |
-| 6 | [How to manage data sources in Microsoft Purview](manage-data-sources.md)
-| 7 | [Best practices for scanning data sources in Microsoft Purview](concept-best-practices-scanning.md)
-| 8, 9 & 10 | [Search the data catalog](how-to-search-catalog.md) <br> [Browse the data catalog](how-to-browse-catalog.md)
-
-## 2. Glossary and classification maintenance
-
-This process describes the high-level steps and roles to manage and define the business glossary and classifications metadata to enrich the Microsoft Purview Data Catalog.
--
-### Process Guidance
-
-| Process Step | Guidance |
-| | -- |
-| 1 & 2 | [Understand Microsoft Purview access and permissions](catalog-permissions.md) |
-| 3 | [Create custom classifications and classification rules](create-a-custom-classification-and-classification-rule.md)
-| 4 | [Create a scan rule set](create-a-scan-rule-set.md)
-| 5 & 6 | [Apply classifications to assets](apply-classifications.md)
-| 7 & 8 | [Understand business glossary features](concept-business-glossary.md)
-| 9 & 10 | [Create and manage glossary terms](how-to-create-manage-glossary-term.md)
-| 11 | [Search the Data Catalog](how-to-search-catalog.md)
-| 12 & 13 | [Browse the Data Catalog](how-to-browse-catalog.md)
-
-> [!Note]
-> It is not currently possible to edit glossary term attributes (for example, Status) in bulk using the Microsoft Purview UI, but it is possible to export the glossary in bulk, edit in Excel and re-import with amendments.
-
-## 3. Moving assets between collections
-
-This process describes the high-level steps and roles to move assets between collections using the Microsoft Purview compliance portal.
--
-### Process Guidance
-
-| Process Step | Guidance |
-| | -- |
-| 1 & 2 | [Microsoft Purview collections architecture and best practice](concept-best-practices-collections.md) |
-| 3 | [Create a collection](quickstart-create-collection.md)
-| 4 | [Understand access and permissions](catalog-permissions.md)
-| 5 | [How to manage collections](how-to-create-and-manage-collections.md#add-assets-to-collections)
-| 6 | [Check collection permissions](how-to-create-and-manage-collections.md#prerequisites)
-| 7 | [Browse the Microsoft Purview Catalog](how-to-browse-catalog.md)
-
-> [!Note]
-> It is not currently possible to bulk move assets from one collection to another using the Microsoft Purview compliance portal.
-
-## 4. Deleting asset metadata
-
-This process describes the high-level steps and roles to delete asset metadata from the data catalog using the Microsoft Purview compliance portal.
-
-Asset Metadata may need to be deleted manually for many reasons:
--- To remove asset metadata where the data is deleted (if a full re-scan isn't performed)-- To remove asset metadata where the data is purged according to its retention period-- To reduce/manage the size of the data map --
-> [!Note]
-> Before deleting assets, please refer to the how-to guide to review considerations: [How to delete assets](catalog-asset-details.md#delete-asset)
--
-### Process Guidance
-
-| Process Step | Guidance |
-| | -- |
-| 1 & 2 | Manual steps |
-| 3 | [Data catalog lineage user guide](catalog-lineage-user-guide.md)
-| 4 | Manual step
-| 5 | [How to view, edit and delete assets](catalog-asset-details.md#delete-asset)
-| 6 | [Scanning best practices](concept-best-practices-scanning.md)
-
-> [!Note]
-> - Deleting a collection, registered source or scan from Microsoft Purview does not delete all associated asset metadata.
-> - It is not possible to bulk delete asset metadata using the Microsoft Purview Portal
-> - Deleting the asset metadata does not delete all associated lineage or other relationship data (for example, glossary or classification assignments) about the asset from the data map. The asset information and relationships will no longer be visible in the portal.
-
-## Next steps
-- [Microsoft Purview accounts architectures and best practices](concept-best-practices-accounts.md)-- [Microsoft Purview collections architectures and best practices](concept-best-practices-collections.md)-- [Microsoft Purview glossary best practices](concept-best-practices-glossary.md)-- [Microsoft Purview classifications best practices](concept-best-practices-classification.md)
purview Concept Best Practices Automation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-automation.md
- Title: Microsoft Purview automation best practices
-description: This article provides an overview of Microsoft Purview automation tools and guidance on what to use when.
----- Previously updated : 12/09/2022--
-# Microsoft Purview automation best practices
-
-While [Microsoft Purview governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions) provide an out of the box user experience with the Microsoft Purview governance portal, not all tasks are suited to the point-and-click nature of the graphical user experience.
-
-For example:
-* Triggering a scan to run as part of an automated process.
-* Monitoring for metadata changes in real time.
-* Building your own custom user experience.
-
-Microsoft Purview provides several tools in which we can use to interact with the underlying platform, in an automated, and programmatic fashion. Because of the open nature of the Microsoft Purview service, we can automate different aspects, from the control plane, made accessible via Azure Resource Manager, to Microsoft Purview's multiple data planes (catalog, scanning, administration, and more).
-
-This article provides a summary of the options available, and guidance on what to use when.
-
-## Tools
-
-| Tool Type | Tool | Scenario | Management | Catalog | Scanning | Logs |
-| | | | | | | |
-**Resource Management** | <ul><li><a href="/azure/templates/microsoft.purview/accounts" target="_blank">ARM Templates</a></li><li><a href="https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/purview_account" target="_blank">Terraform</a></li></ul> | Infrastructure as Code | Γ£ô | | | |
-**Command Line** | <ul><li><a href="/cli/azure/service-page/azure%20purview" target="_blank">Azure CLI</a></li></ul> | Interactive | Γ£ô | | | |
-**Command Line** | <ul><li><a href="/powershell/module/az.purview" target="_blank">Azure PowerShell</a></li></ul> | Interactive | Γ£ô | Γ£ô | | |
-**API** | <ul><li><a href="/rest/api/purview/" target="_blank">REST API</a></li></ul> | On-Demand | Γ£ô | Γ£ô | Γ£ô | |
-**Streaming** (Apache Atlas) | <ul><li><a href="/azure/purview/manage-kafka-dotnet" target="_blank">Event Hubs</a></li></ul> | Real-Time | | Γ£ô | | |
-**Monitoring** | <ul><li><a href="/azure/azure-monitor/essentials/diagnostic-settings?tabs=CMD#destinations" target="_blank">Azure Monitor</a></li></ul> | Monitoring | | | | Γ£ô |
-**SDK** | <ul><li><a href="/dotnet/api/overview/azure/purview" target="_blank">.NET</a></li><li><a href="/java/api/overview/azure/purview" target="_blank">Java</a></li><li><a href="/javascript/api/overview/azure/purview" target="_blank">JavaScript</a></li><li><a href="/python/api/overview/azure/purview" target="_blank">Python</a></li></ul> | Custom Development | Γ£ô | Γ£ô | Γ£ô | |
-
-## Resource Management
-[Azure Resource Manager](../azure-resource-manager/management/overview.md) is a deployment and management service, which enables customers to create, update, and delete resources in Azure. When deploying Azure resources repeatedly, ARM templates can be used to ensure consistency, this approach is referred to as Infrastructure as Code.
-
-To implement infrastructure as code, we can build [ARM templates](../azure-resource-manager/templates/overview.md) using JSON or [Bicep](../azure-resource-manager/bicep/overview.md), or open-source alternatives such as [Terraform](/azure/developer/terraform/overview).
-
-When to use?
-* Scenarios that require repeated Microsoft Purview deployments, templates ensure Microsoft Purview along with any other dependent resources are deployed in a consistent manner.
-* When coupled with [deployment scripts](../azure-resource-manager/templates/deployment-script-template.md), templated solutions can traverse the control and data planes, enabling the deployment of end-to-end solutions. For example, create a Microsoft Purview account, register sources, trigger scans.
-
-## Command Line
-Azure CLI and Azure PowerShell are command-line tools that enable you to manage Azure resources such as Microsoft Purview. While the list of commands will grow over time, only a subset of Microsoft Purview control plane operations is currently available. For an up-to-date list of commands currently available, check out the documentation ([Azure CLI](/cli/azure/purview) | [Azure PowerShell](/powershell/module/az.purview)).
-
-* **Azure CLI** - A cross-platform tool that allows the execution of commands through a terminal using interactive command-line prompts or a script. Azure CLI has a **purview extension** that allows for the management of Microsoft Purview accounts. For example, `az purview account`.
-* **Azure PowerShell** - A cross-platform task automation program, consisting of a set of cmdlets for managing Azure resources. Azure PowerShell has a module called **Az.Purview** that allows for the management of Microsoft Purview accounts. For example, `Get-AzPurviewAccount`.
-
-When to use?
-* Best suited for ad-hoc tasks and quick exploratory operations.
-
-## API
-REST APIs are HTTP endpoints that surface different methods (`POST`, `GET`, `PUT`, `DELETE`), triggering actions such as create, read, update, or delete (CRUD). Microsoft Purview exposes a large portion of the Microsoft Purview platform via multiple [service endpoints](/rest/api/purview/).
-
-When to use?
-* Required operations not available via Azure CLI, Azure PowerShell, or native client libraries.
-* Custom application development or process automation.
-
-## Streaming (Apache Atlas)
-
-Each Microsoft Purview account can configure Event Hubs that are accessible via their Atlas Kafka endpoint.
-
-[You can follow these steps to configure the Event Hubs namespaces.](configure-event-hubs-for-kafka.md)
-
->[!NOTE]
->Enabling this Event Hubs namespace does incur a cost for the namespace. For specific details, see [the pricing page](https://azure.microsoft.com/pricing/details/event-hubs/).
-
-Once the namespace is enabled, Microsoft Purview events can be monitored by consuming messages from the event hub. External systems can also use the event hub to publish events to Microsoft Purview as they occur.
-* **Consume Events** - Microsoft Purview will send notifications about metadata changes to Kafka topic **ATLAS_ENTITIES**. Applications interested in metadata changes can monitor for these notifications. Supported operations include: `ENTITY_CREATE`, `ENTITY_UPDATE`, `ENTITY_DELETE`, `CLASSIFICATION_ADD`, `CLASSIFICATION_UPDATE`, `CLASSIFICATION_DELETE`.
-* **Publish Events** - Microsoft Purview can be notified of metadata changes via notifications to Kafka topic **ATLAS_HOOK**. Supported operations include: `ENTITY_CREATE_V2`, `ENTITY_PARTIAL_UPDATE_V2`, `ENTITY_FULL_UPDATE_V2`, `ENTITY_DELETE_V2`.
-
-When to use?
-* Applications or processes that need to publish or consume Apache Atlas events in real time.
-
-## Monitoring
-
-Microsoft Purview can send platform logs and metrics via "Diagnostic settings" to one or more destinations (Log Analytics Workspace, Storage Account, or Azure Event Hubs). [Available metrics](./how-to-monitor-with-azure-monitor.md#available-metrics) include `Data Map Capacity Units`, `Data Map Storage Size`, `Scan Canceled`, `Scan Completed`, `Scan Failed`, and `Scan Time Taken`.
-
-Once configured, Microsoft Purview automatically sends these events to the destination as a JSON payload. From there, application subscribers that need to consume and act on these events can do so with the option of orchestrating downstream logic.
-
-When to use?
-* Applications or processes that need to consume diagnostic events.
-
-## SDK
-Microsoft provides Azure SDKs to programmatically manage and interact with Azure services. Microsoft Purview client libraries are available in several languages (.NET, Java, JavaScript, and Python), designed to be consistent, approachable, and idiomatic.
-
-* [.NET](/dotnet/api/overview/azure/purview)
-* [Java](/java/api/overview/azure/purview)
-* [JavaScript](/javascript/api/overview/azure/purview)
-* [Python](/python/api/overview/azure/purview)
-
-When to use?
-* Recommended over the REST API as the native client libraries (where available) will follow standard programming language conventions in line with the target language that will feel natural to the developer.
-
-## Next steps
-* [Microsoft Purview REST API](/rest/api/purview)
purview Concept Best Practices Classification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-classification.md
- Title: Classification best practices for the Microsoft Purview governance portal
-description: This article provides best practices for classification in the Microsoft Purview governance portal so you can effectively identify sensitive data across your environment.
----- Previously updated : 02/15/2023--
-# Classification best practices in the Microsoft Purview governance portal
-
-Data classification in the Microsoft Purview governance portal is a way of categorizing data assets by assigning unique logical labels or classes to the data assets. Classification is based on the business context of the data. For example, you might classify assets by *Passport Number*, *Driver's License Number*, *Credit Card Number*, *SWIFT Code*, *PersonΓÇÖs Name*, and so on. To learn more about classification itself, see our [classification article](concept-classification.md).
-
-This article describes best practices to adopt when you're classifying data assets, so that your scans will be more effective and you have the most complete information possible about your entire data estate.
-
-## Scan rule set
-
-By using a *scan rule set*, you can configure the relevant classifications that should be applied to the particular scan for the data source. Select the relevant system classifications, or select custom classifications if you've created one for the data you're scanning.
-
-For example, in the following image, only the specific selected system and custom classifications will be applied for the data source you're scanning (for example, financial data).
-
-
-## Annotation management
-
-While you're deciding on which classifications to apply, we recommend that you:
-
- * Go to **Data Map** > **Annotation management** > **Classifications** pane.
-
- * Review the available system classifications to be applied on the data assets you're scanning. The formal names of system classifications have a *MICROSOFT* prefix.
-
- :::image type="content" source="./media/concept-best-practices/classification-classification-example-4.png" alt-text="Screenshot that shows a list of system classifications on the 'Classifications' pane." lightbox="./media/concept-best-practices/classification-classification-example-4.png":::
-
- * Create a custom classification name, if necessary. Start on this pane, and then go to **Data Map** > **Annotation management** > **Classification rules**. Here, you can create the classification rule for the custom classification name that you created in the preceding step.
-
- :::image type="content" source="./media/concept-best-practices/classification-classification-rules-example-2.png" alt-text="Screenshot that shows the 'Classification rules' pane." lightbox="./media/concept-best-practices/classification-classification-rules-example-2.png":::
-
-## Custom classifications
-
-Create custom classifications only if the available system classifications don't meet your needs.
-
-For the *name* of the custom classification, it's a good practice to use a namespace convention (for example, *\<company name>.\<business unit>.\<custom classification name>*).
-
-As an example, for the custom EMPLOYEE_ID classification for fictitious company Contoso, the name of your custom classification would be CONTOSO.HR.EMPLOYEE_ID, and the friendly name is stored in the system as HR.EMPLOYEE ID.
-
-
-When you create and configure the classification rules for a custom classification, do the following:
-
-* Select the appropriate classification name for which the classification rule is to be created.
-
-* The Microsoft Purview governance portal supports the following two methods for creating custom classification rules:
- * Use the **Regular expression** (regex) method if you can consistently express the data element by using a regular expression pattern or you can generate the pattern by using a data file. Ensure that the sample data reflects the population.
- * Use the **Dictionary** method only if the list of values in the dictionary file represents all possible values of data to be classified and is expected to conform to a given set of data (considering future values as well).
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-rule-example-6.png" alt-text="Screenshot that shows the 'Regular expression' and 'Dictionary' options for creating custom classification rules." lightbox="./media/concept-best-practices/classification-custom-classification-rule-example-6.png":::
-
-* Using the **Regular expression** method:
-
- * Configure the regex pattern for the data to be classified. Ensure that the regex pattern is generic enough to cater to the data being classified.
-
- * Microsoft Purview also provides a feature to generate a suggested regex pattern. After you upload a sample data file, select one of the suggested patterns, and then select **Add to patterns** to use the suggested data and column patterns. You can modify the suggested patterns, or you can type your own patterns without having to upload a file.
-
- * You can also configure the column name pattern, for the column to be classified to minimize false positives.
-
- * Configure the *Minimum match threshold* parameter that's acceptable for your data that matches the data pattern to apply the classification. The threshold values can be from 1% through 100%. We suggest a value of at least 60% as the threshold to avoid false positives. However, you may configure as necessary for your specific classification scenarios. For example, your threshold might be as low as 1% if you want to detect and apply a classification for any value in the data if it matches the pattern.
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-rule-regular-expressions-example-7.png" alt-text="Screenshot that shows the regex method for creating a custom classification rule." lightbox="./media/concept-best-practices/classification-custom-classification-rule-regular-expressions-example-7.png":::
-
- * The option to set a minimum match rule is automatically disabled if more than one data pattern is added to the classification rule.
-
- * Use the *Test classification rule* and test with sample data to verify that the classification rule is working as expected. Ensure that in the sample data (for example, in a .csv file) at least three columns are present, including the column on which the classification is to be applied. If the test is successful, you should see the classification label on the column, as shown in the following image:
-
- :::image type="content" source="./media/concept-best-practices/classification-test-classification-rule-example-8.png" alt-text="Screenshot that shows classification when the test classification is successful." lightbox="./media/concept-best-practices/classification-test-classification-rule-example-8.png":::
-
-* Using the **Dictionary** method:
-
- * You can use the Dictionary method to fit enumeration data or if the dictionary list of possible values is available.
-
- * This method supports .csv and .tsv files, with a file size limit of 30 megabytes (MB).
-
-## Custom classification archetypes
-
-### How the "threshold" parameter works in the regular expression
-
-* Consider the sample source data in the following image. There are five columns, and the custom classification rule should be applied to columns **Sample_col1**, **Sample_col2**, and **Sample_col3** for the data pattern *N{Digit}{Digit}{Digit}AN*.
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-rule-example-source-data-9.png" alt-text="Screenshot that shows example source data." lightbox="./media/concept-best-practices/classification-custom-classification-rule-example-source-data-9.png":::
-
-* The custom classification is named NDDDAN.
-
-* The classification rule (regex for the data pattern) is ^N[0-9]{3}AN$.
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-ndddan-10.png" alt-text="Screenshot that shows a custom classification rule." lightbox="./media/concept-best-practices/classification-custom-classification-ndddan-10.png":::
-
-* The threshold would be computed for the "^N[0-9]{3}AN$" pattern, as shown in the following image:
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-rule-threshold-11.png" alt-text="Screenshot that shows thresholds of a custom classification rule." lightbox="./media/concept-best-practices/classification-custom-classification-rule-threshold-11.png":::
-
- If you have a threshold of 55%, only columns **Sample_col1** and **Sample_col2** will be classified. **Sample_col3** won't be classified, because it doesn't meet the 55% threshold criterion.
-
- :::image type="content" source="./media/concept-best-practices/classification-test-custom-classification-rule-12.png" alt-text="Screenshot that shows the result of a high-threshold criterion." lightbox="./media/concept-best-practices/classification-test-custom-classification-rule-12.png":::
-
-### How to use both data and column patterns
-
-* For the given sample data, where both column **B** and column **C** have similar data patterns, you can classify on column **B** based on the data pattern "^P[0-9]{3}[A-Z]{2}$".
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-sample-data-13.png" alt-text="Screenshot that shows sample data." lightbox="./media/concept-best-practices/classification-custom-classification-sample-data-13.png":::
-
-* Use the column pattern along with the data pattern to ensure that only **Product ID** column is classified.
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-rule-14.png" alt-text="Screenshot that shows a classification rule." lightbox="./media/concept-best-practices/classification-custom-classification-rule-14.png":::
-
- > [!NOTE]
- > The column pattern is verified as an AND condition with the data pattern.
-
-* Use the *Test classification rule* and test with sample data to verify that the classification rule is working as expected.
-
- :::image type="content" source="./media/concept-best-practices/classification-custom-classification-rule-column-pattern-15.png" alt-text="Screenshot that shows a column pattern." lightbox="./media/concept-best-practices/classification-custom-classification-rule-column-pattern-15.png":::
-
-### How to use multiple column patterns
-
-If there are multiple column patterns to be classified for the same classification rule, use pipe (|) character-separated column names. For example, for columns **Product ID**, **Product_ID**, **ProductID**, and so on, write the column pattern as shown in the following image:
-
-
-For more information, see [regex alternation construct](/dotnet/standard/base-types/regular-expression-language-quick-reference#alternation-constructs).
-
-## Classification considerations
-
-Here are some considerations to bear in mind as you're defining classifications:
-
-* To decide what classifications are required to be applied to the assets prior to scanning, consider how your classifications are to be used. Unnecessary classification labels might look noisy and even misleading for data consumers. You can use classifications to:
- * Describe the nature of the data that exists in the data asset or schema that's being scanned. In other words, classifications should enable customers to identify the content of data asset or schema from the classification labels as they search the catalog.
- * Set priorities and develop a plan to achieve the security and compliance needs of an organization.
- * Describe the phases in the data preparation processes (raw zone, landing zone, and so on) and assign the classifications to specific assets to mark the phase in the process.
-
-* You can assign classifications at the asset or column level automatically by including relevant classifications in the scan rule, or you can assign them manually after you ingest the metadata into the Microsoft Purview Data Map.
-* For automatic assignment, see [supported data stores in the Microsoft Purview governance portal](./azure-purview-connector-overview.md).
-* Before you scan your data sources in the Microsoft Purview Data Map, it's important to understand your data and configure the appropriate scan rule set for it (for example, by selecting relevant system classification, custom classifications, or a combination of both), because it could affect your scan performance. For more information, see [supported classifications in the Microsoft Purview governance portal](./supported-classifications.md).
-* The Microsoft Purview scanner applies data sampling rules for deep scans (subject to classification) for both system and custom classifications. The sampling rule is based on the type of data sources. For more information, see the "Sampling within a file" section in [Supported data sources and file types in Microsoft Purview](./sources-and-scans.md#sampling-within-a-file).
-
- > [!Note]
- > **Distinct data threshold**: This is the total number of distinct data values that need to be found in a column before the scanner runs the data pattern on it. Distinct data threshold has nothing to do with pattern matching but it is a pre-requisite for pattern matching. System classification rules require there to be at least 8 distinct values in each column to subject them to classification. The system requires this value to make sure that the column contains enough data for the scanner to accurately classify it. For example, a column that contains multiple rows that all contain the value 1 won't be classified. Columns that contain one row with a value and the rest of the rows have null values also won't get classified. If you specify multiple patterns, this value applies to each of them.
-
-* The sampling rules apply to resource sets as well. For more information, see the "Resource set file sampling" section in [supported data sources and file types in the Microsoft Purview governance portal](./sources-and-scans.md#resource-set-file-sampling).
-* Custom classifications can't be applied on document type assets using custom classification rules. Classifications for such types can be applied manually only.
-* Custom classifications aren't included in any default scan rules. Therefore, if automatic assignment of custom classifications is expected, you must deploy and use a custom scan rule that includes the custom classification to run the scan.
-* If you apply classifications manually from the Microsoft Purview governance portal, such classifications are retained in subsequent scans.
-* Subsequent scans won't remove any classifications from assets, if they were detected previously, even if the classification rules are inapplicable.
-* For *encrypted source* data assets, Microsoft Purview picks only file names, fully qualified names, schema details for structured file types, and database tables. For classification to work, decrypt the encrypted data before you run scans.
--
-## Next steps
--- [Automatically apply classifications](./apply-classifications.md)-- [Manually apply classifications](./manually-apply-classifications.md)-- [Create custom classification](./create-a-custom-classification-and-classification-rule.md)-
purview Concept Best Practices Collections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-collections.md
- Title: Microsoft Purview collections architecture and best practices
-description: This article provides examples of Microsoft Purview collections architectures and describes best practices.
----- Previously updated : 09/13/2022--
-# Microsoft Purview collections architectures and best practices
-
-At the core of [Microsoft Purview unified data governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions), the data map is a platform as a service (PaaS) component that keeps an up-to-date map of assets and their metadata across your data estate. To hydrate the data map, you need to register and scan your data sources. In an organization, there might be thousands of sources of data that are managed and governed by either centralized or decentralized teams.
-
-[Collections](./how-to-create-and-manage-collections.md) in Microsoft Purview support organizational mapping of metadata. By using collections, you can manage and maintain data sources, scans, and assets in a hierarchy instead of a flat structure. Collections allow you to build a custom hierarchical model of your data landscape based on how your organization plans to use Microsoft Purview to govern your landscape.
-
-A collection also provides a security boundary for your metadata in the data map. Access to collections, data sources, and metadata is set up and maintained based on the collections hierarchy in Microsoft Purview, following a least-privilege model:
-- Users have the minimum amount of access they need to do their jobs. -- Users don't have access to sensitive data that they don't need.-
-## Why do you need to define collections and an authorization model for your Microsoft Purview account?
-
-Consider deploying collections in Microsoft Purview to fulfill the following requirements:
--- Organize data sources, distribute assets, and run scans based on your business requirements, geographical distribution of data, and data management teams, departments, or business functions. --- Delegate ownership of data sources and assets to corresponding teams by assigning roles to corresponding collections. --- Search and filter assets by collections.
-
-
-## Define a collection hierarchy
-
-### Design recommendations
--- Review the [Microsoft Purview account best practices](./deployment-best-practices.md#determine-the-number-of-microsoft-purview-instances) and define the adequate number of Microsoft Purview accounts required in your organization before you plan the collection structure. --- We recommend that you design your collection architecture based on the security requirements and data management and governance structure of your organization. Review the recommended [collections archetypes](#collections-archetypes) in this article.--- For future scalability, we recommend that you create a top-level collection for your organization below the root collection. Assign relevant roles at the top-level collection instead of at the root collection. --- Consider security and access management as part of your design decision-making process when you build collections in Microsoft Purview. --- Each collection has a name attribute and a friendly name attribute. If you use [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/) to deploy a collection, the system automatically assigns a random six-letter name to the collection to avoid duplication. To reduce complexity, avoid using duplicated friendly names across your collections, especially in the same level.--- Currently, a collection name can contain up to 36 characters and a collection friendly name can have up to 100 characters.--- When you can, avoid duplicating your organizational structure into a deeply nested collection hierarchy. If you can't avoid doing so, be sure to use different names for every collection in the hierarchy to make the collections easy to distinguish.--- Automate deployment of collections by using the API if you're planning to deploy collections and role assignments in bulk.--- Use a dedicated service principal name (SPN) to run operations on collections and role assignment by using the API. Using an SPN reduces the number of users who have elevated rights and follows least-privilege guidelines.-
-### Design considerations
--- Each Microsoft Purview account is created with a default _root collection_. The root collection name is the same as your Microsoft Purview account name. The root collection can't be removed. To change the root collection's friendly name, you can change the friendly name of your Microsoft Purview account from Microsoft Purview Management center. --- Collections can hold data sources, scans, assets, and role assignments.--- A collection can have as many child collections as needed. But each collection can have only one parent collection. You can't deploy collections above the root collection.--- Data sources, scans, and assets can belong to only one collection. --- A collections hierarchy in a Microsoft Purview can support as many as 256 collections, with a maximum of eight levels of depth. This doesn't include the root collection. --- By design, you can't register data sources multiple times in a single Microsoft Purview account. This architecture helps to avoid the risk of assigning different levels of access control to a single data source. If multiple teams consume the metadata of a single data source, you can register and manage the data source in a parent collection. You can then create corresponding scans under each subcollection so that relevant assets appear under each child collection.--- Lineage connections and artifacts are attached to the root collection even if the data sources are registered at lower-level collections.--- When you run a new scan, by default, the scan is deployed in the same collection as the data source. You can optionally select a different subcollection to run the scan. As a result, the assets will belong under the subcollection. --- Moving data sources across collections is allowed if the user is granted the Data Source Admin role for the source and destination collections. --- Moving assets across collections is allowed if the user is granted the Data Curator role for the source and destination collections. --- Currently, certain operations, like move and rename of a collection, aren't allowed. --- You can delete a collection if it does not have any assets, associated scans, data sources or child collections.--- Data sources, scans, and assets must belong to a collection if they exist in the Microsoft Purview data map. -
-## Define an authorization model
-
-Microsoft Purview data-plane roles are managed in Microsoft Purview. After you deploy a Microsoft Purview account, the creator of the Microsoft Purview account is automatically assigned the following roles at the root collection. You can use [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/) or a programmatic method to directly assign and manage roles in Microsoft Purview.
-
- - **Collection Admins** can edit Microsoft Purview collections and their details and add subcollections. They can also add users to other Microsoft Purview roles on collections where they're admins.
- - **Data Source Admins** can manage data sources and data scans.
- - **Data Curators** can create, read, modify, and delete catalog data assets and establish relationships between assets.
- - **Data Readers** can access but not modify catalog data assets.
-
-### Design recommendations
--- Consider implementing [emergency access](/azure/active-directory/users-groups-roles/directory-emergency-access) or a break-glass strategy for the Collection Admin role at your Microsoft Purview root collection level to avoid Microsoft Purview account-level lockouts. Document the process for using emergency accounts. -
- > [!NOTE]
- > In certain scenarios, you might need to use an emergency account to sign in to Microsoft Purview. You might need this type of account to fix organization-level access problems when nobody else can sign in to Microsoft Purview or when other admins can't accomplish certain operations because of corporate authentication problems. We strongly recommended that you follow Microsoft best practices around implementing [emergency access accounts](/azure/active-directory/users-groups-roles/directory-emergency-access) by using cloud-only users.
- >
- > Follow the instructions in [this article](./concept-account-upgrade.md#what-happens-when-your-upgraded-account-doesnt-have-a-collection-admin) to recover access to your Microsoft Purview root collection if your previous Collection Admin is unavailable.
--- Minimize the number of root Collection Admins. Assign a maximum of three Collection Admin users at the root collection, including the SPN and your break-glass accounts. Assign your Collection Admin roles to the top-level collection or to subcollections instead.--- Assign roles to groups instead of individual users to reduce administrative overhead and errors in managing individual roles. --- Assign the service principal at the root collection for automation purposes.--- To increase security, enable Azure AD Conditional Access with multifactor authentication for at least Collection Admins, Data Source Admins, and Data Curators. Make sure emergency accounts are excluded from the Conditional Access policy.-
-### Design considerations
--- Microsoft Purview access management has moved into data plane. Azure Resource Manager roles aren't used anymore, so you should use Microsoft Purview to assign roles. --- In Microsoft Purview, you can assign roles to users, security groups, and service principals (including managed identities) from Azure Active Directory (Azure AD) on the same Azure AD tenant where the Microsoft Purview account is deployed.
-
-- You must first add guest accounts to your Azure AD tenant as B2B users before you can assign Microsoft Purview roles to external users. --- By default, Collection Admins don't have access to read or modify assets. But they can elevate their access and add themselves to more roles.--- By default, all role assignments are automatically inherited by all child collections. But you can enable **Restrict inherited permissions** on any collection except for the root collection. **Restrict inherited permissions** removes the inherited roles from all parent collections, except for the Collection Admin role. --- For Azure Data Factory connection: to connect to Azure Data Factory, you have to be a Collection Admin for the root collection.--- If you need to connect to Azure Data Factory for lineage, grant the Data Curator role to the data factory's managed identity at your Microsoft Purview root collection level. When you connect Data Factory to Microsoft Purview in the authoring UI, Data Factory tries to add these role assignments automatically. If you have the Collection Admin role on the Microsoft Purview root collection, this operation will work. -
-## Collections archetypes
-
-You can deploy your Microsoft Purview collection based on centralized, decentralized, or hybrid data management and governance models. Base this decision on your business and security requirements.
-
-### Example 1: Single-region organization
-
-This structure is suitable for organizations that:
-- Are mainly based in a single geographic location. -- Have a centralized data management and governance team where the next level of data management falls into departments, teams, or projects. -
-The collection hierarchy consists of these verticals:
--- Root collection (default)-- Contoso (top-level collection)-- Departments (a delegated collection for each department) -- Teams or projects (further segregation based on projects)-
-Each data source is registered and scanned in its corresponding collection. So assets also appear in the same collection.
-
-Organization-level shared data sources are registered and scanned in the Hub-Shared collection.
-
-The department-level shared data sources are registered and scanned in the department collections.
--
-### Example 2: Multiregion organization
-
-This scenario is useful for organizations:
-- That have a presence in multiple regions. -- Where the data governance team is centralized or decentralized in each region.-- Where data management teams are distributed in each geographic location. -
-The collection hierarchy consists of these verticals:
--- Root collection (default)-- FourthCoffee (top-level collection)-- Geographic locations (mid-level collections based on geographic locations where data sources and data owners are located)-- Departments (a delegated collection for each department) -- Teams or projects (further segregation based on teams or projects)-
-In this scenario, each region has a subcollection of its own under the top-level collection in the Microsoft Purview account. Data sources are registered and scanned in the corresponding subcollections in their own geographic locations. So assets also appear in the subcollection hierarchy for the region.
-
-If you have centralized data management and governance teams, you can grant them access from the top-level collection. When you do, they gain oversight for the entire data estate in the data map. Optionally, the centralized team can register and scan any shared data sources.
-
-Region-based data management and governance teams can obtain access from their corresponding collections at a lower level.
-
-The department-level shared data sources are registered and scanned in the department collections.
--
-### Example 3: Multiregion, data transformation
-
-This scenario can be useful if you want to distribute metadata-access management based on geographic locations and data transformation states. Data scientists and data engineers who can transform data to make it more meaningful can manage Raw and Refine zones. They can then move the data into Produce or Curated zones.
-
-The collection hierarchy consists of these verticals:
--- Root collection (default)-- Fabrikam (top-level collection)-- Geographic locations (mid-level collections based on geographic locations where data sources and data owners are located)-- Data transformation stages (Raw, Refine, Produce/Curated) -
-Data scientists and data engineers can have the Data Curators role on their corresponding zones so they can curate metadata. Data Reader access to the curated zone can be granted to entire data personas and business users.
--
-### Example 4: Multiregion, business functions
-
-This option can be used by organizations that need to organize metadata and access management based on business functions.
-
-The collection hierarchy consists of these verticals:
--- Root collection (default)-- AdventureWorks (top-level collection)-- Geographic locations (mid-level collections based on geographic locations where data sources and data owners are located)-- Major business functions or clients (further segregation based on functions or clients)-
-Each region has a subcollection of its own under the top-level collection in the Microsoft Purview account. Data sources are registered and scanned in the corresponding subcollections in their own geographic locations. So assets are added to the subcollection hierarchy for the region.
-
-If you have centralized data management and governance teams, you can grant them access from the top-level collection. When you do, they gain oversight for the entire data estate in the data map. Optionally, the centralized team can register and scan any shared data sources.
-
-Region-based data management and governance teams can obtain access from their corresponding collections at a lower level.
-Each business unit has its own subcollection.
--
-## Access management options
-
-If you want to implement data democratization across an entire organization, assign the Data Reader role at the top-level collection to data management, governance, and business users. Assign Data Source Admin and Data Curator roles at the subcollection levels to the corresponding data management and governance teams.
-
-If you need to restrict access to metadata search and discovery in your organization, assign Data Reader and Data Curator roles at the specific collection level. For example, you could restrict US employees so they can read data only at the US collection level and not in the LATAM collection.
-
-You can apply a combination of these two scenarios in your Microsoft Purview data map if total data democratization is required with a few exceptions for some collections. You can assign Microsoft Purview roles at the top-level collection and restrict inheritance to the specific child collections.
-
-Assign the Collection Admin role to the centralized data security and management team at the top-level collection. Delegate further collection management of lower-level collections to corresponding teams.
-
-## Next steps
-- [Create a collection and assign permissions in Microsoft Purview](./quickstart-create-collection.md)-- [Create and manage collections in Microsoft Purview](./how-to-create-and-manage-collections.md)-- [Access control in Microsoft Purview](./catalog-permissions.md)
purview Concept Best Practices Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-glossary.md
- Title: Microsoft Purview glossary best practices
-description: This article provides examples of Microsoft Purview glossary best practices.
----- Previously updated : 02/15/2023--
-# Microsoft Purview glossary best practices
-
-The business glossary is a definition of terms specific to a domain of knowledge that is commonly used, communicated, and shared in organizations as they're conducting business.
-A common business glossary (for example, business language) is significant as it's critical in improving an organizations overall business productivity and performance. You observe in most organizations that their business language is being codified based on business dealings associated with:
--- Business Meetings-- Stand-Ups, Projects, and Systems (ERP, CRM, SharePoint, etc.). -- Business Plans and Business Processes-- Presentations-- Reporting and Business Rules-- Learnings (Knowledge Acquisition)-- Business Models-- Policy and Procedure-
-It's important that your organizations business language and discourse align to a common business glossary to ensure your organization data assets are properly applied in conducting business at speed and with agility as rapid changing business needs happen.
-
-This article is intended to provide guidance around how to establish and govern your organizations A to Z business glossary of commonly used terms and it's aimed to provide more guidance to ensure you're able to focus on promoting a shared understanding for business data governance ownership. Adopting these considerations and recommendations will help your organization achieve success with Microsoft Purview.
-The adoption by your organization of the business glossary will depend on you promote consistent use of a business glossary to enable your organization to better understand the meanings of their business terms they come across daily while running the organizations business.
-
-## Why is a common business glossary needed?
-
-Without a common business glossary an organization's performance, culture, operations, and strategy often will adversely hinder the business. You'll observe, in this hindrance, a condition in which cultural differences arise grounded in an inconsistent business language. These inconsistencies about the business language are communicated between team members and prevents them from using their relevant data assets as a competitive advantage.
-You'll also observe when there are language barriers, in which, most organizations will spend more time pursuing nonproductive and noncollaborative activities as they need to rely on more detailed interactions to reach the same meaning and understanding for their data assets.
-
-## Recommendations for implementing new glossary terms
-
-Creating terms is necessary to build the business vocabulary and apply it to assets within Microsoft Purview. When a new Microsoft Purview account is created, by default, there are no built-in terms in the account.
-
-This creation process should follow strict naming standards to ensure that the glossary doesn't contain duplicate or competing terms.
--- Establish a strict hierarchy for all business terms across your organization.-- The hierarchy could consist of a business domain such as: Finance, Marketing, Sales, HR, etc.-- Implement naming standards for all glossary terms that include case sensitivity. In Microsoft Purview terms are case-sensitive.-- Always use the provide search glossary terms feature before adding a new term. This will help you avoid adding duplicate terms to the glossary.-- Avoid deploying terms with duplicated names. In Microsoft Purview, terms with the same name can exist under different parent terms. This can lead to confusion and should be well thought out before building your business glossary to avoid duplicated terms. -
-Glossary terms in Microsoft Purview are case sensitive and allow white space. The following shows a poorly executed example of implementing glossary terms and demonstrates the confusion caused:
-
- :::image type="content" source="media/concept-best-practices/glossary-duplicated-term-search.png" alt-text="Screenshot that shows searching duplicated glossary terms.":::
-
-As a best practice it always best to: Plan, search, and strictly follow standards. The following shows an approach that is better thought out and greatly reduces confusion between glossary terms:
-
- :::image type="content" source="media/concept-best-practices/glossary-duplicated-term-2.png" alt-text="Another example of a duplicated glossary terms.":::
-
-## Recommendations for deploying glossary term templates
-
-When building new term templates in Microsoft Purview, review the following considerations:
--- Term templates are used to add custom attributes to glossary terms.-- By default, Microsoft Purview offers several [out-of-the-box term attributes](./concept-business-glossary.md#custom-attributes) such as Name, Nick Name, Status, Definition, Acronym, Resources, Related terms, Synonyms, Stewards, Experts, and Parent term, which are found in the "System Default" template. -- Default attributes can't be edited or deleted. -- Custom attributes extend beyond default attributes, allowing the data curators to add more descriptive details to each term to completely describe the term in the organization.-- As a reminder, Microsoft Purview stores only meta-data. Attributes should describe the meta-data; not the data itself.-- Keep definition simple. If there are custom metrics or formulas these could easily be added as an attribute.-- A term must include at least default attributes. When building new glossary terms if you use custom templates, other attributes that are included in the custom template are expected for the given term.-
-## Recommendations for importing glossary terms from term templates
--- Terms may be imported with the "System default" or custom templates.-- When importing terms, use the sample .CSV file to guide you. This can save hours of frustration.-- When importing terms from a .CSV file, be sure that terms already existing in Microsoft Purview are intended to be updated. When using the import feature, Microsoft Purview will overwrite existing terms.-- Before importing terms, test the import in a lab environment to ensure that no unexpected results occur, such as duplicate terms. -- The email address for Stewards and Experts should be the primary address of the user from the Azure Active Directory group. Alternate email, user principal name and non-Azure Active Directory emails aren't yet supported.-- Glossary terms provide fours status: Draft, Approved, Expired, Alert. Draft isn't officially implemented, Approved is official/stand/approved for production, Expired means should no longer be used, Alert need to pay more attention.
-For more information, see [the import and export glossary terms article.](./how-to-import-export-glossary.md)
-
-## Recommendations for exporting glossary terms
-
-Exporting terms may be useful in Microsoft Purview account to account, backup, or disaster recovery scenarios.
-
-## Recommendations for multi-glossary
-
-Microsoft Purview supports having multiple business glossaries managed within Microsoft Purview. Consider using multiple glossaries when:
--- The same term has different meanings across regions/departments/organizations/teams/etc.-- You want to give management of the glossary to experts in their separate regions/departments/organizations/teams/etc.-- Glossary terms and needs are different and have no overlap between regions/departments/organizations/teams/etc.-
-## Glossary Management
--- Related terms are bi-directional. For example: If I have term A and term B. I add B as a related term to term A. Now term B appears under term A's related terms, and term A appears under term B's related terms.-
-### Recommendations for assigning terms to assets
--- While classifications and sensitivity labels are applied to assets automatically by the system based on classification rules, glossary terms aren't applied automatically.-- Similar to classifications, glossary terms can be mapped to assets at the asset level or scheme level.-- In Microsoft Purview, terms can be added to assets in different ways:
- - Manually, using the Microsoft Purview governance portal.
- - Using Bulk Edit mode to update up to 25 assets, using the Microsoft Purview governance portal.
- - Curated Code using the Atlas API.
-- Use Bulk Edit Mode when assigning terms manually. This feature allows a curator to assign glossary terms, owners, experts, classifications and certified in bulk based on selected items from a search result. Multiple searches can be chained by selecting objects in the results. The Bulk Edit will apply to all selected objects. Be sure to clear the selections after the bulk edit has been performed. -- Other bulk edit operations can be performed by using the Atlas API. An example would be using the API to add descriptions or other custom properties to assets in bulk programmatically.-
-## Next steps
--- [Import and export glossary terms](./how-to-import-export-glossary.md)-- [Best practices for describing data with terms, tags, managed attributes, and business assets](concept-best-practices-annotating-data.md)-- [Create and manage glossaries](how-to-create-manage-glossary.md)-- [Create and manage terms](how-to-create-manage-glossary-term.md)-- [Manage term templates](how-to-manage-term-templates.md)-- [Browse the data catalog in Microsoft Purview](how-to-browse-catalog.md)
purview Concept Best Practices Governing Domains https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-governing-domains.md
- Title: Govern your domains with Microsoft Purview. Best practices for using collections, glossary, and business context in your data catalog
-description: Understanding your domains is critical for effective data governance. In this article, we'll explore how to analyze a business area, define responsibilities, and implement a domain-driven governance approach in Purview.
---- Previously updated : 04/24/2023---
-# Govern your domains with Microsoft Purview: best practices for using collections, glossary, and business context
-
-Understanding your domains is critical for effective data governance. In this article, we explore how to analyze a business area, define responsibilities, and implement a domain-driven governance approach in Purview. By understanding your domains, you can identify which data is critical to your business and which data requires special governance, quality, or compliance considerations.
-
-We also review how to apply several Purview features to data governance. We show how to:
--- Use collections for creating a domain structure, segregating governance roles and responsibilities, and managing access to metadata.-- Use the glossary to define key terms and data elements, and cover when it might be helpful to separate glossaries for different business areas.-- Use business assets to model the real-world concepts and objectives within a domain.-
-Let's dive into the world of domains and discover how Purview can help improve your data governance practices.
-
->[!NOTE]
->The goal of this article is to help you understand PurviewΓÇÖs capabilities, so weΓÇÖve described a simple domain where boundaries are clear, and knowledge, applications, processes, data, and people are well-aligned. This often isn't the case, especially if you have large, legacy data platforms. For example, a complex CRM system is often shared between multiple business departments. For deeper guidance on deconstructing your domains and what to do when boundaries overlap, see the [Domain modeling recommendations from the Microsoft Cloud Adoption Framework.](/azure/cloud-adoption-framework/scenarios/cloud-scale-analytics/architectures/data-domains#domain-modeling-recommendations)
-
-## Understand your domains
-Domains are problem spaces you want to address. They're areas where knowledge, behavior, laws, and activities come together. People within a domain collaborate on shared business objectives, so a shared vocabulary of terms and concepts ensures teams can communicate and work efficiently.
-
-For example, a ΓÇÿmarketing domainΓÇÖ includes:
-- Marketing data.-- Knowledge from subject matter experts regarding how that data is collected and used to support marketing objectives such as ad personalization or campaign segmentation.-- Definitions of key terms and data elements, such as a prospect versus an active customer.-- The people responsible for ensuring marketing data are fit for purpose and used responsibly (privacy and compliance experts as well as stewards). -- Sources of truth for marketing metrics. For instance, which Power BI report holds the official. count of active customers that is shared with your Chief Marketing Officer during a monthly business review?-
-Scanning and categorizing data is just the beginning of data governance. Governing a domain of data means describing the data, its meaning, its use in real-world business activities, and how it flows through your technology systems. This helps you identify which data is most critical to your business and which data requires special governance, quality, or compliance considerations.
-
-In the next sections, we'll walk through how to:
-- Analyze a business domain.-- Define responsibilities for the domain.-- Implement a domain-driven governance approach in Purview.-
-## Analyze a domain
-When you establish an approach to data governance, a domain model can help create a shared understanding of the domain between technical and business experts. A domain model is a description of the real-world concepts and activities inside the business domain you want to govern. YouΓÇÖll start by sketching an abstract view of your problem space, then refine it as you work toward the solution you implement in Purview. DonΓÇÖt be afraid of change. YouΓÇÖll learn and improve your approach as you tackle new domains.
-
-Begin by describing business activities and their connections. This should be a collaborative effort that involves data and business stakeholders. This doesnΓÇÖt have to be formalΓÇöuse a whiteboard to create a picture that makes sense to everyone.
-
-As you work together, identify which systems are used to support each business capability. Note key data as wellΓÇöeven if you donΓÇÖt know exactly how this data is sourced or stored yet. This will help you identify important business terms as well as the areas of responsibility for data governance. If you get stuck, remember that governance should be designed around business objectives and capabilities, so donΓÇÖt be afraid to use a business org chart as a starting point. Just be ready to adjust as your understanding of your domains matures.
-
-### Example: How Contoso uses order data to determine active accounts in different business contexts
-
-In a recent monthly business review with ContosoΓÇÖs chief financial officer, the supply chain and finance teams presented reports showing different rates of growth in active accounts over the previous 30 days. When ContosoΓÇÖs CFO asked why the numbers were different, reporting leads from each team began to debate the meaning of an active account and quickly realized they were using different definitions and sources of truth for their reports. The supply chain team counts any account with an order in the past 30 days as active, while the finance organization looks back further, counting every customer with an order in the past 6 months as active.
-
-Both teams agree theyΓÇÖd like to get better at governing the data used for critical reports and that they need to disambiguate how their data and reports are used in different contexts. Their approach to governing the data within this problem area will become the blueprint for expanding and adopting a governance strategy in other business domains.
-
-To begin their work, they sketch a rough view of how order management works at Contoso.
--
-Their sketch includes:
--- Business objectives: The business activities in this problem area include: order placement, order fulfillment, order shipping, customer invoicing, payments, ledger management. -- Business terms: Both invoicing and shipping activities use account data, so it will be important to have a definition of the term account, and any critical data elements included in an account. Order data is also critical for this business area.-- Data: Lots of physical data supports order management, but for now, the team focuses on finding the physical data that represents accounts and orders, because this is most critical for the domain.-- Responsibilities: They notice two broad but closely related areas of responsibilities in this diagram. These are two unique domains.
- - The supply chain department manages order placement, order fulfillment, and shipments.
- - The finance department manages invoicing, payments, and ledger management. The finance team is also primarily responsible for managing account data, but supply chain consumes this information, so they are important stakeholders of the data.
-- Systems: Finally, there are three key systems that support this domain: the billing system, the order management system, and a data warehouse that makes supply chain and finance data available for reporting.-
-## Purview implementation
-In the next section, weΓÇÖll walk through an implementation in Purview. We will:
-
-1. Use collections to establish a domain structure, segregate governance roles and responsibilities, and manage access to metadata.
-1. Use the glossary to define key terms and data elements, as well as when it might be helpful to separate glossaries for different business areas.
-1. Use business assets to model the real-world concepts and activities within these domains.
-1. Scan metadata into our collections so it can be managed by the right people.
-1. Contextualize your data by linking business and technical information in Purview.
-
-### Step 1: Establish a domain structure and assign responsibilities using collections:
-In the previous diagram, the team identified 2 areas of responsibilities: Supply chain and finance, so weΓÇÖll create separate collections for the supply chain and finance department:
--
-Next, weΓÇÖll grant our finance curators data curator access to the finance collection. This will ensure they can annotate and manage metadata for finance assets. Our supply chain analysts are consumers of finance data and information, so weΓÇÖll add them as data readers. They wonΓÇÖt be able to curate finance assets, but theyΓÇÖll be able to discover them in Purview, and suggest edits via workflows. WeΓÇÖll keep permissions manageable by assigning groups to these roles instead of individual users.
--
-### Step 2. Define key terms in the business glossary
-
-Now we can define key terms for these domains. ItΓÇÖs important for everyone at Contoso to have a shared understanding of core terms like order and account. If we were working with specialized finance or supply chain terms, we could create individual glossaries for these domains, but for now weΓÇÖll centralize terms in an enterprise glossary.
--
-Let’s make sure to assign stewards and experts for these terms as well. Experts are people who understand the full context of an asset or glossary term and can help answer questions, provide context, and disambiguate differences between terms when used in different contexts. Stewards are responsible for governance¬—ensuring a term is reviewed, standardized, and approved for use.
--
-### Step 3: Use assets and asset types to model your domain
-
-When we sketched our domain model, we captured the activities and technology systems that operate in the supply chain and finance domains as well. LetΓÇÖs take a second look at our sketch and add more context by defining the activities and systems in our domain as well as the relationships between them.
--
-WeΓÇÖll add this context to Purview in two steps:
-1. First, weΓÇÖll define asset types and their relations.
-1. Then weΓÇÖll create the individual assets and relationships between them.
-
-### Step 3.1: Define asset types and their relations
-
-In this step, we create a blueprint for the context we want to show in Purview. Purview provides asset types for business processes and systems by default, so we donΓÇÖt need to add these asset types, but it looks like we need two new relationship definitions. LetΓÇÖs add these now.
--- Business process uses dataset.-- System supports business process.--
-Adding these relationship definitions means weΓÇÖll now be able to link datasets to business process assets in Purview. This will help us contextualize data for both business and technology stakeholders.
-
-### Step 3.2: Add business assets
-
-Now letΓÇÖs add the business processes (ledger management, order placement, etc.) and two systems (Contoso billing, General ledger) from our diagram.
--
-WeΓÇÖll create data domain assets for 'account' and 'order' as well because these are not only important terms but key entity concepts in these business areas. This will help us show business context for key data entities.
--
-### Step 4: Scan data from your domains
-
-Finally, weΓÇÖre ready to register and scan the Azure SQL instance that supports both our Finance and Supply Chain domains.
-
-Analytical data sources often centralize data thatΓÇÖs shared by multiple departments. For example, a single Azure SQL Server may have a database that supports Supply Chain, and another database that supports the Finance team. To make sure we can sort these assets from this same source into different collections, weΓÇÖll register the source at the root collection and create two scoped scansΓÇöone for the finance collection and one for the supply chain collection.
--
-Then weΓÇÖll set up scoped scans to divide the data between areas of responsibility. When you set up a [scoped scan](concept-scans-and-ingestion.md#scope-your-scan), you can choose specific folders or tables that apply to each area of responsibility so you can scan the right data into a collection.
--
-### Step 5: Contextualize your data
-
-Now that weΓÇÖve creating our building blocks, weΓÇÖre ready to link the information together in Purview.
-
-WeΓÇÖll go to our logical data domain for account, assign the business term account, link account to the physical table called customer account, link its source system, Contoso billing, and establish relationships the 5 business processes that use account information.
--
-The result helps us visualize how physical data is related to its business use and context.
--
-Now, when people search the catalog for account data, they not only find the data but find its business context including how itΓÇÖs originated, which source system provides the data, its business meaning, as well as how it's used to drive value in critical business domains.
-
-## Next steps
-
-For data mesh-specific guidance on domains, domain-driven design, and data products, see Microsoft's Cloud Adoption Framework:
-- [What is data mesh?](/azure/cloud-adoption-framework/scenarios/cloud-scale-analytics/architectures/what-is-data-mesh)-- [Domain-Driven Design](/azure/cloud-adoption-framework/scenarios/cloud-scale-analytics/architectures/data-domains#domain-driven-design)-- [Data products](/azure/cloud-adoption-framework/scenarios/cloud-scale-analytics/architectures/what-is-data-product)-
purview Concept Best Practices Lineage Azure Data Factory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-lineage-azure-data-factory.md
- Title: Microsoft Purview Data Lineage best practices
-description: This article provides best practices for data Lineage various data sources in Microsoft Purview.
----- Previously updated : 07/11/2023---
-# Microsoft Purview Data Lineage best practices
-
-Data Lineage is broadly understood as the lifecycle that spans the dataΓÇÖs origin, and where it moves over time across the data estate. Microsoft Purview can capture lineage for data in different parts of your organization's data estate, and at different levels of preparation including:
-* Raw data staged from various platforms
-* Transformed and prepared data
-* Data used by visualization platforms
-
-## Why do you need adopt Lineage?
-
-Data lineage is the process of describing what data exists, where it's stored and how it flows between systems. There are many reasons why data lineage is important, but at a high level these can all be boiled down to three categories that we'll explore here:
-* Track data in reports
-* Impact analysis
-* Capture the changes and where the data has resided through the data life cycle
-
-## Azure Data Factory Lineage best practice and considerations
-
-### Azure Data Factory instance
-
-* Data lineage won't be reported to the catalog automatically until the Data Factory connection status turns to Connected. The rest of status Disconnected and CannotAccess can't capture lineage.
-
- :::image type="content" source="./media/how-to-link-azure-data-factory/data-factory-connection.png" alt-text="Screen shot showing a data factory connection list." lightbox="./media/how-to-link-azure-data-factory/data-factory-connection.png":::
-
-* Each Data Factory instance can connect to only one Microsoft Purview account. You can establish new connection in another Microsoft Purview account, but this will turn existing connection to disconnected.
-
- :::image type="content" source="./media/how-to-link-azure-data-factory/warning-for-disconnect-factory.png" alt-text="Screenshot showing warning to disconnect Azure Data Factory.":::
-
-* Data factory's managed identity is used to authenticate lineage push operations in Microsoft Purview account. The data factory's managed identity needs Data Curator role on Microsoft Purview root collection.
-
-* Currently, only 10 data factories can be connected at a time. If you want to add more than 10 data factories, please add 10 new data factory connections at a time using the wizard or use API to connect more than 10 data factories in one operation.
--
-### Azure Data Factory activities
-
-* Microsoft Purview captures runtime lineage from the following Azure Data Factory activities:
- * [Copy activity ](../data-factory/copy-activity-overview.md)
- * [Data Flow activity](../data-factory/concepts-data-flow-overview.md)
- * [Execute SSIS Package activity](../data-factory/how-to-invoke-ssis-package-ssis-activity.md)
-
-* Microsoft Purview drops lineage if the source or destination uses an unsupported data storage system.
- * Supported data sources in copy activity are listed **Copy activity support** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md)
- * Supported data sources in data flow activity are listed **Data Flow support** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md)
- * Supported data sources in SSIS are listed **SSIS execute package activity support** of [Lineage from SQL Server Integration Services](how-to-lineage-sql-server-integration-services.md)
-
-* Microsoft Purview can't capture lineage if Azure Data Factory copy activity uses copy activity features listed in **Limitations on copy activity lineage** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md)
-
-* For the lineage of Dataflow activity, Microsoft Purview only support source and sink. The lineage for Dataflow transformation isn't supported yet.
-
-* Data flow lineage doesn't integrate with Microsoft Purview resource set.
- **Resource set example:**
- Qualified name: https://myblob.blob.core.windows.net/sample-data/data{N}.csv
- Display name: "data"
-
-* For the lineage of Execute SSIS Package activity, we only support source and destination. The lineage for transformation isn't supported yet.
-
- :::image type="content" source="./media/concept-best-practices-lineage/ssis-lineage.png" alt-text="Screenshot of the Execute SSIS lineage in Microsoft Purview." lightbox="./media/concept-best-practices-lineage/ssis-lineage.png":::
-
-* Please refer the following step-by-step guide to [push Azure Data Factory lineage in Microsoft Purview](../data-factory/tutorial-push-lineage-to-purview.md).
-
-## Next steps
-- [Manage data sources](./manage-data-sources.md)
purview Concept Best Practices Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-migration.md
- Title: Microsoft Purview migration best practices
-description: This article provides steps to perform backup and recovery for migration best practices.
----- Previously updated : 02/15/2023--
-# Microsoft Purview backup and recovery for migration best practices
-
-This article provides guidance on backup and recovery strategy when your organization has [Microsoft Purview unified data governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions) in production deployment. You can also use this general guideline to implement account migration. The scope of this article is to cover [manual BCDR methods](disaster-recovery.md) where you could automate using APIs. There's some key information to consider upfront:
--- It isn't advisable to back up "scanned" assets' details. You should only back up the curated data such as mapping of classifications and glossaries on assets. The only case when you need to back up assets' details is when you have custom assets via custom `typeDef`.--- The backed-up asset count should be fewer than 100,000 assets. The main driver is that you have to use the search query API to get the assets, which have limitation of 100,000 assets returned. However, if you're able to segment the search query to get smaller number of assets per API call, it's possible to back up more than 100,000 assets.--- The goal is to perform one time migration. If you wish to continuously "sync" assets between two accounts, there are other steps that won't be covered in detail by this article. You have to use [Microsoft Purview's Event Hubs to subscribe and create entities to another account](manage-kafka-dotnet.md). However, Event Hubs only has Atlas information. Microsoft Purview has added other capabilities such as **glossaries** and **contacts** which won't be available via Event Hubs.-
-## Identify key requirements
-Most of enterprise organizations have critical requirement for Microsoft Purview for capabilities such as Backup, Business Continuity, and Disaster Recovery (BCDR). To get into more details of this requirement, you need to differentiate between Backup, High Availability (HA), and Disaster recovery (DR).
-
-While they're similar, HA will keep the service operational if there was a hardware fault, for example, but it wouldn't protect you if someone accidentally or deliberately deleted all the records in your database. For that, you may need to restore the service from a backup.
-
-### Backup
-You may need to create regular backups from a Microsoft Purview account and use a backup in case a piece of data or configuration is accidentally or deliberately deleted from the Microsoft Purview account by the users.
-
-The backup should allow saving a point in time copy of the following configurations from the Microsoft Purview account:
-
-* Account information (for example, Friendly name)
-* Collection structure and role assignments
-* Custom Scan rule sets, classifications, and classification rules
-* Registered data sources
-* Scan information
-* Create and maintain key vaults connections
-* Key vault connections and Credentials and relations with current scans
-* Registered SHIRs
-* Glossary terms templates
-* Glossary terms
-* Manual asset updates (including classification and glossary assignments)
-* ADF and Synapse connections and lineage
-
-Backup strategy is determined by restore strategy, or more specifically how long it will take to restore things when a disaster occurs. To answer that, you may need to engage with the affected stakeholders (the business owners) and understand what the required recovery objectives are.
-
-There are three main requirements to take into consideration:
-* **Recover Time Objective (RTO)** ΓÇô Defines the maximum allowable downtime following a disaster for which ideally the system should be back operational.
-* **Recovery Point Objective (RPO)** ΓÇô Defines the acceptable amount of data loss that is ok following a disaster. Normally RPO is expressed as a timeframe in hours or minutes.
-* **Recovery Level Object (RLO)** ΓÇô Defines the granularity of the data being restored. It could be a SQL server, a set of databases, tables, records, etc.
-
-### High availability
-In computing, the term availability is used to describe the period of time when a service is available, and the time required by a system to respond to a request made by a user. For Microsoft Purview, high availability means ensuring that Microsoft Purview instances are available if there's a problem that is local to a data center or single region in the cloud region.
-
-#### Measuring availability
-Availability is often expressed as a percentage indicating how much uptime is expected from a particular system or component in a given period of time, where a value of 100% would indicate that the system never fails.
-
-These values are calculated based on several factors, including both scheduled and unscheduled maintenance periods, and the time to recover from a possible system failure.
-
-> [!NOTE]
-> Azure data center outages are rare, but can last anywhere from a few minutes to hours.
-> Information about Microsoft Purview availability is available at [SLA for Microsoft Purview](https://azure.microsoft.com/support/legal/sla/purview/v1_0/).
-> Microsoft Purview ensures no data loss but a recovery from outages may require you to restart your workflows such as scans.
-
-### Resiliency
-The ability of a system to recover from failures and continue to function. It's not about avoiding failures but responding to failures in a way that avoids downtime or data loss.
-
-### Business continuity
-Business continuity means continuing your business in the event of a disaster, planning for recovery, and ensuring that your data map is highly available.
-
-### Disaster recovery
-Organizations need a failover mechanism for their Microsoft Purview instances, so when the primary region experiences a catastrophic event like an earthquake or flood, the business must be prepared to have its systems come online elsewhere.
-
-> [!Note]
-> Not all Azure mirrored regions support deploying Microsoft Purview accounts. For example, For a DR scenario, you cannot choose to deploy a new Microsoft Purview account in Canada East if the primary region is Canada Central. Even with Customers managed DR, not all customer may able to trigger a DR.
-
-## Implementation steps
-This section provides high level guidance on required tasks to copy assets, glossaries, classifications & relationships across regions or subscriptions either using the Microsoft Purview governance portal or the REST APIs. The approach is to perform the tasks as programmatically as possible at scale.
-
-### High-level task outline
-1. Create the new account
-1. Migrate configuration items
-1. Run scans
-1. Migrate custom typedefs and custom assets
-1. Migrate relationships
-1. Migrate glossary terms
-1. Assign classifications to assets
-1. Assign contacts to assets
-
-### Create the new account
-You'll need to create a new Microsoft Purview account by following below instruction:
-* [Quickstart: Create a Microsoft Purview account in the Azure portal](create-catalog-portal.md)
-
-ItΓÇÖs crucial to plan ahead on configuration items that you can't change later:
-* Account name
-* Region
-* Subscription
-* Manage resource group name
-
-### Migrate configuration items
-Below steps are referring to [Microsoft Purview API documentation](/rest/api/purview/) so that you can programmatically stand up the backup account quickly:
-
-|Task|Description|
-|-|--|
-|**Account information**|Maintain Account information by granting access for the admin and/or service principal to the account at root level|
-|**Collections**|Create and maintain Collections along with the association of sources with the Collections. You can call [List Collections API](/rest/api/purview/accountdataplane/collections/list-collections) and then get specific details of each collection via [Get Collection API](/rest/api/purview/accountdataplane/collections/get-collection)|
-|**Scan rule set**|Create and maintain custom scan rule sets. You need to call [List all custom scan rule sets API](/rest/api/purview/scanningdataplane/scan-rulesets/list-all) and get details by calling [Get scan rule set API](/rest/api/purview/scanningdataplane/scan-rulesets/get)|
-|**Manual classifications**|Get a list of all manual classifications by calling get classifications APIs and get details of each classification|
-|**Resource set rule**|Create and maintain resource set rule. You can call the [Get resource set rule API](/rest/api/purview/accountdataplane/resource-set-rules/get-resource-set-rule) to get the rule details|
-|**Data sources**|Call the [Get all data sources API](/rest/api/purview/scanningdataplane/scans/list-by-data-source) to list data sources with details. You also have to get the triggers by calling [Get trigger API](/rest/api/purview/scanningdataplane/triggers/get-trigger). There's also [Create data sources API](/rest/api/purview/scanningdataplane/data-sources/create-or-update) if you need to re-create the sources in bulk in the new account.|
-|**Credentials**|Create and maintain credentials used while scanning. There's no API to extract credentials, so this must be redone in the new account.|
-|**Self-hosted integration runtime (SHIR)**|Get a list of SHIR and get updated keys from the new account then update the SHIRs. This must be done [manually inside the SHIRs' hosts](manage-integration-runtimes.md#create-a-self-hosted-integration-runtime).|
-|**ADF connections**|Currently an ADF can be connected to one Microsoft Purview at a time. You must disconnect ADF from failed Microsoft Purview account and reconnect it to the new account later.|
--
-### Run scans
-This will populate all assets with default `typedef`. There are several reasons to run the scans again vs. exporting the existing assets and importing to the new account:
-
-* There's a limit of 100,000 assets returned from the search query to export assets.
-
-* It's cumbersome to export assets with relationships.
-
-* When you rerun the scans, you'll get all relationships and assets details up to date.
-
-* Microsoft Purview comes out with new features regularly so you can benefit from other features when running new scans.
-
-Running the scans is the most effective way to get all assets of data sources that Microsoft Purview is already supporting.
-
-### Migrate custom typedefs and custom assets
-
-#### Custom typedefs
-To identify all custom `typedef`, you can use the [get all type definitions API](/rest/api/purview/catalogdataplane/types/get-all-type-definitions). This will return each type. You need to identify the custom types in such format as `"serviceType": "<custom_typedef>"`
-
-#### Custom assets
-To export custom assets, you can search those custom assets and pass the proper custom `typedef` via the [discovery API](/rest/api/purview/catalogdataplane/discovery/query)
-
-> [!Note]
-> There is a 100,000 return limit per search result.
-> You might have to break the search query so that it wonΓÇÖt return more than 100,000 records.
-
-There are several ways to scope down the search query to get a subset of assets:
-* **Using `Keyword`**: Pass the parent FQN such as `Keyword: "<Parent String>/*"`
-
-* **Using `Filter`**: Include `assetType` with the specific custom `typedef` in your search such as `"assetType": "<custom_typedef>"`
-
-Here's an example of a search payload by customizing the `keywords` so that only assets in specific storage account (`exampleaccount`) are returned:
-
-```json
-{
- "keywords": "adl://exampleaccount.azuredatalakestore.net/*",
- "filter": {
- "and": [
- {
- "not": {
- "or": [
- {
- "attributeName": "size",
- "operator": "eq",
- "attributeValue": 0
- },
- {
- "attributeName": "fileSize",
- "operator": "eq",
- "attributeValue": 0
- }
- ]
- }
- },
- {
- "not": {
- "classification": "MICROSOFT.SYSTEM.TEMP_FILE"
- }
- },
- {
- "not": {
- "or": [
- {
- "entityType": "AtlasGlossaryTerm"
- },
- {
- "entityType": "AtlasGlossary"
- }
- ]
- }
- }
- ]
- },
- "limit": 10,
- "offset": 0,
- "facets": [
- {
- "facet": "assetType",
- "count": 0,
- "sort": {
- "count": "desc"
- }
- },
- {
- "facet": "classification",
- "count": 10,
- "sort": {
- "count": "desc"
- }
- },
- {
- "facet": "contactId",
- "count": 10,
- "sort": {
- "count": "desc"
- }
- },
- {
- "facet": "label",
- "count": 10,
- "sort": {
- "count": "desc"
- }
- },
- {
- "facet": "term",
- "count": 10,
- "sort": {
- "count": "desc"
- }
- }
- ]
-}
-```
-
-The returned assets will have some key/pair value that you can extract details:
-
-```json
-{
- "referredEntities": {},
- "entity": {
- "typeName": "column",
- "attributes": {
- "owner": null,
- "qualifiedName": "adl://exampleaccount.azuredatalakestore.net/123/1/DP_TFS/CBT/Extensions/DTTP.targets#:xml/Project/Target/XmlPeek/@XmlInputPath",
- "name": "~XmlInputPath",
- "description": null,
- "type": "string"
- },
- "guid": "5cf8a9e5-c9fd-abe0-2e8c-d40024263dcb",
- "status": "ACTIVE",
- "createdBy": "ExampleCreator",
- "updatedBy": "ExampleUpdator",
- "createTime": 1553072455110,
- "updateTime": 1553072455110,
- "version": 0,
- "relationshipAttributes": {
- "schema": [],
- "inputToProcesses": [],
- "composeSchema": {
- "guid": "cc6652ae-dc6d-90c9-1899-252eabc0e929",
- "typeName": "tabular_schema",
- "displayText": "tabular_schema",
- "relationshipGuid": "5a4510d4-57d0-467c-888f-4b61df42702b",
- "relationshipStatus": "ACTIVE",
- "relationshipAttributes": {
- "typeName": "tabular_schema_columns"
- }
- },
- "meanings": [],
- "outputFromProcesses": [],
- "tabular_schema": null
- },
- "classifications": [
- {
- "typeName": "MICROSOFT.PERSONAL.EMAIL",
- "lastModifiedTS": "1",
- "entityGuid": "f6095442-f289-44cf-ae56-47f6f6f6000c",
- "entityStatus": "ACTIVE"
- }
- ],
- "contacts": {
- "Expert": [
- {
- "id": "30435ff9-9b96-44af-a5a9-e05c8b1ae2df",
- "info": "Example Expert Info"
- }
- ],
- "Owner": [
- {
- "id": "30435ff9-9b96-44af-a5a9-e05c8b1ae2df",
- "info": "Example Owner Info"
- }
- ]
- }
- }
-}
-```
-
-> [!Note]
-> You need to migrate the term templates from `typedef` output as well.
-
-When you re-create the custom entities, you may need to prepare the payload prior to sending to the API:
-
-> [!Note]
-> The initial goal is to migrate all entities without any relationships or mappings. This will avoid potential errors.
-
-* All `timestamp` value must be null such as `updateTime`, `updateTime`, and `lastModifiedTS`.
-
-* The `guid` can't be regenerated exactly as before so you have to pass in a negative integer such as "-5000" to avoid error.
-
-* The content of `relationshipAttributes` shouldn't be a part of the payload to avoid errors since it's possible that the `guids` aren't the same or haven't been created yet. You have to turn `relationshipAttributes` into an empty array prior to submitting the payload.
-
- * `meanings` contains all glossary mappings, which will be updated in bulk after the entities are created.
-
-* Similarly, `classifications` needs to be an empty array as well when you submit the payload to create entities since you have to create classification mapping to bulk entities later using a different API.
-
-### Migrate relationships
-
-To complete the asset migration, you must remap the relationships. There are three tasks:
-
-1. Call the [relationship API](/rest/api/purview/catalogdataplane/relationship/get) to get relationship information between entities by its `guid`
-
-1. Prepare the relationship payload so that there's no hard reference to old `guids` in the old Microsoft Purview accounts. You need to update those `guids` to the new account's `guids`.
-
-1. Finally, [Create a new relationship between entities](/rest/api/purview/catalogdataplane/relationship/create)
-
-### Migrate glossary terms
-
-> [!Note]
-> Before migrating terms, you need to migrate the term templates. This step should be already covered in the custom `typedef` migration.
-
-#### Using the Microsoft Purview governance portal
-The quickest way to migrate glossary terms is to [export terms to a .csv file](how-to-import-export-glossary.md). You can do this using the Microsoft Purview governance portal.
-
-#### Using Microsoft Purview API
-To automate glossary migration, you first need to get the glossary `guid` (`glossaryGuid`) via [List Glossaries API](/rest/api/purview/catalogdataplane/glossary/list-glossaries). The `glossaryGuid` is the top/root level glossary `guid`.
-
-The below sample response will provide the `guid` to use for subsequent API calls:
-
-```json
-"guid": "c018ddaf-7c21-4b37-a838-dae5f110c3d8"
-```
-
-Once you have the `glossaryGuid`, you can start to migrate the terms via two steps:
-
-1. [Export Glossary Terms As .csv](/rest/api/purview/catalogdataplane/glossary/export-glossary-terms-as-csv)
-
-1. [Import Glossary Terms Via .csv](/rest/api/purview/catalogdataplane/glossary/import-glossary-terms-via-csv)
-
-### Assign classifications to assets
-
-> [!Note]
-> The prerequisite for this step is to have all classifications available in the new account from [Migrate configuration items]() step.
-
-You must call the [discovery API](/rest/api/purview/catalogdataplane/discovery/query) to get the classification assignments to assets. This is applicable to all assets. If you've migrated the custom assets, the information about classification assignments is already available in `classifications` property. Another way to get classifications is to [list classification per `guid`](/rest/api/purview/catalogdataplane/entity/get-classifications) in the old account.
-
-To assign classifications to assets, you need to [associate a classification to multiple entities in bulk](/rest/api/purview/catalogdataplane/entity/add-classification) via the API.
-
-### Assign contacts to assets
-
-If you have extracted asset information from previous steps, the contact details are available from the [discovery API](/rest/api/purview/catalogdataplane/discovery/query).
-
-To assign contacts to assets, you need a list of `guids` and identify all `objectid` of the contacts. You can automate this process by iterating through all assets and reassign contacts to all assets using the [Create Or Update Entities API](/rest/api/purview/catalogdataplane/entity/create-or-update-entities)
-
-## Next steps
-- [Create a Microsoft Purview account](./create-catalog-portal.md)
purview Concept Best Practices Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-network.md
- Title: Microsoft Purview network architecture and best practices
-description: This article provides examples of Microsoft Purview network architecture options and describes best practices.
----- Previously updated : 03/24/2023---
-# Microsoft Purview network architecture and best practices
-
-Microsoft Purview data governance solutions are a platform as a service (PaaS) solutions for data governance. Microsoft Purview accounts have public endpoints that are accessible through the internet to connect to the service. However, all endpoints are secured through Azure Active Directory (Azure AD) logins and role-based access control (RBAC).
-
->[!NOTE]
->These best practices cover the network architecture for [Microsoft Purview unified data governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions). For more information about Microsoft Purview risk and compliance solutions, [go here](/microsoft-365/compliance/). For more information about Microsoft Purview in general, [go here](/purview/purview).
-
-For an added layer of security, you can create private endpoints for your Microsoft Purview account. You'll get a private IP address from your virtual network in Azure to the Microsoft Purview account and its managed resources. This address will restrict all traffic between your virtual network and the Microsoft Purview account to a private link for user interaction with the APIs and Microsoft Purview governance portal, or for scanning and ingestion.
-
-Currently, the Microsoft Purview firewall provides access control for the public endpoint of your purview account. You can use the firewall to allow all access or to block all access through the public endpoint when using private endpoints. For more information see, [Microsoft Purview firewall options](catalog-firewall.md)
-
-Based on your network, connectivity, and security requirements, you can set up and maintain Microsoft Purview accounts to access underlying services or ingestion. Use this best practices guide to define and prepare your network environment so you can access Microsoft Purview and scan data sources from your network or cloud.
-
-This guide covers the following network options:
--- Use [Azure public endpoints](#option-1-use-public-endpoints).-- Use [private endpoints](#option-2-use-private-endpoints).-- Use [private endpoints and allow public access on the same Microsoft Purview account](#option-3-use-both-private-and-public-endpoints).-- Use Azure [public endpoints to access Microsoft Purview governance portal and private endpoints for ingestion](#option-4-use-private-endpoints-for-ingestion-only).-
-This guide describes a few of the most common network architecture scenarios for Microsoft Purview. Though you're not limited to those scenarios, keep in mind the [limitations](#current-limitations) of the service when you're planning networking for your Microsoft Purview accounts.
-
-## Prerequisites
-
-To understand which network option is best for your environment, we suggest that you perform the following actions first:
--- Review your network topology and security requirements before registering and scanning any data sources in Microsoft Purview. For more information, see: [Define an Azure network topology](/azure/cloud-adoption-framework/ready/azure-best-practices/define-an-azure-network-topology).--- Define your [network connectivity model for PaaS services](/azure/cloud-adoption-framework/ready/azure-best-practices/connectivity-to-azure-paas-services).-
-## Option 1: Use public endpoints
-
-By default, you can use Microsoft Purview accounts through the public endpoints accessible over the internet. Allow public networks in your Microsoft Purview account if you have the following requirements:
--- No private connectivity is required when scanning or connecting to Microsoft Purview endpoints.-- All data sources are software-as-a-service (SaaS) applications only.-- All data sources have a public endpoint that's accessible through the internet.-- Business users require access to a Microsoft Purview account and the Microsoft Purview governance portal through the internet.-
-### Integration runtime options
-
-To scan data sources while the Microsoft Purview account firewall is set to allow public access, you can use both the Azure integration runtime and a [self-hosted integration runtime](./manage-integration-runtimes.md). How you use them depends on the [supportability of your data sources](manage-data-sources.md).
-
-Here are some best practices:
--- You can use the Azure integration runtime or a self-hosted integration runtime to scan Azure data sources such as Azure SQL Database or Azure Blob Storage, but we recommend that you use the Azure integration runtime to scan Azure data sources when possible, to reduce cost and administrative overhead.
-
-- To scan multiple Azure data sources, use a public network and the Azure integration runtime. The following steps show the communication flow at a high level when you're using the Azure integration runtime to scan a data source in Azure:-
- :::image type="content" source="media/concept-best-practices/network-azure-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, the Azure runtime, and data sources."lightbox="media/concept-best-practices/network-azure-runtime.png":::
-
- 1. A manual or automatic scan is initiated from the Microsoft Purview Data Map through the Azure integration runtime.
-
- 1. The Azure integration runtime connects to the data source to extract metadata.
-
- 1. Metadata is queued in Microsoft Purview managed storage and stored in Azure Blob Storage.
-
- 1. Metadata is sent to the Microsoft Purview Data Map.
--- Scanning on-premises and VM-based data sources always requires using a self-hosted integration runtime. The Azure integration runtime isn't supported for these data sources. The following steps show the communication flow at a high level when you're using a self-hosted integration runtime to scan a data source. The first diagram shows a scenario where resources are within Azure or on a VM in Azure. The second diagram shows a scenario with on-premises resources. The steps between the two are the same from Microsoft Purview's perspective:-
- :::image type="content" source="media/concept-best-practices/network-self-hosted-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, a self-hosted runtime, and data sources."lightbox="media/concept-best-practices/network-self-hosted-runtime.png":::
-
- :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime-on-premises.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, an on-premises self-hosted runtime, and data sources in on-premises network."lightbox="media/concept-best-practices/security-self-hosted-runtime-on-premises.png":::
-
- 1. A manual or automatic scan is triggered. Microsoft Purview connects to Azure Key Vault to retrieve the credential to access a data source.
-
- 1. The scan is initiated from the Microsoft Purview Data Map through a self-hosted integration runtime.
-
- 1. The self-hosted integration runtime service from the VM or on-premises machine connects to the data source to extract metadata.
-
- 1. Metadata is processed in the machine's memory for the self-hosted integration runtime. Metadata is queued in Microsoft Purview managed storage and then stored in Azure Blob Storage. Actual data never leaves the boundary of your network.
-
- 1. Metadata is sent to the Microsoft Purview Data Map.
-
-### Authentication options
-
-When you're scanning a data source in Microsoft Purview, you need to provide a credential. Microsoft Purview can then read the metadata of the assets by using the Azure integration runtime in the destination data source. When you're using a public network, authentication options and requirements vary based on the following factors:
--- **Data source type**. For example, if the data source is Azure SQL Database, you need to use a login with db_datareader access to each database. This can be a user-managed identity or a Microsoft Purview managed identity. Or it can be a service principal in Azure Active Directory added to SQL Database as db_datareader.-
- If the data source is Azure Blob Storage, you can use a Microsoft Purview managed identity, or a service principal in Azure Active Directory added as a Blob Storage Data Reader role on the Azure storage account. Or use the storage account's key.
--- **Authentication type**. We recommend that you use a Microsoft Purview managed identity to scan Azure data sources when possible, to reduce administrative overhead. For any other authentication types, you need to [set up credentials for source authentication inside Microsoft Purview](manage-credentials.md):-
- 1. Generate a secret inside an Azure key vault.
- 1. Register the key vault inside Microsoft Purview.
- 1. Inside Microsoft Purview, create a new credential by using the secret saved in the key vault.
--- **Runtime type that's used in the scan**. Currently, you can't use a Microsoft Purview managed identity with a self-hosted integration runtime.-
-### Other considerations
--- If you choose to scan data sources using public endpoints, your self-hosted integration runtime VMs must have outbound access to data sources and Azure endpoints.--- Your self-hosted integration runtime VMs must have [outbound connectivity to Azure endpoints](manage-integration-runtimes.md#networking-requirements).--- Your Azure data sources must allow public access. If a service endpoint is enabled on the data source, make sure you _allow Azure services on the trusted services list_ to access your Azure data sources. The service endpoint routes traffic from the virtual network through an optimal path to Azure.-
-## Option 2: Use private endpoints
-
-Similar to other PaaS solutions, Microsoft Purview doesn't support deploying directly into a virtual network. So you can't use certain networking features with the offering's resources, such as network security groups, route tables, or other network-dependent appliances such as Azure Firewall. Instead, you can use private endpoints that can be enabled on your virtual network. You can then disable public internet access to securely connect to Microsoft Purview.
-
-You must use private endpoints for your Microsoft Purview account if you have any of the following requirements:
--- You need to have end-to-end network isolation for Microsoft Purview accounts and data sources.--- You need to [block public access](./catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access) to your Microsoft Purview accounts.--- Your platform-as-a-service (PaaS) data sources are deployed with private endpoints, and you've blocked all access through the public endpoint.--- Your on-premises or infrastructure-as-a-service (IaaS) data sources can't reach public endpoints.-
-### Design considerations
--- To connect to your Microsoft Purview account privately and securely, you need to deploy an account and a portal private endpoint. For example, this deployment is necessary if you intend to connect to Microsoft Purview through the API or use the Microsoft Purview governance portal.--- If you need to connect to the Microsoft Purview governance portal by using private endpoints, you have to deploy both account and portal private endpoints.--- To scan data sources through private connectivity, you need to configure at least one account and one ingestion private endpoint for Microsoft Purview. You must configure scans by using a self-hosted integration runtime through an authentication method other than a Microsoft Purview managed identity. --- Review [Support matrix for scanning data sources through an ingestion private endpoint](catalog-private-link.md#support-matrix-for-scanning-data-sources-through-ingestion-private-endpoint) before you set up any scans.--- Review [DNS requirements](catalog-private-link-name-resolution.md#deployment-options). If you're using a custom DNS server on your network, clients must be able to resolve the fully qualified domain name (FQDN) for the Microsoft Purview account endpoints to the private endpoint's IP address.--- To scan Azure data sources through private connectivity, use [Managed VNet Runtime](catalog-managed-vnet.md). View [supported regions](catalog-managed-vnet.md#supported-regions). This option can reduce the administrative overhead of deploying and managing self-hosted integration runtime machines.-
-### Integration runtime options
--- If your data sources are in Azure, you can choose any of the following runtime options:
-
- - Managed VNet runtime. Use this option if your Microsoft Purview account is deployed in any of the [supported regions](catalog-managed-vnet.md#supported-regions) and you are planning to scan any of the [supported data sources](catalog-managed-vnet.md#supported-data-sources).
-
- - Self-hosted integration runtime.
-
- - If using self-hosted integration runtime, you need to set up and use a self-hosted integration runtime on a Windows virtual machine that's deployed inside the same or a peered virtual network where Microsoft Purview ingestion private endpoints are deployed. The Azure integration runtime won't work with ingestion private endpoints.
-
- - To scan on-premises data sources, you can also install a self-hosted integration runtime either on an on-premises Windows machine or on a VM inside an Azure virtual network.
-
- - When you're using private endpoints with Microsoft Purview, you need to allow network connectivity from data sources to the self-hosted integration VM on the Azure virtual network where Microsoft Purview private endpoints are deployed.
-
- - We recommend allowing automatic upgrade of the self-hosted integration runtime. Make sure you open required outbound rules in your Azure virtual network or on your corporate firewall to allow automatic upgrade. For more information, see [Self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).
-
-### Authentication options
--- You can't use a Microsoft Purview managed identity to scan data sources through ingestion private endpoints. Use a service principal, an account key, or SQL authentication, based on data source type.--- Make sure that your credentials are stored in an Azure key vault and registered inside Microsoft Purview.--- You must create a credential in Microsoft Purview based on each secret that you create in the Azure key vault. You need to assign, at minimum, _get_ and _list_ access for secrets for Microsoft Purview on the Key Vault resource in Azure. Otherwise, the credentials won't work in the Microsoft Purview account.-
-### Current limitations
--- Scanning multiple Azure sources by using the entire subscription or resource group through ingestion private endpoints and a self-hosted integration runtime isn't supported when you're using private endpoints for ingestion. Instead, you can register and scan data sources individually.--- For limitations related to Microsoft Purview private endpoints, see [Known limitations](catalog-private-link-troubleshoot.md#known-limitations).--- For limitations related to the Private Link service, see [Azure Private Link limits](../azure-resource-manager/management/azure-subscription-service-limits.md#private-link-limits).-
-### Private endpoint scenarios
-
-#### Single virtual network, single region
-
-In this scenario, all Azure data sources, self-hosted integration runtime VMs, and Microsoft Purview private endpoints are deployed in the same virtual network in an Azure subscription.
-
-If on-premises data sources exist, connectivity is provided through a site-to-site VPN or Azure ExpressRoute connectivity to an Azure virtual network where Microsoft Purview private endpoints are deployed.
-
-This architecture is suitable mainly for small organizations or for development, testing, and proof-of-concept scenarios.
--
-#### Single region, multiple virtual networks
-
-To connect two or more virtual networks in Azure together, you can use [virtual network peering](../virtual-network/virtual-network-peering-overview.md). Network traffic between peered virtual networks is private and is kept on the Azure backbone network.
-
-Many customers build their network infrastructure in Azure by using the hub-and-spoke network architecture, where:
--- Networking shared services (such as network virtual appliances, ExpressRoute/VPN gateways, or DNS servers) are deployed in the hub virtual network.-- Spoke virtual networks consume those shared services via virtual network peering.-
-In hub-and-spoke network architectures, your organization's data governance team can be provided with an Azure subscription that includes a virtual network (hub). All data services can be located in a few other subscriptions connected to the hub virtual network through a virtual network peering or a site-to-site VPN connection.
-
-In a hub-and-spoke architecture, you can deploy Microsoft Purview and one or more self-hosted integration runtime VMs in the hub subscription and virtual network. You can register and scan data sources from other virtual networks from multiple subscriptions in the same region.
-
-The self-hosted integration runtime VMs can be deployed inside the same Azure virtual network or a peered virtual network where the account and ingestion private endpoints are deployed.
--
-You can optionally deploy another self-hosted integration runtime in the spoke virtual networks.
-
-#### Multiple regions, multiple virtual networks
-
-If your data sources are distributed across multiple Azure regions in one or more Azure subscriptions, you can use this scenario.
-
-For performance and cost optimization, we highly recommended deploying one or more self-hosted integration runtime VMs in each region where data sources are located.
--
-#### Scan using Managed Vnet Runtime
-
-You can use Managed VNet Runtime to scan data sources in a private network, if your Microsoft Purview account is deployed in any of the [supported regions](catalog-managed-vnet.md#supported-regions) and you are planning to scan Any of the supported [Azure data sources](catalog-managed-vnet.md#supported-data-sources).
-
-Using Managed VNet Runtime helps to minimize the administrative overhead of managing the runtime and reduce overall scan duration.
-
-To scan any Azure data sources using Managed VNet Runtime, a managed private endpoint must be deployed within Microsoft Purview Managed Virtual Network, even if the data source already has a private network in your Azure subscription.
--
-If you need to scan on-premises data sources or additional data sources in Azure that are not supported by Managed VNet Runtime, you can deploy both Managed VNet Runtime and Self-hosted integration runtime.
--
-### If Microsoft Purview isn't available in your primary region
-
-> [!NOTE]
-> Follow recommendations in this section if Microsoft Purview isn't supported in your primary Azure region. For more information, see: [Selecting an Azure region](concept-best-practices-accounts.md#selecting-an-azure-region).
-
-If Microsoft Purview isn't available in your primary Azure region, and secure connectivity for metadata ingestion or user access is required to access Microsoft Purview governance portal, you can use the options below.
-
-For example, if most or all of your Azure data services are deployed in Australia Southeast, where Microsoft Purview isn't currently available, you could choose Australia East region to deploy your Microsoft Purview account and use the options below to enable private network connectivity and portal access.
-
-**Option 1: Deploy your Microsoft Purview account in a secondary region and deploy all private endpoints in the primary region, where your Azure data sources are located.**
-For this scenario:
--- This is the recommended option, if Australia Southeast is the primary region for all your data sources and you have all network resources deployed in your primary region.-- Deploy a Microsoft Purview account in your secondary region (for example, Australia East).-- Deploy all Microsoft Purview private endpoints including account, portal and ingestion in your primary region (for example, Australia Southeast).-- Deploy all [Microsoft Purview self-hosted integration runtime]( manage-integration-runtimes.md) VMs in your primary region (for example, Australia Southeast). This helps to reduce cross region traffic as the Data Map scans will happen in the local region where data sources are located and only metadata is ingested int your secondary region where your Microsoft Purview account is deployed.-- If you use [Microsoft Purview Managed VNets](catalog-managed-vnet.md) for metadata ingestion, Managed VNet Runtime and all managed private endpoints will be automatically deployed in the region where your Microsoft Purview is deployed (for example, Australia East).-
-**Option 2: Deploy your Microsoft Purview account in a secondary region and deploy private endpoints in the primary and secondary regions.**
-For this scenario:
--- This option is recommended if you have data sources in both primary and secondary regions and users are connected through the primary region.-- Deploy a Microsoft Purview account in your secondary region (for example, Australia East).-- Deploy Microsoft Purview governance portal private endpoint in the primary region (for example, Australia Southeast) for user access to Microsoft Purview governance portal.-- Deploy Microsoft Purview account and ingestion private endpoints in your primary region (for example, Australia southeast) to scan data sources locally in the primary region.-- Deploy Microsoft Purview account and ingestion private endpoints in your secondary region (for example, Australia East) to scan data sources locally in the secondary region.-- Deploy [Microsoft Purview self-hosted integration runtime]( manage-integration-runtimes.md) VMs in both primary and secondary regions. This will help to keep data Map scan traffic in the local region and send only metadata to Microsoft Purview Data Map where is configured in your secondary region (for example, Australia East).-- If you use [Microsoft Purview Managed VNets](catalog-managed-vnet.md) for metadata ingestion, Managed VNet Runtime and all managed private endpoints will be automatically deployed in the region where your Microsoft Purview is deployed (for example, Australia East).-
-### DNS configuration with private endpoints
-
-#### Name resolution for multiple Microsoft Purview accounts
-
-It's recommended to follow these recommendations, if your organization needs to deploy and maintain multiple Microsoft Purview accounts using private endpoints:
-
-1. Deploy at least one _account_ private endpoint for each Microsoft Purview account.
-1. Deploy at least one set of _ingestion_ private endpoints for each Microsoft Purview account.
-1. Deploy one _portal_ private endpoint for one of the Microsoft Purview accounts in your Azure environments. Create one DNS A record for _portal_ private endpoint to resolve `web.purview.azure.com`. The _portal_ private endpoint can be used by all purview accounts in the same Azure virtual network or virtual networks connected through VNet peering.
--
-This scenario also applies if multiple Microsoft Purview accounts are deployed across multiple subscriptions and multiple VNets that are connected through VNet peering. _Portal_ private endpoint mainly renders static assets related to the Microsoft Purview governance portal, thus, it's independent of Microsoft Purview account, therefore, only one _portal_ private endpoint is needed to visit all Microsoft Purview accounts in the Azure environment if VNets are connected.
--
-> [!NOTE]
-> You may need to deploy separate _portal_ private endpoints for each Microsoft Purview account in the scenarios where Microsoft Purview accounts are deployed in isolated network segmentations.
-> Microsoft Purview _portal_ is static contents for all customers without any customer information. Optionally, you can use public network, (without portal private endpoint) to launch `web.purview.azure.com` if your end users are allowed to launch the Internet.
-
-## Option 3: Use both private and public endpoints
-
-You might choose an option in which a subset of your data sources uses private endpoints, and at the same time, you need to scan either of the following:
--- Other data sources that are configured with a [service endpoint](../virtual-network/virtual-network-service-endpoints-overview.md)-- Data sources that have a public endpoint that's accessible through the internet-
-If you need to scan some data sources by using an ingestion private endpoint and some data sources by using public endpoints or a service endpoint, you can:
-
-1. Use private endpoints for your Microsoft Purview account.
-1. Set **Public network access** to **Enabled from all networks** on your Microsoft Purview account.
-
-### Integration runtime options
--- To scan an Azure data source that's configured with a private endpoint, you need to set up and use a self-hosted integration runtime on a Windows virtual machine that's deployed inside the same or a peered virtual network where Microsoft Purview account and ingestion private endpoints are deployed.-
- When you're using a private endpoint with Microsoft Purview, you need to allow network connectivity from data sources to a self-hosted integration VM on the Azure virtual network where Microsoft Purview private endpoints are deployed.
--- To scan an Azure data source that's configured to allow a public endpoint, you can use the Azure integration runtime.--- To scan on-premises data sources, you can also install a self-hosted integration runtime on either an on-premises Windows machine or a VM inside an Azure virtual network.--- We recommend allowing automatic upgrade for a self-hosted integration runtime. Make sure you open required outbound rules in your Azure virtual network or on your corporate firewall to allow automatic upgrade. For more information, see [Self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).-
-### Authentication options
--- To scan an Azure data source that's configured to allow a public endpoint, you can use any authentication option, based on the data source type.
-
-- If you use an ingestion private endpoint to scan an Azure data source that's configured with a private endpoint:-
- - You can't use a Microsoft Purview managed identity. Instead, use a service principal, an account key, or SQL authentication, based on the data source type.
-
- - Make sure that your credentials are stored in an Azure key vault and registered inside Microsoft Purview.
-
- - You must create a credential in Microsoft Purview based on each secret that you create in Azure Key Vault. At minimum, assign _get_ and _list_ access for secrets for Microsoft Purview on the Key Vault resource in Azure. Otherwise, the credentials won't work in the Microsoft Purview account.
-
-## Option 4: Use private endpoints for ingestion only
-
-You might choose this option if you need to:
--- Scan all data sources using ingestion private endpoint.-- Managed resources must be configured to disable public network.-- Enable access to Microsoft Purview governance portal through public network. -
-To enable this option:
-
-1. Configure ingestion private endpoint for your Microsoft Purview account.
-1. Set **Public network access** to **Disabled for ingestion only (Preview)** on your [Microsoft Purview account](catalog-firewall.md).
-
-### Integration runtime options
-
-Follow recommendation for option 2.
-
-### Authentication options
-
-Follow recommendation for option 2.
-
-## Self-hosted integration runtime network and proxy recommendations
-
-For scanning data sources across your on-premises and Azure networks, you may need to deploy and use one or multiple [self-hosted integration runtime virtual machines](manage-integration-runtimes.md) inside an Azure VNet or an on-premises network, for any of the scenarios mentioned earlier in this document.
--- To simplify management, when possible, use Azure runtime and [Microsoft Purview Managed runtime](catalog-managed-vnet.md) to scan Azure data sources.--- The Self-hosted integration runtime service can communicate with Microsoft Purview through public or private network over port 443. For more information, see, [self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).--- One self-hosted integration runtime VM can be used to scan one or multiple data sources in Microsoft Purview, however, self-hosted integration runtime must be only registered for Microsoft Purview and can't be used for Azure Data Factory or Azure Synapse at the same time.--- You can register and use one or multiple self-hosted integration runtimes in one Microsoft Purview account. It's recommended to place at least one self-hosted integration runtime VM in each region or on-premises network where your data sources reside.--- It's recommended to define a baseline for required capacity for each self-hosted integration runtime VM and scale the VM capacity based on demand.--- It's recommended to set up network connection between self-hosted integration runtime VMs and Microsoft Purview and its managed resources through private network, when possible.--- Allow outbound connectivity to download.microsoft.com, if auto-update is enabled.--- The self-hosted integration runtime service doesn't require outbound internet connectivity, if self-hosted integration runtime VMs are deployed in an Azure VNet or in the on-premises network that is connected to Azure through an ExpressRoute or Site to Site VPN connection. In this case, the scan and metadata ingestion process can be done through private network. --- Self-hosted integration runtime can communicate Microsoft Purview and its managed resources directly or through [a proxy server](manage-integration-runtimes.md#proxy-server-considerations). Avoid using proxy settings if self-hosted integration runtime VM is inside an Azure VNet or connected through ExpressRoute or Site to Site VPN connection.--- Review supported scenarios, if you need to use self-hosted integration runtime with [proxy setting](manage-integration-runtimes.md#proxy-server-considerations).-
-## Next steps
--- [Use private endpoints for secure access to Microsoft Purview](./catalog-private-link.md)
purview Concept Best Practices Scanning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-scanning.md
- Title: Best practices for scanning data sources in Microsoft Purview
-description: This article provides best practices for registering and scanning various data sources in Microsoft Purview.
----- Previously updated : 02/23/2023---
-# Microsoft Purview scanning best practices
-
-[Microsoft Purview governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions) support automated scanning of on-premises, multicloud, and software as a service (SaaS) data sources.
-
-Running a *scan* invokes the process to ingest metadata from the registered data sources. The metadata curated at the end of the scan and curation process includes technical metadata. This metadata can include data asset names such as table names or file names, file size, columns, and data lineage. Schema details are also captured for structured data sources. A relational database management system is an example of this type of source.
-
-The curation process applies automated classification labels on the schema attributes based on the scan rule set configured. Sensitivity labels are applied if your Microsoft Purview account is connected to the Microsoft Purview compliance portal.
-
-> [!IMPORTANT]
-> If you have any [Azure Policies](../governance/policy/overview.md) preventing **updates to Storage accounts**, this will cause errors for Microsoft Purview's scanning process. Follow the [Microsoft Purview exception tag guide](create-azure-purview-portal-faq.md) to create an exception for Microsoft Purview accounts.
-
-## Why do you need best practices to manage data sources?
-
-Best practices enable you to:
--- Optimize cost.-- Build operational excellence.-- Improve security compliance.-- Gain performance efficiency.-
-## Register a source and establish a connection
-
-The following design considerations and recommendations help you register a source and establish a connection.
-
-### Design considerations
--- Use collections to create the hierarchy that aligns with the organization's strategy, like geographical, business function, or source of data. The hierarchy defines the data sources to be registered and scanned.-- By design, you can't register data sources multiple times in the same Microsoft Purview account. This architecture helps to avoid the risk of assigning different access control to the same data source.-
-### Design recommendations
--- If the metadata of the same data source is consumed by multiple teams, you can register and manage the data source at a parent collection. Then you can create corresponding scans under each subcollection. In this way, relevant assets appear under each child collection. Sources without parents are grouped in a dotted box in the map view. No arrows link them to parents.-
- :::image type="content" source="media/concept-best-practices/scanning-parent-child.png" alt-text="Screenshot that shows Microsoft Purview with data source registered at parent collection.":::
--- Use the **Azure Multiple** option if you need to register multiple sources, such as Azure subscriptions or resource groups, in the cloud. For more information, see the following documentation:-
- * [Scan multiple sources in Microsoft Purview](./register-scan-azure-multiple-sources.md)
- * [Check data source readiness at scale](./tutorial-data-sources-readiness.md)
- * [Configure access to data sources for Microsoft Purview MSI at scale](./tutorial-msi-configuration.md)
--- After a data source is registered, you might scan the same source multiple times, in case the same source is being used differently by various teams or business units.-
-For more information on how to define a hierarchy for registering data sources, see [Best practices on collections architecture](./concept-best-practices-collections.md).
-
-## Scanning
-
-The following design considerations and recommendations are organized based on the key steps involved in the scanning process.
-
-### Design considerations
--- After the data source is registered, set up a scan to manage automated and secure metadata scanning and curation.-- Scan setup includes configuring the name of the scan, scope of scan, integration runtime, scan trigger frequency, scan rule set, and resource set uniquely for each data source per scan frequency.-- Before you create any credentials, consider your data source types and networking requirements. This information helps you decide which authentication method and integration runtime you need for your scenario.-
-### Design recommendations
-
-After you register your source in the relevant [collection](./how-to-create-and-manage-collections.md), plan and follow the order shown here when you set up the scan. This process order helps you to avoid unexpected costs and rework.
--
-1. Identify your classification requirements from the system in-built classification rules. Or you can create specific custom classification rules, as necessary. Base them on specific industry, business, or regional requirements, which aren't available out of the box:
- - See the [classification best practices](./concept-best-practices-classification.md).
- - See how to [create a custom classification and classification rule](./create-a-custom-classification-and-classification-rule.md).
-
-1. Create scan rule sets before you configure the scan.
-
- :::image type="content" source="media/concept-best-practices/scanning-create-scan-rule-set.png" alt-text="Screenshot that shows the Scan rule sets under Data map.":::
-
- When you create the scan rule set, ensure the following points:
- - Verify if the system default scan rule set is sufficient for the data source being scanned. Otherwise, define your custom scan rule set.
- - The custom scan rule set can include both system default and custom, so clear those options not relevant for the data assets being scanned.
- - Where necessary, create a custom rule set to exclude unwanted classification labels. For example, the system rule set contains generic government code patterns for the planet, not just the United States. Your data might match the pattern of some other type, such as "Belgium Driver's License Number."
- - Limit custom classification rules to *most important* and *relevant* labels to avoid clutter. You don't want to have too many labels tagged to the asset.
- - If you modify the custom classification or scan rule set, a full scan is triggered. Configure the classification and scan rule set appropriately to avoid rework and costly full scans.
-
- :::image type="content" source="media/concept-best-practices/scanning-create-custom-scan-rule-set.png" alt-text="Screenshot that shows the option to select relevant classification rules when you create the custom scan rule set.":::
-
- > [!NOTE]
- > When you scan a storage account, Microsoft Purview uses a set of defined patterns to determine if a group of assets forms a resource set. You can use resource set pattern rules to customize or override how Microsoft Purview detects which assets are grouped as resource sets. The rules also determine how the assets are displayed within the catalog.
- > For more information, see [Create resource set pattern rules](./how-to-resource-set-pattern-rules.md).
- > This feature has cost considerations. For information, see the [pricing page](https://azure.microsoft.com/pricing/details/azure-purview/).
-
-1. Set up a scan for the registered data sources.
- - **Scan name**: By default, Microsoft Purview uses the naming convention **SCAN-[A-Z][a-z][a-z]**, which isn't helpful when you're trying to identify a scan that you've run. Be sure to use a meaningful naming convention. For instance, you could name the scan _environment-source-frequency-time_ as DEVODS-Daily-0200. This name represents a daily scan at 0200 hours.
-
- - **Authentication**: Microsoft Purview offers various authentication methods for scanning data sources, depending on the type of source. It could be Azure cloud or on-premises or third-party sources. Follow the least-privilege principle for the authentication method in this order of preference:
- - Microsoft Purview MSI - Managed Service Identity (for example, for Azure Data Lake Storage Gen2 sources)
- - User-assigned managed identity
- - Service principal
- - SQL authentication (for example, for on-premises or Azure SQL sources)
- - Account key or basic authentication (for example, for SAP S/4HANA sources)
-
- For more information, see the how-to guide to [manage credentials](./manage-credentials.md).
-
- > [!Note]
- > If you have a firewall enabled for the storage account, you must use the managed identity authentication method when you set up a scan.
- > When you set up a new credential, the credential name can only contain _letters, numbers, underscores, and hyphens_.
-
- - **Integration runtime**
- - For more information, see [Network architecture best practices](./concept-best-practices-network.md#integration-runtime-options).
- - If self-hosted integration runtime (SHIR) is deleted, any ongoing scans that rely on it will fail.
- - When you use SHIR, make sure that the memory is sufficient for the data source being scanned. For example, when you use SHIR for scanning an SAP source, if you see "out of memory error":
- - Ensure the SHIR machine has enough memory. The recommended amount is 128 GB.
- - In the scan setting, set the maximum memory available as some appropriate value, for example, 100.
- - For more information, see the prerequisites in [Scan to and manage SAP ECC Microsoft Purview](./register-scan-sapecc-source.md#create-and-run-scan).
-
- - **Scope scan**
- - When you set up the scope for the scan, select only the assets that are relevant at a granular level or parent level. This practice ensures that the scan cost is optimal and performance is efficient. All future assets under a certain parent will be automatically selected if the parent is fully or partially checked.
- - Some examples for some data sources:
- - For Azure SQL Database or Data Lake Storage Gen2, you can scope your scan to specific parts of the data source. Select the appropriate items in the list, such as folders, subfolders, collections, or schemas.
- - For Oracle, Hive Metastore Database, and Teradata sources, a specific list of schemas to be exported can be specified through semicolon-separated values or schema name patterns by using SQL LIKE expressions.
- - For Google Big query, a specific list of datasets to be exported can be specified through semicolon-separated values.
- - When you create a scan for an entire AWS account, you can select specific buckets to scan. When you create a scan for a specific AWS S3 bucket, you can select specific folders to scan.
- - For Erwin, you can scope your scan by providing a semicolon-separated list of Erwin model locator strings.
- - For Cassandra, a specific list of key spaces to be exported can be specified through semicolon-separated values or through key spaces name patterns by using SQL LIKE expressions.
- - For Looker, you can scope your scan by providing a semicolon-separated list of Looker projects.
- - For Power BI tenant, you might only specify whether to include or exclude personal workspace.
-
- :::image type="content" source="media/concept-best-practices/scanning-scope-scan.png" alt-text="Screenshot that shows the option to scope a scan while configuring the scan.":::
-
- - In general, use "ignore patterns," where they're supported, based on wild cards (for example, for data lakes) to exclude temp, config files, RDBMS system tables, or backup or STG tables.
- - When you scan documents or unstructured data, avoid scanning a huge number of such documents. The scan processes the first 20 MB of such documents and might result in longer scan duration.
-
- - **Scan rule set**
- - When you select the scan rule set, make sure to configure the relevant system or custom scan rule set that was created earlier.
- - You can create custom filetypes and fill in the details accordingly. Currently, Microsoft Purview supports only one character in Custom Delimiter. If you use custom delimiters, such as ~, in your actual data, you need to create a new scan rule set.
-
- :::image type="content" source="media/concept-best-practices/scanning-scan-rule-set.png" alt-text="Screenshot that shows the scan rule set selection while configuring the scan.":::
-
- - **Scan type and schedule**
- - The scan process can be configured to run full or incremental scans.
- - Run the scans during non-business or off-peak hours to avoid any processing overload on the source.
- - **Start recurrence at** must be at least 1 minute lesser than the **schedule scan time**, otherwise, the scan will be triggered in next recurrence.
- - Initial scan is a full scan, and every subsequent scan is incremental. Subsequent scans can be scheduled as periodic incremental scans.
- - The frequency of scans should align with the change management schedule of the data source or business requirements. For example:
- - If the source structure could potentially change weekly, the scan frequency should be in sync. Changes include new assets or fields within an asset that are added, modified, or deleted.
- - If the classification or sensitivity labels are expected to be up to date on a weekly basis, perhaps for regulatory reasons, the scan frequency should be weekly.
- For example, if partitions files are added every week in a source data lake, you might schedule monthly scans. You don't need to schedule weekly scans because there's no change in metadata. This suggestion assumes there are no new classification scenarios.
- - When you schedule a scan to run on the same day it's created, the start time must be before the scan time by at least one minute.
- - The maximum duration that the scan can run is seven days, possibly because of memory issues. This time period excludes the ingestion process. If progress hasn't been updated after seven days, the scan is marked as failed. The ingestion (into catalog) process currently doesn't have any such limitation.
-
- - **Canceling scans**
- - Currently, scans can only be canceled or paused if the status of the scan has transitioned into an "In Progress" state from "Queued" after you trigger the scan.
- - Canceling an individual child scan isn't supported.
-
-### Points to note
--- If a field or column, table, or a file is removed from the source system after the scan was executed, it will only be reflected (removed) in Microsoft Purview after the next scheduled full or incremental scan.-- An asset can be deleted from a Microsoft Purview catalog by using the **Delete** icon under the name of the asset. This action won't remove the object in the source. If you run a full scan on the same source, it would get reingested in the catalog. If you've scheduled a weekly or monthly scan instead (incremental), the deleted asset won't be picked unless the object is modified at the source. An example is if a column is added or removed from the table.-- To understand the behavior of subsequent scans after *manually* editing a data asset or an underlying schema through the Microsoft Purview governance portal, see [Catalog asset details](./catalog-asset-details.md#editing-assets).-- For more information, see the tutorial on [how to view, edit, and delete assets](./catalog-asset-details.md).-
-## Next steps
-
-[Manage data sources](./manage-data-sources.md)
purview Concept Best Practices Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-security.md
- Title: Microsoft Purview security best practices
-description: This article provides Microsoft Purview best practices.
----- Previously updated : 12/09/2022--
-# Microsoft Purview security best practices
-
-This article provides best practices for common security requirements for Microsoft Purview governance solutions. The security strategy described follows the layered defense-in-depth approach.
-
->[!NOTE]
->These best practices cover security for [Microsoft Purview unified data governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions). For more information about Microsoft Purview risk and compliance solutions, [go here](/microsoft-365/compliance/). For more information about Microsoft Purview in general, [go here](/purview/purview).
--
-Before applying these recommendations to your environment, you should consult your security team as some may not be applicable to your security requirements.
-
-## Network security
-
-Microsoft Purview is a Platform as a Service (PaaS) solution in Azure. You can enable the following network security capabilities for your Microsoft Purview accounts:
--- Enable [end-to-end network isolation](catalog-private-link-end-to-end.md) using Private Link Service.-- Use [Microsoft Purview Firewall](catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access) to disable Public access.-- Deploy [Network Security Group (NSG) rules](#use-network-security-groups) for subnets where Azure data sources private endpoints, Microsoft Purview private endpoints and self-hosted runtime VMs are deployed.-- Implement Microsoft Purview with private endpoints managed by a Network Virtual Appliance, such as [Azure Firewall](../firewall/overview.md) for network inspection and network filtering.--
-For more information, see [Best practices related to connectivity to Azure PaaS Services](/azure/cloud-adoption-framework/ready/azure-best-practices/connectivity-to-azure-paas-services).
-
-### Deploy private endpoints for Microsoft Purview accounts
-
-If you need to use Microsoft Purview from inside your private network, it's recommended to use Azure Private Link Service with your Microsoft Purview accounts for partial or [end-to-end isolation](catalog-private-link-end-to-end.md) to connect to Microsoft Purview governance portal, access Microsoft Purview endpoints and to scan data sources.
-
-The Microsoft Purview _account_ private endpoint is used to add another layer of security, so only client calls that are originated from within the virtual network are allowed to access the Microsoft Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
-
-The Microsoft Purview _portal_ private endpoint is required to enable connectivity to Microsoft Purview governance portal using a private network.
-
-Microsoft Purview can scan data sources in Azure or an on-premises environment by using ingestion private endpoints.
--- For scanning Azure _platform as a service_ data sources, review [Support matrix for scanning data sources through ingestion private endpoint](catalog-private-link.md#support-matrix-for-scanning-data-sources-through-ingestion-private-endpoint).-- If you're deploying Microsoft Purview with end-to-end network isolation, to scan Azure data sources, these data sources must be also configured with private endpoints.-- Review [known limitations](catalog-private-link-troubleshoot.md).-
-For more information, see [Microsoft Purview network architecture and best practices](concept-best-practices-network.md).
-
-### Block public access using Microsoft Purview firewall
-
-You can disable Microsoft Purview Public access to cut off access to the Microsoft Purview account completely from the public internet. In this case, you should consider the following requirements:
--- Microsoft Purview must be deployed based on [end-to-end network isolation scenario](catalog-private-link-end-to-end.md).-- To access Microsoft Purview governance portal and Microsoft Purview endpoints, you need to use a management machine that is connected to private network to access Microsoft Purview through private network.-- Review [known limitations](catalog-private-link-troubleshoot.md).-- To scan Azure platform as a service data sources, review [Support matrix for scanning data sources through ingestion private endpoint](catalog-private-link.md#support-matrix-for-scanning-data-sources-through-ingestion-private-endpoint).-- Azure data sources must be also configured with private endpoints.-- To scan data sources, you must use a self-hosted integration runtime.-
-For more information, see [Firewalls to restrict public access](catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access).
-
-### Use Network Security Groups
-
-You can use an Azure network security group to filter network traffic to and from Azure resources in an Azure virtual network. A network security group contains [security rules](../virtual-network/network-security-groups-overview.md#security-rules) that allow or deny inbound network traffic to, or outbound network traffic from, several types of Azure resources. For each rule, you can specify source and destination, port, and protocol.
-
-Network Security Groups can be applied to network interface or Azure virtual networks subnets, where Microsoft Purview private endpoints, self-hosted integration runtime VMs and Azure data sources are deployed.
-
-For more information, see [apply NSG rules for private endpoints](../private-link/disable-private-endpoint-network-policy.md).
-
-The following NSG rules are required on **data sources** for Microsoft Purview scanning:
-
-|Direction |Source |Source port range |Destination |Destination port |Protocol |Action |
-||||||||
-|Inbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Data Sources private IP addresses or Subnets | 443 | Any | Allow |
--
-The following NSG rules are required on from the **management machines** to access Microsoft Purview governance portal:
-
-|Direction |Source |Source port range |Destination |Destination port |Protocol |Action |
-||||||||
-|Outbound | Management machines' private IP addresses or subnets | * | Microsoft Purview account and portal private endpoint IP addresses or subnets | 443 | Any | Allow |
-|Outbound | Management machines' private IP addresses or subnets | * | Service tag: `AzureCloud` | 443 | Any | Allow |
--
-The following NSG rules are required on **self-hosted integration runtime VMs** for Microsoft Purview scanning and metadata ingestion:
-
- > [!IMPORTANT]
- > Consider adding additional rules with relevant Service Tags, based on your data source types.
-
-|Direction |Source |Source port range |Destination |Destination port |Protocol |Action |
-||||||||
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Data Sources private IP addresses or subnets | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Microsoft Purview account and ingestion private endpoint IP addresses or Subnets | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `Servicebus` | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `Storage` | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `AzureActiveDirectory` | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `DataFactory` | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `KeyVault` | 443 | Any | Allow |
--
-The following NSG rules are required on for **Microsoft Purview account, portal and ingestion private endpoints**:
-
-|Direction |Source |Source port range |Destination |Destination port |Protocol |Action |
-||||||||
-|Inbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Microsoft Purview account and ingestion private endpoint IP addresses or subnets | 443 | Any | Allow |
-|Inbound | Management machines' private IP addresses or subnets | * | Microsoft Purview account and ingestion private endpoint IP addresses or subnets | 443 | Any | Allow |
-
-For more information, see [Self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).
-
-## Access management
-
-Identity and Access Management provides the basis of a large percentage of security assurance. It enables access based on identity authentication and authorization controls in cloud services. These controls protect data and resources and decide which requests should be permitted.
-
-Related to roles and access management in Microsoft Purview, you can apply the following security best practices:
--- Define roles and responsibilities to manage Microsoft Purview in control plane and data plane:
- - Define roles and tasks required to deploy and manage Microsoft Purview inside an Azure subscription.
- - Define roles and task needed to perform data management and governance using Microsoft Purview.
-- Assign roles to Azure Active Directory groups instead of assigning roles to individual users.-- Use Azure [Active Directory Entitlement Management](../active-directory/governance/entitlement-management-overview.md) to map user access to Azure AD groups using Access Packages. -- Enforce multi-factor authentication for Microsoft Purview users, especially, for users with privileged roles such as collection admins, data source admins or data curators.-
-### Manage a Microsoft Purview account in control plane and data plane
-
-Control plane refers to all operations related to Azure deployment and management of Microsoft Purview inside Azure Resource Manager.
-
-Data plane refers to all operations, related to interacting with Microsoft Purview inside Data Map and Data Catalog.
-
-You can assign control plane and data plane roles to users, security groups and service principals from your Azure Active Directory tenant that is associated to Microsoft Purview instance's Azure subscription.
-
-Examples of control plane operations and data plane operations:
-
-|Task |Scope |Recommended role |What roles to use? |
-|||||
-|Deploy a Microsoft Purview account | Control plane | Azure subscription owner or contributor | Azure RBAC roles |
-|Set up a Private Endpoint for Microsoft Purview | Control plane | Contributor  | Azure RBAC roles |
-|Delete a Microsoft Purview account | Control plane | Contributor  | Azure RBAC roles |
-|Add or manage a [self-hosted integration runtime (SHIR)](manage-integration-runtimes.md) | Control plane | Data source administrator |Microsoft Purview roles |
-|View Microsoft Purview metrics to get current capacity units | Control plane | Reader | Azure RBAC roles |
-|Create a collection | Data plane | Collection Admin | Microsoft Purview roles |
-|Register a data source | Data plane | Collection Admin | Microsoft Purview roles |
-|Scan a SQL Server | Data plane | Data source admin and data reader or data curator | Microsoft Purview roles |
-|Search inside Microsoft Purview Data Catalog | Data plane | Data source admin and data reader or data curator | Microsoft Purview roles |
-
-Microsoft Purview plane roles are defined and managed inside Microsoft Purview instance in Microsoft Purview collections. For more information, see [Access control in Microsoft Purview](catalog-permissions.md#roles).
-
-Follow [Azure role-based access recommendations](../role-based-access-control/best-practices.md) for Azure control plane tasks.
-
-### Authentication and authorization
-
-To gain access to Microsoft Purview, users must be authenticated and authorized. Authentication is the process of proving the user is who they claim to be. Authorization refers to controlling access inside Microsoft Purview assigned on collections.
-
-We use Azure Active Directory to provide authentication and authorization mechanisms for Microsoft Purview inside Collections. You can assign Microsoft Purview roles to the following security principals from your Azure Active Directory tenant that is associated with Azure subscription where your Microsoft Purview instance is hosted:
--- Users and guest users (if they're already added into your Azure AD tenant) -- Security groups -- Managed Identities -- Service Principals -
-Microsoft Purview fine-grained roles can be assigned to a flexible Collections hierarchy inside the Microsoft Purview instance.
--
-### Define Least Privilege model
-
-As a general rule, restrict access based on the [need to know](https://en.wikipedia.org/wiki/Need_to_know) and [least privilege](https://en.wikipedia.org/wiki/Principle_of_least_privilege) security principles is imperative for organizations that want to enforce security policies for data access.
-
-In Microsoft Purview, data sources, assets and scans can be organized using [Microsoft Purview Collections](quickstart-create-collection.md). Collections are hierarchical grouping of metadata in Microsoft Purview, but at the same time they provide a mechanism to manage access across Microsoft Purview. Roles in Microsoft Purview can be assigned to a collection based on your collection's hierarchy.
-
-Use [Microsoft Purview collections](concept-best-practices-collections.md#define-a-collection-hierarchy) to implement your organization's metadata hierarchy for centralized or delegated management and governance hierarchy based on least privileged model.
-
-Follow least privilege access model when assigning roles inside Microsoft Purview collections by segregating duties within your team and grant only the amount of access to users that they need to perform their jobs.
-
-For more information how to assign least privilege access model in Microsoft Purview, based on Microsoft Purview collection hierarchy, see [Access control in Microsoft Purview](catalog-permissions.md#assign-permissions-to-your-users).
-
-### Lower exposure of privileged accounts
-
-Securing privileged access is a critical first step to protecting business assets. Minimizing the number of people who have access to secure information or resources, reduces the chance of a malicious user getting access, or an authorized user inadvertently affecting a sensitive resource.
-
-Reduce the number of users with write access inside your Microsoft Purview instance. Keep the number of collection admins and data curator roles minimum at root collection.
-
-### Use multi-factor authentication and conditional access
-
-[Azure Active Directory Multi-Factor Authentication](../active-directory/authentication/concept-mfa-howitworks.md) provides another layer of security and authentication. For more security, we recommend enforcing [conditional access policies](../active-directory/conditional-access/overview.md) for all privileged accounts.
-
-By using Azure Active Directory Conditional Access policies, apply Azure AD Multi-Factor Authentication at sign-in for all individual users who are assigned to Microsoft Purview roles with modify access inside your Microsoft Purview instances: Collection Admin, Data Source Admin, Data Curator.
-
-Enable multi-factor authentication for your admin accounts and ensure that admin account users have registered for MFA.
-
-You can define your Conditional Access policies by selecting Microsoft Purview as a Cloud App.
-
-### Prevent accidental deletion of Microsoft Purview accounts
-
-In Azure, you can apply [resource locks](../azure-resource-manager/management/lock-resources.md) to an Azure subscription, a resource group, or a resource to prevent accidental deletion or modification for critical resources.
-
-Enable Azure resource lock for your Microsoft Purview accounts to prevent accidental deletion of Microsoft Purview instances in your Azure subscriptions.
-
-Adding a `CanNotDelete` or `ReadOnly` lock to Microsoft Purview account doesn't prevent deletion or modification operations inside Microsoft Purview data plane, however, it prevents any operations in control plane, such as deleting the Microsoft Purview account, deploying a private endpoint or configuration of diagnostic settings.
-
-For more information, see [Understand scope of locks](../azure-resource-manager/management/lock-resources.md#understand-scope-of-locks).
-
-Resource locks can be assigned to Microsoft Purview resource groups or resources, however, you can't assign an Azure resource lock to Microsoft Purview Managed resources or managed Resource Group.
-
-### Implement a break glass strategy
-Plan for a break glass strategy for your Azure Active Directory tenant, Azure subscription and Microsoft Purview accounts to prevent tenant-wide account lockout.
-
-For more information about Azure AD and Azure emergency access planning, see [Manage emergency access accounts in Azure AD](../active-directory/roles/security-emergency-access.md).
-
-For more information about Microsoft Purview break glass strategy, see [Microsoft Purview collections best practices and design recommendations](concept-best-practices-collections.md#design-recommendations).
--
-## Threat protection and preventing data exfiltration
-
-Microsoft Purview provides rich insights into the sensitivity of your data, which makes it valuable to security teams using Microsoft Defender for Cloud to manage the organization's security posture and protect against threats to their workloads. Data resources remain a popular target for malicious actors, making it crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments. To address this challenge, we're announcing the integration between Microsoft Defender for Cloud and Microsoft Purview in public preview.
-
-### Integrate with Microsoft 365 and Microsoft Defender for Cloud
-
-Often, one of the biggest challenges for security organization in a company is to identify and protect assets based on their criticality and sensitivity. Microsoft recently [announced integration between Microsoft Purview and Microsoft Defender for Cloud in Public Preview](https://techcommunity.microsoft.com/t5/azure-purview-blog/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954) to help overcome these challenges.
-
-If you've extended your Microsoft 365 sensitivity labels for assets and database columns in Microsoft Purview, you can keep track of highly valuable assets using Microsoft Defender for Cloud from inventory, alerts and recommendations based on assets detected sensitivity labels.
--- For recommendations, we've provided **security controls** to help you understand how important each recommendation is to your overall security posture. Microsoft Defender for Cloud includes a **secure score** value for each control to help you prioritize your security work. Learn more in [Security controls and their recommendations](../defender-for-cloud/secure-score-security-controls.md#security-controls-and-their-recommendations).--- For alerts, we've assigned **severity labels** to each alert to help you prioritize the order in which you attend to each alert. Learn more in [How are alerts classified?](../defender-for-cloud/alerts-overview.md#how-are-alerts-classified).-
-For more information, see [Integrate Microsoft Purview with Azure security products](how-to-integrate-with-azure-security-products.md).
-
-## Information Protection
-
-### Secure metadata extraction and storage
-
-Microsoft Purview is a data governance solution in cloud. You can register and scan different data sources from various data systems from your on-premises, Azure, or multicloud environments into Microsoft Purview. While data source is registered and scanned in Microsoft Purview, the actual data and data sources stay in their original locations, only metadata is extracted from data sources and stored in Microsoft Purview Data Map, which means you don't need to move data out of the region or their original location to extract the metadata into Microsoft Purview.
-
-When a Microsoft Purview account is deployed, in addition, a managed resource group is also deployed in your Azure subscription. A managed Azure Storage Account is deployed inside this resource group. The managed storage account is used to ingest metadata from data sources during the scan. Since these resources are consumed by the Microsoft Purview they can't be accessed by any other users or principals, except the Microsoft Purview account. This is because an Azure role-based access control (RBAC) deny assignment is added automatically for all principals to this resource group at the time of Microsoft Purview account deployment, preventing any CRUD operations on these resources if they aren't initiated from Microsoft Purview.
-
-### Where is metadata stored?
-
-Microsoft Purview extracts only the metadata from different data source systems into [Microsoft Purview Data Map](concept-elastic-data-map.md) during the scanning process.
-
-You can deploy a Microsoft Purview account inside your Azure subscription in any [supported Azure regions](https://azure.microsoft.com/global-infrastructure/services/?products=purview&regions=all).
-
-All metadata is stored inside Data Map inside your Microsoft Purview instance. This means the metadata is stored in the same region as your Microsoft Purview instance.
-
-### How metadata is extracted from data sources?
-
-Microsoft Purview allows you to use any of the following options to extract metadata from data sources:
--- **Azure runtime**. Metadata data is extracted and processed inside the same region as your data sources. -
- :::image type="content" source="media/concept-best-practices/security-azure-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, the Azure runtime, and data sources."lightbox="media/concept-best-practices/security-azure-runtime.png":::
-
- 1. A manual or automatic scan is initiated from the Microsoft Purview Data Map through the Azure integration runtime.
-
- 2. The Azure integration runtime connects to the data source to extract metadata.
-
- 3. Metadata is queued in Microsoft Purview managed storage and stored in Azure Blob Storage.
-
- 4. Metadata is sent to the Microsoft Purview Data Map.
--- **Self-hosted integration runtime**. Metadata is extracted and processed by self-hosted integration runtime inside self-hosted integration runtime VMs' memory before they're sent to Microsoft Purview Data Map. In this case, customers have to deploy and manage one or more self-hosted integration runtime Windows-based virtual machines inside their Azure subscriptions or on-premises environments. Scanning on-premises and VM-based data sources always requires using a self-hosted integration runtime. The Azure integration runtime isn't supported for these data sources. The following steps show the communication flow at a high level when you're using a self-hosted integration runtime to scan a data source.-
- :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, a self-hosted runtime, and data sources."lightbox="media/concept-best-practices/security-self-hosted-runtime.png":::
-
- 1. A manual or automatic scan is triggered. Microsoft Purview connects to Azure Key Vault to retrieve the credential to access a data source.
-
- 2. The scan is initiated from the Microsoft Purview Data Map through a self-hosted integration runtime.
-
- 3. The self-hosted integration runtime service from the VM connects to the data source to extract metadata.
-
- 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Microsoft Purview managed storage and then stored in Azure Blob Storage.
-
- 5. Metadata is sent to the Microsoft Purview Data Map.
-
- If you need to extract metadata from data sources with sensitive data that can't leave the boundary of your on-premises network, it's highly recommended to deploy the self-hosted integration runtime VM inside your corporate network, where data sources are located, to extract and process metadata in on-premises, and send only metadata to Microsoft Purview.
-
- :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime-on-premises.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, an on-premises self-hosted runtime, and data sources in on-premises network."lightbox="media/concept-best-practices/security-self-hosted-runtime-on-premises.png":::
-
- 1. A manual or automatic scan is triggered. Microsoft Purview connects to Azure Key Vault to retrieve the credential to access a data source.
-
- 2. The scan is initiated through the on-premises self-hosted integration runtime.
-
- 3. The self-hosted integration runtime service from the VM connects to the data source to extract metadata.
-
- 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Microsoft Purview managed storage and then stored in Azure Blob Storage. Actual data never leaves the boundary of your network.
-
- 5. Metadata is sent to the Microsoft Purview Data Map.
-
-### Information protection and encryption
-
-Azure offers many mechanisms for keeping data private in rest and as it moves from one location to another. For Microsoft Purview, data is encrypted at rest using Microsoft-managed keys and when data is in transit, using Transport Layer Security (TLS) v1.2 or greater.
-
-#### Transport Layer Security (Encryption-in-transit)
-
-Data in transit (also known as data in motion) is always encrypted in Microsoft Purview.
-
-To add another layer of security in addition to access controls, Microsoft Purview secures customer data by encrypting data in motion with Transport Layer Security (TLS) and protect data in transit against 'out of band' attacks (such as traffic capture). It uses encryption to make sure attackers can't easily read or modify the data.
-
-Microsoft Purview supports data encryption in transit with Transport Layer Security (TLS) v1.2 or greater.
-
-For more information, see, [Encrypt sensitive information in transit](/security/benchmark/azure/baselines/purview-security-baseline#dp-4-encrypt-sensitive-information-in-transit).
-
-#### Transparent data encryption (Encryption-at-rest)
-
-Data at rest includes information that resides in persistent storage on physical media, in any digital format. The media can include files on magnetic or optical media, archived data, and data backups inside Azure regions.
-
-To add another layer of security in addition to access controls, Microsoft Purview encrypts data at rest to protect against 'out of band' attacks (such as accessing underlying storage).
-It uses encryption with Microsoft-managed keys. This practice helps make sure attackers can't easily read or modify the data.
-
-For more information, see [Encrypt sensitive data at rest](/security/benchmark/azure/baselines/purview-security-baseline#dp-5-encrypt-sensitive-data-at-rest).
-
-### Optional Event Hubs namespace configuration
-
-Each Microsoft Purview account can configure Event Hubs that are accessible via their Atlas Kafka endpoint. This can be enabled at creation under *Configuration*, or from the Azure portal under *Kafka configuration*. It's recommended to only enable optional managed event hub if it's used to distribute events into or outside of Microsoft Purview account Data Map. To remove this information distribution point, either don't configure these endpoints, or remove them.
-
-To remove configured Event Hubs namespaces, you can follow these steps:
-1. Search for and open your Microsoft Purview account in the [Azure portal](https://portal.azure.com).
-1. Select **Kafka configuration** under settings on your Microsoft Purview account page in the Azure portal.
-1. Select the Event Hubs you want to disable. (Hook hubs send messages to Microsoft Purview. Notification hubs receive notifications.)
-1. Select **Remove** to save the choice and begin the disablement process. This can take several minutes to complete.
- :::image type="content" source="media/concept-best-practices/select-remove.png" alt-text="Screenshot showing the Kafka configuration page of the Microsoft Purview account page in the Azure portal with the remove button highlighted.":::
-
-> [!NOTE]
-> If you have an ingestion private endpoint when you disable this Event Hubs namespace, after disabling the ingestion private endpoint may show as disconnected.
-
-For more information about configuring these Event Hubs namespaces, see: [Configure Event Hubs for Atlas Kafka topics](configure-event-hubs-for-kafka.md)
-
-## Credential management
-
-To extract metadata from a data source system into Microsoft Purview Data Map, it's required to register and scan the data source systems in Microsoft Purview Data Map. To automate this process, we have made available [connectors](azure-purview-connector-overview.md) for different data source systems in Microsoft Purview to simplify the registration and scanning process.
-
-To connect to a data source Microsoft Purview requires a credential with read-only access to the data source system.
-
-It's recommended prioritizing the use of the following credential options for scanning, when possible:
-
-1. Microsoft Purview Managed Identity
-2. User Assigned Managed Identity
-3. Service Principals
-4. Other options such as Account key, SQL Authentication, etc.
-
-If you use any options rather than managed identities, all credentials must be stored and protected inside an [Azure key vault](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account). Microsoft Purview requires get/list access to secret on the Azure Key Vault resource.
-
-As a general rule, you can use the following options to set up integration runtime, and credentials to scan data source systems:
-
-|Scenario |Runtime option |Supported Credentials |
-||||
-|Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside public network | Option 1: Azure Runtime | Microsoft Purview Managed Identity, Service Principal or Access Key / SQL Authentication (depending on Azure data source type) |
-|Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside public network | Option 2: Self-hosted integration runtime | Service Principal or Access Key / SQL Authentication (depending on Azure data source type) |
-|Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside private network using Azure Private Link Service | Self-hosted integration runtime | Service Principal or Access Key / SQL Authentication (depending on Azure data source type) |
-|Data source is inside an Azure IaaS VM such as SQL Server | Self-hosted integration runtime deployed in Azure | SQL Authentication or Basic Authentication (depending on Azure data source type) |
-|Data source is inside an on-premises system such as SQL Server or Oracle | Self-hosted integration runtime deployed in Azure or in the on-premises network | SQL Authentication or Basic Authentication (depending on Azure data source type) |
-|Multicloud | Azure runtime or self-hosted integration runtime based on data source types | Supported credential options vary based on data sources types |
-|Power BI tenant | Azure Runtime | Microsoft Purview Managed Identity |
-
-Use [this guide](azure-purview-connector-overview.md) to read more about each source and their supported authentication options.
-
-## Other recommendations
-
-### Define required number of Microsoft Purview accounts for your organization
-
-As part of security planning for implementation of Microsoft Purview in your organization, review your business and security requirements to define [how many Microsoft Purview accounts are needed](concept-best-practices-accounts.md) in your organization. various factors may impact the decision, such as [multi-tenancy](/azure/cloud-adoption-framework/ready/enterprise-scale/enterprise-enrollment-and-azure-ad-tenants#define-azure-ad-tenants) billing or compliance requirements.
-
-### Apply security best practices for Self-hosted runtime VMs
-
-Consider securing the deployment and management of self-hosted integration runtime VMs in Azure or your on-premises environment, if self-hosted integration runtime is used to scan data sources in Microsoft Purview.
-
-For self-hosted integration runtime VMs deployed as virtual machines in Azure, follow [security best practices recommendations for Windows virtual machines](../virtual-machines/security-recommendations.md).
--- Lock down inbound traffic to your VMs using Network Security Groups and [Azure Defender access Just-in-Time](../defender-for-cloud/just-in-time-access-usage.md).-- Install antivirus or antimalware. -- Deploy Azure Defender to get insights around any potential anomaly on the VMs. -- Limit the amount of software in the self-hosted integration runtime VMs. Although it isn't a mandatory requirement to have a dedicated VM for a self-hosted runtime for Microsoft Purview, we highly suggest using dedicated VMs especially for production environments. -- Monitor the VMs using [Azure Monitor for VMs](../azure-monitor/vm/vminsights-overview.md). By using Log analytics agent, you can capture content such as performance metrics to adjust required capacity for your VMs. -- By integrating virtual machines with Microsoft Defender for Cloud, you can you prevent, detect, and respond to threats.-- Keep your machines current. You can enable Automatic Windows Update or use [Update Management in Azure Automation](../automation/update-management/overview.md) to manage operating system level updates for the OS. -- Use multiple machines for greater resilience and availability. You can deploy and register multiple self-hosted integration runtimes to distribute the scans across multiple self-hosted integration runtime machines or deploy the self-hosted integration runtime on a Virtual Machine Scale Set for higher redundancy and scalability. -- Optionally, you can plan to enable Azure backup from your self-hosted integration runtime VMs to increase the recovery time of a self-hosted integration runtime VM if there's a VM level disaster.-
-## Next steps
-- [Microsoft Purview accounts architectures and best practices](concept-best-practices-accounts.md)-- [Microsoft Purview network architecture and best practices](concept-best-practices-network.md)-- [Credentials for source authentication in Microsoft Purview](manage-credentials.md)
purview Concept Best Practices Sensitivity Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-sensitivity-labels.md
- Title: Best practices for applying sensitivity labels in the Microsoft Purview Data Map
-description: This article provides best practices for applying sensitivity labels in Microsoft Purview Data Map.
----- Previously updated : 04/21/2022---
-# Labeling best practices for the data map
-
->[!NOTE]
->These best practices cover applying labels to the data map in [Microsoft Purview unified data governance solutions](how-to-automatically-label-your-content.md). For more information about labeling in Microsoft Purview risk and compliance solutions, [go here](/microsoft-365/compliance/apply-sensitivity-label-automatically). For more information about Microsoft Purview in general, [go here](/purview/purview).
-
-The Microsoft Purview Data Map supports labeling structured and unstructured data stored across various data sources. Labeling data within the data map allows users to easily find data that matches predefined autolabeling rules that were configured in the Microsoft Purview compliance portal. The data map extends the use of sensitivity labels from Microsoft Purview Information Protection to assets stored in infrastructure cloud locations and structured data sources.
-
-## Protect personal data with custom sensitivity labels for Microsoft Purview
-
-Storing and processing personal data is subject to special protection. Labeling personal data is crucial to help you identify sensitive information. You can use the detection and labeling tasks for personal data in different stages of your workflows. Because personal data is ubiquitous and fluid in your organization, you need to define identification rules for building policies that suit your individual situation.
-
-## Why do you need to use labeling within the data map?
-
-With the data map, you can extend your organization's investment in sensitivity labels from Microsoft Purview Information Protection to assets that are stored in files and database columns within Azure, multicloud, and on-premises locations. These locations are defined in [supported data sources](./create-sensitivity-label.md#supported-data-sources).
-When you apply sensitivity labels to your content, you can keep your data secure by stating how sensitive certain data is in your organization. The data map also abstracts the data itself, so you can use labels to track the type of data, without exposing sensitive data on another platform.
-
-## Microsoft Purview Data Map labeling best practices and considerations
-
-The following sections walk you through the process of implementing labeling for your assets.
-
-### Get started
--- To enable sensitivity labeling in the data map, follow the steps in [automatically apply sensitivity labels to your data in the Microsoft Purview Data Map](./how-to-automatically-label-your-content.md).-- To find information on required licensing and helpful answers to other questions, see [Sensitivity labels in the Microsoft Purview Data Map FAQ](./sensitivity-labels-frequently-asked-questions.yml).-
-### Label considerations
--- If you already have sensitivity labels from Microsoft Purview Information Protection in use in your environment, continue to use your existing labels. Don't make duplicate or more labels for the data map. This approach allows you to maximize the investment you've already made in the Microsoft Purview. It also ensures consistent labeling across your data estate.-- If you haven't created sensitivity labels in Microsoft Purview Information Protection, review the documentation to [get started with sensitivity labels](/microsoft-365/compliance/get-started-with-sensitivity-labels). Creating a classification schema is a tenant-wide operation. Discuss it thoroughly before you enable it within your organization.-
-### Label recommendations
--- When you configure sensitivity labels for the Microsoft Purview Data Map, you might define autolabeling rules for files, database columns, or both within the label properties. Microsoft Purview labels files within the Microsoft Purview Data Map. When the autolabeling rule is configured, Microsoft Purview automatically applies the label or recommends that the label is applied.-
- > [!WARNING]
- > If you haven't configured autolabeling for items on your sensitivity labels, users might be affected within your Office and Microsoft 365 environment. You can test autolabeling on database columns without affecting users.
--- If you're defining new autolabeling rules for files when you configure labels for the Microsoft Purview Data Map, make sure that you have the condition for applying the label set appropriately.-- You can set the detection criteria to **All of these** or **Any of these** in the upper right of the autolabeling for items page of the label properties.-- The default setting for detection criteria is **All of these**. This setting means that the asset must contain all the specified sensitive information types for the label to be applied. While the default setting might be valid in some instances, many customers want to use **Any of these**. Then if at least one asset is found, the label is applied.-
- :::image type="content" source="media/concept-best-practices/label-detection-criteria.png" alt-text="Screenshot that shows detection criteria for a label.":::
-
- > [!NOTE]
- > Trainable classifiers from Microsoft Purview Information Protection aren't supported by Microsoft Purview Data Map.
--- Maintain consistency in labeling across your data estate. If you use autolabeling rules for files, use the same sensitive information types for autolabeling database columns.-- [Define your sensitivity labels via Microsoft Purview Information Protection to identify your personal data at a central place](/microsoft-365/compliance/information-protection).-- [Use policy templates as a starting point to build your rule sets](/microsoft-365/compliance/what-the-dlp-policy-templates-include#general-data-protection-regulation-gdpr).-- [Combine data classifications to an individual rule set](./supported-classifications.md).-- [Force labeling by using autolabel functionality](./how-to-automatically-label-your-content.md).-- Build groups of sensitivity labels and store them as a dedicated sensitivity label policy. For example, store all required sensitivity labels for regulatory rules by using the same sensitivity label policy to publish.-- Capture all test cases for your labels. Test your label policies with all applications you want to secure.-- Promote sensitivity label policies to the Microsoft Purview Data Map.-- Run test scans from the Microsoft Purview Data Map on different data sources like hybrid cloud and on-premises to identify sensitivity labels.-- Gather and consider insights, for example, by using Microsoft Purview Data Estate Insights. Use alerting mechanisms to mitigate potential breaches of regulations.-
-By using sensitivity labels with Microsoft Purview Data Map, you can extend information protection beyond the border of your Microsoft data estate to your on-premises, hybrid cloud, multicloud, and software as a service (SaaS) scenarios.
-
-## Next steps
--- [Get started with sensitivity labels](/microsoft-365/compliance/get-started-with-sensitivity-labels).-- [How to automatically apply sensitivity labels to your data in the Microsoft Purview Data Map](how-to-automatically-label-your-content.md).
purview Concept Business Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-business-glossary.md
- Title: Understand business glossary features in Microsoft Purview
-description: This article explains what business glossary is in Microsoft Purview.
----- Previously updated : 12/13/2022--
-# Understand business glossary features in Microsoft Purview
-
-This article provides an overview of the business glossary feature in Microsoft Purview.
-
-## Business glossary
-
-A glossary provides vocabulary for business users.  It consists of business terms that can be related to each other and allows them to be categorized so that they can be understood in different contexts. These terms can be then mapped to assets like a database, tables, columns etc. This helps in abstracting the technical jargon associated with the data repositories and allows the business user to discover and work with data in the vocabulary that is more familiar to them.
-
-A business glossary is a collection of terms. Each term represents an object in an organization and it's highly likely that there are multiple terms representing the same object. A customer could also be referred to as client, purchaser, or buyer. These multiple terms have a relationship with each other. The relationship between these terms could be:
--- synonyms - different terms with the same definition-- related - different name with similar definition-
-The same term can also imply multiple business objects. It's important that each term is well-defined and clearly understood within the organization.
-
-## Custom attributes
-
-Microsoft Purview supports these out-of-the-box attributes for any business glossary term:
-- Name (mandatory)-- Nickname-- Status-- Definition-- Stewards-- Experts-- Resources-
-These attributes can't be edited or deleted, but only the Name is mandatory to create a glossary term.
-However, these attributes aren't sufficient to completely define a term in an organization. To solve this problem, Microsoft Purview provides a feature where you can define custom attributes for your glossary.
-
-## Relationships between terms
-
-Microsoft Purview supports these out-of-the-box relationships for terms:
--- Parent/child term-- Acronym-- Synonyms-- Related terms-
-**Relationship definitions in the glossary are bi-directional:** Every relationship between terms is a two-way relationship. This means that if term A is related to term B, then term B is also related to term A.
-
-Anytime you populate a relationship in one direction, Purview automatically adds the reverse relationship for you. For example, if you add term A as a synonym for term B, Purview automatically adds term B as a synonym for term A.
-
-## Term templates
-
-Term Templates provides glossary custom attributes to be logically grouped together in catalog. The feature allows you to group all the relevant custom attributes together in a template and then apply the template while creating the glossary term. For example, all finance- related custom attributes like cost center, profit center, accounting code can be grouped in a term template Finance Template and the Finance template can be used to create financial glossary terms.
-
-All the standard attributes are grouped in a system default template. Any term template that you create will contain these attributes along with any more custom attributes created as part of template creation process.
-
-## Glossary vs classification vs sensitivity labels
-
-While glossary terms, classifications and labels are annotations to a data asset, each one of them has a different meaning in the context of catalog.
-
-### Glossary
-
-As stated above, Business glossary term defines the business vocabulary for an organization and helps in bridging the gap between various departments in your company.
-
-### Classifications
-
-Classifications are annotations that can be assigned to entities. The flexibility of classifications enables you to use them for multiple scenarios such as:
--- understanding the nature of data stored in the data assets-- defining access control policies-
-Microsoft Purview has more than 200 system classifiers today and you can define your own classifiers in catalog. As part of the scanning process, we automatically detect these classifications and apply them to data assets and schemas. However, you can override them at any point of time. The human overrides are never replaced by automated scans.
-
-### Sensitivity labels
-
-Sensitivity labels are a type of annotation that allows you to classify and protect your organization's data, without hindering productivity and collaboration. Sensitivity labels are used to identify the categories of classification types within your organizational data, and group the policies that you wish to apply to each category. Microsoft Purview makes use of the same sensitive information types as Microsoft 365, which allows you to stretch your existing security policies and protection across your entire content and data estate. The same labels can be shared across Microsoft Office products and data assets in Microsoft Purview.
-
-## Next steps
--- [Create and manage glossaries](how-to-create-manage-glossary.md)-- [Create and manage terms](how-to-create-manage-glossary-term.md)-- [Manage term templates](how-to-manage-term-templates.md)-- [Best practices for describing data with terms, tags, managed attributes, and business assets](concept-best-practices-annotating-data.md)-- [Browse the data catalog in Microsoft Purview](how-to-browse-catalog.md)
purview Concept Classification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-classification.md
- Title: Understand data classification in the Microsoft Purview governance portal
-description: This article explains the concepts behind data classification in the Microsoft Purview governance portal.
----- Previously updated : 03/23/2023--
-# Data classification in the Microsoft Purview governance portal
-
-Data classification in the Microsoft Purview governance portal is a way of categorizing data assets by assigning unique logical tags or classes to the data assets. Classification is based on the business context of the data. For example, you might classify assets by *Passport Number*, *Driver's License Number*, *Credit Card Number*, *SWIFT Code*, *PersonΓÇÖs Name*, and so on.
-
-When you classify data assets, you make them easier to understand, search, and govern. Classifying data assets also helps you understand the risks associated with them. This in turn can help you implement measures to protect sensitive or important data from ungoverned proliferation and unauthorized access across the data estate.
-
-The Microsoft Purview Data Map provides an automated classification capability while you scan your data sources. You get more than 200+ built-in system classifications and the ability to create custom classifications for your data. You can classify assets automatically when they're ingested as part of a configured scan, or you can edit them manually in the Microsoft Purview governance portal after they're scanned and ingested.
-
-## Uses of classification
-
-Classification is the process of organizing data into *logical categories* that make the data easy to retrieve, sort, and identify for future use. This can be important for data governance. Among other reasons, classifying data assets is important because it helps you:
-
-* Narrow down the search for data assets that you're interested in.
-* Organize and understand the variety of data classes that are important in your organization and where they're stored.
-* Understand the risks associated with your most important data assets and then take appropriate measures to mitigate them.
-
-Following image shows classification applied while scanning on the *Customer* table in Azure SQL Database.
--
-## Types of classification
-
-The Microsoft Purview governance portal supports both system and custom classifications.
-
-* **System classifications**: 200+ system classifications supported are out of the box. For the entire list of available system classifications, see [supported classifications in the Microsoft Purview governance portal](./supported-classifications.md).
-
- In the example in the preceding image, *PersonΓÇÖs Name* is a system classification. System classification has the thunderbolt icon along with classification name. Hovering over the classification itself provides more details on the type of classification, and more details on how it was applied
-
-* **Custom classifications**: You can create custom classifications when you want to classify assets based on a pattern or a specific column name that's unavailable as a system classification.
-Custom classification rules can be based on a *regular expression* pattern or *dictionary*.
-
- Let's say that the *Employee ID* column follows the EMPLOYEE{GUID} pattern (for example, EMPLOYEE9c55c474-9996-420c-a285-0d0fc23f1f55). You can create your own custom classification by using a regular expression, such as `\^Employee\[A-Za-z0-9\]{8}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{12}\$`.
-
-> [!NOTE]
-> Sensitivity labels are different from classifications. Sensitivity labels categorize assets in the context of data security and privacy, such as *Highly Confidential*, *Restricted*, *Public*, and so on. To use sensitivity labels in the Microsoft Purview Data Map, you'll need at least one Microsoft 365 license or account within the same Azure Active Directory (Azure AD) tenant as your Microsoft Purview Data Map. For more information about the differences between sensitivity labels and classifications, see [sensitivity labels in the Microsoft Purview governance portal FAQ](sensitivity-labels-frequently-asked-questions.yml#what-is-the-difference-between-classifications-and-sensitivity-labels).
-
-## Next steps
-
-* [Read about classification best practices](concept-best-practices-classification.md)
-* [Create custom classifications](create-a-custom-classification-and-classification-rule.md)
-* [Automatically apply classifications](apply-classifications.md)
-* [Manually apply classifications](manually-apply-classifications.md)
-* [Use the Microsoft Purview governance portal](use-azure-purview-studio.md)
purview Concept Data Lineage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-data-lineage.md
- Title: Data lineage in Microsoft Purview
-description: Describes the concepts for data lineage.
----- Previously updated : 12/05/2022-
-# Data lineage in Microsoft Purview
-
-This article provides an overview of data lineage in Microsoft Purview Data Catalog. It also details how data systems can integrate with the catalog to capture lineage of data. Microsoft Purview can capture lineage for data in different parts of your organization's data estate, and at different levels of preparation including:
--- Raw data staged from various platforms-- Transformed and prepared data-- Data used by visualization platforms-
-## Use cases
-
-Data lineage is broadly understood as the lifecycle that spans the dataΓÇÖs origin, and where it moves over time across the data estate. It's used for different kinds of backwards-looking scenarios such as troubleshooting, tracing root cause in data pipelines and debugging. Lineage is also used for data quality analysis, compliance and ΓÇ£what ifΓÇ¥ scenarios often referred to as impact analysis. Lineage is represented visually to show data moving from source to destination including how the data was transformed. Given the complexity of most enterprise data environments, these views can be hard to understand without doing some consolidation or masking of peripheral data points.
-
-## Lineage experience in Microsoft Purview Data Catalog
-
-Microsoft Purview Data Catalog will connect with other data processing, storage, and analytics systems to extract lineage information. The information is combined to represent a generic, scenario-specific lineage experience in the catalog.
--
-Your data estate may include systems doing data extraction, transformation (ETL/ELT systems), analytics, and visualization systems. Each of the systems captures rich static and operational metadata that describes the state and quality of the data within the systems boundary. The goal of lineage in a data catalog is to extract the movement, transformation, and operational metadata from each data system at the lowest grain possible.
-
-The following example is a typical use case of data moving across multiple systems, where the Data Catalog would connect to each of the systems for lineage.
--- Data Factory copies data from on-prem/raw zone to a landing zone in the cloud. -- Data processing systems like Synapse, Databricks would process and transform data from landing zone to Curated zone using notebooks.-- Further processing of data into analytical models for optimal query performance and aggregation. -- Data visualization systems will consume the datasets and process through their meta model to create a BI Dashboard, ML experiments and so on.-
-## Lineage granularity
-
-The following section covers the details about the granularity of which the lineage information is gathered by Microsoft Purview. This granularity can vary based on the data systems supported in Microsoft Purview.
-
-### Entity level lineage: Source(s) > Process > Target(s)
--- Lineage is represented as a graph, typically it contains source and target entities in Data storage systems that are connected by a process invoked by a compute system. -- Data systems connect to the data catalog to generate and report a unique object referencing the physical object of the underlying data system for example: SQL Stored procedure, notebooks, and so on.-- High fidelity lineage with other metadata like ownership is captured to show the lineage in a human readable format for source & target entities. for example: lineage at a hive table level instead of partitions or file level.-
-### Column or attribute level lineage
-
-Identify attribute(s) of a source entity that is used to create or derive attribute(s) in the target entity. The name of the source attribute could be retained or renamed in a target. Systems like ADF can do a one-one copy from on-premises environment to the cloud. For example: `Table1/ColumnA -> Table2/ColumnA`.
-
-### Process execution status
-
-To support root cause analysis and data quality scenarios, we capture the execution status of the jobs in data processing systems. This requirement has nothing to do with replacing the monitoring capabilities of other data processing systems, neither the goal is to replace them.
-
-## Summary
-
-Lineage is a critical feature of the Microsoft Purview Data Catalog to support quality, trust, and audit scenarios. The goal of a data catalog is to build a robust framework where all the data systems within your environment can naturally connect and report lineage. Once the metadata is available, the data catalog can bring together the metadata provided by data systems to power data governance use cases.
-
-## Next steps
-
-* [Quickstart: Create a Microsoft Purview account in the Azure portal](create-catalog-portal.md)
-* [Quickstart: Create a Microsoft Purview account using Azure PowerShell/Azure CLI](create-catalog-powershell.md)
-* [Use the Microsoft Purview governance portal](use-azure-purview-studio.md)
purview Concept Data Share https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-data-share.md
- Title: Azure Storage in-place data sharing with Microsoft Purview (preview)
-description: This article describes Microsoft Purview Data Sharing and its features.
----- Previously updated : 02/16/2023--
-# Azure Storage in-place data sharing with Microsoft Purview (preview)
--
-Traditionally, organizations have shared data with internal teams or external partners by generating data feeds, requiring investment in data copy and refresh pipelines. The result is higher cost for data storage and movement, data proliferation (that is, multiple copies of data), and delay in access to time-sensitive data.
-
-With Microsoft Purview Data Sharing, data providers can now share data **in-place** from Azure Data Lake Storage Gen2 and Azure Storage accounts, both within and across organizations. Share data directly with users and partners without data duplication and centrally manage your sharing activities from within Microsoft Purview.
-
-With Microsoft Purview Data Sharing, data consumers can now have near real-time access to shared data. Storage data access and transactions are charged to the data consumers based on what they use, and at no more cost to the data providers.
-
-## How in-place data sharing works
-
-Microsoft Purview enables sharing of files and folders in-place from ADLS Gen2 and Blob storage accounts.
--
-A data provider creates a share by selecting a data source that is registered in Microsoft Purview, choosing which files and folders to share, and who to share them with. Microsoft Purview then sends an invitation to each data consumer.
-
-When a consumer accepts the invitation, they specify a target storage account in their own Azure subscription that they'll use to access the shared data. This establishes a sharing relationship between the provider and consumer storage accounts. This sharing relationship provides data consumer read-only access to shared data through the consumerΓÇÖs storage account. Any changes to the data in the providerΓÇÖs source storage account is reflected in near real-time in the consumerΓÇÖs storage account.
-
-The data provider pays for data storage and their own data access, while the data consumer pays for their own data access transactions.
-
-Data providers can revoke access to the shared data at any time, or set a share expiration time for time-bound access to data. Data consumers can also terminate access to the share at any time.
-
-## Where data is stored
-
-Microsoft Purview Data Sharing only stores metadata about your share. It doesn't store a copy of the shared data itself. The data is stored in the underlying source storage account that is being shared. You can have your storage accounts in a different Azure region than your Microsoft Purview account.
-
-## Key capabilities
-
-* Share data within the organization or with partners and customers outside of the organization (within the same Azure tenant or across different Azure tenants).
-* Share data from ADLS Gen2 or Blob storage in-place without data duplication.
-* Share data with multiple recipients.
-* Access shared data in near real time.
-* Manage sharing relationships and keep track of who the data is shared with/from, for each ADLSGen2 or Blob Storage account.
-* Terminate share access at any time.
-* Flexible experience through Microsoft Purview governance portal or via REST APIs.
-
-## Get started
-
-Get started with Microsoft Purview in-place data sharing for Azure Storage by reviewing the [next steps](#next-steps) or following the [Data Sharing Quickstart](quickstart-data-share.md).
-
-## Data sharing scenarios
-
-Microsoft Purview Data Sharing can help with various data sharing scenarios, including:
-
-* Collaborate with external business partners while maintaining data security in your own environment.
-* Outsource data transformation and processing to third party ISVs or data aggregators by sharing raw data and receiving normalized data and analytics results back.
-* Automate sharing of big data (for example: IoT data, scientific data, satellite and surveillance images or videos, financial market data) in near real time and without data duplication.
-* Share data between different departments within the organization.
-
-## Next Steps
-
-* [Quickstart: share data](quickstart-data-share.md)
-* [FAQ for data sharing](how-to-data-share-faq.md)
-* [How to share data](how-to-share-data.md)
-* [How to receive a data share](how-to-receive-share.md)
purview Concept Elastic Data Map https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-elastic-data-map.md
- Title: Elastic data map
-description: This article explains the concepts of the Elastic Data Map in Microsoft Purview
----- Previously updated : 03/23/2023---
-# Elastic data map in Microsoft Purview
-
-The Microsoft Purview Data Map provides the foundation for data discovery and data governance. It captures metadata about data present in analytics, software-as-a-service (SaaS), and operational systems in hybrid, on-premises, and multicloud environments. The data map stays up to date with its built-in scanning and classification system.
-
-All Microsoft Purview accounts have a data map that starts at one capacity unit, and can elastically grow. They scale up and down based on request load and metadata stored within the data map.
-
-## Data map capacity unit
-
-The elastic data map has two components- metadata storage and operation throughput, represented as a capacity unit (CU). All Microsoft Purview accounts, by default, start with one capacity unit and elastically grow based on usage. Each data Map capacity unit includes a throughput of 25 operations/sec and 10 GB of metadata storage limit.
-
-### Operations
-
-Operations are the throughput measure of the Microsoft Purview Data Map. They include any Create, Read, Write, Update, and Delete operations on metadata stored in the Data Map. Some examples of operations are:
--- Create an asset in Data Map-- Add a relationship to an asset such as owner, steward, parent, lineage, etc.-- Edit an asset to add business metadata such as description, glossary term, etc.-- Keyword search returning results to search result page.-
-### Storage
-
-Storage is the second component of Data Map and includes the storage of technical, business, operational, and semantic metadata.
-
-The technical metadata includes schema, data type, columns, and so on, that are discovered during Microsoft Purview [scanning](concept-scans-and-ingestion.md). The business metadata includes automated (for example, promoted from Power BI datasets, or descriptions from SQL tables) and manual tagging of descriptions, glossary terms, and so on. Examples of semantic metadata include the collection mapping to data sources, or classifications. The operational metadata includes Data factory copy and data flow activity run statuses, and runs time.
-
-## Work with elastic data map
--- **Elastic Data Map with auto-scale** ΓÇô you start with a Data Map as low as one capacity unit that can autoscale based on load. For most organizations, this feature leads to increased savings and a lower price point for starting data governance projects. This feature impacts pricing.--- **Enhanced scanning & ingestion** ΓÇô you can track and control the population of the data assets, classification, and lineage across both the scanning and ingestion processes. This feature impacts pricing.-
-## Scenario
-
-Claudia is an Azure admin at Contoso who wants to create a new Microsoft Purview account from Azure portal. She doesnΓÇÖt know the required size of Microsoft Purview Data Map to support the future state of the platform. However, she knows that the Microsoft Purview Data Map is billed using Capacity Units, which are affected by storage and operations throughput. She wants to create the smallest Data Map to keep the cost low and grow the Data Map size elastically based on consumption.
-
-Claudia can create a Microsoft Purview account with the default Data Map size of one capacity unit that can automatically scale up and down. The autoscaling feature also allows for capacity to be tuned based on intermittent or planned data bursts during specific periods. Claudia follows the next steps in the creation experience to set up network configuration and completes the creation.
-
-In the Azure portal, in the metrics tab for the Microsoft Purview account, Claudia can see the consumption of the Data Map storage and operations throughput. She can further set up an alert when the storage or operations throughput reaches a certain limit to monitor the consumption and billing of the new Microsoft Purview account.
-
-## Data map billing
-
-Customers are billed for one capacity unit (25 ops/sec and 10 GB) and extra billing is based on the consumption of each extra capacity unit rolled up to the hour. The Data Map operations scale in the increments of 25 operations/sec and metadata storage scales in the increments of 10-GB size. Microsoft Purview Data Map can automatically scale up and down within the elasticity window ([check current limits](how-to-manage-quotas.md)). However, to get the next level of elasticity window, a support ticket needs to be created.
-
-Data Map capacity units come with a cap on operations throughput and storage. If storage exceeds the current capacity unit, customers are charged for the next capacity unit even if the operations throughput isn't used. The below table shows the Data Map capacity unit ranges. Contact support if the Data Map capacity unit goes beyond 100 capacity unit.
-
-|Data Map Capacity Unit |Operations/Sec throughput |Storage capacity in GB|
-|-|--||
-|1 |25 |10 |
-|2 |50 |20 |
-|3 |75 |30 |
-|4 |100 |40 |
-|5 |125 |50 |
-|6 |150 |60 |
-|7 |175 |70 |
-|8 |200 |80 |
-|9 |225 |90 |
-|10 |250 |100 |
-|.. |.. |.. |
-|100 |2500 |1000 |
-
-### Billing examples
--- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is less than or equal to 25 Ops/Sec and storage size is 1 GB. Customers are billed for one capacity unit.--- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is less than or equal to 25 Ops/Sec and storage size is 15 GB. Customers are billed for two capacity units.--- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is 50 Ops/Sec and storage size is 15 GB. Customers are billed for two capacity units.--- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is 50 Ops/Sec and storage size is 25 GB. Customers are billed for three capacity units.--- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is 250 Ops/Sec and storage size is 15 GB. Customers are billed for 10 capacity units.-
-### Detailed billing example
-
-The Data Map billing example shows a Data Map with growing metadata storage and variable operations per second over a six-hour window from 12 PM to 6 PM. The red line in the graph is operations per second consumption, and the blue dotted line is metadata storage consumption over this six-hour window:
--
-Each Data Map capacity unit supports 25 operations/second and 10 GB of metadata storage. The Data Map is billed hourly. It's billed for the maximum Data Map capacity units needed within the hour, with a minimum of one capacity unit. At times, you may need more operations/second within the hour, and more operations increase the number of capacity units needed within that hour. At other times, your operations/second usage may be low, but you may still need a large volume of metadata storage. The metadata storage is what determines how many capacity units you need within the hour.
-
-The table shows the maximum number of operations/second and metadata storage used per hour for this billing example:
--
-Based on the Data Map operations/second and metadata storage consumption in this period, this Data Map would be billed for 22 capacity-unit hours over this six-hour period (1 + 3 + 4 + 5 + 6 + 3):
--
->[!Important]
->Microsoft Purview Data Map can automatically scale up and down within the elasticity window ([check current limits](how-to-manage-quotas.md)). To get the next level of the elasticity window, a support ticket needs to be created.
-
-## Increase operations throughput limit
-
-The default limit for maximum operations per second is 10 capacity units. If you're working with a large Microsoft Purview environment and require a higher throughput, you can request a larger capacity of elasticity window by [creating a quota request](how-to-manage-quotas.md#request-quota-increase). Select "Data map capacity unit" as the quota type. Provide as much relevant information as you can about your environment and the extra capacity you would like to request.
-
-> [!IMPORTANT]
-> There's no default limit for metadata storage. As you add more metadata to your data map, it will elastically increase.
-
-Increasing the operations throughput limit also increases the minimum number of capacity units. If you increase the throughput limit to 20, the minimum capacity units you're charged is 2 CUs. The below table illustrates the possible throughput options. The number you enter in the quota request is the minimum number of capacity units on the account.
--
-| Minimum capacity units | Operations throughput limit |
-|-|--|
-| 1 |10 (Default) |
-| 2 |20 |
-| 3 |30 |
-| 4 |40 |
-| 5 |50 |
-| 6 |60 |
-| 7 |70 |
-| 8 |80 |
-| 9 |90 |
-| 10 |100 |
-
-## Monitoring the elastic data map
-
-The metrics _data map capacity units_ and the _data map storage size_ can be monitored in order to understand the data estate size and the billing.
-
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
-
-1. Select **Overview** and scroll down to observe the **Monitoring** section for _Data Map Capacity Units_ and _Data Map Storage Size_ metrics over different time periods
-
- :::image type="content" source="./media/concept-elastic-data-map/data-map-metrics.png" alt-text="Screenshot of the menu showing the elastic data map metrics overview page.":::
-
-1. For other settings, navigate to the **Monitoring --> Metrics** to observe the **Data Map Capacity Units** and **Data Map Storage Size**.
-
- :::image type="content" source="./media/concept-elastic-data-map/elastic-data-map-metrics.png" alt-text="Screenshot of the menu showing the metrics.":::
-
-1. Select the **Data Map Capacity Units** to view the data map capacity unit usage over the last 24 hours. Observe that hovering the mouse over the line graph indicates the data map capacity units consumed at that particular time on the particular day.
-
- :::image type="content" source="./media/concept-elastic-data-map/data-map-capacity-default.png" alt-text="Screenshot of the menu showing the data map capacity units consumed over 24 hours.":::
-
-1. Select the **Local Time: Last 24 hours (Automatic - 1 hour)** at the top right of the screen to modify time range displayed for the graph.
-
- :::image type="content" source="./media/concept-elastic-data-map/data-map-capacity-custom.png" alt-text="Screenshot of the menu showing the data map capacity units consumed over a custom time range.":::
-
- :::image type="content" source="./media/concept-elastic-data-map/data-map-capacity-time-range.png" alt-text="Screenshot of the menu showing the data map capacity units consumed over a three day time range.":::
-
-1. Customize the graph type by selecting the option:
-
- :::image type="content" source="./media/concept-elastic-data-map/data-map-capacity-graph-type.png" alt-text="Screenshot of the menu showing the options to modify the graph type.":::
-
-1. Select the **New chart** to add the graph for the Data Map Storage Size chart.
-
- :::image type="content" source="./media/concept-elastic-data-map/data-map-storage-size.png" alt-text="Screenshot of the menu showing the data map storage size used.":::
-
-## Summary
-
-With elastic Data Map, Microsoft Purview provides low-cost barrier for customers to start their data governance journey.
-Microsoft Purview Data Map can grow elastically with pay as you go model starting from as small as one Capacity unit.
-Customers donΓÇÖt need to worry about choosing the correct Data Map size for their data estate at creation time.
-
-## Next Steps
--- [Create a Microsoft Purview account](create-catalog-portal.md)-- [Microsoft Purview Pricing](https://azure.microsoft.com/pricing/details/azure-purview/)
purview Concept Guidelines Pricing Data Estate Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-guidelines-pricing-data-estate-insights.md
- Title: Pricing guidelines for Data Estate Insights
-description: This article provides a guideline to understand and strategize pricing for the Data Estate Insights components of Microsoft Purview (formerly Azure Purview).
----- Previously updated : 01/24/2023---
-# Pricing for Data Estate Insights
-
-This guide covers pricing guidelines for Data Estate Insights.
-
-For a full pricing guideline details for Microsoft Purview (formerly Azure Purview), see the [pricing guideline overview.](concept-guidelines-pricing.md)
-
-For specific price details, see the [Microsoft Purview (formerly Azure Purview) pricing page](https://azure.microsoft.com/pricing/details/purview/). This article will guide you through the features and factors that will affect pricing for Data Estate Insights.
-
-## Guidelines
-
-Data Estate Insights is billed on two dimensions:
--- **Report generation** - this incorporates the job that checks for any changes in your environment and the jobs that aggregate metrics about your Microsoft Purview account that will appear in specific reports.
- - If you have [set your reports to refresh on a schedule](how-to-schedule-data-estate-insights.md), at the time of refresh there will be a job to check if any updates have been made to your environment. You'll always be billed a small amount for this check.
- - If updates have been made to your environment, you'll be billed for the jobs that aggregate metrics and generate your report.
-
- > [!NOTE]
- > On the [pricing page](https://azure.microsoft.com/pricing/details/purview/), you can find details for report generation pricing under Data Map Enrichment.
- > :::image type="content" source="media/concept-guidelines-pricing/data-map-enrichment.png" alt-text="Screenshot of the pricing page headers, showing Data Map Enrichment selected." :::
--- **Report consumption** - This incorporates access of the report features (currently served through the UX). On the [pricing page](https://azure.microsoft.com/pricing/details/purview/), you can find details for report generation pricing under Data Estate Insights.
- :::image type="content" source="media/concept-guidelines-pricing/data-estate-insights.png" alt-text="Screenshot of the pricing page headers, showing Data Estate Insights selected." :::
-
-> [!IMPORTANT]
-> The Data Estate Insights application is **on** by default when you create a Microsoft Purview account. This means, ΓÇ£StateΓÇ¥ is on and ΓÇ£Refresh FrequencyΓÇ¥ is set to automatic*.
->
-> \* At this time automatic refresh is weekly.
-
-If you want to reschedule or reduce how often your reports are refreshed, you can [schedule your Data Estate Insights reports](how-to-schedule-data-estate-insights.md).
-
-If you don't plan on using Data Estate Insights for a while, a **[data curator](catalog-permissions.md#roles) on the [root collection](reference-azure-purview-glossary.md#root-collection)** can disable Data Estate Insights features in one of two ways:
--- [Disable the Data Estate Insights application](#disable-the-data-estate-insights-application) - this will stop billing from both report generation and report consumption.-- [Disable report refresh](#disable-report-refresh) - [insights readers](catalog-permissions.md#roles) have access to current reports, but reports won't be refreshed. Billing will occur for report consumption but not report generation.-
-> [!NOTE]
-> The application or report refresh can be enabled again later at any time.
-
-A **[data curator](catalog-permissions.md#roles) on your account's [root collection](reference-azure-purview-glossary.md#root-collection)** can make these changes in the Management section of the Microsoft Purview governance portal in **Overview**, under **Feature options**. For specific steps, see the [disable Data Estates Insights article](disable-data-estate-insights.md)
--
-### Schedule Report Refresh
-
-Your reports can be scheduled to refresh daily, weekly, or monthly depending on your needs, and this will affect billing.
-
-Firstly, on the date/time of your scheduled refresh, insights will run a job to check if any changes have been made in your environment since the last refresh. You'll always be billed for this job.
-
-If changes have been made since the last report refresh, billing is calculated based on the amount of compute power that is used to generate the report, and is billed at the time of report generation.
-For more information about specific pricing, see the [pricing page](https://azure.microsoft.com/pricing/details/purview/) **under Data Map Enrichment.**
-
-For more information about setting your refresh schedule, see [the schedule Data Estate Insights reports article](how-to-schedule-data-estate-insights.md).
-
-### Disable the Data Estate Insights application
-
-Disabling Data Estate Insights will disable the entire application, including these reports:
--- Stewardship-- Asset-- Glossary-- Classification-- Labeling-
-The application icon will still show in the menu, but insights readers won't have access to reports at all, and report generation jobs will be stopped. The Microsoft Purview account won't receive any bill for Data Estate Insights.
-
-For steps to disable the Data Estate Insights application, see the [disable article.](disable-data-estate-insights.md#disable-the-data-estate-insights-application)
-
-### Disable report refresh
-
-You can choose to disable report refreshes instead of disabling the entire Data Estate Insights application.
-
-When you disable report refreshes, insight readers will be able view reports but they'll see a banner on top of each report, warning that the report may not be current. It will also indicate the date the report was last generated.
-
-In this case, graphs showing data from last 30 days will appear blank after 30 days. Graphs showing snapshot of the data map will continue to show graph and details. When an [insights readers](catalog-permissions.md#roles) accesses an insight report, report consumption meter will be triggered, and the Microsoft Purview account will be billed.
-
-For steps to disable report refresh see the [disable article.](disable-data-estate-insights.md#disable-report-refresh)
-
-## Next steps
--- [Schedule Data Estate Insights reports](how-to-schedule-data-estate-insights.md)-- [Disable Data Estate Insights](disable-data-estate-insights.md)-- [Microsoft Purview, formerly Azure Purview, pricing page](https://azure.microsoft.com/pricing/details/azure-purview/)-- [Pricing guideline overview](concept-guidelines-pricing.md)-- [Pricing guideline Data Map](concept-guidelines-pricing-data-map.md)
purview Concept Guidelines Pricing Data Map https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-guidelines-pricing-data-map.md
- Title: Pricing guidelines for the Microsoft Purview elastic data map
-description: This article provides a guideline to understand and strategize pricing for the elastic data map in the Microsoft Purview governance portal.
----- Previously updated : 06/27/2022---
-# Pricing for the Microsoft Purview Data Map
-
-This guide covers pricing guidelines for the data map in the Microsoft Purview governance portal.
-
-For a full pricing guideline details for Microsoft Purview (formerly Azure Purview), see the [pricing guideline overview.](concept-guidelines-pricing.md)
-
-For specific price details, see the [Microsoft Purview (formerly Azure Purview) pricing page](https://azure.microsoft.com/pricing/details/purview/). This article will guide you through the features and factors that will affect pricing for the Microsoft Purview Data Map.
-
-Direct costs impacting pricing for the Microsoft Purview Data Map are based on the following three dimensions:
-- [**Elastic data map**](#elastic-data-map)-- [**Automated scanning & classification**](#automated-scanning-classification-and-ingestion)-- [**Advanced resource sets**](#advanced-resource-sets)-
-## Elastic data map
--- The **Data map** is the foundation of the Microsoft Purview governance portal architecture and so needs to be up to date with asset information in the data estate at any given point--- The data map is charged in terms of **Capacity Unit** (CU). The data map is provisioned at one CU if the catalog is storing up to 10 GB of metadata storage and serves up to 25 data map operations/sec--- The data map is always provisioned at one CU when an account is first created--- However, the data map scales automatically between the minimal and maximal limits of that elasticity window, to cater to changes in the data map with respect to two key factors - **operation throughput** and **metadata storage**-
-### Operation throughput
--- An event driven factor based on the Create, Read, Update, Delete operations performed on the data map-- Some examples of the data map operations would be:
- - Creating an asset in Data Map
- - Adding a relationship to an asset such as owner, steward, parent, lineage
- - Editing an asset to add business metadata such as description, glossary term
- - Keyword-search returning results to search result page
- - Importing or exporting information using API
-- If there are multiple queries executed on the Data Map, the number of I/O operations also increases resulting in the scaling up of the data map-- The number of concurrent users also forms a factor governing the data map capacity unit-- Other factors to consider are type of search query, API interaction, workflows, approvals, and so on-- Data burst level
- - When there's a need for more operations/second throughput, the Data map can autoscale within the elasticity window to cater to the changed load
- - This constitutes the **burst characteristic** that needs to be estimated and planned for
- - The burst characteristic comprises the **burst level** and the **burst duration** for which the burst exists
- - The **burst level** is a multiplicative index of the expected consistent elasticity under steady state
- - The **burst duration** is the percentage of the month that such bursts (in elasticity) are expected because of growing metadata or higher number of operations on the data map
-
-### Metadata storage
--- If the number of assets reduces in the data estate, and are then removed in the data map through subsequent incremental scans, the storage component automatically reduces and so the data map scales down-
-## Automated scanning, classification, and ingestion
-
-There are two major automated processes that can trigger ingestion of metadata into the Microsoft Purview Data Map:
-- Automatic scans using native [connectors](azure-purview-connector-overview.md). This process includes three main steps:
- - Metadata scan
- - Automatic classification
- - Ingestion of metadata into the Microsoft Purview Data Map
--- Automated ingestion using Azure Data Factory and/or Azure Synapse pipelines. This process includes:
- - Ingestion of metadata and lineage into the Microsoft Purview Data Map if the account is connected to any Azure Data Factory or Azure Synapse pipelines.
--
-### Automatic scans using native connectors
--- A **full scan** processes all assets within a selected scope of a data source whereas an **incremental scan** detects and processes assets, which have been created, modified, or deleted since the previous successful scan --- All scans (full or Incremental scans) will pick up **updated, modified, or deleted** assets--- It's important to consider and avoid the scenarios when multiple people or groups belonging to different departments set up scans for the same data source resulting in more pricing for duplicate scanning--- Schedule **frequent incremental scans** post the initial full scan aligned with the changes in the data estate. This will ensure the data map is kept up to date always and the incremental scans consume lesser v-core hours as compared to a full scan--- The **ΓÇ£View DetailsΓÇ¥** link for a data source will enable users to run a full scan. However, consider running incremental scans after a full scan for optimized scanning excepting when there's a change to the scan rule set (classifications/file types)--- **Register the data source at a parent collection** and **Scope scans at child collection** with different access controls to ensure there are no duplicate scanning costs being entailed--- Curtail the users who are allowed to register data sources for scanning through **fine grained access control** and **Data Source Administrator** role using [Collection authorization](./catalog-permissions.md). This will ensure only valid data sources are allowed to be registered and scanning v-core hours is controlled resulting in lower costs for scanning--- Consider that the **type of data source** and the **number of assets** being scanned affect the scan duration--- **Create custom scan rule sets** to include only the subset of **file types** available in your data estate and **classifications** that are relevant to your business requirements to ensure optimal use of the scanners--- While creating a new scan for a data source, follow the **order of preparation** recommended before actually running the scan. This includes gathering the requirements for **business specific classifications** and **file types** (for storage accounts) to enable appropriate scan rule sets to be defined to avoid multiple scans and control unnecessary costs for multiple scans through missed requirements--- Align your scan schedules with Self-Hosted Integration Runtime (SHIR) VMs (Virtual Machines) size to avoid extra costs linked to virtual machines-
-### Automated ingestion using Azure Data Factory and/or Azure Synapse pipelines
--- Metadata and lineage are ingested from Azure Data Factory or Azure Synapse pipelines every time the pipelines run in the source system.-
-## Advanced resource sets
--- The Microsoft Purview Data Map uses **resource sets** to address the challenge of mapping large numbers of data assets to a single logical resource by providing the ability to scan all the files in the data lake and find patterns (GUID, localization patterns, etc.) to group them as a single asset in the data map--- **Advanced Resource Set** is an optional feature, which allows for customers to get enriched resource set information computed such as Total Size, Partition Count, etc., and enables the customization of resource set grouping via pattern rules. If Advanced Resource Set feature isn't enabled, your data catalog will still contain resource set assets, but without the aggregated properties. There will be no "Resource Set" meter billed to the customer in this case.--- Use the basic resource set feature, before switching on the Advanced Resource Sets in the Microsoft Purview Data Map to verify if requirements are met--- Consider turning on Advanced Resource Sets if:
- - Your data lakes schema is constantly changing, and you're looking for more value beyond the basic Resource Set feature to enable the Microsoft Purview Data Map to compute parameters such as #partitions, size of the data estate, etc., as a service
- - There's a need to customize how resource set assets get grouped.
--- It's important to note that billing for Advanced Resource Sets is based on the compute used by the offline tier to aggregate resource set information and is dependent on the size/number of resource sets in your catalog--
-## Next steps
--- [Microsoft Purview, formerly Azure Purview, pricing page](https://azure.microsoft.com/pricing/details/azure-purview/)-- [Pricing guideline overview](concept-guidelines-pricing.md)-- [Pricing guideline Data Estate Insights](concept-guidelines-pricing-data-estate-insights.md)
purview Concept Guidelines Pricing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-guidelines-pricing.md
- Title: Pricing guidelines for Microsoft Purview (formerly Azure Purview)
-description: This article provides a guideline to understand and strategize pricing for the components of Microsoft Purview (formerly Azure Purview).
---- Previously updated : 12/09/2022---
-# Overview of pricing for Microsoft Purview (formerly Azure Purview)
-
-Microsoft Purview, formally known as Azure Purview, provides a single pane of glass for managing data governance by enabling automated scanning and classifying data at scale through the Microsoft Purview governance portal.
-
-For specific price details, see the [Microsoft Purview (formerly Azure Purview) pricing page](https://azure.microsoft.com/pricing/details/purview/). This article will guide you through the features and factors that will affect pricing.
-
-## Why do you need to understand the components of pricing?
--- While the pricing for Microsoft Purview (formerly Azure Purview) is on a subscription-based **Pay-As-You-Go** model, there are various dimensions that you can consider while budgeting-- This guideline is intended to help you plan the budgeting for Microsoft Purview in the governance portal by providing a view on the control factors that impact the budget-
-## Factors impacting Azure Pricing
-
-There are **direct** and **indirect** costs that need to be considered while planning budgeting and cost management.
-
-Direct costs impacting Microsoft Purview pricing are based on these applications:
-- [The Microsoft Purview Data Map](concept-guidelines-pricing-data-map.md)-- [Data Estate Insights](concept-guidelines-pricing-data-estate-insights.md)-
-## Indirect costs
-
-Indirect costs impacting Microsoft Purview (formerly Azure Purview) pricing to be considered are:
--- [Managed resources](https://azure.microsoft.com/pricing/details/azure-purview/)
- - When an account is provisioned, a storage account is created in the subscription in order to cater to secured scanning, which may be charged separately.
- - An Event Hubs namespace can be [configured at creation](create-catalog-portal.md#create-an-account) or enabled in the [Azure portal](https://portal.azure.com) on the Kafka configuration page of the account to enable monitoring with [*Atlas Kafka* topics events](manage-kafka-dotnet.md). The Event Hubs will be charged separately.
---- [Azure private endpoint](./catalog-private-link.md)
- - Azure private end points are used for Microsoft Purview (formerly Azure Purview), where it's required for users on a virtual network (VNet) to securely access the catalog over a private link
- - The prerequisites for setting up private endpoints could result in extra costs
--- [Self-hosted integration runtime related costs](./manage-integration-runtimes.md)
- - Self-hosted integration runtime requires infrastructure, which results in extra costs
- - It's required to deploy and register Self-hosted integration runtime (SHIR) inside the same virtual network where Microsoft Purview ingestion private endpoints are deployed
- - [Other memory requirements for scanning](./register-scan-sapecc-source.md#create-and-run-scan)
- - Certain data sources such as SAP require more memory on the SHIR machine for scanning
---- [Virtual Machine Sizing](../virtual-machines/sizes.md)
- - Plan virtual machine sizing in order to distribute the scanning workload across VMs to optimize the v-cores utilized while running scans
--- [Microsoft 365 license](./create-sensitivity-label.md)
- - Microsoft Purview Information Protection sensitivity labels can be automatically applied to your Azure assets in the Microsoft Purview Data Map.
- - Microsoft Purview Information Protection sensitivity labels are created and managed in the Microsoft Purview compliance portal.
- - To create sensitivity labels for use in Microsoft Purview, you must have an active Microsoft 365 license, which offers the benefit of automatic labeling. For the full list of licenses, see the Sensitivity labels in Microsoft Purview FAQ.
--- [Azure Alerts](../azure-monitor/alerts/alerts-overview.md)
- - Azure Alerts can notify customers of issues found with infrastructure or applications using the monitoring data in Azure Monitor
- - The pricing for Azure Alerts is available [here](https://azure.microsoft.com/pricing/details/monitor/)
--- [Cost Management Budgets & Alerts](../cost-management-billing/costs/cost-mgt-alerts-monitor-usage-spending.md)
- - Automatically generated cost alerts are used in Azure to monitor Azure usage and spending based on when Azure resources are consumed
- - Azure allows you to create and manage Azure budgets. Refer [tutorial](../cost-management-billing/costs/tutorial-acm-create-budgets.md)
--- Multi-cloud egress charges
- - Consider the egress charges (minimal charges added as a part of the multi-cloud subscription) associated with scanning multi-cloud (for example AWS, Google) data sources running native services excepting the S3 and RDS sources
-
-## Next steps
--- [Microsoft Purview, formerly Azure Purview, pricing page](https://azure.microsoft.com/pricing/details/azure-purview/)-- [Pricing guideline Data Estate Insights](concept-guidelines-pricing-data-estate-insights.md)-- [Pricing guideline Data Map](concept-guidelines-pricing-data-map.md)
purview Concept Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-insights.md
- Title: Understand Insights reports in Microsoft Purview
-description: This article explains what Insights are in Microsoft Purview.
------ Previously updated : 05/16/2022--
-# Understand the Microsoft Purview Data Estate Insights application
-
-This article provides an overview of the Data Estate Insights application in Microsoft Purview.
-
-The Data Estate Insights application is purpose-built for governance stakeholders, primarily for roles focused on data management, compliance, and data use: like a Chief Data Officer. The application provides actionable insights into the organizationΓÇÖs data estate, catalog usage, adoption, and processes. As organizations scan and populate their Microsoft Purview Data Map, the Data Estate Insights application automatically extracts valuable governance gaps and highlights them in its top metrics. Then it also provides drill-down experience that enables all stakeholders, such as data owners and data stewards, to take appropriate action to close the gaps.
-
-All the reports within the Data Estate Insights application are automatically generated and populated, so governance stakeholders can focus on the information itself, rather than building the reports.
-
-The dashboards and reports available within Microsoft Purview Data Estate Insights are categorized in three sections:
-* [Health](#health)
-* [Inventory and Ownership](#inventory-and-ownership)
-* [Curation and governance](#curation-and-governance)
-
- :::image type="content" source="./media/insights/table-of-contents.png" alt-text="Screenshot of table of contents for Microsoft Purview Data Estate Insights.":::
-
-## Health
-
-Data, governance, and quality focused users like chief data officers and data stewards can start at the health dashboards to understand the current health status of their data estate, current return on investment on their catalog, and begin to address any outstanding issues.
-
- :::image type="content" source="./media/insights/data-stewardship-small.png" alt-text="Screenshot of health insights report dashboard." lightbox="media/insights/data-stewardship-large.png":::
-
-### Data stewardship
-
-The data stewardship dashboard highlights key performing indicators that the governance stakeholders need to focus on, to attain a clean and governance-ready data estate. Information like asset curation rates, data ownership rates, and classification rates are calculated out of the box and trended over time.
-
-Management-focused users, like a Chief Data Officer, can also get a high-level understanding of weekly and monthly active users of their catalog, and information about how the catalog is being used. Is the catalog being adopted across their organization, as better adoption leads to better overall governance penetration in the organization?
-
-For more information about these dashboards, see the [data Stewardship documentation.](data-stewardship.md)
-
-### Catalog adoption
-
-The catalog adoption dashboard highlights active users, searches, viewed assets, and top searched keywords. This report helps you understand how and if your data catalog is being used, so you can see the impact it's having on data usage and discoverability.
-
-For more information about these dashboards, see the [catalog adoption documentation.](catalog-adoption-insights.md)
-
-## Inventory and ownership
-
-This area focuses on summarizing data estate inventory for data quality and management focused users, like data stewards and data curators. These dashboards provide key metrics and overviews to give users the ability to find and resolve gaps in their assets, all from within the data estate insights application.
-
- :::image type="content" source="./media/insights/asset-insights-small.png" alt-text="Screenshot of inventory and ownership insights report dashboard." lightbox="media/insights/asset-insights-large.png":::
-
-### Assets
-
-This report provides a summary of your data estate and its distribution by collection and source type. You can also view new assets, deleted assets, updated assets, and stale assets from the last 30 days.
-
-Explore your data by classification, investigate why assets didn't get classified, and see how many assets exist without a data owner assigned. To take action, the report provides a ΓÇ£View DetailΓÇ¥ button to view and edit the specific assets that need treatment.
-
-You can also view data asset trends by asset count and data size, as we record this metadata during the data map scanning process.
-
-For more information, see the [asset insights documentation.](asset-insights.md)
-
-## Curation and governance
-
-This area focuses on giving a summary of how curated your assets are by several curation contexts. Currently we focus on showcasing assets with glossary, classification, and sensitivity labels.
-
- :::image type="content" source="./media/insights/curation-and-governance-small.png" alt-text="Screenshot of example curation and governance insights report dashboard." lightbox="media/insights/curation-and-governance-large.png":::
-
-### Glossary
-
-Data, governance, and quality focused users like chief data officers and data stewards a status check on their business glossary. Data maintenance and collection focused users like Data Stewards can view this report to understand distribution of glossary terms by status, learn how many glossary terms are attached to assets, and how many aren't yet attached to any asset. Business users can also learn about completeness of their glossary terms.
-
-This report summarizes top items that use needs to focus on to create a complete and usable glossary for their organization. Users can also navigate into the "Glossary" experience from "Glossary Insights" experience, to make changes on a specific glossary term.
-
-For more information, see the [glossary insights in Microsoft Purview documentation.](glossary-insights.md)
-
-### Classifications
-
-This report provides details about where classified data is located, the classifications found during a scan, and a drill-down to the classified files themselves. It enables data quality and data security focused users like data stewards, data curators, and security administrators to understand the types of information found in their organization's data estate.
-
-In Microsoft Purview, classifications are similar to subject tags, and are used to mark and identify content of a specific type in your data estate.
-
-Use the classification insights report to identify content with specific classifications and understand required actions, such as adding extra security to the repositories, or moving content to a more secure location.
-
-For more information, see the [classification insights about your data from Microsoft Purview documentation.](classification-insights.md)
-
-### Sensitivity Labels
-
-This report provides details about the sensitivity labels found during a scan and a drill-down to the labeled files themselves. It enables security administrators to ensure the security of the data found in their organization's data estate by identifying where sensitive data is stored.
-
-In Microsoft Purview, sensitivity labels are used to identify classification type categories, and the group security policies that you want to apply to each category.
-
-Use the labeling insights report to identify the sensitivity labels found in your content and understand required actions, such as managing access to specific repositories or files.
-
-For more information, see the [sensitivity label insights about your data in Microsoft Purview documentation.](sensitivity-insights.md)
-
-## Next steps
-
-Learn how to use Data Estate Insights with resources below:
-
-* [Learn how to use Asset insights](asset-insights.md)
-* [Learn how to use Classification insights](classification-insights.md)
-* [Learn how to use Glossary insights](glossary-insights.md)
-* [Learn how to use Label insights](sensitivity-insights.md)
purview Concept Metamodel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-metamodel.md
- Title: Microsoft Purview metamodel
-description: The Microsoft Purview metamodel helps you represent a business perspective of your data, how itΓÇÖs grouped into data domains, used in business processes, organized into systems, and more.
----- Previously updated : 11/10/2022---
-# Microsoft Purview metamodel
--
-Metamodel is a feature in the Microsoft Purview Data Map that helps add rich business context to your data catalog. It tells a story about how your data is grouped in data domains, how it's used in business processes, what projects are impacted by the data, and ultimately how the data fits in the day to day of your business.
-
-The context metamodel provides is important because business users, the people who are consuming the data, often have non-technical questions about the data. Questions like: What department produces this dataset? Are there any projects that are using this dataset? Where does this report come from?
-
-When you scan data into Microsoft Purview, the technical metadata can tell you what the data looks like, if the data has been classified, or if it has glossary terms assigned, but it can't tell you where and how that data is used. The metamodel gives your users that information. It can tell you what data is mission critical for a product launch or used by a high performing sales team to convert leads to prospects.
-
-So not only does the metamodel help data consumers find the data they're looking for, but it also tells your data managers what data is critical and how healthy that critical data is, so you can better manage and govern it. For example: you may have different privacy obligations if you use personal information for marketing activities vs. analytics that improve a product or service, and metamodel can help you determine where data is being used.
-
-## So what is the metamodel?
-
-What does it look like? The metamodel is built from **assets** and the **relationships** between them.
-
-For example, you might have a sales reporting team (asset) that consumes data (relationship) from some SQL tables (assets).
-
-When you scan data sources into the data map, you already have the technical data assets like SQL tables available for your metamodel. But what about assets like a sales reporting team, or a marketing strategy that represent processes or people instead of a data source? Metamodel provides **asset types** that allow you to describe other important parts of your business.
-
-An **asset type** is a template for important concepts like business processes, departments, lines of business, or even products. They're the building blocks you'll use to describe how data is used in your business. The **asset type** creates a template you can use over and over to describe specific assets. For example, you can define an asset type "department" and then create new department assets for each of your business departments. These new assets are stored in Microsoft Purview like any other data assets that were scanned in the data map, so you can search and browse for them in the data catalog.
-
-Metamodel includes several [predefined asset types](how-to-metamodel.md#predefined-asset-types) to help you get started, but you can also create your own.
-
-Similarly a **relationship definition** is the template for the kinds of interactions between assets that you want to represent. For example, a department *manages* a business process. An organization *has* departments. The business process, organization, and departments are the assets, "manages" and "has" are the relationships between them.
-
-For example, if we want to use Microsoft Purview to show how key data sets are used in our business processes, we can represent that information as a template:
--
-Which we can then use to describe how a specific business process uses a specific data set:
--
-## Metamodel example
-
-For a simple example of a metamodel, let's consider marketing campaign management in a business. This process is an asset that we'll add to our metamodel. It's not a data source we can scan, since it's a set of business processes, but during this process real data will be used and referenced. It's important to show what data is being used and how, so we can properly manage and govern it. We'll create an asset for the marketing campaign management from the asset type "Business Process".
-
-We know there are two tables in SQL that the marketing campaign management team uses. These are assets too, but they were data source assets created when the SQL database was scanned into the Microsoft Purview Data Map.
-
-The relationship between the marketing campaign management asset and the SQL table assets is that campaign management **consumes**, or uses, the SQL tables when developing campaigns. We can record that in our metamodel, so now anyone that looks at those SQL tables can see that they're being used to develop marketing campaigns. We'll also be able to see if there are any other teams that use this data, or maybe which department develops this data as well. So now, with metamodel, we know not only what the data is, but we have a story about how it's being used that helps us understand and manage it.
--
-And now that marketing campaign management is an asset in your data catalog like any other asset, so you can find it in a search. A user could search for that process and quickly know what data it uses and produces, without needing to know anything about the data beforehand.
-
-## Next steps
-
-If you're ready to get started, follow the [how to create a metamodel](how-to-metamodel.md) article.
purview Concept Policies Data Owner Action Update https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-policies-data-owner-action-update.md
- Title: Role definitions SQL Performance Monitor and SQL Security Auditor supported only from DevOps policies experience
-description: This guide discusses actions that have been retired from Data owner policies experience and are now supported only from the DevOps policies experience.
----- Previously updated : 02/09/2023--
-# Role definitions SQL Performance Monitor and SQL Security Auditor supported only from DevOps policies experience
-
-This guide discusses actions that have been retired from Data owner policies experience and are now supported only from the DevOps policies experience.
-
-## Important considerations
-The following two actions that were previously available in the Microsoft Purview Data owner policies experience are now supported only from the Microsoft Purview DevOps policies, which is a more focused experience.
-- SQL Performance Monitor-- SQL Security Auditor-
-If you currently have Data owner policies with these actions we encourage you to give the DevOps policies experience a try. Creating new policies or editing existing policies that involve these two actions is now unsupported from the Data owner experience.
-
-## Next steps
-Check these concept guides
-* [DevOps policies](concept-policies-devops.md)
purview Concept Policies Data Owner https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-policies-data-owner.md
- Title: Microsoft Purview data owner policies concepts
-description: Understand Microsoft Purview data owner policies
----- Previously updated : 07/06/2023--
-# Concepts for Microsoft Purview data owner policies (preview)
--
-This article discusses concepts related to managing access to data sources in your data estate from within the Microsoft Purview governance portal.
-
-> [!Note]
-> This capability is different from access control for Microsoft Purview itself, which is described in [Access control in Microsoft Purview](catalog-permissions.md).
-
-## Overview
-
-Access policies in Microsoft Purview enable you to manage access to different data systems across your entire data estate. For example:
-
-A user needs read access to an Azure Storage account that has been registered in Microsoft Purview. You can grant this access directly in Microsoft Purview by creating a data access policy through the **Policy management** app in the Microsoft Purview governance portal.
-
-Data access policies can be enforced through Purview on data systems that have been registered for policy.
-
-## Microsoft Purview policy concepts
-
-### Microsoft Purview policy
-
-A **policy** is a named collection of policy statements. When a policy is published to one or more data systems under PurviewΓÇÖs governance, it's then enforced by them. A policy definition includes a policy name, description, and a list of one or more policy statements.
-
-### Policy statement
-
-A **policy statement** is a human readable instruction that dictates how the data source should handle a specific data operation. The policy statement comprises **Effect**, **Action, Data Resource** and **Subject**.
-
-#### Action
-
-An **action** is the operation being permitted or denied as part of this policy. For example: Read or Modify. These high-level logical actions map to one (or more) data actions in the data system where they are enforced.
-
-#### Effect
-
-The **effect** indicates what should be resultant effect of this policy. Currently, the only supported value is **Allow**.
-
-#### Data resource
-
-The **data resource** is the fully qualified data asset path to which a policy statement is applicable. It conforms to the following format:
-
-*/subscription/\<subscription-id>/resourcegroups/\<resource-group-name>/providers/\<provider-name>/\<data-asset-path>*
-
-Azure Storage data-asset-path format:
-
-*Microsoft.Storage/storageAccounts/\<account-name>/blobservice/default/containers/\<container-name>*
-
-Azure SQL DB data-asset-path format:
-
-*Microsoft.Sql/servers/\<server-name>*
-
-#### Subject
-
-The end-user identity from Azure Active Directory for whom this policy statement is applicable. This identity can be a service principal, an individual user, a group, or a managed service identity (MSI).
-
-### Example
-
-Allow Read on Data Asset:
-*/subscription/finance/resourcegroups/prod/providers/Microsoft.Storage/storageAccounts/finDataLake/blobservice/default/containers/FinData to group Finance-analyst*
-
-In the above policy statement, the effect is *Allow*, the action is *Read*, the data resource is Azure Storage container *FinData*, and the subject is Azure Active Directory group *Finance-analyst*. If any user that belongs to this group attempts to read data from the storage container *FinData*, the request will be allowed.
-
-### Hierarchical enforcement of policies
-
-The data resource specified in a policy statement is hierarchical by default. This means that the policy statement applies to the data object itself and to **all** the child objects contained by the data object. For example, a policy statement on Azure storage container applies to all the blobs contained within it.
-
-### Policy combining algorithm
-
-Microsoft Purview can have different policy statements that refer to the same data asset. When evaluating a decision for data access, Microsoft Purview combines all the applicable policies and provides a consolidated decision. The combining strategy picks the most restrictive policy.
-For example, letΓÇÖs assume two different policies on an Azure Storage container *FinData* as follows,
-
-Policy 1 - *Allow Read on Data Asset /subscription/…./containers/FinData
-To group Finance-analyst*
-
-Policy 2 - *Deny Read on Data Asset /subscription/…./containers/FinData
-To group Finance-contractors*
-
-Then letΓÇÖs assume that user ΓÇÿuser1ΓÇÖ, who is part of two groups:
-*Finance-analyst* and *Finance-contractors*, executes a call to blob read API. Since both policies will be applicable, Microsoft Purview will choose the most restrictive one, which is *Deny* of *Read*. Thus, the access request will be denied.
-
-> [!Note]
-> As mentioned in the Effect section above, currently, the only supported effect is **Allow**. The above example is to explain how multiple policies on a single asset are evaluated.
-
-## Policy publishing
-
-A newly created policy exists in the draft mode state, only visible in Microsoft Purview. The act of publishing initiates enforcement of a policy in the specified data systems. It's an asynchronous action that can take between 5 minutes and 2 hours to be effective, depending on the enforcement code in the underlying data sources. For more information, consult the tutorials related to each data source
-
-A policy published to a data source could contain references to an asset belonging to a different data source. Such references will be ignored since the asset in question does not exist in the data source where the policy is applied.
-
-## Next steps
-Check the guides on how to create policies in Microsoft Purview that get enforced in specific data systems. Beyond the UI, you can now also the data owners API
-* Doc: [Provision access to Azure Storage datasets](how-to-policies-data-owner-storage.md)
-* Doc: [Provision access to all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
-* Doc: [Provision access to Azure SQL Database assets](how-to-policies-data-owner-azure-sql-db.md)
-* Doc: [Provision access to SQL Server 2022 (Arc-enabled) assets](how-to-policies-data-owner-arc-sql-server.md)
-* Blog: [Grant users access to data assets in your enterprise via API](https://aka.ms/AAlg655)
--
purview Concept Policies Devops https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-policies-devops.md
- Title: Microsoft Purview DevOps policies concepts
-description: Understand Microsoft Purview DevOps policies.
----- Previously updated : 07/17/2023--
-# What can I accomplish with Microsoft Purview DevOps policies?
-
-This article describes how to manage access to data sources in your data estate by using the Microsoft Purview governance portal. It focuses on basic concepts of DevOps policies. That is, it provides background information about DevOps policies that you should know before you follow other articles to get configuration steps.
-
-> [!NOTE]
-> This capability is different from the [internal access control in the Microsoft Purview governance portal](./catalog-permissions.md).
-
-Access to system metadata is crucial for IT and DevOps personnel to ensure that critical database systems are healthy, are performing to expectations, and are secure. You can grant and revoke that access efficiently and at scale through Microsoft Purview DevOps policies.
-
-Any user who holds the Policy Author role at the root collection level in Microsoft Purview can create, update, and delete DevOps policies. After DevOps policies are saved, they're published automatically.
-
-## Access policies vs. DevOps policies
-
-Microsoft Purview access policies enable customers to manage access to data systems across their entire data estate, all from a central location in the cloud. You can think of these policies as access grants that can be created through Microsoft Purview Studio, avoiding the need for code. They dictate whether a list of Azure Active Directory (Azure AD) principals, such as users and groups, should be allowed or denied a specific type of access to a data source or an asset within it. Microsoft Purview communicates these policies to the data sources, where they're natively enforced.
-
-DevOps policies are a special type of Microsoft Purview access policies. They grant access to database system metadata instead of user data. They simplify access provisioning for IT operations and security auditing personnel. DevOps policies only grant access. They don't deny access.
-
-## Elements of a DevOps policy
-
-Three elements define a DevOps policy:
--- **Subject**-
- This is a list of Azure AD users, groups, or service principals that are granted access.
-- **Data resource**-
- This is the scope where the policy is enforced. The data resource path is the composition of subscription > resource group > data source.
-
- Microsoft Purview DevOps policies currently support SQL-type data sources. You can configure them on individual data sources and on entire resource groups and subscriptions. You can create DevOps policies only after you register the data resource in Microsoft Purview with the **Data use management** option turned on.
-- **Role**-
- A role maps to a set of actions that the policy permits on the data resource. DevOps policies support the SQL Performance Monitor and SQL Security Auditor roles. Both of these roles provide access to SQL system metadata, and more specifically to dynamic management views (DMVs) and dynamic management functions (DMFs). But the set of DMVs and DMFs that these roles grant is different. We provide some popular examples [later in this article](#mapping-of-popular-dmvs-and-dmfs).
-
- The [Create, list, update, and delete Microsoft Purview DevOps policies](how-to-policies-devops-authoring-generic.md) article details the role definition for each data source type. That is, it provides a mapping of roles in Microsoft Purview to the actions that are permitted in that type of data source. For example, the role definition for SQL Performance Monitor and SQL Security Auditor includes *Connect* actions at the server and database level on the data source side.
-
-In essence, the DevOps policy assigns the role's related permissions to the subject and is enforced in the scope of the data resource's path.
-
-## Hierarchical enforcement of policies
-
-A DevOps policy on a data resource is enforced on the data resource itself and all child resources that it contains. For example, a DevOps policy on an Azure subscription applies to all resource groups, to all policy-enabled data sources within each resource group, and to all databases within each data source.
-
-## Example scenario to demonstrate the concept and the benefits
-
-Bob and Alice are involved with the DevOps process at their company. They need to log in to dozens of SQL Server instances on-premises and Azure SQL logical servers to monitor their performance so that critical DevOps processes don't break. Their manager, Mateo, puts all these SQL data sources into Resource Group 1. He then creates an Azure AD group and includes Alice and Bob. Next, he uses Microsoft Purview DevOps policies (Policy 1 in the following diagram) to grant this Azure AD group access to Resource Group 1, which hosts the logical servers.
-
-![Diagram that shows an example of DevOps policies on a resource group.](./media/concept-policies-devops/devops-policy-on-resource-group.png).
-
-These are the benefits:
--- Mateo doesn't have to create local logins in each server.-- The policies from Microsoft Purview improve security by limiting local privileged access. They support the principle of least privilege. In the scenario, Mateo grants only the minimum access that Bob and Alice need to perform the task of monitoring system health and performance.-- When new servers are added to the resource group, Mateo doesn't need to update the policy in Microsoft Purview for it to be enforced on the new servers.-- If Alice or Bob leaves the organization and the job is backfilled, Mateo just updates the Azure AD group. He doesn't have to make any changes to the servers or to the policies that he created in Microsoft Purview.-- At any point in time, Mateo or the company's auditor can see all the permissions that were granted directly in Microsoft Purview Studio.-
-| Principle | Benefit |
-|-|-|
-|Simplify |The role definitions SQL Performance Monitor and SQL Security Auditor capture the permissions that typical IT and DevOps personas need to execute their job.|
-| |There's less need for permission expertise on each data source type.|
-|||
-|Reduce effort |A graphical interface lets you move through the data object hierarchy quickly.|
-| |Microsoft Purview supports policies on entire Azure resource groups and subscriptions.|
-|||
-|Enhance security|Access is granted centrally and can be easily reviewed and revoked.|
-| |There's less need for privileged accounts to configure access directly at the data source.|
-| |DevOps policies support the principle of least privilege via data resource scopes and role definitions.|
-|||
-
-## DevOps policies API
-
-Many sophisticated customers prefer to interact with Microsoft Purview via scripts rather than via the UI. Microsoft Purview DevOps policies now support a REST API that offers full create, read, update, and delete (CRUD) capability. This capability includes listing, policies for SQL Performance Monitor, and policies for SQL Security Auditor. For more information, see the [API specification](/rest/api/purview/devopspolicydataplane/devops-policy).
-
-![Screenshot that shows where to find the DevOps API on the Azure REST API menu.](./media/concept-policies-devops/devops-policy-api.png).
-
-## Mapping of popular DMVs and DMFs
-
-SQL dynamic metadata includes a list of more than 700 DMVs and DMFs. The following table illustrates some of the most popular ones. The table maps the DMVs and DMFs to their role definitions in Microsoft Purview DevOps policies. It also provides links to reference content.
-
-| DevOps role | Category | Example DMV or DMF |
-|-|-|-|
-||||
-| SQL Performance Monitor | Query system parameters to understand your system | [sys.configurations](/sql/relational-databases/system-catalog-views/sys-configurations-transact-sql) |
-| | | [sys.dm_os_sys_info](/sql/relational-databases/system-dynamic-management-views/sys-dm-os-sys-info-transact-sql) |
-| | Identify performance bottlenecks | [sys.dm_os_wait_stats](/sql/relational-databases/system-dynamic-management-views/sys-dm-os-wait-stats-transact-sql) |
-| | Analyze currently running queries | [sys.dm_exec_query_stats](/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-query-stats-transact-sql) |
-| | Analyze blocking issues | [sys.dm_tran_locks](/sql/relational-databases/system-dynamic-management-views/sys-dm-tran-locks-transact-sql) |
-| | | [sys.dm_exec_requests](/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-requests-transact-sql) |
-| | | [sys.dm_os_waiting_tasks](/sql/relational-databases/system-dynamic-management-views/sys-dm-os-waiting-tasks-transact-sql) |
-| | Analyze memory usage | [sys.dm_os_memory_clerks](/sql/relational-databases/system-dynamic-management-views/sys-dm-os-memory-clerks-transact-sql) |
-| | Analyze file usage and performance| [sys.master_files](/sql/relational-databases/system-catalog-views/sys-master-files-transact-sql) |
-| | | [sys.dm_io_virtual_file_stats](/sql/relational-databases/system-dynamic-management-views/sys-dm-io-virtual-file-stats-transact-sql) |
-| | Analyze index usage and fragmentation | [sys.indexes](/sql/relational-databases/system-catalog-views/sys-indexes-transact-sql) |
-| | | [sys.dm_db_index_usage_stats](/sql/relational-databases/system-dynamic-management-views/sys-dm-db-index-usage-stats-transact-sql) |
-| | | [sys.dm_db_index_physical_stats](/sql/relational-databases/system-dynamic-management-views/sys-dm-db-index-physical-stats-transact-sql) |
-| | Manage active user connections and internal tasks | [sys.dm_exec_sessions](/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-sessions-transact-sql) |
-| | Get procedure execution statistics | [sys.dm_exec_procedure_stats](/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-procedure-stats-transact-sql) |
-| | Use the Query Store | [sys.query_store_plan](/sql/relational-databases/system-catalog-views/sys-query-store-plan-transact-sql) |
-| | | [sys.query_store_query](/sql/relational-databases/system-catalog-views/sys-query-store-query-transact-sql) |
-| | | [sys.query_store_query_text](/sql/relational-databases/system-catalog-views/sys-query-store-query-text-transact-sql) |
-| | Get Error Log (not yet supported)| [sys.sp_readerrorlog](/sql/relational-databases/system-stored-procedures/sp-readerrorlog-transact-sql) |
-||||
-| SQL Security Auditor | Get audit details | [sys.dm_server_audit_status](/sql/relational-databases/system-dynamic-management-views/sys-dm-server-audit-status-transact-sql) |
-||||
-| Both SQL Performance Monitor and SQL Security Auditor| | [sys.dm_audit_actions](/sql/relational-databases/system-dynamic-management-views/sys-dm-audit-actions-transact-sql) |
-|||[sys.dm_audit_class_type_map](/sql/relational-databases/system-dynamic-management-views/sys-dm-audit-class-type-map-transact-sql) |
-||||
-
-For more information on what IT support personnel can do when you grant them access via the Microsoft Purview roles, see the following resources:
--- SQL Performance Monitor: [Use Microsoft Purview to provide at-scale access to performance data in Azure SQL and SQL Server](https://techcommunity.microsoft.com/t5/azure-sql-blog/use-microsoft-purview-to-provide-at-scale-access-to-performance/ba-p/3812839)-- SQL Security Auditor: [Security-related dynamic management views and functions](/sql/relational-databases/system-dynamic-management-views/security-related-dynamic-management-views-and-functions-transact-sql)-
-## Next steps
-
-To get started with DevOps policies, consult the following resources:
-
-* Try DevOps policies for Azure SQL Database: [Quickstart guide](https://aka.ms/quickstart-DevOps-policies).
-* See [other videos, blogs, and articles](./how-to-policies-devops-authoring-generic.md#next-steps).
purview Concept Policies Purview Account Delete https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-policies-purview-account-delete.md
- Title: Impact of deleting Microsoft Purview account on access policies
-description: This guide discusses the consequences of deleting a Microsoft Purview account on published access policies
----- Previously updated : 02/09/2023--
-# Impact of deleting Microsoft Purview account on access policies
-
-## Important considerations
-Deleting a Microsoft Purview account that has active (that is, published) policies removes those policies. Any access to a data source or a dataset that was previously provisioned from Microsoft Purview gets revoked. This can lead to outages, that is, users or groups in your organization not able to access critical data. Review the decision to delete the Microsoft Purview account with the people in Policy Author role at root collection level before proceeding. To find out who holds that role in the Microsoft Purview account, review the section on managing role assignments in this [guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
-
-Before deleting the Microsoft Purview account, it's advisable that you provision access to the users in your organization that need access to datasets using an alternate mechanism or a different Purview account. Then orderly delete or unpublish any active policies
-* [Deleting DevOps policies](how-to-policies-devops-authoring-generic.md#delete-a-devops-policy) - You need to delete DevOps policies for them to be unpublished.
-* [Unpublishing Data Owner policies](how-to-policies-data-owner-authoring-generic.md#unpublish-a-policy).
-* [Deleting Self-service access policies](how-to-delete-self-service-data-access-policy.md) - You need to delete Self-service access policies for them to be unpublished.
-
-## Next steps
-Check these concept guides
-* [DevOps policies](concept-policies-devops.md)
-* [Data owner access policies](concept-policies-data-owner.md)
-* [Self-service access policies](concept-self-service-data-access-policy.md)
purview Concept Resource Sets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-resource-sets.md
- Title: Understanding resource sets
-description: This article explains what resource sets are and how Microsoft Purview creates them.
----- Previously updated : 02/17/2023--
-# Understanding resource sets
-
-This article helps you understand how Microsoft Purview uses resource sets to map data assets to logical resources.
-
-## Background info
-
-At-scale data processing systems typically store a single table in storage as multiple files. In the Microsoft Purview Data Catalog, this concept is represented by using resource sets. A resource set is a single object in the catalog that represents a large number of assets in storage.
-
-For example, suppose your Spark cluster has persisted a DataFrame into an Azure Data Lake Storage (ADLS) Gen2 data source. Although in Spark the table looks like a single logical resource, on the disk there are likely thousands of Parquet files, each of which represents a partition of the total DataFrame's contents. IoT data and web log data have the same challenge. Imagine you have a sensor that outputs log files several times a second. It won't take long until you have hundreds of thousands of log files from that single sensor.
-
-## How Microsoft Purview detects resource sets
-
-Microsoft Purview supports detecting resource sets in Azure Blob Storage, ADLS Gen1, ADLS Gen2, Azure Files, and Amazon S3.
-
-Microsoft Purview automatically detects resource sets when scanning. This feature looks at all of the data that's ingested via scanning and compares it to a set of defined patterns.
-
-For example, suppose you scan a data source whose URL is `https://myaccount.blob.core.windows.net/mycontainer/machinesets/23/foo.parquet`. Microsoft Purview looks at the path segments and determines if they match any built-in patterns. It has built-in patterns for GUIDs, numbers, date formats, localization codes (for example, en-us), and so on. In this case, the number pattern matches *23*. Microsoft Purview assumes that this file is part of a resource set named `https://myaccount.blob.core.windows.net/mycontainer/machinesets/{N}/foo.parquet`.
-
-Or, for a URL like `https://myaccount.blob.core.windows.net/mycontainer/weblogs/en_au/23.json`, Microsoft Purview matches both the localization pattern and the number pattern, producing a resource set named `https://myaccount.blob.core.windows.net/mycontainer/weblogs/{LOC}/{N}.json`.
-
-Using this strategy, Microsoft Purview would map the following resources to the same resource set, `https://myaccount.blob.core.windows.net/mycontainer/weblogs/{LOC}/{N}.json`:
--- `https://myaccount.blob.core.windows.net/mycontainer/weblogs/cy_gb/1004.json`-- `https://myaccount.blob.core.windows.net/mycontainer/weblogs/cy_gb/234.json`-- `https://myaccount.blob.core.windows.net/mycontainer/weblogs/de_Ch/23434.json`-
-### File types that Microsoft Purview will not detect as resource sets
-
-Microsoft Purview intentionally doesn't try to classify most document file types like Word, Excel, or PDF as Resource Sets. The exception is CSV format since that is a common partitioned file format.
-
-## How Microsoft Purview scans resource sets
-
-When Microsoft Purview detects resources that it thinks are part of a resource set, it switches from a full scan to a sample scan. A sample scan opens only a subset of the files that it thinks are in the resource set. For each file it opens, it uses its schema and runs its classifiers. Microsoft Purview then finds the newest resource among the opened resources and uses that resource's schema and classifications in the entry for the entire resource set in the catalog.
-
-## Advanced resource sets
-
-Microsoft Purview can customize and further enrich your resource set assets through the **Advanced Resource Sets** capability. Advanced resource sets allow Microsoft Purview to understand the underlying partitions of data ingested and enables the creation of [resource set pattern rules](how-to-resource-set-pattern-rules.md) that customize how Microsoft Purview groups resource sets during scanning.
-
-When Advanced Resource Sets are enabled, Microsoft Purview runs extra aggregations to compute the following information about resource set assets:
--- A sample path from a file that comprises the resource set.-- A partition count that shows how many files make up the resource set. -- The total size of all files that comprise the resource set. -
-These properties can be found on the asset details page of the resource set.
--
-### Turning on advanced resource sets
-
-Advanced resource sets are off by default in all new Microsoft Purview instances. Advanced resource sets can be enabled from **Account information** in the management hub. Only those users who are added to the Data Curator role at root collection, can manage Advanced Resource Sets settings.
--
-After enabling advanced resource sets, the additional enrichments will occur on all newly ingested assets. The Microsoft Purview team recommends waiting an hour before scanning in new data lake data after toggling on the feature.
-
-> [!IMPORTANT]
-> Enabling advanced resource sets will impact the refresh rate of asset and classification insights. When advanced resource sets are on, asset and classification insights will only update twice a day.
-
-## Built-in resource set patterns
-
-Microsoft Purview supports the following resource set patterns. These patterns can appear as a name in a directory or as part of a file name.
-### Regex-based patterns
-
-| Pattern Name | Display Name | Description |
-|--|--|-|
-| Guid | {GUID} | A globally unique identifier as defined in [RFC 4122](https://tools.ietf.org/html/rfc4122) |
-| Number | {N} | One or more digits |
-| Date/Time Formats | {Year}{Month}{Day}{N} | We support various date/time formats but all are represented with {Year}[delimiter]{Month}[delimiter]{Day} or series of {N}s. |
-| 4ByteHex | {HEX} | A 4-digit HEX number. |
-| Localization | {LOC} | A language tag as defined in [BCP 47](https://tools.ietf.org/html/bcp47), both - and _ names are supported (for example, en_ca and en-ca) |
-
-### Complex patterns
-
-| Pattern Name | Display Name | Description |
-|--|--|-|
-| SparkPath | {SparkPartitions} | Spark partition file identifier |
-| Date(yyyy/mm/dd)InPath | {Year}/{Month}/{Day} | Year/month/day pattern spanning multiple folders |
--
-## How resource sets are displayed in the Microsoft Purview Data Catalog
-
-When Microsoft Purview matches a group of assets into a resource set, it attempts to extract the most useful information to use as a display name in the catalog. Some examples of the default naming convention applied:
-
-### Example 1
-
-Qualified name: `https://myblob.blob.core.windows.net/sample-data/name-of-spark-output/{SparkPartitions}`
-
-Display name: "name of spark output"
-
-### Example 2
-
-Qualified name: `https://myblob.blob.core.windows.net/my-partitioned-data/{Year}-{Month}-{Day}/{N}-{N}-{N}-{N}/{GUID}`
-
-Display name: "my partitioned data"
-
-### Example 3
-
-Qualified name: `https://myblob.blob.core.windows.net/sample-data/data{N}.csv`
-
-Display name: "data"
-
-## Customizing resource set grouping using pattern rules
-
-When scanning a storage account, Microsoft Purview uses a set of defined patterns to determine if a group of assets is a resource set. In some cases, Microsoft Purview's resource set grouping may not accurately reflect your data estate. These issues can include:
--- Incorrectly marking an asset as a resource set-- Putting an asset into the wrong resource set-- Incorrectly marking an asset as not being a resource set-
-To customize or override how Microsoft Purview detects which assets are grouped as resource sets and how they're displayed within the catalog, you can define pattern rules in the management center. For step-by-step instructions and syntax, see [resource set pattern rules](how-to-resource-set-pattern-rules.md).
-
-## Known limitations with resource sets
--- By default, resource set assets will only be deleted by a scan if [Advanced Resource sets](#advanced-resource-sets) are enabled. If this capability is off, resource set assets can only be deleted manually or via API.-
-## Next steps
-
-To get started with Microsoft Purview, see [Quickstart: Create a Microsoft Purview account](create-catalog-portal.md).
purview Concept Scans And Ingestion https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-scans-and-ingestion.md
- Title: Scans and ingestion
-description: This article explains scans and ingestion in Microsoft Purview.
----- Previously updated : 04/20/2023---
-# Scans and ingestion in Microsoft Purview
-
-This article provides an overview of the Scanning and Ingestion features in Microsoft Purview. These features connect your Microsoft Purview account to your sources to populate the data map and data catalog so you can begin exploring and managing your data through Microsoft Purview.
--- [**Scanning**](#scanning) captures metadata from [data sources](microsoft-purview-connector-overview.md) and brings it to Microsoft Purview.-- [**Ingestion**](#ingestion) processes metadata and stores it in the data catalog from both:
- - Data source scans - scanned metadata is added to the Microsoft Purview Data Map.
- - Lineage connections - transformation resources add metadata about their sources, outputs, and activities to the Microsoft Purview Data Map.
-
-## Scanning
-
-After data sources are [registered](manage-data-sources.md) in your Microsoft Purview account, the next step is to scan the data sources. The scanning process establishes a connection to the data source and captures technical metadata like names, file size, columns, and so on. It also extracts schema for structured data sources, applies classifications on schemas, and [applies sensitivity labels if your Microsoft Purview Data Map is connected to a Microsoft Purview compliance portal](create-sensitivity-label.md). The scanning process can be triggered to run immediately or can be scheduled to run on a periodic basis to keep your Microsoft Purview account up to date.
-
-For each scan, there are customizations you can apply so that you're only scanning information you need, rather than the whole source.
-
-### Choose an authentication method for your scans
-
-Microsoft Purview is secure by default. No passwords or secrets are stored directly in Microsoft Purview, so youΓÇÖll need to choose an authentication method for your sources. There are several possible ways to authenticate your Microsoft Purview account, but not all methods are supported for each data source.
-
-Whenever possible, a Managed Identity is the preferred authentication method because it eliminates the need for storing and managing credentials for individual data sources. This can greatly reduce the time you and your team spend setting up and troubleshooting authentication for scans. When you enable a managed identity for your Microsoft Purview account, an identity is created in Azure Active Directory and is tied to the lifecycle of your account.
-
-### Scope your scan
-
-When scanning a source, you have a choice to scan the entire data source or choose only specific entities (folders/tables) to scan. Available options depend on the source you're scanning, and can be defined for both one-time and scheduled scans.
-
-For example, when [creating and running a scan for an Azure SQL Database](register-scan-azure-sql-database.md#create-the-scan), you can choose which tables to scan, or select the entire database.
-
-For each entity (folder/table), there will be three selection states: fully selected, partially selected and not selected. In the example below, if you select ΓÇ£Department 1ΓÇ¥ on the folder hierarchy, ΓÇ£Department 1ΓÇ¥ is considered as fully selected. The parent entities for ΓÇ£Department 1ΓÇ¥ like ΓÇ£CompanyΓÇ¥ and ΓÇ£exampleΓÇ¥ are considered as partially selected as thereΓÇÖre other entities under the same parent not having been selected, for example ΓÇ£Department 2ΓÇ¥. Different icons will be used on UI for entities with different selection states.
--
-After you run the scan itΓÇÖs likely that there will be new assets added in the source system. By default the future assets under a certain parent will be automatically selected if the parent is fully or partially selected when you run the scan again. In the example above, after you select ΓÇ£Department 1ΓÇ¥ and run the scan, any new assets under folder ΓÇ£Department 1ΓÇ¥ or under ΓÇ£CompanyΓÇ¥ and ΓÇ£exampleΓÇ¥ will be included when you run the scan again.
-
-A toggle button is introduced for users to control the automatic inclusion for new assets under partially selected parent. By default the toggle will be turned off and the automatic inclusion behavior for partially selected parent is disabled. In the same example with the toggle turned off, any new assets under partially selected parents like ΓÇ£CompanyΓÇ¥ and ΓÇ£exampleΓÇ¥ will not be included when you run the scan again, only new assets under ΓÇ£Department 1ΓÇ¥ will be included in future scan.
--
-If the toggle button is turned on, the new assets under a certain parent will be automatically selected if the parent is fully or partially selected when you run the scan again. The inclusion behavior will be the same as before the toggle button is introduced.
--
-> [!NOTE]
-> * The availability of the toggle button will depend on the data source type. Currently it's available in public preview for sources including Azure Blob Storage, Azure Data Lake Storage Gen 1, Azure Data Lake Storage Gen 2, Azure Files and Azure Dedicated SQL pool (formerly SQL DW).
-> * For any scans created or scheduled before the toggle button is introduced, the toggle state is set as on and canΓÇÖt be changed. For any scans created or scheduled after the toggle button is introduced, the toggle state canΓÇÖt be changed after the scan is saved. You need to create a new scan to change the toggle state.
-> * When the toggle button is turned off, for sources of storage type like Azure Data Lake Storage Gen 2 it may take up to 4 hours before the [browse by source type](how-to-browse-catalog.md#browse-by-source-type) experience becomes fully available after your scan job is completed.
-
-#### Known limitations
-When the toggle button is turned off:
-* The file entities under a partially selected parent will not be scanned.
-* If all existing entities under a parent are explicitly selected, the parent will be considered as fully selected and any new assets under the parent will be included when you run the scan again.
-
-### Scan rule set
-
-A scan rule set determines the kinds of information a scan will look for when it's running against one of your sources. Available rules depend on the kind of source you're scanning, but include things like the [file types](sources-and-scans.md#file-types-supported-for-scanning) you should scan, and the kinds of [classifications](supported-classifications.md) you need.
-
-There are [system scan rule sets](create-a-scan-rule-set.md#system-scan-rule-sets) already available for many data source types, but you can also [create your own scan rule sets](create-a-scan-rule-set.md) to tailor your scans to your organization.
-
-### Schedule your scan
-
-Microsoft Purview gives you a choice of scanning weekly or monthly at a specific time you choose. Weekly scans may be appropriate for data sources with structures that are actively under development or frequently change. Monthly scanning is more appropriate for data sources that change infrequently. Best practice is to work with the administrator of the source you want to scan to identify a time when compute demands on the source are low.
-
-### How scans detect deleted assets
-
-A Microsoft Purview catalog is only aware of the state of a data store when it runs a scan. For the catalog to know if a file, table, or container was deleted, it compares the last scan output against the current scan output. For example, suppose that the last time you scanned an Azure Data Lake Storage Gen2 account, it included a folder named *folder1*. When the same account is scanned again, *folder1* is missing. Therefore, the catalog assumes the folder has been deleted.
-
-#### Detecting deleted files
-
-The logic for detecting missing files works for multiple scans by the same user and by different users. For example, suppose a user runs a one-time scan on a Data Lake Storage Gen2 data store on folders A, B, and C. Later, a different user in the same account runs a different one-time scan on folders C, D, and E of the same data store. Because folder C was scanned twice, the catalog checks it for possible deletions. Folders A, B, D, and E, however, were scanned only once, and the catalog won't check them for deleted assets.
-
-To keep deleted files out of your catalog, it's important to run regular scans. The scan interval is important, because the catalog can't detect deleted assets until another scan is run. So, if you run scans once a month on a particular store, the catalog can't detect any deleted data assets in that store until you run the next scan a month later.
-
-When you enumerate large data stores like Data Lake Storage Gen2, there are multiple ways (including enumeration errors and dropped events) to miss information. A particular scan might miss that a file was created or deleted. So, unless the catalog is certain a file was deleted, it won't delete it from the catalog. This strategy means there can be errors when a file that doesn't exist in the scanned data store still exists in the catalog. In some cases, a data store might need to be scanned two or three times before it catches certain deleted assets.
-
-> [!NOTE]
-> - Assets that are marked for deletion are deleted after a successful scan. Deleted assets might continue to be visible in your catalog for some time before they are processed and removed.
-> - Currently, source deletion detection is not supported for the following sources: Azure Databricks, Cassandra, DB2, Erwin, Google BigQuery, Hive Metastore, Looker, MongoDB, MySQL, Oracle, PostgreSQL, Salesforce, SAP BW, SAP ECC, SAP HANA, SAP S/4HANA, Snowflake, and Teradata. When object is deleted from the data source, the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Ingestion
-
-Ingestion is the process responsible for populating the data map with metadata gathered through its various processes.
-
-## Ingestion from scans
-
-The technical metadata or classifications identified by the scanning process are then sent to ingestion. Ingestion analyses the input from scan, [applies resource set patterns](concept-resource-sets.md#how-microsoft-purview-detects-resource-sets), populates available [lineage](concept-data-lineage.md) information, and then loads the data map automatically. Assets/schemas can be discovered or curated only after ingestion is complete. So, if your scan is completed but you haven't seen your assets in the data map or catalog, you'll need to wait for the ingestion process to finish.
-
-## Ingestion from lineage connections
-
-Resources like [Azure Data Factory](how-to-link-azure-data-factory.md) and [Azure Synapse](how-to-lineage-azure-synapse-analytics.md) can be connected to Microsoft Purview to bring data source and lineage information into your Microsoft Purview Data Map. For example, when a copy pipeline runs in an Azure Data Factory that has been connected to Microsoft Purview, metadata about the input sources, the activity, and the output sources are ingested in Microsoft Purview and the information is added to the data map.
-
-If a data source has already been added to the data map through a scan, lineage information about the activity will be added to the existing source. If the data source hasn't yet been added to the data map, the lineage ingestion process will add it to the root collection with its lineage information.
-
-For more information about the available lineage connections, see the [lineage user guide](catalog-lineage-user-guide.md).
-
-## Next steps
-
-For more information, or for specific instructions for scanning sources, follow the links below.
-
-* To understand resource sets, see our [resource sets article](concept-resource-sets.md).
-* [How to govern an Azure SQL Database](register-scan-azure-sql-database.md#create-the-scan)
-* [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
purview Concept Self Service Data Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-self-service-data-access-policy.md
- Title: Microsoft Purview Self-service access concepts
-description: Understand what self-service access and data discovery are in Microsoft Purview, and explore how users can take advantage of it.
----- Previously updated : 11/11/2022--
-# Microsoft Purview Self-service data discovery and access (Preview)
-
-This article helps you understand Microsoft Purview Self-service data access policy.
-
-> [!IMPORTANT]
-> Microsoft Purview Self-service data access policy is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
-
-## Important limitations
-
-The self-service data access policy is only supported when the prerequisites mentioned in [Data Use Management](./how-to-enable-data-use-management.md#prerequisites) are satisfied.
-
-## Overview
-
-Microsoft Purview Self-service data access workflow allows data consumer to request access to data when browsing or searching for data. Once the data access request is approved, a policy gets auto-generated to grant access to the requestor provided the data source is enabled for Data Use Management. Currently, self-service data access policy is supported for storage accounts, containers, folders, and files.
-
-A **workflow admin** will need to map a self-service data access workflow to a collection. Collection is logical grouping of data sources that are registered within Microsoft Purview. **Only data source(s) that are registered** for Data Use Management will have self-service policies auto-generated.
-
-## Terminology
-
-* **Data consumer** is anyone who uses the data. Example, a data analyst accessing marketing data for customer segmentation. Data consumer and data requestor will be used interchangeably within this document.
-
-* **Collection** is logical grouping of data sources that are registered within Microsoft Purview.
-
-* **Self-service data access workflow** is the workflow that is initiated when a data consumer requests access to data.
-
-* **Approver** is either security group or Azure Active Directory (Azure AD) users or Azure AD Groups that can approve self-service access requests.
-
-## How to use Microsoft Purview self-service data access policy
-
-Microsoft Purview allows organizations to catalog metadata about all registered data assets. It allows data consumers to search for or browse to the required data asset.
-
-With self-service data access workflow, data consumers can not only find data assets but also request access to the data assets. When the data consumer requests access to a data asset, the associated self-service data access workflow is triggered.
-
-A default self-service data access workflow template is provided with every Microsoft Purview account. The default template can be amended to add more approvers and/or set the approver's email address. For more details, refer [Create and enable self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md).
-
-Whenever a data consumer requests access to a dataset, the notification is sent to the workflow approver(s). The approver(s) can view the request and approve it either from Microsoft Purview governance portal or from within the email notification. When the request is approved, a policy is auto-generated and applied against the respective data source. Self-service data access policy gets auto-generated only if the data source is registered for **Data Use Management**. The pre-requisites mentioned within the [Data Use Management](./how-to-enable-data-use-management.md#prerequisites) have to be satisfied.
-
-Data consumer can access the requested dataset using tools such as Power BI or Azure Synapse Analytics workspace.
-
->[!NOTE]
-> Users will not be able to browse to the asset using the Azure Portal or Storage explorer if the only permission granted is read/modify access at the file or folder level of the storage account.
-
-> [!CAUTION]
-> Folder level permission is required to access data in ADLS Gen 2 using PowerBI.
-> Additionally, resource sets are not supported by self-service policies. Hence, folder level permission needs to be granted to access resource set files such as CSV or parquet.
-
-## Next steps
-
-If you would like to preview these features in your environment, follow the link below.
-- [Enable Data Use Management](./how-to-enable-data-use-management.md#prerequisites)-- [create self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md)-- [working with policies at file level](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-file-level-permission/ba-p/3102166)-- [working with policies at folder level](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-folder-level-permission/ba-p/3109583)-- [self-service policies for Azure SQL Database tables and views](./how-to-policies-self-service-azure-sql-db.md)
purview Concept Workflow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-workflow.md
- Title: Workflows in Microsoft Purview
-description: This article describes workflows in Microsoft Purview, the roles they play, and who can create and manage them.
----- Previously updated : 10/17/2022---
-# Workflows in Microsoft Purview
--
-Workflows are automated, repeatable business processes that users can create within Microsoft Purview to validate and orchestrate CUD (create, update, delete) operations on their data entities. Enabling these processes allow organizations to track changes, enforce policy compliance, and ensure quality data across their data landscape.
-
-Since the workflows are created and managed within Microsoft Purview, manual change monitoring or approval are no longer required to ensure quality updates to the data catalog.
-
-## What are workflows?
-
-Workflows are automated processes that are made up of [connectors](#workflow-connectors) that contain a common set of pre-established actions and are run when specified operations occur in your data catalog.
-
-For example: A user attempts to delete a business glossary term that is bound to a workflow. When the user submits this operation, the workflow runs through its actions instead of, or before, the original delete operation.
-
-Workflow [actions](#workflow-connectors) include things like generating approval requests or sending a notification, that allow users to automate validation and notification systems across their organization.
-
-Currently, there are two kinds of workflows:
-
-* **Data governance** - for data policy, access governance, and loss prevention. [Scoped](#workflow-scope) at the collection level.
-* **Data catalog** - to manage approvals for CUD (create, update, delete) operations for glossary terms. [Scoped](#workflow-scope) at the glossary level.
-
-These workflows can be built from pre-established [workflow templates](#workflow-templates) provided in the Microsoft Purview governance portal, but are fully customizable using the available workflow connectors.
--
-## Workflow templates
-
-For all the different types of user defined workflows enabled and available for your use, Microsoft Purview provides templates to help [workflow administrators](#who-can-manage-workflows) create workflows without needing to build them from scratch. The templates are built into the authoring experience and automatically populate based on the workflow being created, so there's no need to search for them.
-
-Templates are available to launch the workflow authoring experience. However, a workflow admin can customize the template to meet the requirements in their organization.
-
-## Workflow connectors
-
-Workflow connectors are a common set of actions applicable across some workflows. They can be used in any workflow in Microsoft Purview to create processes customized to your organization. To view the list of existing workflow connectors in Microsoft Purview, see [Workflow connectors](how-to-use-workflow-connectors.md).
-
-## Workflow scope
-
-Once a workflow is created and enabled, it can be bound to a particular scope. This gives you the flexibility to run different workflows for different areas/departments in your organization.
-
-Data governance workflows are scoped to collections, and can be bound to the root collection to govern the whole Microsoft Purview catalog, or any subcollection.
-
-Data catalog workflows are scoped to the glossary and can be bound to the entire glossary, any single term, or any parent term to manage child-terms.
-
-If there's no workflow directly associated with a scope, the workflow engine will traverse upward in the scope hierarchy to determine closest workflow, and run that workflow for the operation.
-
-For example, the AdatumCorp Purview account has the following collection hierarchy:
-
-Root Collection > Sales | Finance | Marketing
--- **Root collection** has the workflow _Self-Service data access default workflow_ defined and bound.-- **Sales** has _Self-Service data access for sales collection_ defined and bound.-- **Finance** has _Self-Service data access for finance collection_ defined and bound.-- **Marketing** has no workflows directly bound. -
-In the above setup, when an access request is made for a data asset in Finance collection, the _Self-Service data access for finance collection_ workflow is run.
-
-However, when a request access is made for a data asset in Marketing collection, _Self-Service data access default workflow_ is triggered. Because there are no workflows bound at the Marketing scope, the workflow engine traverses to the next level in the scope hierarchy, which is Marketing's parent: the root collection. The workflow at the parent scope, the root collection scope, is run.
-
-## Who can manage workflows?
-
-A new role, **Workflow Admin** is being introduced with workflow functionality.
-
-A Workflow admin defined for a collection can create self-service workflows and bind these workflows to the collections they have access to.
-
-A Workflow admin defined for any collection can create approval workflows for the business glossary. In order to bind the glossary workflows to a term you need to have at least [Data reader permissions](catalog-permissions.md).
-
-## Next steps
-
-Now that you understand what workflows are, you can follow these guides to use them in your Microsoft Purview account:
--- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)-
purview Configure Event Hubs For Kafka https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/configure-event-hubs-for-kafka.md
- Title: Configure Event Hubs with Microsoft Purview for Atlas Kafka topics
-description: Configure Event Hubs to send/receive events to/from Microsoft Purview's Apache Atlas Kafka topics.
----- Previously updated : 12/13/2022---
-# Configure Event Hubs with Microsoft Purview to send and receive Atlas Kafka topics messages
-
-This article will show you how to configure Microsoft Purview to be able to send and receive *Atlas Kafka* topics events with Azure Event Hubs.
-
-If you have already configured your environment, you can follow [our guide to get started with the **Azure.Messaging.EventHubs** .NET library to send and receive messages.](manage-kafka-dotnet.md)
-
-## Prerequisites
-
-To configure your environment, you need certain prerequisites in place:
--- **A Microsoft Azure subscription**. To use Azure services, including Event Hubs, you need an Azure subscription. If you don't have an Azure account, you can sign up for a [free trial](https://azure.microsoft.com/free/) or use your MSDN subscriber benefits when you [create an account](https://azure.microsoft.com).-- An active [Microsoft Purview account](create-catalog-portal.md).-- An [Azure Event Hubs](../event-hubs/event-hubs-create.md) namespace with an Event Hubs.-
-## Configure Event Hubs
-
-To send or receive Atlas Kafka topics messages, you'll need to configure at least one Event Hubs namespace.
-
->[!NOTE]
-> If your Microsoft Purview account was created before December 15th, 2022 you may have a managed Event Hubs resource already associated with your account.
-> You can check in **Managed Resources** under settings on your Microsoft Purview account page in the [Azure portal](https://portal.azure.com).
-> :::image type="content" source="media/configure-event-hubs-for-kafka/enable-disable-event-hubs.png" alt-text="Screenshot showing the Event Hubs namespace toggle highlighted on the Managed resources page of the Microsoft Purview account page in the Azure portal.":::
->
-> - If you do not see this resource, or it is **disabled**, follow the steps below to configure your Event Hubs.
->
-> - If it is **enabled**, you can continue to use this managed Event Hubs namespace if you prefer. (There is associated cost. See see [the pricing page](https://azure.microsoft.com/pricing/details/purview/).) If you want to manage your own Event Hubs account, you must first **disable** this feature and follow the steps below.
-> **If you disable the managed Event Hubs resource you won't be able to re-enable a managed Event Hub resource. You will only be able to configure your own Event Hubs**.
-
-## Event Hubs permissions
-
-To authenticate with your Event Hubs, you can either use:
--- Microsoft Purview managed identity-- [User assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) - only available when configuring namespaces after account creation.-
-One of these identities will need **at least contributor permissions on your Event Hubs** to be able to configure them to use with Microsoft Purview.
-
-## Configure Event Hubs to publish messages to Microsoft Purview
-
-1. Navigate to **Kafka configuration** under settings on your Microsoft Purview account page in the [Azure portal](https://portal.azure.com).
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/select-kafka-configuration.png" alt-text="Screenshot showing the Kafka configuration option in the Microsoft Purview menu in the Azure portal.":::
-
-1. Select **Add configuration** and **Hook configuration**.
- >[!NOTE]
- > You can add as many hook configurations as you need.
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/add-hook-configuration.png" alt-text="Screenshot showing the Kafka configuration page with add configuration and hook configuration highlighted.":::
-
-1. Give a name to your hook configuration, select your subscription, an existing Event Hubs namespace, an existing Event Hubs to send the notifications to, the consumer group you want to use, and the kind of authentication you would like to use.
-
- >[!TIP]
- > You can use the same Event Hubs namespace more than once, but each configuration will need its own Event Hubs.
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/configure-hook-event-hub.png" alt-text="Screenshot showing the hook configuration page, with all values filled in.":::
-
-1. Select **Save**. It will take a couple minutes for your configuration to complete.
-
-1. Once configuration is complete, you can begin the steps to [publish messages to Microsoft Purview](manage-kafka-dotnet.md#publish-messages-to-microsoft-purview).
-
-### Configure Event Hubs to receive messages from Microsoft Purview
-
-1. Navigate to **Kafka configuration** under settings on your Microsoft Purview account page in the [Azure portal](https://portal.azure.com).
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/select-kafka-configuration.png" alt-text="Screenshot showing the Kafka configuration option in the Microsoft Purview menu in the Azure portal.":::
-
-1. If there's a configuration already listed as type **Notification**, Event Hubs is already configured, and you can begin the steps to [receive Microsoft Purview messages](manage-kafka-dotnet.md#receive-microsoft-purview-messages).
- >[!NOTE]
- > Only one *Notification* Event Hubs can be configured at a time.
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/type-notification.png" alt-text="Screenshot showing the Kafka configuration option with a notification type configuration ready.":::
-
-1. If there isn't a **Notification** configuration already listed, select **Add configuration** and **Notification configuration**.
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/add-notification-configuration.png" alt-text="Screenshot showing the Kafka configuration page with add configuration and notification configuration highlighted.":::
-
-1. Give a name to your notification configuration, select your subscription, an existing Event Hubs namespace, an existing Event Hubs to send the notifications to, the partitions you want to use, and the kind of authentication you would like to use.
-
- >[!TIP]
- > You can use the same Event Hubs namespace more than once, but each configuration will need its own Event Hubs.
-
- :::image type="content" source="media/configure-event-hubs-for-kafka/configure-notification-event-hub.png" alt-text="Screenshot showing the notification hub configuration page, with all values filled in.":::
-
-1. Select **Save**. It will take a couple minutes for your configuration to complete.
-
-1. Once configuration is complete, you can begin the steps to [receive Microsoft Purview messages](manage-kafka-dotnet.md#receive-microsoft-purview-messages).
-
-## Remove configured Event Hubs
-
-To remove configured Event Hubs namespaces, you can follow these steps:
-
-1. Search for and open your Microsoft Purview account in the [Azure portal](https://portal.azure.com).
-1. Select **Kafka configuration** under settings on your Microsoft Purview account page in the Azure portal.
-1. Select the Event Hubs you want to disable. (Hook hubs send messages to Microsoft Purview. Notification hubs receive notifications.)
-1. Select **Remove** to save the choice and begin the disablement process. This can take several minutes to complete.
- :::image type="content" source="media/configure-event-hubs-for-kafka/select-remove.png" alt-text="Screenshot showing the Kafka configuration page of the Microsoft Purview account page in the Azure portal with the remove button highlighted.":::
-
-## Next steps
--- [Publish and process Atlas Kafka topics messages using your Event Hubs](manage-kafka-dotnet.md)-- [Event Hubs samples in GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/eventhub/Azure.Messaging.EventHubs/samples)-- [Event processor samples in GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/eventhub/Azure.Messaging.EventHubs.Processor/samples)-- [An introduction to Atlas notifications](https://atlas.apache.org/2.0.0/Notifications.html)
purview Create A Custom Classification And Classification Rule https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-a-custom-classification-and-classification-rule.md
- Title: Create a custom classification and classification rule
-description: Learn how to create custom classifications to define data types in your data estate that are unique to your organization in Microsoft Purview.
----- Previously updated : 04/26/2023-
-# Custom classifications in Microsoft Purview
-
-This article describes how you can create custom classifications to define data types in your data estate that are unique to your organization. It also describes the creation of custom classification rules that let you find specified data throughout your data estate.
-
->[!IMPORTANT]
->To create a custom classification you need either **data curator** or **data source administrator** permission on a collection. Permissions at any collection level are sufficient.
->For more information about permissions, see: [Microsoft Purview permissions](catalog-permissions.md).
-
-## Default system classifications
-
-The Microsoft Purview Data Catalog provides a large set of default system classifications that represent typical personal data types that you might have in your data estate. For the entire list of available system classifications, see [Supported classifications in Microsoft Purview](supported-classifications.md).
--
-You also have the ability to create custom classifications, if any of the default classifications don't meet your needs.
-
-> [!Note]
-> Our [data sampling rules](sources-and-scans.md#sampling-within-a-file) are applied to both system and custom classifications.
-
-> [!NOTE]
-> Microsoft Purview custom classifications are applied only to structured data sources like SQL and CosmosDB, and to structured file types like CSV, JSON, and Parquet. Custom classification isn't applied to unstructured data file types like DOC, PDF, and XLSX.
-
-## Steps to create a custom classification
-
-To create a custom classification, follow these steps:
-
-1. You'll need [**data curator** or **data source administrator** permissions on any collection](catalog-permissions.md) to be able to create a custom classification.
-
-1. From your catalog, select **Data Map** from the left menu.
-
-1. Select **Classifications** under **Annotation management**.
-
-1. Select **+ New**
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/new-classification.png" alt-text="New classification" border="true":::
-
-The **Add new classification** pane opens, where you can give your
-classification a name and a description. It's good practice to use a name-spacing convention, such as `your company name.classification name`.
-
-The Microsoft system classifications are grouped under the reserved `MICROSOFT.` namespace. An example is **MICROSOFT.GOVERNMENT.US.SOCIAL\_SECURITY\_NUMBER**.
-
-The name of your classification must start with a letter followed by a sequence of letters, numbers, and period (.) or underscore characters. As you type, the UX automatically generates a friendly name. This friendly name is what users see when you apply it to an asset in the catalog.
-
-To keep the name short, the system creates the friendly name based on
-the following logic:
--- All but the last two segments of the namespace are trimmed.--- The casing is adjusted so that the first letter of each word is capitalized.--- All underscores (\_) are replaced with spaces.-
-As an example, if you named your classification **contoso.hr.employee\_ID**, the friendly name is stored
-in the system as **Hr.Employee ID**.
--
-Select **OK**, and your new classification is added to your
-classification list.
--
-Selecting the classification in the list opens the classification
-details page. Here, you find all the details about the classification.
-
-These details include the count of how many instances there are, the formal name, associated classification rules (if any), and the owner name.
--
-## Custom classification rules
-
-The catalog service provides a set of default classification rules, which are used by the scanner to automatically detect certain data types. You can also add your own custom classification rules to detect other types of data that you might be interested in finding across your data estate. This capability can be powerful when you're trying to find data within your data estate.
-
->[!NOTE]
->Custom classification rules are only supported in the English language.
-
-As an example, let\'s say that a company named Contoso has employee IDs that are standardized throughout the company with the word \"Employee\" followed by a GUID to create EMPLOYEE{GUID}. For example, one instance of an employee ID looks like `EMPLOYEE9c55c474-9996-420c-a285-0d0fc23f1f55`.
-
-Contoso can configure the scanning system to find instances of these IDs by creating a custom classification rule. They can supply a regular expression that matches the data pattern, in this
-case, `\^Employee\[A-Za-z0-9\]{8}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{12}\$`. Optionally,
-if the data usually is in a column that they know the name of, such as Employee\_ID or EmployeeID, they can add a column pattern regular expression to make the scan even more accurate. An example regex is Employee\_ID\|EmployeeID.
-
-The scanning system can then use this rule to examine the actual data in the column and the column name to try to identify every instance of where the employee ID pattern is found.
-
-## Steps to create a custom classification rule
-
-To create a custom classification rule:
-
-1. Create a custom classification by following the instructions in the previous section. You'll add this custom classification in the classification rule configuration so that the system applies it when it finds a match in the column.
-
-2. Select the **Data Map** icon.
-
-3. Select the **Classifications rules** section.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/classification-rules.png" alt-text="Classification rules tile" border="true":::
-
-4. Select **New**.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/new-classification-rule.png" alt-text="Add new classification rule" border="true":::
-
-5. The **New classification rule** dialog box opens. Fill in the fields and decide whether to create a **regular expression rule** or a **dictionary rule**.
-
- |Field |Description |
- |||
- |Name | Required. The maximum is 100 characters. |
- |Description |Optional. The maximum is 256 characters. |
- |Classification Name | Required. Select the name of the classification from the drop-down list to tell the scanner to apply it if a match is found. |
- |State | Required. The options are enabled or disabled. Enabled is the default. |
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/create-new-classification-rule.png" alt-text="Create new classification rule" border="true":::
-
-### Creating a Regular Expression Rule
-
->[!IMPORTANT]
->Regular expressions in custom classifications are case insensitive.
-
-1. If creating a regular expression rule, you'll see the following screen. You may optionally upload a file that will be used to **generate suggested regex patterns** for your rule. Only English language rules are supported.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/create-new-regex-rule.png" alt-text="Create new regex rule" border="true":::
-
-1. If you decide to generate a suggested regex pattern, after uploading a file, select one of the suggested patterns and select **Add to Patterns** to use the suggested data and column patterns. You can tweak the suggested patterns or you may also type your own patterns without uploading a file.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/suggested-regex.png" alt-text="Generate suggested regex" border="true":::
-
- |Field |Description |
- |||
- |Data Pattern |Optional. A regular expression that represents the data that's stored in the data field. The limit is large. In the previous example, the data patterns test for an employee ID that's literally the word `Employee{GUID}`. |
- |Column Pattern |Optional. A regular expression that represents the column names that you want to match. The limit is large. |
-
-1. Under **Data Pattern** you can use the **Minimum match threshold** to set the minimum percentage of the distinct data value matches in a column that must be found by the scanner for the classification to be applied. The suggested value is 60%. If you specify multiple data patterns, this setting is disabled and the value is fixed at 60%.
-
- > [!Note]
- > The Minimum match threshold must be at least 1%.
-
-1. You can now verify your rule and **create** it.
-1. Test the classification rule before completing the creation process to validate that it will apply tags to your assets. The classifications in the rule will be applied to the sample data uploaded just as it would in a scan. This means all of the system classifications and your custom classification will be matched to the data in your file.
-
- Input files may include delimited files (CSV, PSV, SSV, TSV), JSON, or XML content. The content will be parsed based on the file extension of the input file. Delimited data may have a file extension that matches any of the mentioned types. For example, TSV data can exist in a file named MySampleData.csv. Delimited content must also have a minimum of three columns.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/test-rule-screen.png" alt-text="Test rule before creating" border="true":::
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/test-rule-uploaded-file-result-screen.png" alt-text="View applied classifications after uploading a test file" border="true":::
-
-### Creating a Dictionary Rule
-
-1. If creating a dictionary rule, you'll see the following screen. Upload a file that contains all possible values for the classification you're creating in a single column. Only English language rules are supported.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/dictionary-rule.png" alt-text="Create dictionary rule" border="true":::
-
-1. After the dictionary is generated, you can adjust the minimum match threshold and submit the rule.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/dictionary-generated.png" alt-text="Create dictionary rule, with Dictionary-Generated checkmark." border="true":::
-
-## Edit or delete a custom classification
-
-To update or edit a custom classification, follow these steps:
-
-1. In your Microsoft Purview account, select the **Data map**, and then **Classifications**.
-1. Select the **Custom** tab.
-1. Select the classification you want to edit, then select the **Edit** button.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/select-edit.png" alt-text="Screenshot of the custom classification page, showing a classification selected and the edit button highlighted." border="true":::
-
-1. Now can edit the description of this custom classification. Select the **Ok** button when you're finished to save your changes.
-
-To delete a custom classification:
-
-1. After opening the **Data map**, and then **Classifications**, select the **Custom** tab.
-1. Select the classification you want to delete, or multiple classifications you want to delete, and then select the **Delete** button.
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/select-delete.png" alt-text="Screenshot of the custom classification page, showing a classification selected and the delete button highlighted." border="true":::
-
-You can also edit or delete a classification from inside the classification itself. Just select your classification, then select the **Edit** or **Delete** buttons in the top menu.
--
-## Enable or disable classification rules
-
-1. In your Microsoft Purview account, select the **Data map**, and then **Classification rules**.
-1. Select the **Custom** tab.
-1. You can check the current status of a classification rule by looking at the **Status** column in the table.
-1. Select the classification rule, or multiple classification rules, that you want to enable or disable.
-1. Select either the **Enable** or **Disable** buttons in the top menu.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/enable-or-disable.png" alt-text="Screenshot of the custom classification rule page, showing a classification rule selected and the enable and disable buttons highlighted." border="true":::
-
-You can also update the status of a rule when editing the rule.
-
-## Edit or delete a classification rule
-
-To update or edit a custom classification rule, follow these steps:
-
-1. In your Microsoft Purview account, select the **Data map**, and then **Classification rules**.
-1. Select the **Custom** tab.
-1. Select the classification rule you want to edit, then select the **Edit** button.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/select-edit-rule.png" alt-text="Screenshot of the custom classification rule page, showing a classification rule selected and the edit button highlighted." border="true":::
-
-1. Now you can edit the state, the description, and the associated classification rule.
-1. Select the **Continue** button.
-1. You can upload a new file for your regular expression or dictionary rule to match against, and update your match threshold and column pattern match.
-1. Select **Apply** to save your changes. Scans will need to be rerun with the new rule to apply changes across your assets.
-
-To delete a custom classification:
-
-1. After opening the **Data map**, and then **Classification rules**, select the **Custom** tab.
-1. Select the classification rule you want to delete, and then select the **Delete** button.
-
- :::image type="content" source="media/create-a-custom-classification-and-classification-rule/select-delete-rule.png" alt-text="Screenshot of the custom classification rule page, showing a classification rule selected and the delete button highlighted." border="true":::
-
-## Next steps
-
-Now that you've created your classification rule, it's ready to be added to a scan rule set so that your scan uses the rule when scanning. For more information, see [Create a scan rule set](create-a-scan-rule-set.md).
purview Create A Scan Rule Set https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-a-scan-rule-set.md
- Title: Create a scan rule set
-description: Create a scan rule set in Microsoft Purview to quickly scan data sources in your organization.
----- Previously updated : 11/01/2022--
-# Create a scan rule set
-
-In a Microsoft Purview catalog, you can create scan rule sets to enable you to quickly scan data sources in your organization.
-
-A scan rule set is a container for grouping a set of scan rules together so that you can easily associate them with a scan. For example, you might create a default scan rule set for each of your data source types, and then use these scan rule sets by default for all scans within your company. You might also want users with the right permissions to create other scan rule sets with different configurations based on business need.
-
-## Steps to create a scan rule set
-
-To create a scan rule set:
-
-1. From your Azure [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), select **Data Map**.
-
-1. Select **Scan rule sets** from the left pane, and then select **New**.
-
-1. From the **New scan rule set** page, select the data sources that the catalog scanner supports from the **Source Type** drop-down list. You can create a scan rule set for each type of data source you intend to scan.
-
-1. Give your scan rule set a **Name**. The maximum length is 63 characters, with no spaces allowed. Optionally, enter a **Description**. The maximum length is 256 characters.
-
- :::image type="content" source="./media/create-a-scan-rule-set/purview-home-page.png" alt-text="Screenshot showing the Scan rule set page." border="true":::
-
-1. Select **Continue**.
-
- The **Select file types** page appears. Notice that the file type options on this page vary based on the data source type that you chose on the previous page. All the file types are enabled by default.
-
- :::image type="content" source="./media/create-a-scan-rule-set/select-file-types-page.png" alt-text="Screenshot showing the Select file types page.":::
-
- The **Document file types** selection on this page allows you to include or exclude the following office file types: .doc, .docm, .docx, .dot, .odp, .ods, .odt, .pdf, .pot, .pps, .ppsx, .ppt, .pptm, .pptx, .xlc, .xls, .xlsb, .xlsm, .xlsx, and .xlt.
-
-1. Enable or disable a file type tile by selecting or clearing its check box. If you choose a Data Lake type data source (for example, Azure Data Lake Storage Gen2 or Azure Blob), enable the file types for which you want to have schema extracted and classified.
-
-1. For certain data source types, you can also [Create a custom file type](#create-a-custom-file-type).
-
-1. Select **Continue**.
-
- The **Select classification rules** page appears. This page displays the selected **System rules** and **Custom rules**, and the total number of classification rules selected. By default, all the **System rules** check boxes are selected
-
-1. For the rules you want to include or exclude, you can select or clear the **System rules** classification rule check boxes globally by category.
-
- :::image type="content" source="./media/create-a-scan-rule-set/select-classification-rules.png" alt-text="Screenshot showing the Select classification rules page.":::
-
-1. You can expand the category node and select or clear individual check boxes. For example, if the rule for **Argentina.DNI Number** has high false positives, you can clear that specific check box.
-
- :::image type="content" source="./media/create-a-scan-rule-set/select-system-rules.png" alt-text="Screenshot showing how to select system rules.":::
-
-1. Select **Create** to finish creating the scan rule set.
-
-## Create a custom file type
-
-Microsoft Purview supports adding a custom extension and defining a custom column delimiter in a scan rule set.
-
-To create a custom file type:
-
-1. Follow steps 1ΓÇô5 in [Steps to create a scan rule set](#steps-to-create-a-scan-rule-set) or edit an existing scan rule set.
-
-1. On the **Select file types** page, select **New file type** to create a new custom file type.
-
- :::image type="content" source="./media/create-a-scan-rule-set/select-new-file-type.png" alt-text="Screenshot showing how to select New file type from the Select file types page.":::
-
-1. Enter a **File Extension** and an optional **Description**.
-
- :::image type="content" source="./media/create-a-scan-rule-set/new-custom-file-type-page.png" alt-text="Screenshot showing the New custom file type page." border="true":::
-
-1. Make one of the following selections for **File contents within** to specify the type of file contents within your file:
-
- - Select **Custom Delimiter** and enter your own **Custom delimiter** (single character only).
-
- - Select **System File Type** and choose a system file type (for example XML) from the **System file type** drop-down list.
-
-1. Select **Create** to save the custom file.
-
- The system returns to the **Select file types** page and inserts the new custom file type as a new tile.
-
- :::image type="content" source="./media/create-a-scan-rule-set/new-custom-file-type-tile.png" alt-text="Screenshot showing the new custom file type tile on the Select file types page.":::
-
-1. Select **Edit** in the new file type tile if you want to change or delete it.
-
-1. Select **Continue** to finish configuring the scan rule set.
-
-## Ignore patterns
-
-Microsoft Purview supports defining regular expressions (regex) to exclude assets during scanning. During scanning, Microsoft Purview will compare the asset's URL against these regular expressions. All assets matching any of the regexes mentioned will be ignored while scanning.
-
-The **Ignore patterns** blade pre-populates one regex for spark transaction files. You can remove the pre-existing pattern if it is not required. You can define up to 10 ignore patterns.
--
-In the above example:
--- Regexes 2 and 3 ignore all files ending with .txt and .csv during scanning.-- Regex 4 ignores /folderB/ and all its contents during scanning.-
-Here are some more tips you can use to ignore patterns:
--- While processing the regex, Microsoft Purview will add $ to the regex by default.-- A good way to understand what url the scanning agent will compare with your regular expression is to browse through the Microsoft Purview data catalog, find the asset you want to ignore in the future, and see its fully qualified name (FQN) in the **Overview** tab.-
- :::image type="content" source="./media/create-a-scan-rule-set/fully-qualified-name.png" alt-text="Screenshot showing the fully qualified name on an asset's overview tab.":::
-
-## System scan rule sets
-
-System scan rule sets are Microsoft-defined scan rule sets that are automatically created for each Microsoft Purview catalog. Each system scan rule set is associated with a specific data source type. When you create a scan, you can associate it with a system scan rule set. Every time Microsoft makes an update to these system rule sets, you can update them in your catalog, and apply the update to all the associated scans.
-
-1. To view the list of system scan rule sets, select **Scan rule sets** in the **Management Center** and choose the **System** tab.
-
- :::image type="content" source="./media/create-a-scan-rule-set/system-scan-rule-sets.jpg" alt-text="Screenshot showing the list of system scan rule sets." lightbox="./media/create-a-scan-rule-set/system-scan-rule-sets.jpg":::
-
-1. Each system scan rule set has a **Name**, **Source type**, and a **Version**. If you select the version number of a scan rule set in the **Version** column, you see the rules associated with the current version and the previous versions (if any).
-
- :::image type="content" source="./media/create-a-scan-rule-set/system-scan-rule-set-page.jpg" alt-text="Screenshot showing a system scan rule set page." lightbox="./media/create-a-scan-rule-set/system-scan-rule-set-page.jpg":::
-
-1. If an update is available for a system scan rule set, you can select **Update** in the **Version** column. In the system scan rule page, choose from a version from the **Select a new version to update** drop-down list. The page provides a list of system classification rules associated with the new version and current version.
-
- :::image type="content" source="./media/create-a-scan-rule-set/system-scan-rule-set-version.jpg" alt-text="Screenshot showing how to change the version of a system scan rule set." lightbox="./media/create-a-scan-rule-set/system-scan-rule-set-version.jpg":::
-
-### Associate a scan with a system scan rule set
-
-When you [create a scan](tutorial-scan-data.md#scan-data-into-the-catalog), you can choose to associate it with a system scan rule set as follows:
-
-1. On the **Select a scan rule set** page, select the system scan rule set.
-
- :::image type="content" source="./media/create-a-scan-rule-set/set-system-scan-rule-set.jpg" alt-text="Screenshot showing how to select a system scan rule set for a scan." lightbox="./media/create-a-scan-rule-set/set-system-scan-rule-set.jpg":::
-
-1. Select **Continue**, and then select **Save and Run**.
purview Create Microsoft Purview Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-microsoft-purview-dotnet.md
- Title: 'Quickstart: Create Microsoft Purview (formerly Azure Purview) account using .NET SDK'
-description: This article will guide you through creating a Microsoft Purview (formerly Azure Purview) account using .NET SDK.
---- Previously updated : 06/17/2022--
-# Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using .NET SDK
-
-In this quickstart, you'll use the [.NET SDK](/dotnet/api/overview/azure/purview) to create a Microsoft Purview (formerly Azure Purview) account.
-
-The Microsoft Purview governance portal surfaces tools like the Microsoft Purview Data Map and Microsoft Purview Data Catalog that help you manage and govern your data landscape. By connecting to data across your on-premises, multicloud, and software-as-a-service (SaaS) sources, the Microsoft Purview Data Map creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-
-For more information about the governance capabilities of Microsoft Purview, formerly Azure Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md)
--
-### Visual Studio
-
-The walkthrough in this article uses Visual Studio 2019. The procedures for Visual Studio 2013, 2015, or 2017 may differ slightly.
-
-### Azure .NET SDK
-
-Download and install [Azure .NET SDK](https://azure.microsoft.com/downloads/) on your machine.
-
-## Create an application in Azure Active Directory
-
-1. In [Create an Azure Active Directory application](../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal), create an application that represents the .NET application you're creating in this tutorial. For the sign-on URL, you can provide a dummy URL as shown in the article (`https://contoso.org/exampleapp`).
-1. In [Get values for signing in](../active-directory/develop/howto-create-service-principal-portal.md#sign-in-to-the-application), get the **application ID** and **tenant ID**, and note down these values that you use later in this tutorial.
-1. In [Certificates and secrets](../active-directory/develop/howto-create-service-principal-portal.md#set-up-authentication), get the **authentication key**, and note down this value that you use later in this tutorial.
-1. In [Assign the application to a role](../active-directory/develop/howto-create-service-principal-portal.md#assign-a-role-to-the-application), assign the application to the **Contributor** role at the subscription level so that the application can create data factories in the subscription.
-
-## Create a Visual Studio project
-
-Next, create a C# .NET console application in Visual Studio:
-
-1. Launch **Visual Studio**.
-2. In the Start window, select **Create a new project** > **Console App (.NET Framework)**. .NET version 4.5.2 or above is required.
-3. In **Project name**, enter **PurviewQuickStart**.
-4. Select **Create** to create the project.
-
-## Install NuGet packages
-
-1. Select **Tools** > **NuGet Package Manager** > **Package Manager Console**.
-2. In the **Package Manager Console** pane, run the following commands to install packages. For more information, see the [Microsoft.Azure.Management.Purview NuGet package](https://www.nuget.org/packages/Microsoft.Azure.Management.Purview/).
-
- ```powershell
- Install-Package Microsoft.Azure.Management.Purview
- Install-Package Microsoft.Azure.Management.ResourceManager -IncludePrerelease
- Install-Package Microsoft.IdentityModel.Clients.ActiveDirectory
- ```
->[!TIP]
-> If you are getting an error that reads: **Package \<package name> is not found in the following primary sources(s):** and it is listing a local folder, you need to update your package sources in Visual Studio to include the nuget site as an online source.
-> 1. Go to **Tools**
-> 1. Select **NuGet Package Manager**
-> 1. Select **Package Manage Settings**
-> 1. Select **Package Sources**
-> 1. Add https://nuget.org/api/v2/ as a source.
-
-## Create a Microsoft Purview client
-
-1. Open **Program.cs**, include the following statements to add references to namespaces.
-
- ```csharp
- using System;
- using System.Collections.Generic;
- using System.Linq;
- using Microsoft.Rest;
- using Microsoft.Rest.Serialization;
- using Microsoft.Azure.Management.ResourceManager;
- using Microsoft.Azure.Management.Purview;
- using Microsoft.Azure.Management.Purview.Models;
- using Microsoft.IdentityModel.Clients.ActiveDirectory;
- ```
-
-2. Add the following code to the **Main** method that sets the variables. Replace the placeholders with your own values. For a list of Azure regions in which Microsoft Purview is currently available, search on **Microsoft Purview** and select the regions that interest you on the following page: [Products available by region](https://azure.microsoft.com/global-infrastructure/services/).
-
- ```csharp
- // Set variables
- string tenantID = "<your tenant ID>";
- string applicationId = "<your application ID>";
- string authenticationKey = "<your authentication key for the application>";
- string subscriptionId = "<your subscription ID where the data factory resides>";
- string resourceGroup = "<your resource group where the data factory resides>";
- string region = "<the location of your resource group>";
- string purviewAccountName =
- "<specify the name of purview account to create. It must be globally unique.>";
- ```
-
-3. Add the following code to the **Main** method that creates an instance of **PurviewManagementClient** class. You use this object to create a Microsoft Purview Account.
-
- ```csharp
- // Authenticate and create a purview management client
- var context = new AuthenticationContext("https://login.windows.net/" + tenantID);
- ClientCredential cc = new ClientCredential(applicationId, authenticationKey);
- AuthenticationResult result = context.AcquireTokenAsync(
- "https://management.azure.com/", cc).Result;
- ServiceClientCredentials cred = new TokenCredentials(result.AccessToken);
- var client = new PurviewManagementClient(cred)
- {
- SubscriptionId = subscriptionId
- };
- ```
-
-## Create an account
-
-Add the following code to the **Main** method that will create the **Microsoft Purview Account**.
-
-```csharp
-// Create a purview Account
-Console.WriteLine("Creating Microsoft Purview Account " + purviewAccountName + "...");
-Account account = new Account()
-{
-Location = region,
-Identity = new Identity(type: "SystemAssigned"),
-Sku = new AccountSku(name: "Standard", capacity: 4)
-};
-try
-{
- client.Accounts.CreateOrUpdate(resourceGroup, purviewAccountName, account);
- Console.WriteLine(client.Accounts.Get(resourceGroup, purviewAccountName).ProvisioningState);
-}
-catch (ErrorResponseModelException purviewException)
-{
-Console.WriteLine(purviewException.StackTrace);
- }
- Console.WriteLine(
- SafeJsonConvert.SerializeObject(account, client.SerializationSettings));
- while (client.Accounts.Get(resourceGroup, purviewAccountName).ProvisioningState ==
- "PendingCreation")
- {
- System.Threading.Thread.Sleep(1000);
- }
-Console.WriteLine("\nPress any key to exit...");
-Console.ReadKey();
-```
-
-## Run the code
-
-Build and start the application, then verify the execution.
-
-The console prints the progress of creating Microsoft Purview Account.
-
-### Sample output
-
-```json
-Creating Microsoft Purview Account testpurview...
-Succeeded
-{
- "sku": {
- "capacity": 4,
- "name": "Standard"
- },
- "identity": {
- "type": "SystemAssigned"
- },
- "location": "southcentralus"
-}
-
-Press any key to exit...
-```
-
-## Verify the output
-
-Go to the **Microsoft Purview accounts** page in the [Azure portal](https://portal.azure.com) and verify the account created using the above code.
-
-## Delete Microsoft Purview account
-
-To programmatically delete a Microsoft Purview account, add the following lines of code to the program:
-
-```csharp
-Console.WriteLine("Deleting the Microsoft Purview Account");
-client.Accounts.Delete(resourceGroup, purviewAccountName);
-```
-
-## Check if Microsoft Purview account name is available
-
-To check availability of a purview account, use the following code:
-
-```csharp
-CheckNameAvailabilityRequest checkNameAvailabilityRequest = newCheckNameAvailabilityRequest()
-{
- Name = purviewAccountName,
- Type = "Microsoft.Purview/accounts"
-};
-Console.WriteLine("Check Microsoft Purview account name");
-Console.WriteLine(client.Accounts.CheckNameAvailability(checkNameAvailabilityRequest).NameAvailable);
-```
-
-The above code with print 'True' if the name is available and 'False' if the name isn't available.
-
-## Next steps
-
-In this quickstart, you learned how to create a Microsoft Purview (formerly Azure Purview) account, delete the account, and check for name availability. You can now download the .NET SDK and learn about other resource provider actions you can perform for a Microsoft Purview account.
-
-Follow these next articles to learn how to navigate the Microsoft Purview governance portal, create a collection, and grant access to the Microsoft Purview governance portal.
-
-* [How to use the Microsoft Purview governance portal](use-azure-purview-studio.md)
-* [Grant users permissions to the governance portal](catalog-permissions.md)
-* [Create a collection](quickstart-create-collection.md)
purview Create Microsoft Purview Portal Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-microsoft-purview-portal-faq.md
- Title: Create an exception to deploy Microsoft Purview
-description: This article describes how to create an exception to deploy Microsoft Purview while leaving existing Azure policies in place to maintain security.
---- Previously updated : 06/02/2023--
-# Create an exception to deploy Microsoft Purview
-
-Many subscriptions have [Azure Policies](../governance/policy/overview.md) in place that restrict the creation or update of some resources. This is to maintain subscription security and cleanliness. However, Microsoft Purview accounts will deploy an Azure Storage account when it's created. It'll be managed by Azure, so you don't need to maintain it, but it is necessary for Microsoft Purview to run correctly. Existing policies may block this deployment, and you may receive an error when attempting to create a Microsoft Purview account.
-
-Microsoft Purview also regularly updates its Azure Storage account after creation, so any policies blocking updates to this storage account will cause errors during scanning.
-
-To maintain your policies in your subscription, but still allow the creation and updates to these managed resources, you can create an exception.
-
-## Create an Azure policy exception for Microsoft Purview
-
-1. Navigate to the [Azure portal](https://portal.azure.com) and search for **Policy**
-
- :::image type="content" source="media/create-purview-portal-faq/search-for-policy.png" alt-text="Screenshot showing the Azure portal search bar, searching for Policy keyword.":::
-
-1. Follow [Create a custom policy definition](../governance/policy/tutorials/create-custom-policy-definition.md) or modify existing policy to add two exceptions with `not` operator and `resourceBypass` tag:
-
- ```json
- {
- "mode": "All",
- "policyRule": {
- "if": {
- "anyOf": [
- {
- "allOf": [
- {
- "field": "type",
- "equals": "Microsoft.Storage/storageAccounts"
- },
- {
- "not": {
- "field": "tags['<resourceBypass>']",
- "exists": true
- }
- }]
- },
- {
- "allOf": [
- {
- "field": "type",
- "equals": "Microsoft.EventHub/namespaces"
- },
- {
- "not": {
- "field": "tags['<resourceBypass>']",
- "exists": true
- }
- }]
- }]
- },
- "then": {
- "effect": "deny"
- }
- },
- "parameters": {}
- }
- ```
-
- > [!Note]
- > The tag could be named anything beside `resourceBypass`. You'll create the tag either when creating Microsoft Purview, or you can add it to a previously created account in the **Tags** page on the resource in the Azure portal. The name can be whatever you choose, but when you define the tag the value needs to be 'allowed'.
-
- :::image type="content" source="media/create-catalog-portal/policy-definition.png" alt-text="Screenshot showing how to create policy definition.":::
-
-1. [Create a policy assignment](../governance/policy/assign-policy-portal.md) using the custom policy created.
-
- :::image type="content" source="media/create-catalog-portal/policy-assignment.png" alt-text="Screenshot showing how to create policy assignment" lightbox="./media/create-catalog-portal/policy-assignment.png":::
-
-> [!Note]
-> If you have **Azure Policy** and need to add exception as in **Prerequisites**, you need to add the correct tag. For example, you can add `resourceBypass` tag. The name can be whatever you like, as long as the tag name matches the name in the policy, but the value must be "allowed".
-> :::image type="content" source="media/create-catalog-portal/add-purview-tag.png" alt-text="Add tag to Microsoft Purview account.":::
-
-## Next steps
-
-To set up Microsoft Purview by using Private Link, see [Use private endpoints for your Microsoft Purview account](./catalog-private-link.md).
purview Create Microsoft Purview Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-microsoft-purview-portal.md
- Title: 'Quickstart: Create a Microsoft Purview (formerly Azure Purview) account'
-description: This Quickstart describes how to create a Microsoft Purview (formerly Azure Purview) account and configure permissions to begin using it.
-- Previously updated : 06/07/2023----
-# Quickstart: Create an account in the Microsoft Purview governance portal
-
-This quickstart describes the steps to Create a Microsoft Purview (formerly Azure Purview) account through the Azure portal. Then we'll get started on the process of classifying, securing, and discovering your data in the Microsoft Purview Data Map!
-
-The Microsoft Purview governance portal surfaces tools like the Microsoft Purview Data Map and Microsoft Purview Data Catalog that help you manage and govern your data landscape. By connecting to data across your on-premises, multicloud, and software-as-a-service (SaaS) sources, the Microsoft Purview Data Map creates an up-to-date map of your data estate. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-
-For more information about the governance capabilities of Microsoft Purview, formerly Azure Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview governance services across your organization, [see our deployment best practices](deployment-best-practices.md).
--
-## Create an account
-
-> [!IMPORTANT]
-> If you have any [Azure Policies](../governance/policy/overview.md) preventing creation of **Storage accounts** or **Event Hub namespaces**, or preventing **updates to Storage accounts** first follow the [Microsoft Purview exception tag guide](create-azure-purview-portal-faq.md) to create an exception for Microsoft Purview accounts. Otherwise you will not be able to deploy Microsoft Purview.
-
-1. Search for **Microsoft Purview** in the [Azure portal](https://portal.azure.com).
-
- :::image type="content" source="media/create-catalog-portal/purview-accounts-page.png" alt-text="Screenshot showing the purview accounts page in the Azure portal":::
-
-1. Select **Create** to create a new Microsoft Purview account.
-
- :::image type="content" source="media/create-catalog-portal/select-create.png" alt-text="Screenshot of the Microsoft Purview accounts page with the create button highlighted in the Azure portal.":::
-
- Or instead, you can go to the marketplace, search for **Microsoft Purview**, and select **Create**.
-
- :::image type="content" source="media/create-catalog-portal/search-marketplace.png" alt-text="Screenshot showing Microsoft Purview in the Azure Marketplace, with the create button highlighted.":::
-
-1. On the new Create Microsoft Purview account page under the **Basics** tab, select the Azure subscription where you want to create your account.
-
-1. Select an existing **resource group** or create a new one to hold your account.
-
- To learn more about resource groups, see our article on [using resource groups to manage your Azure resources](../azure-resource-manager/management/manage-resource-groups-portal.md#what-is-a-resource-group).
-
-1. Enter a **Microsoft Purview account name**. Spaces and symbols aren't allowed.
- The name of the Microsoft Purview account must be globally unique. If you see the following error, change the name of Microsoft Purview account and try creating again.
-
- :::image type="content" source="media/create-catalog-portal/name-error.png" alt-text="Screenshot showing the Create Microsoft Purview account screen with an account name that is already in use, and the error message highlighted.":::
-
-1. Choose a **location**.
- The list shows only locations that support the Microsoft Purview governance portal. The location you choose will be the region where your Microsoft Purview account and meta data will be stored. Sources can be housed in other regions.
-
- > [!Note]
- > The Microsoft Purview, formerly Azure Purview, does not support moving accounts across regions, so be sure to deploy to the correct region. You can find out more information about this in [move operation support for resources](../azure-resource-manager/management/move-support-resources.md).
-
-1. You can choose a name for your managed resource group. Microsoft Purview will create a managed storage account in this group that it will use during its processes.
-
-1. On the **Networking** tab you can choose to connect to all networks, or to use private endpoints. For more information and configuration options, see [Configure firewall settings for your Microsoft Purview account](catalog-firewall.md) and [private endpoints for Microsoft Purview articles.](catalog-private-link.md)
-
-1. On **Configuration** tab you can choose to configure Event Hubs namespaces to programmatically monitor your Microsoft Purview account using Event Hubs and Atlas Kafka.
- - [Steps to configure Event Hubs namespaces](configure-event-hubs-for-kafka.md)
- - [Send and receive Atlas Kafka topics messages](manage-kafka-dotnet.md)
-
- :::image type="content" source="media/create-catalog-portal/configure-kafka-event-hubs.png" alt-text="Screenshot showing the Event Hubs configuration page in the Create Microsoft Purview account window.":::
-
- >[!NOTE]
- > [These options can be configured after you have created your account](configure-event-hubs-for-kafka.md) in **Kafka configuration** under settings on your Microsoft Purview account page in the Azure Portal.
- >
- > :::image type="content" source="media/create-catalog-portal/select-kafka-configuration.png" alt-text="Screenshot showing Kafka configuration option on the Microsoft Purview account page in the Azure Portal.":::
-
-1. On the **Tags** tab, add a tag called **Purview environment**, and give it one of the below values:
-
- |Value |Meaning |
- |-|--|
- |Production|This account is being used or will be used in the future to support all my cataloging and governance requirements in production.|
- |Pre-Production|This account is being used or will be used in the future to validate cataloging and governance requirements before making it available to my users in production.|
- |Test|This account is being used or will be used in the future to test out capabilities in Microsoft Purview Governance. |
- |Dev|This account is being used or will be used in the future to test out capabilities or develop custom code, scripts etc. in Microsoft Purview Governance.|
- |Proof of Concept|This account is being used or will be used in the future to test out capabilities or develop custom code, scripts etc. in Microsoft Purview Governance.|
-
- :::image type="content" source="media/create-catalog-portal/tag-resource.png" alt-text="Screenshot showing the tags tab, with a Purview environment tag added with the value Production.":::
-
-1. Select **Review & Create**, and then select **Create**. It takes a few minutes to complete the creation. The newly created account will appear in the list on your **Microsoft Purview accounts** page.
-
- :::image type="content" source="media/create-catalog-portal/create-resource.png" alt-text="Screenshot showing the Create Microsoft Purview account screen with the Review + Create button highlighted":::
-
-## Open the Microsoft Purview governance portal
-
-After your account is created, you'll use the Microsoft Purview governance portal to access and manage it. There are two ways to open the Microsoft Purview governance portal:
--- You can browse directly to [https://web.purview.azure.com](https://web.purview.azure.com), select your Microsoft Purview account name, and sign in to your workspace.-- Alternatively, open your Microsoft Purview account in the [Azure portal](https://portal.azure.com). Select the **Open Microsoft Purview governance portal** tile on the overview page.
- :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Microsoft Purview account overview page, with the Microsoft Purview governance portal tile highlighted.":::
-
-## Next steps
-
-In this quickstart, you learned how to create a Microsoft Purview (formerly Azure Purview) account, and how to access it.
-
-Next, you can create a user-assigned managed identity (UAMI) that will enable your new Microsoft Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
-
-To create a UAMI, follow our [guide to create a user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-Follow these next articles to learn how to navigate the Microsoft Purview governance portal, create a collection, and grant access to the Microsoft Purview Data Map:
-
-* [Using the Microsoft Purview governance portal](use-azure-purview-studio.md)
-* [Create a collection](quickstart-create-collection.md)
-* [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Create Microsoft Purview Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-microsoft-purview-powershell.md
- Title: 'Quickstart: Create a Microsoft Purview (formerly Azure Purview) account with PowerShell/Azure CLI'
-description: This Quickstart describes how to create a Microsoft Purview (formerly Azure Purview) account using Azure PowerShell/Azure CLI.
-- Previously updated : 06/17/2022---
-#Customer intent: As a data steward, I want create a new Microsoft Purview Account so that I can scan and classify my data.
-
-# Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using Azure PowerShell/Azure CLI
-
-In this Quickstart, you'll create a Microsoft Purview account using Azure PowerShell/Azure CLI. [PowerShell reference for Microsoft Purview](/powershell/module/az.purview/) is available, but this article will take you through all the steps needed to create an account with PowerShell.
-
-The Microsoft Purview governance portal surfaces tools like the Microsoft Purview Data Map and Microsoft Purview Data Catalog that help you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, the Microsoft Purview Data Map creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-
-the governance capabilities of Microsoft Purview, formerly Azure Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview governance services across your organization, [see our deployment best practices](deployment-best-practices.md).
--
-### Install PowerShell
-
- Install either Azure PowerShell or Azure CLI in your client machine to deploy the template: [Command-line deployment](../azure-resource-manager/templates/template-tutorial-create-first-template.md?tabs=azure-cli#command-line-deployment)
-
-## Create an account
-
-1. Sign in with your Azure credential
-
- # [PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- Connect-AzAccount
- ```
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az login
- ```
-
-
-
-1. If you have multiple Azure subscriptions, select the subscription you want to use:
-
- # [PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- Set-AzContext [SubscriptionID/SubscriptionName]
- ```
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az account set --subscription [SubscriptionID/SubscriptionName]
- ```
-
-
-
-1. Create a resource group for your account. You can skip this step if you already have one:
-
- # [PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzResourceGroup -Name myResourceGroup -Location 'East US'
- ```
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az group create \
- --name myResourceGroup \
- --location "East US"
- ```
-
-
-
-1. Create or Deploy the account:
-
- # [PowerShell](#tab/azure-powershell)
-
- Use the [New-AzPurviewAccount](/powershell/module/az.purview/new-azpurviewaccount) cmdlet to create the Microsoft Purview account:
-
- ```azurepowershell
- New-AzPurviewAccount -Name yourPurviewAccountName -ResourceGroupName myResourceGroup -Location eastus -IdentityType SystemAssigned -SkuCapacity 4 -SkuName Standard -PublicNetworkAccess Enabled
- ```
-
- # [Azure CLI](#tab/azure-cli)
-
- 1. Create a Microsoft Purview template file such as `purviewtemplate.json`. You can update `name`, `location`, and `capacity` (`4` or `16`):
-
- ```json
- {
- "$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "resources": [
- {
- "name": "<yourPurviewAccountName>",
- "type": "Microsoft.Purview/accounts",
- "apiVersion": "2020-12-01-preview",
- "location": "EastUs",
- "identity": {
- "type": "SystemAssigned"
- },
- "properties": {
- "networkAcls": {
- "defaultAction": "Allow"
- }
- },
- "dependsOn": [],
- "sku": {
- "name": "Standard",
- "capacity": "4"
- },
- "tags": {}
- }
- ],
- "outputs": {}
- }
- ```
-
- 1. Deploy Microsoft Purview template
-
- To run this deployment command, you must have the [latest version](/cli/azure/install-azure-cli) of Azure CLI.
-
- ```azurecli
- az deployment group create --resource-group "<myResourceGroup>" --template-file "<PATH TO purviewtemplate.json>"
- ```
-
-
-
-1. The deployment command returns results. Look for `ProvisioningState` to see whether the deployment succeeded.
-
-1. If you deployed the account using a service principal, instead of a user account, you'll also need to run the below command in the Azure CLI:
-
- ```azurecli
- az purview account add-root-collection-admin --account-name [Microsoft Purview Account Name] --resource-group [Resource Group Name] --object-id [User Object Id]
- ```
-
- This command will grant the user account [collection admin](catalog-permissions.md#roles) permissions on the root collection in your Microsoft Purview account. This allows the user to access the Microsoft Purview governance portal and add permission for other users. For more information about permissions in Microsoft Purview, see our [permissions guide](catalog-permissions.md). For more information about collections, see our [manage collections article](how-to-create-and-manage-collections.md).
-
-## Next steps
-
-In this quickstart, you learned how to create a Microsoft Purview (formerly Azure Purview) account.
-
-Follow these next articles to learn how to navigate the Microsoft Purview governance portal, create a collection, and grant access to the Microsoft Purview governance portal.
-
-* [How to use the Microsoft Purview governance portal](use-azure-purview-studio.md)
-* [Grant users permissions to the governance portal](catalog-permissions.md)
-* [Create a collection](quickstart-create-collection.md)
purview Create Microsoft Purview Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-microsoft-purview-python.md
- Title: 'Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using Python'
-description: This article will guide you through creating a Microsoft Purview (formerly Azure Purview) account using Python.
---- Previously updated : 06/17/2022---
-# Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using Python
-
-In this quickstart, youΓÇÖll create a Microsoft Purview (formerly Azure Purview) account programatically using Python. [The Python reference for Microsoft Purview](/python/api/azure-mgmt-purview/) is available, but this article will take you through all the steps needed to create an account with Python.
-
-The Microsoft Purview governance portal surfaces tools like the Microsoft Purview Data Map and Microsoft Purview Data Catalog that help you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, the Microsoft Purview Data Map creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-
-For more information about the governance capabilities of Microsoft Purview, formerly Azure Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md)
--
-## Install the Python package
-
-1. Open a terminal or command prompt with administrator privileges.
-2. First, install the Python package for Azure management resources:
-
- ```python
- pip install azure-mgmt-resource
- ```
-
-3. To install the Python package for Microsoft Purview, run the following command:
-
- ```python
- pip install azure-mgmt-purview
- ```
-
- The [Python SDK for Microsoft Purview](https://github.com/Azure/azure-sdk-for-python) supports Python 2.7, 3.3, 3.4, 3.5, 3.6 and 3.7.
-
-4. To install the Python package for Azure Identity authentication, run the following command:
-
- ```python
- pip install azure-identity
- ```
-
- > [!NOTE]
- > The "azure-identity" package might have conflicts with "azure-cli" on some common dependencies. If you meet any authentication issue, remove "azure-cli" and its dependencies, or use a clean machine without installing "azure-cli" package.
-
-## Create a purview client
-
-1. Create a file named **purview.py**. Add the following statements to add references to namespaces.
-
- ```python
- from azure.identity import ClientSecretCredential
- from azure.mgmt.resource import ResourceManagementClient
- from azure.mgmt.purview import PurviewManagementClient
- from azure.mgmt.purview.models import *
- from datetime import datetime, timedelta
- import time
- ```
-
-2. Add the following code to the **Main** method that creates an instance of PurviewManagementClient class. You'll use this object to create a purview account, delete purview accounts, check name availability, and other resource provider operations.
-
- ```python
- def main():
-
- # Azure subscription ID
- subscription_id = '<subscription ID>'
-
- # This program creates this resource group. If it's an existing resource group, comment out the code that creates the resource group
- rg_name = '<resource group>'
-
- # The purview name. It must be globally unique.
- purview_name = '<purview account name>'
-
- # Location name, where Microsoft Purview account must be created.
- location = '<location name>'
-
- # Specify your Active Directory client ID, client secret, and tenant ID
- credentials = ClientSecretCredential(client_id='<service principal ID>', client_secret='<service principal key>', tenant_id='<tenant ID>')
- # resource_client = ResourceManagementClient(credentials, subscription_id)
- purview_client = PurviewManagementClient(credentials, subscription_id)
- ```
-
-## Create a purview account
-
-1. Add the following code to the **Main** method that creates a **purview account**. If your resource group already exists, comment out the first `create_or_update` statement.
-
- ```python
- # create the resource group
- # comment out if the resource group already exits
- resource_client.resource_groups.create_or_update(rg_name, rg_params)
-
- #Create a purview
- identity = Identity(type= "SystemAssigned")
- sku = AccountSku(name= 'Standard', capacity= 4)
- purview_resource = Account(identity=identity,sku=sku,location =location )
-
- try:
- pa = (purview_client.accounts.begin_create_or_update(rg_name, purview_name, purview_resource)).result()
- print("location:", pa.location, " Microsoft Purview Account Name: ", pa.name, " Id: " , pa.id ," tags: " , pa.tags)
- except:
- print("Error")
- print_item(pa)
-
- while (getattr(pa,'provisioning_state')) != "Succeeded" :
- pa = (purview_client.accounts.get(rg_name, purview_name))
- print(getattr(pa,'provisioning_state'))
- if getattr(pa,'provisioning_state') == "Failed" :
- print("Error in creating Microsoft Purview account")
- break
- time.sleep(30)
- ```
-
-2. Now, add the following statement to invoke the **main** method when the program is run:
-
- ```python
- # Start the main method
- main()
- ```
-
-## Full script
-
-HereΓÇÖs the full Python code:
-
-```python
-
- from azure.identity import ClientSecretCredential
- from azure.mgmt.resource import ResourceManagementClient
- from azure.mgmt.purview import PurviewManagementClient
- from azure.mgmt.purview.models import *
- from datetime import datetime, timedelta
- import time
-
- # Azure subscription ID
- subscription_id = '<subscription ID>'
-
- # This program creates this resource group. If it's an existing resource group, comment out the code that creates the resource group
- rg_name = '<resource group>'
-
- # The purview name. It must be globally unique.
- purview_name = '<purview account name>'
-
- # Specify your Active Directory client ID, client secret, and tenant ID
- credentials = ClientSecretCredential(client_id='<service principal ID>', client_secret='<service principal key>', tenant_id='<tenant ID>')
- # resource_client = ResourceManagementClient(credentials, subscription_id)
- purview_client = PurviewManagementClient(credentials, subscription_id)
-
- # create the resource group
- # comment out if the resource group already exits
- resource_client.resource_groups.create_or_update(rg_name, rg_params)
-
- #Create a purview
- identity = Identity(type= "SystemAssigned")
- sku = AccountSku(name= 'Standard', capacity= 4)
- purview_resource = Account(identity=identity,sku=sku,location ="southcentralus" )
-
- try:
- pa = (purview_client.accounts.begin_create_or_update(rg_name, purview_name, purview_resource)).result()
- print("location:", pa.location, " Microsoft Purview Account Name: ", purview_name, " Id: " , pa.id ," tags: " , pa.tags)
- except:
- print("Error in submitting job to create account")
- print_item(pa)
-
- while (getattr(pa,'provisioning_state')) != "Succeeded" :
- pa = (purview_client.accounts.get(rg_name, purview_name))
- print(getattr(pa,'provisioning_state'))
- if getattr(pa,'provisioning_state') == "Failed" :
- print("Error in creating Microsoft Purview account")
- break
- time.sleep(30)
-
-# Start the main method
-main()
-```
-
-## Run the code
-
-Build and start the application. The console prints the progress of Microsoft Purview account creation. Wait until itΓÇÖs completed.
-HereΓÇÖs the sample output:
-
-```console
-location: southcentralus Microsoft Purview Account Name: purviewpython7 Id: /subscriptions/8c2c7b23-848d-40fe-b817-690d79ad9dfd/resourceGroups/Demo_Catalog/providers/Microsoft.Purview/accounts/purviewpython7 tags: None
-Creating
-Creating
-Succeeded
-```
-
-## Verify the output
-
-Go to the **Microsoft Purview accounts** page in the Azure portal and verify the account created using the above code.
-
-## Delete Microsoft Purview account
-
-To delete purview account, add the following code to the program, then run:
-
-```python
-pa = purview_client.accounts.begin_delete(rg_name, purview_name).result()
-```
-
-## Next steps
-
-In this quickstart, you learned how to create a Microsoft Purview (formerly Azure Purview) account, delete the account, and check for name availability. You can now download the Python SDK and learn about other resource provider actions you can perform for a Microsoft Purview account.
-
-Follow these next articles to learn how to navigate the Microsoft Purview governance portal, create a collection, and grant access to the Microsoft Purview governance portal.
-
-* [How to use the Microsoft Purview governance portal](use-azure-purview-studio.md)
-* [Grant users permissions to the governance portal](catalog-permissions.md)
-* [Create a collection](quickstart-create-collection.md)
purview Create Sensitivity Label https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-sensitivity-label.md
- Title: Labeling in the Microsoft Purview Data Map
-description: Start utilizing sensitivity labels and classifications to enhance your Microsoft Purview assets
------ Previously updated : 04/22/2022--
-# Labeling in the Microsoft Purview Data Map
-
-> [!IMPORTANT]
-> Labeling in the Microsoft Purview Data Map is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
->
-
-To get work done, people in your organization collaborate with others both inside and outside the organization. Data doesn't always stay in your cloud, and often roams everywhere, across devices, apps, and services. When your data roams, you still want it to be secure in a way that meets your organization's business and compliance policies.</br>
-
-Applying sensitivity labels to your content enables you to keep your data secure by stating how sensitive certain data is in your organization. It also abstracts the data itself, so you use labels to track the type of data, without exposing sensitive data on another platform.</br>
-
-For example, applying a sensitivity label ΓÇÿhighly confidentialΓÇÖ to a document that contains social security number and credit card numbers helps you identify the sensitivity of the document without knowing the actual data in the document.
-
-## Benefits of labeling in Microsoft Purview
-
-Microsoft Purview allows you to apply sensitivity labels to assets, enabling you to classify and protect your data.
-
-* **Label travels with the data:** The sensitivity labels created in Microsoft Purview Information Protection can also be extended to the Microsoft Purview Data Map, SharePoint, Teams, Power BI, and SQL. When you apply a label on an office document and then scan it into the Microsoft Purview Data Map, the label will be applied to the data asset. While the label is applied to the actual file in Microsoft Purview Information Protection, it's only added as metadata in the Microsoft Purview map. While there are differences in how a label is applied to an asset across various services/applications, labels travel with the data and is recognized by all the services you extend it to.
-* **Overview of your data estate:** Microsoft Purview provides insights into your data through pre-canned reports. When you scan data into the Microsoft Purview Data Map, we hydrate the reports with information on what assets you have, scan history, classifications found in your data, labels applied, glossary terms, etc.
-* **Automatic labeling:** Labels can be applied automatically based on sensitivity of the data. When an asset is scanned for sensitive data, autolabeling rules are used to decide which sensitivity label to apply. You can create autolabeling rules for each sensitivity label, defining which classification/sensitive information type constitutes a label.
-* **Apply labels to files and database columns:** Labels can be applied to files in storage such as Azure Data Lake or Azure Files as well as to schematized data such as columns in Azure SQL Database.
-
-Sensitivity labels are tags that you can apply on assets to classify and protect your data. Learn more about [sensitivity labels here](/microsoft-365/compliance/create-sensitivity-labels).
-
-## How to apply labels to assets in the Microsoft Purview Data Map
--
-Being able to apply labels to your asset in the data map requires you to perform the following steps:
-
-1. [Create new or apply existing sensitivity labels](how-to-automatically-label-your-content.md) in the Microsoft Purview compliance portal. Creating sensitivity labels include autolabeling rules that tell us which label should be applied based on the classifications found in your data.
-1. [Register and scan your asset](how-to-automatically-label-your-content.md#scan-your-data-to-apply-sensitivity-labels-automatically) in the Microsoft Purview Data Map.
-1. Microsoft Purview applies **classifications**: When you schedule a scan on an asset, Microsoft Purview scans the type of data in your asset and applies classifications to it in the data map. Application of classifications is done automatically by Microsoft Purview, there's no action for you.
-1. Microsoft Purview applies **labels**: Once classifications are found on an asset, Microsoft Purview will apply labels to the assets depending on autolabeling rules. Application of labels is done automatically by Microsoft Purview, there's no action for you as long as you have created labels with autolabeling rules in step 1.
-
-> [!NOTE]
-> Autolabeling rules are conditions that you specify, stating when a particular label should be applied. When these conditions are met, the label is automatically assigned to the data. When you create your labels, make sure to define autolabeling rules for both files and database columns to apply your labels automatically with each scan.
->
-
-## Supported data sources
-
-Sensitivity labels are supported in the Microsoft Purview Data Map for the following data sources:
-
-|Data type |Sources |
-|||
-|Automatic labeling for files | - Azure Blob Storage</br>- Azure Files</br>- Azure Data Lake Storage Gen 1 and Gen 2</br>- Amazon S3|
-|Automatic labeling for data assets | - SQL server</br>- Azure SQL database</br>- Azure SQL Managed Instance</br>- Azure Synapse Analytics workspaces</br>- Azure Cosmos DB for NoSQL</br> - Azure Database for MySQL</br> - Azure Database for PostgreSQL</br> - Azure Data Explorer</br> |
-| | |
-
-## Labeling for SQL databases
-
-In addition to the Microsoft Purview Data Map's labeling for schematized data assets, Microsoft also supports labeling for SQL database columns using the SQL data classification in [SQL Server Management Studio (SSMS)](/sql/ssms/sql-server-management-studio-ssms). While Microsoft Purview uses the global [sensitivity labels](/microsoft-365/compliance/sensitivity-labels), SSMS only uses labels defined locally.
-
-Labeling in Microsoft Purview and labeling in SSMS are separate processes that don't currently interact with each other. Therefore, **labels applied in SSMS are not shown in Microsoft Purview, and vice versa**. We recommend Microsoft Purview for labeling SQL databases, because the labels can be applied globally, across multiple platforms.
-
-For more information, see the [SQL data discovery and classification documentation](/sql/relational-databases/security/sql-data-discovery-and-classification). </br></br>
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [How to automatically label your content](./how-to-automatically-label-your-content.md)
-
-> [!div class="nextstepaction"]
-> [Sensitivity label insights](sensitivity-insights.md)
-
-> [!div class="nextstepaction"]
-> [Labeling Frequently Asked Questions](sensitivity-labels-frequently-asked-questions.yml)
purview Create Service Principal Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-service-principal-azure.md
- Title: Create a service principal in Azure.
-description: This article describes how you can create a service principal in Azure for use with Microsoft Purview.
----- Previously updated : 03/24/2023
-# Customer intent: As an Azure AD Global Administrator or other roles such as Application Administrator, I need to create a new service principal, in order to register an application in the Azure AD tenant.
---
-# Creating a service principal for use with Microsoft Purview
-
-You can create a new or use an existing service principal in your Azure Active Directory tenant to use to authenticate with other services.
-
-## App registration
-
-1. Navigate to the [Azure portal](https://portal.azure.com).
-1. Select **Azure Active Directory** from the left-hand side menu.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-aad.png" alt-text="Screenshot that shows the link to the Azure Active Directory.":::
-
-1. Select **App registrations** and **+ New registration**.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-new-reg.png" alt-text="Screenshot that shows the link to New registration.":::
-
-1. Enter a name for the **application** (the service principal name).
-
-1. Select **Accounts in this organizational directory only**.
-
-1. For **Redirect URI** select **Web** and enter any URL you want. If you have an authentication endpoint for your organization you want to use, this is the place. Otherwise `https://example.com/auth` will do.
-
-1. Then select **Register**.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-register.png" alt-text="Screenshot that shows the details for the new app registration.":::
-
-1. Copy the **Application (client) ID** value. We'll use this later to create a credential in Microsoft Purview.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-new-app.png" alt-text="Screenshot that shows the newly created application.":::
-
-## Adding a secret to the client credentials
-
-1. Select the app from the **App registrations**.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-app-select.png" alt-text="Screenshot that shows the app for registration.":::
-
-1. Select **Add a certificate or secret**.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-add-secret.png" alt-text="Screenshot that shows the app.":::
-
-1. Select **+ New client secret** under **Client secrets**.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-new-client-secret.png" alt-text="Screenshot that shows the client secret menu.":::
-
-1. Provide a **Description** and set the **Expires** for the secret.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-secret-desc.png" alt-text="Screenshot that shows the client secret details.":::
-
-1. Copy the value of the **Secret value**. We'll use this later to create a secret in Azure Key Vault.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-client-secret.png" alt-text="Screenshot that shows the client secret.":::
-
-## Adding the secret to your Azure Key Vault
-
-To allow Microsoft Purview to use this service principal to authenticate with other services, you'll need to store this credential in Azure Key Vault.
-
-* If you need an Azure Key vault, you can [follow these steps to create one.](../key-vault/general/quick-create-portal.md)
-* To grant your Microsoft Purview account access to the Azure Key Vault, you can [follow these steps](manage-credentials.md#microsoft-purview-permissions-on-the-azure-key-vault).
-
-1. Navigate to your **Key vault**.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-key-vault.png" alt-text="Screenshot that shows the Key vault.":::
-
-1. Select **Settings** --> **Secrets** --> **+ Generate/Import**
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-generate-secret.png" alt-text="Screenshot that options in the Key vault.":::
-
-1. Enter the **Name** of your choice, and save it to create a credential in Microsoft Purview.
-
-1. Enter the **Value** as the **Secret value** from your Service Principal.
-
- :::image type="content" source="media/create-service-principal-azure/create-service-principal-azure-sp-secret.png" alt-text="Screenshot that shows the Key vault to create a secret.":::
-
-1. Select **Create** to complete.
-
-## Create a credential for your secret in Microsoft Purview
-
-To enable Microsoft Purview to use this service principal to authenticate with other services, you'll need to follow these three steps.
-
-1. [Connect your Azure Key Vault to Microsoft Purview](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. [Grant your service principal authentication on your source](microsoft-purview-connector-overview.md) - Follow instructions on each source page to grant appropriate authentication.
-1. [Create a new credential in Microsoft Purview](manage-credentials.md#create-a-new-credential) - You'll use the service principal's application (client) ID and the name of the secret you created in your Azure Key Vault.
purview Data Stewardship https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/data-stewardship.md
- Title: Data Stewardship Insights in Microsoft Purview
-description: This article describes the data stewardship dashboards in Microsoft Purview, and how they can be used to govern and manage your data estate.
------ Previously updated : 04/25/2023--
-# Get insights into data stewardship from Microsoft Purview
-
-As described in the [insights concepts](concept-insights.md), the data stewardship report is part of the "Health" section of the Data Estate Insights App. This report offers a one-stop shop experience for data, governance, and quality focused users like chief data officers and data stewards to get actionable insights into key areas of gap in their data estate.
-
-In this guide, you'll learn how to:
-
-> [!div class="checklist"]
-> * Navigate and view data stewardship report from your Microsoft Purview account.
-> * Drill down for more asset count details.
-
-## Prerequisites
-
-Before getting started with Microsoft Purview Data Estate Insights, make sure that you've completed the following steps:
-
-* Set up a storage resource and populated the account with data.
-
-* Set up and completed a scan your storage source.
-
-* [Enable and schedule your data estate insights reports](how-to-schedule-data-estate-insights.md).
-
-For more information to create and complete a scan, see [the manage data sources in Microsoft Purview article](manage-data-sources.md).
-
-## Understand your data estate and catalog health in Data Estate Insights
-
-In Microsoft Purview Data Estate Insights, you can get an overview of all assets inventoried in the Data Map, and any key gaps that can be closed by governance stakeholders, for better governance of the data estate.
-
-1. Access the [Microsoft Purview Governance Portal](https://web.purview.azure.com/) and open your Microsoft Purview account.
-
-1. On the Microsoft Purview **Home** page, select **Data Estate Insights** on the left menu.
-
- :::image type="content" source="./media/data-stewardship/view-insights.png" alt-text="Screenshot of the Microsoft Purview governance portal with the Data Estate Insights button highlighted in the left menu.":::
-
-1. In the **Data Estate Insights** area, look for **Data Stewardship** in the **Health** section.
-
- :::image type="content" source="./media/data-stewardship/data-stewardship-table-of-contents.png" alt-text="Screenshot of the Microsoft Purview governance portal Data Estate Insights menu with Data Stewardship highlighted under the Health section.":::
-
-## View data stewardship dashboard
-
-The dashboard is purpose-built for the governance and quality focused users, like data stewards and chief data officers, to understand the data estate health of their organization. The dashboard shows high level KPIs that need to reduce governance risks:
-
-* **Asset curation**: All data assets are categorized into three buckets - "Fully curated", "Partially curated" and "Not curated", based on certain attributes of assets being present. An asset is "Fully curated" if it has at least one classification tag, an assigned Data Owner and a description. If any of these attributes is missing, but not all, then the asset is categorized as "Partially curated" and if all of them are missing, then it's "Not curated".
-* **Asset data ownership**: Assets that have the owner attribute within "Contacts" tab as blank are categorized as "No owner", else it's categorized as "Owner assigned".
-
- :::image type="content" source="./media/data-stewardship/kpis-small.png" alt-text="Screenshot of the data stewardship insights summary graphs, showing the three main KPI charts." lightbox="media/data-stewardship/data-stewardship-kpis-large.png":::
-
-### Data estate health
-
-Data estate health is a scorecard view that helps management and governance focused users, like chief data officers, understand critical governance metrics that can be looked at by collection hierarchy.
--
-You can view the following metrics:
-* **Assets**: Count of assets by collection drill-down
-* **With sensitive classifications**: Count of assets with any system classification applied
-* **Fully curated assets**: Count of assets that have a data owner, at least one classification and a description.
-* **Owner assigned**: Count of assets with data owner assigned on them
-* **No classifications**: Count of assets with no classification tag
-* **Out of date**: Percentage of assets that have not been updated in over 365 days.
-* **New**: Count of new assets pushed in the Data Map in the last 30 days
-* **Updated**: Count of assets updated in the Data Map in the last 30 days
-* **Deleted**: Count of deleted assets from the Data Map in the last 30 days
-
-You can also drill down by collection paths. As you hover on each column name, it provides description of the column, provides recommended percentage ranges, and takes you to the detailed graph for further drill-down.
---
-### Asset curation
-
-All data assets are categorized into three buckets - ***Fully curated***, ***Partially curated*** and ***Not curated***, based on whether assets have been given certain attributes.
--
-An asset is ***Fully curated*** if it has at least one classification tag, an assigned data owner, and a description.
-
-If any of these attributes is missing, but not all, then the asset is categorized as ***Partially curated***. If all of them are missing, then it's listed as ***Not curated***.
-
-You can drill down by collection hierarchy.
--
-For further information about which assets aren't fully curated, you can select **View details** link that will take you into the deeper view.
--
-In the **View details** page, if you select a specific collection, it will list all assets with attribute values or blanks that make up the ***fully curated*** assets.
--
-The detail page shows two key pieces of information:
-
-First, it tells you what was the ***classification source***, if the asset is classified. It's **Manual** if a curator/data owner applied the classification manually. It's **Automatic** if it was classified during scanning. This page only provides the last applied classification state.
-
-Second, if an asset is unclassified, it tells us why it's not classified, in the column ***Reasons for unclassified***.
-Currently, Data estate insights can tell one of the following reasons:
-
-* No match found
-* Low confidence score
-* Not applicable
--
-You can select any asset and add missing attributes, without leaving the **Data estate insights** app.
--
-### Trends and gap analysis
-
-This graph shows how the assets and key metrics have been trending over:
-
-* Last 30 days: The graph takes last run of the day or recording of the last run across days as a data point.
-* Last six weeks: The graph takes last run of the week where week ends on Sunday. If there was no run on Sunday, then it takes the last recorded run.
-* Last 12 months: The graph takes last run of the month.
-* Last four quarters: The graph takes last run of the calendar quarter.
--
-## Next steps
-
-Learn more about Microsoft Purview Data Estate Insights through:
-* [Data Estate Insights Concepts](concept-insights.md)
-* [Catalog adoption insights](catalog-adoption-insights.md)
purview Deployment Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/deployment-best-practices.md
- Title: 'Deployment best practices for Microsoft Purview (formerly Azure Purview)'
-description: This article provides best practices for deploying Microsoft Purview (formerly Azure Purview) in your data estate. The Microsoft Purview Data Map and governance portal enable any user to register, discover, understand, and consume data sources.
---- Previously updated : 03/21/2023-
-# Microsoft Purview (formerly Azure Purview) deployment best practices
-
-This article is a guide to successfully deploying Microsoft Purview (formerly Azure Purview) into production in your data estate. It's intended to help you strategize and phase your deployment from research to hardening your production environment, and is best used in tandem with our [deployment checklist](tutorial-azure-purview-checklist.md).
-
->[!NOTE]
->These best practices cover the deployment for [Microsoft Purview unified data governance solutions](/purview/purview#microsoft-purview-unified-data-governance-solutions). For more information about Microsoft Purview risk and compliance solutions, [go here](/microsoft-365/compliance/). For more information about Microsoft Purview in general, [go here](/purview/purview).
-
-If you're looking for a strictly technical deployment guide, use the [deployment checklist](tutorial-azure-purview-checklist.md).
-
-If you're creating a plan to deploy Microsoft Purview and want to consider best practices as you develop your deployment strategy, then follow the article below. This guide outlines tasks can be completed in phases over the course of a month or more to develop your deployment process for Microsoft Purview. Even organizations who have already deployed Microsoft Purview can use this guide to ensure they're getting the most out of their investment.
-
-A well-planned deployment of your data governance platform, can give the following benefits:
--- Better data discovery-- Improved analytic collaboration-- Maximized return on investment-
-This guide provides insight on a full deployment lifecycle, from initial planning to a mature environment by following these stages:
-
-| Stage | Description |
-|-|-|
-|[Identify objectives and goals](#identify-objectives-and-goals)|Consider what your entire organization wants and needs from data governance.|
-|[Gathering questions](#gathering-questions)|What questions might you and your team have as you get started, and where can you look to begin addressing them?|
-|[Deployment models](#deployment-models)|Customize your Microsoft Purview deployment to your data estate.|
-|[Create a process to move to production](#create-a-process-to-move-to-production)|Create a phased deployment strategy tailored to your organization.|
-|[Platform hardening](#platform-hardening)|Continue to grow your deployment to maturity.|
-
-Many of Microsoft Purview's applications and features have their own individual best practices pages as well. They're referenced often throughout this deployment guide, but you can find all of them in the table of contents under **Concepts** and then **Best practices and guidelines**.
-
-## Identify objectives and goals
-
-Many organizations have started their data governance journey by developing individual solutions that cater to specific requirements of isolated groups and data domains across the organization. Although experiences may vary depending on the industry, product, and culture, most organizations find it difficult to maintain consistent controls and policies for these types of solutions.
-
-Some of the common data governance objectives that you might want to identify in the early phases to create a comprehensive data governance experience include:
-
-* Maximizing the business value of your data
-* Enabling a data culture where data consumers can easily find, interpret, and trust data
-* Increasing collaboration amongst various business units to provide a consistent data experience
-* Fostering innovation by accelerating data analytics to reap the benefits of the cloud
-* Decreasing time to discover data through self-service options for various skill groups
-* Reducing time-to-market for the delivery of analytics solutions that improve service to their customers
-* Reducing the operational risks that are due to the use of domain-specific tools and unsupported technology
-
-The general approach is to break down those overarching objectives into various categories and goals. Some examples are:
-
-|Category|Goal|
-|||
-|Discovery|Admin users should be able to scan Azure and non-Azure data sources (including on-premises sources) to gather information about the data assets automatically.|
-|Classification|The platform should automatically classify data based on a sampling of the data and allow manual override using custom classifications.|
-|Consumption|The business users should be able to find information about each asset for both business and technical metadata.|
-|Lineage|Each asset must show a graphical view of underlying datasets so that the users understand the original sources and what changes have been made.|
-|Collaboration|The platform must allow users to collaborate by providing additional information about each data asset.|
-|Reporting|The users must be able to view reporting on the data estate including sensitive data and data that needs extra enrichment.|
-|Data governance|The platform must allow the admin to define policies for access control and automatically enforce the data access based on each user.|
-|Workflow|The platform must have the ability to create and modify workflow so that it's easy to scale out and automate various tasks within the platform.|
-|Integration|Other third-party technologies such as ticketing or orchestration must be able to integrate into the platform via script or REST APIs.|
-
-### Identify key scenarios
-
-Microsoft Purview governance services can be used to centrally manage data governance across an organizationΓÇÖs data estate spanning cloud and on-premises environments. To have a successful implementation, you must identify key scenarios that are critical to the business. These scenarios can cross business unit boundaries or affect multiple user personas either upstream or downstream.
-
-These scenarios can be written up in various ways, but you should include at least these five dimensions:
-
-1. Persona ΓÇô Who are the users?
-2. Source system ΓÇô What are the data sources such as Azure Data Lake Storage Gen2 or Azure SQL Database?
-3. Impact Area ΓÇô What is the category of this scenario?
-4. Detail scenarios ΓÇô How the users use Microsoft Purview to solve problems?
-5. Expected outcome ΓÇô What is the success criteria?
-
-The scenarios must be specific, actionable, and executable with measurable results. Some example scenarios that you can use:
-
-|Scenario|Detail|Persona|
-||||
-|Catalog business-critical assets|I need to have information about each data sets to have a good understanding of what it is. This scenario includes both business and technical metadata data about the data set in the catalog. The data sources include Azure Data Lake Storage Gen2, Azure Synapse DW, and/or Power BI. This scenario also includes on-premises resources such as SQL Server.|Business Analyst, Data Scientist, Data Engineer|
-|Discover business-critical assets|I need to have a search engine that can search through all metadata in the catalog. I should be able to search using technical term, business term with either simple or complex search using wildcard.|Business Analyst, Data Scientist, Data Engineer, Data Admin|
-|Track data to understand its origin and troubleshoot data issues|I need to have data lineage to track data in reports, predictions, or models back to its original source. I also need to understand the changes made to the data, and where the data has resided throughout the data life cycle. This scenario needs to support prioritized data pipelines Azure Data Factory and Databricks.|Data Engineer, Data Scientist|
-|Enrich metadata on critical data assets|I need to enrich the data set in the catalog with technical metadata that is generated automatically. Classification and labeling are some examples.|Data Engineer, Domain/Business Owner|
-|Govern data assets with friendly user experience|I need to have a Business glossary for business-specific metadata. The business users can use Microsoft Purview for self-service scenarios to annotate their data and enable the data to be discovered easily via search.|Domain/Business Owner, Business Analyst, Data Scientist, Data Engineer|
-
-### Integration points with Microsoft Purview
-
-ItΓÇÖs likely that a mature organization already has an existing data catalog. The key question is whether to continue to use the existing technology and sync with the Microsoft Purview Data Map and Data Catalog or not. To handle syncing with existing products in an organization, [Microsoft Purview provides Atlas REST APIs](tutorial-using-rest-apis.md). Atlas APIs provide a powerful and flexible mechanism handling both push and pull scenarios. Information can be published to Microsoft Purview using Atlas APIs for bootstrapping or to push latest updates from another system into Microsoft Purview. The information available in Microsoft Purview can also be read using Atlas APIs and then synced back to existing products.
-
-For other integration scenarios such as ticketing, custom user interface, and orchestration you can use Atlas APIs and Kafka endpoints. In general, there are four integration points with Microsoft Purview:
-
-* **Data Asset** ΓÇô This enables Microsoft Purview to scan a storeΓÇÖs assets in order to enumerate what those assets are and collect any readily available metadata about them. So for SQL this could be a list of DBs, tables, stored procedures, views and config data about them kept in places like `sys.tables`. For something like Azure Data Factory (ADF) this could be enumerating all the pipelines and getting data on when they were created, last run, current state.
-* **Lineage** ΓÇô This enables Microsoft Purview to collect information from an analysis/data mutation system on how data is moving around. For something like Spark this could be gathering information from the execution of a notebook to see what data the notebook ingested, how it transformed it and where it outputted it. For something like SQL, it could be analyzing query logs to reverse engineer what mutation operations were executed and what they did. We support both push and pull based lineage depending on the needs.
-* **Classification** ΓÇô This enables Microsoft Purview to take physical samples from data sources and run them through our classification system. The classification system figures out the semantics of a piece of data. For example, we may know that a file is a Parquet file and has three columns and the third one is a string. But the classifiers we run on the samples will tell us that the string is a name, address, or phone number. Lighting up this integration point means that we've defined how Microsoft Purview can open up objects like notebooks, pipelines, parquet files, tables, and containers.
-* **Embedded Experience** ΓÇô Products that have a ΓÇ£studioΓÇ¥ like experience (such as ADF, Synapse, SQL Studio, PBI, and Dynamics) usually want to enable users to discover data they want to interact with and also find places to output data. Microsoft PurviewΓÇÖs catalog can help to accelerate these experiences by providing an embedding experience. This experience can occur at the API or the UX level at the partnerΓÇÖs option. By embedding a call to Microsoft Purview, the organization can take advantage of Microsoft PurviewΓÇÖs map of the data estate to find data assets, see lineage, check schemas, look at ratings, contacts etc.
-
-## Gathering questions
-
-Once your organization agrees on the high-level objectives and goals, there will be many questions from multiple groups. ItΓÇÖs crucial to gather these questions in order to craft a plan to address all of the concerns. Make sure to [include relevant groups](#include-the-right-stakeholders) as you gather these questions. You can use our documentation to start answering them.
-
-Some example questions that you may run into during the initial phase:
--- What are the main data sources and data systems in our organization?-- [What data sources are supported?](microsoft-purview-connector-overview.md) -- For data sources that aren't supported yet by Microsoft Purview, what are my options?-- [How should we budget for Microsoft Purview?](concept-guidelines-pricing.md)-- [How many Microsoft Purview instances do we need?](concept-best-practices-accounts.md)-- [Who will use Microsoft Purview, and what roles will they have?](catalog-permissions.md)-- [Who can scan new data sources?](catalog-permissions.md)-- [Who can modify content inside of Microsoft Purview?](catalog-permissions.md)-- What processes can I use to improve the data quality in Microsoft Purview?-- How to bootstrap the platform with existing critical assets, [glossary terms](concept-best-practices-glossary.md), and contacts?-- How to integrate with existing systems?-- [How can we secure Microsoft Purview?](concept-best-practices-security.md)-- How can we gather feedback and build a sustainable process?-- [What can we do in a disaster situation?](concept-best-practices-migration.md)-- [We're already using Azure Data Catalog, can we migrate to Microsoft Purview?](../data-catalog/data-catalog-migration-to-azure-purview.md)-
-Even if you might not have the answer to most of these questions right away, gathering questions can help your organization to frame this project and ensure all ΓÇ£must-haveΓÇ¥ requirements can be met.
-
-### Include the right stakeholders
-
-To ensure the success of implementing Microsoft Purview for your entire organization, itΓÇÖs important to involve the right stakeholders. Only a few people are involved in the initial phase. However, as the scope expands, you'll require more personas to contribute to the project and provide feedback.
-
-Some key stakeholders that you may want to include:
-
-|Persona|Roles|
-|||
-|**Chief Data Officer**|The CDO oversees a range of functions that may include data management, data quality, master data management, data science, business intelligence, and creating data strategy. They can be the sponsor of the Microsoft Purview implementation project.|
-|**Domain/Business Owner**|A business person who influences usage of tools and has budget control|
-|**Data Analyst**|Able to frame a business problem and analyze data to help leaders make business decisions|
-|**Data Architect**|Design databases for mission-critical line-of-business apps along with designing and implementing data security|
-|**Data Engineer**|Operate and maintain the data stack, pull data from different sources, integrate and prepare data, set up data pipelines|
-|**Data Scientist**|Build analytical models and set up data products to be accessed by APIs|
-|**DB Admin**|Own, track, and resolve database-related incidents and requests within service-level agreements (SLAs); May set up data pipelines|
-|**DevOps**|Line-of-Business application development and implementation; may include writing scripts and orchestration capabilities|
-|**Data Security Specialist**|Assess overall network and data security, which involves data coming in and out of Microsoft Purview|
-
-## Deployment models
-
-If you have only one small group using Microsoft Purview with basic consumption use cases, the approach could be as simple as having one Microsoft Purview instance to service the entire group. However, you may also wonder whether your organization needs more than one Microsoft Purview instance. And if using multiple Microsoft Purview instances, how can employees promote the assets from one stage to another.
-
-### Determine the number of Microsoft Purview instances
-
-In most cases, there should only be one Microsoft Purview (formerly Azure Purview) account for the entire organization. This approach takes maximum advantage of the ΓÇ£network effectsΓÇ¥ where the value of the platform increases exponentially as a function of the data that resides inside the platform.
-
-However, there are exceptions to this pattern:
-
-1. **Testing new configurations** ΓÇô Organizations may want to create multiple instances for testing out scan configurations or classifications in isolated environments. Although there's a ΓÇ£versioningΓÇ¥ feature in some areas of the platform such as glossary, it would be easier to have a ΓÇ£disposableΓÇ¥ instance to freely test.
-2. **Separating Test, Pre-production and Production** ΓÇô Organizations want to create different platforms for different kinds of data stored in different environments. It isn't recommended as those kinds of data are different content types. You could use glossary term at the top hierarchy level or category to segregate content types.
-3. **Conglomerates and federated model** ΓÇô Conglomerates often have many business units (BUs) that operate separately, and, in some cases, they won't even share billing with each other. In those cases, the organization will end up creating a Microsoft Purview instance for each BU. This model isn't ideal, but may be necessary, especially because BUs are often not willing to share billing.
-4. **Compliance** ΓÇô There are some strict compliance regimes, which treat even metadata as sensitive and require it to be in a specific geography. If a company has multiple geographies, the only solution is to have multiple Microsoft Purview instances, one for each geography.
-
-For more information, see our [accounts architecture best practices guide](concept-best-practices-accounts.md) and our [default account guide](concept-default-purview-account.md).
-
-## Create a process to move to production
-
-Some organizations may decide to keep things simple by working with a single production version of Microsoft Purview. They probably donΓÇÖt need to go beyond discovery, search, and browse scenarios. However, most organizations that want to deploy Microsoft Purview across various business units will want to have some form of process and control.
-
-Below we've provided a potential four phase deployment plan that includes tasks, helpful links, and acceptance criteria for each phase:
-
-1. [Phase 1: Pilot](#phase-1-pilot)
-1. [Phase 2: Minimum viable product](#phase-2-minimum-viable-product)
-1. [Phase 3: Pre-production](#phase-3-pre-production)
-1. [Phase 4: Production](#phase-4-production)
-
-### Phase 1: Pilot
-
-In this phase, Microsoft Purview must be created and configured for a small set of users. Usually, it's just a group of 2-3 people working together to run through end-to-end scenarios. They're considered the advocates of Microsoft Purview in their organization. The main goal of this phase is to ensure key functionalities can be met and the right stakeholders are aware of the project.
-
-#### Tasks to complete
-
-|Task|Detail|Duration|
-||||
-|Gather & agree on requirements|Discussion with all stakeholders to gather a full set of requirements. Different personas must participate to agree on a subset of requirements to complete for each phase of the project.|One Week|
-|[Navigating the Microsoft Purview governance portal](use-azure-purview-studio.md)|Understand how to use Microsoft Purview from the home page.|One Day|
-|[Configure ADF for lineage](how-to-link-azure-data-factory.md)|Identify key pipelines and data assets. Gather all information required to connect to an internal ADF account.|One Day|
-|Scan a data source such as [Azure Data Lake Storage Gen2](register-scan-adls-gen2.md) or a [SQL server.](tutorial-register-scan-on-premises-sql-server.md)|Add the data source and set up a scan. Ensure the scan successfully detects all assets.|Two Day|
-|[Search](how-to-search-catalog.md) and [browse](how-to-browse-catalog.md)|Allow end users to access Microsoft Purview and perform end-to-end search and browse scenarios.|One Day|
-
-#### Other helpful links
--- [Create a Microsoft Purview account](create-catalog-portal.md)-- [Create a collection](quickstart-create-collection.md)-- [Concept: Permissions and access](catalog-permissions.md)-- [Microsoft Purview product glossary](reference-azure-purview-glossary.md)-
-#### Acceptance criteria
-
-* Microsoft Purview account is created successfully in organization subscription under the organization tenant.
-* A small group of users with multiple roles can access Microsoft Purview.
-* Microsoft Purview is configured to scan at least one data source.
-* Users should be able to extract key values of Microsoft Purview such as:
- * Search and browse
- * Lineage
-* Users should be able to assign asset ownership in the asset page.
-* Presentation and demo to raise awareness to key stakeholders.
-* Buy-in from management to approve more resources for MVP phase.
-
-### Phase 2: Minimum viable product
-
-Once you have the agreed requirements and participated business units to onboard Microsoft Purview, the next step is to work on a Minimum Viable Product (MVP) release. In this phase, you'll expand the usage of Microsoft Purview to more users who will have more needs horizontally and vertically. There will be key scenarios that must be met horizontally for all users such as glossary terms, search, and browse. There will also be in-depth requirements vertically for each business unit or group to cover specific end-to-end scenarios such as lineage from Azure Data Lake Storage to Azure Synapse DW to Power BI.
-
-#### Tasks to complete
-
-|Task|Detail|Duration|
-||||
-|[Scan Azure Synapse Analytics](register-scan-azure-synapse-analytics.md)|Start to onboard your database sources and scan them to populate key assets|Two Days|
-|[Create custom classifications and rules](create-a-custom-classification-and-classification-rule.md)|Once your assets are scanned, your users may realize that there are other use cases for more classification beside the default classifications from Microsoft Purview.|2-4 Weeks|
-|[Scan Power BI](register-scan-power-bi-tenant.md)|If your organization uses Power BI, you can scan Power BI in order to gather all data assets being used by Data Scientists or Data Analysts that have requirements to include lineage from the storage layer.|1-2 Weeks|
-|[Import glossary terms](how-to-import-export-glossary.md)|In most cases, your organization may already develop a collection of glossary terms and term assignment to assets. This will require an import process into Microsoft Purview via .csv file.|One Week|
-|Add contacts to assets|For top assets, you may want to establish a process to either allow other personas to assign contacts or import via REST APIs.|One Week|
-|[Add sensitive labels and scan](how-to-automatically-label-your-content.md)|This might be optional for some organizations, depending on the usage of Labeling from Microsoft 365.|1-2 Weeks|
-|[Get classification and sensitive insights](concept-insights.md)|For reporting and insight in Microsoft Purview, you can access this functionality to get various reports and provide presentation to management.|One Day|
-|Onboard more users using Microsoft Purview managed users|This step will require the Microsoft Purview Admin to work with the Azure Active Directory Admin to establish new Security Groups to grant access to Microsoft Purview.|One Week|
-
-#### Other helpful links
--- [Collections architecture best practices](concept-best-practices-collections.md)-- [Classification best practices](concept-best-practices-classification.md)-- [Glossary best practices](concept-best-practices-glossary.md)-- [Labeling best practices](concept-best-practices-sensitivity-labels.md)-- [Asset lifecycle best practices](concept-best-practices-asset-lifecycle.md)-
-#### Acceptance criteria
-
-* Successfully onboard a larger group of users to Microsoft Purview (50+)
-* Scan business critical data sources
-* Import and assign all critical glossary terms
-* Successfully test important labeling on key assets
-* Successfully met minimum scenarios for participated business unitsΓÇÖ users
-
-### Phase 3: Pre-production
-
-Once the MVP phase has passed, itΓÇÖs time to plan for pre-production milestone. Your organization may decide to have a separate instance of Microsoft Purview for pre-production and production, or keep the same instance but restrict access. Also in this phase, you may want to include scanning on on-premises data sources such as SQL Server. If there's any gap in data sources not supported by Microsoft Purview, it's time to explore the Atlas API to understand other options.
-
-#### Tasks to complete
-
-|Task|Detail|Duration|
-||||
-|[Refine your scan using scan rule set](create-a-scan-rule-set.md)|Your organization will have many data sources for pre-production. ItΓÇÖs important to pre-define key criteria for scanning so that classifications and file extension can be applied consistently across the board.|1-2 Days|
-|[Assess region availability for scan for each of your sources by checking source pages](microsoft-purview-connector-overview.md)|Depending on the region of the data sources and organizational requirements on compliance and security, you may want to consider what regions must be available for scanning.|One Day|
-|[Understand firewall concept when scanning](concept-best-practices-security.md#network-security)|This step requires some exploration of how the organization configures its firewall and how Microsoft Purview can authenticate itself to access the data sources for scanning.|One Day|
-|[Understand Private Link concept when scanning](catalog-private-link.md)|If your organization uses Private Link, you must lay out the foundation of network security to include Private Link as a part of the requirements.|One Day|
-|[Scan on-premises SQL Server](register-scan-on-premises-sql-server.md)|This is optional if you have on-premises SQL Server. The scan will require setting up [Self-hosted Integration Runtime](manage-integration-runtimes.md) and adding SQL Server as a data source.|1-2 Weeks|
-|[Use Microsoft Purview REST API for integration scenarios](tutorial-using-rest-apis.md)|If you have requirements to integrate Microsoft Purview with other third party technologies such as orchestration or ticketing system, you may want to explore REST API area.|1-4 Weeks|
-|[Understand Microsoft Purview pricing](concept-guidelines-pricing.md)|This step will provide the organization important financial information to make decision.|1-5 Days|
-
-#### Other helpful links
--- [Labeling best practices](concept-best-practices-sensitivity-labels.md)-- [Network architecture best practices](concept-best-practices-network.md)-
-#### Acceptance criteria
-
-* Successfully onboard at least one business unit with all of users
-* Scan on-premises data source such as SQL Server
-* POC at least one integration scenario using REST API
-* Complete a plan to go to production, which should include key areas on infrastructure and security
-
-### Phase 4: Production
-
-The above phases should be followed to create an effective data lifecycle management, which is the foundation for better governance programs. Data governance will help your organization prepare for the growing trends such as AI, Hadoop, IoT, and blockchain. It's just the start for many things data and analytics, and there's plenty more that can be discussed. The outcome of this solution would deliver:
-
-* **Business Focused** - A solution that is aligned to business requirements and scenarios over technical requirements.
-* **Future Ready** - A solution will maximize default features of the platform and use standardized industry practices for configuration or scripting activities to support the advancements/evolution of the platform.
-
-#### Tasks to complete
-
-|Task|Detail|Duration|
-||||
-|Scan production data sources with Firewall enabled|If this is optional when firewall is in place but itΓÇÖs important to explore options to hardening your infrastructure.|1-5 Days|
-|[Enable Private Link](catalog-private-link.md)|If this is optional when Private Link is used. Otherwise, you can skip this as itΓÇÖs a must-have criterion when Private is enabled.|1-5 Days|
-|[Create automated workflow](concept-workflow.md)|Workflow is important to automate process such as approval, escalation, review and issue management.|2-3 Weeks|
-|Create operation documentation|Data governance isn't a one-time project. It's an ongoing program to fuel data-driven decision making and creating opportunities for business. It's critical to document key procedure and business standards.|One Week|
-
-#### Other helpful links
--- [Manage workflow runs](how-to-workflow-manage-runs.md)-- [Workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)-- [Manage integration runtimes](manage-integration-runtimes.md)-- [Request access to a data asset](how-to-request-access.md)-
-#### Acceptance criteria
-
-* Successfully onboard all business unit and their users
-* Successfully meet infrastructure and security requirements for production
-* Successfully meet all use cases required by the users
-
-## Platform hardening
-
-More hardening steps can be taken:
-
-* Increase security posture by enabling scan on firewall resources or use Private Link
-* Fine-tune scope scan to improve scan performance
-* [Use REST APIs](tutorial-atlas-2-2-apis.md) to export critical metadata and properties for backup and recovery
-* [Use workflow](how-to-workflow-business-terms-approval.md) to automate ticketing and eventing to avoid human errors
-* [Use policies](concept-policies-data-owner.md) to manage access to data assets through the Microsoft Purview governance portal.
-
-## Lifecycle considerations
-
-Another important aspect to include in your production process is how classifications and labels can be migrated. Microsoft Purview has over 90 system classifiers. You can apply system or custom classifications on file, table, or column assets. Classifications are like subject tags and are used to mark and identify content of a specific type found within your data estate during scanning. Sensitivity labels are used to identify the categories of classification types within your organizational data, and then group the policies you wish to apply to each category. It makes use of the same sensitive information types as Microsoft 365, allowing you to stretch your existing security policies and protection across your entire content and data estate. It can scan and automatically classify documents. For example, if you have a file named multiple.docx and it has a National ID number in its content, Microsoft Purview will add classification such as EU National Identification Number in the Asset Detail page.
-
-In the Microsoft Purview Data Map, there are several areas where the Catalog Administrators need to ensure consistency and maintenance best practices over its life cycle:
-
-* **Data assets** ΓÇô Data sources will need to be rescanned across environments. ItΓÇÖs not recommended to scan only in development and then regenerate them using APIs in Production. The main reason is that the Microsoft Purview scanners do a lot more ΓÇ£wiringΓÇ¥ behind the scenes on the data assets, which could be complex to move them to a different Microsoft Purview instance. ItΓÇÖs much easier to just add the same data source in production and scan the sources again. The general best practice is to have documentation of all scans, connections, and authentication mechanisms being used.
-* **Scan rule sets** ΓÇô This is your collection of rules assigned to specific scan such as file type and classifications to detect. If you donΓÇÖt have that many scan rule sets, itΓÇÖs possible to just re-create them manually again via Production. This will require an internal process and good documentation. However, if your rule sets change on a daily or weekly basis, this could be addressed by exploring the REST API route.
-* **Custom classifications** ΓÇô Your classifications may not also change regularly. During the initial phase of deployment, it may take some time to understand various requirements to come up with custom classifications. However, once settled, this will require little change. So the recommendation here's to manually migrate any custom classifications over or use the REST API.
-* **Glossary** ΓÇô ItΓÇÖs possible to export and import glossary terms via the UX. For automation scenarios, you can also use the REST API.
-* **Resource set pattern policies** ΓÇô This functionality is advanced for any typical organizations to apply. In some cases, your Azure Data Lake Storage has folder naming conventions and specific structure that may cause problems for Microsoft Purview to generate the resource set. Your business unit may also want to change the resource set construction with more customizations to fit the business needs. For this scenario, itΓÇÖs best to keep track of all changes via REST API, and document the changes through external versioning platform.
-* **Role assignment** ΓÇô This is where you control who has access to Microsoft Purview and which permissions they have. Microsoft Purview also has REST API to support export and import of users and roles but this isn't Atlas API-compatible. The recommendation is to assign an Azure Security Group and manage the group membership instead.
-
-## Moving tenants
-
-If your Azure Subscription moves tenants while you have a Microsoft Purview account, you will need to create a new Microsoft Purview account and re-register and scan your sources.
-
-Moving tenants is not currently supported for Microsoft Purview.
-
-## Next steps
--- [Collections best practices](concept-best-practices-collections.md)-- [Navigate the home page and search for an asset](tutorial-asset-search.md)
purview Disable Data Estate Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/disable-data-estate-insights.md
- Title: Disable Data Estate Insights
-description: This article provides the steps to disable or enable Data Estate Insights in the Microsoft Purview governance portal.
----- Previously updated : 12/08/2022--
-# Disable Data Estate Insights or report refresh
-
-Microsoft Purview Data Estate Insights automatically aggregates metrics and creates reports about your Microsoft Purview account and your data estate. When you scan registered sources and populate your Microsoft Purview Data Map, the Data Estate Insights application automatically identifies governance gaps and highlights them in its top metrics. It also provides drill-down experience that enables all stakeholders, such as data owners and data stewards, to take appropriate action to close the gaps.
-
-These features are optional and can be enabled or disabled at any time. This article provides the specific steps required to enable or disable Microsoft Purview Data Estate Insights features.
-
-To reschedule without disabling, see [the article to schedule Data Estate Insights reports](how-to-schedule-data-estate-insights.md).
-
-> [!IMPORTANT]
-> The Data Estate Insights application is **on** by default when you create a Microsoft Purview account.
->
-> **State** is set to On.
->
-> **Refresh Schedule** is set to a weekly refresh that begins 7 days after the account is created.
->
-> As the Data Map is populated and curated, Insights App shows data in the reports. The reports are ready for consumption to anyone with Insights Reader role.
-
-If you don't plan on using Data Estate Insights for a time, a **[data curator](catalog-permissions.md#roles) on the [root collection](reference-azure-purview-glossary.md#root-collection)** can disable the Data Estate Insights in one of two ways:
--- [Disable the Data Estate Insights application](#disable-the-data-estate-insights-application) - this will stop billing from both report generation and report consumption.-- [Disable report refresh](#disable-report-refresh) - Insights readers have access to current reports, but reports won't be refreshed. Billing will occur for report consumption but not report generation.-
-Steps for both methods, and for re-enablement, are below.
-
-For more information about billing for Data Estates Insights, see our [pricing guidelines](concept-guidelines-pricing-data-estate-insights.md).
-
-## Disable the Data Estate Insights application
-
-> [!NOTE]
-> To be able to disable this application, you will need to have the [data curator role](catalog-permissions.md#roles) on your account's [root collection.](reference-azure-purview-glossary.md#root-collection)
-
-Disabling Data Estate Insights will disable the entire application, including these reports:
-- Stewardship-- Asset-- Glossary-- Classification-- Labeling-
-The application icon will still show in the menu, but insights readers won't have access to reports at all, and report generation jobs will be stopped. The Microsoft Purview account won't receive any bill for Data Estate Insights.
-
-To disable the Data Estate Insights application, a user with the [data curator role](catalog-permissions.md#roles) at the [root collection](reference-azure-purview-glossary.md#root-collection) can follow these steps:
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), go to the **Management** section.
-
- :::image type="content" source="media/disable-data-estate-insights/locate-management.png" alt-text="Screenshot of the Microsoft Purview governance portal left menu, with the Management section highlighted and overview shown selected in the next menu." :::
-
-1. Then select **Overview**.
-1. In the **Feature options** menu, locate Data Estate Insights, and select the **State** toggle to change it to **Off**.
-
- :::image type="content" source="media/disable-data-estate-insights/disable-option.png" alt-text="Screenshot of the Overview window in the Management section of the Microsoft Purview governance portal with the State toggle highlighted for Data Estate Insights feature options." :::
-
-Once you have disabled Data Estate Insights, the icon will still appear in the left hand menu, but users will receive a warning stating that the application has been disabled when attempting to access it.
--
-## Disable report refresh
-
-> [!NOTE]
-> To be able to disable or edit report refresh, you will need to have the [data curator role](catalog-permissions.md#roles) on your account's [root collection.](reference-azure-purview-glossary.md#root-collection)
-
-You can choose to disable report refreshes instead of disabling the entire Data Estate Insights application. When you disable report refreshes, users with the [insights reader role](catalog-permissions.md#roles) will still be able view reports, but they'll see warning at the top of each report indicating that the data may not be current and the date of the last refresh.
-
-Graphs that show data from the last 30 days will appear blank after 30 days while graphs showing snapshot of the data map will continue to show graphs and details.
--
-To disable the Data Estate Insights report refresh, a user with the [data curator role](catalog-permissions.md#roles) at the [root collection](reference-azure-purview-glossary.md#root-collection) can follow these steps:
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), go to the **Management** section.
-
- :::image type="content" source="media/disable-data-estate-insights/locate-management.png" alt-text="Screenshot of the Microsoft Purview governance portal left menu, with the Management section highlighted." :::
-
-1. Then select **Overview**.
-1. In the **Insights refresh** menu, locate Data Estate Insights and select the **Edit pencil**
-
- :::image type="content" source="media/disable-data-estate-insights/disable-data-estate-insights.png" alt-text="Screenshot of the Overview window in the Management section of the Microsoft Purview governance portal with the edit pencil in the Data Estate Insights row highlighted." :::
-
-1. Select the **Off** radio button and select **Continue**. If prompted, review your edits and select **Save.**
-
- :::image type="content" source="media/disable-data-estate-insights/disable-recurrance.png" alt-text="Screenshot of the Data Estate Insights edit page, with the Recurring radio button highlighted and set to Off." :::
-
-1. You can now see under the **Schedule** column, the refresh schedule reads as **Disabled**.
-
-## Re-enable Data Estate Insights and report refresh
-
-> [!NOTE]
-> To enable Data Estate Insights, enable report refresh, or edit report refresh, you will need to have the [data curator role](catalog-permissions.md#roles) on your account's [root collection.](reference-azure-purview-glossary.md#root-collection)
-
-If Data Estate Insights or report refresh has been disabled in your Microsoft Purview governance portal environment, a user with the [data curator role](catalog-permissions.md#roles) at the [root collection](reference-azure-purview-glossary.md#root-collection) can re-enable either at any time by following these steps:
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), go to the **Management** section.
-
- :::image type="content" source="media/disable-data-estate-insights/locate-management.png" alt-text="Screenshot of the Microsoft Purview governance portal Management section highlighted.":::
-
-1. Then select **Overview**.
-1. In the **Insights refresh** menu, locate Data Estate Insights, and select the **Edit** pencil.
-
- :::image type="content" source="media/disable-data-estate-insights/refresh-frequency.png" alt-text="Screenshot of the Overview window in the Management section of the Microsoft Purview governance portal with the refresh frequency dropdown highlighted for Data Estate Insights feature options." :::
-
-1. In the **Edit** menu, select **Recurring**.
-
-1. Then select your time zone, set your recurrence to **Month(s)** or **Weeks(s)**, select your day and time to run, specify a start time, and optionally specify an end time. For more information about the available options, see [the article to schedule Data Estate Insights reports](how-to-schedule-data-estate-insights.md).
-
- :::image type="content" source="media/disable-data-estate-insights/set-recurrance.png" alt-text="Screenshot of the Data Estate Insights edit page, with the Recurring radio button highlighted and set to Recurring." :::
-
-1. Select **Continue**. If prompted, review the updates and select **Save**.
-1. Now you can see your schedule is set. Selecting **More info** in the schedule columns will give you the recurrence details.
-
- :::image type="content" source="media/disable-data-estate-insights/schedule-set.png" alt-text="Screenshot of the management page, with the Data Estate Insights information row highlighted." :::
-
-## Next steps
--- [Learn how to use Asset insights](asset-insights.md)-- [Learn how to use Classification insights](classification-insights.md)-- [Learn how to use Glossary insights](glossary-insights.md)-- [Learn how to use Label insights](sensitivity-insights.md)-- [Schedule Data Estate Insights reports](how-to-schedule-data-estate-insights.md)
purview Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/disaster-recovery.md
- Title: Disaster recovery for Microsoft Purview
-description: Learn how to configure a disaster recovery environment for Microsoft Purview.
---- Previously updated : 06/03/2022-
-# Disaster recovery for Microsoft Purview
-
-This article explains how to configure a disaster recovery environment for Microsoft Purview. Azure data center outages are rare, but can last anywhere from a few minutes to hours. Data Center outages can cause disruption to environments that are being relied on for data governance. By following the steps detailed in this article, you can continue to govern your data in the event of a data center outage for the primary region of your Microsoft Purview account.
-
-## Achieve business continuity for Microsoft Purview
-
-Business continuity and disaster recoveryΓÇ»(BCDR) in a Microsoft Purview instance refers to the mechanisms, policies, and procedures that enable your business to protect data loss and continue operating in the face of disruption, particularly to its scanning, catalog, and insights tiers. This page explains how to configure a disaster recovery environment for Microsoft Purview.
-
-Today, Microsoft Purview does not support automated BCDR. Until that support is added, you are responsible to take care of backup and restore activities. You can manually create a secondary Microsoft Purview account as a warm standby instance in another region.
-
-The following steps show how you can achieve disaster recovery manually:
-
-1. Once the primary Microsoft Purview account is created in a certain region, you must provision one or more secondary Microsoft Purview accounts in separate regions from Azure portal.
-
-2. All activities performed on the primary Microsoft Purview account must be carried out on the secondary Microsoft Purview accounts as well. This includes:
-
- - Maintain Account information
- - Create and maintain custom Scan rule sets, Classifications, and Classification rules
- - Register and scan sources
- - Create and maintain Collections along with the association of sources with the Collections
- - Create and maintain Credentials used while scanning.
- - Curate data assets
- - Create and maintain Glossary terms
--
-As you plan your manual BCDR plan, keep the following points in mind:
--- You will be charged for primary and secondary Microsoft Purview accounts. --- The primary and secondary Microsoft Purview accounts cannot be configured to the same Azure Data Factory, Azure Data Share and Synapse Analytics accounts, if applicable. As a result, the lineage from Azure Data Factory and Azure Data Share cannot be seen in the secondary Microsoft Purview accounts. Also, the Synapse Analytics workspace associated with the primary Microsoft Purview account cannot be associated with secondary Microsoft Purview accounts. This is a limitation today and will be addressed when automated BCDR is supported. --- The integration runtimes are specific to a Microsoft Purview account. Hence, if scans must run in primary and secondary Microsoft Purview accounts in-parallel, multiple self-hosted integration runtimes must be maintained. This limitation will also be addressed when automated BCDR is supported. --- Parallel execution of scans from both primary and secondary Microsoft Purview accounts on the same source can affect the performance of the source. This can result in scan durations to vary across the Microsoft Purview accounts. -
-## Related information
--- [Business Continuity and Disaster Recovery](../availability-zones/cross-region-replication-azure.md)-- [Build high availability into your BCDR strategy](/azure/architecture/solution-ideas/articles/build-high-availability-into-your-bcdr-strategy)-- [Azure status](https://azure.status.microsoft/status)-
-## Next steps
-
-To get started with Microsoft Purview, see [Create a Microsoft Purview account](create-catalog-portal.md).
purview Glossary Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/glossary-insights.md
- Title: Glossary report on your data using Microsoft Purview Data Estate Insights
-description: This guide describes how to view and use Microsoft Purview Data Estate Insights glossary reporting on your data.
------ Previously updated : 05/16/2022--
-# Insights for your business glossary in Microsoft Purview
-
-This guide describes how to access, view, and filter Microsoft Purview glossary insight reports for your data.
-
-In this how-to guide, you'll learn how to:
-
-> [!div class="checklist"]
-> - Find Data Estate Insights from your Microsoft Purview account
-> - Get a bird's eye view of your data.
-
-## Prerequisites
-
-Before getting started with Microsoft Purview Data Estate Insights glossary insights, make sure that you've completed the following steps:
-
-* Set up a storage resource and populated the account with data.
-
-* [Set up and complete a scan your storage source](manage-data-sources.md).
-
-* Set up at least one [business glossary term](how-to-create-manage-glossary-term.md) and attach it to an asset.
-
-## Use Microsoft Purview glossary insights
-
-In Microsoft Purview, you can [create glossary terms and attach them to assets](how-to-create-manage-glossary-term.md). As you make use of these terms in your data map, you can view the glossary distribution in glossary insights. These insights will give you the state of your glossary based on:
-* Number of terms attached to assets
-* Status of terms
-* Distribution of roles by users
-
-**To view Glossary Insights:**
-
-1. Go to the **Microsoft Purview** [account screen in the Azure portal](https://aka.ms/purviewportal) and select your Microsoft Purview account.
-
-1. On the **Overview** page, in the **Get Started** section, select **Open Microsoft Purview governance portal** account tile.
-
- :::image type="content" source="./media/glossary-insights/portal-access.png" alt-text="Screenshot showing the Open Microsoft Purview governance portal button on the account page.":::
-
-1. On the Microsoft Purview **Home** page, select **Data Estate Insights** on the left menu.
-
- :::image type="content" source="./media/glossary-insights/view-insights.png" alt-text="Screenshot showing Data Estate Insights in left menu of the Microsoft Purview governance portal.":::
-
-1. In the **Data Estate Insights** area, select **Glossary** to display the Microsoft Purview **glossary insights** report.
-
-1. The report starts with **High-level KPIs** that shows ***Total terms*** in your Microsoft Purview account, ***Approved terms without assets*** and ***Expired terms with assets***. Each of these values will help you understand the current health of your Glossary.
-
- :::image type="content" source="./media/glossary-insights/glossary-kpi.png" alt-text="Screenshot showing glossary KPI charts.":::
--
-1. The **Snapshot of terms** section (displayed above) shows the term status as ***Draft***, ***Approved***, ***Alert***, and ***Expired*** for terms with assets and terms without assets.
-
-1. Select **View details** to see the term names with various status and more details about ***Stewards*** and ***Experts***.
-
- :::image type="content" source="./media/glossary-insights/glossary-view-more.png" alt-text="Screenshot of terms with and without assets.":::
-
-1. When you select "View more" for ***Approved terms with assets***, Data Estate Insights allow you to navigate to the **Glossary** term detail page, from where you can further navigate to the list of assets with the attached terms.
-
- :::image type="content" source="./media/glossary-insights/navigate-to-glossary-detail.png" alt-text="Screenshot of Data Estate Insights to glossary.":::
-
-1. In Glossary insights page, view a distribution of **Incomplete terms** by type of information missing. The graph shows count of terms with ***Missing definition***, ***Missing expert***, ***Missing steward*** and ***Missing multiple*** fields.
-
-1. Select ***View more*** from **Incomplete terms**, to view the terms that have missing information. You can navigate to Glossary term detail page to input the missing information and ensure the glossary term is complete.
-
-## Next steps
-
-Learn more about how to create a glossary term through the [glossary documentation.](./how-to-create-manage-glossary-term.md)
purview How To Automatically Label Your Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-automatically-label-your-content.md
- Title: How to automatically apply sensitivity labels to your data in Microsoft Purview Data Map
-description: Learn how to create sensitivity labels and automatically apply them to your data during a scan.
------ Previously updated : 04/26/2023-
-# How to automatically apply sensitivity labels to your data in the Microsoft Purview Data Map
-
-## Create new or apply existing sensitivity labels in the data map
-
-> [!IMPORTANT]
-> Labeling in the Microsoft Purview Data Map is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
->
-
-If you don't already have sensitivity labels, you'll need to create them and make them available for the Microsoft Purview Data Map. Existing sensitivity labels from Microsoft Purview Information Protection can also be modified to make them available to the data map.
-
-### Step 1: Licensing requirements
-
-Sensitivity labels are created and managed in the Microsoft Purview compliance portal. To create sensitivity labels for use through Microsoft Purview, you must have an active Microsoft 365 license that offers the benefit of automatically applying sensitivity labels.
-
-For the full list of licenses, see the [Sensitivity labels in Microsoft Purview FAQ](sensitivity-labels-frequently-asked-questions.yml). If you don't already have the required license, you can sign up for a trial of [Microsoft 365 E5](https://www.microsoft.com/microsoft-365/business/compliance-solutions#midpagectaregion).
-
-### Step 2: Consent to use sensitivity labels in the Microsoft Purview Data Map
-
-The following steps extend your existing sensitivity labels and enable them to be available for use in the data map, where you can apply sensitivity labels to files and database columns.
-
-1. In the Microsoft Purview compliance portal, navigate to the **Information Protection** menu and the **Labels** page.</br>
- If you've recently provisioned your subscription for Information Protection, it may take a few hours for the **Information Protection** page to display.
-1. In the **Extend labeling to assets in the Microsoft Purview Data Map** area, select the **Turn on** button, and then select **Yes** in the confirmation dialog that appears.
-
-For example:
---
-> [!TIP]
->If you don't see the button, and you're not sure if consent has been granted to extend labeling to assets in the Microsoft Purview Data Map, see [this FAQ](sensitivity-labels-frequently-asked-questions.yml#how-can-i-determine-if-consent-has-been-granted-to-extend-labeling-to-the-microsoft-purview-data-map) item on how to determine the status.
->
-
-After you've extended labeling to assets in the Microsoft Purview Data Map, all published sensitivity labels are available for use in the data map.
-
-### Step 3: Create or modify existing label to automatically label content
-
-**To create new sensitivity labels or modify existing labels**:
-
-1. Open the [Microsoft Purview compliance portal](https://compliance.microsoft.com/).
-
-1. Under **Solutions**, select **Information protection**, **Labels**, then select **Create a label**.
-
- :::image type="content" source="media/how-to-automatically-label-your-content/create-sensitivity-label-full-small.png" alt-text="Create sensitivity labels in the Microsoft Purview compliance center" lightbox="media/how-to-automatically-label-your-content/create-sensitivity-label-full.png":::
-
-1. Name the label. Then, under **Define the scope for this label**:
-
- - In all cases, select **Schematized data assets**.
- - To label files, also select **Items**. This option isn't required to label schematized data assets only.
-
- :::image type="content" source="media/how-to-automatically-label-your-content/create-label-scope-small.png" alt-text="Automatically label in the Microsoft Purview compliance center" lightbox="media/how-to-automatically-label-your-content/create-label-scope.png":::
-
-1. Follow the rest of the prompts to configure the label settings.
-
- Specifically, define autolabeling rules for files and schematized data assets:
-
- - [Define autolabeling rules for files](#autolabeling-for-files)
- - [Define autolabeling rules for schematized data assets](#autolabeling-for-schematized-data-assets)
-
- For more information about configuration options, see [What sensitivity labels can do](/microsoft-365/compliance/sensitivity-labels#what-sensitivity-labels-can-do) in the Microsoft 365 documentation.
-
-1. Repeat the steps listed above to create more labels.
-
- To create a sublabel, select the parent label > **...** > **More actions** > **Add sub label**.
-
-1. To modify existing labels, browse to **Information Protection** > **Labels**, and select your label.
-
- Then select **Edit label** to open the **Edit sensitivity label** configuration again, with all of the settings you'd defined when you created the label.
-
- :::image type="content" source="media/how-to-automatically-label-your-content/edit-sensitivity-label-full-small.png" alt-text="Edit an existing sensitivity label" lightbox="media/how-to-automatically-label-your-content/edit-sensitivity-label-full.png":::
-
-1. When you're done creating all of your labels, make sure to view your label order, and reorder them as needed.
-
- To change the order of a label, select **...** **> More actions** > **Move up** or **Move down.**
-
- For more information, see the documentation for [label priority (order matters)](/microsoft-365/compliance/sensitivity-labels#label-priority-order-matters).
-
-#### Autolabeling for files
-
-Define autolabeling rules for files when you create or edit your label.
-
-On the **Auto-labeling for Office apps** page, enable **Auto-labeling for Office apps,** and then define the conditions where you want your label to be automatically applied to your data.
-
-For example:
--
-For more information, see the documentation to [apply a sensitivity label to data automatically](/microsoft-365/compliance/apply-sensitivity-label-automatically#how-to-configure-auto-labeling-for-office-apps).
-
-#### Autolabeling for schematized data assets
-
-Define autolabeling rules for schematized data assets when you create or edit your label.
-
-At the **Schematized data assets** option:
-
-1. Select the **Auto-labeling for schematized data assets** slider.
-
-1. Select **Check sensitive info types** to choose the sensitive info types you want to apply to your label.
-
-For example:
--
-### Step 4: Publish labels
-
-If the Sensitivity label has been published previously, then no further action is needed.
-
-If this is a new sensitivity label that hasn't been published before, then the label must be published for the changes to take effect. Follow [these steps to publish the label](/microsoft-365/compliance/create-sensitivity-labels#publish-sensitivity-labels-by-creating-a-label-policy).
-
-Once you create a label, you'll need to Scan your data in the Microsoft Purview Data Map to automatically apply the labels you've created, based on the autolabeling rules you've defined.
-
-## Scan your data to apply sensitivity labels automatically
-
-Scan your data in the data map to automatically apply the labels you've created, based on the autolabeling rules you've defined. Allow up to 24 hours for sensitivity label changes to reflect in the data map.
-
-For more information on how to set up scans on various assets in the Microsoft Purview Data Map, see:
-
-|Source |Reference |
-|||
-|**Files within Storage** | [Register and Scan Azure Blob Storage](register-scan-azure-blob-storage-source.md) </br> [Register and scan Azure Files](register-scan-azure-files-storage-source.md) [Register and scan Azure Data Lake Storage Gen1](register-scan-adls-gen1.md) </br>[Register and scan Azure Data Lake Storage Gen2](register-scan-adls-gen2.md)</br>[Register and scan Amazon S3](register-scan-amazon-s3.md) |
-|**database columns** | [Register and scan an Azure SQL Database](register-scan-azure-sql-database.md) </br>[Register and scan an Azure SQL Managed Instance](register-scan-azure-sql-managed-instance.md) </br> [Register and scan Dedicated SQL pools](register-scan-azure-synapse-analytics.md)</br> [Register and scan Azure Synapse Analytics workspaces](register-scan-azure-synapse-analytics.md) </br> [Register and scan Azure Cosmos DB for NoSQL database](register-scan-azure-cosmos-database.md) </br> [Register and scan an Azure MySQL database](register-scan-azure-mysql-database.md) </br> [Register and scan an Azure database for PostgreSQL](register-scan-azure-postgresql.md) |
-| | |
-
-## View labels on assets in the catalog
-
-Once you've defined autolabeling rules for your labels in the Microsoft Purview compliance portal and scanned your data in the data map, labels are automatically applied to your assets in the data map.
-
-**To view the labels applied to your assets in the Microsoft Purview catalog:**
-
-In the Microsoft Purview catalog, use the **Label** filtering options to show assets with specific labels only. For example:
--
-To view details of an asset including classifications found and label applied, select the asset in the results.
-
-For example:
--
-## View Insight reports for the classifications and sensitivity labels
-
-Find insights on your classified and labeled data in the Microsoft Purview Data Map use the **Classification** and **Sensitivity labeling** reports.
-
-> [!div class="nextstepaction"]
-> [Classification insights](./classification-insights.md)
-
-> [!div class="nextstepaction"]
-> [Sensitivity label insights](sensitivity-insights.md)
-
-> [!div class="nextstepaction"]
-> [Overview of Labeling in Microsoft Purview](create-sensitivity-label.md)
-
-> [!div class="nextstepaction"]
-> [Labeling Frequently Asked Questions](sensitivity-labels-frequently-asked-questions.yml)
purview How To Browse Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-browse-catalog.md
- Title: 'How to: browse the Data Catalog'
-description: This article gives an overview of how to browse the Microsoft Purview Data Catalog by asset type
----- Previously updated : 12/06/2022--
-# Browse the Microsoft Purview Data Catalog
-
-Searching a data catalog is a great tool for data discovery if a data consumer knows what they're looking for, but often users don't know exactly how their data estate is structured. The Microsoft Purview Data Catalog offers a browse experience that enables users to explore what data is available to them either by collection or through traversing the hierarchy of each data source in the catalog.
-
-To access the browse experience, select ΓÇ£Browse assetsΓÇ¥ from the data catalog home page.
--
-## Browse by collection
-
-Browse by collection allows you to explore the different collections you're a data reader or curator for.
-
-> [!NOTE]
-> You will only see collections you have access to. For more information, see [create and manage Collections](how-to-create-and-manage-collections.md).
--
-Once a collection is selected, you'll get a list of assets in that collection with the facets and filters available in search. As a collection can have thousands of assets, browse uses the Microsoft Purview search relevance engine to boost the most important assets to the top.
-
-You can also refine the list of assets using facets and filters:
--- [Use the facets](how-to-search-catalog.md#use-the-facets) on the left hand side to narrow results by business metadata like glossary terms or classifications.-- [Use the filters](how-to-search-catalog.md#use-the-filters) at the top to narrow results by source type, [managed attributes](how-to-managed-attributes.md), or activity.--
-If the selected collection doesnΓÇÖt contain the data you're looking for, you can easily navigate to related collections, or go back and view the entire collections tree.
--
-Once you find the asset you're looking for, you can select it to view more details such as schema, lineage, and a detailed classification list. To learn more about the asset details page, see [Manage catalog assets](catalog-asset-details.md).
--
-## Browse by source type
-
-Browse by source type allows data consumers to explore the hierarchies of data sources using an explorer view. Select a source type to see the list of scanned sources.
-
-For example, you can easily find a dataset called *DateDimension* under a folder called *Dimensions* in Azure Data lake Storage Gen 2. You can use the 'Browse by source type' experience to navigate to the ADLS Gen 2 storage account, then browse to the service > container > folder(s) to reach the specific *Dimensions* folder and then see the *DateDimension* table.
-
-A native browsing experience with hierarchical namespace is provided for each corresponding data source.
-
-> [!NOTE]
-> After a successful scoped scan, there may be delay before newly scanned assets appear in the browse experience.
-
-1. On the **Browse by source types** page, tiles are categorized by data sources. To further explore assets in each data source, select the corresponding tile.
-
- :::image type="content" source="media/how-to-browse-catalog/browse-asset-types.png" alt-text="Browse asset types page" border="true":::
-
- > [!TIP]
- > Certain tiles are groupings of a collection of data sources. For example, the Azure Storage Account tile contains all Azure Blob Storage and Azure Data Lake Storage Gen2 accounts. The Azure SQL Server tile will display the Azure SQL Server assets that contain Azure SQL Database and Azure Dedicated SQL Pool instances ingested into the catalog.
-
-1. On the next page, top-level assets under your chosen data type are listed. Pick one of the assets to further explore its contents. For example, after selecting "Azure SQL Database", you'll see a list of databases with assets in the data catalog.
-
- :::image type="content" source="media/how-to-browse-catalog/asset-type-specific-browse.png" alt-text="Azure SQL Database browse page" border="true":::
-
-1. The explorer view will open. Start browsing by selecting the asset on the left panel. Child assets will be listed on the right panel of the page.
-
- :::image type="content" source="media/how-to-browse-catalog/explorer-view.png" alt-text="Explorer view" border="true":::
-
-1. To view the details of an asset, select the name or the ellipses button on the far right.
-
- :::image type="content" source="media/how-to-browse-catalog/view-asset-detail-click-ellipses-inline.png" alt-text="Select the ellipses button to see asset details page" lightbox="media/how-to-browse-catalog/view-asset-detail-click-ellipses-expanded.png" border="true":::
-
-## Next steps
--- [How to create and manage glossary terms](how-to-create-manage-glossary-term.md)-- [How to import and export glossary terms](how-to-import-export-glossary.md)-- [How to manage term templates for business glossary](how-to-manage-term-templates.md)-- [How to search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
purview How To Bulk Edit Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-bulk-edit-assets.md
- Title: How to bulk edit assets
-description: Learn bulk edit assets in Microsoft Purview to add classifications, glossary terms, and modify contacts on multiple assets at once.
----- Previously updated : 03/23/2023--
-# How to bulk edit assets
-
-This article describes how you can update assets in bulk to add glossary terms, classifications, owners, and experts on multiple assets at once.
-
->[!NOTE]
->Bulk editing as described here is for data assets. Other assets like classifications and glossary terms cannot be added to the bulk edit list.
-
-## Select assets to bulk edit
-
-1. Use Microsoft Purview search or browse to discover assets you wish to edit.
-
-1. In the search results, each data asset has a checkbox you can select to add the asset to the selection list.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/asset-checkbox.png" alt-text="Screenshot of the bulk edit checkbox.":::
-
-1. You can also add an asset to the bulk edit list from the asset detail page. Select **Select for bulk edit** to add the asset to the bulk edit list.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/asset-list.png" alt-text="Screenshot of the asset page with the bulk edit box highlighted.":::
-
-1. Select the checkbox to add it to the bulk edit list. You can see the selected assets by selecting the **View selected** button.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/selected-list.png" alt-text="Screenshot of the asset list with the View Selected button highlighted.":::
-
-## Bulk edit assets
-
-1. When all assets have been chosen, select **View selected** to pull up the selected assets.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/view-list.png" alt-text="Screenshot of the view.":::
-
-1. Review the list and select **Deselect** if you want to remove any assets from the list.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/remove-list.png" alt-text="Screenshot with the Deselect button highlighted.":::
-
-1. Select **Bulk edit** to add, remove or replace an annotation for all the selected assets. You can edit the glossary terms, classifications, owners or experts of an asset.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/bulk-edit.png" alt-text="Screenshot with the bulk edit button highlighted.":::
-
-1. For each attribute selected, you can choose which edit operation to apply
- 1. **Add** will append a new annotation to the selected data assets.
- 1. **Replace with** will replace all of the annotations for the selected data assets with the annotation selected.
- 1. **Remove** will remove all annotations for selected data assets.
-
- You can edit multiple assets at once by selecting **Select a new attribute**.
-
- :::image type="content" source="media/how-to-bulk-edit-assets/add-list.png" alt-text="Screenshot of the add.":::
-
-1. When you have made all your updates, select **Apply**.
-
-1. Once complete, close the bulk edit blade by selecting **Close** or **Remove all and close**. Close won't remove the selected assets whereas remove all and close will remove all the selected assets.
- :::image type="content" source="media/how-to-bulk-edit-assets/close-list.png" alt-text="Screenshot of the close.":::
-
-> [!IMPORTANT]
-> In the UI you can currently only select up to 25 assets.
-> The **View Selected** box will be visible only if there is at least one asset selected.
-
-## Next steps
--- [How to create and manage glossary terms](how-to-create-manage-glossary-term.md)-- [How to import and export glossary terms](how-to-import-export-glossary.md)
purview How To Certify Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-certify-assets.md
- Title: Asset certification in the Microsoft Purview data catalog
-description: How to certify assets in the Microsoft Purview data catalog
----- Previously updated : 03/01/2023-
-# Asset certification in the Microsoft Purview data catalog
-
-As a Microsoft Purview data catalog grows in size, it becomes important for data consumers to understand what assets they can trust. Data consumers must know if an asset meet their organization's quality standards and can be regarded as reliable. Microsoft Purview allows data stewards to manually endorse assets to indicate that they're ready to use across an organization or business unit. This article describes how data stewards can certify assets and data consumers can view certification labels.
-
-## How to certify an asset
-
-To certify an asset, you must be a **data curator** for the collection containing the asset.
-
-1. Navigate to the [asset details](catalog-asset-details.md) of the desired asset. Select **Edit**.
-
- :::image type="content" source="media/how-to-certify-assets/edit-asset.png" alt-text="Edit an asset from the asset details page" border="true":::
-
-1. Toggle the **Certified** field to **Yes**.
-
- :::image type="content" source="media/how-to-certify-assets/toggle-certification-on.png" alt-text="Toggle an asset to be certified" border="true":::
-
-2. Save your changes. The asset has a "Certified" label next to the asset name.
-
- :::image type="content" source="media/how-to-certify-assets/view-certified-asset.png" alt-text="An asset with a certified label" border="true":::
-
-> [!NOTE]
-> PowerBI assets can only be [certified in a PowerBI workspace](/power-bi/collaborate-share/service-endorse-content). PowerBI endorsement labels are displayed in Microsoft Purview's search and browse experiences.
-
-### Certify assets in bulk
-
-You can use the Microsoft Purview [bulk edit experience](how-to-bulk-edit-assets.md) to certify multiple assets at once.
-
-1. After searching or browsing the data catalog, select checkbox next to the assets you wish to certify.
-
- :::image type="content" source="media/how-to-certify-assets/bulk-edit-select.png" alt-text="Select assets to bulk certify" border="true":::
-
-1. Select **View selected**.
-1. Select **Bulk edit**
-
- :::image type="content" source="media/how-to-certify-assets/bulk-edit-open.png" alt-text="Open the bulk edit experience" border="true":::
-
-1. Choose attribute **Certified**, operation **Replace with**, and new value **Yes**.
-
- :::image type="content" source="media/how-to-certify-assets/bulk-edit-certify.png" alt-text="Apply certification labels to all selected assets" border="true":::
-
-1. Select **Apply**
-
-All assets selected have the "Certified" label.
-
-## Viewing certification labels in Search
-
-When search or browsing the data catalog, you see a certification label on any asset that is certified. Certified assets boosted in search results to help data consumers discover them easily.
---
-## Next steps
-
-Discover your assets in the Microsoft Purview Data Catalog by either:
-- [Browsing the data catalog](how-to-browse-catalog.md)-- [Searching the data catalog](how-to-search-catalog.md)
purview How To Create And Manage Collections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-create-and-manage-collections.md
- Title: How to create and manage collections
-description: This article explains how to create and manage collections within the Microsoft Purview Data Map.
----- Previously updated : 02/01/2023---
-# Create and manage collections in the Microsoft Purview Data Map
-
-Collections in the Microsoft Purview Data Map can be used to organize assets and sources by your business's flow. They're also the tool used to manage access across the Microsoft Purview governance portal. This guide will take you through the creation and management of these collections, as well as cover steps about how to register sources and add assets into your collections.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* Your own [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md).
-
-* An active [Microsoft Purview (formerly Azure Purview) account](create-catalog-portal.md).
-
-### Check permissions
-
-In order to create and manage collections in the Microsoft Purview Data Map, you'll need to be a **Collection Admin** within the Microsoft Purview governance portal. We can check these permissions in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/). You can find the Microsoft Purview governance portal by:
--- Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.-- Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.-
-1. Select Data Map > Collections from the left pane to open collection management page.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/find-collections.png" alt-text="Screenshot of Microsoft Purview governance portal window, opened to the Data Map, with the Collections tab selected." border="true":::
-
-1. Select your root collection. This is the top collection in your collection list and will have the same name as your account. In the following example, it's called Contoso Microsoft Purview. Alternatively, if collections already exist you can select any collection where you want to create a subcollection.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/select-root-collection.png" alt-text="Screenshot of Microsoft Purview governance portal window, opened to the Data Map, with the root collection highlighted." border="true":::
-
-1. Select **Role assignments** in the collection window.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/role-assignments.png" alt-text="Screenshot of Microsoft Purview governance portal window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
-
-1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant your permission.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/collection-admins.png" alt-text="Screenshot of Microsoft Purview governance portal window, opened to the Data Map, with the collection admin section highlighted." border="true":::
-
-## Collection management
-
-### Create a collection
-
-You'll need to be a collection admin in order to create a collection. If you aren't sure, follow the [guide above](#check-permissions) to check permissions.
-
-1. Select Data Map > Collections from the left pane to open collection management page.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/find-collections.png" alt-text="Screenshot of Microsoft Purview governance portal window, opened to the Data Map, with the Collections tab selected and open." border="true":::
-
-1. Select **+ Add a collection**. Again, note that only [collection admins](#check-permissions) can manage collections.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/select-add-a-collection.png" alt-text="Screenshot of Microsoft Purview governance portal window, showing the new collection window, with the 'Add a collection' button highlighted." border="true":::
-
-1. In the right panel, enter the collection name and description. If needed you can also add users or groups as collection admins to the new collection.
-1. Select **Create**.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/create-collection.png" alt-text="Screenshot of Microsoft Purview governance portal window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
-
-1. The new collection's information will reflect on the page.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/created-collection.png" alt-text="Screenshot of Microsoft Purview governance portal window, showing the newly created collection window." border="true":::
-
-### Edit a collection
-
-1. Select **Edit** either from the collection detail page, or from the collection's dropdown menu.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/edit-collection.png" alt-text="Screenshot of Microsoft Purview governance portal window, open to collection window, with the 'edit' button highlighted both in the selected collection window, and under the ellipsis button next to the name of the collection." border="true":::
-
-1. Currently collection description and collection admins can be edited. Make any changes, then select **Save** to save your change.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/edit-description.png" alt-text="Screenshot of Microsoft Purview governance portal window with the edit collection window open, a description added to the collection, and the save button highlighted." border="true":::
-
-### View Collections
-
-1. Select the triangle icon beside the collection's name to expand or collapse the collection hierarchy. Select the collection names to navigate.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/subcollection-menu.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the button next to the collection name highlighted." border="true":::
-
-1. Type in the filter box at the top of the list to filter collections.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/filter-collections.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the filter above the collections highlighted." border="true":::
-
-1. Select **Refresh** in Root collection's contextual menu to reload the collection list.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/refresh-collections.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the button next to the Resource name selected, and the refresh button highlighted." border="true":::
-
-1. Select **Refresh** in collection detail page to reload the single collection.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/refresh-single-collection.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the refresh button under the collection window highlighted." border="true":::
-
-### Delete a collection
-
-You'll need to be a collection admin in order to delete a collection. If you aren't sure, follow the guide above to check permissions. Collection can be deleted only if no child collections, assets, data sources or scans are associated with it.
-
-1. Select **Delete** from the collection detail page.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/delete-collections.png" alt-text="Screenshot of Microsoft Purview governance portal window to delete a collection" border="true":::
-
-1. Select **Confirm** when prompted, **Are you sure you want to delete this collection?**
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/delete-collection-confirmation.png" alt-text="Screenshot of Microsoft Purview governance portal window showing confirmation message to delete a collection" border="true":::
-
-1. Verify deletion of the collection from your Microsoft Purview Data Map.
-
-### Move registered sources between collections
-
-You can move registered sources from one collection to another you have access to. For steps, see the [manage data sources article](manage-data-sources.md#move-sources-between-collections).
-
-## Add roles and restrict access through collections
-
-Since permissions are managed through collections in the Microsoft Purview Data Map, it's important to understand the roles and what permissions they'll give your users. A user granted permissions on a collection will have access to sources and assets associated with that collection, and inherit permissions to subcollections. Inheritance [can be restricted](#restrict-inheritance), but is allowed by default.
-
-The following guide will discuss the roles, how to manage them, and permissions inheritance.
-
-### Roles
-
-All assigned roles apply to sources, assets, and other objects within the collection where the role is applied.
-A few of the main roles are:
--- **Collection administrator** - a role for users that will need to assign roles to other users in the Microsoft Purview governance portal or manage collections. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.-- **Data curators** - a role that provides access to the data catalog to manage assets, configure custom classifications, set up glossary terms, and view data estate insights. Data curators can create, read, modify, move, and delete assets. They can also apply annotations to assets.-- **Data readers** - a role that provides read-only access to data assets, classifications, classification rules, collections and glossary terms.-- **Data source administrator** - a role that allows a user to manage data sources and scans. If a user is granted only to **Data source admin** role on a given data source, they can run new scans using an existing scan rule. To create new scan rules, the user must be also granted as either **Data reader** or **Data curator** roles.-
-> [!IMPORTANT]
-> For a list of all available roles, and more information about roles, see the [permissions documentation](catalog-permissions.md#roles).
-
-### Add role assignments
-
-1. Select the **Role assignments** tab to see all the roles in a collection. Only a collection admin can manage role assignments.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/select-role-assignments.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the role assignments tab highlighted." border="true":::
-
-1. Select **Edit role assignments** or the person icon to edit each role member.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/edit-role-assignments.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the edit role assignments dropdown list selected." border="true":::
-
-1. Type in the textbox to search for users you want to add to the role member. Select **X** to remove members you don't want to add.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-user-permissions.png" alt-text="Screenshot of Microsoft Purview governance portal collection admin window with the search bar highlighted." border="true":::
-
-1. Select **OK** to save your changes, and you'll see the new users reflected in the role assignments list.
-
-### Remove role assignments
-
-1. Select **X** button next to a user's name to remove a role assignment.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/remove-role-assignment.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the role assignments tab selected, and the x button beside one of the names highlighted." border="true":::
-
-1. Select **Confirm** if you're sure to remove the user.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/confirm-remove.png" alt-text="Screenshot of a confirmation pop-up, with the confirm button highlighted." border="true":::
-
-### Restrict inheritance
-
-Collection permissions are inherited automatically from the parent collection. For example, any permissions on the root collection (the collection at the top of the list that has the same name as your account), will be inherited by all collections below it. You can restrict inheritance from a parent collection at any time, using the restrict inherited permissions option.
-
-Once you restrict inheritance, you'll need to add users directly to the restricted collection to grant them access.
-
-1. Navigate to the collection where you want to restrict inheritance and select the **Role assignments** tab.
-1. Select **Restrict inherited permissions** and select **Restrict access** in the popup dialog to remove inherited permissions from this collection and any subcollections. Note that collection admin permissions won't be affected.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/restrict-access-inheritance.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the role assignments tab selected, and the restrict inherited permissions slide button highlighted." border="true":::
-
-1. After restriction, inherited members are removed from the roles expect for collection admin.
-1. Select the **Restrict inherited permissions** toggle button again to revert.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/remove-restriction.png" alt-text="Screenshot of Microsoft Purview governance portal collection window, with the role assignments tab selected, and the unrestrict inherited permissions slide button highlighted." border="true":::
-
-## Register source to a collection
-
-1. Select **Register** or register icon on collection node to register a data source. Only a data source admin can register sources.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/register-by-collection.png" alt-text="Screenshot of the data map Microsoft Purview governance portal window with the register button highlighted both at the top of the page and under a collection."border="true":::
-
-1. Fill in the data source name, and other source information. It lists all the collections where you have scan permission on the bottom of the form. You can select one collection. All assets under this source will belong to the collection you select.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/register-source.png" alt-text="Screenshot of the source registration window."border="true":::
-
-1. The created data source will be put under the selected collection. Select **View details** to see the data source.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/see-registered-source.png" alt-text="Screenshot of the data map Microsoft Purview governance portal window with the newly added source card highlighted."border="true":::
-
-1. Select **New scan** to create scan under the data source.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/new-scan.png" alt-text="Screenshot of a source Microsoft Purview governance portal window with the new scan button highlighted."border="true":::
-
-1. Similarly, at the bottom of the form, you can select a collection, and all assets scanned will be included in the collection.
-The collections listed here are restricted to subcollections of the data source collection.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/scan-under-collection.png" alt-text="Screenshot of a new scan window with the collection dropdown highlighted."border="true":::
-
-1. Back in the collection window, you'll see the data sources linked to the collection on the sources card.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/source-under-collection.png" alt-text="Screenshot of the data map Microsoft Purview governance portal window with the newly added source card highlighted in the map."border="true":::
-
-## Add assets to collections
-
-Assets and sources are also associated with collections. During a scan, if the scan was associated with a collection the assets will be automatically added to that collection, but can also be manually added to any subcollections.
-
-1. Check the collection information in asset details. You can find collection information in the **Collection path** section on right-top corner of the asset details page.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/collection-path.png" alt-text="Screenshot of Microsoft Purview governance portal asset window, with the collection path highlighted." border="true":::
-
-1. Permissions in asset details page:
- 1. Check the collection-based permission model by following the [add roles and restricting access on collections guide above](#add-roles-and-restrict-access-through-collections).
- 1. If you don't have read permission on a collection, the assets under that collection won't be listed in search results. If you get the direct URL of one asset and open it, you'll see the no access page. Contact your collection admin to grant you the access. You can select the **Refresh** button to check the permission again.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/no-access.png" alt-text="Screenshot of Microsoft Purview governance portal asset window where the user has no permissions, and has no access to information or options." border="true":::
-
- 1. If you have the read permission to one collection but don't have the write permission, you can browse the asset details page, but the following operations are disabled:
- * Edit the asset. The **Edit** button will be disabled.
- * Delete the asset. The **Delete** button will be disabled.
- * Move asset to another collection. The ellipsis button on the right-top corner of Collection path section will be hidden.
- 1. The assets in **Hierarchy** section are also affected by permissions. Assets without read permission will be grayed.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/hierarchy-permissions.png" alt-text="Screenshot of Microsoft Purview governance portal hierarchy window where the user has only read permissions, and has no access to options." border="true":::
-
-### Move asset to another collection
-
-1. Select the ellipsis button on the right-top corner of Collection path section.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/move-asset.png" alt-text="Screenshot of Microsoft Purview governance portal asset window with the collection path highlighted and the ellipsis button next to collection path selected." border="true":::
-
-1. Select the **Move to another collection** button.
-1. In the right side panel, choose the target collection you want move to. You can only see the collections where you have write permissions. The asset can also only be added to the subcollections of the data source collection.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/move-select-collection.png" alt-text="Screenshot of Microsoft Purview governance portal pop-up window with the select a collection dropdown menu highlighted." border="true":::
-
-1. Select **Move** button on the bottom of the window to move the asset.
-
-## Search and browse by collections
-
-### Search by collection
-
-1. In the Microsoft Purview governance portal, the search bar is located at the top of the portal window.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/purview-search-bar.png" alt-text="Screenshot showing the location of the Microsoft Purview governance portal search bar." border="true":::
-
-1. When you select the search bar, you can see your recent search history and recently accessed assets. Select **View all** to see all of the recently viewed assets.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-no-keywords.png" alt-text="Screenshot showing the search bar before any keywords have been entered" border="true":::
-
-1. Enter in keywords that help identify your asset such as its name, data type, classifications, and glossary terms. As you enter in keywords relating to your desired asset, the Microsoft Purview governance portal displays suggestions on what to search and potential asset matches. To complete your search, select **View search results** or press **Enter**.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-keywords.png" alt-text="Screenshot showing the search bar as a user enters in keywords" border="true":::
-
-1. The search results page shows a list of assets that match the keywords provided in order of relevance. There are various factors that can affect the relevance score of an asset. You can filter down the list more by selecting specific collections, data stores, classifications, contacts, labels, and glossary terms that apply to the asset you're looking for.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-results.png" alt-text="Screenshot showing the results of a search" border="true":::
-
-1. Select your desired asset to view the asset details page where you can view properties including schema, lineage, and asset owners.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-by-collection.png" alt-text="Screenshot showing search results with collections." border="true":::
-
-### Browse by collection
-
-1. You can browse data assets, by selecting the **Browse assets** on the homepage.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/browse-by-collection.png" alt-text="Screenshot of the catalog Microsoft Purview governance portal window with the browse assets button highlighted." border="true":::
-
-1. On the Browse asset page, select **By collection** pivot. Collections are listed with hierarchical table view. To further explore assets in each collection, select the corresponding collection name.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/by-collection-view.png" alt-text="Screenshot of the asset Microsoft Purview governance portal window with the by collection tab selected."border="true":::
-
-1. On the next page, the search results of the assets under selected collection will be shown. You can narrow the results by selecting the facet filters. Or you can see the assets under other collections by selecting the sub/related collection names.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-results-by-collection.png" alt-text="Screenshot of the catalog Microsoft Purview governance portal window with the by collection tab selected."border="true":::
-
-1. To view the details of an asset, select the asset name in the search result. Or you can check the assets and bulk edit them.
-
- :::image type="content" source="./media/how-to-create-and-manage-collections/view-asset-details.png" alt-text="Screenshot of the catalog Microsoft Purview governance portal window with the by collection tab selected and asset check boxes highlighted."border="true":::
-
-## Next steps
-
-Now that you have a collection, you can follow these guides below to add resources and scan.
-
-* [Manage data sources](manage-data-sources.md)
-
-* [Supported data sources](azure-purview-connector-overview.md)
-
-* [Scan and ingestion](concept-scans-and-ingestion.md)
purview How To Create Manage Glossary Term https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-create-manage-glossary-term.md
- Title: Create and manage glossary terms
-description: Learn how to create and manage business glossary terms in Microsoft Purview.
----- Previously updated : 11/14/2022--
-# Create and manage glossary terms
-
-This article describes how to work with the business glossary in Microsoft Purview. It provides steps to create a business glossary term in the Microsoft Purview Data Catalog. It also shows you how to import and export glossary terms by using .CSV files, and how to delete terms that you no longer need.
-
-## Create a term
-
-To create a glossary term, follow these steps:
-
-1. On the home page, select **Data catalog** on the left pane, and then select the **Manage glossary** button in the center of the page.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/find-glossary.png" alt-text="Screenshot of the data catalog with the button for managing a glossary highlighted." border="true":::
-
-1. On the **Business glossary** page, select the glossary you would like to create the new term for, then select **+ New term**. A term can only be added to one glossary at a time.
-
- > [!NOTE]
- > Each glossary supports a maximum of 100,000 terms. For more information about creating and managing glossaries see the [manage glossaries page.](how-to-create-manage-glossary.md)
-
- A pane opens with the **System default** template selected. Choose the template, or templates, that you want to use to create a glossary term, and then select **Continue**.
- Selecting multiple templates will allow you to use the custom attributes from those templates.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/new-term-with-default-template.png" alt-text="Screenshot of the button and pane for creating a new term." border="true":::
-
-1. If you selected multiple templates, you can select and deselect templates from the **Term template** dropdown at the top of the page.
-
-1. Give your new term a name, which must be unique in the catalog.
-
- > [!NOTE]
- > Term names are case-sensitive. For example, **Sample** and **sample** could both exist in the same glossary.
-
-1. For **Definition**, add a definition for the term.
-
- Microsoft Purview enables you to add rich formatting to term definitions. For example, you can add bold, underline, or italic formatting to text. You can also create tables, bulleted lists, or hyperlinks to external resources.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/rich-text-editor.png" alt-text="Screenshot that shows the rich text editor.":::
-
- Here are the options for rich text formatting:
-
- | Name | Description | Keyboard shortcut |
- | - | -- | |
- | Bold | Make your text bold. Adding the asterisk (*) character around text will also make it bold. | Ctrl+B |
- | Italic | Make your text italic. Adding the underscore (_) character around text will also make it italic. | Ctrl+I |
- | Underline | Underline your text. | Ctrl+U |
- | Bullets | Create a bulleted list. Adding the hyphen (-) character before text will also create a bulleted list. | |
- | Numbering | Create a numbered list. Adding the 1 character before text will also create a numbered list. | |
- | Heading | Add a formatted heading. | |
- | Font size | Change the size of your text. The default size is 12. | |
- | Decrease indent | Move your paragraph closer to the margin. | |
- | Increase indent | Move your paragraph farther away from the margin. | |
- | Add hyperlink | Create a link for quick access to webpages and files. | |
- | Remove hyperlink | Change a link to plain text. | |
- | Quote | Add quote text. | |
- | Add table | Add a table to your content. | |
- | Edit table | Insert or delete a column or row from a table. | |
- | Clear formatting | Remove all formatting from a selection of text. | |
- | Undo | Undo changes that you made to the content. | Ctrl+Z |
- | Redo | Redo changes that you made to the content. | Ctrl+Y |
-
- > [!NOTE]
- > Updating a definition with the rich text editor adds the attribute `"microsoft_isDescriptionRichText": "true"` in the term payload. This attribute is not visible on the user experience and is automatically populated when you take any rich text action. The right text definition is populated in the following snippet of a term's JSON message:
- >
- >```json
- > {
- > "additionalAttributes": {
- > "microsoft_isDescriptionRichText": "true"
- > }
- > }
- >```
-
-1. For **Status**, select the status for the term. New terms default to **Draft**.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/overview-tab.png" alt-text="Screenshot of the status choices.":::
-
- Status markers are metadata associated with the term. Currently, you can set the following status on each term:
-
- - **Draft**: This term isn't yet officially implemented.
- - **Approved**: This term is officially approved.
- - **Expired**: This term should no longer be used.
- - **Alert**: This term needs attention.
-
- > [!Important]
- > If an approval workflow is enabled on the term hierarchy, a new term will go through the approval process when it's created. The term is stored in the catalog only when it's approved. To learn about how to manage approval workflows for a business glossary, see [Approval workflow for business terms](how-to-workflow-business-terms-approval.md).
-
-1. Add **Resources** and **Acronym** information. If the term is part of a hierarchy, you can add parent terms at **Parent** on the **Overview** tab.
-
-1. To establish relationships with other terms, add **Synonyms** and **Related terms** information on the **Related** tab, and then select **Apply**.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/related-tab.png" alt-text="Screenshot of tab for related terms and the box for adding synonyms." border="true":::
--
-1. Optionally, select the **Contacts** tab to add experts and stewards to your term.
-
-1. Select **Create** to create your term.
-
- > [!Important]
- > If an approval workflow is enabled on the term's hierarchy path, you'll see **Submit for approval** instead of the **Create** button. Selecting **Submit for approval** will trigger the approval workflow for this term.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/submit-for-approval.png" alt-text="Screenshot of the button to submit a term for approval." border="true":::
-
-## Delete terms
-
-1. On the home page, select **Data catalog** on the left pane, and then select the **Manage glossary** button in the center of the page.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/find-glossary.png" alt-text="Screenshot of the data catalog and the button for managing a glossary." border="true":::
-
-1. Select the glossary that has the terms you want to delete, and select the **Terms** tab.
-
-1. Select checkboxes for the terms that you want to delete. You can select a single term or multiple terms for deletion.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/select-terms.png" alt-text="Screenshot of the glossary with a few terms selected." border="true":::
-
-1. Select the **Delete** button on the top menu.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/select-delete.png" alt-text="Screenshot of the glossary with the Delete button highlighted on the top menu." border="true":::
-
-1. A new window shows all the terms selected for deletion. In the following example, the list of terms to be deleted are the parent term **Revenue** and its two child terms.
-
- > [!NOTE]
- > If a parent is selected for deletion, all the children for that parent are automatically selected for deletion.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/delete-window.png" alt-text="Screenshot of the window for deleting glossary terms, with a list of all terms to be deleted." border="true":::
-
- Review the list. You can remove the terms that you don't want to delete by selecting **Remove**.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/select-remove.png" alt-text="Screenshot of the window for deleting glossary terms, with the column for removing items from the list of terms to be deleted." border="true":::
-
-1. The **Approval needed** column shows which terms require an approval process. If the value is **Yes**, the term will go through an approval workflow before deletion. If the value is **No**, the term will be deleted without any approvals.
-
- > [!NOTE]
- > If a parent has an associated approval process but its child doesn't, the workflow for deleting the parent term will be triggered. This is because the selection is done on the parent, and you're acknowledging the deletion of child terms along with parent.
-
- If at least one term needs to be approved, **Submit for approval** and **Cancel** buttons appear. Selecting **Submit for approval** will delete all the terms where approval isn't needed and will trigger approval workflows for terms that require it.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/yes-approval-needed.png" alt-text="Screenshot of the window for deleting glossary terms, which shows terms that need approval and includes the button for submitting them for approval." border="true":::
-
- If no terms need to be approved, **Delete** and **Cancel** buttons appear. Selecting **Delete** will delete all the selected terms.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/no-approval-needed.png" alt-text="Screenshot of the window for deleting glossary terms, which shows terms that don't need approval and the button for deleting them." border="true":::
-
-## Business terms with approval workflow enabled
-
-If [workflows](concept-workflow.md) are enabled on a term, then any create, update, or delete actions for the term will go through an approval before they're saved in the data catalog.
--- **New terms**: When a create approval workflow is enabled on a parent term, you see **Submit for approval** instead of **Create** after you enter all the details in the creation process. Selecting **Submit for approval** triggers the workflow. You'll get a notification when your request is approved or rejected.--- **Updates to existing terms**: When an update approval workflow is enabled on a parent term, you see **Submit for approval** instead of **Save** when you're updating the term. Selecting **Submit for approval** triggers the workflow. The changes won't be saved in catalog until all the approvals are met.--- **Deletion**: When a delete approval workflow is enabled on the parent term, you see **Submit for approval** instead of **Delete** when you're deleting the term. Selecting **Submit for approval** triggers the workflow. However, the term won't be deleted from the catalog until all the approvals are met.--- **Importing terms**: When an import approval workflow is enabled for the Microsoft Purview glossary, you see **Submit for approval** instead of **OK** in the **Import** window when you're importing terms via .CSV file. Selecting **Submit for approval** triggers the workflow. However, the terms in the file won't be updated in the catalog until all the approvals are met.--
-## Next steps
-
-* For more information about glossary terms, see the [glossary reference](reference-azure-purview-glossary.md).
-* For more information about approval workflows of the business glossary, see [Approval workflow for business terms](how-to-workflow-business-terms-approval.md).
purview How To Create Manage Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-create-manage-glossary.md
- Title: Create and manage glossaries in Microsoft Purview
-description: Learn how to create and manage business glossaries in Microsoft Purview.
----- Previously updated : 11/14/2022--
-# Create and manage business glossaries
-
-In Microsoft Purview, you can create multiple business glossaries to support separate glossaries for any context in your business. This article provides the steps to create and manage business glossaries in the Microsoft Purview Data Catalog.
-
-## Create a new glossary
-
-To create a business glossary, follow these steps:
-
-1. On the home page, select **Data catalog** on the left pane, and then select the **Manage glossary** button in the center of the page.
-
- :::image type="content" source="media/how-to-create-manage-glossary/find-glossary.png" alt-text="Screenshot of the data catalog with the button for managing a glossary highlighted." border="true":::
-
-1. On the **Business glossary** page, select **+ New glossary**.
-
- :::image type="content" source="media/how-to-create-manage-glossary/select-new-glossary.png" alt-text="Screenshot of the button and pane for creating a new glossary." border="true":::
-
-1. Give your glossary a **Name** and a **Description**.
-
-1. You'll need to select at least one **Steward**, an Azure Active Directory user or group who will manage the glossary.
-
-1. You'll also need to select at least one **Expert**, an Azure Active Directory user or group who can be contacted for more information about the glossary.
-
-1. You can give additional information about the stewards or experts here, and when you're finished, select **Create**.
-
- :::image type="content" source="media/how-to-create-manage-glossary/create-new-glossary.png" alt-text="Screenshot that shows the new glossary template with all sections filled out.":::
-
-1. Once your new glossary is created, you'll be able to see it in the list of glossaries. You can switch between glossaries by selecting their names.
-
- :::image type="content" source="media/how-to-create-manage-glossary/glossary-created.png" alt-text="Screenshot showing the business glossary page, with the new glossary highlighted and selected.":::
-
-## Manage or delete a glossary
-
-1. To edit or delete a glossary, hover over the glossary and select the ellipsis button next to the glossary's name.
-
- :::image type="content" source="media/how-to-create-manage-glossary/edit-or-delete-glossary.png" alt-text="Screenshot showing the business glossary page, with a glossary highlighted, showing the ellipsis button selected and the edit and delete options pop up available.":::
-
-1. If you select **Edit glossary**, you can edit the description and the steward or the expert, but at this time you can't change the glossary name. Select **Save** to save the changes.
-
- :::image type="content" source="media/how-to-create-manage-glossary/edit-glossary.png" alt-text="Screenshot of edit glossary page, with all values filled and the save button highlighted.":::
-
-1. If you select **Delete glossary**, you'll be asked to confirm the deletion. All terms associated with the glossary will be deleted if you delete the glossary. Select **Delete** again to delete the glossary.
-
- :::image type="content" source="media/how-to-create-manage-glossary/delete-glossary.png" alt-text="Screenshot of the delete glossary window.":::
-
-## Next steps
-
-* For more information about creating and managing glossary terms, see the [glossary terms article](how-to-create-manage-glossary.md).
purview How To Data Share Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-data-share-faq.md
- Title: Microsoft Purview Data Sharing FAQ
-description: Microsoft Purview Data Sharing frequently asked questions (FAQ) and answers.
----- Previously updated : 02/16/2023-
-# FAQ: Azure Storage in-place data share with Microsoft Purview Data Sharing (preview)
-
-Here are some frequently asked questions for Microsoft Purview Data Sharing.
-
-## What are the key terms related to data sharing?
-
-* **Data Provider** - Organization that shares data.
-* **Data Consumer** - Organization that receives shared data from a data provider.
-* **Asset** - For storage in-place sharing, an asset is a storage account, and the list of files and folders you want to share from the storage account.
-* **Share** - A share is a set of data that can be shared from provider to consumer. It's a set of assets. You can have one asset with files/folders from one storage account, and another asset with files/folders from a different storage account.
-* **Collection** - A [collection](catalog-permissions.md) is a tool Microsoft Purview uses to group assets, sources, shares, and other artifacts into a hierarchy for discoverability and to manage access control. A root collection is created automatically when you create your Microsoft Purview account and you're granted all the roles to the root collection. You can use the root collection (default) or create child collections for data sharing.
-* **Recipient** - A recipient is a user or service principal to which the share is sent.
-
-## Can I use the API or SDK for storage in-place sharing?
-
-Yes, you can use [REST API](/rest/api/purview/) or [.NET SDK](/dotnet/api/overview/azure/purview) for programmatic experience to share data.
-
-We have a [guide for getting started with the .NET SDK](quickstart-data-share-dotnet.md).
-
-## What are the roles and permissions required to share data or receive shares?
-
-| **Operations** | **Roles and Permissions** |
-|||
-|**Data provider**: create share, add asset and recipients, revoke access | **Microsoft Purview collection role**: minimum of Data Reader to use the Microsoft Purview compliance portal experience, none to use API or SDK |
-| |**Storage account role** checked when adding and updating asset: Owner or Storage Blob Data Owner |
-| |**Storage account permissions** checked when adding and updating asset: Microsoft.Authorization/roleAssignments/write OR Microsoft.Storage/storageAccounts/blobServices/containers/blobs/modifyPermissions/|
-|**Data consumer**: Receive share, attach share, delete share |**Microsoft Purview collection role**: minimum of Data Reader to use the Microsoft Purview compliance portal experience, none to use API or SDK |
-| |**Storage account role** checked when attaching share: Contributor OR Owner OR Storage Blob Data Contributor OR Storage Blob Data Owner |
-| |**Storage account permissions** checked when attaching share: Microsoft.Storage/storageAccounts/write OR Microsoft.Storage/storageAccounts/blobServices/containers/write|
-|**Data consumer**: Access shared data| No share-specific role required. You can access shared data with regular storage account permission just like any other data. Data consumer's ability to apply ACLs for shared data is currently not supported.|
-
-## How can I share data from containers?
-
-When adding assets, you can select the container(s) that you would like to share.
-
-## Can I share data in-place with storage account in a different Azure region?
-
-Cross-region in-place data sharing isn't currently supported for storage account. Data provider and data consumer's storage accounts need to be in the same Azure region.
-
-## Is there support for read-write shares?
-
-Storage in-place sharing supports read-only shares. Data consumer can't write to the shared data.
-
-To share data back to the data provider, the data consumer can create a share and share with the data provider.
-
-## Can I access shared data from analytics tools like Azure Synapse?
-
-You can access shared data from storage clients like Azure Synapse Analytics Spark and Databricks. You won't be able to access shared data using Azure Data Factory, Power BI, or AzCopy.
-
-## Does the recipient of the share need to be a user's email address or can I share data with an application?
-
-Through the UI, you can share data with recipient's Azure sign-in email or using service principal's object ID and tenant ID.
-
-[Through API and SDK, you also send invitation to object ID of a user principal or service principal](quickstart-data-share-dotnet.md#send-invitation-to-a-service). Also, you can optionally specify a tenant ID that you want the share to be received into.
-
-## Is the recipient accepting the share only for themselves?
-
-When the recipient attaches the share to a target storage account, any user or application that has access to the target storage account will be able to access shared data.
-
-## If the recipient leaves the organization, what happens to the received share?
-
-Once the received share is accepted and attached to a target storage account, any users with appropriate permissions to the target storage account can continue to access the shared data even after the recipient has left the organization.
-
-Once the received share is accepted, any user with data reader permission to the Microsoft Purview collection that the share is received into can view and update the received share.
-
-## How do I request an increase in limits for the number of shares?
-
-Data provider's source storage account can support up to 20 targets, and data consumer's target storage account can support up to 100 sources. To request a limit increase, [contact support](https://azure.microsoft.com/support/create-ticket/).
-
-## How do I troubleshoot data sharing issues?
-
-To troubleshoot issues with sharing data, refer to the [troubleshooting section of the how to share data article](how-to-share-data.md#troubleshoot). To troubleshoot issues with receiving share, refer to the [troubleshooting section of the how to receive share article](how-to-receive-share.md#troubleshoot).
-
-## Is there support for Private endpoints, VNET and IP restrictions?
-
-Private endpoints, VNET, and IP restrictions are supported for data share for storage. Blob should be chosen as the target subresource when creating a private endpoint for storage accounts.
-
-## Next steps
-
-* [Data sharing quickstart](quickstart-data-share.md)
-* [How to Share data](how-to-share-data.md)
-* [How to receive a share](how-to-receive-share.md)
-* [REST API reference](/rest/api/purview/)
purview How To Delete Self Service Data Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-delete-self-service-data-access-policy.md
- Title: Delete self-service policies
-description: This article describes how to delete auto-generated self-service policies.
----- Previously updated : 02/27/2023--
-# How to delete self-service data access policies
-
-In a Microsoft Purview catalog, you can now [request access](how-to-request-access.md) to data assets. If policies are currently available for the data source type and the data source has [Data Use Management enabled](how-to-enable-data-use-management.md), a self-service policy is generated when a data access request is approved.
-
-This article describes how to delete self-service data access policies that have been auto-generated by approved access requests.
-
-## Prerequisites
-
-> [!IMPORTANT]
-> To delete self-service policies, make sure that the below prerequisites are completed.
-
-Self-service policies must exist to be deleted. To enable and create self-service policies, follow these articles:
-
-1. [Enable Data Use Management](how-to-enable-data-use-management.md) - this will allow Microsoft Purview to create policies for your sources.
-1. [Create a self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md) - this will enable [users to request access to data sources from within Microsoft Purview](how-to-request-access.md).
-1. [Approve a self-service data access request](how-to-workflow-manage-requests-approvals.md#approvals) - after approving a request, if your workflow from the previous step includes the ability to create a self-service data policy, your policy will be created and will be viewable.
-
-## Permission
-
-Only users with **Policy Admin** privilege can delete self-service data access policies.
-
-## Steps to delete self-service data access policies
-
-The Microsoft Purview governance portal can be launched as shown below from the Azure portal or by using the [url directly](https://web.purview.azure.com/resource/).
--
-1. Select the policy management tab to launch the self-service access policies.
-
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-2.png" alt-text="Screenshot of the Microsoft Purview governance portal with the leftmost menu open, and the Data policy page option highlighted.":::
-
-1. Open the self-service access policies tab.
-
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-3.png" alt-text="Screenshot of the Microsoft Purview governance portal open to the Data policy page with self-service access policies highlighted.":::
-
-1. Here you'll see all your policies. Select the policies that need to be deleted. The policies can be sorted and filtered by any of the displayed columns to improve your search.
-
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-selecting-policy-pic-4.png" alt-text="Screenshot showing the self-service access policies page with one policy selected.":::
-
-1. Select the delete button to delete all selected policies.
-
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-press-delete-pic-5.png" alt-text="Screenshot showing a self-service access policy selected, with the Delete button at the top of the page highlighted.":::
-
-1. Select **OK** on the confirmation dialog box to delete the policy. Refresh the screen to confirm that the policies have been deleted.
-
-## Next steps
--- [Self-service data access policy](./concept-self-service-data-access-policy.md)
purview How To Deploy Profisee Purview Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-deploy-profisee-purview-integration.md
- Title: Deploy Microsoft Purview - Profisee integration for master data management (MDM)
-description: This guide describes how to deploy the Microsoft Purview-Profisee better together integration as a master data management (MDM) solution for your data estate governance.
----- Previously updated : 08/18/2022---
-# Microsoft Purview - Profisee MDM Integration
-
-Master data management (MDM) is a key pillar of any unified data governance solution. Microsoft Purview supports master data management with our partner [Profisee](https://profisee.com/solutions/microsoft-enterprise/azure/). This tutorial compiles reference and integration deployment materials in one place; firstly to put Microsoft Purview Unified Data Governance and MDM in the context of an Azure data estate; and more importantly, to get you started on your MDM journey with Microsoft Purview through our integration with Profisee.
-
-## Why Data Governance and Master Data Management (MDM) are essential to the modern Data Estate?
-
-All organizations have multiple data sources, and the larger the organization the greater the number of data sources. Typically, there will be ERPs, CRMs, Legacy applications, regional versions of each of these, external data feeds and so on. Most of these businesses move massive amounts of data between applications, storage systems, analytics systems, and across departments within their organization. During these movements, and over time, data can get duplicated or become fragmented, and become stale or out of date. Hence, accuracy becomes a concern when using this data to drive insights into your business.
-
-Inevitably, data that was created in different ΓÇÿsilosΓÇÖ with different (or no) governance standards to meet the needs of their respective applications will always have issues. When you look at the data drawn from each of these applications, you'll see that it's inconsistent in terms of both the standardization of data. Often, there are numerous inconsistencies in terms of the values themselves, and most often individual records are incomplete. In fact, it would be surprising if these inconsistencies weren't the case ΓÇô but it does present a problem. What is needed is data that is complete, and consistent, and accurate.
-
-To protect the quality of data within an organization, master data management (MDM) arose as a discipline that creates a source of truth for enterprise data so that an organization can check and validate their key assets. These key assets, or master data assets, are critical records that provide context for a business. For example, master data might include information on specific products, employees, customers, financial structures, suppliers, or locations. Master data management ensures data quality across an entire organization by maintaining an authoritative consolidated de-duplicated set of the master data records, and ensuring data remains consistent across your organization's complete data estate.
-
-As an example, it can be difficult for a company to have a clear, single view of their customers. Customer data may differ between systems, there may be duplicated records due to incorrect entry, or shipping and customer service systems may vary due to name, address, or other attributes. Master data management consolidates and standardizes all this differing information about the customer. This standardization process may involve automatic or user-defined rules, validations and checks. It's the job of the MDM system to ensure your data remains consistent within the framework of these rules over time. Not only does this improve quality of data by eliminating mismatched data across departments, but it ensures that data analyzed for business intelligence (BI) and other applications is trustworthy and up to date, reduces data load by removing duplicate records across the organization, and streamlines communications between business systems.
-
-The ability to consolidate data from multiple disparate systems is key if we want to use the data to drive business insights and operational efficiencies ΓÇô or any form of ΓÇÿdigital transformationΓÇÖ. What we need in that case is high quality, trusted data that is ready to use, whether it's being consumed in basic enterprise metrics or advanced AI algorithms. Bridging this gap is the job of Data Governance and MDM, and in the Azure world that means [Microsoft Purview](https://azure.microsoft.com/services/purview/) and [Profisee MDM](https://profisee.com/platform).
--
-While governance systems can *define* data standards, MDM is where they're *enforced*. Data from different systems can be matched and merged, validated against data quality and governance standards, and remediated where required. Then the new corrected and validated ΓÇÿmasterΓÇÖ data can be shared to downstream analytics systems and then back into source systems to drive operational improvements. By properly creating and maintaining enterprise master data, we ensure that data is no longer a liability and cause for concern, but an asset of the business that enables improved operation and innovation.
-
-More Details on [Profisee MDM](https://profisee.com/master-data-management-what-why-how-who/) and [Profisee-Purview MDM Concepts and Azure Architecture](/azure/architecture/reference-architectures/data/profisee-master-data-management-purview).
-
-## Microsoft Purview & Profisee MDM - Better Together!
-
-Microsoft Purview and Profisee MDM are often discussed as being a ΓÇÿBetter TogetherΓÇÖ value proposition due to the complementary nature of the solutions. Microsoft Purview excels at cataloging data sources and defining data standards, while Profisee MDM enforces those standards across master data drawn from multiple siloed sources. It's clear not only that either system has independent value to offer, but also that each reinforces the other for a natural ΓÇÿBetter TogetherΓÇÖ synergy that goes deeper than the independent offerings.
- - Common technical foundation ΓÇô Profisee was born out of Microsoft technologies using common tools, databases & infrastructure so any ΓÇÿMicrosoft shopΓÇÖ will find the Profisee solution familiar. In fact, for many years Profisee MDM was built on Microsoft Master Data Services (MDS) and now that MDS is nearing end of life, Profisee is the premier upgrade/replacement solution for MDS.
- - Developer collaboration and joint development ΓÇô Profisee and Microsoft Purview developers have collaborated extensively to ensure a good complementary fit between their respective solutions to deliver a seamless integration that meets the needs of their customers.
- - Joint sales and deployments ΓÇô Profisee has more MDM deployments on Azure, and jointly with Microsoft Purview, than any other MDM vendor, and can be purchased through Azure Marketplace. In FY2023 Profisee is the only MDM vendor with a Top Tier Microsoft partner certification available as an IaaS/CaaS or SaaS offering through Azure Marketplace.
- - Rapid and reliable deployment ΓÇô Rapid and reliable deployment is critical for any enterprise software and Gartner points out that Profisee has more implementations taking under 90 days than any other MDM vendor.
- - Inherently multi-domain ΓÇô Profisee offers a multi-domain approach to MDM where there are no limitations to the number of specificity of master data domains. This design aligns well with customers looking to modernize their data estate who may start with a limited number of domains, but ultimately will benefit from maximizing domain coverage (matched to their data governance coverage) across their whole data estate.
- - Engineered for Azure ΓÇô Profisee has been engineered to be cloud-native with options for both SaaS and managed IaaS/CaaS deployments on Azure (see next section)
-
- >[!TIP]
- > **Leverage ProfiseeΓÇÖs [MDS Migration Utility](https://profisee.com/solutions/microsoft-enterprise/master-data-services/) to upgrade from Microsoft MDS (Master Data Services) in a single click**
-
-## Profisee MDM: Deployment Flexibility ΓÇô Turnkey SaaS Experience or IaaS/CaaS Flexibility
-Profisee MDM has been engineered for a cloud-native experience and may be deployed on Azure in two ways ΓÇô SaaS and Azure IaaS/CaaS/Kubernetes Cluster.
-
-### Turnkey SaaS Experience
-A fully managed instance of Profisee MDM hosted by Profisee in the Azure cloud. Full turn-key service for the easiest and fastest MDM deployment. Profisee MDM SaaS can be purchased on [Azure Marketplace Profisee MDM - SaaS](https://portal.azure.com/#view/Microsoft_Azure_Marketplace/GalleryItemDetailsBladeNopdl/id/profisee.profisee_saas_private/product~/).
-- **Platform and Management in one** ΓÇô Use a true, end-to-end SaaS platform with one agreement and no third parties.-- **Industry-leading Cloud service** ΓÇô Hosted on Azure for industry-leading scalability and availability.-- **The fastest path to Trusted Data** ΓÇô Deploy in minutes with minimal technical knowledge. Leave the networking, firewalls and storage to us so you can deploy in minutes.-
-### Ultimate IaaS/CaaS Flexibility
-Complete deployment flexibility and control, using the most efficient and low-maintenance option on the [Microsoft Azure](https://azure.microsoft.com/) Kubernetes Service, functioning as a customer hosted fully managed IaaS/CaaS (container-as-a-service) deployment. The section below on "Microsoft Purview - Profisee integration deployment on Azure Kubernetes Service (AKS)" describes this deployment route in detail.
-- **Modern Cloud Architecture** - Platform available as a containerized Kubernetes service. -- **Complete Flexibility & Autonomy** - Available in Azure, AWS, Google Cloud or on-premises. -- **Fast to Deploy, Easy to Maintain** - 100% containerized configuration streamlines patches and upgrades. -
-More Details on [Profisee MDM Benefits On Modern Cloud Architecture](https://profisee.com/our-technology/modern-cloud-architecture/), [Profisee MDM on Azure](https://profisee.com/solutions/microsoft-enterprise/azure/), and why it fits best with [Microsoft Azure](https://azure.microsoft.com/) deployments!
-
-## Microsoft Purview - Profisee Reference Architecture
-
-The reference architecture shows how both Microsoft Purview and Profisee MDM work together to provide a foundation of high-quality, trusted data for the Azure data estate. It's also available as a short video walk-through.
-
-**Video: [Profisee Reference Architecture: MDM and Governance for Azure](https://profisee.wistia.com/medias/k72zte2wbr)**
--
-1. Scan & classify metadata from LOB systems ΓÇô uses pre-built Microsoft Purview connectors to scan data sources and populate the Microsoft Purview Data Catalog
-2. Publish master data model to Microsoft Purview ΓÇô any master data entities created in Profisee MDM are seamlessly published into Microsoft Purview to further populate the Microsoft Purview Data Catalog and ensure Microsoft Purview is ΓÇÿawareΓÇÖ of this critical source of data
-3. Enrich master data model with governance details ΓÇô Governance Data Stewards can enrich master data entity definitions with data dictionary and glossary information as well as ownership and sensitive data classifications, etc. in Microsoft Purview
-4. Apply enriched governance data for data stewardship ΓÇô any definitions and metadata available in Microsoft Purview are visible in real-time in Profisee as guidance for the MDM Data Stewards
-5. Load source data from business applications ΓÇô Azure Data Factory extracts data from source systems with 100+ pre-built connectors and/or REST gateway
-6. Transactional and unstructured data is loaded to downstream analytics solution ΓÇô All ΓÇÿrawΓÇÖ source data can be loaded to analytics database such as Synapse (Synapse is generally the preferred analytic database but others such as Snowflake are also common). Analysis on this raw information without proper master (ΓÇÿgoldenΓÇÖ) data will be subject to inaccuracy as data overlaps, mismatches and conflicts won't yet have been resolved.
-7. Master data from source systems is loaded to Profisee MDM application ΓÇô Multiple streams of ΓÇÿmasterΓÇÖ data is loaded to Profisee MDM. Master data is the data that defines a domain entity such as customer, product, asset, location, vendor, patient, household, menu item, ingredient, and so on. This data is typically present in multiple systems and resolving differing definitions and matching and merging this data across systems is critical to the ability to use any cross-system data in a meaningful way.
-8. Master data is standardized, matched, merged, enriched and validated according to governance rules ΓÇô Although data quality and governance rules may be defined in other systems (such as Microsoft Purview), Profisee MDM is where they're enforced. Source records are matched and merged both within and across source systems to create the most complete and correct record possible. Data quality rules check each record for compliance with business and technical requirements.
-9. Extra data stewardship to review and confirm matches, data quality, and data validation issues, as required ΓÇô Any record failing validation or matching with only a low probability score is subject to remediation. To remediate failed validations, a workflow process assigns records requiring review to Data Stewards who are experts in their business data domain. Once records have been verified or corrected, they're ready to use as a ΓÇÿgolden recordΓÇÖ master.
-10. Direct access to curated master data including secure data access for reporting in Power BI ΓÇô Power BI users may report directly on master data through a dedicated Power BI Connector that recognizes and enforces role-based security and hides various system fields for simplicity.
-11. High-quality, curated master data published to downstream analytics solution ΓÇô Verified master data can be published out to any target system using Azure Data Factory. Master data including the parent-child lineage of merged records published into Azure Synapse (or wherever the ΓÇÿrawΓÇÖ source transactional data was loaded). With this combination of properly curated master data plus transactional data, we have a solid foundation of trusted data for further analysis.
-12. Visualization and analytics with high-quality master data eliminates common data quality issues and delivers improved insights ΓÇô Irrespective of the tools used for analysis, including machine learning, and visualization, well-curated master data forms a better and more reliable data foundation. The alternative is to use whatever information you can get ΓÇô and risk misleading results that can damage the business.
-
-### Reference architecture guides/reference documents
--- [Data Governance with Profisee and Microsoft Purview](/azure/architecture/reference-architectures/data/profisee-master-data-management-purview)-- [Operationalize Profisee with ADF Azure Data Factory, Azure Synapse Analytics and Power BI](/azure/architecture/reference-architectures/data/profisee-master-data-management-data-factory)-- [MDM on Azure Overview](/azure/cloud-adoption-framework/scenarios/cloud-scale-analytics/govern-master-data)-
-## Microsoft Purview - Profisee integration deployment on Azure Kubernetes Service (AKS)
-
-1. Get the license file from Profisee by raising a support ticket on [https://support.profisee.com/](https://support.profisee.com/). The only pre-requisite for this step is your need to pre-determine the DNS resolved URL your Profisee setup on Azure. In other words, keep the DNS HOST NAME of the load balancer used in the deployment. It will be something like "[profisee_name].[region].cloudapp.azure.com".
-For example, DNSHOSTNAME="purviewprofisee.southcentralus.cloudapp.azure.com". Supply this DNSHOSTNAME to Profisee support when you raise the support ticket and Profisee will revert with the license file. You'll need to supply this file during the next configuration steps below.
-
-1. [Create a user-assigned managed identity in Azure](../active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities.md#create-a-user-assigned-managed-identity). You must have a managed identity created to run the deployment. After the deployment is done, the managed identity can be deleted. Based on your ARM template choices, you'll need some or all of the following roles and permissions assigned to your managed identity:
- - Contributor role to the resource group where AKS will be deployed. It can either be assigned directly to the resource group **OR** at the subscription level and down.
- - DNS Zone Contributor role to the particular DNS zone where the entry will be created **OR** Contributor role to the DNS Zone resource group. This DNS role is needed only if updating DNS hosted in Azure.
- - Application Administrator role in Azure Active Directory so the required permissions that are needed for the application registration can be assigned.
- - Managed Identity Contributor and User Access Administrator at the subscription level. Required in order for the ARM template managed identity to be able to create a Key Vault specific managed identity that will be used by Profisee to pull the values stored in the Key Vault.
-
- :::image type="content" alt-text="Screenshot of Profisee Managed Identity Azure Role Assignments." source="./media/how-to-deploy-profisee-purview/profisee-managed-identity-azure-role-assignments.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-managed-identity-azure-role-assignments.png":::
-
-1. [Create an application registration](../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal) that will act as the login identity once Profisee is installed. It needs to be a part of the Azure Active Directory that will be used to sign in to Profisee. Save the **Application (client) ID** for use later.
- - Set authentication to match the settings below:
- - Support ID tokens (used for implicit and hybrid flows)
- - Set the redirect URL to: https://\<your-deployment-url>/profisee/auth/signin-microsoft
- - Your deployment URL is the URL you'll have provided Profisee in step 1
-
-1. [Create a service principal](../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal) that Microsoft Purview will use to take some actions on itself during this Profisee deployment. To create a service principal, create an application like you did in the previous step, then [create an application secret](../active-directory/develop/howto-create-service-principal-portal.md#option-3-create-a-new-application-secret). Save the **Object ID** for the application, and the **Value** of the secret you created for later use.
- - Give this service principal (using the name or Object ID to locate it) **Data Curator** permissions on the root collection of your Microsoft Purview account.
-
-1. Go to [https://github.com/Profisee/kubernetes](https://github.com/Profisee/kubernetes) and select Microsoft Purview [**Azure ARM**](https://github.com/profisee/kubernetes/blob/master/Azure-ARM/README.md#deploy-profisee-platform-on-to-aks-using-arm-template).
- - The ARM template will deploy Profisee on a load balanced AKS (Azure Kubernetes Service) infrastructure using an ingress controller.
- - The readme includes troubleshooting steps.
- - Read all the steps and troubleshooting wiki page carefully.
-
-1. Select "Deploy to Azure"
-
- [![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2Fprofisee%2Fkubernetes%2Fmaster%2FAzure-ARM%2Fazuredeploy.json/createUIDefinitionUri/https%3A%2F%2Fraw.githubusercontent.com%2Fprofisee%2Fkubernetes%2Fmaster%2FAzure-ARM%2FcreateUIDefinition.json)
- - The configurator wizard will ask for the inputs as described here - [Deploying the AKS Cluster using the ARM Template](https://support.profisee.com/wikis/2022_r1_support/deploying_the_AKS_cluster_with_the_arm_template)
- - Make sure to give the exact same RG (Resource Group) in the deployment as you gave permissions to the managed identity in Step1.
-
-1. Once deployment completes, select Microsoft Purview "Go to Resource Group" and open the Profisee AKS Cluster.
-
-### Stages of a typical Microsoft Purview - Profisee deployment run
-
-1. On the basics page, select the [user-assigned managed identity you created earlier](#microsoft-purviewprofisee-integration-deployment-on-azure-kubernetes-service-aks) to deploy the resources.
-
-1. For your Profisee configuration, you can have your information stored in Key Vault or supply the details during deployment.
- 1. Choose your Profisee version, and provide your admin user account and license.
- 1. Select to configure using Microsoft Purview.
- 1. For the Application Registration Client ID, provide the [**application (client) ID**](../active-directory/develop/howto-create-service-principal-portal.md#sign-in-to-the-application) for the [application registration you created earlier](#microsoft-purviewprofisee-integration-deployment-on-azure-kubernetes-service-aks).
- 1. Select your Microsoft Purview account.
- 1. Add the **object ID** for the [service principal you created earlier](#microsoft-purviewprofisee-integration-deployment-on-azure-kubernetes-service-aks).
- 1. Add the value for the secret you created for that service principal.
- 1. Give your web application a name.
-
- :::image type="content" alt-text="Screenshot of the Profisee page of the Azure ARM Wizard, with all values filled out." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-a-profisee.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-a-profisee.png":::
-
-1. On the Kubernetes page, you may choose an older version of Kubernetes if needed, but leave the field **blank** to deploy the **latest** version.
-
- :::image type="content" alt-text="Screenshot of the Kubernetes configuration page in the ARM deployment wizard, configured with the smallest standard size and default network settings." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-b-kubernetes.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-b-kubernetes.png":::
-
- >[!TIP]
- > In most cases, leaving the version field blank is sufficient, unless there is a reason you need to deploy using an older version of Kubernetes AKS specifically.
-
-1. On the SQL configuration page you can choose to deploy a new Azure SQL server, or use an existing Azure SQL Server. You'll provide login details and a database name to use for this deployment.
-
- :::image type="content" alt-text="Screenshot of SQL configuration page in the ARM deployment wizard, with Yes, create a new SQL Server selected and details provided." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-c-sqlserver.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-c-sqlserver.png":::
-
-1. On the storage configuration page, you can choose to create a new storage account or use an existing one. You'll need to provide an access key and the name of an existing file share if you choose an existing account.
-
- :::image type="content" alt-text="Screenshot of ARM deployment wizard storage account page, with details provided." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-e-storage.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-e-storage.png":::
-
-1. On the networking configuration page, you'll choose to either use the default Azure DNS or provide your own DNS host name.
-
- >[!TIP]
- > **Yes, use default Azure DNS** is the recommended configuration. Choosing Yes, the deployer automatically creates a Let's Encrypt certificate for HTTP/TLS. If you choose **No** you'll need to supply various networking configuration parameters and your own HTTPS/TLS certificate.
-
- :::image type="content" alt-text="Screenshot of the ARM deployment Networking page, with Yes use default Azure DNS selected." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-d-azure-dns.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-d-azure-dns.png":::
-
- >[!WARNING]
- > The default Azure DNS URL (for example URL="https://purviewprofisee.southcentralus.cloudapp.azure.com/profisee") will be picked up by thr ARM template deployment wizard from the license file supplied to you by Profisee. If you intend to make changes and not use the default Azure DNS, make sure to communicate the full DNS and the fully qualified URL of the Profisee DNS to the Profisee support team so that they can re-generate and provide you the updated license file. Failure to do this will result in a malfunctioning installation of Profisee.
-
-1. On the review + create page, review your details to ensure they're correct while the wizard validates your configuration. Once validation passes, select **Create**.
-
- :::image type="content" alt-text="Screenshot of the review plus create page of the ARM deployment wizard, showing all details with a validation passed flag at the top of the page." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-f-final-template-validation.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-step-f-final-template-validation.png":::
-
-1. It will take around 45-50 Minutes for deployment to complete installing Profisee. During the deployment, you'll see the aspects that are in progress, and can refresh the page to review progress. The deployment will show as complete when all is finished. Completion of "InstallProfiseePlatform" stage also indicates deployment is complete!
-
- :::image type="content" alt-text="Screenshot of Profisee Azure ARM Wizard Deployment Progress, showing intermediate progress." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-deployment-progress-mid.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-deployment-progress-mid.png":::
-
- :::image type="content" alt-text="Screenshot of Profisee Azure ARM Wizard Deployment Progress, showing completed progress." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-deployment-progress-final.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-deployment-progress-final.png":::
-
-1. Once deployment is completed, open the resource group where you deployed your integration.
-
- :::image type="content" alt-text="Screenshot of the resource group where the Profisee resources were deployed, with the deployment script highlighted." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-post-deploy-click-open-resource-group.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-post-deploy-click-open-resource-group.png":::
-
-1. Under outputs, fetch the final deployment URL. The final WEBURL is what you need to paste on your browser address bar and start enjoying Profisee-Purview integration! This URL will be the same that you'd have supplied to Profisee support while obtaining the license file. Unless you chose to change the URL format, it will look something like - "https://[profisee_name].[region].cloudapp.azure.com/profisee/
-
- :::image type="content" alt-text="Screenshot of the outputs of the deployment script, showing the deployment WEB URL highlighted in the output." source="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-click-outputs-get-final-deployment-url.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-azure-arm-wizard-click-outputs-get-final-deployment-url.png":::
-
-1. Populate and hydrate data to the newly installed Profisee environment by installing FastApp. Go to your Profisee deployment URL and select **/Profisee/api/client**. It should look something like - "https://[profisee_name].[region].cloudapp.azure.com/profisee/api/client". Select the Downloads for "Profisee FastApp Studio" utility and the "Profisee Platform Tools". Install both these tools on your local client machine.
-
- :::image type="content" alt-text="Screenshot of the Profisee Client Tools download, with the download links highlighted." source="./media/how-to-deploy-profisee-purview/profisee-download-fastapp-tools.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-download-fastapp-tools.png":::
-
-1. Log in to FastApp Studio and perform the rest of the MDM Administration and configuration management for Profisee. Once you log in with the administrator email address supplied during the setup; you should be able to see the administration menu on the left pane of the Profisee FastApp Studio. Navigate to these menus and perform the rest of your MDM journey using FastApp tool. Being able to see the administration menu as seen in the image below confirms successful installation of Profisee on Azure Platform.
-
- :::image type="content" alt-text="Screenshot of the Profisee FastApp Studio once you sign in, showing the Accounts and Teams menu selected, and the FastApps link highlighted." source="./media/how-to-deploy-profisee-purview/profisee-fastapp-studio-home-screen.png" lightbox="./media/how-to-deploy-profisee-purview/profisee-fastapp-studio-home-screen.png":::
-
-1. As a final validation step to ensure successful installation and for checking whether Profisee has been successfully connected to your Microsoft Purview instance, go to **/Profisee/api/governance/health** It should look something like - "https://[profisee_name].[region].cloudapp.azure.com//Profisee/api/governance/health". The output response will indicate the words **"Status": "Healthy"** on all the Microsoft Purview subsystems.
-
-```json
-{
- "OverallStatus": "Healthy",
- "TotalCheckDuration": "0:XXXXXXX",
- "DependencyHealthChecks": {
- "purview_service_health_check": {
- "Status": "Healthy",
- "Duration": "00:00:NNNN",
- "Description": "Successfully connected to Purview."
- },
- "governance_service_health_check": {
- "Status": "Healthy",
- "Duration": "00:00:NNNN",
- "Description": "Purview cache loaded successfully.
- Total assets: NNN; Instances: 1; Entities: NNN; Attributes: NNN; Relationships: NNN; Hierarchies: NNN"
- },
- "messaging_db_health_check": {
- "Status": "Healthy",
- "Duration": "00:00:NNNN",
- "Description": null
- },
- "logging_db_health_check": {
- "Status": "Healthy",
- "Duration": "00:00:NNNN",
- "Description": null
- }
- }
-}
-```
-
-An output response that looks similar as the above confirms successful installation, completes all the deployment steps; and validates whether Profisee has been successfully connected to your Microsoft Purview and indicates that the two systems are able to communicate properly.
-
-## Next steps
-
-Through this guide, we learned of the importance of MDM in driving and supporting Data Governance in the context of the Azure data estate, and how to set up and deploy a Microsoft Purview-Profisee integration.
-For more usage details on Profisee MDM, register for scheduled trainings, live product demonstration and Q&A on [Profisee Academy Tutorials and Demos](https://profisee.com/demo/)!
purview How To Enable Data Use Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-enable-data-use-management.md
- Title: Enabling Data use management on your Microsoft Purview sources
-description: Step-by-step guide on how to enable Data use management for your registered sources.
----- Previously updated : 10/31/2022---
-# Enable Data use management on your Microsoft Purview sources
-
-*Data use management* is an option within the data source registration in Microsoft Purview. This option lets Microsoft Purview manage data access for your resources. The high level concept is that the data owner allows its data resource to be available for access policies by enabling *Data use management*.
-
-Currently, a data owner can enable Data use management on a data resource, which enables it for these types of access policies:
-* [DevOps policies](concept-policies-devops.md)
-* [Data owner access policies](concept-policies-data-owner.md)
-* [Self-service access policies](concept-self-service-data-access-policy.md) - access policies automatically generated by Microsoft Purview after a [self-service access request](how-to-request-access.md) is approved.
-
-To be able to create any data policy on a resource, Data use management must first be enabled on that resource. This article will explain how to enable Data use management on your resources in Microsoft Purview.
-
->[!IMPORTANT]
->Because Data use management directly affects access to your data, it directly affects your data security. Review [**additional considerations**](#additional-considerations-related-to-data-use-management) and [**best practices**](#data-use-management-best-practices) below before enabling Data use management in your environment.
-
-## Prerequisites
-
-## Enable Data use management
-
-To enable *Data use management* for a resource, the resource will first need to be registered in Microsoft Purview.
-To register a resource, follow the **Prerequisites** and **Register** sections of the [source pages](azure-purview-connector-overview.md) for your resources.
-
-Once you have your resource registered, follow the rest of the steps to enable an individual resource for *Data use management*.
-
-1. Go to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the **Data map** tab in the left menu.
-
-1. Select the **Sources** tab in the left menu.
-
-1. Select the source where you want to enable *Data use management*.
-
-1. At the top of the source page, select **Edit source**.
-
-1. Set the *Data use management* toggle to **Enabled**, as shown in the image below.
--
-## Disable Data use management
-
-To disable Data use management for a source, resource group, or subscription, a user needs to either be a resource IAM **Owner** or a Microsoft Purview **Data source admin**. Once you have those permissions follow these steps:
-
-1. Go to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the **Data map** tab in the left menu.
-
-1. Select the **Sources** tab in the left menu.
-
-1. Select the source you want to disable Data use management for.
-
-1. At the top of the source page, select **Edit source**.
-
-1. Set the **Data use management** toggle to **Disabled**.
-
-## Additional considerations related to Data use management
-- Make sure you write down the **Name** you use when registering in Microsoft Purview. You will need it when you publish a policy. The recommended practice is to make the registered name exactly the same as the endpoint name.-- To disable a source for *Data use management*, you first have to remove any published policies on that data source.-- While user needs to have both data source *Owner* and Microsoft Purview *Data source admin* to enable a source for *Data use management*, **any** Data Source admin for the collection can disable it.-- Disabling *Data use management* for a subscription will disable it also for all assets registered in that subscription.-
-> [!WARNING]
-> **Known issues** related to source registration
-> - Moving data sources to a different resource group or subscription is not supported. If want to do that, de-register the data source in Microsoft Purview before moving it and then register it again after that happens. Note that policies are bound to the data source ARM path. Changing the data source subscription or resource group makes policies ineffective.
-> - Once a subscription gets disabled for *Data use management* any underlying assets that are enabled for *Data use management* will be disabled, which is the right behavior. However, policy statements based on those assets will still be allowed after that.
-
-## Data use management best practices
-- We highly encourage registering data sources for *Data use management* and managing all associated access policies in a single Microsoft Purview account.-- Should you have multiple Microsoft Purview accounts, be aware that **all** data sources belonging to a subscription must be registered for *Data use management* in a single Microsoft Purview account. That Microsoft Purview account can be in any subscription in the tenant. The *Data use management* toggle will become greyed out when there are invalid configurations. Some examples of valid and invalid configurations follow in the diagram below:
- - **Case 1** shows a valid configuration where a Storage account is registered in a Microsoft Purview account in the same subscription.
- - **Case 2** shows a valid configuration where a Storage account is registered in a Microsoft Purview account in a different subscription.
- - **Case 3** shows an invalid configuration arising because Storage accounts S3SA1 and S3SA2 both belong to Subscription 3, but are registered to different Microsoft Purview accounts. In that case, the *Data use management* toggle will only enable in the Microsoft Purview account that wins and registers a data source in that subscription first. The toggle will then be greyed out for the other data source.
-- If the *Data use management* toggle is greyed out and cannot be enabled, hover over it to know the name of the Microsoft Purview account that has registered the data resource first.-
-![Diagram shows valid and invalid configurations when using multiple Microsoft Purview accounts to manage policies.](./media/how-to-policies-data-owner-authoring-generic/valid-and-invalid-configurations.png)
-
-## Next steps
--- [Create data owner policies for your resources](how-to-policies-data-owner-authoring-generic.md)-- [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)-- [Enable Microsoft Purview data owner policies on an Azure Storage account](./how-to-policies-data-owner-storage.md)
purview How To Import Export Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-import-export-glossary.md
- Title: Import and export glossary terms
-description: Learn how import and export business glossary terms in Microsoft Purview.
----- Previously updated : 11/14/2022--
-# Import and export glossary terms
-
-This article describes how to work with the business glossary in Microsoft Purview. It provides steps to create a business glossary term in the Microsoft Purview Data Catalog. It also shows you how to import and export glossary terms by using .CSV files, and how to delete terms that you no longer need.
--
-## Import terms into the glossary
-
-The Microsoft Purview Data Catalog provides a template .CSV file for you to import terms from the catalog into your glossary. Duplicate terms include both spelling and capitalization, because term names are case-sensitive.
-
-1. On the **Business glossary** page, select the glossary you want to import terms to, then select **Import terms**.
-
-1. Select the template, or templates, for the terms you want to import, and then select **Continue**.
-
- You can select multiple templates and import terms for different templates from a single .csv file.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/select-term-template-for-import.png" alt-text="Screenshot of the template list for importing a term, with the system default template highlighted.":::
-
-1. Download the .csv template and use it to enter the terms that you want to add.
-
- Give your template file a name that starts with a letter and includes only letters, numbers, spaces, an underscore (_), or other non-ASCII Unicode characters. Special characters in the file name will create an error.
-
- > [!Important]
- > The system supports only importing columns that are available in the template. The **System default** template will have all the default attributes.
- >
- > Custom term templates define out-of-the box attributes and additional custom attributes. Therefore, the .CSV file differs in the total number of columns and the column names, depending on the term template that you select. You can also review the file for problems after upload.
- >
- > If you want to upload a file with a rich text definition, be sure to enter the definition with markup tags and populate the column `IsDefinitionRichText` to `true` in the .CSV file.
-
- :::image type="content" source="media/how-to-create-import-export-glossary/select-file-for-import.png" alt-text="Screenshot of the button for downloading a sample template file.":::
-
-1. After you finish filling out your .CSV file, select your file to import, and then select **OK**.
-
-The system will upload the file and add all the terms to your selected glossary.
-
-> [!Important]
-> The email address for an expert or steward should be the primary address of the user from the Azure Active Directory (Azure AD) group. Alternate emails, user principal names, and non-Azure AD emails are not yet supported.
-
-## Export terms from the glossary
-
-When you're in any of your glossaries, the **Export terms** button is disabled by default. After you select the terms that you want to export, the **Export terms** button is enabled.
-
-> [!NOTE]
-> Selected terms **don't** need to be from the same term template to be able to export them.
-
-Select **Export terms** to download the selected terms.
--
-> [!Important]
-> The import process currently doesn't support updating the parent of a term.
-
-## Business terms with approval workflow enabled
-
-If [workflows](concept-workflow.md) are enabled on a term, then any import actions for the term will go through an approval before they're saved in the data catalog.
-
-When an import approval workflow is enabled for the Microsoft Purview glossary, you'll see **Submit for approval** instead of **OK** in the **Import** window when you're importing terms via .CSV file. Selecting **Submit for approval** triggers the workflow. However, the terms in the file won't be updated in the catalog until all the approvals are met.
--
-## Next steps
-
-* For more information about glossary terms, see the [glossary reference](reference-azure-purview-glossary.md).
-* For more information about approval workflows of the business glossary, see [Approval workflow for business terms](how-to-workflow-business-terms-approval.md).
purview How To Integrate With Azure Security Products https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-integrate-with-azure-security-products.md
- Title: Integrate with Azure security products
-description: This article describes how to connect Azure security services and Microsoft Purview to get enriched security experiences.
---- Previously updated : 03/23/2023-
-# Integrate Microsoft Purview with Azure security products
-
-This document explains the steps required for connecting a Microsoft Purview account with various Azure security products to enrich security experiences with data classifications and sensitivity labels.
-
-## Microsoft Defender for Cloud
-
-Microsoft Purview provides rich insights into the sensitivity of your data. This makes it valuable to security teams using Microsoft Defender for Cloud to manage the organizationΓÇÖs security posture and protect against threats to their workloads. Data resources remain a popular target for malicious actors, making it crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments. The integration with Microsoft Purview expands visibility into the data layer, enabling security teams to prioritize resources that contain sensitive data.
-
-Classifications and labels applied to data resources in Microsoft Purview are ingested into Microsoft Defender for Cloud, which provides valuable context for protecting resources. Microsoft Defender for Cloud uses the resource classifications and labels to identify potential [attack paths](../defender-for-cloud/how-to-manage-attack-path.md) and [security risks](../defender-for-cloud/how-to-manage-cloud-security-explorer.md) related to sensitive data. The resources in the Defender for Cloud's Inventory and Alerts pages are also enriched with the classifications and labels discovered by Microsoft Purview, so your security teams can filter and focus to prioritize protecting your most sensitive assets.
-
-To take advantage of this [enrichment in Microsoft Defender for Cloud](../security-center/information-protection.md), no more steps are needed in Microsoft Purview. Start exploring the security enrichments with Microsoft Defender for Cloud's [Inventory page](https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/25) where you can see the list of data sources with classifications and sensitivity labels.
-
-### Supported data sources
-
-The integration supports data sources in Azure and AWS; sensitive data discovered in these resources is shared with Microsoft Defender for Cloud:
--- [Azure Blob Storage](./register-scan-azure-blob-storage-source.md)-- [Azure Cosmos DB](./register-scan-azure-cosmos-database.md)-- [Azure Data Explorer](./register-scan-azure-data-explorer.md)-- [Azure Data Lake Storage Gen1](./register-scan-adls-gen1.md)-- [Azure Data Lake Storage Gen2](./register-scan-adls-gen2.md)-- [Azure Files](./register-scan-azure-files-storage-source.md)-- [Azure Database for MySQL](./register-scan-azure-mysql-database.md)-- [Azure Database for PostgreSQL](./register-scan-azure-postgresql.md)-- [Azure SQL Managed Instance](./register-scan-azure-sql-managed-instance.md)-- [Azure Dedicated SQL pool (formerly SQL DW)](./register-scan-azure-synapse-analytics.md)-- [Azure SQL Database](./register-scan-azure-sql-database.md)-- [Azure Synapse Analytics (Workspace)](./register-scan-synapse-workspace.md)-- [Amazon S3](./register-scan-amazon-s3.md)-
-### Known issues
--- Data sensitivity information is currently not shared for sources hosted inside virtual machines - like SAP, Erwin, and Teradata.-- Data sensitivity information is currently not shared for Amazon RDS.-- Data sensitivity information is currently not shared for Azure PaaS data sources registered using a connection string.-- Unregistering the data source in Microsoft Purview doesn't remove the data sensitivity enrichment in Microsoft Defender for Cloud.-- Deleting the Microsoft Purview account will persist the data sensitivity enrichment for 30 days in Microsoft Defender for Cloud.-- Custom classifications defined in the Microsoft Purview compliance portal or Microsoft Purview governance portal aren't shared with Microsoft Defender for Cloud.-
-### FAQ
-
-#### **Why don't I see the AWS data source I have scanned with Microsoft Purview in Microsoft Defender for Cloud?**
-
-Data sources must be onboarded to Microsoft Defender for Cloud as well. Learn more about how to [connect your AWS accounts](../security-center/quickstart-onboard-aws.md) and see your AWS data sources in Microsoft Defender for Cloud.
-
-#### **Why don't I see sensitivity labels in Microsoft Defender for Cloud?**
-
-Assets must first be labeled in Microsoft Purview Data Map, before the labels are shown in Microsoft Defender for Cloud. Check if you have the necessary [prerequisites for sensitivity labels](./how-to-automatically-label-your-content.md) in place. After you've scanned the data, the labels will show up in Microsoft Purview Data Map and then automatically in Microsoft Defender for Cloud.
-
-## Microsoft Sentinel
-
-Microsoft Sentinel is a scalable, cloud-native, solution for both security information and event management (SIEM), and security orchestration, automation, and response (SOAR). Microsoft Sentinel delivers intelligent security analytics and threat intelligence across the enterprise, providing a single solution for attack detection, threat visibility, proactive hunting, and threat response.
-
-Integrate Microsoft Purview with Microsoft Sentinel to gain visibility into where on your network sensitive information is stored, in a way that helps you prioritize at-risk data for protection, and understand the most critical incidents and threats to investigate in Microsoft Sentinel.
-
-1. Start by ingesting your Microsoft Purview logs into Microsoft Sentinel through a data source.
-1. Then use a Microsoft Sentinel workbook to view data such as assets scanned, classifications found, and labels applied by Microsoft Purview.
-1. Use analytics rules to create alerts for changes within data sensitivity.
-
-Customize the Microsoft Purview workbook and analytics rules to best suit the needs of your organization, and combine Microsoft Purview logs with data ingested from other sources to create enriched insights within Microsoft Sentinel.
-
-For more information, see [Tutorial: Integrate Microsoft Sentinel and Microsoft Purview](../sentinel/purview-solution.md).
-
-## Next steps
--- [Experiences in Microsoft Defender for Cloud enriched using sensitivity from Microsoft Purview](../security-center/information-protection.md)
purview How To Lineage Azure Synapse Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-azure-synapse-analytics.md
- Title: Metadata and lineage from Azure Synapse Analytics
-description: This article describes how to connect Azure Synapse Analytics and Microsoft Purview to track data lineage.
----- Previously updated : 03/13/2023-
-# How to get lineage from Azure Synapse Analytics into Microsoft Purview
-
-This document explains the steps required for connecting an Azure Synapse workspace with a Microsoft Purview account to track [data lineage](concept-data-lineage.md) and [ingest data sources](concept-scans-and-ingestion.md#ingestion). The document also gets into the details of the activity coverage scope and supported lineage capabilities.
-
-When you connect Azure Synapse Analytics to Microsoft Purview, whenever a [supported pipeline activity](#supported-azure-synapse-capabilities) is run, metadata about the activity's source data, output data, and the activity will be automatically [ingested](concept-scans-and-ingestion.md#ingestion) into the Microsoft Purview Data Map.
-
-If a data source has already been scanned and exists in the data map, the ingestion process will add the lineage information from Azure Synapse Analytics to that existing source. If the source or output doesn't exist in the data map and is [supported by Azure Synapse Analytics lineage](#supported-azure-synapse-capabilities) Microsoft Purview will automatically add their metadata from Synapse Analytics into the data map under the root collection.
-
-This can be an excellent way to monitor your data estate as users move and transform information using Azure Synapse Analytics.
-
-## Supported Azure Synapse capabilities
-
-Currently, Microsoft Purview captures runtime lineage from the following Azure Synapse pipeline activities:
--- [Copy Data](../data-factory/copy-activity-overview.md?context=/azure/synapse-analytics/context/context)-- [Data Flow](../data-factory/concepts-data-flow-overview.md?context=/azure/synapse-analytics/context/context)-
-> [!IMPORTANT]
-> Microsoft Purview drops lineage if the source or destination uses an unsupported data storage system.
--
-## Access secured Microsoft Purview account
-
-If your Microsoft Purview account is protected by firewall, learn how to let Azure Synapse [access a secured Microsoft Purview account](../synapse-analytics/catalog-and-governance/how-to-access-secured-purview-account.md) through Microsoft Purview private endpoints.
-
-## Bring Azure Synapse lineage into Microsoft Purview
-
-### Step 1: Connect Azure Synapse workspace to your Microsoft Purview account
-
-You can connect an Azure Synapse workspace to Microsoft Purview, and the connection enables Azure Synapse to push lineage information to Microsoft Purview. Follow the steps in [Connect Synapse workspace to Microsoft Purview](../synapse-analytics/catalog-and-governance/quickstart-connect-azure-purview.md). Multiple Azure Synapse workspaces can connect to a single Microsoft Purview account for holistic lineage tracking.
-
-### Step 2: Run pipeline in Azure Synapse workspace
-
-You can create pipelines with Copy activity in Azure Synapse workspace. You don't need any other configuration for lineage data capture. The lineage data will automatically be captured during the activities execution.
-
-### Step 3: Monitor lineage reporting status
-
-After you run the Azure Synapse pipeline, in the Synapse pipeline monitoring view, you can check the lineage reporting status by selecting the following **Lineage status** button. The same information is also available in the activity output JSON -> `reportLineageToPurvew` section.
--
-### Step 4: View lineage information in your Microsoft Purview account
-
-In your Microsoft Purview account, you can browse assets and choose type "Azure Synapse Analytics". You can also search the Data Catalog using keywords.
--
-Select the Synapse account -> pipeline -> activity, you can view the lineage information.
--
-## Monitor the Azure Synapse Analytics links
-
-In Microsoft Purview governance portal, you can [monitor the Azure Synapse Analytics links](how-to-monitor-data-map-population.md#monitor-links).
-
-## Next steps
-
-[Catalog lineage user guide](catalog-lineage-user-guide.md)
-
-[Link to Azure Data Share for lineage](how-to-link-azure-data-share.md)
purview How To Lineage Powerbi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-powerbi.md
- Title: Metadata and Lineage from Power BI
-description: This article describes the data lineage extraction from Power BI source.
----- Previously updated : 06/12/2023-
-# How to get lineage from Power BI into Microsoft Purview
-
-This article elaborates on the data lineage for Power BI sources in Microsoft Purview.
-
-## Prerequisites
-
-To see data lineage in Microsoft Purview for Power BI, you must first [register and scan your Power BI source.](../purview/register-scan-power-bi-tenant.md)
-
-## Common scenarios
-
-After a Power BI source has been scanned, lineage information for your current data assets, and data assets referenced by Power BI, will automatically be added in the Microsoft Purview Data Catalog.
-
-1. Data consumers can perform root cause analysis of a report or dashboard from Microsoft Purview. For any data discrepancy in a report, users can easily identify the upstream datasets and contact their owners if necessary.
-
-1. Data producers can see the downstream reports or dashboards consuming their dataset. Before making any changes to their datasets, the data owners can make informed decisions.
-
-1. Users can search by name, endorsement status, sensitivity label, owner, description, and other business facets to return the relevant Power BI artifacts.
-
-## Power BI artifacts in Microsoft Purview
-
-Once the [scan of your Power BI](../purview/register-scan-power-bi-tenant.md) is complete, following Power BI artifacts will be inventoried in Microsoft Purview:
-
-* Workspaces
-* Dashboards
-* Reports
-* Datasets
-* Dataflows
-* Datamarts
--
-## Lineage of Power BI artifacts in Microsoft Purview
-
-Users can search for a Power BI artifact by name, description, or other details to see relevant results. Under the asset overview and properties tabs, the basic details such as description, classification are shown. Under the lineage tab, asset relationships are shown with the upstream and downstream dependencies.
-
-Microsoft Purview captures lineage among Power BI artifacts (for example: Dataflow -> Dataset -> Report -> Dashboard) and external data assets.
-
->[!NOTE]
-> For lineage between Power BI artifacts and external data assets, currently the supported source types are:
->* Azure SQL Database
->* Azure Blob Storage
->* Azure Data Lake Store Gen1
->* Azure Data Lake Store Gen2
--
-In addition, column level lineage (Power BI subartifact lineage) and transformation inside of Power BI datasets are captured when using Azure SQL Database as source. For measures, you can further select column -> Properties -> expression to see the transformation details.
-
->[!NOTE]
-> Column level lineage and transformation are supported when using Azure SQL Database as source. Other sources are currently not supported.
--
-## Known limitations
-
-* Microsoft Purview leverages the scanner API to retrieve the metadata and lineage. Learn about some API limitations from [Metadata scanning - Considerations and limitations](/power-bi/enterprise/service-admin-metadata-scanning#considerations-and-limitations).
-* In case you have the dataset table connected to another dataset table, when the middle dataset disables the "Enable load" option inside the Power BI desktop, and the lineage can't be extracted.
-* For lineage between Power BI artifacts and external data assets:
- * Currently the supported source types are Azure SQL Database, Azure Blob Storage, Azure Data Lake Store Gen1 and Azure Data Lake Store Gen2.
- * Column level lineage and transformation are only supported when using Azure SQL Database as source. Other sources are currently not supported.
- * Limited information is currently shown for data sources where the Power BI Dataflow is created. For example, for a SQL server source of Power BI dataset, only server/database name is captured.
- * Note due to the following limitations, if you have such scenarios and scan both Power BI and the data sources that Power BI artifacts connect to, currently you may see duplicate assets in the catalog.
- * The source object names in assets and fully qualified names follow the case used in Power BI settings/queries, which may not align with the object case in original data source.
- * When Power BI references SQL views, they are currently captured as SQL table assets.
- * When Power BI references Azure Dedicated SQL pools (formerly SQL DW) source, it's currently captured as Azure SQL Database assets.
-* For Power BI subartifact lineage:
- * Some measures aren't shown in the subartifact lineage, for example, `COUNTROWS`.
- * In the lineage graph, when selecting a measure that is derived by columns using the COUNT function, the underlying column isn't selected automatically. Check the measure expression in the column properties tab to identify the underlying column.
- * If you scanned your Power BI source before subartifact lineage was supported, you may see a database asset along with the new table assets in the lineage graph, which isn't removed.
-
-## Next steps
--- [Learn about Data lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Link Azure Data Factory to push automated lineage](how-to-link-azure-data-factory.md)
purview How To Lineage Purview Data Sharing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-purview-data-sharing.md
- Title: Microsoft Purview Data Sharing Lineage
-description: This article describes how to view lineage of shared datasets, shared using Microsoft Purview Data Sharing.
----- Previously updated : 02/16/2023-
-# How to view Microsoft Purview Data Sharing lineage
-
-This article discusses how to view the lineage of datasets shared using [Microsoft Purview Data Sharing](concept-data-share.md). With these tools, users can discover and track lineage of data across boundaries like organizations, departments, and even data centers.
-
->[!IMPORTANT]
-> This article is about lineage for [Microsoft Purview Data Sharing](concept-data-share.md). If you are using Azure Data Share, use this article for lineage: [Azure Data Share lineage](how-to-link-azure-data-share.md).
-
-## Common scenarios
-
-Data Sharing lineage aims to provide detailed information for root cause analysis and impact analysis.
-
-Some common scenarios include:
--- [Full view of datasets shared in and out of your organization](#full-view-of-datasets-shared-in-and-out-of-your-organization)-- [Root cause analysis for upstream dataset dependencies](#root-cause-analysis-for-upstream-dataset-dependencies)-- [Impact analysis for shared datasets](#impact-analysis-for-shared-datasets)-
-### Full view of datasets shared in and out of your organization
-
-Data officers can see a list of all bi-directionally shared datasets with their partner organizations. They can search and discover the datasets by organization name and see a complete view of all outgoing and incoming shares.
-
-See the [Azure Active Directory share lineage](#azure-active-directory-share-lineage) section to see a full view of outgoing and incoming shares.
-
-### Root cause analysis for upstream dataset dependencies
-
-A report has incorrect information because of upstream data issues from an external Microsoft Purview Data Sharing activity. The data engineers can understand upstream failures, understand possible cause, and further contact the owner of the share to fix the issues causing their data discrepancy.
-
-See the [lineage of a share](#lineage-of-a-share) or [Azure Active Directory share lineage](#azure-active-directory-share-lineage) sections to see the upstream dependencies for your assets.
-
-### Impact analysis for shared datasets
-
-Data producers want to know who will be impacted when they make a change to their datasets. With a lineage map, a data producer can easily understand the impact of the downstream internal or external partners who are consuming data using Microsoft Purview Data Sharing.
-
-See the [lineage of a share](#lineage-of-a-share) or [Azure Active Directory share lineage](#azure-active-directory-share-lineage) sections to see downstream dependencies of your shared assets.
-
-## Lineage of a share
-
-To see the data sharing lineage for your sent share or received share asset, you only need to have [data reader permissions](catalog-permissions.md) on the collection where the asset is housed.
-
-1. Discover your sent share or received share asset in the Catalog [search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) for the share in the data catalog, and narrow results to only data shares.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/search-for-share.png" alt-text="Screenshot of the data catalog search, showing the data share filter selected and a share highlighted." border="true":::
-
-1. Select your data share asset.
-
-1. Select the lineage tab to see a graph with upstream and downstream dependencies.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/select-lineage-tab.png" alt-text="Screenshot of the data share asset lineage page, showing the lineage tab highlighted and a lineage graph." border="true":::
-
-1. You can select the files or folders, or the Azure Active Directory asset in the lineage canvas and see the data sharing lineage for those assets.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/select-lineage-asset.png" alt-text="Screenshot of the data share asset lineage page, showing the data asset selected and the switch to asset button highlighted." border="true":::
-
-## Azure Active Directory share lineage
-
-1. You can search for your Azure Active Directory tenant asset in the Catalog using the search bar, or filtering by **Source type** and selecting **Azure Active Directory**.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/search-active-directory.png" alt-text="Screenshot of the data catalog search, with the filter set to Azure Active Directory." border="true":::
-
-1. Select your Azure Active Directory, then select the **Lineage** tab to see a lineage graph with either upstream or downstream dependencies.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/active-directory-lineage.png" alt-text="Screenshot of the Azure Active Directory asset with the lineage tab selected, showing the lineage map." border="true":::
-
-1. If you're seeing all sent shares, or all received shares, you can select the button to switch the view to see the other.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/switch-shares.png" alt-text="Screenshot of the Azure Active Directory shares lineage map, showing the Switch to received shares button highlighted." border="true":::
-
- >[!NOTE]
- >If you do not see this button it may be because your Azure Active Directory does not have any sent/received shares, or that you do not have permission on the collection where the sent/received shares are housed.
-
-1. You can also select any file or folder in the lineage canvas and see the data sharing lineage for those assets.
-
- :::image type="content" source="./media/how-to-lineage-purview-data-sharing/switch-to-asset.png" alt-text="Screenshot of the Azure Active Directory shares lineage map, showing an asset selected and the Switch to asset button highlighted." border="true":::
-
->[!Important]
->For Data Share assets to show in Microsoft Purview, the ADLS Gen2 or Blob Storage account that the shares (sent share or received share) belong to should be registered with Microsoft Purview and the user needs Reader permission on the collection where they are housed.
-
-## Troubleshoot
-
-Here are some common issues when viewing data sharing lineage and their possible resolutions.
-
-### Can't see the recipient Azure Active Directory tenant on the sent share lineage
-
-If you're unable to see the recipient Azure Active Directory tenant on the sent share lineage, it means the recipient hasn't attached the share yet.
-
-### Can't find a sent share or received share asset in the Catalog
-
-Sent or received share assets are housed in the same collection as their registered storage account assets.
-
-* If the storage accounts the share assets belong to aren't registered, the share assets won't be discoverable.
-* If you don't have a minimum of Data Reader permission to the collection where the share asset is housed, the share assets aren't discoverable.
-
-## Next steps
-
-* [Data sharing quickstart](quickstart-data-share.md)
-* [How to Share data](how-to-share-data.md)
-* [How to receive a share](how-to-receive-share.md)
-* [REST API reference](/rest/api/purview/)
-* [Data Sharing FAQ](how-to-data-share-faq.md)
purview How To Lineage Sql Server Integration Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-sql-server-integration-services.md
- Title: Lineage from SQL Server Integration Services
-description: This article describes the data lineage extraction from SQL Server Integration Services.
----- Previously updated : 08/11/2022-
-# How to get lineage from SQL Server Integration Services (SSIS) into Microsoft Purview
-
-This article elaborates on the data lineage aspects of SQL Server Integration Services (SSIS) in Microsoft Purview.
-
-## Prerequisites
--- [Lift and shift SQL Server Integration Services workloads to the cloud](/sql/integration-services/lift-shift/ssis-azure-lift-shift-ssis-packages-overview)-
-## Supported scenarios
-
-The current scope of support includes the lineage extraction from SSIS packages executed by Azure Data Factory SSIS integration runtime.
-
-On premises SSIS lineage extraction is not supported yet.
-
-Only source and destination are supported for Microsoft Purview SSIS lineage running from Data FactoryΓÇÖs SSIS Execute Package activity. Transformations under SSIS are not yet supported.
-
-### Supported data stores
-
-| Data store | Supported |
-| - | - |
-| Azure Blob Storage | Yes |
-| Azure Data Lake Storage Gen1 | Yes |
-| Azure Data Lake Storage Gen2 | Yes |
-| Azure Files | Yes |
-| Azure SQL Database \* | Yes |
-| Azure SQL Managed Instance \*| Yes |
-| Azure Synapse Analytics \* | Yes |
-| SQL Server \* | Yes |
-
-*\* Microsoft Purview currently doesn't support query or stored procedure for lineage or scanning. Lineage is limited to table and view sources only.*
--
-## How to bring SSIS lineage into Microsoft Purview
-
-### Step 1. [Connect a Data Factory to Microsoft Purview](how-to-link-azure-data-factory.md)
-
-### Step 2. Trigger SSIS activity execution in Azure Data Factory
-
-You can [run SSIS package with Execute SSIS Package activity](../data-factory/how-to-invoke-ssis-package-ssis-activity.md) or [run SSIS package with Transact-SQL in ADF SSIS Integration Runtime](../data-factory/how-to-invoke-ssis-package-stored-procedure-activity.md).
-
-Once Execute SSIS Package activity finishes the execution, you can check lineage report status from the activity output in [Data Factory activity monitor](../data-factory/monitor-visually.md#monitor-activity-runs).
-
-### Step 3. Browse lineage Information in your Microsoft Purview account
--- You can browse the Data Catalog by choosing asset type ΓÇ£SQL Server Integration ServicesΓÇ¥.---- You can also search the Data Catalog using keywords.---- You can view lineage information for an SSIS Execute Package activity and open in Data Factory to view/edit the activity settings.---- You can choose one data source to drill into how the columns in the source are mapped to the columns in the destination.--
-## Next steps
--- [Lift and shift SQL Server Integration Services workloads to the cloud](/sql/integration-services/lift-shift/ssis-azure-lift-shift-ssis-packages-overview)-- [Learn about Data lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Link Azure Data Factory to push automated lineage](how-to-link-azure-data-factory.md)
purview How To Link Azure Data Factory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-link-azure-data-factory.md
- Title: Connect to Azure Data Factory
-description: This article describes how to connect Azure Data Factory and Microsoft Purview to track data lineage.
----- Previously updated : 03/13/2023-
-# How to connect Azure Data Factory and Microsoft Purview
-
-This document explains the steps required for connecting an Azure Data Factory account with a Microsoft Purview account to track [data lineage](concept-data-lineage.md) and [ingest data sources](concept-scans-and-ingestion.md#ingestion). The document also gets into the details of the activity coverage scope and supported lineage patterns.
-
-When you connect an Azure Data Factory to Microsoft Purview, whenever a [supported Azure Data Factory activity](#supported-azure-data-factory-activities) is run, metadata about the activity's source data, output data, and the activity will be automatically [ingested](concept-scans-and-ingestion.md#ingestion) into the Microsoft Purview Data Map.
-
-If a data source has already been scanned and exists in the data map, the ingestion process will add the lineage information from Azure Data Factory to that existing source. If the source or output doesn't exist in the data map and is [supported by Azure Data Factory lineage](#supported-azure-data-factory-activities) Microsoft Purview will automatically add their metadata from Azure Data Factory into the data map under the root collection.
-
-This can be an excellent way to monitor your data estate as users move and transform information using Azure Data Factory.
-
-## View existing Data Factory connections
-
-Multiple Azure Data Factories can connect to a single Microsoft Purview to push lineage information. The current limit allows you to connect up to 10 Data Factory accounts at a time from the Microsoft Purview management center. To show the list of Data Factory accounts connected to your Microsoft Purview account, do the following:
-
-1. Select **Management** on the left navigation pane.
-2. Under **Lineage connections**, select **Data Factory**.
-3. The Data Factory connection list appears.
-
- :::image type="content" source="./media/how-to-link-azure-data-factory/data-factory-connection.png" alt-text="Screen shot showing a data factory connection list." lightbox="./media/how-to-link-azure-data-factory/data-factory-connection.png":::
-
-4. Notice the various values for connection **Status**:
-
- - **Connected**: The data factory is connected to the Microsoft Purview account.
- - **Disconnected**: The data factory has access to the catalog, but it's connected to another catalog. As a result, data lineage won't be reported to the catalog automatically.
- - **CannotAccess**: The current user doesn't have access to the data factory, so the connection status is unknown.
-
->[!Note]
->To view the Data Factory connections, you need to be assigned the following role. Role inheritance from management group is not supported.
->**Collection admins** role on the root collection.
-
-## Create new Data Factory connection
-
->[!Note]
->To add or remove the Data Factory connections, you need to be assigned the following role. Role inheritance from management group is not supported.
->**Collection admins** role on the root collection.
->
-> Also, it requires the users to be the data factory's "Owner" or "Contributor".
->
-> Your data factory needs to have system assigned managed identity enabled.
-
-Follow the steps below to connect an existing data factory to your Microsoft Purview account. You can also [connect Data Factory to Microsoft Purview account from ADF](../data-factory/connect-data-factory-to-azure-purview.md).
-
-1. Select **Management** on the left navigation pane.
-2. Under **Lineage connections**, select **Data Factory**.
-3. On the **Data Factory connection** page, select **New**.
-
-4. Select your Data Factory account from the list and select **OK**. You can also filter by subscription name to limit your list.
-
- Some Data Factory instances might be disabled if the data factory is already connected to the current Microsoft Purview account, or the data factory doesn't have a managed identity.
-
- A warning message will be displayed if any of the selected Data Factories are already connected to other Microsoft Purview account. When you select OK, the Data Factory connection with the other Microsoft Purview account will be disconnected. No other confirmations are required.
-
- :::image type="content" source="./media/how-to-link-azure-data-factory/warning-for-disconnect-factory.png" alt-text="Screenshot showing warning to disconnect Azure Data Factory.":::
-
->[!Note]
->We support adding up to 10 Azure Data Factory accounts at once. If you want to add more than 10 data factory accounts, do so in multiple batches.
-
-### How authentication works
-
-Data factory's managed identity is used to authenticate lineage push operations from data factory to Microsoft Purview. When you connect your data factory to Microsoft Purview on UI, it adds the role assignment automatically.
-
-Grant the data factory's managed identity **Data Curator** role on Microsoft Purview **root collection**. Learn more about [Access control in Microsoft Purview](../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
-
-### Remove Data Factory connections
-
-To remove a data factory connection, do the following:
-
-1. On the **Data Factory connection** page, select the **Remove** button next to one or more data factory connections.
-2. Select **Confirm** in the popup to delete the selected data factory connections.
-
- :::image type="content" source="./media/how-to-link-azure-data-factory/remove-data-factory-connection.png" alt-text="Screenshot showing how to select data factories to remove connection." lightbox="./media/how-to-link-azure-data-factory/remove-data-factory-connection.png":::
-
-## Monitor the Data Factory links
-
-In Microsoft Purview governance portal, you can [monitor the Data Factory links](how-to-monitor-data-map-population.md#monitor-links).
-
-## Supported Azure Data Factory activities
-
-Microsoft Purview captures runtime lineage from the following Azure Data Factory activities:
--- [Copy Data](../data-factory/copy-activity-overview.md)-- [Data Flow](../data-factory/concepts-data-flow-overview.md)-- [Execute SSIS Package](../data-factory/how-to-invoke-ssis-package-ssis-activity.md)-
-> [!IMPORTANT]
-> Microsoft Purview drops lineage if the source or destination uses an unsupported data storage system.
-
-The integration between Data Factory and Microsoft Purview supports only a subset of the data systems that Data Factory supports, as described in the following sections.
--
-### Execute SSIS Package support
-
-Refer to [supported data stores](how-to-lineage-sql-server-integration-services.md#supported-data-stores).
-
-## Access secured Microsoft Purview account
-
-If your Microsoft Purview account is protected by firewall, learn how to let Data Factory [access a secured Microsoft Purview account](../data-factory/how-to-access-secured-purview-account.md) through Microsoft Purview private endpoints.
-
-## Bring Data Factory lineage into Microsoft Purview
-
-For an end to end walkthrough, follow the [Tutorial: Push Data Factory lineage data to Microsoft Purview](../data-factory/turorial-push-lineage-to-purview.md).
-
-## Supported lineage patterns
-
-There are several patterns of lineage that Microsoft Purview supports. The generated lineage data is based on the type of source and sink used in the Data Factory activities. Although Data Factory supports over 80 source and sinks, Microsoft Purview supports only a subset, as listed in [Supported Azure Data Factory activities](#supported-azure-data-factory-activities).
-
-To configure Data Factory to send lineage information, see [Get started with lineage](catalog-lineage-user-guide.md#get-started-with-lineage).
-
-Some other ways of finding information in the lineage view, include the following:
--- In the **Lineage** tab, hover on shapes to preview additional information about the asset in the tooltip.-- Select the node or edge to see the asset type it belongs or to switch assets.-- Columns of a dataset are displayed in the left side of the **Lineage** tab. For more information about column-level lineage, see [Dataset column lineage](catalog-lineage-user-guide.md#dataset-column-lineage).-
-### Data lineage for 1:1 operations
-
-The most common pattern for capturing data lineage, is moving data from a single input dataset to a single output dataset, with a process in between.
-
-An example of this pattern would be the following:
--- 1 source/input: *Customer* (SQL Table)-- 1 sink/output: *Customer1.csv* (Azure Blob)-- 1 process: *CopyCustomerInfo1\#Customer1.csv* (Data Factory Copy activity)--
-### Data movement with 1:1 lineage and wildcard support
-
-Another common scenario for capturing lineage is using a wildcard to copy files from a single input dataset to a single output dataset. The wildcard allows the copy activity to match multiple files for copying using a common portion of the file name. Microsoft Purview captures file-level lineage for each individual file copied by the corresponding copy activity.
-
-An example of this pattern would be the following:
-
-* Source/input: *CustomerCall\*.csv* (ADLS Gen2 path)
-* Sink/output: *CustomerCall\*.csv* (Azure blob file)
-* 1 process: *CopyGen2ToBlob\#CustomerCall.csv* (Data Factory Copy activity)  
--
-### Data movement with n:1 lineage
-
-You can use Data Flow activities to perform data operations like merge, join, and so on. More than one source dataset can be used to produce a target dataset. In this example, Microsoft Purview captures file-level lineage for individual input files to a SQL table that is part of a Data Flow activity.
-
-An example of this pattern would be the following:
-
-* 2 sources/inputs: *Customer.csv*, *Sales.parquet* (ADLS Gen2 Path)
-* 1 sink/output: *Company data* (Azure SQL table)
-* 1 process: *DataFlowBlobsToSQL* (Data Factory Data Flow activity)
--
-### Lineage for resource sets
-
-A resource set is a logical object in the catalog that represents many partition files in the underlying storage. For more information, see [Understanding Resource sets](concept-resource-sets.md). When Microsoft Purview captures lineage from the Azure Data Factory, it applies the rules to normalize the individual partition files and create a single logical object.
-
-In the following example, an Azure Data Lake Gen2 resource set is produced from an Azure Blob:
-
-* 1 source/input: *Employee\_management.csv* (Azure Blob)
-* 1 sink/output: *Employee\_management.csv* (Azure Data Lake Gen 2)
-* 1 process: *CopyBlobToAdlsGen2\_RS* (Data Factory Copy activity)
--
-## Next steps
-
-[Tutorial: Push Data Factory lineage data to Microsoft Purview](../data-factory/turorial-push-lineage-to-purview.md)
-
-[Catalog lineage user guide](catalog-lineage-user-guide.md)
-
-[Link to Azure Data Share for lineage](how-to-link-azure-data-share.md)
purview How To Link Azure Data Share https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-link-azure-data-share.md
- Title: Connect to Azure Data Share
-description: This article describes how to connect an Azure Data Share account with Microsoft Purview to search assets and track data lineage.
----- Previously updated : 03/07/2023-
-# How to connect Azure Data Share and Microsoft Purview
-
-This article discusses how to connect your [Azure Data Share](../data-share/overview.md) account with Microsoft Purview and govern the shared datasets (both outgoing and incoming) in your data estate. Various data governance personas can discover and track lineage of data across boundaries like organizations, departments and even data centers.
-
->[!IMPORTANT]
-> This article is about lineage for the [Azure Data Share service](../data-share/overview.md). If you are using data sharing in Microsoft Purview, use this article for lineage: [Microsoft Purview Data Sharing lineage](how-to-lineage-purview-data-sharing.md).
-
-## Common scenarios
-
-Data Share Lineage is aimed to provide detailed information for root cause analysis and impact analysis.
-
-### Scenario 1: 360 view of datasets shared in/out for a partner organization or internal department
-
-Data officers can see a list of all datasets that are bi-directionally shared with their partner organizations. They can search and discover the datasets by organization name and see a complete view of all outgoing and incoming shares.
-
-### Scenario 2: Root cause analysis - upstream dependency on datasets coming into organization (consumer view of incoming shares)
-
-A report has incorrect information because of upstream data issues from an external Data Share account. The data engineers can understand upstream failures, be informed about the reasons, and further contact the owner of the share to fix the issues causing their data discrepancy.
-
-### Scenario 3: Impact analysis on datasets going outside organization (provider view of outgoing shares)
-
-Data producers want to know who will be impacted upon making a change to their dataset. Using lineage, a data producer can easily understand the impact of the downstream internal or external partners who are consuming data using Azure Data Share.
-
-## Azure Data Share and Microsoft Purview connected experience
-
-To connect your Azure Data Share and Microsoft Purview account, do the following:
-
-1. Create a Microsoft Purview account. All the Data Share lineage information will be collected by a Microsoft Purview account. You can use an existing one or create a new Microsoft Purview account.
-
-1. Connect your Azure Data Share to your Microsoft Purview account.
-
- 1. In the Microsoft Purview governance portal, you can go to **Management Center** and connect your Azure Data Share under the **External connections** section.
- 1. Select **+ New** on the top bar, find your Azure Data Share in the pop-up side bar and add the Data Share account. Run a snapshot job after connecting your Data Share to Microsoft Purview account, so that the Data Share assets and lineage information is visible in Microsoft Purview.
-
- :::image type="content" source="media/how-to-link-azure-data-share/connect-to-data-share.png" alt-text="Screenshot of the management center to link Azure Data Share.":::
-
-1. Execute your snapshot in Azure Data Share.
-
- - Once the Azure Data share connection is established with Microsoft Purview, you can execute a snapshot for your existing shares.
- - If you donΓÇÖt have any existing shares, go to the Azure Data Share portal to [share your data](../data-share/share-your-data.md) [and subscribe to a data share](../data-share/subscribe-to-data-share.md).
- - Once the share snapshot is complete, you can view associated Data Share assets and lineage in Microsoft Purview.
-
-1. Discover Data Share accounts and share information in your Microsoft Purview account.
-
- - In the home page of Microsoft Purview account, select **Browse by asset type** and select the **Azure Data Share** tile. You can search for an account name, share name, share snapshot, or partner organization. Otherwise apply filters on the Search result page for account name, share type (sent vs received shares).
-
- :::image type="content" source="media/how-to-link-azure-data-share/azure-data-share-search-result-page.png" alt-text="Screenshot of Azure Data share in the search result page.":::
-
- >[!Important]
- >For Data Share assets to show in Microsoft Purview, a snapshot job must be run after you connect your Data Share to Microsoft Purview.
-
-1. Track lineage of datasets shared with Azure Data Share.
-
- - From the Microsoft Purview search result page, choose the Data share snapshot (received/sent) and select the **Lineage** tab, to see a lineage graph with upstream and downstream dependencies.
-
- :::image type="content" source="media/how-to-link-azure-data-share/azure-data-share-lineage.png" alt-text="Screenshot of the lineage of datasets shared using Azure Data Share.":::
-
-## Next steps
--- [Catalog lineage user guide](catalog-lineage-user-guide.md)-- [Link to Azure Data Factory for lineage](how-to-link-azure-data-factory.md)
purview How To Manage Quotas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-manage-quotas.md
- Title: Manage resources and quotas-
-description: Learn about the quotas and limits on resources for Microsoft Purview and how to request quota increases.
---- Previously updated : 02/17/2023-
-
-# Manage and increase quotas for resources with Microsoft Purview
-
-This article highlights the limits that currently exist in the Microsoft Purview service. These limits are also known as quotas.
-
-## Microsoft Purview limits
-
-|**Resource**| **Default Limit** |**Maximum Limit**|
-||||
-|Microsoft Purview accounts per region, per tenant (all subscriptions combined)|3|Contact Support|
-|Data Map throughput^ <br><small>There's no default limit on the data map metadata storage</small>| 10 capacity units <br><small>250 operations per second</small> | 100 capacity units <br><small>2,500 operations per second</small> |
-|vCores available for scanning, per account*|160|160|
-|Concurrent scans per Purview account. The limit is based on the type of data sources scanned*|5 | 10 |
-|Maximum time that a scan can run for|7 days|7 days|
-|Size of assets per account|100M physical assets |Contact Support|
-|Maximum size of an asset in a catalog|2 MB|2 MB|
-|Maximum length of an asset name and classification name|4 KB|4 KB|
-|Maximum length of asset property name and value|32 KB|32 KB|
-|Maximum length of classification attribute name and value|32 KB|32 KB|
-|Maximum number of glossary terms, per glossary|100K|100K|
-|Maximum number of self-service policies, per account|3K|3K|
-
-\* Self-hosted integration runtime scenarios aren't included in the limits defined in the above table.
-
-^ Increasing the data map throughput limit also increases the minimum number of capacity units with no usage. See [Data Map throughput](concept-elastic-data-map.md) for more info.
-
-## Request quota increase
-
-Use the following steps to create a new support request from the Azure portal to increase quota for Microsoft Purview. You can create a quota request for Microsoft Purview accounts in a subscription, accounts in a tenant and the data map throughput of a specific account.
-
-1. On the [Azure portal](https://portal.azure.com) menu, select **Help + support**.
-
- :::image type="content" source="./media/how-to-manage-quotas/help-plus-support.png" alt-text="Screenshot showing how to navigate to help and support" border="true":::
-
-1. In **Help + support**, select **New support request**.
-
- :::image type="content" source="./media/how-to-manage-quotas/create-new-support-request.png" alt-text="Screenshot showing how to create new support request" border="true":::
-
-1. For **Issue type**, select **Service and subscription limits (quotas)**.
-
-1. For **Subscription**, select the subscription whose quota you want to increase.
-
-1. For **Quota type**, select Microsoft Purview. Then select **Next**.
-
- :::image type="content" source="./media/how-to-manage-quotas/enter-support-details.png" alt-text="Screenshot showing how to enter support information" border="true":::
-
-1. In the **Details** window, select **Enter details** to enter additional information.
-1. Choose your **Quota type**, scope (either location or account) and what you wish the new limit to be
-
- :::image type="content" source="./media/how-to-manage-quotas/enter-quota-amount.png" alt-text="Screenshot showing how to enter quota amount for Microsoft Purview accounts per subscription" border="true":::
-
-1. Enter the rest of the required support information. Review and create the support request
-
-## Next steps
-
-> [!div class="nextstepaction"]
->[Concept: Elastic Data Map in Microsoft Purview](concept-elastic-data-map.md)
-
-> [!div class="nextstepaction"]
->[Tutorial: Scan data with Microsoft Purview](tutorial-scan-data.md)
-
-> [!div class="nextstepaction"]
->[Tutorial: Navigate the home page and search for an asset](tutorial-asset-search.md)
-
-> [!div class="nextstepaction"]
->[Tutorial: Browse assets and view their lineage](tutorial-browse-and-view-lineage.md)
purview How To Manage Term Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-manage-term-templates.md
- Title: How to manage term templates for business glossary
-description: Learn how to manage term templates for business glossary in a Microsoft Purview Data Catalog.
----- Previously updated : 2/27/2023-
-# How to manage term templates for business glossary
-
-Microsoft Purview allows you to create a glossary of terms that are important for enriching your data. Each new term added to your Microsoft Purview Data Catalog Glossary is based on a term template that determines the fields for the term. This article describes how to create a term template and custom attributes that can be associated to glossary terms.
-
-## Manage term templates and custom attributes
-
-Using term templates, you can create custom attributes, group them together and apply a template while creating terms. Once a term is created, the template associated with the term can't be changed.
-
-1. On the **Glossary terms** page, select **Manage term templates**.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/manage-term-templates.png" alt-text="Screenshot of Glossary terms > Manage term templates button.":::
-
-1. The page shows both system and custom attributes. Select the **Custom** tab to create or edit term templates.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/manage-term-custom.png" alt-text="Screenshot of Glossary terms > Manage term templates page.":::
-
-1. Select **+ New term template** and enter a template name and description.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/new-term-template.png" alt-text="Screenshot of Glossary terms > Manage term templates > New term templates":::
-
-1. Select **+ New attribute** to create a new custom attribute for the term template. Enter an attribute name, description. The custom attribute name must be unique within a term template but can be same name can be reused across templates.
-
- Choose the field type from the list of options **Text**, **Single choice**, **Multi choice**, or **Date**. You can also provide a default value string for Text field types. The attribute can also be marked as **required**.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/new-attribute.png" alt-text="Screenshot of Glossary terms > New attribute page.":::
-
-1. Once all the custom attributes are created, select **Preview** to arrange the sequence of custom attributes. You can drag and drop custom attributes in the desired sequence.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/preview-term-template.png" alt-text="Screenshot of Glossary terms > Preview term template.":::
-
-1. Once all the custom attributes are defined, select **Create** to create a term template with custom attributes.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/create-term-template.png" alt-text="Screenshot of Glossary terms > New term template - Create button.":::
-
-1. Existing custom attributes can be marked as expired by checking **Mark as Expired**. Once expired, the attribute can't be reactivated. The expired attribute is greyed out for existing terms. Future new terms created with this term template will no longer show the attribute that has been marked expired.
-
- :::image type="content" source="./media/how-to-manage-glossary-term-templates/expired-attribute.png" alt-text="Screenshot of Glossary terms > Edit attribute to mark it as expired.":::
-
-1. To delete a term template, open the template and select **Delete**. Note, if the term template is in use, you can't delete it until you remove the glossary terms to which it's assigned.
-
-## Next steps
-
-Follow the [Tutorial: Create and import glossary terms](tutorial-import-create-glossary-terms.md) to learn more.
purview How To Managed Attributes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-managed-attributes.md
- Title: Managed attributes in the Microsoft Purview Data Catalog
-description: Apply business context to assets using managed attributes
----- Previously updated : 04/11/2023--
-# Managed attributes in the Microsoft Purview Data Catalog
-
-Managed attributes are user-defined attributes that provide a business or organization level context to an asset. When applied, managed attributes enable data consumers using the data catalog to gain context on the role an asset plays in a business.
-
-## Terminology
-
-**Managed attribute:** A set of user-defined attributes that provide a business or organization level context to an asset. A managed attribute has a name and a value. For example, ΓÇ£DepartmentΓÇ¥ is an attribute name and ΓÇ£FinanceΓÇ¥ is its value.
-**Attribute group:** A grouping of managed attributes that allow for easier organization and consumption.
-
-## Create managed attributes in Microsoft Purview Studio
-
-In Microsoft Purview Studio, an organization's managed attributes are managed in the **Annotation management** section of the data map application. Follow the instructions below to create a managed attribute.
-
-1. Open the data map application and navigate to **Managed attributes** in the **Annotation management** section.
-1. Select **New**. Choose whether you wish to start by creating an attribute group or a managed attribute.
- :::image type="content" source="media/how-to-managed-attributes/create-new-managed-attribute.png" alt-text="Screenshot that shows how to create a new managed attribute or attribute group.":::
-1. To create an attribute group, enter a name and a description.
- :::image type="content" source="media/how-to-managed-attributes/create-attribute-group.png" alt-text="Screenshot that shows how to create an attribute group.":::
-1. Managed attributes have a name, attribute group, data type, and associated asset types. They also have a required flag that can only be enabled when created as part of creating a new attribute group. Associated asset types are the data asset types you can apply the attribute to. For example, if you select "Azure SQL Table" for an attribute, you can apply it to Azure SQL Table assets, but not Azure Synapse Dedicated Table assets.
- :::image type="content" source="media/how-to-managed-attributes/create-managed-attribute.png" alt-text="Screenshot that shows how to create a managed attribute.":::
-1. Select **Create** to save your attribute.
-
-### Required managed attributes
-
-When you create a managed attribute as part of a managed attribute group, you can add the **required** flag. The required flag means that a value must be provided for this managed attribute. When a data asset is edited the required attribute must be filled out before you can close the editor.
-
->[!NOTE]
-> - You can't add the **required** flag to an existing attribute in editing.
-> - You can't add the **required** flag while creating a new attribute outside of an attribute group.
-> You can only add this flag while creating an attribute group.
-
-1. Open the data map application and navigate to **Managed attributes** in the **Annotation management** section.
-1. Select **New** and select **Attribute group**.
-1. Select **New attribute**.
-1. Fill out your attribute details, and select the **Mark as required** flag.
- :::image type="content" source="media/how-to-managed-attributes/mark-as-required.png" alt-text="Screenshot of the mark as required flag on a new attribute being created as a part of a new attribute group.":::
-1. Select **Apply** and finish adding other attributes to complete your attribute group.
-
-### Expiring managed attributes
-
-In the managed attribute management experience, managed attributes can't be deleted, only expired. Expired attributes can't be applied to any assets and are, by default, hidden in the user experience. By default, expired managed attributes aren't removed from an asset. If an asset has an expired managed attribute applied, it can only be removed, not edited.
-
-Both attribute groups and individual managed attributes can be expired. To mark an attribute group or managed attribute as expired, select the **Edit** icon.
--
-Select **Mark as expired** and confirm your change. Once expired, attribute groups and managed attributes can't be reactivated.
--
-## Apply managed attributes to assets in Microsoft Purview Studio
-
-Managed attributes can be applied in the [asset details page](catalog-asset-details.md) in the data catalog. Follow the instructions below to apply a managed attribute.
-
-1. Navigate to an asset by either searching or browsing the data catalog. Open the asset details page.
-1. Select **Edit** on the asset's action bar.
- :::image type="content" source="media/how-to-managed-attributes/edit-asset.png" alt-text="Screenshot that shows how to edit an asset.":::
-1. In the managed attributes section of the editing experience, select **Add attribute**.
-1. Choose the attribute you wish to apply. Attributes are grouped by their attribute group.
-1. Choose the value or values of the applied attribute.
-1. Continue adding more attributes or select **Save** to apply your changes.
-
-## Create managed attributes using APIs
-
-Managed attributes can be programmatically created and applied using the business metadata APIs in Apache Atlas 2.2. For more information, see the [Use Atlas 2.2 APIs](tutorial-atlas-2-2-apis.md) tutorial.
-
-## Searching by managed attributes
-
-Once you have created managed attributes, you can refine your [data catalog searches](how-to-search-catalog.md) using these attributes.
-
-1. In a data catalog search, to refine by a managed attribute, first select **Add filter** at the top of the search.
-
- :::image type="content" source="media/how-to-managed-attributes/add-filter.png" alt-text="Screenshot showing the add filter button highlighted on a search in the Data Catalog.":::
-
-1. Select the drop-down, scroll to your list of managed attributes, and select one.
-
- :::image type="content" source="media/how-to-managed-attributes/select-managed-attributes.png" alt-text="Screenshot showing the filter dropdown with the list of added managed attributes highlighted.":::
-
-1. Select your operator, which will be different based on the kinds of values allowed by the attribute. In this example, we've selected Cost Center, which is a text value, so we can compare Cost Center with the text we'll enter.
-
- :::image type="content" source="media/how-to-managed-attributes/select-operator.png" alt-text="Screenshot showing the filter operator dropdown with the available operators highlighted.":::
-
-1. Enter your values and the search will run with your new filter.
-
-## Known limitations
-
-Below are the known limitations of the managed attribute feature as it currently exists in Microsoft Purview.
--- Managed attributes can only be deleted if they have not been applied to any assets.-- Managed attributes can't be applied via the bulk edit experience.-- After creating an attribute group, you can't edit the name of the attribute group.-- After creating a managed attribute, you can't update the attribute name, attribute group or the field type.-- A managed attribute can only be marked as required during the creation of an attribute group.-
-## Next steps
--- After creating managed attributes, apply them to assets in the [asset details page](catalog-asset-details.md).
purview How To Metamodel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-metamodel.md
- Title: Manage assets with metamodel
-description: Manage asset types with Microsoft Purview metamodel
----- Previously updated : 01/26/2023---
-# Manage assets with metamodel
--
-Metamodel is a feature in the Microsoft Purview Data Map that gives the technical data in your data map relationships and reference points to make it easier to navigate and understand in the day to day. Like adding streets and cities to a map, the metamodel orients users so they know where they are and can discover the information they need.
-
-This article will get you started in building a metamodel for your Microsoft Purview Data Map.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- Create a new, or use an existing Microsoft Purview account. You can [follow our quick-start guide to create one](create-catalog-portal.md).-- Create a new, or use an existing resource group, and place new data sources under it. [Follow this guide to create a new resource group](../azure-resource-manager/management/manage-resource-groups-portal.md).-- [Data Curator role](catalog-permissions.md#roles) on the collection where the data asset is housed and/or the root collection, depending on what you need. See the guide on [managing Microsoft Purview role assignments](catalog-permissions.md#assign-permissions-to-your-users).
- - Create and modify asset types, modify assets - Data Curator on the collection where the data asset is housed. An asset will need to be moved to your collection after creation for you to be able to modify it.
- - Create and modify assets - Data curator on the root collection.
-
->[!NOTE]
-> As this feature is in preview, these permissions are not the final permission structure for metamodel. Updates will continue to be made to this structure.
-
-## Current limitations
-
->[!NOTE]
-> Since this feature is in preview, available abilities are regularly updated.
--- When a new asset created, you have to refresh the asset to see relationships-- New assets are created in the root collection, but can be edited afterwards to be moved to a new collection.-- You can't set relationships between two data assets in the Microsoft Purview governance portal-- The related tab only shows a "business lineage" view for business assets, not data assets-
-## Create and modify asset types
-
-1. To get started, open the data map and select **Asset types**. YouΓÇÖll see a list of available asset types. [Predefined asset types](#predefined-asset-types) will have unique icons. All custom assets are designated with a puzzle piece icon.
-
-1. To create a new asset type, select **New asset type** and add a name, description, and attributes.
-
- :::image type="content" source="./media/how-to-metamodel/create-and-modify-metamodel-asset-types-inline.png" alt-text="Screenshot of the asset types page in the Microsoft Purview Data Map, with the buttons in steps 1 through 3 highlighted." lightbox="./media/how-to-metamodel/create-and-modify-metamodel-asset-types.png":::
-
-1. To define a relationship between two asset types, select **New relationship type**.
-
-1. Give the relationship a name and define its reverse direction. Assign it to one or more pairs of assets. Select **Create** to save your new relationship type.
-
- :::image type="content" source="./media/how-to-metamodel/create-new-relationship-type.png" alt-text="Screenshot of the new relationship type page with a relationship defined and the create button highlighted." border="true":::
-
-1. As you create more asset types, your canvas may get crowded with asset types. To hide an asset from the canvas, select the eye icon on the asset card.
-
- :::image type="content" source="./media/how-to-metamodel/hide-asset.png" alt-text="Screenshot of an asset card in the asset types canvas, the eye icon in the right corner is highlighted." border="true":::
-
-1. To add an asset type back to the canvas, drag it from the left panel.
-
- :::image type="content" source="./media/how-to-metamodel/add-asset.png" alt-text="Screenshot of the asset list to the left of the asset canvas with one item highlighted." border="true":::
-
-## Create and modify assets
-
-1. When youΓÇÖre ready to begin working with assets, go to the data catalog and select **Business assets**.
-
- :::image type="content" source="./media/how-to-metamodel/metamodel-assets-in-catalog.png" alt-text="Screenshot of left menu in the Microsoft Purview governance portal, the data map and business assets buttons highlighted." border="true":::
-
-1. To create a new asset, select **New asset**, select the asset type from the drop-down menu, give it a name, description, and complete any required attributes. Select **Create** to save your new asset.
-
- :::image type="content" source="./media/how-to-metamodel/select-new-asset.png" alt-text="Screenshot of the business assets page with the new asset button highlighted." border="true":::
-
- :::image type="content" source="./media/how-to-metamodel/create-new-asset.png" alt-text="Screenshot of the new asset page with a name and description added and the create button highlighted." border="true":::
-
-1. To establish a relationship between two assets, go to the asset detail page and select **Edit > Related**, and the relationship youΓÇÖd like to populate.
-
- :::image type="content" source="./media/how-to-metamodel/select-edit.png" alt-text="Screenshot of an asset page with the edit button highlighted." border="true":::
-
- :::image type="content" source="./media/how-to-metamodel/establish-relationships.png" alt-text="Screenshot of the edit asset page with the Related tab open and the relationships highlighted." border="true":::
-
-1. Select the assets or assets youΓÇÖd like to link from the data catalog and select **OK**.
-
- :::image type="content" source="./media/how-to-metamodel/select-related-assets.png" alt-text="Screenshot of the select assets page with two assets selected and the Ok button highlighted." border="true":::
-
-1. Save your changes. You can see the relationships you established in the asset overview.
-
-1. In the **Related** tab of the asset you can also explore a visual representation of related assets.
-
- :::image type="content" source="./media/how-to-metamodel/visualize-related-assets.png" alt-text="Screenshot of the related tab of a business asset." border="true":::
-
- >[!NOTE]
- >This is the experience provided by default from Atlas.
-
-## Predefined asset types
-
-An asset type is a template for storing a concept thatΓÇÖs important to your organizationΓÇöanything you might want to represent in your data map alongside your physical metadata. You can create your own, but Purview also comes with a prepackaged set of business asset types you can modify to meet your needs.
-
-| Asset Type | Description |
-|||
-| Application service| A well-defined software component, especially one that implements a specific business function such as on-boarding a new customer, taking an order, or sending an invoice.ΓÇ» |
-| Business process | A set of activities that are performed in coordination in an organizational or technical environment that jointly realizes a business goal. |
-| Data Domain | A category of data that is governed or explicitly managed for master data management. |
-| Department | An organizational subunit that only has full recognition within the context of that organization. A department wouldn't be regarded as a legal entity in its own right. |
-| Line of business | An organization subdivision focused on a single product or family of products. |
-| Organization | A collection of people organized together into a community or other social, commercial or political structure. The group has some common purpose or reason for existence that goes beyond the set of people belonging to it and can act as a unit. Organizations are often decomposable into hierarchical structures. |
-| Product | Any offered product or service. |
-| Project | A specific activity used to control the use of resources and associated costs so they're used appropriately in order to successfully achieve the project's goals, such as building a new capability or improving an existing capability. |
-| System | An IT system including hardware and software. |
-
-## Next steps
-
-For more information about the metamodel, see the metamodel [concept page](concept-metamodel.md).
purview How To Monitor Data Map Population https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-monitor-data-map-population.md
- Title: Monitor data map population in Microsoft Purview
-description: This guide describes how to monitor data map population including the scan runs and links in Microsoft Purview.
----- Previously updated : 12/14/2022--
-# Monitor data map population in Microsoft Purview
-
-In Microsoft Purview, you can scan various types of data sources and view the scan status over time; you can also connect other services with Microsoft Purview and view the trends of the ingested assets/relationship. This article outlines how to monitor and get a bird's eye view of the data map population.
-
-## Monitor scan runs
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Open your Microsoft Purview account and select **Data map** -> **Monitoring**. You need to have **Data source admin** role on any collection to access this page. And you'll see the scan runs that belong to the collections on which you have data source admin privilege.
-
-1. The high-level KPIs show total scan runs within a period. The time period is defaulted at last 30 days, you can also choose to select last seven days. Based on the time filter selected, you can see the distribution of successful, failed, canceled, and in progress scan runs by week or by the day in the graph.
-
- :::image type="content" source="./media/how-to-monitor-scan-runs/monitor-scan-runs.png" alt-text="View scan runs over time" lightbox="./media/how-to-monitor-scan-runs/monitor-scan-runs.png":::
-
-1. At the bottom of the graph, there's a **View more** link for you to explore further. The link opens the **Scan status** page. Here you can see a scan name and the number of times it has succeeded, failed, or been canceled in the time period. You can also filter the list by source types.
-
- :::image type="content" source="./media/how-to-monitor-scan-runs/view-scan-status.png" alt-text="View scan status in details" lightbox="./media/how-to-monitor-scan-runs/view-scan-status.png":::
-
-1. You can explore a specific scan further by selecting the **scan name**. It connects you to the scan history page, where you can find the list of run IDs with more execution details.
-
- :::image type="content" source="./media/how-to-monitor-scan-runs/view-scan-history.png" alt-text="View scan history for a given scan" lightbox="./media/how-to-monitor-scan-runs/view-scan-history.png":::
-
-1. You can click into the **run ID** to check more about the [scan run details](#scan-run-details).
-
-## Scan run details
-
-You can navigate to scan run history for a given scan from different places:
--- Go to **Data map** -> **Monitoring** as described in [Monitor scan runs](#monitor-scan-runs) section.-- Go to **Data map** -> **Sources** -> select the desired data source -> see **Scans**, **Recent scans** or **Recent failed scans**.-- Go to **Data map** -> **Collections** -> select the desired collection -> **Scans** -> select the scan name that you want to view more.-
-You can click the **run ID** to check more about the scan run details:
---- **Run ID**: The GUID used to identify the given scan run.--- **Run type**: Full or incremental scan.--- **Scan** section summarizes the metrics for discovery phase that Purview connects to the source, extracts the metadata/lineage and classifies the data.-
- - **Scan status**:
-
- | Status | Description |
- | -- | |
- | Completed | The scan phase succeeds. |
- | Failed | The scan phase fails. You can check the error details by clicking the "More info" link next to it. |
- | Canceled | The scan run is canceled by user. |
- | In Progress | The scan is running in progress. |
- | Queued | The scan run is waiting for available integration runtime resource.<br>If you use self-hosted integration runtime, note each node can run a number of concurrent scans at the same time depending on your machine specification (CPU and memory). More scans are in Queued status. |
- | Throttled | The scan run is being throttled. It means this Microsoft Purview account at the moment has more ongoing scan runs than the allowed max concurrent count. Learn more about the limit [here](how-to-manage-quotas.md). This particular scan run is waiting and will be executed once your other ongoing scan(s) finishes. |
-
- The scan run is not charged during "Throttled" or "Queued" status.
-
- - **Scan type**: Manual or scheduled scan.
- - **Assets discovered**: The number of assets enumerated from the source. For both full and incremental scans, it includes all assets in the configured scope, regardless of whether they're existing assets or newly created/updated assets since the last scan run. While for incremental scan, detailed metadata is extracted only for newly created or updated assets additionally.
- - **Assets classified**: The number of assets sampled to classify the data, regardless of whether the assets have any matching classification or not. It's a subset of the discovered assets based on the [sampling mechanism](microsoft-purview-connector-overview.md#sampling-data-for-classification). For incremental scan, only newly created or updated assets may be selected for classification.
- - **Duration**: The scan phase duration and the start/end time.
--- **Data ingestion** section summarizes the metrics for ingestion phase that Purview populates the data map with the identified metadata and relationship.-
- - **Ingestion status**:
-
- | Status | Description |
- | - | |
- | Completed | All of the assets and relationships are ingested into the data map successfully. |
- | Partially completed | Partial of the assets and relationships are ingested into the data map successfully, while some fail. |
- | Failed | The ingestion phase fails. |
- | Canceled | The scan run is canceled by user, thus the ingestion is canceled along. |
- | In Progress | The ingestion is running in progress. |
- | Queued | The ingestion is waiting for available service resource or waiting for scan to discover metadata. |
-
- - **Assets ingested**: The number of assets ingested into the data map. For incremental scan, it only includes the newly created or updated assets, in which case may be less than the "assets discovered" count. When scanning file-based source, it's the raw assets count before resource set aggregation.
-
- - **Relationships ingested**: The number of relationships ingested into the data map. It includes lineage and other relationships like foreign key relationships.
-
- - **Duration**: The ingestion duration and the start/end time.
-
-## View the exception log (Preview)
-
-When some assets or relationship fail to ingest into data map during scan, e.g. ingestion status ends up as partially completed, you can see a "**Download log**" button in the [scan run details](#scan-run-details) panel. It provides exception log files that capture the details of the failures.
-
-The following table shows the schema of a log file.
-
-| Column | Description |
-| | -- |
-| TimeStamp | The UTC timestamp when the ingestion operation happens.|
-| ErrorCode | Error code of the exception. |
-| OperationItem | Identifier for the failed asset/relationship, usually using the fully qualified name. |
-| Message | More information on which asset/relationship failed to ingest due to what reason. If there is ingestion failure for resource set, it may apply to multiple assets matching the same naming pattern, and the message includes the impacted count. |
-
-Currently, the exception log doesn't include failures happened during scan phase (metadata discovery). It will be added later.
-
-## Monitor links
-
-You can connect other services with Microsoft Purview to establish a "link", which will make the metadata and lineage of that service's assets available to Microsoft Purview. Currently, link is supported for [Azure Data Factory](how-to-link-azure-data-factory.md) and [Azure Synapse Analytics](how-to-lineage-azure-synapse-analytics.md).
-
-To monitor the assets and relationship ingested over the links:
-
-1. Go to your Microsoft Purview account -> open **Microsoft Purview governance portal** -> **Data map** -> **Monitoring** -> **Links**. You need to have **Data source admin** role on any collection to access the Monitoring tab. And you'll see the results that belong to the collections on which you have data source admin privilege. Permission on root collection is needed to monitor Azure Data Factory and Azure Synapse Analytics links.
-
-1. You can see the high-level KPIs including total number of sources, number of ingested assets and relationship (lineage), followed by trending charts over time. You can apply additional filters on the following to narrow down the results:
-
- - Source type
- - Source name
- - Date range: Default is 30 days. You can also choose last seven days or a custom date range. The retention is 45 days.
-
- The metrics are reported till the date time shown at the top right corner. And the aggregation will happen hourly.
-
- :::image type="content" source="./media/how-to-monitor-scan-runs/monitor-links.png" alt-text="Screenshot of view link results." lightbox="./media/how-to-monitor-scan-runs/monitor-links.png":::
-
-1. At the bottom of the graph, there's a **View more** link for you to explore further. In the **Link status** page, you can see a list of source names along with the source type, assets ingested, relationship ingested and the last run date time. The filters in the previous page will be carried over, and you can further filter the list by source type, source name and date range.
-
- :::image type="content" source="./media/how-to-monitor-scan-runs/monitor-links-drilldown.png" alt-text="Screenshot of view link results by source." lightbox="./media/how-to-monitor-scan-runs/monitor-links-drilldown.png":::
-
-1. You can drill down to each source to see the next level details by clicking the source name. For example, for Azure Data Factory, it shows how each pipeline activity reports the assets and relationship to Microsoft Purview, with the name in the format of `<pipeline_name>/<activity_name>`.
-
- :::image type="content" source="./media/how-to-monitor-scan-runs/monitor-links-drilldown-second-layer.png" alt-text="Screenshot of view link results by source's sub-artifacts." lightbox="./media/how-to-monitor-scan-runs/monitor-links-drilldown-second-layer.png":::
-
-### Known limitations
--- For Azure Data Factory and Azure Synapse Analytics, currently this link monitoring captures the assets and relationship generated from copy activity, but not data flow and SSIS activities.-- The aggregation and date filter are in UTC time.-
-## Scans no longer run
-
-If your Microsoft Purview scan used to successfully run, but are now failing, check these things:
-
-1. Check the error message first to see the failure details.
-1. Have credentials to your resource changed or been rotated? If so, you'll need to update to make your scan use the correct credentials.
-1. Is an [Azure Policy](../governance/policy/overview.md) preventing **updates to Storage accounts**? If so follow the [Microsoft Purview exception tag guide](create-azure-purview-portal-faq.md) to create an exception for Microsoft Purview accounts.
-1. Are you using a self-hosted integration runtime? Check that it's up to date with the latest software and that it's connected to your network.
-
-## Next steps
-
-* [Microsoft Purview supported data sources and file types](azure-purview-connector-overview.md)
-* [Manage data sources](manage-data-sources.md)
-* [Scan and ingestion](concept-scans-and-ingestion.md)
purview How To Monitor With Azure Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-monitor-with-azure-monitor.md
- Title: How to monitor Microsoft Purview
-description: Learn how to configure Microsoft Purview metrics, alerts, and diagnostic settings by using Azure Monitor.
---- Previously updated : 04/07/2022--
-# Microsoft Purview metrics in Azure Monitor
-
-This article describes how to configure metrics, alerts, and diagnostic settings for Microsoft Purview using Azure Monitor.
-
-## Monitor Microsoft Purview
-
-Microsoft Purview admins can use Azure Monitor to track the operational state of Microsoft Purview account. Metrics are collected to provide data points for you to track potential problems, troubleshoot, and improve the reliability of the Microsoft Purview account. The metrics are sent to Azure monitor for events occurring in Microsoft Purview.
-
-## Aggregated metrics
-
-The metrics can be accessed from the Azure portal of a Microsoft Purview account. Access to the metrics is controlled by the role assignment of Microsoft Purview account. Users need to be part of the "Monitoring Reader" role in Microsoft Purview to see the metrics. Check out [Monitoring Reader Role permissions](../azure-monitor/roles-permissions-security.md#built-in-monitoring-roles) to learn more about the roles access levels.
-
-The person who created the Microsoft Purview account automatically gets permissions to view metrics. If anyone else wants to see metrics, add them to the **Monitoring Reader** role, by following these steps:
-
-### Add a user to the Monitoring Reader role
-
-To add a user to the **Monitoring Reader** role, the owner of Microsoft Purview account or the Subscription owner can follow these steps:
-
-1. Go to the [Azure portal](https://portal.azure.com) and search for the Microsoft Purview account name.
-
-1. Select **Access control (IAM)**.
-
-1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
-
-1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
-
- | Setting | Value |
- | | |
- | Role | Monitoring Reader |
- | Assign access to | User, group, or service principal |
- | Members | &lt;Azure AD account user&gt; |
-
- :::image type="content" source="../../includes/role-based-access-control/media/add-role-assignment-page.png" alt-text="Screenshot showing Add role assignment page in Azure portal.":::
-
-## Metrics visualization
-
-Users in the **Monitoring Reader** role can see the aggregated metrics and diagnostic logs sent to Azure Monitor. The metrics are listed in the Azure portal for the corresponding Microsoft Purview account. In the Azure portal, select the Metrics section to see the list of all available metrics.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/purview-metrics.png" alt-text="Screenshot showing available Microsoft Purview metrics section." lightbox="./media/how-to-monitor-with-azure-monitor/purview-metrics.png":::
-
-Microsoft Purview users can also access the metrics page directly from the management center of the Microsoft Purview account. Select Azure Monitor in the main page of Microsoft Purview management center to launch Azure portal.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/launch-metrics-from-management.png" alt-text="Screenshot to launch Microsoft Purview metrics from management center." lightbox="./media/how-to-monitor-with-azure-monitor/launch-metrics-from-management.png":::
-
-### Available metrics
-
-To get familiarized with how to use the metric section in the Azure portal pre read the following two documents. [Getting started with Metric Explorer](../azure-monitor/essentials/metrics-getting-started.md) and [Advanced features of Metric Explorer](../azure-monitor/essentials/metrics-charts.md).
-
-The following table contains the list of metrics available to explore in the Azure portal:
-
-| Metric Name | Metric Namespace | Aggregation type | Description |
-| - | - | - | -- |
-| Data Map Capacity Units | Elastic data map | Sum <br> Count | Aggregate the elastic data map capacity units over time period |
-| Data Map Storage Size | Elastic data map | Sum <br> Avg | Aggregate the elastic data map storage size over time period |
-| Scan Canceled | Automated scan | Sum <br> Count | Aggregate the canceled data source scans over time period |
-| Scan Completed | Automated scan | Sum <br> Count | Aggregate the completed data source scans over time period |
-| Scan Failed | Automated scan | Sum <br> Count | Aggregate the failed data source scans over time period |
-| Scan time taken | Automated scan | Min <br> Max <br> Sum <br> Avg | Aggregate the total time taken by scans over time period |
-
-## Monitoring alerts
-
-Alerts can be accessed from the Azure portal of a Microsoft Purview account. Access to the alerts is controlled by the role assignment of Microsoft Purview account just like metrics.
-A user can setup alert rules in their purview account to get notified when important monitoring events happen.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/step-one-alerts-setting.png" alt-text="Screenshot showing creating an alert." lightbox="./media/how-to-monitor-with-azure-monitor/step-one-alerts-setting.png":::
-
-The user can also create specific alert rules and conditions for signals within purview.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/step-two-alerts-setting.png" alt-text="Screenshot showing addition alerts rules and conditions to a signal." lightbox="./media/how-to-monitor-with-azure-monitor/step-two-alerts-setting.png":::
-
-## Sending diagnostic logs
-
-Raw telemetry events are sent to Azure Monitor. Events can be sent to a Log Analytics Workspace, archived to a customer storage account of choice, streamed to an event hub, or sent to a partner solution for further analysis. Exporting of logs is done via the Diagnostic settings for the Microsoft Purview account on the Azure portal.
-
-Follow these steps to create a diagnostic setting for your Microsoft Purview account and send to your preferred destination:
-
-1. Locate your Microsoft Purview account in the [Azure portal](https://portal.azure.com).
-2. In the menu under **Monitoring** select **Diagnostic settings**.
-3. Select **Add diagnostic setting** to create a new diagnostic setting to collect platform logs and metrics. For more information about these settings and logs, see [the Azure Monitor documentation.](../azure-monitor/essentials/diagnostic-settings.md).
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/create-diagnostic-setting.png" alt-text="Screenshot showing creating diagnostic log." lightbox="./media/how-to-monitor-with-azure-monitor/create-diagnostic-setting.png":::
-
-4. You can send your logs to:
--- [A log analytics workspace](#destinationlog-analytics-workspace)-- [A storage account](#destinationstorage-account)-- [An event hub](#destinationevent-hub)-
-### Destination - Log Analytics Workspace
-
-1. In the **Destination details**, select **Send to Log Analytics workspace**.
-2. Create a name for the diagnostic setting, select the applicable log category group and select the right subscription and workspace, then select save. The workspace doesn't have to be in the same region as the resource being monitored. You to create a new workspace, you can follow this article: [Create a New Log Analytics Workspace](../azure-monitor/logs/quick-create-workspace.md).
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/log-analytics-diagnostic-setting.png" alt-text="Screenshot showing assigning log analytics workspace to send event to." lightbox="./media/how-to-monitor-with-azure-monitor/log-analytics-diagnostic-setting.png":::
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/log-analytics-select-workspace-diagnostic-setting.png" alt-text="Screenshot showing saved diagnostic log event to log analytics workspace." lightbox="./media/how-to-monitor-with-azure-monitor/log-analytics-select-workspace-diagnostic-setting.png":::
-
-3. Verify the changes in your Log Analytics Workspace by performing some operations to populate data. For example, creating/updating/deleting a policy. After which you can open the **Log Analytics Workspace**, navigate to **Logs**, enter query filter as **"purviewsecuritylogs"**, then select **"Run"** to execute the query.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/log-analytics-view-logs-diagnostic-setting.png" alt-text="Screenshot showing log results in the Log Analytics Workspace after a query was run." lightbox="./media/how-to-monitor-with-azure-monitor/log-analytics-view-logs-diagnostic-setting.png":::
-
-### Destination - Storage account
-
-1. In the **Destination details**, select **Archive to a storage account**.
-2. Create a diagnostic setting name, select the log category, select the destination as archive to a storage account, select the right subscription and storage account then select save. A dedicated storage account is recommended for archiving the diagnostic logs. If you need a storage account, you can follow this article: [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal).
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/storage-diagnostic-setting.png" alt-text="Screenshot showing assigning storage account for diagnostic log." lightbox="./media/how-to-monitor-with-azure-monitor/storage-diagnostic-setting.png":::
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/storage-select-diagnostic-setting.png" alt-text="Screenshot showing saved log events to storage account." lightbox="./media/how-to-monitor-with-azure-monitor/storage-select-diagnostic-setting.png":::
-
-3. To see logs in the **Storage Account**, perform a sample action (for example: create/update/delete a policy), then open the **Storage Account**, navigate to **Containers**, and select the container name.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/storage-two-diagnostic-setting.png" alt-text="Screenshot showing container in storage account where the diagnostic logs have been sent to." lightbox="./media/how-to-monitor-with-azure-monitor/storage-two-diagnostic-setting.png":::
-
-4. Navigate to the file and download it to see the logs.
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/storage-navigate-diagnostic-setting.png" alt-text="Screenshot showing folders with details of logs." lightbox="./media/how-to-monitor-with-azure-monitor/storage-navigate-diagnostic-setting.png":::
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/storage-select-logs-diagnostic-setting.png" alt-text="Screenshot showing details of logs." lightbox="./media/how-to-monitor-with-azure-monitor/storage-select-logs-diagnostic-setting.png":::
-
-### Destination - Event hub
-
-1. In the **Destination details**, select **Stream to an event hub**.
-2. Create a diagnostic setting name, select the log category, select the destination as stream to event hub, select the right subscription, event hubs namespace, event hub name and event hub policy name then select save. An event hub name space is required before you can stream to an event hub. If you need to create an event hub namespace, you can follow this article: [Create an event hub & event hubs namespace storage account](../event-hubs/event-hubs-create.md)
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/step-four-diagnostic-setting.png" alt-text="Screenshot showing streaming to an event hub for diagnostic log." lightbox="./media/how-to-monitor-with-azure-monitor/step-four-diagnostic-setting.png":::
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/step-four-one-diagnostic-setting.png" alt-text="Screenshot showing saved log events to event hub." lightbox="./media/how-to-monitor-with-azure-monitor/step-four-one-diagnostic-setting.png":::
-
-3. To see logs in the **Event Hubs Namespace**, Go to the [Azure portal](https://portal.azure.com), and search for the name of the event hubs namespace you created earlier, go the Event Hubs Namespace and click on overview. To find our more about capturing and reading captured audit events in the event hubs namespace, you can follow this article: [Audit Logs & diagnostics](../purview/tutorial-purview-audit-logs-diagnostics.md)
-
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/step-four-one-diagnostic-setting.png" alt-text="Screenshot showing activities in the event hub." lightbox="./media/how-to-monitor-with-azure-monitor/step-four-one-diagnostic-setting.png":::
-
-## Sample Log
-
-Here's a sample log you'd receive from a diagnostic setting.
-
-The event tracks the scan life cycle. A scan operation follows progress through a sequence of states, from Queued, Running and finally a terminal state of Succeeded | Failed | Canceled. An event is logged for each state transition and the schema of the event will have the following properties.
-
-```JSON
-{
- "time": "<The UTC time when the event occurred>",
- "properties": {
- "dataSourceName": "<Registered data source friendly name>",
- "dataSourceType": "<Registered data source type>",
- "scanName": "<Scan instance friendly name>",
- "assetsDiscovered": "<If the resultType is succeeded, count of assets discovered in scan run>",
- "assetsClassified": "<If the resultType is succeeded, count of assets classified in scan run>",
- "scanQueueTimeInSeconds": "<If the resultType is succeeded, total seconds the scan instance in queue>",
- "scanTotalRunTimeInSeconds": "<If the resultType is succeeded, total seconds the scan took to run>",
- "runType": "<How the scan is triggered>",
- "errorDetails": "<Scan failure error>",
- "scanResultId": "<Unique GUID for the scan instance>"
- },
- "resourceId": "<The azure resource identifier>",
- "category": "<The diagnostic log category>",
- "operationName": "<The operation that cause the event Possible values for ScanStatusLogEvent category are:
- |AdhocScanRun
- |TriggeredScanRun
- |StatusChangeNotification>",
- "resultType": "Queued ΓÇô indicates a scan is queued.
- Running ΓÇô indicates a scan entered a running state.
- Succeeded ΓÇô indicates a scan completed successfully.
- Failed ΓÇô indicates a scan failure event.
- Cancelled ΓÇô indicates a scan was cancelled. ",
- "resultSignature": "<Not used for ScanStatusLogEvent category. >",
- "resultDescription": "<This will have an error message if the resultType is Failed. >",
- "durationMs": "<Not used for ScanStatusLogEvent category. >",
- "level": "<The log severity level. Possible values are:
- |Informational
- |Error >",
- "location": "<The location of the Microsoft Purview account>",
-}
-```
-
-The Sample log for an event instance is shown in the below section.
-
-```JSON
-{
- "time": "2020-11-24T20:25:13.022860553Z",
- "properties": {
- "dataSourceName": "AzureDataExplorer-swD",
- "dataSourceType": "AzureDataExplorer",
- "scanName": "Scan-Kzw-shoebox-test",
- "assetsDiscovered": "0",
- "assetsClassified": "0",
- "scanQueueTimeInSeconds": "0",
- "scanTotalRunTimeInSeconds": "0",
- "runType": "Manual",
- "errorDetails": "empty_value",
- "scanResultId": "0dc51a72-4156-40e3-8539-b5728394561f"
- },
- "resourceId": "/SUBSCRIPTIONS/111111111111-111-4EB2/RESOURCEGROUPS/FOOBAR-TEST-RG/PROVIDERS/MICROSOFT.PURVIEW/ACCOUNTS/FOOBAR-HEY-TEST-NEW-MANIFEST-EUS",
- "category": "ScanStatusLogEvent",
- "operationName": "TriggeredScanRun",
- "resultType": "Delayed",
- "resultSignature": "empty_value",
- "resultDescription": "empty_value",
- "durationMs": 0,
- "level": "Informational",
- "location": "eastus",
-}
-```
-
-## Next steps
-
-[Elastic data map in Microsoft Purview](concept-elastic-data-map.md)
purview How To Policies Data Owner Arc Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-arc-sql-server.md
- Title: Provision access by data owner for Azure Arc-enabled SQL Server (preview)
-description: Step-by-step guide on how data owners can configure access to Azure Arc-enabled SQL Servers through Microsoft Purview access policies.
----- Previously updated : 07/06/2023--
-# Provision access by data owner for Azure Arc-enabled SQL Server (preview)
--
-[Data owner policies](concept-policies-data-owner.md) are a type of Microsoft Purview access policies. They allow you to manage access to user data in sources that have been registered for *Data Use Management* in Microsoft Purview. These policies can be authored directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
-
-This guide covers how a data owner can delegate authoring policies in Microsoft Purview to enable access to Azure Arc-enabled SQL Server. The following actions are currently enabled: *Read*. This action is only supported for policies at server level. *Modify* is not supported at this point.
-
-## Prerequisites
-
-## Microsoft Purview configuration
-
-### Register data sources in Microsoft Purview
-Register each data source with Microsoft Purview to later define access policies.
-
-1. Sign in to Microsoft Purview Studio.
-
-1. Navigate to the **Data map** feature on the left pane, select **Sources**, then select **Register**. Type "Azure Arc" in the search box and select **SQL Server on Azure Arc**. Then select **Continue**
-![Screenshot shows how to select a source for registration.](./media/how-to-policies-data-owner-sql/select-arc-sql-server-for-registration.png)
-
-1. Enter a **Name** for this registration. It is best practice to make the name of the registration the same as the server name in the next step.
-
-1. select an **Azure subscription**, **Server name** and **Server endpoint**.
-
-1. **Select a collection** to put this registration in.
-
-1. Enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-1. Upon enabling Data Use Management, Microsoft Purview will automatically capture the **Application ID** of the App Registration related to this Azure Arc-enabled SQL Server if one has been configured. Come back to this screen and hit the refresh button on the side of it to refresh, in case the association between the Azure Arc-enabled SQL Server and the App Registration changes in the future.
-
-1. Select **Register** or **Apply** at the bottom
-
-Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this picture.
-![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-arc-sql.png)
-
-## Enable policies in Azure Arc-enabled SQL Server
-
-## Create and publish a Data owner policy
-
-Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to the example:
-
-**Example: Read policy**. This policy assigns the Azure AD principal 'sg-Finance' to the *SQL Data reader* action, in the scope of SQL server *DESKTOP-xxx*. This policy has also been published to that server. Note that policies related to this action are not supported below server level.
-
-![Screenshot shows a sample data owner policy giving Data Reader access to an Azure SQL Database.](./media/how-to-policies-data-owner-sql/data-owner-policy-example-arc-sql-server-data-reader.png)
-
-> [!Note]
-> - Given that scan is not currently available for this data source, data reader policies can only be created at server level. Use the **Data sources** box instead of the Asset box when authoring the **data resources** part of the policy.
-> - There is a know issue with SQL Server Management Studio that prevents right-clicking on a table and choosing option ΓÇ£Select Top 1000 rowsΓÇ¥.
--
->[!Important]
-> - Publish is a background operation. It can take up to **5 minutes** for the changes to be reflected in this data source.
-> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
--
-## Unpublish a data owner policy
-Follow this link for the steps to [unpublish a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#unpublish-a-policy).
-
-## Update or delete a data owner policy
-Follow this link for the steps to [update or delete a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#update-or-delete-a-policy).
-
-## Test the policy
-
-After creating the policy, any of the Azure AD users in the Subject should now be able to connect to the data sources in the scope of the policy. To test, use SSMS or any SQL client and try to query. Attempt access to a SQL table you have provided read access to.
-
-If you require additional troubleshooting, see the [Next steps](#next-steps) section in this guide.
-
-## Role definition detail
-This section contains a reference of how relevant Microsoft Purview data policy roles map to specific actions in SQL data sources.
-
-| **Microsoft Purview policy role definition** | **Data source specific actions** |
-|-|--|
-|||
-| *Read* |Microsoft.Sql/sqlservers/Connect |
-||Microsoft.Sql/sqlservers/databases/Connect |
-||Microsoft.Sql/Sqlservers/Databases/Schemas/Tables/Rows|
-||Microsoft.Sql/Sqlservers/Databases/Schemas/Views/Rows |
-|||
--
-## Next steps
-Check blog, demo and related how-to guides
-* Doc: [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
-* Doc: [Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
-* Doc: [Microsoft Purview data owner policies on an Azure SQL Database](./how-to-policies-data-owner-azure-sql-db.md)
-* Doc: [Troubleshoot Microsoft Purview policies for SQL data sources](./troubleshoot-policy-sql.md)
-* Blog: [Grant users access to data assets in your enterprise via API](https://aka.ms/AAlg655)
purview How To Policies Data Owner Authoring Generic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-authoring-generic.md
- Title: Authoring and publishing data owner access policies (preview)
-description: Step-by-step guide on how a data owner can author and publish access policies in Microsoft Purview
----- Previously updated : 07/06/2023--
-# Authoring and publishing data owner access policies (Preview)
--
-[Data owner policies](concept-policies-data-owner.md) are a type of Microsoft Purview access policies. They allow you to manage access to user data in sources that have been registered for *Data Use Management* in Microsoft Purview. These policies can be authored directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
-
-This guide describes how to create, update, and publish data owner policies in the Microsoft Purview governance portal.
-
-## Prerequisites
-
-### Configuration
-Before authoring policies in the Microsoft Purview policy portal, you'll need to configure Microsoft Purview and the data sources so that they can enforce those policies.
-
-1. Follow any policy-specific prerequisites for your source. Check the [Microsoft Purview supported data sources table](./microsoft-purview-connector-overview.md) and select the link in the **Access Policy** column for sources where access policies are available. Follow any steps listed in the Access policy or Prerequisites sections.
-1. Register the data source in Microsoft Purview. Follow the **Prerequisites** and **Register** sections of the [source pages](./microsoft-purview-connector-overview.md) for your resources.
-1. Enable the Data use management option on the data source registration. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
--
-## Create a new policy
-
-This section describes the steps to create a new policy in Microsoft Purview.
-Ensure you have the *Policy Author* permission as described [here](how-to-enable-data-use-management.md#configure-microsoft-purview-permissions-to-create-update-or-delete-access-policies).
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
-1. Select the **New Policy** button in the policy page.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-1.png" alt-text="Screenshot showing data owner can access the Policy functionality in Microsoft Purview when it wants to create policies.":::
-
-1. The new policy page will appear. Enter the policy **Name** and **Description**.
-
-1. To add policy statements to the new policy, select the **New policy statement** button. This will bring up the policy statement builder.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/create-new-policy.png" alt-text="Screenshot showing data owner can create a new policy statement.":::
-
-1. Select the **Effect** button and choose *Allow* from the drop-down list.
-
-1. Select the **Action** button and choose *Read* or *Modify* from the drop-down list.
-
-1. Select the **Data Resources** button to bring up the window to enter Data resource information, which will open to the right.
-
-1. Under the **Data Resources** Panel do **one of two things** depending on the granularity of the policy:
- - To create a broad policy statement that covers an entire data source, resource group, or subscription that was previously registered, use the **Data sources** box and select its **Type**.
- - To create a fine-grained policy, use the **Assets** box instead. Enter the **Data Source Type** and the **Name** of a previously registered and scanned data source. See example in the image.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-data-source-type.png" alt-text="Screenshot showing data owner can select a Data Resource when editing a policy statement.":::
-
-1. Select the **Continue** button and transverse the hierarchy to select and underlying data-object (for example: folder, file, etc.). Select **Recursive** to apply the policy from that point in the hierarchy down to any child data-objects. Then select the **Add** button. This will take you back to the policy editor.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-asset.png" alt-text="Screenshot showing data owner can select the asset when creating or editing a policy statement.":::
-
-1. Select the **Subjects** button and enter the subject identity as a principal, group, or MSI. Note that Microsoft 365 groups are supported but updates to group membership take up to 1 hour to get reflected by Azure AD. Then select the **OK** button. This will take you back to the policy editor.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-subject.png" alt-text="Screenshot showing data owner can select the subject when creating or editing a policy statement.":::
-
-1. Repeat the steps #5 to #11 to enter any more policy statements.
-
-1. Select the **Save** button to save the policy.
-
-Now that you have created your policy, you will need to publish it for it to become active.
-
-## Publish a policy
-A newly created policy is in the **draft** state. The process of publishing associates the new policy with one or more data sources under governance. This is called "binding" a policy to a data source.
-
-Ensure you have the *Data Source Admin* permission as described [here](how-to-enable-data-use-management.md#configure-microsoft-purview-permissions-for-publishing-data-owner-policies)
-
-The steps to publish a policy are as follows:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-2.png" alt-text="Screenshot showing data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy by selecting Data policies.":::
-
-1. The Policy portal will present the list of existing policies in Microsoft Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/publish-policy.png" alt-text="Screenshot showing data owner can publish a policy.":::
-
-1. A list of data sources is displayed. You can enter a name to filter the list. Then, select each data source where this policy is to be published and then select the **Publish** button.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-data-sources-publish-policy.png" alt-text="Screenshot showing data owner can select the data source where the policy will be published.":::
-
->[!Note]
-> After making changes to a policy, there is no need to publish it again for it to take effect if the data source(s) continues to be the same.
-
-## Unpublish a policy
-Ensure you have the *Data Source Admin* permission as described [here](how-to-enable-data-use-management.md#configure-microsoft-purview-permissions-for-publishing-data-owner-policies)
-
-The steps to publish a policy are as follows:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-2.png" alt-text="Screenshot showing data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy by selecting Data policies.":::
-
-1. The Policy portal will present the list of existing policies in Microsoft Purview. Locate the policy that needs to be unpublished. Select the trash can icon.
-
-![Screenshot shows how to unpublish a data owner policy.](./media/how-to-policies-data-owner-authoring-generic/unpublish-policy.png)
-
-## Update or delete a policy
-
-Steps to update or delete a policy in Microsoft Purview are as follows.
-Ensure you have the *Policy Author* permission as described [here](how-to-enable-data-use-management.md#configure-microsoft-purview-permissions-to-create-update-or-delete-access-policies)
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-2.png" alt-text="Screenshot showing data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy.":::
-
-1. The Policy portal will present the list of existing policies in Microsoft Purview. Select the policy that needs to be updated.
-
-1. The policy details page will appear, including Edit and Delete options. Select the **Edit** button, which brings up the policy statement builder. Now, any parts of the statements in this policy can be updated. To delete the policy, use the **Delete** button.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/edit-policy.png" alt-text="Screenshot showing data owner can edit or delete a policy statement.":::
-
-## Next steps
-
-For specific guides on creating policies, you can follow these tutorials:
-- Doc: [Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)-- Doc: [Microsoft Purview data owner policies on an Azure Storage account](./how-to-policies-data-owner-storage.md)-- Blog: [Grant users access to data assets in your enterprise via API](https://aka.ms/AAlg655)
purview How To Policies Data Owner Azure Sql Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-azure-sql-db.md
- Title: Provision access by data owner for Azure SQL Database (preview)
-description: Step-by-step guide on how data owners can configure access for Azure SQL Database through Microsoft Purview access policies.
----- Previously updated : 07/06/2023--
-# Provision access by data owner for Azure SQL Database (preview)
--
-[Data owner policies](concept-policies-data-owner.md) are a type of Microsoft Purview access policies. They allow you to manage access to user data in sources that have been registered for *Data Use Management* in Microsoft Purview. These policies can be authored directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
-
-This guide covers how a data owner can delegate authoring policies in Microsoft Purview to enable access to Azure SQL Database. The following actions are currently enabled: *Read*. *Modify* is not supported at this point.
-
-## Prerequisites
-
-## Microsoft Purview configuration
-
-### Register the data sources in Microsoft Purview
-The Azure SQL Database data source needs to be registered first with Microsoft Purview before creating access policies. You can follow these guides:
-
-[Register and scan Azure SQL Database](./register-scan-azure-sql-database.md)
-
-After you've registered your resources, you'll need to enable Data Use Management. Data Use Management can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**:
-[How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this screenshot. This will enable the access policies to be used with the given Azure SQL server and all its contained databases.
-![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-azure-sql-db.png)
--
-## Create and publish a data owner policy
-
-Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to the example shown.
-
-**Example: Read policy**. This policy assigns the Azure AD principal 'Robert Murphy' to the *SQL Data reader* action, in the scope of SQL server *relecloud-sql-srv2*. This policy has also been published to that server. Note that policies related to this action are supported below server level (e.g., database, table)
-
-![Screenshot shows a sample data owner policy giving Data Reader access to an Azure SQL Database.](./media/how-to-policies-data-owner-sql/data-owner-policy-example-azure-sql-db-data-reader.png)
--
->[!Important]
-> - Publish is a background operation. It can take up to **5 minutes** for the changes to be reflected in this data source.
-> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
--
-## Unpublish a data owner policy
-Follow this link for the steps to [unpublish a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#unpublish-a-policy).
-
-## Update or delete a data owner policy
-Follow this link for the steps to [update or delete a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#update-or-delete-a-policy).
-
-## Test the policy
-After creating the policy, any of the Azure AD users in the Subject should now be able to connect to the data sources in the scope of the policy. To test, use SSMS or any SQL client and try to query. Attempt access to a SQL table you have provided read access to.
-
-If you require additional troubleshooting, see the [Next steps](#next-steps) section in this guide.
-
-## Role definition detail
-This section contains a reference of how relevant Microsoft Purview data policy roles map to specific actions in SQL data sources.
-
-| **Microsoft Purview policy role definition** | **Data source specific actions** |
-|-|--|
-|||
-| *Read* |Microsoft.Sql/sqlservers/Connect |
-||Microsoft.Sql/sqlservers/databases/Connect |
-||Microsoft.Sql/Sqlservers/Databases/Schemas/Tables/Rows|
-||Microsoft.Sql/Sqlservers/Databases/Schemas/Views/Rows |
-|||
-
-## Next steps
-Check blog, demo and related how-to guides
-* Doc: [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
-* Doc: [Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
-* Doc: [Microsoft Purview data owner policies on an Azure Arc-enabled SQL Server](./how-to-policies-data-owner-arc-sql-server.md)
-* Doc: [Troubleshoot Microsoft Purview policies for SQL data sources](./troubleshoot-policy-sql.md)
-* Blog: [Grant users access to data assets in your enterprise via API](https://aka.ms/AAlg655)
purview How To Policies Data Owner Resource Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-resource-group.md
- Title: Resource group and subscription access provisioning by data owner (preview)
-description: Step-by-step guide showing how a data owner can create access policies to resource groups or subscriptions.
----- Previously updated : 07/06/2023--
-# Resource group and subscription access provisioning by data owner (Preview)
-
-[Data owner policies](concept-policies-data-owner.md) are a type of Microsoft Purview access policies. They allow you to manage access to user data in sources that have been registered for *Data Use Management* in Microsoft Purview. These policies can be authored directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
-
-In this guide we cover how to register an entire resource group or subscription and then create a single policy that will manage access to **all** data sources in that resource group or subscription. That single policy will cover all existing data sources and any data sources that are created afterwards.
-
-## Prerequisites
-
-**Only these data sources are enabled for access policies on resource group or subscription**. Follow the **Prerequisites** section that is specific to the data source(s) in these guides:
-* [Data owner policies on an Azure Storage account](./how-to-policies-data-owner-storage.md#prerequisites)
-* [Data owner policies on an Azure SQL Database](./how-to-policies-data-owner-azure-sql-db.md#prerequisites)(*)
-* [Data owner policies on an Azure Arc-enabled SQL Server](./how-to-policies-data-owner-arc-sql-server.md#prerequisites)(*)
-
-(*) The *Modify* action is not currently supported for SQL-type data sources.
-
-## Microsoft Purview configuration
-
-### Register the subscription or resource group for Data Use Management
-The subscription or resource group needs to be registered with Microsoft Purview before you can create access policies. To register your subscription or resource group, follow the **Prerequisites** and **Register** sections of this guide:
--- [Register multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md#prerequisites)-
-After you've registered your resources, you'll need to enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-In the end, your resource will have the **Data Use Management** toggle **Enabled**, as shown in the screenshot:
-
-![Screenshot shows how to register a resource group or subscription for policy by toggling the enable tab in the resource editor.](./media/how-to-policies-data-owner-resource-group/register-resource-group-for-policy.png)
-
->[!Important]
-> - If you create a policy on a resource group or subscription and want to have it enforced in Azure Arc-enabled SQL Servers, you will need to also register those servers independently and enable *Data use management* which captures their App ID: [See this document](./how-to-policies-devops-arc-sql-server.md#register-data-sources-in-microsoft-purview).
--
-## Create and publish a data owner policy
-Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to the example shown in the image: a policy that provides security group *sg-Finance* *modify* access to resource group *finance-rg*. Use the Data source box in the Policy user experience.
-
-![Screenshot shows a sample data owner policy giving access to a resource group.](./media/how-to-policies-data-owner-resource-group/data-owner-policy-example-resource-group.png)
-
->[!Important]
-> - Publish is a background operation. For example, Azure Storage accounts can take up to **2 hours** to reflect the changes.
-> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
-
-## Unpublish a data owner policy
-Follow this link for the steps to [unpublish a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#unpublish-a-policy).
-
-## Update or delete a data owner policy
-Follow this link for the steps to [update or delete a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#update-or-delete-a-policy).
-
-## Additional information
-- Creating a policy at subscription or resource group level will enable the Subjects to access Azure Storage system containers, for example, *$logs*. If this is undesired, first scan the data source and then create finer-grained policies for each (that is, at container or sub-container level).-
-### Limits
-The limit for Microsoft Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
-
-## Next steps
-Check blog, demo and related tutorials:
-
-* Doc: [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
-* Blog: [Resource group-level governance can significantly reduce effort](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-resource-group-level-governance-can/ba-p/3096314)
-* Video: [Demo of data owner access policies for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
-* Blog: [Grant users access to data assets in your enterprise via API](https://aka.ms/AAlg655)
purview How To Policies Data Owner Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-storage.md
- Title: Access provisioning by data owner to Azure Storage datasets (preview)
-description: Step-by-step guide showing how data owners can create access policies to datasets in Azure Storage
------ Previously updated : 07/06/2023--
-# Access provisioning by data owner to Azure Storage datasets (Preview)
--
-[Data owner policies](concept-policies-data-owner.md) are a type of Microsoft Purview access policies. They allow you to manage access to user data in sources that have been registered for *Data Use Management* in Microsoft Purview. These policies can be authored directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
-
-This guide covers how a data owner can delegate in Microsoft Purview management of access to Azure Storage datasets. Currently, these two Azure Storage sources are supported:
-- Blob storage-- Azure Data Lake Storage (ADLS) Gen2-
-## Prerequisites
-
-## Microsoft Purview configuration
-
-### Register the data sources in Microsoft Purview for Data Use Management
-The Azure Storage resources need to be registered first with Microsoft Purview to later define access policies.
-
-To register your resources, follow the **Prerequisites** and **Register** sections of these guides:
--- [Register and scan Azure Storage Blob - Microsoft Purview](register-scan-azure-blob-storage-source.md#prerequisites)--- [Register and scan Azure Data Lake Storage (ADLS) Gen2 - Microsoft Purview](register-scan-adls-gen2.md#prerequisites)-
-After you've registered your resources, you'll need to enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** toggle **Enabled**, it will look like this screenshot:
--
-## Create and publish a data owner policy
-Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to the example shown in the image: a policy that provides group *Contoso Team* *read* access to Storage account *marketinglake1*:
--
->[!Important]
-> - Publish is a background operation. Azure Storage accounts can take up to **2 hours** to reflect the changes.
--
-## Unpublish a data owner policy
-Follow this link for the steps to [unpublish a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#unpublish-a-policy).
-
-## Update or delete a data owner policy
-Follow this link for the steps to [update or delete a data owner policy in Microsoft Purview](how-to-policies-data-owner-authoring-generic.md#update-or-delete-a-policy).
-
-## Data Consumption
-- Data consumer can access the requested dataset using tools such as Power BI or Azure Synapse Analytics workspace.-- The Copy and Clone commands in Azure Storage Explorer require additional IAM permissions to work in addition to the Allow Modify policy from Purview. Provide Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey/action permission in IAM to the Azure AD principal.-- Sub-container access: Policy statements set below container level on a Storage account are supported. However, users will not be able to browse to the data asset using Azure portal's Storage Browser or Microsoft Azure Storage Explorer tool if access is granted only at file or folder level of the Azure Storage account. This is because these apps attempt to crawl down the hierarchy starting at container level, and the request fails because no access has been granted at that level. Instead, the App that requests the data must execute a direct access by providing a fully qualified name to the data object. The following documents show examples of how to perform a direct access. See also the blogs in the *Next steps* section of this how-to-guide.
- - [*abfs* for ADLS Gen2](../hdinsight/hdinsight-hadoop-use-data-lake-storage-gen2.md#access-files-from-the-cluster)
- - [*az storage blob download* for Blob Storage](../storage/blobs/storage-quickstart-blobs-cli.md#download-a-blob)
-
-## Additional information
-- Creating a policy at Storage account level will enable the Subjects to access system containers, for example *$logs*. If this is undesired, first scan the data source(s) and then create finer-grained policies for each (that is, at container or subcontainer level).-- The root blob in a container will be accessible to the Azure AD principals in a Microsoft Purview *allow*-type RBAC policy if the scope of such policy is either subscription, resource group, Storage account or container in Storage account.-- The root container in a Storage account will be accessible to the Azure AD principals in a Microsoft Purview *allow*-type RBAC policy if the scope of such policy is either subscription, resource group, or Storage account.-
-### Limits
-- The limit for Microsoft Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.-
-### Known issues
-
-**Known issues** related to Policy creation
-- Do not create policy statements based on Microsoft Purview resource sets. Even if displayed in Microsoft Purview policy authoring UI, they are not yet enforced. Learn more about [resource sets](concept-resource-sets.md).-
-### Policy action mapping
-
-This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in Azure Storage.
-
-| **Microsoft Purview policy action** | **Data source specific actions** |
-||--|
-|||
-| *Read* |Microsoft.Storage/storageAccounts/blobServices/containers/read |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read |
-|||
-| *Modify* |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/read |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/write |
-| |Microsoft.Storage/storageAccounts/blobServices/containers/delete |
-|||
--
-## Next steps
-Check blog, demo and related tutorials:
-* [Demo of access policy for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
-* Doc: [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
-* Doc: [Provision access to all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
-* Blog: [What's New in Microsoft Purview at Microsoft Ignite 2021](https://techcommunity.microsoft.com/t5/azure-purview/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954)
-* Blog: [Accessing data when folder level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-folder-level-permission/ba-p/3109583)
-* Blog: [Accessing data when file level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-file-level-permission/ba-p/3102166)
-* Blog: [Grant users access to data assets in your enterprise via API](https://aka.ms/AAlg655)
purview How To Policies Devops Arc Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-arc-sql-server.md
- Title: Manage access to SQL Server 2022 system health and performance using Microsoft Purview DevOps policies, a type of RBAC policies.
-description: Use Microsoft Purview DevOps policies to provision access to SQL Server 2022 system metadata, so IT operations personnel can monitor performance, health and audit security, while limiting the insider threat.
----- Previously updated : 03/10/2023--
-# Provision access to system metadata in Azure Arc-enabled SQL Server 2022
-
-[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. They allow you to manage access to system metadata on data sources that have been registered for *Data use management* in Microsoft Purview. These policies are configured directly from the Microsoft Purview governance portal, and after they are saved, they get automatically published and then enforced by the data source. Microsoft Purview policies only manage access for Azure AD principals.
-
-This how-to guide covers how to configure SQL Server 2022 to enforce policies created in Microsoft Purview. It covers onboarding with Azure Arc, enabling Azure AD on the SQL Server, and provisioning access to its system metadata (DMVs and DMFs) using the DevOps policies actions *SQL Performance Monitoring* or *SQL Security Auditing*.
-
-## Prerequisites
-
-## Microsoft Purview configuration
-
-### Register data sources in Microsoft Purview
-The Azure Arc-enabled SQL Server data source needs to be registered first with Microsoft Purview, before policies can be created.
-
-1. Sign in to Microsoft Purview Studio.
-
-1. Navigate to the **Data map** feature on the left pane, select **Sources**, then select **Register**. Type "Azure Arc" in the search box and select **SQL Server on Azure Arc**. Then select **Continue**
-![Screenshot shows how to select a source for registration.](./media/how-to-policies-data-owner-sql/select-arc-sql-server-for-registration.png)
-
-1. Enter a **Name** for this registration. It is best practice to make the name of the registration the same as the server name in the next step.
-
-1. select an **Azure subscription**, **Server name** and **Server endpoint**.
-
-1. **Select a collection** to put this registration in.
-
-1. Enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-1. Upon enabling Data Use Management, Microsoft Purview will automatically capture the **Application ID** of the App Registration related to this Azure Arc-enabled SQL Server if one has been configured. Come back to this screen and hit the refresh button on the side of it to refresh, in case the association between the Azure Arc-enabled SQL Server and the App Registration changes in the future.
-
-1. Select **Register** or **Apply** at the bottom
-
-Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this picture.
-![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-arc-sql.png)
-
-## Enable policies in Azure Arc-enabled SQL Server
-
-## Create a new DevOps policy
-Follow this link for the steps to [create a new DevOps policy in Microsoft Purview](how-to-policies-devops-authoring-generic.md#create-a-new-devops-policy).
-
-## List DevOps policies
-Follow this link for the steps to [list DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#list-devops-policies).
-
-## Update a DevOps policy
-Follow this link for the steps to [update a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#update-a-devops-policy).
-
-## Delete a DevOps policy
-Follow this link for the steps to [delete a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#delete-a-devops-policy).
-
->[!Important]
-> DevOps policies are auto-published and changes can take up to **5 minutes** to be enforced by the data source.
-
-## Test the DevOps policy
-See how to [test the policy you created](./how-to-policies-devops-authoring-generic.md#test-the-devops-policy)
-
-## Role definition detail
-See the [mapping of DevOps role to data source actions](./how-to-policies-devops-authoring-generic.md#role-definition-detail)
-
-## Next steps
-See [related videos, blogs and documents](./how-to-policies-devops-authoring-generic.md#next-steps)
---
purview How To Policies Devops Authoring Generic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-authoring-generic.md
- Title: Create, list, update, and delete Microsoft Purview DevOps policies
-description: Use Microsoft Purview DevOps policies to provision access to database system metadata, so IT operations personnel can monitor performance, health, and audit security, while limiting insider threats.
----- Previously updated : 05/30/2023--
-# Create, list, update, and delete Microsoft Purview DevOps policies
-
-[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. You can use them to manage access to system metadata on data sources that have been registered for **Data use management** in Microsoft Purview.
-
-You can configure DevOps policies directly from the Microsoft Purview governance portal. After they're saved, they're automatically published and then enforced by the data source. Microsoft Purview policies manage only access for Azure Active Directory (Azure AD) principals.
-
-This guide covers the configuration steps in Microsoft Purview to provision access to database system metadata by using the DevOps policy actions for the SQL Performance Monitor and SQL Security Auditor roles. It shows how to create, list, update, and delete DevOps policies.
-
-## Prerequisites
--
-### Configuration
-
-Before you author policies in the Microsoft Purview policy portal, you need to configure the data sources so that they can enforce those policies:
-
-1. Follow any policy-specific prerequisites for your source. Check the [table of Microsoft Purview supported data sources](./microsoft-purview-connector-overview.md) and select the link in the **Access policy** column for sources where access policies are available. Follow any steps listed in the "Access policy" and "Prerequisites" sections.
-1. Register the data source in Microsoft Purview. Follow the "Prerequisites" and "Register" sections of the [source pages](./microsoft-purview-connector-overview.md) for your resources.
-1. Turn on the **Data use management** toggle in the data source registration. For more information, including additional permissions that you need for this step, see [Enable Data use management on your Microsoft Purview sources](how-to-enable-data-use-management.md).
-
-## Create a new DevOps policy
-
-To create a DevOps policy, first ensure that you have the Microsoft Purview Policy Author role at the root collection level. Check the section on managing Microsoft Purview role assignments in [this guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections). Then, follow these steps:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. On the left pane, select **Data policy**. Then select **DevOps policies**.
-
-1. Select the **New policy** button.
-
- ![Screenshot that shows the button for creating a new SQL DevOps policy.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-create.png)
-
- The policy detail panel opens.
-
-1. For **Data source type**, select a data source. Under **Data source name**, select one of the listed data sources. Then click **Select** to return to the **New Policy** pane.
-
- ![Screenshot that shows the panel for selecting a policy's data source.](./media/how-to-policies-devops-authoring-generic/select-a-data-source.png)
-
-1. Select one of two roles, **Performance monitoring** or **Security auditing**. Then select **Add/remove subjects** to open the **Subject** panel.
-
- In the **Select subjects** box, enter the name of an Azure AD principal (user, group, or service principal). Microsoft 365 groups are supported, but updates to group membership take up to one hour to be reflected in Azure AD. Keep adding or removing subjects until you're satisfied, and then select **Save**.
-
- ![Screenshot that shows the selection of roles and subjects for a policy.](./media/how-to-policies-devops-authoring-generic/select-role-and-subjects.png)
-
-1. Select **Save** to save the policy. The policy is published automatically. Enforcement starts at the data source within five minutes.
-
-## List DevOps policies
-
-To list DevOps policies, first ensure that you have one of the following Microsoft Purview roles at the root collection level: Policy Author, Data Source Admin, Data Curator, or Data Reader. Check the section on managing Microsoft Purview role assignments in [this guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections). Then, follow these steps:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. On the left pane, select **Data policy**. Then select **DevOps policies**.
-
- The **DevOps Policies** pane lists any policies that have been created.
-
- ![Screenshot that shows selections for opening a list of SQL DevOps policies.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-list.png)
-
-## Update a DevOps policy
-
-To update a DevOps policy, first ensure that you have the Microsoft Purview Policy Author role at the root collection level. Check the section on managing Microsoft Purview role assignments in [this guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections). Then, follow these steps:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. On the left pane, select **Data policy**. Then select **DevOps policies**.
-
-1. On the **DevOps Policies** pane, open the policy details for one of the policies by selecting it from its data resource path.
-
- ![Screenshot that shows selections to open SQL DevOps policies.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-update.png)
-
-1. On the pane for policy details, select **Edit**.
-
-1. Make your changes, and then select **Save**.
-
-## Delete a DevOps policy
-
-To delete a DevOps policy, first ensure that you have the Microsoft Purview Policy Author role at the root collection level. Check the section on managing Microsoft Purview role assignments in [this guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections). Then, follow these steps:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. On the left pane, select **Data policy**. Then select **DevOps policies**.
-
-1. Select the checkbox for one of the policies, and then select **Delete**.
-
- ![Screenshot that shows selections for deleting a SQL DevOps policy.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-delete.png)
-
-## Test the DevOps policy
-
-After you create a policy, any of the Azure AD users that you selected as subjects can now connect to the data sources in the scope of the policy. To test, use SQL Server Management Studio (SSMS) or any SQL client and try to query some dynamic management views (DMVs) and dynamic management functions (DMFs). The following sections list a few examples. For more examples, consult the mapping of popular DMVs and DMFs in [What can I accomplish with Microsoft Purview DevOps policies?](./concept-policies-devops.md#mapping-of-popular-dmvs-and-dmfs).
-
-If you require more troubleshooting, see the [Next steps](#next-steps) section in this guide.
-
-### Test SQL Performance Monitor access
-
-If you provided the subjects of the policy for the SQL Performance Monitor role, you can issue the following commands:
-
-```sql
Returns I/O statistics for data and log files
-SELECT * FROM sys.dm_io_virtual_file_stats(DB_ID(N'testdb'), 2)
Waits encountered by threads that executed. Used to diagnose performance issues
-SELECT wait_type, wait_time_ms FROM sys.dm_os_wait_stats
-```
-
-![Screenshot that shows a test for SQL Performance Monitor.](./media/how-to-policies-devops-authoring-generic/test-access-sql-performance-monitor.png)
-
-### Test SQL Security Auditor access
-
-If you provided the subjects of the policy for the SQL Security Auditor role, you can issue the following commands from SSMS or any SQL client:
-
-```sql
Returns the current state of the audit
-SELECT * FROM sys.dm_server_audit_status
Returns information about the encryption state of a database and its associated database encryption keys
-SELECT * FROM sys.dm_database_encryption_keys
-```
-
-### Ensure no access to user data
-
-Try to access a table in one of the databases by using the following command:
-
-```sql
Test access to user data
-SELECT * FROM [databaseName].schemaName.tableName
-```
-
-The Azure AD principal that you're testing with should be denied, which means the data is protected from insider threats.
-
-![Screenshot that shows a test to access user data.](./media/how-to-policies-devops-authoring-generic/test-access-user-data.png)
-
-## Role definition detail
-
-The following table maps Microsoft Purview data policy roles to specific actions in SQL data sources.
-
-| Microsoft Purview policy role | Actions in data sources |
-|-|--|
-| | |
-| SQL Performance Monitor |Microsoft.Sql/Sqlservers/Connect |
-||Microsoft.Sql/Sqlservers/Databases/Connect |
-||Microsoft.Sql/Sqlservers/Databases/SystemViewsAndFunctions/DatabasePerformanceState/Rows/Select |
-||Microsoft.Sql/Sqlservers/SystemViewsAndFunctions/ServerPerformanceState/Rows/Select |
-||Microsoft.Sql/Sqlservers/Databases/SystemViewsAndFunctions/DatabaseGeneralMetadata/Rows/Select |
-||Microsoft.Sql/Sqlservers/SystemViewsAndFunctions/ServerGeneralMetadata/Rows/Select |
-||Microsoft.Sql/Sqlservers/Databases/DBCCs/ViewDatabasePerformanceState/Execute |
-||Microsoft.Sql/Sqlservers/DBCCs/ViewServerPerformanceState/Execute |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Create |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Options/Alter |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Events/Add |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Events/Drop |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/State/Enable |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/State/Disable |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Drop
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Target/Add |
-||Microsoft.Sql/Sqlservers/Databases/ExtendedEventSessions/Target/Drop |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Create |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Options/Alter |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Events/Add |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Events/Drop |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/State/Enable |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/State/Disable |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Drop |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Target/Add |
-||Microsoft.Sql/Sqlservers/ExtendedEventSessions/Target/Drop |
-|||
-| SQL Security Auditor |Microsoft.Sql/Sqlservers/Connect |
-||Microsoft.Sql/Sqlservers/Databases/Connect |
-||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityState/rows/select |
-||Microsoft.Sql/Sqlservers/Databases/SystemViewsAndFunctions/DatabaseSecurityState/rows/select |
-||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityMetadata/rows/select |
-||Microsoft.Sql/Sqlservers/Databases/SystemViewsAndFunctions/DatabaseSecurityMetadata/rows/select |
-|||
-
-## Next steps
-
-Check the following blogs, videos, and related articles:
-
-* Blog: [Microsoft Purview DevOps policies for Azure SQL Database is now generally available](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/microsoft-purview-devops-policies-for-azure-sql-database-is-now/ba-p/3775885)
-* Blog: [Inexpensive solution for managing access to SQL health, performance, and security information](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/inexpensive-solution-for-managing-access-to-sql-health/ba-p/3750512)
-* Blog: [Microsoft Purview DevOps policies enable at-scale access provisioning for IT operations](https://techcommunity.microsoft.com/t5/microsoft-purview-blog/microsoft-purview-devops-policies-enable-at-scale-access/ba-p/3604725)
-* Blog: [Microsoft Purview DevOps policies API is now public](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/microsoft-purview-devops-policies-api-is-now-public/ba-p/3818931)
-* Video: [Prerequisite for policies: The "Data use management" option](https://youtu.be/v_lOzevLW-Q)
-* Video: [Quick overview of DevOps policies](https://aka.ms/Microsoft-Purview-DevOps-Policies-Video)
-* Video: [Deep dive for DevOps policies](https://youtu.be/UvClpdIb-6g)
-* Article: [Microsoft Purview DevOps policies concept guide](./concept-policies-devops.md)
-* Article: [Microsoft Purview DevOps policies on Azure Arc-enabled SQL Server](./how-to-policies-devops-arc-sql-server.md)
-* Article: [Microsoft Purview DevOps policies on Azure SQL Database](./how-to-policies-devops-azure-sql-db.md)
-* Article: [Microsoft Purview DevOps policies on entire resource groups or subscriptions](./how-to-policies-devops-resource-group.md)
-* Article: [Troubleshoot Microsoft Purview policies for SQL data sources](./troubleshoot-policy-sql.md)
purview How To Policies Devops Azure Sql Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-azure-sql-db.md
- Title: Manage access to Azure SQL Database system health and performance using Microsoft Purview DevOps policies, a type of RBAC policies.
-description: Use Microsoft Purview DevOps policies to provision access to Azure SQL Database system metadata, so IT operations personnel can monitor performance, health and audit security, while limiting the insider threat.
----- Previously updated : 04/04/2023--
-# Provision access to system metadata in Azure SQL Database
-
-[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. They allow you to manage access to system metadata on data sources that have been registered for *Data use management* in Microsoft Purview. These policies are configured directly from the Microsoft Purview governance portal, and after they are saved, they get automatically published and then enforced by the data source. Microsoft Purview policies only manage access for Azure AD principals.
-
-This how-to guide covers how to configure Azure SQL Database to enforce policies created in Microsoft Purview. It covers the configuration steps for Azure SQL Database and the ones in Microsoft Purview to provision access to Azure SQL Database system metadata (DMVs and DMFs) using the DevOps policies actions *SQL Performance Monitoring* or *SQL Security Auditing*.
-
-## Prerequisites
-
-## Microsoft Purview Configuration
-
-### Register the data sources in Microsoft Purview
-The Azure SQL Database data source needs to be registered first with Microsoft Purview, before access policies can be created. You can follow these guides:
-
-[Register and scan Azure SQL Database](./register-scan-azure-sql-database.md)
-
-After you've registered your resources, you'll need to enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this screenshot. This will enable the access policies to be used with the given data source
-![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-azure-sql-db.png)
--
-## Create a new DevOps policy
-Follow this link for the steps to [create a new DevOps policy in Microsoft Purview](how-to-policies-devops-authoring-generic.md#create-a-new-devops-policy).
-
-## List DevOps policies
-Follow this link for the steps to [list DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#list-devops-policies).
-
-## Update a DevOps policy
-Follow this link for the steps to [update a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#update-a-devops-policy).
-
-## Delete a DevOps policy
-Follow this link for the steps to [delete a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#delete-a-devops-policy).
-
->[!Important]
-> DevOps policies are auto-published and changes can take up to **5 minutes** to be enforced by the data source.
-
-## Test the DevOps policy
-See how to [test the policy you created](./how-to-policies-devops-authoring-generic.md#test-the-devops-policy)
-
-## Role definition detail
-See the [mapping of DevOps role to data source actions](./how-to-policies-devops-authoring-generic.md#role-definition-detail)
-
-## Next steps
-See [related videos, blogs and documents](./how-to-policies-devops-authoring-generic.md#next-steps)
--
purview How To Policies Devops Resource Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-resource-group.md
- Title: Manage access to entire resource groups or subscriptions for monitoring system health and performance using Microsoft Purview DevOps policies, a type of RBAC policies.
-description: Use Microsoft Purview DevOps policies to provision access to all data sources inside a resource group or subscription, so IT operations personnel can monitor performance, health and audit security, while limiting the insider threat.
----- Previously updated : 03/10/2023--
-# Provision access to system metadata in resource groups or subscriptions
-
-[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. They allow you to manage access to system metadata on data sources that have been registered for *Data use management* in Microsoft Purview. These policies are configured directly in the Microsoft Purview governance portal, and after they are saved, they get automatically published and then enforced by the data source. Microsoft Purview policies only manage access for Azure AD principals.
-
-This how-to guide covers how to register an entire resource group or subscription and then create a single policy that will provision access to **all** data sources in that resource group or subscription. That single policy will cover all existing data sources and any data sources that are created afterwards. and provisioning access to its system metadata (DMVs and DMFs) using the DevOps policies actions *SQL Performance Monitoring* or *SQL Security Auditing*.
---
-## Prerequisites
-
-**Only these data sources are enabled for access policies on resource group or subscription**. Follow the **Prerequisites** section that is specific to the data source(s) in these guides:
-* [DevOps policies on an Azure SQL Database](./how-to-policies-devops-azure-sql-db.md#prerequisites)
-* [DevOps policies on an Azure Arc-enabled SQL Server](./how-to-policies-devops-arc-sql-server.md#prerequisites)
-
-## Microsoft Purview Configuration
-
-### Register the subscription or resource group for Data Use Management
-The subscription or resource group needs to be registered with Microsoft Purview before you can create access policies. To register your subscription or resource group, follow the **Prerequisites** and **Register** sections of this guide:
--- [Register multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md#prerequisites)-
-After you've registered your resources, you'll need to enable the Data Use Management option. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-In the end, your resource will have the **Data Use Management** toggle **Enabled**, as shown in the screenshot:
-
-![Screenshot shows how to register a resource group or subscription for policy by toggling the enable tab in the resource editor.](./media/how-to-policies-data-owner-resource-group/register-resource-group-for-policy.png)
-
->[!Important]
-> - If you create a policy on a resource group or subscription and want to have it enforced in Azure Arc-enabled SQL Servers, you will need to also register those servers independently and enable *Data use management* which captures their App ID: [See this document](./how-to-policies-devops-arc-sql-server.md#register-data-sources-in-microsoft-purview).
--
-## Create a new DevOps policy
-Follow this link for the steps to [create a new DevOps policy in Microsoft Purview](how-to-policies-devops-authoring-generic.md#create-a-new-devops-policy).
-
-## List DevOps policies
-Follow this link for the steps to [list DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#list-devops-policies).
-
-## Update a DevOps policy
-Follow this link for the steps to [update a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#update-a-devops-policy).
-
-## Delete a DevOps policy
-Follow this link for the steps to [delete a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#delete-a-devops-policy).
-
-## Test the DevOps policy
-See how to [test the policy you created](./how-to-policies-devops-authoring-generic.md#test-the-devops-policy)
-
-## Role definition detail
-See the [mapping of DevOps role to data source actions](./how-to-policies-devops-authoring-generic.md#role-definition-detail)
-
-## Next steps
-See [related videos, blogs and documents](./how-to-policies-devops-authoring-generic.md#next-steps)
purview How To Policies Self Service Azure Sql Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-self-service-azure-sql-db.md
- Title: Self-service policies for Azure SQL Database (preview)
-description: Step-by-step guide on how self-service policy is created for Azure SQL Database through Microsoft Purview access policies.
----- Previously updated : 11/11/2022--
-# Self-service policies for Azure SQL Database (preview)
--
-[Self-service policies](concept-self-service-data-access-policy.md) allow you to manage access from Microsoft Purview to data sources that have been registered for **Data Use Management**.
-
-This how-to guide describes how self-service policies get created in Microsoft Purview to enable access to Azure SQL Database. The following actions are currently enabled: *Read Tables*, and *Read Views*.
-
-> [!CAUTION]
-> *Ownership chaining* must exist for *select* to work on Azure SQL Database *views*.
-
-## Prerequisites
-
-## Microsoft Purview Configuration
-
-### Register the data sources in Microsoft Purview
-The Azure SQL Database resources need to be registered first with Microsoft Purview to later define access policies. You can follow these guides:
-
-[Register and scan Azure SQL DB](./register-scan-azure-sql-database.md)
-
-After you've registered your resources, you'll need to enable data use management. Data use management can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to data use management in this guide**:
-
-[How to enable data use management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data use management** toggle *Enabled*, it will look like this picture. This will enable the access policies to be used with the given SQL server and all its contained databases.
-![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-azure-sql-db.png)
--
-## Create a self-service data access request
---
->[!Important]
-> - Publish is a background operation. It can take up to **5 minutes** for the changes to be reflected in this data source.
-> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
--
-## View a self-service Policy
-
-To view the policies you've created, follow the article to [view the self-service policies](how-to-view-self-service-data-access-policy.md).
--
-### Test the policy
-
-The Azure Active Directory Account, group, MSI, or SPN for which the self-service policies were created, should now be able to connect to the database on the server and execute a select query against the requested table or view.
-
-#### Force policy download
-It's possible to force an immediate download of the latest published policies to the current SQL database by running the following command. The minimal permission required to run it's membership in ##MS_ServerStateManager##-server role.
-
-```sql
Force immediate download of latest published policies
-exec sp_external_policy_refresh reload
-```
-
-#### Analyze downloaded policy state from SQL
-The following DMVs can be used to analyze which policies have been downloaded and are currently assigned to Azure AD accounts. The minimal permission required to run them is VIEW DATABASE SECURITY STATE - or assigned Action Group *SQL Security Auditor*.
-
-```sql
- Lists generally supported actions
-SELECT * FROM sys.dm_server_external_policy_actions
- Lists the roles that are part of a policy published to this server
-SELECT * FROM sys.dm_server_external_policy_roles
- Lists the links between the roles and actions, could be used to join the two
-SELECT * FROM sys.dm_server_external_policy_role_actions
- Lists all Azure AD principals that were given connect permissions
-SELECT * FROM sys.dm_server_external_policy_principals
- Lists Azure AD principals assigned to a given role on a given resource scope
-SELECT * FROM sys.dm_server_external_policy_role_members
- Lists Azure AD principals, joined with roles, joined with their data actions
-SELECT * FROM sys.dm_server_external_policy_principal_assigned_actions
-```
-
-## Additional information
-
-### Policy action mapping
-
-This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in Azure SQL Database.
-
-| **Microsoft Purview policy action** | **Data source specific actions** |
-|-|--|
-|||
-| *Read* |Microsoft.Sql/sqlservers/Connect |
-||Microsoft.Sql/sqlservers/databases/Connect |
-||Microsoft.Sql/Sqlservers/Databases/Schemas/Tables/Rows|
-||Microsoft.Sql/Sqlservers/Databases/Schemas/Views/Rows |
-|||
-
-## Next steps
-Check blog, demo and related how-to guides
-- [self-service policies](concept-self-service-data-access-policy.md) -- [What are Microsoft Purview workflows](concept-workflow.md)-- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)
purview How To Policies Self Service Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-self-service-storage.md
- Title: Self-service policies for Azure Storage (preview)
-description: Step-by-step guide on how self-service policy is created for storage through Microsoft Purview access policies.
----- Previously updated : 10/24/2022---
-# Self-service access provisioning for Azure Storage datasets (Preview)
--
-[Access policies](concept-policies-data-owner.md) allow you to manage access from Microsoft Purview to data sources that have been registered for *Data Use Management*.
-
-This how-to guide describes how self-service policies get created in Microsoft Purview to enable access to Azure storage datasets. Currently, these two Azure Storage sources are supported:
--- Blob storage-- Azure Data Lake Storage (ADLS) Gen2-
-## Prerequisites
--
-## Configuration
-
-### Register the data sources in Microsoft Purview for Data Use Management
-The Azure Storage resources need to be registered first with Microsoft Purview to later define access policies.
-
-To register your resources, follow the **Prerequisites** and **Register** sections of these guides:
--- [Register and scan Azure Storage Blob - Microsoft Purview](register-scan-azure-blob-storage-source.md#prerequisites)--- [Register and scan Azure Data Lake Storage (ADLS) Gen2 - Microsoft Purview](register-scan-adls-gen2.md#prerequisites)-
-After you've registered your resources, you'll need to enable data use management. Data use management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable data use management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** toggle **Enabled**, it will look like this picture:
--
-## Create a self-service data access request
--
->[!Important]
-> - Publish is a background operation. Azure Storage accounts can take up to **2 hours** to reflect the changes.
-
-## View a self-service policy
-
-To view the policies you've created, follow the article to [view the self-service policies](how-to-view-self-service-data-access-policy.md).
-
-## Data consumption
--- Data consumer can access the requested dataset using tools such as Power BI or Azure Synapse Analytics workspace.-
->[!NOTE]
-> Users will not be able to browse to the asset using the Azure Portal or Storage explorer if the only permission granted is read/modify access at the file or folder level of the storage account.
-
-> [!CAUTION]
-> Folder level permission is required to access data in ADLS Gen 2 using PowerBI.
-> Additionally, resource sets are not supported by self-service policies. Hence, folder level permission needs to be granted to access resource set files such as CSV or parquet.
--
-### Known issues
-
-**Known issues** related to Policy creation
-- self-service policies aren't supported for Microsoft Purview resource sets. Even if displayed in Microsoft Purview, it isn't yet enforced. Learn more about [resource sets](concept-resource-sets.md).--
-## Next steps
-Check blog, demo and related tutorials:
-
-* [self-service policies concept](./concept-self-service-data-access-policy.md)
-* [Demo of self-service policies for storage](https://www.youtube.com/watch?v=AYKZ6_imorE)
-* [Blog: Accessing data when folder level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-folder-level-permission/ba-p/3109583)
-* [Blog: Accessing data when file level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-file-level-permission/ba-p/3102166)
purview How To Receive Share https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-receive-share.md
- Title: How to receive shared data
-description: Learn how to receive shared data from Azure Blob Storage and Azure Data Lake Storage using Microsoft Purview Data Sharing.
------ Previously updated : 02/16/2023-
-# Receive Azure Storage in-place share with Microsoft Purview Data Sharing (preview)
--
-Microsoft Purview Data Sharing supports in-place data sharing from Azure Data Lake Storage (ADLS Gen2) to ADLS Gen2, and Blob storage account to Blob storage account. This article explains how to receive a share and access shared data.
-
->[!NOTE]
->This feature has been updated in February 2023, and permissions needed to view and manage Data Sharing in Microsoft Purview have changed.
->Now only Reader permissions are needed on the collection where the shared data is housed. Refer to [Microsoft Purview permissions](catalog-permissions.md) to learn more about the Microsoft Purview collection and roles.
-
-## Prerequisites to receive shared data
-
-### Microsoft Purview prerequisites
-
-* [A Microsoft Purview account](create-catalog-portal.md).
-* A minimum of Data Reader role on a Microsoft Purview collection to use data sharing user experience in the Microsoft Purview governance portal. Refer to [Microsoft Purview permissions](catalog-permissions.md) to learn more about the Microsoft Purview collection and roles.
-* To use the SDK, no Microsoft Purview permission is needed.
-
-### Azure Storage account prerequisites
-
-* Your Azure subscription must be registered for the **AllowDataSharing** preview feature. Follow the below steps using Azure portal or PowerShell.
-
- # [Portal](#tab/azure-portal)
-
- 1. In the [Azure portal](https://portal.azure.com), select the Azure subscription that you use to create the source and target storage account.
- 1. From the left menu, select **Preview features** under *Settings*.
- 1. Select **AllowDataSharing** and *Register*.
- 1. Refresh the *Preview features* screen to verify the *State* is **Registered**. It could take 15 minutes to 1 hour for registration to complete.
- 1. In addition, to use data share for storage accounts in East US, East US2, North Europe, Southcentral US, West Central US, West Europe, West US, West US2: Select AllowDataSharingInHeroRegion and Register
-
- For more information, see [the register preview feature article](../azure-resource-manager/management/preview-features.md?tabs=azure-portal#register-preview-feature).
-
- # [PowerShell](#tab/powershell)
- ```azurepowershell
- Set-AzContext -SubscriptionId [Your Azure subscription ID]
- ```
- ```azurepowershell
- Register-AzProviderFeature -FeatureName "AllowDataSharing" -ProviderNamespace "Microsoft.Storage"
- ```
- ```azurepowershell
- Get-AzProviderFeature -FeatureName "AllowDataSharing" -ProviderNamespace "Microsoft.Storage"
- ```
- In addition, to use data share for storage accounts in East US, East US2, North Europe, Southcentral US, West Central US, West Europe, West US, West US2:
-
- ```azurepowershell
- Register-AzProviderFeature -FeatureName "AllowDataSharingInHeroRegion" -ProviderNamespace "Microsoft.Storage"
- ```
- ```azurepowershell
- Get-AzProviderFeature -FeatureName "AllowDataSharingInHeroRegion" -ProviderNamespace "Microsoft.Storage"
- ```
-The *RegistrationState* should be **Registered**. It could take 15 minutes to 1 hour for registration to complete. For more information, see the [register preview feature article](../azure-resource-manager/management/preview-features.md?tabs=azure-portal#register-preview-feature).
--
-* A target storage account **created after** the registration step is completed. **The target storage account must be in the same Azure region as the source storage account.** If you don't know the Azure region of the source storage account, you can find out during the share attaching step later in the process. Target storage account can be in a different Azure region from your Microsoft Purview account.
-
- > [!IMPORTANT]
- > The target storage account must be in the same Azure region as the source storage account.
-
-* You need the **Contributor** or **Owner** or **Storage Blob Data Owner** or **Storage Blob Data Contributor** role on the target storage account. You can find more details on the [ADLS Gen2](register-scan-adls-gen2.md#data-sharing) or [Blob storage](register-scan-azure-blob-storage-source.md#data-sharing) data source pages.
-* If the target storage account is in a different Azure subscription than the one for Microsoft Purview account, the Microsoft.Purview resource provider needs to be registered in the Azure subscription where the Storage account is located. It's automatically registered at the time of share consumer attaching the share and if the user has permission to do the `/register/action` operation and therefore, Contributor or Owner roles to the subscription where the Storage account is located.
-This registration is only needed the first time when sharing or receiving data into a storage account in the Azure subscription.
-* A storage account needs to be registered in the collection where you'll receive the share. For instructions to register, see the [ADLS Gen2](register-scan-adls-gen2.md) or [Blob storage](register-scan-azure-blob-storage-source.md) data source pages. This step isn't required to use the SDK.
-* Latest version of the storage SDK, PowerShell, CLI and Azure Storage Explorer. Storage REST API version must be February 2020 or later.
-
-## Receive share
-
-1. You can view your share invitations in any Microsoft Purview account. Open the Microsoft Purview governance portal by:
-
- * Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- * Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Select the **Data Map** icon from the left navigation. Then select **Share invites**. If you received an email invitation, you can also select the **View share invite** link in the email to select a Microsoft Purview account.
-
- If you're a guest user of a tenant, you'll be asked to verify your email address for the tenant before viewing share invitation for the first time. [You can see our guide below for steps.](#guest-user-verification) Once verified, it's valid for 12 months.
-
- :::image type="content" source="./media/how-to-receive-share/view-invites.png" alt-text="Screenshot showing the Share invites page in the Microsoft Purview governance portal." border="true":::
-
-1. Alternately, within the [Microsoft Purview governance portal](https://web.purview.azure.com/), find the Azure Storage or Azure Data Lake Storage (ADLS) Gen 2 data asset you would like to receive the share into using either the [data catalog search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md). Select the **Data Share** button. You can see all the invitations in the **Share invites** tab.
-
-1. Select name of the share invite you want to view or configure.
-
-1. If you don't want to accept the invitation, select **Delete**.
-
- :::image type="content" source="./media/how-to-receive-share/select-delete-invitation-inline.png" alt-text="Screenshot showing the share attachment page with the delete button highlighted." border="true" lightbox="./media/how-to-receive-share/select-delete-invitation-large.png":::
-
- >[!NOTE]
- > If you delete an invitation, if you want to accept the share in future it will need to be resent. To deselect the share without deleting select the **Cancel** button instead.
-
-1. You can edit the **Received share name** if you like. Then select a **Storage account name** for a target storage account in the same region as the source. You can choose to **Register a new storage account to attach the share** in the drop-down as well.
-
- >[!IMPORTANT]
- >The target storage account needs to be in the same Azure region as the source storage account.
-
-1. Configure the **Path** (either a new container name, or the name of an existing share container) and, **New folder** (a new folder name for the share within in your container).
-
-1. Select **Attach to target**.
-
- :::image type="content" source="./media/how-to-receive-share/attach-shared-data-inline.png" alt-text="Screenshot showing share invitation configuration page, with a share name added, a collection selected, and the accept and configure button highlighted." border="true" lightbox="./media/how-to-receive-share/attach-shared-data-large.png":::
-
-1. On the Manage data shares page, you'll see the new share with the status of **Attaching** until it has completed and is attached.
-
- :::image type="content" source="./media/how-to-receive-share/manage-data-shares-window-creating.png" alt-text="Screenshot showing the attach share window, with the attach button highlighted after you specify a target data store to receive or access shared data." border="true":::
-
-1. You can access shared data from the target storage account through Azure portal, Azure Storage Explorer, Azure Storage SDK, PowerShell or CLI. You can also analyze the shared data by connecting your storage account to Azure Synapse Analytics Spark or Databricks.
-
-When a share is attached, a new asset of type received share is ingested into the Microsoft Purview catalog, in the same collection as the storage account to which you attached the share is registered to. You can search or browse for it like any other asset.
-
-Refer to [Microsoft Purview Data Sharing lineage](how-to-lineage-purview-data-sharing.md) to learn more about share assets and data sharing lineage.
-
-> [!NOTE]
- > Shares attached using the SDK without registering the storage account with Microsoft Purview will not be ingested into the catalog. User can register their storage account if desired. If a storage account is un-registered or re-registered to a different collection, share assets of that storage account continue to be in the initial collection.
-
-## Update received share
-
-Once you attach a share, you can edit the received share by [reattaching the share to a new storage account](#reattach-share), or stop the sharing relationship by [deleting the received share](#delete-received-share).
-
-You can find and edit received share asset one of two ways:
-
-* Access the blob storage or ADLS Gen2 asset where the data was received in the data catalog and open it, then select **Data Share** and **Manage data shares**. There you're able to see all the shares for that asset. Select the **Received shares** tab, select a share, and then select your share.
-
- :::image type="content" source="./media/how-to-receive-share/manage-data-shares-inline.png" alt-text="Screenshot of the blob storage account where the share was received, with Data Share select and Manage data shares highlighted." border="true" lightbox="./media/how-to-receive-share/manage-received-share.png":::
-
- :::image type="content" source="./media/how-to-receive-share/manage-received-share.png" alt-text="Screenshot of the list of received data shares, showing the name of the share highlighted.":::
-
-* [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) the data catalog for data share assets and select your received share. Then select the **Edit** button.
-
- :::image type="content" source="./media/how-to-receive-share/search-for-share.png" alt-text="Screenshot of the data catalog search, showing the data share filter selected and a share highlighted." border="true":::
-
-## Reattach share
-
-After you've selected your data share to edit, you can reattach the share to a new storage account or path in your current storage account by selecting a storage account, providing a path, and providing the folder.
--
-If you're updating the target, select **Attach to target** to save your changes. Attaching can take a couple minutes to complete after the process has been started.
-
-> [!NOTE]
-> While re-attaching, if you selected a storage account that is registered to a collection that you don't have permissions to or a storage account that is not registered in Microsoft Purview, you will be shown the appropriate message. You will see the shared data in your target data store.
-
-## Delete received share
-
-To delete a received share, [select the share](#update-received-share) and then select **Delete**.
---
-Deleting a received share stops the sharing relationship, and you won't be able to access shared data. Deleting a received share can take a few minutes.
-
-## Guest user verification
-
-If you're a guest user in Azure, to be able to receive shares your account must first be associated with the Azure Active Directory. You can start this process from within Microsoft Purview.
-
-1. In the [Microsoft Purview Governance Portal](https://web.purview.azure.com/), select the Data Map and then **Share invites**.
-
-1. You'll see text indicating that you need to be associated with Azure Active Directory. Select the **Get verification code** button.
-
- :::image type="content" source="./media/how-to-receive-share/get-verification-code.png" alt-text="Screenshot of the Share invites page with the Get verification code button highlighted." border="true":::
-
-1. You'll receive an email from Microsoft Azure with the verification code. Copy it.
-
-1. Return to the **Share invites** page in the Microsoft Purview Governance Portal, select **Verify**, and enter the code you received.
-
-## Troubleshoot
-
-Here are some common issues for receiving share and how to troubleshoot them.
-
-### Can't create Microsoft Purview account
-
-If you're getting an error related to *quota* when creating a Microsoft Purview account, it means your organization has exceeded [Microsoft Purview service limit](how-to-manage-quotas.md). If you require an increase in limit, contact support.
-
-### Can't find my Storage account asset in the Catalog
-
-* Data source isn't registered in Microsoft Purview. Refer to the registration steps for [Blob Storage](register-scan-azure-blob-storage-source.md) and [ADLSGen2](register-scan-adls-gen2.md) respectively. Performing a scan isn't necessary.
-* Data source is registered to a Microsoft Purview collection that you don't have a minimum of Data Reader permission to. Refer to [Microsoft Purview catalog permissions](catalog-permissions.md) and reach out to your collection admin for access.
-
-### Can't view list of shares in the storage account asset
-
- * Permission issue to the data store that you want to see shares for. You need a minimum of **Reader** role on the source storage account to see a read-only view of sent shares and received shares. You can find more details on the [ADLS Gen2](register-scan-adls-gen2.md#data-sharing) or [Blob storage](register-scan-azure-blob-storage-source.md#data-sharing) data source page.
-
-### Can't view received share in the storage account asset
-
-* You may have selected a different storage account to attach the share to, that may not be registered in Microsoft Purview or maybe registered to a collection you don't have permissions to. Refer to the registration steps for [Blob Storage](register-scan-azure-blob-storage-source.md) and [ADLSGen2](register-scan-adls-gen2.md) respectively. Refer to [Microsoft Purview catalog permissions](catalog-permissions.md) and reach out to your collection admin for access to collections.
-
-### Can't view share invite
-
-If you've been notified that you've received a share, but can't view share invite in your Microsoft Purview account, it could be due to the following reasons:
-
-* You don't have a minimum of **Data Reader** role to any collections in this Microsoft Purview account. Contact your *Microsoft Purview Collection Admin* to grant you access to **Data Reader** role to view, accept and configure the received share.
-* Share invitation is sent to your email alias or an email distribution group instead of your Azure sign-in email. Contact your data provider and ensure that they've sent the invitation to your Azure sign-in e-mail address.
-* Share has already been accepted. If you've already accepted the share, it will no longer show up in *Share invites* tab. Select *Received shares* in any Storage account that you have permissions to and see your active shares.
-* You're a guest user of the tenant. If you're a guest user of a tenant, [you need to verify your email address for the tenant in order to view a share invitation for the first time](#guest-user-verification). Once verified, it's valid for 12 months.
-
-### Can't see target storage account in the list when attaching a share
-
-When you attach a share to a target, if your storage account isn't listed for you to select, it's likely due to the following reasons:
-
-* You don't have required permissions to the storage account. Check [prerequisites](#prerequisites-to-receive-shared-data) for details on required storage permissions.
-
-### Failed to attach share
-
-If you failed to attach a share, it's likely due to the following reasons:
-* Permission issue to the target data store. Check [prerequisites](#prerequisites-to-receive-shared-data) for required data store permissions.
-* The storage account isn't supported. Microsoft Purview Data share only [supports storage accounts with specific configurations](#azure-storage-account-prerequisites).
-* The *Path* you specified includes container created outside of Microsoft Purview Data Sharing. You can only receive data into containers created while attaching.
-* The *New Folder* you specified to receive storage data isn't empty.
-* Source and target storage account is the same. Sharing from the same source storage account to the same target isn't supported.
-* Exceeding limit. Source storage account can support up to 20 targets, and target storage account can support up to 100 sources. If you require an increase in limit, contact support.
-
-### Can't access shared data in the target data store
-
-If you can't access shared data, it's likely due to the following reasons:
-
-* After share attaching is successful, it may take some time for the data to appear in the target data store. Try again in a few minutes. Likewise, after you delete a share, it may take a few minutes for the data to disappear in the target data store.
-* You're accessing shared data using a storage API version prior to February 2020. Only storage API version February 2020 and later are supported for accessing shared data. Ensure you're using the latest version of the storage SDK, PowerShell, CLI and Azure Storage Explorer.
-* You're accessing shared data using an analytics tool that uses a storage API version prior to February 2020. You can access shared data from Azure Synapse Analytics Spark and Databricks. You won't be able to access shared data using Azure Data Factory, Power BI or AzCopy.
-
-## Next steps
-
-* [How to share data](how-to-share-data.md)
-* [FAQ for data sharing](how-to-data-share-faq.md)
-* [REST API reference](/rest/api/purview/)
-* Discover share assets in Catalog
-* See data sharing lineage
purview How To Request Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-request-access.md
- Title: How to request access to a data source in Microsoft Purview.
-description: This article describes how a user can request access to a data source from within Microsoft Purview.
----- Previously updated : 03/23/2023---
-# How to request access for a data asset
--
-If you discover a data asset in the catalog that you would like to access, you can request access directly through Microsoft Purview. The request will trigger a workflow that will request that the owners of the data resource grant you access to that data source.
-
-This article outlines how to make an access request.
-
-> [!NOTE]
-> For this option to be available for a resource, a [self-service access workflow](how-to-workflow-self-service-data-access-hybrid.md) needs to be created and assigned to the collection where the resource is registered. Contact the collection administrator, data source administrator, or workflow administrator of your collection for more information.
-> Or, for information on how to create a self-service access workflow, see our [self-service access workflow documentation](how-to-workflow-self-service-data-access-hybrid.md).
-
-## Request access
--
-## Next steps
--- [What are Microsoft Purview workflows](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)-- [Self-service policies](concept-self-service-data-access-policy.md)
purview How To Resource Set Pattern Rules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-resource-set-pattern-rules.md
- Title: How to create resource set pattern rules
-description: Learn how to create a resource set pattern rule to overwrite how assets get grouped into resource sets
----- Previously updated : 12/06/2022--
-# Create resource set pattern rules
-
-At-scale data processing systems typically store a single table in storage as multiple files. This concept is represented in Microsoft Purview by using resource sets. A resource set is a single object in the data catalog that represents a large number of assets in storage. To learn more, see [Understanding resource sets](concept-resource-sets.md).
-
-When scanning a storage account, Microsoft Purview uses a set of defined patterns to determine if a group of assets is a resource set. In some cases, Microsoft Purview's resource set grouping may not accurately reflect your data estate. Resource set pattern rules allow you to customize or override how Microsoft Purview detects which assets are grouped as resource sets and how they're displayed within the catalog.
-
-Pattern rules are currently supported in the following source types:
-- Azure Data Lake Storage Gen2-- Azure Blob Storage-- Azure Files-- Amazon S3-
-The Advanced resource set feature set must be enabled to create resource set pattern rules. To learn more, see [Understanding Advanced resource sets](concept-resource-sets.md#advanced-resource-sets).
-
-## How to create a resource set pattern rule
-
-Follow the steps below to create a new resource set pattern rule:
-
-1. Go to the data map. Select **Pattern rules** from the menu under the Source management heading. Select **+ New** to create a new rule set.
-
- :::image type="content" source="media/how-to-resource-set-pattern-rules/create-new-scoped-resource-set-rule.png" alt-text="Create new resource set pattern rule" border="true":::
-
-1. Enter the scope of your resource set pattern rule. Select your storage account type and the name of the storage account you wish to create a rule set on. Each set of rules is applied relative to a folder path scope specified in the **Folder path** field.
-
- :::image type="content" source="media/how-to-resource-set-pattern-rules/create-new-scoped-resource-set-scope.png" alt-text="Create resource set pattern rule configurations" border="true":::
-
-1. To enter a rule for a configuration scope, select **+ New Rule**.
-
-1. Enter in the following fields to create a rule:
-
- 1. **Rule name:** The name of the configuration rule. This field has no effect on the assets the rule applies to.
-
- 1. **Qualified name:** A qualified path that uses a combination of text, dynamic replacers, and static replacers to match assets to the configuration rule. This path is relative to the scope of the configuration rule. See the [syntax](#syntax) section below for detailed instructions on how to specify qualified names.
-
- 1. **Display name:** The display name of the asset. This field is optional. Use plain text and static replacers to customize how an asset is displayed in the catalog. For more detailed instructions, see the [syntax](#syntax) section below.
-
- 1. **Do not group as resource set:** If enabled, matched resource won't be grouped into a resource set.
-
- :::image type="content" source="media/how-to-resource-set-pattern-rules/scoped-resource-set-rule-example.png" alt-text="Create new configuration rule." border="true":::
-
-1. Save the rule by selecting **Add**.
-
-> [!NOTE]
-> After a pattern rule is created, all new scans will apply the rule during ingestion. Existing assets in the data catalog will be updated via a background process which can take up to a few hours.
-
-## <a name="syntax"></a> Pattern rule syntax
-
-When creating resource set pattern rules, use the following syntax to specify which assets rules apply to.
-
-### Dynamic replacers (single brackets)
-
-Single brackets are used as **dynamic replacers** in a pattern rules. Specify a dynamic replacer in the qualified name using format `{<replacerName:<replacerType>}`. If matched, dynamic replacers are used as a grouping condition that indicate assets should be represented as a resource set. If the assets are grouped into a resource set, the resource set qualified path would contain `{replacerName}` where the replacer was specified.
-
-For example, If two assets `folder1/file-1.csv` and `folder2/file-2.csv` matched to rule `{folder:string}/file-{NUM:int}.csv`, the resource set would be a single entity `{folder}/file-{NUM}.csv`.
-
-#### Special case: Dynamic replacers when not grouping into resource set
-
-If *Don't group as resource set* is enabled for a pattern rule, the replacer name is an optional field. `{:<replacerType>}` is valid syntax. For example, `file-{:int}.csv` would successfully match for `file-1.csv` and `file-2.csv` and create two different assets instead of a resource set.
-
-### Static replacers (double brackets)
-
-Double brackets are used as **static replacers** in the qualified name of a pattern rule. Specify a static replacer in the qualified name using format `{{<replacerName>:<replacerType>}}`. If matched, each set of unique static replacer values will create different resource set groupings.
-
-For example, If two assets `folder1/file-1.csv` and `folder2/file-2.csv` matched to rule `{{folder:string}}/file-{NUM:int}.csv`, two resource sets would be created `folder1/file-{NUM}.csv` and `folder2/file-{NUM}.csv`.
-
-Static replacers can be used to specify the display name of an asset matching to a pattern rule. Using `{{<replacerName>}}` in the display name of a rule will use the matched value in the asset name.
-
-### Available replacement types
-
-Below are the available types that can be used in static and dynamic replacers:
-
-| Type | Structure |
-| - | |
-| string | A series of one or more Unicode characters including delimiters like spaces. |
-| int | A series of 1 or more 0-9 ASCII characters, it can be 0 prefixed (for example, 0001). |
-| guid | A series of 32 or 8-4-4-4-12 string representation of an UUID as defined in [RFC 4122](https://tools.ietf.org/html/rfc4122). |
-| date | A series of 6 or 8 0-9 ASCII characters with optionally separators: yyyymmdd, yyyy-mm-dd, yymmdd, yy-mm-dd, specified in [RFC 3339](https://tools.ietf.org/html/rfc3339). |
-| time | A series of 4 or 6 0-9 ASCII characters with optionally separators: HHmm, HH:mm, HHmmss, HH:mm:ss specified in [RFC 3339](https://tools.ietf.org/html/rfc3339). |
-| timestamp | A series of 12 or 14 0-9 ASCII characters with optionally separators: yyyy-mm-ddTHH:mm, yyyymmddhhmm, yyyy-mm-ddTHH:mm:ss, yyyymmddHHmmss specified in [RFC 3339](https://tools.ietf.org/html/rfc3339). |
-| boolean | Can contain 'true' or 'false', case insensitive. |
-| number | A series of 0 or more 0-9 ASCII characters, it can be 0 prefixed (for example, 0001) followed by optionally a dot '.' and a series of 1 or more 0-9 ASCII characters, it can be 0 postfixed (for example, .100) |
-| hex | A series of one or more ASCII characters from the set 0-1 and A-F, the value can be 0 prefixed |
-| locale | A string that matches the syntax specified in [RFC 5646](https://tools.ietf.org/html/rfc5646). |
-
-## Order of resource set pattern rules getting applied
-
-Below is the order of operations for applying pattern rules:
-
-1. More specific scopes will take priority if an asset matches to two rules. For example, rules in a scope `container/folder` will apply before rules in scope `container`.
-
-1. Order of rules within a specific scope. This can be edited in the UX.
-
-1. If an asset doesn't match to any specified rule, the default resource set heuristics apply.
-
-## Examples
-
-### Example 1
-
-SAP data extraction into full and delta loads
-
-#### Inputs
-
-Files:
--- `https://myazureblob.blob.core.windows.net/bar/customer/full/2020/01/13/saptable_customer_20200101_20200102_01.txt`-- `https://myazureblob.blob.core.windows.net/bar/customer/full/2020/01/13/saptable_customer_20200101_20200102_02.txt`-- `https://myazureblob.blob.core.windows.net/bar/customer/delta/2020/01/15/saptable_customer_20200101_20200102_01.txt`-- `https://myazureblob.blob.core.windows.net/bar/customer/full/2020/01/17/saptable_customer_20200101_20200102_01.txt`-- `https://myazureblob.blob.core.windows.net/bar/customer/full/2020/01/17/saptable_customer_20200101_20200102_02.txt`-
-#### Pattern rule
-
-**Scope:** `https://myazureblob.blob.core.windows.net/bar/`
-
-**Display name:** 'External Customer'
-
-**Qualified Name:** `customer/{extract:string}/{year:int}/{month:int}/{day:int}/saptable_customer_{date_from:date}_{date_to:time}_{sequence:int}.txt`
-
-**Resource Set:** true
-
-#### Output
-
-One resource set asset
-
-**Display Name:** External Customer
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/customer/{extract}/{year}/{month}/{day}/saptable_customer_{date_from}_{date_to}_{sequence}.txt`
-
-### Example 2
-
-IoT data in avro format
-
-#### Inputs
-
-Files:
--- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-001.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-002.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/02-01-2020/22:33:22-001.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-90/01-01-2020/22:33:22-001.avro`-
-#### Pattern rules
-
-**Scope:** `https://myazureblob.blob.core.windows.net/bar/`
-
-Rule 1
-
-**Display name:** 'machine-89'
-
-**Qualified Name:** `raw/machinename-89/{date:date}/{time:time}-{id:int}.avro`
-
-**Resource Set:** true
-
-Rule 2
-
-**Display name:** 'machine-90'
-
-**Qualified Name:** `raw/machinename-90/{date:date}/{time:time}-{id:int}.avro`
-
-**Resource Set:** true
-
-#### Outputs
-
-Two resource sets
-
-Resource Set 1
-
-**Display Name:** machine-89
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/{date}/{time}-{id}.avro`
-
-Resource Set 2
-
-**Display Name:** machine-90
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-90/{date}/{time}-{id}.avro`
-
-### Example 3
-
-IoT data in avro format
-
-#### Inputs
-
-Files:
--- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-001.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-002.avro`-- `https://myazureblob.blob.core.windows.netbar/raw/machinename-89/02-01-2020/22:33:22-001.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-90/01-01-2020/22:33:22-001.avro`-
-#### Pattern rule
-
-**Scope:** `https://myazureblob.blob.core.windows.net/bar/`
-
-**Display name:** 'Machine-{{machineid}}'
-
-**Qualified Name:** `raw/machinename-{{machineid:int}}/{date:date}/{time:time}-{id:int}.avro`
-
-**Resource Set:** true
-
-#### Outputs
-
-Resource Set 1
-
-**Display name:** machine-89
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/{date}/{time}-{id}.avro`
-
-Resource Set 2
-
-**Display name:** machine-90
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-90/{date}/{time}-{id}.avro`
-
-### Example 4
-
-Don't group into resource sets
-
-#### Inputs
-
-Files:
--- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-001.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-002.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/02-01-2020/22:33:22-001.avro`-- `https://myazureblob.blob.core.windows.net/bar/raw/machinename-90/01-01-2020/22:33:22-001.avro`-
-#### Pattern rule
-
-**Scope:** `https://myazureblob.blob.core.windows.net/bar/`
-
-**Display name:** `Machine-{{machineid}}`
-
-**Qualified Name:** `raw/machinename-{{machineid:int}}/{{:date}}/{{:time}}-{{:int}}.avro`
-
-**Resource Set:** false
-
-#### Outputs
-
-Four individual assets
-
-Asset 1
-
-**Display name:** machine-89
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-001.avro`
-
-Asset 2
-
-**Display name:** machine-89
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/01-01-2020/22:33:22-002.avro`
-
-Asset 3
-
-**Display name:** machine-89
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-89/02-01-2020/22:33:22-001.avro`
-
-Asset 4
-
-**Display name:** machine-90
-
-**Qualified Name:** `https://myazureblob.blob.core.windows.net/bar/raw/machinename-90/01-01-2020/22:33:22-001.avro`
-
-## Next steps
-
-Get started by [registering and scanning an Azure Data Lake Gen2 storage account](register-scan-adls-gen2.md).
purview How To Schedule Data Estate Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-schedule-data-estate-insights.md
- Title: Schedule Data Estate Insights report refresh
-description: This article provides the steps to schedule your Data Estate Insights report refresh rate in the Microsoft Purview governance portal.
----- Previously updated : 01/24/2023--
-# Schedule Data Estate Insights report refresh
-
-Microsoft Purview Data Estate Insights automatically creates reports that help you to identify governance gaps in your data estate. However, you can set the schedule for this automatic refresh to fit your reporting patterns and needs. In this article, we'll describe how you can schedule your refresh and what options are available.
-
-If you would like to disable Data Estate Insights for a time, you can follow our article to [disable Data Estate Insights](disable-data-estate-insights.md).
-
-> [!IMPORTANT]
-> The Data Estate Insights application is **on** by default when you create a Microsoft Purview account.
->
-> **Refresh Schedule** is set to a weekly refresh that begins 7 days after the account is created.
->
-> As the Data Map is populated and curated, Insights App shows data in the reports. The reports are ready for consumption to anyone with Insights Reader role.
-
-## Permissions needed
-
-To update the report refresh schedule for Data Estate Insights, you'll need **[data curator permissions](catalog-permissions.md#roles) on the [root collection](reference-azure-purview-glossary.md#root-collection)**.
-
-## How to update the refresh schedule
-
->[!IMPORTANT]
-> Report refresh rate [does impact your billing](#does-the-refresh-schedule-affect-billing).
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), go to the **Management** section.
-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/locate-management.png" alt-text="Screenshot of the Microsoft Purview governance portal Management section highlighted.":::
-
-1. Then select **Overview**.
-1. In the **Insights refresh** menu, locate Data Estate Insights, and select the **Edit** pencil. For more information about the insights refresh menu, see the [insights refresh menu section below](#insights-refresh-menu).
-
- >[!NOTE]
- >If the State is currently set to **Off** you will need to first turn it **On** before you can update the refresh schedule.
-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/refresh-frequency.png" alt-text="Screenshot of the Overview window in the Management section of the Microsoft Purview governance portal with the refresh frequency dropdown highlighted for Data Estate Insights feature options." :::
-
-1. In the **Edit** menu, select **Recurring**.
-
-1. Then select your time zone, set your recurrence to **Month(s)**, **Weeks(s)**, or **Days(s)**, select your day and time to run, specify a start time, and optionally specify an end time. For more information about the schedule options, see the [schedule options section below](#schedule-options).
-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/set-recurrance.png" alt-text="Screenshot of the Data Estate Insights edit page, with the Recurring radio button highlighted and set to Recurring." :::
-
-1. Select **Continue**. If prompted, review the updates and select **Save**.
-1. Now you can see your schedule is set. Selecting **More info** in the schedule columns will give you the recurrence details.
-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/schedule-set.png" alt-text="Screenshot of the management page, with the Data Estate Insights information row highlighted." :::
-
-## Does the refresh schedule affect billing?
-
-Yes. You're billed when Microsoft Purview Data Estate Insights checks for updates in your environment and when a report is generated.
-
-1. A check for updates to your Microsoft Purview instance is made at your scheduled report refresh time. You'll always be billed for this check at your report refresh time.
-
-1. If updates have been made, you'll be billed to generate your insights report. If no updates were made since the last report, you'll not be billed to generate a report.
-
- >[!NOTE]
- > An asset is considered to be **updated** if it's updated in Microsoft Purview.
- > An update in Microsoft Purview means a change in the following properties: asset name, description, schema, classifications, experts, owners, etc...
- > New rows in a data asset and a re-scan will not result in an update to your insights reports if the metadata properties have not changed.
-
-For more information about billing for Data Estates Insights, see our [pricing guidelines](concept-guidelines-pricing-data-estate-insights.md).
-
-## Schedule options
-
-Below are listed all the schedule options currently available when you modify your refresh schedule:
-
-- [Recurring or off](#recurring-or-off)-- [Time zone](#time-zone)-- [Recurrence](#recurrence)-- [Start at](#start-at)-- [End date (optional)](#specify-an-end-date)-
-### Recurring or off
--- **Recurring** - your Data Estate Insights report will refresh on the recurrence schedule you'll specify below, based on the time zone you indicate.-- **Off** - your Data Estate Insights reports will no longer refresh information, but the current reports will remain available. For more information, see our [disable Data Estate Insights article.](disable-data-estate-insights.md#disable-report-refresh)-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/recurring-off.png" alt-text="Screenshot of the Edit refresh page recurrence and off options, showing recurrence selected." :::
-
-### Time zone
-
-Select the time zone you'd like to align your refresh schedule with. If the time zone you select observes daylight savings, your trigger will auto-adjust for the difference.
--
-### Recurrence
-
-You can select a daily, weekly, or monthly refresh recurrence.
-
->[!IMPORTANT]
-> No matter what you set your refresh recurreance to, Data Estate Insights will run a check before refreshing to see if any updates have been made to the assets.
-> If no update has been made, the reports will not refresh.
->
-> An asset is considered to be **updated** only if it is updated in Microsoft Purview.
-> An update in Microsoft Purview means a change in the following properties: asset name, description, schema, classifications, experts, owners, etc...
-> New rows in a data asset and a re-scan will not result in an update to your insights reports if the metadata properties of the asset have not changed.
--- **Daily recurrence** - schedule daily recurrence by setting recurrence to every (1-5) days(s), and choosing the time of day you would like to refresh the report. For example, the below report will refresh every other day at 12 AM UTC.-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/daily-recurrence.png" alt-text="Screenshot of the Edit refresh page recurrence options, showing a daily recurrence set." :::
--- **Weekly recurrence** - schedule weekly recurrence by setting recurrence to every (1-5) week(s), and choosing the day of the week and time of day you would like to refresh the report. For example, the below report will refresh on Monday every two weeks, at six AM.-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/weekly-recurrence.png" alt-text="Screenshot of the Edit refresh page recurrence options, showing a weekly recurrence set." :::
--- **Monthly recurrence** - schedule monthly recurrence by setting recurrence to every (1-5) months(s), and choosing the day of the month and time of day you would like to refresh the report. Choose **Last** to always run on the last day of the month, and **1** to always run on the first day of the month. For example, the below report will refresh on the last day of every month at five PM.-
- :::image type="content" source="media/how-to-schedule-data-estate-insights/monthly-recurrence.png" alt-text="Screenshot of the Edit refresh page recurrence options, showing a monthly recurrence set." :::
-
-Set **Scan at this time** to the time of day you'd like to begin your refresh.
-
->[!TIP]
-> The time you select will be the time that the report begins its refresh. Microsoft Purview will adjust compute for the amount of data being aggregated, but **schedule your recurrence a little before you will need the report** so the report has time to fully refresh and will be ready.
-
-### Start at
-
-The **Start at** option allows you to set when your reports will begin their refresh schedule, and is relative to the time zone you selected. Set it for the current date and time to set the schedule immediately.
--
-### Specify an end date
-
-If you want your reports to stop refreshing after a certain amount of time, you can enable this option by selecting the check box and providing an end date.
-For example: a monthly recurrence schedule refreshing on the first of the month with the below end date would refresh on September 1 2023, and wouldn't refresh again in October.
--
-## Insights refresh menu
-
-In the management menu of the Microsoft Purview governance portal, you'll find the Insights refresh menu that lets you set your Data Estate Insights report availability.
-Here's all the information you can find there:
-
-|Feature|State|Schedule|Configuration date|Edit|
-||||||
-|Lists the feature your settings are for|Either **On** or **Off**. For more information about turning off reports, see our [disable Data Estate Insights article.](disable-data-estate-insights.md#disable-report-refresh)| Select **More info** to see your recurrence schedule, start time, and end time. | The day your recurrence schedule was last set. By default this will be the creation date of your Microsoft Purview account. | Allows you to update your recurrence schedule when reports are turned on. |
--
-## Next steps
--- [Learn how to use Asset insights](asset-insights.md)-- [Learn how to use Classification insights](classification-insights.md)-- [Learn how to use Glossary insights](glossary-insights.md)-- [Learn how to use Label insights](sensitivity-insights.md)-- [Disable Data Estate Insights](disable-data-estate-insights.md)
purview How To Search Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-search-catalog.md
- Title: 'How to: search the Data Catalog'
-description: This article gives an overview of how to search the Microsoft Purview Data Catalog.
----- Previously updated : 04/11/2022--
-# Search the Microsoft Purview Data Catalog
-
-After data is scanned and ingested into the Microsoft Purview Data Map, data consumers need to easily find the data needed for their analytics or governance workloads. Data discovery can be time consuming because you may not know where to find the data that you want. Even after finding the data, you may have doubts about whether you can trust the data and take a dependency on it.
-
-The goal of search in Microsoft Purview is to speed up the process of quickly finding the data that matters. This article outlines how to search the Microsoft Purview Data Catalog to quickly find the data you're looking for.
-
-## Searching the catalog
-
-The search bar can be quickly accessed from the top bar of the Microsoft Purview governance portal UX. In the data catalog home page, the search bar is in the center of the screen.
--
-Once you select the search bar, you'll be presented with your search history and the items recently accessed in the data catalog. This allows you to quickly pick up from previous data exploration that was already done.
--
-Enter in keywords that help narrow down your search such as name, data type, classifications, and glossary terms. As you enter in search keywords, Microsoft Purview dynamically suggests results and searches that may fit your needs. To complete your search, select "View search results" or press "Enter".
--
-Once you enter in your search, Microsoft Purview returns a list of data assets and glossary terms a user is a data reader for to that matched to the keywords entered in.
-
-Your keyword will be highlighted in the return results, so you can see where the term was found in the asset. In the example below, the search term was 'Sales'.
--
-> [!NOTE]
-> Search will only return items in collections you're a data reader or curator for. For more information, see [create and manage Collections](how-to-create-and-manage-collections.md).
-
-The Microsoft Purview relevance engine sorts through all the matches and ranks them based on what it believes their usefulness is to a user. For example, a data consumer is likely more interested in a table curated by a data steward that matches on multiple keywords than an unannotated folder. Many factors determine an assetΓÇÖs relevance score and the Microsoft Purview search team is constantly tuning the relevance engine to ensure the top search results have value to you.
-
-## Refine results
-
-If the top results donΓÇÖt include the assets you're looking for, there are two ways you can filter results:
--- [Use the facets](#use-the-facets) on the left hand side to narrow results by business metadata like glossary terms or classifications.-- [Use the filters](#use-the-filters) at the top to narrow results by source type, [managed attributes](how-to-managed-attributes.md), or activity.-
-### Use the facets
-
-To narrow your results by your business metadata, use the facet pane on the left side of the search results. If it's not visible, you can select the arrow button to open it.
-
-Then select any facet you would like to narrow your results by.
--
-For certain annotations, you can select the ellipses to choose between an AND condition or an OR condition.
--
->[!NOTE]
->The *Filter by keyword* option at the top of the facet menu is to filter facets by keyword, not the search results.
->
-> :::image type="content" source="./media/how-to-search-catalog/filter-facets.png" alt-text="Screenshot showing the facet filter at the top of the menu, with a search parameter entered, and the facets filtered below." border="true":::
-
-#### Available facets
--- **Assigned term** - refines your search to assets with the selected terms applied.-- **Classification** - refines your search to assets with certain classifications.-- **Collection** - refines your search by assets in a specific collection.-- **Contact** - refines your search to assets that have selected users listed as a contact.-- **Data** - refines your search to specific data types. For example: pipelines, data shares, tables, or reports.-- **Endorsement** - refines your search to assets with specified endorsements, like **Certified** or **Promoted**.-- **Label** - refines your search to assets with specific security labels.-- **Metamodel facets** - if you've created a [metamodel](concept-metamodel.md) in your Microsoft Purview Data Map, you can also refine your search to metamodel assets like Business or Organization.-- **Rating** - refines your search to only data assets with a specified rating.-
-### Use the filters
-
-To narrow results by asset type, [managed attributes](how-to-managed-attributes.md), or activity you'll use the filters at the top of the page of search results.
-
-To filter by source type, select the **Source type:** button, and select all the source types you want to see results from.
--
-To add another filter for a different attribute, select **Add filter**.
--
-Then, select your attribute, enter your operator and value, and your search results will be filtered after you select outside the filter.
-
-To remove any filters, select the **x** in the filter button, or clear all filters by selecting **Clear all filters**.
--
-#### Available filters
--- **Activity** - allows you refine your search to attributes created or updated within a certain timeframe.-- **Managed attributes** - refines your search to assets with specified [managed attributes](how-to-managed-attributes.md). Attributes will be listed under their attribute group, and use operators to help search for specific values. For example: Contains any, or Doesn't contain.-- **Source type** - refines your search to assets from specified source types. For example: Azure Blob Storage or Power BI.-- **Tags** - refines your search to assets with selected tags.-
-## View assets
-
-From the search results page, you can select an asset to view details such as schema, lineage, and classifications. To learn more about the asset details page, see [Manage catalog assets](catalog-asset-details.md).
--
-## Searching Microsoft Purview in connected services
-
-Once you register your Microsoft Purview instance to an Azure Data Factory or an Azure Synapse Analytics workspace, you can search the Microsoft Purview Data Catalog directly from those services. To learn more, see [Discover data in ADF using Microsoft Purview](../data-factory/how-to-discover-explore-purview-data.md) and [Discover data in Synapse using Microsoft Purview](../synapse-analytics/catalog-and-governance/how-to-discover-connect-analyze-azure-purview.md).
--
-## Bulk edit search results
-
-If you're looking to make changes to multiple assets returned by search, Microsoft Purview lets you modify glossary terms, classifications, and contacts in bulk. To learn more, see the [bulk edit assets](how-to-bulk-edit-assets.md) guide.
-
-## Browse the data catalog
-
-While searching is great if you know what you're looking for, there are times where data consumers wish to explore the data available to them. The Microsoft Purview Data Catalog offers a browse experience that enables users to explore what data is available to them either by collection or through traversing the hierarchy of each data source in the catalog. For more information, see [browse the data catalog](how-to-browse-catalog.md).
-
-## Search query syntax
-
-All search queries consist of keywords and operators. A keyword is a something that would be part of an asset's properties. Potential keywords can be a classification, glossary term, asset description, or an asset name. A keyword can be just a part of the property you're looking to match to. Use keywords and the operators to ensure Microsoft Purview returns the assets you're looking for.
-
-Certain characters including spaces, dashes, and commas are interpreted as delimiters. Searching a string like `hive-database` is the same as searching two keywords `hive database`.
-
-The following table contains the operators that can be used to compose a search query. Operators can be combined as many times as need in a single query.
-
-| Operator | Definition | Example |
-| -- | - | - |
-| OR | Specifies that an asset must have at least one of the two keywords. Must be in all caps. A white space is also an OR operator. | The query `hive OR database` returns assets that contain 'hive' or 'database' or both. |
-| AND | Specifies that an asset must have both keywords. Must be in all caps | The query `hive AND database` returns assets that contain both 'hive' and 'database'. |
-| NOT | Specifies that an asset can't contain the keyword to the right of the NOT clause. Must be in all caps | The query `hive NOT database` returns assets that contain 'hive', but not 'database'. |
-| () | Groups a set of keywords and operators together. If you combine multiple operators, parentheses specify the order of operations. | The query `hive AND (database OR warehouse)` returns assets that contain 'hive' and either 'database' or 'warehouse', or both. |
-| "" | Specifies exact content in a phrase that the query must match to. | The query `"hive database"` returns assets that contain the phrase "hive database" in their properties |
-| field:keyword | Searches the keyword in a specific attribute of an asset. Field search is case insensitive and is limited to the following fields at this time: <ul><li>name</li><li>description</li><li>entityType</li><li>assetType</li><li>classification</li><li>term</li><li>contact</li></ul> | The query `description: German` returns all assets that contain the word "German" in the description.<br><br>The query `term:Customer` will return all assets with glossary terms that include "Customer" and all glossary terms that match to "Customer". |
-
-> [!TIP]
-> Searching "*" will return all the assets and glossary terms in the catalog.
-
-### Known limitations
-
-* Grouping isn't supported within a field search. Customers should use operators to connect field searches. For example,`name:(alice AND bob)` is invalid search syntax, but `name:alice AND name:bob` is supported.
-
-## Next steps
--- [How to create and manage glossary terms](how-to-create-manage-glossary-term.md)-- [How to import and export glossary terms](how-to-import-export-glossary.md)-- [How to manage term templates for business glossary](how-to-manage-term-templates.md)
purview How To Share Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-share-data.md
- Title: How to share data
-description: Learn how to share data with Microsoft Purview Data Sharing.
------ Previously updated : 02/16/2023-
-# Share Azure Storage data in-place with Microsoft Purview Data Sharing (preview)
--
-Microsoft Purview Data Sharing supports in-place data sharing from Azure Data Lake Storage (ADLS Gen2) to ADLS Gen2, and Blob storage account to Blob storage account. This article explains how to share data using Microsoft Purview.
-
->[!NOTE]
->This feature has been updated in February 2023, and permissions needed to view and manage Data Sharing in Microsoft Purview have changed.
->Now only Reader permissions are needed on the collection where the shared data is housed. Refer to [Microsoft Purview permissions](catalog-permissions.md) to learn more about the Microsoft Purview collection and roles.
-
-## Prerequisites to share data
-
-### Microsoft Purview prerequisites
-
-* [A Microsoft Purview account](create-catalog-portal.md).
-* A minimum of Data Reader role is needed on a Microsoft Purview collection to use data sharing in the governance portal. Refer to [Microsoft Purview permissions](catalog-permissions.md) to learn more about the Microsoft Purview collection and roles.
-* To use the SDK, no Microsoft Purview permissions are needed.
-* Your data recipient's Azure sign-in email address, or the object ID and tenant ID of the recipient application, that you'll use to send the invitation to receive a share. The recipient's email alias won't work.
-
-### Azure Storage account prerequisites
-
-* Your Azure subscription must be registered for the **AllowDataSharing** preview feature. Follow the below steps using Azure portal or PowerShell.
-
- # [Portal](#tab/azure-portal)
- 1. In the [Azure portal](https://portal.azure.com), select your Azure subscription.
- 1. From the left menu, select **Preview features** under *Settings*.
- 1. Select **AllowDataSharing** and *Register*.
- 1. Refresh the *Preview features* screen to verify the *State* is **Registered**. It could take 15 minutes to 1 hour for registration to complete.
- 1. In addition, to use data share for storage accounts in East US, East US2, North Europe, Southcentral US, West Central US, West Europe, West US, West US2: Select AllowDataSharingInHeroRegion and Register
-
- For more information, see [Register preview feature](../azure-resource-manager/management/preview-features.md?tabs=azure-portal#register-preview-feature).
-
- # [PowerShell](#tab/powershell)
- ```azurepowershell
- Set-AzContext -SubscriptionId [Your Azure subscription ID]
- ```
- ```azurepowershell
- Register-AzProviderFeature -FeatureName "AllowDataSharing" -ProviderNamespace "Microsoft.Storage"
- ```
- ```azurepowershell
- Get-AzProviderFeature -FeatureName "AllowDataSharing" -ProviderNamespace "Microsoft.Storage"
- ```
- In addition, to use data share for storage accounts in East US, East US2, North Europe, Southcentral US, West Central US, West Europe, West US, West US2:
-
- ```azurepowershell
- Register-AzProviderFeature -FeatureName "AllowDataSharingInHeroRegion" -ProviderNamespace "Microsoft.Storage"
- ```
- ```azurepowershell
- Get-AzProviderFeature -FeatureName "AllowDataSharingInHeroRegion" -ProviderNamespace "Microsoft.Storage"
- ```
- The *RegistrationState* should be **Registered**. It could take 15 minutes to 1 hour for registration to complete. For more information, see [Register preview feature](../azure-resource-manager/management/preview-features.md?tabs=azure-portal#register-preview-feature).
--
-* A source storage account **created after the registration step is completed**. Source storage account can be in a different Azure region from your Microsoft Purview account, but needs to follow the available configurations.
-
-* You need the **Owner** or **Storage Blob Data Owner** role on the source storage account to be able to share data. You can find more details on the [ADLS Gen2](register-scan-adls-gen2.md#data-sharing) or [Blob storage](register-scan-azure-blob-storage-source.md#data-sharing) data source page.
-
-* If the source storage account is in a different Azure subscription than the one for Microsoft Purview account, the Microsoft. Purview resource provider needs to be registered in the Azure subscription where the Storage account is located. It's automatically registered at the time of share provider adding an asset if the user has permission to do the `/register/action` operation and therefore, Contributor or Owner roles to the subscription where the Storage account is located.
-This registration is only needed the first time when sharing or receiving data into a storage account in the Azure subscription.
-
-* A storage account needs to be registered in the collection to create a share using the Microsoft Purview compliance portal experience. For instructions to register, see the [ADLS Gen2](register-scan-adls-gen2.md) or [Blob storage](register-scan-azure-blob-storage-source.md) data source pages. This step isn't required to use the SDK.
-
-## Create a share
-
-1. You can create a share by starting from **Data Map**
-
- Open the [Microsoft Purview governance portal](https://web.purview.azure.com/). Select the **Data Map** icon from the left navigation. Then select **Shares**. Select **+New Share**.
-
- :::image type="content" source="./media/how-to-share-data/create-share-datamap-new-share.png" alt-text="Screenshot that shows the Microsoft Purview governance portal Data Map with Data Map, Shares and New Share highlighted." border="true":::
-
- Select the Storage account type and the Storage account you want to share data from. Then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/create-share-datamap-select-type-account.png" alt-text="Screenshot that shows the New Share creation step with Type and Storage account options highlighted." border="true":::
-
-1. You can create a share by starting from **Data Catalog**
-
- Within the [Microsoft Purview governance portal](https://web.purview.azure.com/), find the Azure Storage or Azure Data Lake Storage (ADLS) Gen 2 data asset you would like to share data from using either the [data catalog search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md).
-
- :::image type="content" source="./media/how-to-share-data/search-or-browse.png" alt-text="Screenshot that shows the Microsoft Purview governance portal homepage with the search and browse options highlighted." border="true":::
-
- Once you have found your data asset, select the **Data Share** button.
-
- :::image type="content" source="./media/how-to-share-data/select-data-share-inline.png" alt-text="Screenshot of a data asset in the Microsoft Purview governance portal with the Data Share button highlighted." border="true" lightbox="./media/how-to-share-data/select-data-share-large.png":::
-
- Select **+New Share**.
-
- :::image type="content" source="./media/how-to-share-data/select-new-share-inline.png" alt-text="Screenshot of the Data Share management window with the New Share button highlighted." border="true" lightbox="./media/how-to-share-data/select-new-share-large.png":::
-
-1. Specify a name and a description of share contents (optional). Then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/create-share-details-inline.png" alt-text="Screenshot showing create share and enter details window, with the Continue button highlighted." border="true" lightbox="./media/how-to-share-data/create-share-details-large.png":::
-
-1. Search for and add all the assets you'd like to share out at the container, folder, and file level, and then select **Continue**.
-
- > [!IMPORTANT]
- > Only containers, files, and folders that belong to the current Blob or ADLSGen2 Storage account can be added to the share.
-
- :::image type="content" source="./media/how-to-share-data/add-asset.png" alt-text="Screenshot showing the add assets window, with a file and a folder selected to share." border="true":::
-
-1. You can edit the display names the shared data will have, if you like. Then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/provide-display-names.png" alt-text="Screenshot showing the second add assets window with the display names unchanged." border="true":::
-
-1. Select **Add Recipient** and select **User** or **App**.
-
- To share data to a user, select **User**, then enter the Azure sign-in email address of who you want to share data with. By default, the option to enter email address of user is shown.
-
- :::image type="content" source="./media/how-to-share-data/create-share-add-user-recipient-inline.png" alt-text="Screenshot showing the add recipients page, with the add recipient button highlighted, default user email option shown." border="true" lightbox="./media/how-to-share-data/create-share-add-user-recipient-large.png":::
-
- To to share data with a service principal, select **App**. Enter the object ID and tenant ID of the recipient you want to share data with.
-
- :::image type="content" source="./media/how-to-share-data/create-share-add-app-recipient-inline.png" alt-text="Screenshot showing the add app recipients page, with the add app option and required fields highlighted." border="true" lightbox="./media/how-to-share-data/create-share-add-app-recipient-large.png":::
-
-1. Select **Create and Share**. Optionally, you can specify an **Expiration date** for when to terminate the share. You can share the same data with multiple recipients by selecting **Add Recipient** multiple times.
-
-You've now created your share. The recipients of your share will receive an invitation and they can view the share invitation in their Microsoft Purview account.
-
-When a share is created, a new asset of type sent share is ingested into the Microsoft Purview catalog, in the same collection as the storage account from which you created the share. You can search for it like any other asset in the data catalog.
-
-You can also track lineage for data shared using Microsoft Purview. See, [Microsoft Purview Data Sharing lineage](how-to-lineage-purview-data-sharing.md) to learn more about share assets and data sharing lineage.
-
-> [!NOTE]
-> Shares created using the SDK without registering the storage account with Microsoft Purview will not be ingested into the catalog. User can register their storage account if desired. If a storage account is un-registered or re-registered to a different collection, share assets of that storage account continue to be in the initial collection.
-
-## Update a sent share
-
-Once a share is created, you can update description, assets, and recipients.
-
-> [!NOTE]
-> If you only have the **Reader** role on the source storage account, you will be able to view list of sent shares and received shares but not edit. You can find more details on the [ADLS Gen2](register-scan-adls-gen2.md#data-sharing) or [Blob storage](register-scan-azure-blob-storage-source.md#data-sharing) data source page.
-
-You can find your sent shares one of two ways:
-
-* Access the blob storage or ADLS Gen2 asset where the data was shared from in the data catalog. Open it, then select **Data Share**. There you're able to see all the shares for that asset. Select a share, and then select the **Edit** option.
-
- :::image type="content" source="./media/how-to-share-data/select-data-share-inline.png" alt-text="Screenshot of a data asset in the Microsoft Purview governance portal with the data share button highlighted." border="true" lightbox="./media/how-to-share-data/select-data-share-large.png":::
-
- :::image type="content" source="./media/how-to-share-data/select-share-to-edit.png" alt-text="Screenshot of the Manage data shares page with a share selected and the edit button highlighted." border="true":::
-
-* For shares that you sent, you can find them in the **Shares** menu in the Microsoft Purview Data Map. There you're able to see all the shares you have sent. Select a share, and then select the **Edit** option.
-
- :::image type="content" source="./media/how-to-share-data/select-shares-in-data-map-inline.png" alt-text="Screenshot of the Data Shares menu in the Microsoft Purview Data Map." border="true" lightbox="./media/how-to-share-data/select-shares-in-data-map.png":::
-
-From any of these places you can:
--- [Edit share details](#edit-details)-- [Edit shared assets](#edit-assets)-- [Edit share recipients](#edit-recipients)-- [Delete your share](#delete-share)-
-### Edit details
-
-On the **Details** tab of the [edit share page](#update-a-sent-share), you can update the share name and description.
-Save any changes by selecting **Save**.
--
-### Edit assets
-
-On the **Asset** tab of the [edit share page](#update-a-sent-share) you can see all the shared files and folders.
-
-You can **remove** any containers, files, or folders from the share by selecting the delete button in the asset's row however you can't remove all the assets of a sent share.
--
-You can **add new assets** by selecting the **Edit** button and then searching for and selecting any other containers, files, and folders in the asset that you would like to add.
--
-Once you've selected your assets, select **Add**, and you'll see your new asset in the Asset tab.
-
-Save all your changes by selecting the **Save** button.
-
-### Edit recipients
-
-On the **Recipients** tab of the [edit share page](#update-a-sent-share) you can see all the users and groups that are receiving your shares, their status, and the expiration date for their share.
-
-Here are what each of the recipient statuses mean:
-
-| Status | Meaning |
-|||
-|Attached | The share has been accepted and the recipient has access to the shared data. |
-|Detached | The recipient hasn't accepted the invitation or is no longer active. They aren't receiving the share. |
-
-You can **remove or delete recipients** by either selecting the delete button on the recipient's row, or selecting multiple recipients and then selecting the **Delete recipients** button at the top of the page.
--
-You can **add recipients** by selecting the **Add recipients** button.
--
-Select **Add Recipient** again and select **User** or **App**.
-
-To share data to a user, select **User**, then enter the Azure sign-in email address of who you want to share data with. By default, the option to enter email address of user is shown.
--
-To to share data with a service principal, select **App**. Enter the object ID and tenant ID of the recipient you want to share data with.
--
-Optionally, you can specify an **Expiration date** for when to terminate the share. You can share the same data with multiple recipients by clicking on **Add Recipient** multiple times.
-
-When you're finished, select the **Add recipients** confirmation button at the bottom of the page.
-
-Save all your changes by selecting the **Save** button.
-
-### Delete share
-
-To delete your share, on any tab in the [edit share page](#update-a-sent-share), select the **Delete share** button.
--
-Confirm that you would like to delete in the pop-up window and the share will be removed.
-
-## Troubleshoot
-
-Here are some common issues for sharing data and how to troubleshoot.
-
-### Can't create Microsoft Purview account
-
-If you're getting an error related to *quota* when creating a Microsoft Purview account, it means your organization has exceeded [Microsoft Purview service limit](how-to-manage-quotas.md). If you require an increase in limit, contact support.
-
-### Can't find my Storage account asset in the Catalog
-
-There are a couple possible reasons:
-
-* The data source isn't registered in Microsoft Purview. Refer to the registration steps for [Blob Storage](register-scan-azure-blob-storage-source.md) and [ADLSGen2](register-scan-adls-gen2.md) respectively. Performing a scan isn't necessary.
-* Data source is registered to a Microsoft Purview collection that you don't have a minimum of Data Reader permission to. Refer to [Microsoft Purview catalog permissions](catalog-permissions.md) and reach out to your collection admin for access.
-
-### Can't create shares or edit shares
-
-* You don't have permission to the data store where you want to share data from. Check the [prerequisites](#prerequisites-to-share-data) for required data store permissions.
-
-### Can't view list of shares in the storage account asset
-
- * You don't have enough permissions the data store that you want to see shares of. You need a minimum of **Reader** role on the source storage account to see a read-only view of sent shares and received shares. You can find more details on the [ADLS Gen2](register-scan-adls-gen2.md#data-sharing) or [Blob storage](register-scan-azure-blob-storage-source.md#data-sharing) data source page.
- * Review [storage account prerequisites](#azure-storage-account-prerequisites) and make sure your storage account region, performance, and redundancy options are all supported.
-
-## Next steps
-
-* [How to receive share](how-to-receive-share.md)
-* [FAQ for data sharing](how-to-data-share-faq.md)
-* [REST API reference](/rest/api/purview/)
-* Discover share assets in Catalog
-* See data sharing lineage
purview How To Use Workflow Connectors https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-use-workflow-connectors.md
- Title: Workflow connectors and actions
-description: This article describes how to use connectors and actions in Microsoft Purview workflows
----- Previously updated : 05/15/2023---
-# Workflow connectors and actions
--
-You can use [workflows](concept-workflow.md) to automate some business processes through Microsoft Purview. A Connector in a workflow provides a way to connect to different systems and leverage a set of prebuilt actions and triggers.
-
-## Current workflow connectors and actions
-
-Currently the following connectors are available for a workflow in Microsoft Purview:
-
-|Connector Type |Functionality |Parameters |Customizable |Workflow templates |
-||||||
-|Apply to each |Apply an action or set of actions to all returned values in an output. | -Output to process <br> -Actions|- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow|All workflows templates|
-|Check data source registration for data use governance |Validate if data source has been registered with Data Use Management enabled. |None | - Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |Data access request |
-|Condition |Evaluate a value to true or false. Based on the evaluation the workflow will be re-directed to different branches |- Add row <br> - Title <br> - Add group |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |All workflows templates |
-|Create Glossary Term |Create a new glossary term |None |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |Create glossary term template |
-|Create task and wait for task completion |Creates, assigns, and tracks a task to a user or Azure Active Directory group as part of a workflow. <br> - Reminder settings - You can set reminders to periodically remind the task owner till they complete the task. <br> - Expiry settings - You can set an expiration or deadline for the task activity. Also, you can set who needs to be notified (user/AAD group) after the expiry. |- Assigned to <br> - Task title <br> - Task body |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |All workflows templates |
-|Delete glossary term |Delete an existing glossary term |None |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |Delete glossary term |
-|Grant access |Create an access policy to grant access to the requested user. |None |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |Data access request |
-|Http |Integrate with external applications through http or https call. <br> For more information, see [Workflows HTTP connector](how-to-use-workflow-http-connector.md) |- Host <br> - Method <br> - Path <br> - Headers <br> - Queries <br> - Body <br> - Authentication |- Renamable: Yes <br> - Deletable: Yes <br> - Settings: Secured Input and Secure outputs (Enabled by default) <br> - Multiple per workflow |All workflows templates |
-|Import glossary terms |Import one or more glossary terms |None |- Renamable: Yes <br> - Deletable: No <br> - Multiple per workflow |Import terms |
-|Parse JSON |Parse an incoming JSON to extract parameters |- Content <br> - Schema <br> |- Renamable: Yes <br> - Deletable: No <br> - Multiple per workflow |All workflows templates |
-|Send email notification |Send email notification to one or more recipients |- Subject <br> - Message body <br> - Recipient |- Renamable: Yes <br> - Deletable: Yes <br> - Settings: Secured Input and Secure outputs (Enabled by default) <br> - Multiple per workflow |All workflows templates |
-|Start and wait for an approval |Generates approval requests and assign the requests to individual users or Microsoft Azure Active Directory groups. Microsoft Purview workflow approval connector currently supports two types of approval types: <br> - First to Respond ΓÇô This implies that the first approver's outcome (Approve/Reject) is considered final. <br> - Everyone must approve ΓÇô This implies everyone identified as an approver must approve the request for the request to be considered approved. If one approver rejects the request, regardless of other approvers, the request is rejected. <br> - Reminder settings - You can set reminders to periodically remind the approver till they approve or reject. <br> - Expiry settings - You can set an expiration or deadline for the approval activity. Also, you can set who needs to be notified (user/AAD group) after the expiry. |- Approval Type <br> - Title <br> - Assigned To |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |All workflows templates |
-|Update glossary term |Update an existing glossary term |None |- Renamable: Yes <br> - Deletable: Yes <br> - Multiple per workflow |Update glossary term |
-|When term creation request is submitted |Triggers a workflow with all term details when a new term request is submitted |None |- Renamable: Yes <br> - Deletable: No <br> - Only one per workflow |Create glossary term template |
-|When term deletion request is submitted |Triggers a workflow with all term details when a request to delete an existing term is submitted |None |- Renamable: Yes <br> - Deletable: No <br> - Only one per workflow |Delete glossary term |
-|When term Import request is submitted |Triggers a workflow with all term details in a csv file, when a request to import terms is submitted |None |- Renamable: Yes <br> - Deletable: No <br> - Only one per workflow |Import terms |
-|When term update request is submitted |Triggers a workflow with all term details when a request to update an existing term is submitted |None |- Renamable: Yes <br> - Deletable: No <br> - Only one per workflow |Update glossary term |
-
-## Next steps
-
-For more information about workflows, see these articles:
--- [Workflows in Microsoft Purview](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Use Workflow Dynamic Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-use-workflow-dynamic-content.md
- Title: Workflow dynamic content
-description: This article describes how to use dynamic content to create expressions with built-in variables and functions in Microsoft Purview workflows.
----- Previously updated : 03/09/2023---
-# Workflow dynamic content
--
-You can use dynamic content inside Microsoft Purview workflows to associate certain variables in the workflow or add other expressions to process these values.
-
-When you add dynamic content to your workflows, you're building expressions from provided building blocks that reference and process information in your workflow so you can get the values you need in real-time.
-
-In the dynamic content menu, the currently available options are:
-
-* [Built-in variables](#built-in-variables) - variables that represent values coming to the workflow from the items that triggered it
-* [Expressions](#expressions) - formulas built from functions and variables that can process values in-workflow.
-
-## Built-in variables
-
-Currently, the following variables are available for a workflow connector in Microsoft Purview:
-
-|Prerequisite connector |Built-in variable |Functionality | Type | Possible Values |
-||||||
-|When data access request is submitted |Workflow.Requestor |The requestor of the workflow |string||
-| |Workflow.Request Recipient |The request recipient of the workflow |string||
-| |Asset.Name |The name of the asset |string||
-| |Asset.Description |The description of the asset |string||
-| |Asset.Type |The type of the asset |string||
-| |Asset.Fully Qualified Name |The fully qualified name of the asset |string||
-| |Asset.Owner |The owner of the asset |array of strings||
-| |Asset.Classification |The display names of classifications of the asset |array of strings||
-| |Asset.Certified |The indicator of whether the asset meets your organization's quality standards and can be regarded as reliable |string|'true' or 'false'|
-|Start and wait for an approval |Approval.Outcome |The outcome of the approval |string|'Approved' or 'Rejected'|
-| |Approval.Assigned To |The IDs of the approvers |array of strings||
-| |Approval.Comments |The IDs of the approvers and their comments |string||
-|Check data source registration for data use governance |Data Use Governance |The result of the data use governance check|string|'true' or 'false'|
-|When term creation request is submitted |Workflow.Requestor |The requestor of the workflow |string||
-| |Term.Name |The name of the term |string||
-| |Term.Formal Name |The formal name of the term |string||
-| |Term.Definition |The definition of the term |string||
-| |Term.Experts |The experts of the term |array of strings||
-| |Term.Stewards |The stewards of the term |array of strings||
-| |Term.Parent.Name |The name of parent term if exists |string||
-| |Term.Parent.Formal Name |The formal name of parent term if exists |string||
-|When term update request is submitted <br> When term deletion request is submitted | Workflow.Requestor |The requestor of the workflow |string||
-| |Term.Name |The name of the term |string||
-| |Term.Formal Name |The formal name of the term |string||
-| |Term.Definition |The definition of the term |string||
-| |Term.Experts |The experts of the term |array of strings||
-| |Term.Stewards |The stewards of the term |array of strings||
-| |Term.Parent.Name |The name of parent term if exists |string||
-| |Term.Parent.Formal Name |The formal name of parent term if exists |string||
-| |Term.Created By |The creator of the term |string||
-| |Term.Last Updated By |The last updater of the term |string||
-|When term import request is submitted |Workflow.Requestor |The requestor of the workflow |string||
-| |Import File.Name |The name of the file to import |string||
-
-## Expressions
-
-Workflow definitions in Microsoft Purview allow you to use functions in your expressions to process values in your workflows.
-
-To find functions [based on their general purpose](#ordered-by-purpose), review the following tables. Or, for detailed information about each function, see the [alphabetical list](#alphabetical-list).
-
-When you're building a workflow and want to add a function to your expressions, follow these steps:
-
-1. Select the value you're going to edit.
-1. Select the **Add dynamic content** button that appears underneath the textbox.
-1. Select the **Expressions** tab in the dynamic content window and scroll to select your value.
-1. Update your expression and select **OK** to add it.
--
-### Considerations
-
-* Function parameters are evaluated from left to right.
-
-* Functions that appear inline with plain text require enclosing curly braces ({}) to use the expression's interpolated format instead. This format helps avoid parsing problems. If your function expression doesn't appear inline with plain text, no curly braces are necessary.
-
- The following example shows the correct and incorrect syntax:
-
- **Correct**: `"<text>/@{<function-name>('<parameter-name>')}/<text>"`
-
- **Incorrect**: `"<text>/@<function-name>('<parameter-name>')/<text>"`
-
- **OK**: `"@<function-name>('<parameter-name>')"`
-
-The following sections organize functions based on their [general purpose](#ordered-by-purpose), or you can browse these functions in [alphabetical order](#alphabetical-list).
-
-<a name="ordered-by-purpose"></a>
-<a name="string-functions"></a>
-
-## String functions
-
-To work with strings, you can use these string functions and also some [collection functions](#collection-functions). String functions work only on strings.
-
-| String function | Task |
-| | - |
-| [endsWith](#endswith) | Check whether a string ends with the specified substring. |
-| [startsWith](#startswith) | Check whether a string starts with a specific substring. |
-|||
-
-<a name="collection-functions"></a>
-
-## Collection functions
-
-To work with collections, generally arrays, strings, and sometimes, dictionaries, you can use these collection functions.
-
-| Collection function | Task |
-| - | - |
-| [contains](#contains) | Check whether a collection has a specific item. |
-|||
-
-<a name="comparison-functions"></a>
-
-## Logical comparison functions
-
-To work with conditions, compare values and expression results, or evaluate various kinds of logic, you can use these logical comparison functions. For the full reference about each function, see the [alphabetical list](../logic-apps/workflow-definition-language-functions-reference.md#alphabetical-list).
-
-| Logical comparison function | Task |
-| | - |
-| [equals](#equals) | Check whether both values are equivalent. |
-| [greater](#greater) | Check whether the first value is greater than the second value. |
-| [greaterOrEquals](#greaterOrEquals) | Check whether the first value is greater than or equal to the second value. |
-| [less](#less) | Check whether the first value is less than the second value. |
-| [lessOrEquals](#lessOrEquals) | Check whether the first value is less than or equal to the second value. |
-|||
-
-##
-
-<a name="alphabetical-list"></a>
-
-## All functions - alphabetical list
-
-<a name="contains"></a>
-
-### contains
-
-Check whether a collection has a specific item. Return true when the item is found, or return false when not found. This function is case-sensitive.
-
-```
-contains('<collection>', '<value>')
-contains([<collection>], '<value>')
-```
-
-Specifically, this function works on these collection types:
-
-* A *string* to find a *substring*
-* An *array* to find a *value*
-* A *dictionary* to find a *key*
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*collection*> | Yes | String, Array, or Dictionary | The collection to check |
-| <*value*> | Yes | String, Array, or Dictionary, respectively | The item to find |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the item is found. Return false when not found. |
-||||
-
-*Example 1*
-
-This example checks the string "hello world" for
-the substring "world" and returns true:
-
-```
-contains('hello world', 'world')
-```
-
-*Example 2*
-
-This example checks the string "hello world" for
-the substring "universe" and returns false:
-
-```
-contains('hello world', 'universe')
-```
-
-<a name="endswith"></a>
-
-### endsWith
-
-Check whether a string ends with a specific substring.
-Return true when the substring is found, or return false when not found.
-This function isn't case-sensitive.
-
-```
-endsWith('<text>', '<searchText>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*text*> | Yes | String | The string to check |
-| <*searchText*> | Yes | String | The ending substring to find |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the ending substring is found. Return false when not found. |
-||||
-
-*Example 1*
-
-This example checks whether the "hello world"
-string ends with the "world" string:
-
-```
-endsWith('hello world', 'world')
-```
-
-And returns this result: `true`
-
-*Example 2*
-
-This example checks whether the "hello world"
-string ends with the "universe" string:
-
-```
-endsWith('hello world', 'universe')
-```
-
-And returns this result: `false`
-
-<a name="equals"></a>
-
-### equals
-
-Check whether both values, expressions, or objects are equivalent.
-Return true when both are equivalent, or return false when they're not equivalent.
-
-```
-equals('<object1>', '<object2>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*object1*>, <*object2*> | Yes | Various | The values, expressions, or objects to compare |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when both are equivalent. Return false when not equivalent. |
-||||
-
-*Example*
-
-These examples check whether the specified inputs are equivalent.
-
-```
-equals(true, 1)
-equals('abc', 'abcd')
-```
-
-And returns these results:
-
-* First example: Both values are equivalent, so the function returns `true`.
-* Second example: Both values aren't equivalent, so the function returns `false`.
-
-### greater
-
-Check whether the first value is greater than the second value.
-Return true when the first value is more,
-or return false when less.
-
-```
-greater(<value>, <compareTo>)
-greater('<value>', '<compareTo>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*value*> | Yes | Integer, Float, or String | The first value to check whether greater than the second value |
-| <*compareTo*> | Yes | Integer, Float, or String, respectively | The comparison value |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the first value is greater than the second value. Return false when the first value is equal to or less than the second value. |
-||||
-
-*Example*
-
-These examples check whether the first value is greater than the second value:
-
-```
-greater(10, 5)
-greater('apple', 'banana')
-```
-
-And return these results:
-
-* First example: `true`
-* Second example: `false`
-
-<a name="greaterOrEquals"></a>
-
-### greaterOrEquals
-
-Check whether the first value is greater than or equal to the second value.
-Return true when the first value is greater or equal,
-or return false when the first value is less.
-
-```
-greaterOrEquals(<value>, <compareTo>)
-greaterOrEquals('<value>', '<compareTo>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*value*> | Yes | Integer, Float, or String | The first value to check whether greater than or equal to the second value |
-| <*compareTo*> | Yes | Integer, Float, or String, respectively | The comparison value |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the first value is greater than or equal to the second value. Return false when the first value is less than the second value. |
-||||
-
-*Example*
-
-These examples check whether the first value is greater or equal than the second value:
-
-```
-greaterOrEquals(5, 5)
-greaterOrEquals('apple', 'banana')
-```
-
-And return these results:
-
-* First example: `true`
-* Second example: `false`
-
-<a name="less"></a>
-
-### less
-
-Check whether the first value is less than the second value.
-Return true when the first value is less,
-or return false when the first value is more.
-
-```
-less(<value>, <compareTo>)
-less('<value>', '<compareTo>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*value*> | Yes | Integer, Float, or String | The first value to check whether less than the second value |
-| <*compareTo*> | Yes | Integer, Float, or String, respectively | The comparison item |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the first value is less than the second value. Return false when the first value is equal to or greater than the second value. |
-||||
-
-*Example*
-
-These examples check whether the first value is less than the second value.
-
-```
-less(5, 10)
-less('banana', 'apple')
-```
-
-And return these results:
-
-* First example: `true`
-* Second example: `false`
-
-<a name="lessOrEquals"></a>
-
-### lessOrEquals
-
-Check whether the first value is less than or equal to the second value.
-Return true when the first value is less than or equal,
-or return false when the first value is more.
-
-```
-lessOrEquals(<value>, <compareTo>)
-lessOrEquals('<value>', '<compareTo>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*value*> | Yes | Integer, Float, or String | The first value to check whether less than or equal to the second value |
-| <*compareTo*> | Yes | Integer, Float, or String, respectively | The comparison item |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the first value is less than or equal to the second value. Return false when the first value is greater than the second value. |
-||||
-
-*Example*
-
-These examples check whether the first value is less or equal than the second value.
-
-```
-lessOrEquals(10, 10)
-lessOrEquals('apply', 'apple')
-```
-
-And return these results:
-
-* First example: `true`
-* Second example: `false`
-
-### startsWith
-
-Check whether a string starts with a specific substring. Return true when the substring is found, or return false when not found. This function isn't case-sensitive.
-
-```
-startsWith('<text>', '<searchText>')
-```
-
-| Parameter | Required | Type | Description |
-| | -- | - | -- |
-| <*text*> | Yes | String | The string to check |
-| <*searchText*> | Yes | String | The starting string to find |
-|||||
-
-| Return value | Type | Description |
-| | - | -- |
-| true or false | Boolean | Return true when the starting substring is found. Return false when not found. |
-||||
-
-*Example 1*
-
-This example checks whether the "hello world" string starts with the "hello" substring:
-
-```
-startsWith('hello world', 'hello')
-```
-
-And returns this result: `true`
-
-*Example 2*
-
-This example checks whether the "hello world" string starts with the "greetings" substring:
-
-```
-startsWith('hello world', 'greetings')
-```
-
-And returns this result: `false`
-
-## Next steps
-
-For more information about workflows, see these articles:
--- [Workflows in Microsoft Purview](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Use Workflow Http Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-use-workflow-http-connector.md
- Title: Use external APIs in Workflows
-description: Work with external APIs using HTTP connector and Parse JSON in Microsoft Purview workflows.
----- Previously updated : 05/16/2023---
-# Use external APIs in Workflows
--
-You can use [workflows](concept-workflow.md) to automate some business processes through Microsoft Purview, and the [HTTP connector](#http-connector) and [parse JSON action](#parse-json-action) allow you to integrate your workflows with external applications.
-
-## HTTP connector
-
-HTTP connectors use Representational State Transfer (REST) architecture, which allows Microsoft Purview workflows to interact directly with third party applications by using web requests.
-
-HTTP connector is available in all workflow templates.
-
->[!NOTE]
-> To create or edit a workflow, you need the [workflow admin role](catalog-permissions.md) in Microsoft Purview. You can also contact the workflow admin in your collection, or reach out to your collection administrator, for permissions.
-
-1. To add an HTTP connector, click on the **+** icon in the template where you want to add and select HTTP connector.
-
- :::image type="content" source="./media/how-to-use-workflow-http-connector/add-http-connector.png" alt-text="Screenshot of how to add HTTP connector.":::
-
-1. Once you select HTTP connector, you see the following parameters:
- 1. Host - Request URL you want to call when this connector is executed.
- 1. Method - Select one of the following methods. GET, PUT, PATCH, POST and DELETE. These methods correspond to create, read, update and delete operations.
- 1. Path - Optionally you can enter request URL Path. You can use dynamic content for this parameter.
- 1. Headers - Optionally, you can enter HTTP headers. HTTP headers let the client and the server pass additional information with an HTTP request or response
- 1. Queries - Optionally, you can pass queries.
- 1. Body - Optionally, you can pass HTTP body while invoking the URL
- 1. Authentication - HTTP connector is integrated with Purview credentials. Depending on the URL, you may invoke the endpoint with None (no authentication) or you can use credentials to create a basic authentication. To learn more about credentials see the [Microsoft Purview credentials article](manage-credentials.md).
-
- :::image type="content" source="./media/how-to-use-workflow-http-connector/add-http-properties.png" alt-text="Screenshot of how to add HTTP connector properties.":::
-
-1. By default, secure settings are turned on for HTTP connectors. To turn OFF secure inputs and outputs select the ellipsis icon (**...**) to go to settings.
-
- :::image type="content" source="./media/how-to-use-workflow-http-connector/add-http-settings.png" alt-text="Screenshot of how to add HTTP connector settings.":::
-
-1. You are now presented with the settings for HTTP connector and you can turn secure inputs and outputs OFF.
-
- :::image type="content" source="./media/how-to-use-workflow-http-connector/add-http-secure.png" alt-text="Screenshot of how to add HTTP connector secure input and outputs.":::
-
-## Parse JSON action
-
-The parse JSON action in workflows allows you to take an incoming JSON from HTTP (or any other action/connector), and parse the JSON to extract values for use in your workflow.
-
-The parse JSON action is available in all workflows.
--
-The parse JSON action has two parameters:
--- Content - a variable that should contain the JSON you want to parse.-- Schema - the schema of the incoming JSON, which allows the workflow to parse the incoming information. You can supply your own, or use the **Generate from sample** button. If you generate from a sample, you'll enter a sample JSON payload and a schema will be automatically generated for you.-
-Actions and connectors in the workflow after the parse JSON action will be able to use the values extracted from the JSON by selecting **Add dynamic content** for any parameters.
---
-## Next steps
-
-For more information about workflows, see these articles:
--- [Workflows in Microsoft Purview](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)-
purview How To View Self Service Data Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-view-self-service-data-access-policy.md
- Title: View self-service policies
-description: This article describes how to view autogenerated self-service access policies
----- Previously updated : 03/23/2023--
-# How to view self-service data access policies
-
-In a Microsoft Purview catalog, you can now [request access](how-to-request-access.md) to data assets. If policies are currently available for the data source type and the data source has [Data Use Management enabled](how-to-enable-data-use-management.md), a self-service policy is generated when a data access request is approved.
-
-This article describes how to view self-service data access policies that have been autogenerated by approved access requests.
-
-## Prerequisites
-
-> [!IMPORTANT]
-> To view self-service policies, make sure that the below prerequisites are completed.
-
-Self-service policies must exist for them to be viewed. To enable and create self-service policies, follow these articles:
-
-1. [Enable Data Use Management](how-to-enable-data-use-management.md) - this will allow Microsoft Purview to create policies for your sources.
-1. [Create a self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md) - this will enable [users to request access to data sources from within Microsoft Purview](how-to-request-access.md).
-1. [Approve a self-service data access request](how-to-workflow-manage-requests-approvals.md#approvals) - after approving a request, if your workflow from the previous step includes the ability to create a self-service data policy, your policy will be created and will be viewable.
-
-## Permission
-
-Only the creator of your Microsoft Purview account, or users with [**Policy Admin**](catalog-permissions.md#roles) permissions can view self-service data access policies.
-
-If you need to add or request permissions, follow the [Microsoft Purview permissions documentation](catalog-permissions.md#add-users-to-roles).
-
-## Steps to view self-service data access policies
-
-1. Launch the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/). The Microsoft Purview governance portal can be launched as shown below or by using the [url directly](https://web.purview.azure.com/resource/).
-
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-launch-pic-1.png" alt-text="Screenshot showing a Microsoft Purview account open in the Azure portal, with the Microsoft Purview governance portal button highlighted.":::
-
-1. Select the policy management tab to launch the self-service access policies.
-
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-2.png" alt-text="Screenshot of the Microsoft Purview governance portal with the leftmost menu open, and the Data policy page option highlighted.":::
-
-1. Open the self-service access policies tab.
-
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-3.png" alt-text="Screenshot of the Microsoft Purview governance portal open to the Data policy page with self-service access policies highlighted.":::
-
-1. Here you'll see all your policies. The policies can be sorted and filtered by any of the displayed columns to improve your search.
-
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-4.png" alt-text="Screenshot showing the self-service access policies page, with an active filter highlighted next to the keyword filter textbox, and the date created column header selected to sort by that column.":::
-
-## Next steps
--- [Self-service data access policies](./concept-self-service-data-access-policy.md)-- [How to delete self-service access policies](how-to-delete-self-service-data-access-policy.md)
purview How To Workflow Asset Curation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-asset-curation.md
- Title: Asset curation approval workflow
-description: This article describes how to create and manage workflows to approve data asset curation in Microsoft Purview.
----- Previously updated : 01/03/2023----
-# Approval workflow for asset curation
--
-This guide will take you through the creation and management of approval workflows for asset curation.
-
-## Create and enable a new approval workflow for asset curation
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/) and select the Management center. You'll see three new icons in the table of contents.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/workflow-section.png" alt-text="Screenshot showing the management center left menu with the new workflow section highlighted.":::
-
-1. To create new workflows, select **Authoring** in the workflow section. This will take you to the workflow authoring experiences.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/workflow-authoring-experience.png" alt-text="Screenshot showing the authoring workflows page, showing a list of all workflows.":::
-
- >[!NOTE]
- >If the authoring tab is greyed out, you don't have the permissions to be able to author workflows. You'll need the [workflow admin role](catalog-permissions.md).
-
-1. To create a new workflow, select the **+New** button.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/workflow-authoring-select-new.png" alt-text="Screenshot showing the authoring workflows page, with the plus sign New button highlighted.":::
-
-1. To create **Approval workflows for asset curation** Select **Data Catalog** and select **Continue**
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/select-data-catalog.png" alt-text="Screenshot showing the new workflows menu, with Data Catalog selected.":::
-
-1. In the next screen, you'll see all the templates provided by Microsoft Purview to create a workflow. Select the template using which you want to start your authoring experiences and select **Continue**. Each of these templates specifies the kind of action that will trigger the workflow. In the screenshot below we've selected **Update asset attributes** to create approval workflow for asset updates.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/update-asset-attributes-continue.png" alt-text="Screenshot showing the new data catalog workflow menu, showing template options, with the Continue button selected." lightbox="./media/how-to-workflow-asset-curation/update-asset-attributes-continue.png":::
-
-1. Next, enter a workflow name and optionally add a description. Then select **Continue**.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/name-and-continue.png" alt-text="Screenshot showing the new data catalog workflow menu with a name entered into the name textbox.":::
-
-1. You'll now be presented with a canvas where the selected template is loaded by default.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/workflow-authoring-canvas-inline.png" alt-text="Screenshot showing the workflow authoring canvas, with the selected template workflow populated in the central workspace." lightbox="./media/how-to-workflow-asset-curation/workflow-authoring-canvas-inline.png":::
-
-1. The default template can be used as it is by populating the approver's email address in **Start and Wait for approval** Connector.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/add-approver-email-inline.png" alt-text="Screenshot showing the workflow authoring canvas, with the start and wait for an approval step opened, and the Assigned to textbox highlighted." lightbox="./media/how-to-workflow-asset-curation/add-approver-email-inline.png":::
-
- The default template has the following steps:
- 1. Trigger when an asset is updated. The update can be done in overview, schema or contacts tab.
- 1. Approval connector that specifies a user or group that will be contacted to approve the request.
- 1. Condition to check approval status
- - If approved:
- 1. Update the asset in Purview data catalog.
- 1. Send an email to requestor that their request is approved, and asset update operation is successful.
- - If rejected:
- 1. Send email to requestor that their asset update request is denied.
-
-1. You can also modify the template by adding more connectors to suit your organizational needs. Add a new step to the end of the template by selecting the **New step** button. Add steps between any already existing steps by selecting the arrow icon between any steps.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/modify-template-inline.png" alt-text="Screenshot showing the workflow authoring canvas, with a plus sign button highlighted on the arrow between the two top steps, and the Next Step button highlighted at the bottom of the workspace." lightbox="./media/how-to-workflow-asset-curation/modify-template-inline.png":::
-
-1. Once you're done defining a workflow, you need to bind the workflow to a collection hierarchy path. The binding implies that this workflow is triggered only for update operation on data assets in that collection. A workflow can be bound to only one hierarchy path. To bind a workflow or to apply a scope to a workflow, you need to select ΓÇÿApply workflowΓÇÖ. Select the scopes you want this workflow to be associated with and select **OK**.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/select-apply-workflow.png" alt-text="Screenshot showing the new data catalog workflow menu with the Apply Workflow button highlighted at the top of the workspace." lightbox="./media/how-to-workflow-asset-curation/select-apply-workflow.png":::
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/select-okay.png" alt-text="Screenshot showing the apply workflow window, showing a list of items that the workflow can be applied to. At the bottom of the window, the O K button is selected." lightbox="./media/how-to-workflow-asset-curation/select-okay.png":::
-
- >[!NOTE]
- > The Microsoft Purview workflow engine will always resolve to the closest workflow that the collection hierarchy path is associated with. In case a direct binding is not found, it will traverse up in the tree to find the workflow associated with the closest parent in the collection tree.
-
-1. By default, the workflow will be enabled. To disable, toggle the Enable button in the top menu.
-
-1. Finally select **Save and close** to create and the workflow.
-
- :::image type="content" source="./media/how-to-workflow-asset-curation/workflow-enabled.png" alt-text="Screenshot showing the workflow authoring page, showing the newly created workflow listed among all other workflows." lightbox="./media/how-to-workflow-asset-curation/workflow-enabled.png":::
-
-## Edit an existing workflow
-
-To modify an existing workflow, select the workflow and then select **Edit** in the top menu. You'll then be presented with the canvas containing workflow definition. Modify the workflow and select **Save** to commit changes.
--
-## Disable a workflow
-
-To disable a workflow, select the workflow and then select **Disable** in the top menu. You can also disable the workflow by selecting **Edit** and changing the enable toggle in workflow canvas.
--
-## Delete a workflow
-
-To delete a workflow, select the workflow and then select **Delete** in the top menu.
--
-## Limitations for asset curation with approval workflow enabled
--- Lineage updates are directly stored in Purview data catalog without any approvals.-
-## Next steps
-
-For more information about workflows, see these articles:
--- [What are Microsoft Purview workflows](concept-workflow.md)-- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Workflow Business Terms Approval https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-business-terms-approval.md
- Title: Business terms approval workflow
-description: This article describes how to create and manage workflows to approve business terms in Microsoft Purview.
----- Previously updated : 02/20/2023----
-# Approval workflow for business terms
--
-This guide will take you through the creation and management of approval workflows for business terms.
-
-## Create and enable a new approval workflow for business terms
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/) and select the Management center. You'll see three new icons in the table of contents.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/workflow-section.png" alt-text="Screenshot showing the management center left menu with the new workflow section highlighted.":::
-
-1. To create new workflows, select **Authoring** in the workflow section. This will take you to the workflow authoring experiences.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/workflow-authoring-experience.png" alt-text="Screenshot showing the authoring workflows page, showing a list of all workflows.":::
-
- >[!NOTE]
- >If the authoring tab is greyed out, you don't have the permissions to be able to author workflows. You'll need the [workflow admin role](catalog-permissions.md).
-
-1. To create a new workflow, select **+New** button.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/workflow-authoring-select-new.png" alt-text="Screenshot showing the authoring workflows page, with the + New button highlighted.":::
-
-1. To create **Approval workflows for business terms** Select **Data Catalog** and select **Continue**
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/select-data-catalog.png" alt-text="Screenshot showing the new workflows menu, with Data Catalog selected.":::
-
-1. In the next screen, you'll see all the templates provided by Microsoft Purview to create a workflow. Select the template using which you want to start your authoring experiences and select **Continue**. Each of these templates specifies the kind of action that will trigger the workflow. In the screenshot below we've selected **Create glossary term**. The four different templates available for business glossary are:
- * Create glossary term - when a term is created, approval will be requested.
- * Update glossary term - when a term is updated, approval will be requested.
- * Delete glossary term - when a term is deleted, approval will be requested.
- * Import terms - when terms are imported, approval will be requested.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/create-glossary-term-select-continue.png" alt-text="Screenshot showing the new data catalog workflow menu, showing template options, with the Continue button selected.":::
-
-1. Next, enter a workflow name and optionally add a description. Then select **Continue**.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/name-and-continue.png" alt-text="Screenshot showing the new data catalog workflow menu with a name entered into the name textbox.":::
-
-1. You'll now be presented with a canvas where the selected template is loaded by default.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/workflow-authoring-canvas-inline.png" alt-text="Screenshot showing the workflow authoring canvas, with the selected template workflow populated in the central workspace." lightbox="./media/how-to-workflow-business-terms-approval/workflow-authoring-canvas-expanded.png":::
-
-1. The default template can be used as it is by populating the approver's email address in **Start and Wait for approval** Connector.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/add-approver-email-inline.png" alt-text="Screenshot showing the workflow authoring canvas, with the start and wait for an approval step opened, and the Assigned to textbox highlighted." lightbox="./media/how-to-workflow-business-terms-approval/add-approver-email-expanded.png":::
-
- The default template has the following steps:
- 1. Trigger when a glossary term is created/updated/deleted/imported depending on the template selected.
- 1. Approval connector that specifies a user or group that will be contacted to approve the request.
- 1. Condition to check approval status
- - If approved:
- 1. Create/update/delete/import the glossary term
- 1. Send an email to requestor that their request is approved, and term CUD (create, update, delete) operation is successful.
- - If rejected:
- 1. Send email to requestor that their request is denied.
-
-1. You can also modify the template by adding more connectors to suit your organizational needs. Add a new step to the end of the template by selecting the **New step** button. Add steps between any already existing steps by selecting the arrow icon between any steps.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/modify-template-inline.png" alt-text="Screenshot showing the workflow authoring canvas, with a + button highlighted on the arrow between the two top steps, and the Next Step button highlighted at the bottom of the workspace." lightbox="./media/how-to-workflow-business-terms-approval/modify-template-expanded.png":::
-
-1. Once you're done defining a workflow, you need to bind the workflow to a glossary hierarchy path. The binding implies that this workflow is triggered only for CUD operations within the specified glossary hierarchy path. A workflow can be bound to only one hierarchy path. To bind a workflow or to apply a scope to a workflow, you need to select ΓÇÿApply workflowΓÇÖ. Select the scopes you want this workflow to be associated with and select **OK**.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/select-apply-workflow.png" alt-text="Screenshot showing the new data catalog workflow menu with the Apply Workflow button highlighted at the top of the workspace.":::
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/select-okay.png" alt-text="Screenshot showing the apply workflow window, showing a list of items that the workflow can be applied to. At the bottom of the window, the O K button is selected.":::
-
- >[!NOTE]
- > - The Microsoft Purview workflow engine will always resolve to the closest workflow that the term hierarchy path is associated with. In case a direct binding is not found, it will traverse up in the tree to find the workflow associated with the closest parent in the glossary tree.
- > - Import terms can only be bound to root glossary path as the .CSV can contain terms from different hierarchy paths.
-
-1. By default, the workflow will be enabled. To disable, toggle the Enable button in the top menu.
-
-1. Finally select **Save and close** to create and the workflow.
-
- :::image type="content" source="./media/how-to-workflow-business-terms-approval/workflow-enabled.png" alt-text="Screenshot showing the workflow authoring page, showing the newly created workflow listed among all other workflows.":::
-
-## Edit an existing workflow
-
-To modify an existing workflow, select the workflow and then select **Edit** in the top menu. You'll then be presented with the canvas containing workflow definition. Modify the workflow and select **Save** to commit changes.
--
-## Disable a workflow
-
-To disable a workflow, select the workflow and then select **Disable** in the top menu. You can also disable the workflow by selecting **Edit** and changing the enable toggle in workflow canvas.
--
-## Delete a workflow
-
-To delete a workflow, select the workflow and then select **Delete** in the top menu.
--
-## Limitations for business terms with approval workflow enabled
-
-* Non-approved glossary terms aren't saved in Purview catalog.
-* The behavior of tagging terms to assets/schemas is same as today. That is, previously created draft terms can be tagged to assets/schemas.
-
-## Next steps
-
-For more information about workflows, see these articles:
--- [What are Microsoft Purview workflows](concept-workflow.md)-- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Workflow Manage Requests Approvals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-manage-requests-approvals.md
- Title: Manage workflow requests and approvals
-description: This article outlines how to manage requests and approvals generated by a workflow in Microsoft Purview.
----- Previously updated : 02/20/2023---
-# Manage workflow requests and approvals
--
-This article outlines how to manage requests and approvals that generated by a [workflow](concept-workflow.md) in Microsoft Purview.
-
-To view requests you've made or request for approvals that have been sent to you by a workflow instance, navigate to management center in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), and select **Requests and Approvals**.
---
-You'll be presented with three tabs:
-
-* [Waiting for a response](#waiting-for-a-response) - This tab shows the requests (tasks) and approvals that are waiting for you to act on.
-* [Pending requests](#pending-requests) - You can view all the approval requests and tasks you've submitted in this tab.
-* [History](#history) - All the completed approvals and tasks are moved to this tab.
-
-## Waiting for a response
-
-This tab shows the requests (tasks) and approvals that are waiting for your action.
--
-Select the request to take action.
-
-### Approvals
-
-1. To approve/reject a request, select the request and you'll be presented with the following window:
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/select-request.png" alt-text="Screenshot showing that the Waiting for a response tab is selected, and an open request has been selected. The Respond page is shown with some details, a space for a response, and a space for commentary.":::
-
-1. An approval activity has the following available responses:
- - **Approved** ΓÇô An approver can mark the response as **Approved** indicating their approval for the changes proposed by the requestor.
- - **Rejected** - An approver can mark the response as **Rejected** indicating that they don't approve the changes proposed by the requestor.
-1. Optionally, select the value to see details of the request. In the screenshot below, it shows the term details. If the approval is for a data asset, you'll be able to view the details, as shown below.
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/select-value.png" alt-text="Screenshot showing the respond page is open, with the Value detail highlighted.":::
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/view-details.png" alt-text="Screenshot showing the request value has been selected, so the request details page is open showing an overview of the request, related information, and contacts.":::
-
-1. If there are updates, you'll also to be able to see the current value and proposed value.
-1. Choose your response, optionally add comments, and select **Confirm**.
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/select-option-and-confirm.png" alt-text="Screenshot showing the Respond page is open, with the response section Highlighted, a response selected, and the confirm button highlighted at the bottom.":::
-
-### Tasks
-
-1. To complete a task, select the task request and you'll be presented with the following window:
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/task-request.png" alt-text="Screenshot showing the task selected and the Respond page is open, with details, a status, and a place for comments.":::
-
-1. A task has the following statues:
- - Not started ΓÇô This is the status of the task when it's initially created by a workflow.
- - In Progress ΓÇô A task owner can mark the task as **In progress** to indicate that they're currently working on it.
- - Complete ΓÇô Once the task is complete, a task owner can change the status as **Complete**. This marks the completion of task activity, and the workflow will now move to the next step.
-
-1. Select the correct status, add any comments, and select **Confirm**.
-
-### Reassign requests
-
-You can reassign requests both approvals and tasks that are assigned to you to a different user.
-
-1. To reassign, select the request or task you're assigned and select **Reassign** in the following window.
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/reassign-button.png" alt-text="Screenshot showing the task selected and the Respond page is open, with details, a status, and a place for comments and reassign button.":::
-
-1. You'll be presented with a list of all the users who are assigned to the request. Select **Assignee** where your user name or the group you're part of is displayed and change it from your user name to the new user name. Select **Save** to complete the reassignment.
-
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/reassign-user.png" alt-text="Screenshot showing the request selected and the reassign user.":::
-
- > [!NOTE]
- > You can only re-assign your user id or group you are part of to another user or group. The other assignees will be greyed out and will not available for re-assignment.
-
-## Pending requests
-
-In this tab you can view all the approval requests and tasks that you've submitted.
-
-Select the request to see the status and the outcomes for each approver/task owner.
--
-### Cancel workflows
-
-You can cancel a submitted request and its underlying workflow by selecting **Cancel request and its underlying workflow run**.
--
- > [!NOTE]
- > You can only cancel workflows that are in progress. When you cancel a request from the **requests and approvals** section, it will cancel the underlying workflow run.
--
-## History
-
-All the completed approvals and tasks are moved to this tab.
--
-Select an approval or task to see details and responses from all approvals or task owners.
-
-## Email notifications
-
-Purview approvals and task connectors have in-built email capabilities. Every time an approval/task action is triggered in workflow; it sends email to all the users who need to act on it.
--
-Users can respond by selecting the links in the email, or by navigating to the Microsoft Purview governance portal and viewing their pending tasks.
-
-## Next steps
--- [What are Microsoft Purview workflows](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)-- [Manage workflow runs](how-to-workflow-manage-runs.md)
purview How To Workflow Manage Runs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-manage-runs.md
- Title: Manage workflow runs
-description: This article outlines how to manage workflow runs.
----- Previously updated : 03/23/2023---
-# Manage workflow runs
--
-This article outlines how to manage workflows that are already running.
-
-1. To view workflow runs you triggered, sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), select the Management center, and select **Workflow runs**.
-
- :::image type="content" source="./media/how-to-workflow-manage-runs/select-workflow-runs.png" alt-text="Screenshot of the management menu in the Microsoft Purview governance portal. The Workflow runs tab is highlighted.":::
-
-1. You'll be presented with the list of workflow runs and their statuses.
-
- :::image type="content" source="./media/how-to-workflow-manage-runs/workflow-runs.png" alt-text="Screenshot of the workflow runs page, showing a list of all workflow runs, their status, and their run IDs.":::
-
-1. You can filter the results by using workflow name, status, or time.
-
- :::image type="content" source="./media/how-to-workflow-manage-runs/filters.png" alt-text="Screenshot of the workflow runs page, with the keyword, name, status, and time filters highlighted above the list of workflows.":::
-
-1. Select a workflow name to see the details of the workflow run.
-
-1. This will present a window that shows all the actions that are completed, actions that are in-progress, and the next action for that workflow run.
-
- :::image type="content" source="./media/how-to-workflow-manage-runs/workflow-details.png" alt-text="Screenshot of the workflow runs page, with an example workflow name selected, and the workflow details page overlaid, showing workflow run, submission time, run ID, status, and a list of all steps in the request timeline.":::
-
-1. You can select any of the actions in the request timeline to see the specific status and sub steps details.
-
- :::image type="content" source="./media/how-to-workflow-manage-runs/select-stages.png" alt-text="Screenshot of the workflow runs page, with the workflow details page overlaid. Some workflow run actions in the request timeline have been expanded to show more information and sub steps.":::
-
-1. You can cancel a running workflow by selecting **Cancel workflow run**.
-
- :::image type="content" source="./media/how-to-workflow-manage-runs/cancel-workflows-inline.png" alt-text="Screenshot of the workflow runs page, with the workflow details page overlaid and cancel button to cancel the workflow run." lightbox="./media/how-to-workflow-manage-runs/cancel-workflows.png":::
-
- > [!NOTE]
- > You can only cancel workflows that are in progress.
--
-## Next steps
--- [What are Microsoft Purview workflows](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Workflow Self Service Data Access Hybrid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-self-service-data-access-hybrid.md
- Title: Self-service hybrid data access workflows
-description: This article describes how to create and manage hybrid self-service data access workflows in Microsoft Purview.
----- Previously updated : 02/20/2023---
-# Self-service access workflows for hybrid data estates
--
-You can use [workflows](concept-workflow.md) to automate some business processes through Microsoft Purview. Self-service access workflows allow you to create a process for your users to request access to datasets they've discovered in Microsoft Purview.
-
-Let's say your team has a new data analyst who will do some business reporting. You add that data analyst to your department's collection in Microsoft Purview. From there, they can browse through the data assets and read descriptions about the data that your department has available.
-
-The data analyst notices that one of the Azure Data Lake Storage Gen2 accounts seems to have the exact data that they need to get started. Because a self-service access workflow has been set up for that resource, they can [request access](how-to-request-access.md) to that Azure Data Lake Storage account from within Microsoft Purview.
--
-You can create these workflows for any of your resources across your data estate to automate the access request process. Workflows are assigned at the [collection](reference-azure-purview-glossary.md#collection) level, so they automate business processes along the same organizational lines as your permissions.
-
-This guide shows you how to create and manage self-service access workflows in Microsoft Purview.
-
->[!NOTE]
-> To create or edit a workflow, you need the [workflow admin role](catalog-permissions.md) in Microsoft Purview. You can also contact the workflow admin in your collection, or reach out to your collection administrator, for permissions.
-
-## Create and enable the self-service access workflow
-
-1. Sign in to [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/) and select the management center. Three new icons appear in the table of contents.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/workflow-section.png" alt-text="Screenshot that shows the management center menu with the new workflow section highlighted.":::
-
-1. To create new workflows, select **Authoring**. This step takes you to the workflow authoring experience.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/workflow-authoring-experience.png" alt-text="Screenshot that shows the page for authoring workflows and a list of all workflows.":::
-
- >[!NOTE]
- >If the **Authoring** tab is unavailable, you don't have the permissions to author workflows. You need the [workflow admin role](catalog-permissions.md).
-
-1. To create a new self-service workflow, select the **+New** button.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/workflow-authoring-select-new.png" alt-text="Screenshot that shows the page for authoring workflows, with the New button highlighted.":::
-
-1. You're presented with categories of workflows that you can create in Microsoft Purview. To create an access request workflow, select **Governance**, and then select **Continue**.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/select-governance.png" alt-text="Screenshot that shows the new workflow panel, with the Governance option selected.":::
-
-1. The next screen shows all the templates that Microsoft Purview provides to create a self-service data access workflow. Select the **Data access request** template, and then select **Continue**.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/select-data-access-request.png" alt-text="Screenshot that shows the new workflow panel, with the data access request template selected.":::
-
-1. Enter a workflow name, optionally add a description, and then select **Continue**.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/name-and-continue.png" alt-text="Screenshot that shows the name and description boxes for a new workflow.":::
-
-1. You're presented with a canvas where the selected template is loaded by default.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/workflow-canvas-inline.png" alt-text="Screenshot that shows the workflow canvas with the selected template workflow steps displayed." lightbox="./media/how-to-workflow-self-service-data-access-hybrid/workflow-canvas-expanded.png":::
-
- The template has the following steps:
- 1. Trigger when a data access request is made.
- 1. Get an approval connector that specifies a user or group that will be contacted to approve the request.
-
- Assign data owners as approvers. Using the dynamic variable **Asset.Owner** as approvers in the approval connector will send approval requests to the data owners on the entity.
-
- >[!Note]
- > Using the **Asset.Owner** variable might result in errors if an entity doesn't have a data owner.
-
-1. If the condition to check approval status is approved, take the following steps:
-
- * If a data source is registered for [data use management](how-to-enable-data-use-governance.md) with the policy:
- 1. Create a [self-service policy](concept-self-service-data-access-policy.md).
- 1. Send email to the requestor that confirms access.
- * If a data source isn't registered with the policy:
- 1. Use a connector to assign [a task](how-to-workflow-manage-requests-approvals.md#tasks) to a user or an Azure Active Directory (Azure AD) group to manually provide access to the requestor.
- 1. Send an email to requestor to explain that access is provided after the task is marked as complete.
-
- If the condition to check approval status is rejected, send an email to the requestor to say that the data access request is denied.
-
-1. You can use the default template as it is by populating two fields:
- * Add an approver's email address or Azure AD group in the **Start and wait for an approval** connector.
- * Add a user's email address or Azure AD group in the **Create task and wait for task completion** connector to denote who is responsible for manually providing access if the source isn't registered with the policy.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/required-fields-for-template-inline.png" alt-text="Screenshot that shows the workflow canvas with the connector for starting an approval and the connector for creating a task, along with the text boxes for assigning them." lightbox="./media/how-to-workflow-self-service-data-access-hybrid/required-fields-for-template-expanded.png":::
-
- > [!NOTE]
- > Configure the workflow to create self-service policies only for sources that the Microsoft Purview policy supports. To see what the policy supports, check the [documentation about data owner policies](tutorial-data-owner-policies-storage.md).
- >
- > If the Microsoft Purview policy doesn't support your source, use the **Create task and wait for task completion** connector to assign [tasks](how-to-workflow-manage-requests-approvals.md#tasks) to users or groups that can provide access.
-
- You can also modify the template by adding more connectors to suit your organizational needs.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/more-connectors-inline.png" alt-text="Screenshot that shows the workflow authoring canvas, with the button for adding a connector and the button for saving the new conditions." lightbox="./media/how-to-workflow-self-service-data-access-hybrid/more-connectors-expanded.png":::
-
-1. After you define a workflow, you need to bind the workflow to a collection hierarchy path. The binding (or scoping) implies that this workflow is triggered only for data access requests in that collection.
-
- To bind a workflow or to apply a scope to a workflow, select **Apply workflow**. Select the scope that you want to associate with this workflow, and then select **OK**.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/apply-workflow.png" alt-text="Screenshot that shows the workflow workspace with a list of items on the menu for applying a workflow.":::
-
- >[!NOTE]
- > The Microsoft Purview workflow engine will always resolve to the closest workflow that the collection hierarchy path is associated with. If the workflow engine doesn't find a direct binding, it will look for the workflow that's associated with the closest parent in the collection tree.
-
-1. Make sure that the **Enable** toggle is on. The workflow should be enabled by default.
-1. Select **Save and close** to create and enable the workflow.
-
- Your new workflow now appears in the list of workflows.
-
- :::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/completed-workflows.png" alt-text="Screenshot that shows the workflow authoring page with the newly created workflow listed among the other workflows.":::
-
-## Edit an existing workflow
-
-To modify an existing workflow, select the workflow, and then select the **Edit** button. You're presented with the canvas that contains the workflow definition. Modify the workflow, and then select **Save** to commit the changes.
--
-## Disable a workflow
-
-To disable a workflow, select the workflow, and then select **Disable**.
--
-Another way is to select the workflow, select **Edit**, turn off the **Enable** toggle in the workflow canvas, and then select **Save and close**.
-
-## Delete a workflow
-
-To delete a workflow, select the workflow, and then select **Delete**.
--
-## Next steps
-
-For more information about workflows, see these articles:
--- [Workflows in Microsoft Purview](concept-workflow.md)-- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)-- [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)-- [Self-service access policies](concept-self-service-data-access-policy.md)
purview Insights Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/insights-permissions.md
- Title: Permissions for Data Estate Insights in Microsoft Purview
-description: This article describes what permissions are needed to access access and managed Data Estate Insights in Microsoft Purview.
------ Previously updated : 05/16/2022--
-# Access control in Data Estate Insights within Microsoft Purview
-
-Like all other permissions in Microsoft Purview, Data Estate Insights access is given through collections. This article describes what permissions are needed to access Data Estate Insights in Microsoft Purview.
-
-## Insights reader role
-
-The insights reader role gives users read permission to the Data Estate Insights application in Microsoft Purview. However, a user with this role will only have access to information for collections that they also have at least data reader access to.
-
-As Data Estate Insights application gives a bird's eye view of your data estate and catalog usage from a governance and risk perspective, it's intended for users who need to manage and report on this high-level information, like a Chief Data Officer. You may not want, or need, all your data readers to have access to the Data Estate Insights dashboards.
--
-## Role assignment
-
-* **Insights Reader** role can be assigned to any Data Map user, by the **Data Curator of the root collection**. Users assigned the Data Curator role on subcollections don't have the privilege of assigning insights reader.
-
- :::image type="content" source="media/insights-permissions/insights-reader.png" alt-text="Screenshot of root collection, showing the role assignments tab, with the add user button selected next to Insights reader.":::
-
-* A **Data Curator** of any collection also has read permission to Data Estate Insights application. Their scope of insights will be limited to the metadata assigned to collections. In other words, a Data Curator at a subcollection will only view KPIs and aggregations on collections they have access to. A Data Curator can still view and edit assets from Data Estate Insights app, without any extra permissions.
-
-* A **Data Reader** at any collection node, can see Data Estate Insights app on the left navigation bar, however, when they hover on the icon, they'll receive a message saying they need to contact Data Curator at root collection for access. Once a Data Reader has been assigned the Insights Reader role, they can view KPIs and aggregations based on the collections they have Data Reader permission on.
-A Data Reader can't edit assets or select ***"Export to CSV"*** from the app.
-
-> [!NOTE]
-> All roles other than Data Curator, will need explicit role assignment as **Insights Reader** to be able to click into the Data Estate Insights app.
-
-## Next steps
-
-Learn how to use Data Estate Insights with sources below:
-
-* [Learn how to use Asset insights](asset-insights.md)
-* [Learn how to use Data Stewardship](data-stewardship.md)
-* [Learn how to use Classification insights](classification-insights.md)
-* [Learn how to use Glossary insights](glossary-insights.md)
-* [Learn how to use Label insights](sensitivity-insights.md)
purview Manage Credentials https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-credentials.md
- Title: Create and manage credentials for scans
-description: Learn about the steps to create and manage credentials in Microsoft Purview.
----- Previously updated : 10/12/2022---
-# Credentials for source authentication in Microsoft Purview
-
-This article describes how you can create credentials in Microsoft Purview. These saved credentials let you quickly reuse and apply saved authentication information to your data source scans.
-
-## Prerequisites
--- An Azure key vault. To learn how to create one, see [Quickstart: Create a key vault using the Azure portal](../key-vault/general/quick-create-portal.md).-
-## Introduction
-
-A credential is authentication information that Microsoft Purview can use to authenticate to your registered data sources. A credential object can be created for various types of authentication scenarios, such as Basic Authentication requiring username/password. Credential capture specific information required to authenticate, based on the chosen type of authentication method. Credentials use your existing Azure Key Vaults secrets for retrieving sensitive authentication information during the Credential creation process.
-
-In Microsoft Purview, there are few options to use as authentication method to scan data sources such as the following options. Learn from each [data source article](azure-purview-connector-overview.md) for its supported authentication.
--- [Microsoft Purview system-assigned managed identity](#use-microsoft-purview-system-assigned-managed-identity-to-set-up-scans)-- [User-assigned managed identity](#create-a-user-assigned-managed-identity) (preview)-- Account Key (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))-- SQL Authentication (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))-- Service Principal (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))-- Consumer Key (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))-- And more-
-Before creating any credentials, consider your data source types and networking requirements to decide which authentication method you need for your scenario.
-
-## Use Microsoft Purview system-assigned managed identity to set up scans
-
-If you're using the Microsoft Purview system-assigned managed identity (SAMI) to set up scans, you won't need to create a credential and link your key vault to Microsoft Purview to store them. For detailed instructions on adding the Microsoft Purview SAMI to have access to scan your data sources, refer to the data source-specific authentication sections below:
--- [Azure Blob Storage](register-scan-azure-blob-storage-source.md#authentication-for-a-scan)-- [Azure Data Lake Storage Gen1](register-scan-adls-gen1.md#authentication-for-a-scan)-- [Azure Data Lake Storage Gen2](register-scan-adls-gen2.md#authentication-for-a-scan)-- [Azure SQL Database](register-scan-azure-sql-database.md)-- [Azure SQL Managed Instance](register-scan-azure-sql-managed-instance.md#authentication-for-registration)-- [Azure Synapse Workspace](register-scan-synapse-workspace.md#register)-- [Azure Synapse dedicated SQL pools (formerly SQL DW)](register-scan-azure-synapse-analytics.md#authentication-for-registration)-
-## Grant Microsoft Purview access to your Azure Key Vault
-
-To give Microsoft Purview access to your Azure Key Vault, there are two things you'll need to confirm:
--- [Firewall access to the Azure Key Vault](#firewall-access-to-azure-key-vault)-- [Microsoft Purview permissions on the Azure Key Vault](#microsoft-purview-permissions-on-the-azure-key-vault)-
-### Firewall access to Azure Key Vault
-
-If your Azure Key Vault has disabled public network access, you have two options to allow access for Microsoft Purview.
--- [Trusted Microsoft services](#trusted-microsoft-services)-- [Private endpoint connections](#private-endpoint-connections)-
-#### Trusted Microsoft services
-
-Microsoft Purview is listed as one of [Azure Key Vault's trusted services](../key-vault/general/overview-vnet-service-endpoints.md#trusted-services), so if public network access is disabled on your Azure Key Vault you can enable access only to trusted Microsoft services, and Microsoft Purview will be included.
-
-You can enable this setting in your Azure Key Vault under the **Networking** tab.
-
-At the bottom of the page, under Exception, enable the **Allow trusted Microsoft services to bypass this firewall** feature.
--
-#### Private endpoint connections
-
-To connect to Azure Key Vault with private endpoints, follow [Azure Key Vault's private endpoint documentation](../key-vault/general/private-link-service.md).
-
-> [!NOTE]
-> Private endpoint connection option is supported when using Azure integration runtime in [managed virtual network](catalog-managed-vnet.md) to scan the data sources. For self-hosted integration runtime, you need to enable [trusted Microsoft services](#trusted-microsoft-services).
-
-### Microsoft Purview permissions on the Azure Key Vault
-
-Currently Azure Key Vault supports two permission models:
--- [Option 1 - Access Policies](#option-1assign-access-using-key-vault-access-policy)-- [Option 2 - Role-based Access Control](#option-2assign-access-using-key-vault-azure-role-based-access-control)-
-Before assigning access to the Microsoft Purview system-assigned managed identity (SAMI), first identify your Azure Key Vault permission model from Key Vault resource **Access Policies** in the menu. Follow steps below based on relevant the permission model.
--
-#### Option 1 - Assign access using Key Vault Access Policy
-
-Follow these steps only if permission model in your Azure Key Vault resource is set to **Vault Access Policy**:
-
-1. Navigate to your Azure Key Vault.
-
-2. Select the **Access policies** page.
-
-3. Select **Add Access Policy**.
-
- :::image type="content" source="media/manage-credentials/add-msi-to-akv-2.png" alt-text="Add Microsoft Purview managed identity to AKV":::
-
-4. In the **Secrets permissions** dropdown, select **Get** and **List** permissions.
-
-5. For **Select principal**, choose the Microsoft Purview system managed identity. You can search for the Microsoft Purview SAMI using either the Microsoft Purview instance name **or** the managed identity application ID. We don't currently support compound identities (managed identity name + application ID).
-
- :::image type="content" source="media/manage-credentials/add-access-policy.png" alt-text="Add access policy":::
-
-6. Select **Add**.
-
-7. Select **Save** to save the Access policy.
-
- :::image type="content" source="media/manage-credentials/save-access-policy.png" alt-text="Save access policy":::
-
-#### Option 2 - Assign access using Key Vault Azure role-based access control
-
-Follow these steps only if permission model in your Azure Key Vault resource is set to **Azure role-based access control**:
-
-1. Navigate to your Azure Key Vault.
-
-2. Select **Access Control (IAM)** from the left navigation menu.
-
-3. Select **+ Add**.
-
-4. Set the **Role** to **Key Vault Secrets User** and enter your Microsoft Purview account name under **Select** input box. Then, select Save to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/manage-credentials/akv-add-rbac.png" alt-text="Azure Key Vault RBAC":::
-
-## Create Azure Key Vaults connections in your Microsoft Purview account
-
-Before you can create a Credential, first associate one or more of your existing Azure Key Vault instances with your Microsoft Purview account.
-
-1. Open your Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Open the [Azure portal](https://portal.azure.com), search for and select the Microsoft Purview account you want to use to receive the share. Open [the Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-1. Navigate to the **Management Center** in the studio and then navigate to **credentials**.
-
-1. From the **Credentials** page, select **Manage Key Vault connections**.
-
- :::image type="content" source="media/manage-credentials/manage-kv-connections.png" alt-text="Manage Azure Key Vault connections":::
-
-1. Select **+ New** from the Manage Key Vault connections page.
-
-1. Provide the required information, then select **Create**.
-
-1. Confirm that your Key Vault has been successfully associated with your Microsoft Purview account as shown in this example:
-
- :::image type="content" source="media/manage-credentials/view-kv-connections.png" alt-text="View Azure Key Vault connections to confirm.":::
-
-## Create a new credential
-
-These credential types are supported in Microsoft Purview:
--- Basic authentication: You add the **password** as a secret in key vault.-- Service Principal: You add the **service principal key** as a secret in key vault.-- SQL authentication: You add the **password** as a secret in key vault.-- Windows authentication: You add the **password** as a secret in key vault.-- Account Key: You add the **account key** as a secret in key vault.-- Role ARN: For an Amazon S3 data source, add your **role ARN** in AWS.-- Consumer Key: For Salesforce data sources, you can add the **password** and the **consumer secret** in key vault.-- User-assigned managed identity (preview): You can add user-assigned managed identity credentials. For more information, see the [create a user-assigned managed identity section](#create-a-user-assigned-managed-identity) below.-
-For more information, see [Add a secret to Key Vault](../key-vault/secrets/quick-create-portal.md#add-a-secret-to-key-vault) and [Create a new AWS role for Microsoft Purview](register-scan-amazon-s3.md#create-a-new-aws-role-for-microsoft-purview).
-
-After storing your secrets in the key vault:
-
-1. In Microsoft Purview, go to the Credentials page.
-
-2. Create your new Credential by selecting **+ New**.
-
-3. Provide the required information. Select the **Authentication method** and a **Key Vault connection** from which to select a secret from.
-
-4. Once all the details have been filled in, select **Create**.
-
- :::image type="content" source="media/manage-credentials/new-credential.png" alt-text="New credential":::
-
-5. Verify that your new credential shows up in the list view and is ready to use.
-
- :::image type="content" source="media/manage-credentials/view-credentials.png" alt-text="View credential":::
-
-## Manage your key vault connections
-
-1. Search/find Key Vault connections by name
-
- :::image type="content" source="media/manage-credentials/search-kv.png" alt-text="Search key vault":::
-
-2. Delete one or more Key Vault connections
-
- :::image type="content" source="media/manage-credentials/delete-kv.png" alt-text="Delete key vault":::
-
-## Manage your credentials
-
-1. Search/find Credentials by name.
-
-2. Select and make updates to an existing Credential.
-
-3. Delete one or more Credentials.
-
-## Create a user-assigned managed identity
-
-User-assigned managed identities (UAMI) enable Azure resources to authenticate directly with other resources using Azure Active Directory (Azure AD) authentication, without the need to manage those credentials. They allow you to authenticate and assign access just like you would with a system assigned managed identity, Azure AD user, Azure AD group, or service principal. User-assigned managed identities are created as their own resource (rather than being connected to a pre-existing resource). For more information about managed identities, see the [managed identities for Azure resources documentation](../active-directory/managed-identities-azure-resources/overview.md).
-
-The following steps will show you how to create a UAMI for Microsoft Purview to use.
-
-### Supported data sources for UAMI
-
-* [Azure Data Lake Gen 1](register-scan-adls-gen1.md)
-* [Azure Data Lake Gen 2](register-scan-adls-gen2.md)
-* [Azure SQL Database](register-scan-azure-sql-database.md)
-* [Azure SQL Managed Instance](register-scan-azure-sql-managed-instance.md)
-* [Azure SQL Dedicated SQL pools](register-scan-azure-synapse-analytics.md)
-* [Azure Blob Storage](register-scan-azure-blob-storage-source.md)
-
-### Create a user-assigned managed identity
-
-1. In the [Azure portal](https://portal.azure.com/) navigate to your Microsoft Purview account.
-
-1. In the **Managed identities** section on the left menu, select the **+ Add** button to add user assigned managed identities.
-
- :::image type="content" source="media/manage-credentials/create-new-managed-identity.png" alt-text="Screenshot showing managed identity screen in the Azure portal with user-assigned and add highlighted.":::
-
-1. After finishing the setup, go back to your Microsoft Purview account in the Azure portal. If the managed identity is successfully deployed, you'll see the Microsoft Purview account's status as **Succeeded**.
-
- :::image type="content" source="media/manage-credentials/status-successful.png" alt-text="Screenshot the Microsoft Purview account in the Azure portal with Status highlighted under the overview tab and essentials menu.":::
--
-1. Once the managed identity is successfully deployed, navigate to the [Microsoft Purview governance portal](https://web.purview.azure.com/), by selecting the **Open Microsoft Purview governance portal** button.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to the Management Center in the studio and then navigate to the Credentials section.
-
-1. Create a user-assigned managed identity by selecting **+New**.
-1. Select the Managed identity authentication method, and select your user assigned managed identity from the drop-down menu.
-
- :::image type="content" source="media/manage-credentials/new-user-assigned-managed-identity-credential.png" alt-text="Screenshot showing the new managed identity creation tile, with the Learn More link highlighted.":::
-
- >[!NOTE]
- > If the portal was open during creation of your user assigned managed identity, you'll need to refresh the Microsoft Purview web portal to load the settings finished in the Azure portal.
-
-1. After all the information is filled in, select **Create**.
-
-### Delete a user-assigned managed identity
-
-1. In the [Azure portal](https://portal.azure.com/) navigate to your Microsoft Purview account.
-
-1. In the **Managed identities** section on the left menu, select the identity you want to delete.
-
-1. Select the **Remove** button.
-
-1. Once the managed identity is successfully removed, navigate to the [Microsoft Purview governance portal](https://web.purview.azure.com/), by selecting the **Open Microsoft Purview governance portal** button.
-
-1. Navigate to the Management Center in the studio and then navigate to the Credentials section.
-
-1. Select the identity you want to delete, and then select the **Delete** button.
-
->[!NOTE]
->If you have deleted a user-assigned managed identity in the Azure portal, you need to delete the original identiy and create a new one in the Microsoft Purview governance portal.
-
-## Next steps
-
-[Create a scan rule set](create-a-scan-rule-set.md)
purview Manage Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-data-sources.md
- Title: How to manage data sources
-description: Learn how to register new data sources, manage collections of data sources, and view sources in Microsoft Purview.
----- Previously updated : 02/01/2023--
-# Manage data sources in Microsoft Purview
-
-In this article, you learn how to register new data sources, manage collections of data sources, view sources, and move sources between collections in Microsoft Purview.
-
-## Register a new source
-
->[!NOTE]
-> You'll need to be a Data Source Admin and one of the other Purview roles (for example, Data Reader or Data Share Contributor) to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details on roles and adding permissions.
-
-Use the following steps to register a new source:
-
-1. Open [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/), navigate to the **Data Map**, **Sources**, and select **Register**.
-
- :::image type="content" source="media/manage-data-sources/purview-studio.png" alt-text="Screenshot of the Microsoft Purview governance portal.":::
-
-1. Select a source type. This example uses Azure Blob Storage. Select **Continue**.
-
- :::image type="content" source="media/manage-data-sources/select-source-type.png" alt-text="Screenshot showing selecting a data source type in the Register sources page.":::
-
-1. Fill out the form on the **Register sources** page. Select a name for your source and enter the relevant information. If you chose **From Azure subscription** as your account selection method, the sources in your subscription appear in a dropdown list.
-
-1. Select **Register**.
-
->[!IMPORTANT]
->Most data sources have additional information and prerequisites to register and scan them in Microsoft Purview. For a list of all available sources, and links to source-specific instructions for registeration and scanning, see our [supported sources article.](microsoft-purview-connector-overview.md#microsoft-purview-data-map-available-data-sources)
-
-## View sources
-
-You can view all registered sources on the **Data Map** tab of the Microsoft Purview governance portal.
-There are two view types:
--- [The map view](#map-view)-- [The list view](#table-view)-
-### Map view
-
-In Map view, you can see all of your sources and collections. In the following image we can see the root collection at the top, called ContosoPurview. Two sources are housed in the root collection: An Azure Data Lake Storage Gen2 source and a Power BI source. There are also five subcollections: Finance, Marketing, Sales, Development, and Outreach.
--
-Each of the subcollections can be opened and managed from the map view by selecting the **+** button.
-You can also register a new source by selecting the register source button, or view details by selecting **View details**.
--
-### Table view
-
-In the table view, you can see a sortable list of sources. Hover over the source for options to edit, begin a new scan, or delete.
--
-## Manage collections
-
-You can group your data sources into collections. To create a new collection, select **+ New collection** on the *Sources* page of the Microsoft Purview governance portal. Give the collection a name and select *None* as the Parent. The new collection appears in the map view.
-
-To add sources to a collection, select the **Edit** pencil on the source and choose a collection from the **Select a collection** drop-down menu.
-
-To create a hierarchy of collections, assign higher-level collections as a parent to lower-level collections. In the following image, *ContosoPurview* is a parent to the *Finance* collection, which contains an Azure SQL Database source and two subcollections: Investment and Revenue. You can collapse or expand collections by selecting the circle attached to the arrow between levels.
--
->[!TIP]
->You can remove sources from a hierarchy by selecting *None* for the parent. Unparented sources are grouped in a dotted box in the map view with no arrows linking them to parents.
-
-## Move sources between collections
-
-After you've registered your source, you can move it to another collection that you have access to.
-
->[!IMPORTANT]
-> When a source moves to a new collection, its scans move with it, but assets will not appear in the new collection until your next scan.
-
-1. Find your source in the data map and select it.
-
-1. Beside the **Collection Path** list, select the ellipsis (...) button and select **Move**.
-
- :::image type="content" source="media/manage-data-sources/choose-to-move.png" alt-text="Screenshot of a source, with the ellipsis and move buttons highlighted.":::
-
-1. In the **Move collection** menu that appears, select your collection from the drop-down and then select **Ok**.
-
- :::image type="content" source="media/manage-data-sources/select-collection.png" alt-text="Screenshot of the Move collection window, showing the drop down selection of collections.":::
-
-1. Your data source has been moved. It can take up to an hour for results to be fully seen across your Microsoft Purview environment. Your scans will move with your resource, but assets will remain in their original collection until your next scan, then they'll move to the new collection.
-
->[!NOTE]
->If any of the assets from your source were moved manually to a different collection before the source was migrated, the scan won't take them to the new collection. They will remain in the collection you moved them to.
-
-## Next steps
-
-Learn how to discover and govern various data sources:
-
-* [Azure Data Lake Storage Gen 2](register-scan-adls-gen2.md)
-* [Power BI tenant](register-scan-power-bi-tenant.md)
-* [Azure SQL Database](register-scan-azure-sql-database.md)
purview Manage Integration Runtimes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-integration-runtimes.md
- Title: Create and manage Integration Runtimes
-description: This article explains the steps to create and manage Integration Runtimes in Microsoft Purview.
------ Previously updated : 07/18/2023--
-# Create and manage a self-hosted integration runtime
-
-The integration runtime (IR) is the compute infrastructure that Microsoft Purview uses to power data scan across different network environments.
-
-A self-hosted integration runtime (SHIR) can be used to scan data source in an on-premises network or a virtual network. The installation of a self-hosted integration runtime needs an on-premises machine or a virtual machine inside a private network.
-
-This article covers both set up of a self-hosted integration runtime, and troubleshooting and management.
--
-|Topic | Section|
-|-|-|
-|Set up a new self-hosted integration runtime|[Machine requirements](#prerequisites)|
-||[Source-specific machine requirements are listed under prerequisites in each source article](azure-purview-connector-overview.md)|
-||[Set up guide](#setting-up-a-self-hosted-integration-runtime)|
-|Networking|[Networking requirements](#networking-requirements)|
-||[Proxy servers](#proxy-server-considerations)|
-||[Private endpoints](catalog-private-link.md)|
-||[Troubleshoot proxy and firewall](#possible-symptoms-for-issues-related-to-the-firewall-and-proxy-server)|
-||[Troubleshoot connectivity](troubleshoot-connections.md)|
-|Management|[General](#manage-a-self-hosted-integration-runtime)|
-
-> [!NOTE]
-> The Microsoft Purview Integration Runtime cannot be shared with an Azure Synapse Analytics or Azure Data Factory Integration Runtime on the same machine. It needs to be installed on a separated machine.
-
-## Prerequisites
--- The supported versions of Windows are:
- - Windows 8.1
- - Windows 10
- - Windows 11
- - Windows Server 2012
- - Windows Server 2012 R2
- - Windows Server 2016
- - Windows Server 2019
- - Windows Server 2022
-
-Installation of the self-hosted integration runtime on a domain controller isn't supported.
-
-> [!IMPORTANT]
-> Scanning some data sources requires additional setup on the self-hosted integration runtime machine. For example, JDK, Visual C++ Redistributable, or specific driver.
-> For your source, **[refer to each source article for prerequisite details.](azure-purview-connector-overview.md)**
-> Any requirements will be listed in the **Prerequisites** section.
--- To add and manage a SHIR in Microsoft Purview, you'll need [data source administrator permissions](catalog-permissions.md) in Microsoft Purview.--- Self-hosted integration runtime requires a 64-bit Operating System with .NET Framework 4.7.2 or above. See [.NET Framework System Requirements](/dotnet/framework/get-started/system-requirements) for details.--- The recommended minimum configuration for the self-hosted integration runtime machine is a 2-GHz processor with 8 cores, 28 GB of RAM, and 80 GB of available hard drive space. Scanning some data sources may require higher machine specification based on your scenario. Please also check the prerequisites in corresponding [connector article](microsoft-purview-connector-overview.md).-- If the host machine hibernates, the self-hosted integration runtime doesn't respond to data requests. Configure an appropriate power plan on the computer before you install the self-hosted integration runtime. If the machine is configured to hibernate, the self-hosted integration runtime installer prompts with a message.-- You must be an administrator on the machine to successfully install and configure the self-hosted integration runtime.-- Scan runs happen with a specific frequency per the schedule you've set up. Processor and RAM usage on the machine follows the same pattern with peak and idle times. Resource usage also depends heavily on the amount of data that is scanned. When multiple scan jobs are in progress, you see resource usage goes up during peak times.-
-> [!IMPORTANT]
-> If you use the Self-Hosted Integration runtime to scan Parquet files, you need to install the **64-bit JRE 8 (Java Runtime Environment) or OpenJDK** on your IR machine. Check our [Java Runtime Environment section at the bottom of the page](#java-runtime-environment-installation) for an installation guide.
-
-### Considerations for using a self-hosted IR
--- You can use a single self-hosted integration runtime for scanning multiple data sources.-- You can install only one instance of self-hosted integration runtime on any single machine. If you have two Microsoft Purview accounts that need to scan on-premises data sources, install the self-hosted IR on two machines, one for each Microsoft Purview account.-- The self-hosted integration runtime doesn't need to be on the same machine as the data source, unless specially called out as a prerequisite in the respective source article. Having the self-hosted integration runtime close to the data source reduces the time for the self-hosted integration runtime to connect to the data source.-
-## Setting up a self-hosted integration runtime
-
-To create and set up a self-hosted integration runtime, use the following procedures.
-
-### Create a self-hosted integration runtime
-
->[!NOTE]
-> To add or manage a SHIR in Microsoft Purview, you'll need [data source administrator permissions](catalog-permissions.md) in Microsoft Purview.
-
-1. On the home page of the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), select **Data Map** from the left navigation pane.
-
-2. Under **Sources and scanning** on the left pane, select **Integration runtimes**, and then select **+ New**.
-
- :::image type="content" source="media/manage-integration-runtimes/select-integration-runtimes.png" alt-text="Select on IR.":::
-
-3. On the **Integration runtime setup** page, select **Self-Hosted** to create a self-Hosted IR, and then select **Continue**.
-
- :::image type="content" source="media/manage-integration-runtimes/select-self-hosted-ir.png" alt-text="Create new SHIR.":::
-
-4. Enter a name for your IR, and select Create.
-
-5. On the **Integration Runtime settings** page, follow the steps under the **Manual setup** section. You'll have to download the integration runtime from the download site onto a VM or machine where you intend to run it.
-
- :::image type="content" source="media/manage-integration-runtimes/integration-runtime-settings.png" alt-text="get key":::
-
- - Copy and paste the authentication key.
-
- - Download the self-hosted integration runtime from [Microsoft Integration Runtime](https://www.microsoft.com/download/details.aspx?id=39717) on a local Windows machine. Run the installer. Self-hosted integration runtime versions such as 5.4.7803.1 and 5.6.7795.1 are supported.
-
- - On the **Register Integration Runtime (Self-hosted)** page, paste one of the two keys you saved earlier, and select **Register**.
-
- :::image type="content" source="media/manage-integration-runtimes/register-integration-runtime.png" alt-text="input key.":::
-
- - On the **New Integration Runtime (Self-hosted) Node** page, select **Finish**.
-
-6. After the Self-hosted integration runtime is registered successfully, you see the following window:
-
- :::image type="content" source="media/manage-integration-runtimes/successfully-registered.png" alt-text="successfully registered.":::
-
-You can register multiple nodes for a self-hosted integration runtime using the same key. Learn more from [High availability and scalability](#high-availability-and-scalability).
-
-## Manage a self-hosted integration runtime
-
-You can edit a self-hosted integration runtime by navigating to **Integration runtimes** in the Microsoft Purview governance portal, hover on the IR then select the **Edit** button.
--- In the **Settings** tab, you can update the description, copy the key, or regenerate new keys. -- In the **Nodes** tab, you can see a list of the registered nodes, along with the status, IP address, and the option of node deletion. Learn more from [High availability and scalability](#high-availability-and-scalability).-- In the **Version** tab, you can see the IR version status. Learn more from [Self-hosted integration runtime auto-update and expire notification](self-hosted-integration-runtime-version.md).--
-You can delete a self-hosted integration runtime by navigating to **Integration runtimes**, hover on the IR then select the **Delete** button.
-
-### Notification area icons and notifications
-
-If you move your cursor over the icon or message in the notification area, you can see details about the state of the self-hosted integration runtime.
--
-### Service account for Self-hosted integration runtime
-
-The default sign-in service account of self-hosted integration runtime is **NT SERVICE\DIAHostService**. You can see it in **Services -> Integration Runtime Service -> Properties -> Log on**.
--
-Make sure the account has the permission of Log-on as a service. Otherwise self-hosted integration runtime can't start successfully. You can check the permission in **Local Security Policy -> Security Settings -> Local Policies -> User Rights Assignment -> Log on as a service**
---
-## High availability and scalability
-
-You can associate a self-hosted integration runtime with multiple on-premises machines or virtual machines in Azure. These machines are called nodes. You can have up to four nodes associated with a self-hosted integration runtime. The benefits of having multiple nodes are:
--- Higher availability of the self-hosted integration runtime so that it's no longer the single point of failure for scan. This availability helps ensure continuity when you use up to four nodes.-- Run more concurrent scans. Each self-hosted integration runtime can empower many scan runs at the same time, auto determined based on the machine's CPU/memory. You can install more nodes if you have more concurrency need. -- When scanning sources like Azure Blob, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2 and Azure Files, each scan run can leverage all those nodes to boost the scan performance. For other sources, scan will be executed on one of the nodes.-
-You can associate multiple nodes by installing the self-hosted integration runtime software from [Download Center](https://www.microsoft.com/download/details.aspx?id=39717). Then, register it by using the same authentication key.
-
-> [!NOTE]
-> Before you add another node for high availability and scalability, ensure that the **Remote access to intranet** option is enabled on the first node. To do so, select **Microsoft Integration Runtime Configuration Manager** > **Settings** > **Remote access to intranet**.
-
-## Networking requirements
-
-Your self-hosted integration runtime machine needs to connect to several resources to work correctly:
-
-* The Microsoft Purview services used to manage the self-hosted integration runtime.
-* The data sources you want to scan using the self-hosted integration runtime.
-* The managed Storage account created by Microsoft Purview. Microsoft Purview uses these resources to ingest the results of the scan, among many other things, so the self-hosted integration runtime need to be able to connect with these resources.
-
-There are two firewalls to consider:
--- The *corporate firewall* that runs on the central router of the organization-- The *Windows firewall* that is configured as a daemon on the local machine where the self-hosted integration runtime is installed-
-Here are the domains and outbound ports that you need to allow at both **corporate and Windows/machine firewalls**.
-
-> [!TIP]
-> For domains listed with '\<managed_storage_account>', add the name of the managed resources associated with your Microsoft Purview account. You can find them from Azure portal -> your Microsoft Purview account -> Managed resources tab.
-
-| Domain names | Outbound ports | Description |
-| -- | -- | - |
-| `*.frontend.clouddatahub.net` | 443 | Required to connect to the Microsoft Purview service. Currently wildcard is required as there's no dedicated resource. |
-| `*.servicebus.windows.net` | 443 | Required for setting up scan in the Microsoft Purview governance portal. This endpoint is used for interactive authoring from UI, for example, test connection, browse folder list and table list to scope scan. To avoid using wildcard, see [Get URL of Azure Relay](#get-url-of-azure-relay).|
-| `<purview_account>.purview.azure.com` | 443 | Required to connect to Microsoft Purview service. If you use Purview [Private Endpoints](catalog-private-link.md), this endpoint is covered by *account private endpoint*. |
-| `<managed_storage_account>.blob.core.windows.net` | 443 | Required to connect to the Microsoft Purview managed Azure Blob storage account. If you use Purview [Private Endpoints](catalog-private-link.md), this endpoint is covered by *ingestion private endpoint*. |
-| `<managed_storage_account>.queue.core.windows.net` | 443 | Required to connect to the Microsoft Purview managed Azure Queue storage account. If you use Purview [Private Endpoints](catalog-private-link.md), this endpoint is covered by *ingestion private endpoint*. |
-| `download.microsoft.com` | 443 | Required to download the self-hosted integration runtime updates. If you have disabled auto-update, you can skip configuring this domain. |
-| `login.windows.net`<br>`login.microsoftonline.com` | 443 | Required to sign in to the Azure Active Directory. |
-
-> [!NOTE]
-> As currently Azure Relay doesn't support service tag, you have to use service tag AzureCloud or Internet in NSG rules for the communication to Azure Relay.
-
-Depending on the sources you want to scan, you also need to allow other domains and outbound ports for other Azure or external sources. A few examples are provided here:
-
-| Domain names | Outbound ports | Description |
-| -- | -- | - |
-| `<your_storage_account>.dfs.core.windows.net` | 443 | When scan Azure Data Lake Store Gen 2. |
-| `<your_storage_account>.blob.core.windows.net` | 443 | When scan Azure Blob storage. |
-| `<your_sql_server>.database.windows.net` | 1433 | When scan Azure SQL Database. |
-| `*.powerbi.com` and `*.analysis.windows.net` | 443 | When scan Power BI tenant. |
-| `<your_ADLS_account>.azuredatalakestore.net` | 443 | When scan Azure Data Lake Store Gen 1. |
-| Various domains | Dependent | Domains and ports for any other sources the SHIR will scan. |
-
-For some cloud data stores such as Azure SQL Database and Azure Storage, you may need to allow IP address of self-hosted integration runtime machine on their firewall configuration, or you can create private endpoint of the service in your self-hosted integration runtime's network.
-
-> [!IMPORTANT]
-> In most environments, you will also need to make sure that your DNS is correctly configured. To confirm, you can use **nslookup** from your SHIR machine to check connectivity to each of the domains. Each nslookup should return the IP of the resource. If you are using [Private Endpoints](catalog-private-link.md), the private IP should be returned and not the Public IP. If no IP is returned, or if when using Private Endpoints the public IP is returned, you need to address your DNS/VNet association, or your Private Endpoint/VNet peering.
-
-### Get URL of Azure Relay
-
-One required domain and port that need to be put in the allowlist of your firewall is for the communication to Azure Relay. The self-hosted integration runtime uses it for interactive authoring such as test connection and browse folder/table list. If you don't want to allow **.servicebus.windows.net** and would like to have more specific URLs, then you can see all the FQDNs that are required by your self-hosted integration runtime. Follow these steps:
-
-1. Go to the Microsoft Purview governance portal -> Data map -> Integration runtimes, and edit your self-hosted integration runtime.
-2. In Edit page, select **Nodes** tab.
-3. Select **View Service URLs** to get all FQDNs.
-
- :::image type="content" source="media/manage-integration-runtimes/get-azure-relay-urls.png" alt-text="Screenshot that shows how to get Azure Relay URLs for an integration runtime.":::
-
-4. You can add these FQDNs in the allowlist of firewall rules.
-
-> [!NOTE]
-> For the details related to Azure Relay connections protocol, see [Azure Relay Hybrid Connections protocol](../azure-relay/relay-hybrid-connections-protocol.md).
-
-## Proxy server considerations
-
-If your corporate network environment uses a proxy server to access the internet, configure the self-hosted integration runtime to use appropriate proxy settings. You can set the proxy during the initial registration phase or after it's being registered.
--
-When configured, the self-hosted integration runtime uses the proxy server to connect to the services that use HTTP or HTTPS protocol. This is why you select **Change link** during initial setup.
--
-There are two supported configuration options by Microsoft Purview:
--- **Do not use proxy**: The self-hosted integration runtime doesn't explicitly use any proxy to connect to cloud services.-- **Use system proxy**: The self-hosted integration runtime uses the proxy setting that is configured in the executable's configuration files. If no proxy is specified in these files, the self-hosted integration runtime connects to the services directly without going through a proxy.-- **Use custom proxy**: Configure the HTTP proxy setting to use for the self-hosted integration runtime, instead of using configurations in diahost.exe.config and diawp.exe.config. **Address** and **Port** values are required. **User Name** and **Password** values are optional, depending on your proxy's authentication setting. All settings are encrypted with Windows DPAPI on the self-hosted integration runtime and stored locally on the machine.-
-> [!NOTE]
-> Connecting to data sources via proxy is not supported for connectors other than Azure data sources and Power BI.
-
-The integration runtime host service restarts automatically after you save the updated proxy settings.
-
-After you register the self-hosted integration runtime, if you want to view or update proxy settings, use Microsoft Integration Runtime Configuration Manager.
-
-1. Open **Microsoft Integration Runtime Configuration Manager**.
-1. Select the **Settings** tab.
-3. Under **HTTP Proxy**, select the **Change** link to open the **Set HTTP Proxy** dialog box.
-4. Select **Next**. You then see a warning that asks for your permission to save the proxy setting and restart the integration runtime host service.
-
-> [!NOTE]
-> If you set up a proxy server with NTLM authentication, the integration runtime host service runs under the domain account. If you later change the password for the domain account, remember to update the configuration settings for the service and restart the service. Because of this requirement, we suggest that you access the proxy server by using a dedicated domain account that doesn't require you to update the password frequently.
-
-If using system proxy, make sure your proxy server allow outbound traffic to the [network rules](#networking-requirements).
-
-### Configure proxy server settings
-
-If you select the **Use system proxy** option for the HTTP proxy, the self-hosted integration runtime uses the proxy settings in the following four files under the path C:\Program Files\Microsoft Integration Runtime\5.0\ to perform different operations:
--- .\Shared\diahost.exe.config-- .\Shared\diawp.exe.config-- .\Gateway\DataScan\Microsoft.DataMap.Agent.exe.config-- .\Gateway\DataScan\DataTransfer\Microsoft.DataMap.Agent.Connectors.Azure.DataFactory.ServiceHost.exe.config-
-When no proxy is specified in these files, the self-hosted integration runtime connects to the services directly without going through a proxy.
-
-The following procedure provides instructions for updating the **diahost.exe.config** file.
-
-1. In File Explorer, make a safe copy of C:\Program Files\Microsoft Integration Runtime\5.0\Shared\diahost.exe.config as a backup of the original file.
-
-1. Open Notepad running as administrator.
-
-1. In Notepad, open the text file C:\Program Files\Microsoft Integration Runtime\5.0\Shared\diahost.exe.config.
-
-1. Find the default **system.net** tag as shown in the following code:
-
- ```xml
- <system.net>
- <defaultProxy useDefaultCredentials="true" />
- </system.net>
- ```
-
- You can then add proxy server details as shown in the following example:
-
- ```xml
- <system.net>
- <defaultProxy>
- <proxy bypassonlocal="true" proxyaddress="<your proxy server e.g. http://proxy.domain.org:8888/>" />
- </defaultProxy>
- </system.net>
- ```
- The proxy tag allows other properties to specify required settings like `scriptLocation`. See [\<proxy\> Element (Network Settings)](/dotnet/framework/configure-apps/file-schema/network/proxy-element-network-settings) for syntax.
-
- ```xml
- <proxy autoDetect="true|false|unspecified" bypassonlocal="true|false|unspecified" proxyaddress="uriString" scriptLocation="uriString" usesystemdefault="true|false|unspecified "/>
- ```
-
-1. Save the configuration file in its original location.
--
-Repeat the same procedure to update **diawp.exe.config** and **Microsoft.DataMap.Agent.exe.config** files.
-
-Then go to path C:\Program Files\Microsoft Integration Runtime\5.0\Gateway\DataScan\DataTransfer, create a file named "**Microsoft.DataMap.Agent.Connectors.Azure.DataFactory.ServiceHost.exe.config**", and configure the proxy setting as follows. You can also extend the settings as described above.
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<configuration>
- <system.net>
- <defaultProxy>
- <proxy bypassonlocal="true" proxyaddress="<your proxy server e.g. http://proxy.domain.org:8888/>" />
- </defaultProxy>
- </system.net>
-</configuration>
-```
-
-Local traffic must be excluded from proxy, for example if your Microsoft Purview account is behind private endpoints. In such cases, update the following four files under the path to include bypass list C:\Program Files\Microsoft Integration Runtime\5.0\ with required bypass list:
--- .\Shared\diahost.exe.config-- .\Shared\diawp.exe.config-- .\Gateway\DataScan\Microsoft.DataMap.Agent.exe.config-- .\Gateway\DataScan\DataTransfer\Microsoft.DataMap.Agent.Connectors.Azure.DataFactory.ServiceHost.exe.config-
-An example for bypass list for scanning an Azure SQL Database and ADLS gen 2 Storage:
-
- ```xml
- <system.net>
- <defaultProxy>
- <bypasslist>
- <add address="scaneastus4123.blob.core.windows.net" />
- <add address="scaneastus4123.queue.core.windows.net" />
- <add address="Atlas-abc12345-1234-abcd-a73c-394243a566fa.servicebus.windows.net" />
- <add address="contosopurview123.purview.azure.com" />
- <add address="contososqlsrv123.database.windows.net" />
- <add address="contosoadls123.dfs.core.windows.net" />
- <add address="contosoakv123.vault.azure.net" />
- </bypasslist>
- <proxy proxyaddress=http://proxy.domain.org:8888 bypassonlocal="True" />
- </defaultProxy>
- </system.net>
- ```
-Restart the self-hosted integration runtime host service, which picks up the changes. To restart the service, use the services applet from Control Panel. Or from Integration Runtime Configuration Manager, select the **Stop Service** button, and then select **Start Service**. If the service doesn't start, you likely added incorrect XML tag syntax in the application configuration file that you edited.
-
-> [!IMPORTANT]
-> Don't forget to update all four files mentioned above.
-
-You also need to make sure that Microsoft Azure is in your company's allowlist. You can download the list of valid Azure IP addresses. IP ranges for each cloud, broken down by region and by the tagged services in that cloud are now available on MS Download:
- - Public: https://www.microsoft.com/download/details.aspx?id=56519
-
-### Possible symptoms for issues related to the firewall and proxy server
-
-If you see error messages like the following ones, the likely reason is improper configuration of the firewall or proxy server. Such configuration prevents the self-hosted integration runtime from connecting to Microsoft Purview services. To ensure that your firewall and proxy server are properly configured, refer to the previous section.
--- When you try to register the self-hosted integration runtime, you receive the following error message: "Failed to register this Integration Runtime node! Confirm that the Authentication key is valid and the integration service host service is running on this machine."-- When you open Integration Runtime Configuration Manager, you see a status of **Disconnected** or **Connecting**. When you view Windows event logs, under **Event Viewer** > **Application and Services Logs** > **Microsoft Integration Runtime**, you see error messages like this one:-
- ```output
- Unable to connect to the remote server
- A component of Integration Runtime has become unresponsive and restarts automatically. Component name: Integration Runtime (Self-hosted)
- ```
-
-## Java Runtime Environment Installation
-
-If you scan Parquet files using the self-hosted integration runtime with Microsoft Purview, you'll need to install either the Java Runtime Environment or OpenJDK on your self-hosted IR machine.
-
-When scanning Parquet files using the self-hosted IR, the service locates the Java runtime by firstly checking the registry *`(HKEY_LOCAL_MACHINE\SOFTWARE\JavaSoft\Java Runtime Environment\{Current Version}\JavaHome)`* for JRE, if not found, secondly checking system variable *`JAVA_HOME`* for OpenJDK. You can set JAVA_HOME under System Settings, Environment Variables on your machine. Create or edit the JAVA_HOME variable to point to the Java jre on your machine. For example: *`C:\Program Files\Java\jdk1.8\jre`*
--- **To use JRE**: The 64-bit IR requires 64-bit JRE. You can find it from [here](https://go.microsoft.com/fwlink/?LinkId=808605).-- **To use OpenJDK**: It's supported since IR version 3.13. Package the jvm.dll with all other required assemblies of OpenJDK into self-hosted IR machine, and set system environment variable JAVA_HOME accordingly.-
-## Next steps
--- [Microsoft Purview network architecture and best practices](concept-best-practices-network.md)--- [Use private endpoints with Microsoft Purview](catalog-private-link.md)
purview Manage Kafka Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-kafka-dotnet.md
- Title: Publish and process Atlas Kafka topics messages via Event Hubs
-description: Get a walkthrough on how to use Event Hubs and a .NET Core application to send/receive events to/from Microsoft Purview's Apache Atlas Kafka topics. Try the Azure.Messaging.EventHubs package.
----- Previously updated : 12/13/2022---
-# Use Event Hubs and .NET to send and receive Atlas Kafka topics messages
-
-This quickstart teaches you how to send and receive *Atlas Kafka* topics events. We'll make use of *Azure Event Hubs* and the **Azure.Messaging.EventHubs** .NET library.
-
-## Prerequisites
-
-If you're new to Event Hubs, see [Event Hubs overview](../event-hubs/event-hubs-about.md) before you complete this quickstart.
-
-To follow this quickstart, you need certain prerequisites in place:
--- **A Microsoft Azure subscription**. To use Azure services, including Event Hubs, you need an Azure subscription. If you don't have an Azure account, you can sign up for a [free trial](https://azure.microsoft.com/free/) or use your MSDN subscriber benefits when you [create an account](https://azure.microsoft.com).-- **Microsoft Visual Studio 2022**. The Event Hubs client library makes use of new features that were introduced in C# 8.0. You can still use the library with previous C# versions, but the new syntax won't be available. To make use of the full syntax, it's recommended that you compile with the [.NET Core SDK](https://dotnet.microsoft.com/download) 3.0 or higher and [language version](/dotnet/csharp/language-reference/configure-language-version#override-a-default) set to `latest`. If you're using a Visual Studio version prior to Visual Studio 2019, it doesn't have the tools needed to build C# 8.0 projects. Visual Studio 2022, including the free Community edition, can be downloaded [here](https://visualstudio.microsoft.com/vs/).-- An active [Microsoft Purview account](create-catalog-portal.md).-- [An Event Hubs configured with your Microsoft Purview account to sent and receive messages](configure-event-hubs-for-kafka.md):
- - Your account may already be configured. You can check your Microsoft Purview account in the [Azure portal](https://portal.azure.com) under **Settings**, **Kafka configuration**. If it is not already configured, [follow this guide](configure-event-hubs-for-kafka.md).
-
-## Publish messages to Microsoft Purview
-
-Let's create a .NET Core console application that sends events to Microsoft Purview via Event Hubs Kafka topic, **ATLAS_HOOK**.
-
-To publish messages to Microsoft Purview, you'll need either a [managed Event Hubs](configure-event-hubs-for-kafka.md#configure-event-hubs), or [at least one Event Hubs with a hook configuration.](configure-event-hubs-for-kafka.md#configure-event-hubs-to-publish-messages-to-microsoft-purview).
-
-## Create a Visual Studio project
-
-Next create a C# .NET console application in Visual Studio:
-
-1. Launch **Visual Studio**.
-2. In the Start window, select **Create a new project** > **Console App (.NET Framework)**. .NET version 4.5.2 or above is required.
-3. In **Project name**, enter **PurviewKafkaProducer**.
-4. Select **Create** to create the project.
-
-### Create a console application
-
-1. Start Visual Studio 2022.
-1. Select **Create a new project**.
-1. On the **Create a new project** dialog box, do the following steps: If you don't see this dialog box, select **File** on the menu, select **New**, and then select **Project**.
- 1. Select **C#** for the programming language.
- 1. Select **Console** for the type of the application.
- 1. Select **Console App (.NET Core)** from the results list.
- 1. Then, select **Next**.
--
-### Add the Event Hubs NuGet package
-
-1. Select **Tools** > **NuGet Package Manager** > **Package Manager Console** from the menu.
-1. Run the following command to install the **Azure.Messaging.EventHubs** NuGet package and **Azure.Messaging.EventHubs.Producer** NuGet package:
-
- ```cmd
- Install-Package Azure.Messaging.EventHubs
- ```
-
- ```cmd
- Install-Package Azure.Messaging.EventHubs.Producer
- ```
-
-### Write code that sends messages to the event hub
-
-1. Add the following `using` statements to the top of the **Program.cs** file:
-
- ```csharp
- using System;
- using System.Text;
- using System.Threading.Tasks;
- using Azure.Messaging.EventHubs;
- using Azure.Messaging.EventHubs.Producer;
- ```
-
-1. Add constants to the `Program` class for the Event Hubs connection string and Event Hubs name.
-
- ```csharp
- private const string connectionString = "<EVENT HUBS NAMESPACE - CONNECTION STRING>";
- private const string eventHubName = "<EVENT HUB NAME>";
- ```
-
-1. Replace the `Main` method with the following `async Main` method and add an `async ProduceMessage` to push messages into Microsoft Purview. See the comments in the code for details.
-
- ```csharp
- static async Task Main()
- {
- // Read from the default consumer group: $Default
- string consumerGroup = EventHubConsumerClient.DefaultConsumerGroupName;
-
- / Create an event producer client to add events in the event hub
- EventHubProducerClient producer = new EventHubProducerClient(ehubNamespaceConnectionString, eventHubName);
-
- await ProduceMessage(producer);
- }
-
- static async Task ProduceMessage(EventHubProducerClient producer)
-
- {
- // Create a batch of events
- using EventDataBatch eventBatch = await producerClient.CreateBatchAsync();
-
- // Add events to the batch. An event is a represented by a collection of bytes and metadata.
- eventBatch.TryAdd(new EventData(Encoding.UTF8.GetBytes("<First event>")));
- eventBatch.TryAdd(new EventData(Encoding.UTF8.GetBytes("<Second event>")));
- eventBatch.TryAdd(new EventData(Encoding.UTF8.GetBytes("<Third event>")));
-
- // Use the producer client to send the batch of events to the event hub
- await producerClient.SendAsync(eventBatch);
- Console.WriteLine("A batch of 3 events has been published.");
-
- }
- ```
-1. Build the project. Ensure that there are no errors.
-1. Run the program and wait for the confirmation message.
-
- > [!NOTE]
- > For the complete source code with more informational comments, see [this file in GitHub](https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/eventhub/Azure.Messaging.EventHubs/samples/Sample04_PublishingEvents.md)
-
-### Sample code that creates an sql table with two columns using a Create Entity JSON message
-
-```json
-
- {
- "msgCreatedBy":"nayenama",
- "message":{
- "type":"ENTITY_CREATE_V2",
- "user":"admin",
- "entities":{
- "entities":[
- {
- "typeName":"azure_sql_table",
- "attributes":{
- "owner":"admin",
- "temporary":false,
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable",
- "name":"SalesOrderTable",
- "description":"Sales Order Table added via Kafka"
- },
- "relationshipAttributes":{
- "columns":[
- {
- "guid":"-1102395743156037",
- "typeName":"azure_sql_column",
- "uniqueAttributes":{
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable#OrderID"
- }
- },
- {
- "guid":"-1102395743156038",
- "typeName":"azure_sql_column",
- "uniqueAttributes":{
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable#OrderDate"
- }
- }
- ]
- },
- "guid":"-1102395743156036",
- "version":0
- }
- ],
- "referredEntities":{
- "-1102395743156037":{
- "typeName":"azure_sql_column",
- "attributes":{
- "owner":null,
- "userTypeId":61,
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable#OrderID",
- "precision":23,
- "length":8,
- "description":"Sales Order ID",
- "scale":3,
- "name":"OrderID",
- "data_type":"int"
- },
- "relationshipAttributes":{
- "table":{
- "guid":"-1102395743156036",
- "typeName":"azure_sql_table",
- "entityStatus":"ACTIVE",
- "displayText":"SalesOrderTable",
- "uniqueAttributes":{
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable"
- }
- }
- },
- "guid":"-1102395743156037",
- "version":2
- },
- "-1102395743156038":{
- "typeName":"azure_sql_column",
- "attributes":{
- "owner":null,
- "userTypeId":61,
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable#OrderDate",
- "description":"Sales Order Date",
- "scale":3,
- "name":"OrderDate",
- "data_type":"datetime"
- },
- "relationshipAttributes":{
- "table":{
- "guid":"-1102395743156036",
- "typeName":"azure_sql_table",
- "entityStatus":"ACTIVE",
- "displayText":"SalesOrderTable",
- "uniqueAttributes":{
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable"
- }
- }
- },
- "guid":"-1102395743156038",
- "status":"ACTIVE",
- "createdBy":"ServiceAdmin",
- "version":0
- }
- }
- }
- },
- "version":{
- "version":"1.0.0"
- },
- "msgCompressionKind":"NONE",
- "msgSplitIdx":1,
- "msgSplitCount":1
-}
--
-```
-
-## Receive Microsoft Purview messages
-
-Next learn how to write a .NET Core console application that receives messages from event hubs using an event processor. The event processor manages persistent checkpoints and parallel receptions from event hubs. This simplifies the process of receiving events. You need to use the ATLAS_ENTITIES event hub to receive messages from Microsoft Purview.
-
-To receive messages from Microsoft Purview, you'll need either a [managed Event Hubs](configure-event-hubs-for-kafka.md#configure-event-hubs), or [an Event Hubs notification configuration.](configure-event-hubs-for-kafka.md#configure-event-hubs-to-receive-messages-from-microsoft-purview).
-
-> [!WARNING]
-> Event Hubs SDK uses the most recent version of Storage API available. That version may not necessarily be available on your Stack Hub platform. If you run this code on Azure Stack Hub, you will experience runtime errors unless you target the specific version you are using. If you're using Azure Blob Storage as a checkpoint store, review the [supported Azure Storage API version for your Azure Stack Hub build](/azure-stack/user/azure-stack-acs-differences?#api-version) and in your code, target that version.
->
-> The highest available version of the Storage service is version 2019-02-02. By default, the Event Hubs SDK client library uses the highest available version on Azure (2019-07-07 at the time of the release of the SDK). If you are using Azure Stack Hub version 2005, in addition to following the steps in this section, you will also need to add code that targets the Storage service API version 2019-02-02. To learn how to target a specific Storage API version, see [this sample in GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/eventhub/Azure.Messaging.EventHubs.Processor/samples/).
-
-### Create an Azure Storage and a blob container
-
-We'll use Azure Storage as the checkpoint store. Use the following steps to create an Azure Storage account.
-
-1. [Create an Azure Storage account](../storage/common/storage-account-create.md?tabs=azure-portal)
-2. [Create a blob container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container)
-3. [Get the connection string for the storage account](../storage/common/storage-configure-connection-string.md)
-
- Make note of the connection string and the container name. You'll use them in the receive code.
-
-### Create a Visual Studio project for the receiver
-
-1. In the Solution Explorer window, select and hold (or right-click) the **EventHubQuickStart** solution, point to **Add**, and select **New Project**.
-1. Select **Console App (.NET Core)**, and select **Next**.
-1. Enter **PurviewKafkaConsumer** for the **Project name**, and select **Create**.
-
-### Add the Event Hubs NuGet package
-
-1. Select **Tools** > **NuGet Package Manager** > **Package Manager Console** from the menu.
-1. Run the following command to install the **Azure.Messaging.EventHubs** NuGet package:
-
- ```cmd
- Install-Package Azure.Messaging.EventHubs
- ```
-1. Run the following command to install the **Azure.Messaging.EventHubs.Processor** NuGet package:
-
- ```cmd
- Install-Package Azure.Messaging.EventHubs.Processor
- ```
-
-### Update the Main method
-
-1. Add the following `using` statements at the top of the **Program.cs** file.
-
- ```csharp
- using System;
- using System.Text;
- using System.Threading.Tasks;
- using Azure.Storage.Blobs;
- using Azure.Messaging.EventHubs;
- using Azure.Messaging.EventHubs.Consumer;
- using Azure.Messaging.EventHubs.Processor;
- ```
-
-1. Add constants to the `Program` class for the Event Hubs connection string and the event hub name. Replace placeholders in brackets with the real values that you got when you created the event hub and the storage account (access keys - primary connection string). Make sure that the `{Event Hubs namespace connection string}` is the namespace-level connection string, and not the event hub string.
-
- ```csharp
- private const string ehubNamespaceConnectionString = "<EVENT HUBS NAMESPACE - CONNECTION STRING>";
- private const string eventHubName = "<EVENT HUB NAME>";
- private const string blobStorageConnectionString = "<AZURE STORAGE CONNECTION STRING>";
- private const string blobContainerName = "<BLOB CONTAINER NAME>";
- ```
-
- Use **ATLAS_ENTITIES** as the event hub name when sending messages to Microsoft Purview.
-
-1. Replace the `Main` method with the following `async Main` method. See the comments in the code for details.
-
- ```csharp
- static async Task Main()
- {
- // Read from the default consumer group: $Default
- string consumerGroup = EventHubConsumerClient.DefaultConsumerGroupName;
-
- // Create a blob container client that the event processor will use
- BlobContainerClient storageClient = new BlobContainerClient(blobStorageConnectionString, blobContainerName);
-
- // Create an event processor client to process events in the event hub
- EventProcessorClient processor = new EventProcessorClient(storageClient, consumerGroup, ehubNamespaceConnectionString, eventHubName);
-
- // Register handlers for processing events and handling errors
- processor.ProcessEventAsync += ProcessEventHandler;
- processor.ProcessErrorAsync += ProcessErrorHandler;
-
- // Start the processing
- await processor.StartProcessingAsync();
-
- // Wait for 10 seconds for the events to be processed
- await Task.Delay(TimeSpan.FromSeconds(10));
-
- // Stop the processing
- await processor.StopProcessingAsync();
- }
- ```
-
-1. Now add the following event and error handler methods to the class.
-
- ```csharp
- static async Task ProcessEventHandler(ProcessEventArgs eventArgs)
- {
- // Write the body of the event to the console window
- Console.WriteLine("\tReceived event: {0}", Encoding.UTF8.GetString(eventArgs.Data.Body.ToArray()));
-
- // Update checkpoint in the blob storage so that the app receives only new events the next time it's run
- await eventArgs.UpdateCheckpointAsync(eventArgs.CancellationToken);
- }
-
- static Task ProcessErrorHandler(ProcessErrorEventArgs eventArgs)
- {
- // Write details about the error to the console window
- Console.WriteLine($"\tPartition '{ eventArgs.PartitionId}': an unhandled exception was encountered. This was not expected to happen.");
- Console.WriteLine(eventArgs.Exception.Message);
- return Task.CompletedTask;
- }
- ```
-
-1. Build the project. Ensure that there are no errors.
-
- > [!NOTE]
- > For the complete source code with more informational comments, see [this file on the GitHub](https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/eventhub/Azure.Messaging.EventHubs.Processor/samples/Sample01_HelloWorld.md).
-
-1. Run the receiver application.
-
-### An example of a Message received from Microsoft Purview
-
-```json
-{
- "version":
- {"version":"1.0.0",
- "versionParts":[1]
- },
- "msgCompressionKind":"NONE",
- "msgSplitIdx":1,
- "msgSplitCount":1,
- "msgSourceIP":"10.244.155.5",
- "msgCreatedBy":
- "",
- "msgCreationTime":1618588940869,
- "message":{
- "type":"ENTITY_NOTIFICATION_V2",
- "entity":{
- "typeName":"azure_sql_table",
- "attributes":{
- "owner":"admin",
- "createTime":0,
- "qualifiedName":"mssql://nayenamakafka.eventhub.sql.net/salespool/dbo/SalesOrderTable",
- "name":"SalesOrderTable",
- "description":"Sales Order Table"
- },
- "guid":"ead5abc7-00a4-4d81-8432-d5f6f6f60000",
- "status":"ACTIVE",
- "displayText":"SalesOrderTable"
- },
- "operationType":"ENTITY_UPDATE",
- "eventTime":1618588940567
- }
-}
-```
-
-## Next steps
-
-Check out more examples in GitHub.
--- [Event Hubs samples in GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/eventhub/Azure.Messaging.EventHubs/samples)-- [Event processor samples in GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/eventhub/Azure.Messaging.EventHubs.Processor/samples)-- [An introduction to Atlas notifications](https://atlas.apache.org/2.0.0/Notifications.html)
purview Manually Apply Classifications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manually-apply-classifications.md
- Title: Manually apply classifications on assets
-description: This document describes how to manually apply classifications on assets.
----- Previously updated : 12/30/2022-
-# Manually apply classifications on assets in Microsoft Purview
-
-This article discusses how to manually apply classifications on assets in the Microsoft Purview Governance Portal.
-
-[Classifications](concept-classification.md) are logical labels to help you and your team identify the kinds of data you have across your data estate. For example: if files or tables contain credit card numbers, or addresses.
-
-Microsoft Purview [automatically applies classifications to some assets during the scanning process](apply-classifications.md), but there are some scenarios when you may want to manually apply more classifications. For example, Microsoft Purview doesn't automatically apply classifications to table assets (only their columns), or you might want to apply custom classifications, or add classifications to assets grouped a [resource set](concept-resource-sets.md).
-
->[!NOTE]
->Some custom classifications can be [automatically applied](apply-classifications.md) after setting up a [custom classification rule.](create-a-custom-classification-and-classification-rule.md#custom-classification-rules)
-
-Follow the steps in this article to manually apply classifications to [file](#manually-apply-classification-to-a-file-asset), [table](#manually-apply-classification-to-a-table-asset), and [column](#manually-add-classification-to-a-column-asset) assets.
-
-## Manually apply classification to a file asset
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) the Microsoft Purview Data Catalog for the file you're interested in and navigate to the asset detail page.
-
- :::image type="content" source="./media/apply-classifications/asset-detail-page.png" alt-text="Screenshot showing the asset detail page." lightbox="./media/apply-classifications/asset-detail-page.png":::
-
-1. On the **Overview** tab, view the **Classifications** section to see if there are any existing classifications. Select **Edit**.
-
-1. From the **Classifications** drop-down list, select the specific classifications you're interested in. In our example, we're adding **Credit Card Number**, which is a system classification and **CustomerAccountID**, which is a custom classification.
-
- :::image type="content" source="./media/apply-classifications/select-classifications.png" alt-text="Screenshot showing how to select classifications to add to an asset." lightbox="./media/apply-classifications/select-classifications.png":::
-
-1. Select **Save**.
-
-1. On the **Overview** tab, confirm that the classifications you selected appear under the **Classifications** section.
-
- :::image type="content" source="./media/apply-classifications/confirm-classifications.png" alt-text="Screenshot showing how to confirm classifications were added to an asset." lightbox="./media/apply-classifications/confirm-classifications.png":::
-
-## Manually apply classification to a table asset
-
-When Microsoft Purview scans your data sources, it doesn't automatically assign classifications to table assets (only on columns). For a table asset to have classifications, you must add them manually:
-
-To add a classification to a table asset:
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) the data catalog for the table asset that you're interested in. For example, **Customer** table.
-
-1. Confirm that no classifications are assigned to the table. Select **Edit**.
-
- :::image type="content" source="./media/apply-classifications/select-edit-from-table-asset.png" alt-text="Screenshot showing how to view and edit the classifications of a table asset." lightbox="./media/apply-classifications/select-edit-from-table-asset.png":::
-
-1. From the **Classifications** drop-down list, select one or more classifications. This example uses a custom classification named **CustomerInfo**, but you can select any classifications for this step.
-
- :::image type="content" source="./media/apply-classifications/select-classifications-in-table.png" alt-text="Screenshot showing how to select classifications to add to a table asset." lightbox="./media/apply-classifications/select-classifications-in-table.png":::
-
-1. Select **Save** to save the classifications.
-
-1. On the **Overview** page, verify that Microsoft Purview added your new classifications.
-
- :::image type="content" source="./media/apply-classifications/verify-classifications-added-to-table.png" alt-text="Screenshot showing how to verify that classifications were added to a table asset." lightbox="./media/apply-classifications/verify-classifications-added-to-table.png":::
-
-## Manually add classification to a column asset
-
-Microsoft Purview automatically scans and adds classifications to all column assets. However, if you want to change the classification, you can do so at the column level:
-
-1. [Search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) the data catalog for the table asset that contains the column you want to update.
-
-1. Select **Edit** from the **Overview** tab.
-
-1. Select the **Schema** tab.
-
- :::image type="content" source="./media/apply-classifications/edit-column-schema.png" alt-text="Screenshot showing how to edit the schema of a column." lightbox="./media/apply-classifications/edit-column-schema.png":::
-
-1. Identify the columns you're interested in and select **Add a classification**. This example adds a **Common Passwords** classification to the **PasswordHash** column.
-
- :::image type="content" source="./media/apply-classifications/add-classification-to-column.png" alt-text="Screenshot showing how to add a classification to a column." lightbox="./media/apply-classifications/add-classification-to-column.png":::
-
-1. Select **Save**.
-
-1. Select the **Schema** tab and confirm that the classification has been added to the column.
-
- :::image type="content" source="./media/apply-classifications/confirm-classification-added.png" alt-text="Screenshot showing how to confirm that a classification was added to a column schema." lightbox="./media/apply-classifications/confirm-classification-added.png":::
--
-## Next steps
--- To learn how to create a custom classification, see [create a custom classification](create-a-custom-classification-and-classification-rule.md).-- To learn about how to automatically apply classifications, see [automatically apply classifications](apply-classifications.md).
purview Microsoft Purview Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/microsoft-purview-connector-overview.md
- Title: Microsoft Purview Data Map supported data sources and file types
-description: This article provides details about supported data sources, file types, and functionalities in the Microsoft Purview Data Map.
----- Previously updated : 04/04/2023---
-# Supported data sources and file types
-
-This article discusses currently supported data sources, file types, and scanning concepts in the Microsoft Purview Data Map.
-
-## Microsoft Purview Data Map available data sources
-
-The table below shows the supported capabilities for each data source. Select the data source, or the feature, to learn more.
-
-|**Category**| **Data Store** |**Technical metadata** |**Classification** |**Lineage** | **Labeling** |**Access Policy** | **Data Sharing** |
-|||||||||
-| Azure |[Multiple sources](register-scan-azure-multiple-sources.md)| [Yes](register-scan-azure-multiple-sources.md#register) | [Yes](register-scan-azure-multiple-sources.md#scan) | No |[Source Dependent](create-sensitivity-label.md)|[Yes](register-scan-azure-multiple-sources.md#access-policy) | No |
-||[Azure Blob Storage](register-scan-azure-blob-storage-source.md)| [Yes](register-scan-azure-blob-storage-source.md#register) | [Yes](register-scan-azure-blob-storage-source.md#scan)| Limited* | [Yes](create-sensitivity-label.md)|[Yes](register-scan-azure-blob-storage-source.md#access-policy) (Preview) | [Yes](register-scan-azure-blob-storage-source.md#data-sharing)|
-|| [Azure Cosmos DB (API for NoSQL)](register-scan-azure-cosmos-database.md)| [Yes](register-scan-azure-cosmos-database.md#register) | [Yes](register-scan-azure-cosmos-database.md#scan)|No*|[Yes](create-sensitivity-label.md)|No| No|
-|| [Azure Data Explorer](register-scan-azure-data-explorer.md)| [Yes](register-scan-azure-data-explorer.md#register) | [Yes](register-scan-azure-data-explorer.md#scan)| No* | [Yes](create-sensitivity-label.md)|No | No|
-|| [Azure Data Factory](how-to-link-azure-data-factory.md) | [Yes](how-to-link-azure-data-factory.md) | No | [Yes](how-to-link-azure-data-factory.md) | No | No | No|
-|| [Azure Data Lake Storage Gen1](register-scan-adls-gen1.md)| [Yes](register-scan-adls-gen1.md#register) | [Yes](register-scan-adls-gen1.md#scan)| Limited* | [Yes](create-sensitivity-label.md)|No | No|
-|| [Azure Data Lake Storage Gen2](register-scan-adls-gen2.md)| [Yes](register-scan-adls-gen2.md#register) | [Yes](register-scan-adls-gen2.md#scan)| Limited* | [Yes](create-sensitivity-label.md)|[Yes](register-scan-adls-gen2.md#access-policy) (Preview) | [Yes](register-scan-adls-gen2.md#data-sharing) |
-|| [Azure Data Share](how-to-link-azure-data-share.md) | [Yes](how-to-link-azure-data-share.md) | No | [Yes](how-to-link-azure-data-share.md) | No| No | No|
-|| [Azure Database for MySQL](register-scan-azure-mysql-database.md) | [Yes](register-scan-azure-mysql-database.md#register) | [Yes](register-scan-azure-mysql-database.md#scan) | No* | [Yes](create-sensitivity-label.md)| No | No |
-|| [Azure Database for PostgreSQL](register-scan-azure-postgresql.md) | [Yes](register-scan-azure-postgresql.md#register) | [Yes](register-scan-azure-postgresql.md#scan) | No* | [Yes](create-sensitivity-label.md)| No | No |
-|| [Azure Databricks](register-scan-azure-databricks.md) | [Yes](register-scan-azure-databricks.md#register) | No | [Yes](register-scan-azure-databricks.md#lineage) | No | No | No |
-|| [Azure Dedicated SQL pool (formerly SQL DW)](register-scan-azure-synapse-analytics.md)| [Yes](register-scan-azure-synapse-analytics.md#register) | [Yes](register-scan-azure-synapse-analytics.md#scan)| No* | No | No | No |
-|| [Azure Files](register-scan-azure-files-storage-source.md)|[Yes](register-scan-azure-files-storage-source.md#register) | [Yes](register-scan-azure-files-storage-source.md#scan) | Limited* | [Yes](create-sensitivity-label.md)|No | No |
-|| [Azure SQL Database](register-scan-azure-sql-database.md)| [Yes](register-scan-azure-sql-database.md#register-the-data-source) |[Yes](register-scan-azure-sql-database.md#scope-and-run-the-scan)| [Yes (Preview)](register-scan-azure-sql-database.md#extract-lineage-preview) | [Yes](create-sensitivity-label.md)| [Yes](register-scan-azure-sql-database.md#set-up-access-policies) | No |
-|| [Azure SQL Managed Instance](register-scan-azure-sql-managed-instance.md)| [Yes](register-scan-azure-sql-managed-instance.md#scan) | [Yes](register-scan-azure-sql-managed-instance.md#scan) | No* | [Yes](create-sensitivity-label.md)| No | No |
-|| [Azure Synapse Analytics (Workspace)](register-scan-synapse-workspace.md)| [Yes](register-scan-synapse-workspace.md#register) | [Yes](register-scan-synapse-workspace.md#scan)| [Yes - Synapse pipelines](how-to-lineage-azure-synapse-analytics.md)| [Yes](create-sensitivity-label.md)|No| No |
-|Database| [Amazon RDS](register-scan-amazon-rds.md) | [Yes](register-scan-amazon-rds.md#register-an-amazon-rds-data-source) | [Yes](register-scan-amazon-rds.md#scan-an-amazon-rds-database) | No | No | No | No|
-|| [Cassandra](register-scan-cassandra-source.md)|[Yes](register-scan-cassandra-source.md#register) | No | [Yes](register-scan-cassandra-source.md#lineage)| No| No |No|
-|| [Db2](register-scan-db2.md) | [Yes](register-scan-db2.md#register) | No | [Yes](register-scan-db2.md#lineage) | No | No | No|
-|| [Google BigQuery](register-scan-google-bigquery-source.md)| [Yes](register-scan-google-bigquery-source.md#register)| No | [Yes](register-scan-google-bigquery-source.md#lineage)| No| No | No|
-|| [Hive Metastore Database](register-scan-hive-metastore-source.md) | [Yes](register-scan-hive-metastore-source.md#register) | No | [Yes*](register-scan-hive-metastore-source.md#lineage) | No| No |No|
-|| [MongoDB](register-scan-mongodb.md) | [Yes](register-scan-mongodb.md#register) | No | No | No | No | No|
-|| [MySQL](register-scan-mysql.md) | [Yes](register-scan-mysql.md#register) | No | [Yes](register-scan-mysql.md#lineage) | No | No | No|
-|| [Oracle](register-scan-oracle-source.md) | [Yes](register-scan-oracle-source.md#register)| [Yes](register-scan-oracle-source.md#scan) | [Yes*](register-scan-oracle-source.md#lineage) | No| No | No|
-|| [PostgreSQL](register-scan-postgresql.md) | [Yes](register-scan-postgresql.md#register) | No | [Yes](register-scan-postgresql.md#lineage) | No | No | No|
-|| [SAP Business Warehouse](register-scan-sap-bw.md) | [Yes](register-scan-sap-bw.md#register) | No | No | No | No | No|
-|| [SAP HANA](register-scan-sap-hana.md) | [Yes](register-scan-sap-hana.md#register) | No | No | No | No | No|
-|| [Snowflake](register-scan-snowflake.md) | [Yes](register-scan-snowflake.md#register) | [Yes](register-scan-snowflake.md#scan) | [Yes](register-scan-snowflake.md#lineage) | No | No | No|
-|| [SQL Server](register-scan-on-premises-sql-server.md)| [Yes](register-scan-on-premises-sql-server.md#register) |[Yes](register-scan-on-premises-sql-server.md#scan) | No* | [Yes](create-sensitivity-label.md)|No| No |
-|| [SQL Server on Azure-Arc](register-scan-azure-arc-enabled-sql-server.md)| [Yes](register-scan-azure-arc-enabled-sql-server.md#register) | [Yes](register-scan-azure-arc-enabled-sql-server.md#scan) | No* |No|[Yes](register-scan-azure-arc-enabled-sql-server.md#access-policy) | No |
-|| [Teradata](register-scan-teradata-source.md)| [Yes](register-scan-teradata-source.md#register)| [Yes](register-scan-teradata-source.md#scan)| [Yes*](register-scan-teradata-source.md#lineage) | No|No| No |
-|File|[Amazon S3](register-scan-amazon-s3.md)|[Yes](register-scan-amazon-s3.md)| [Yes](register-scan-amazon-s3.md)| Limited* | [Yes](create-sensitivity-label.md)|No| No |
-||[HDFS](register-scan-hdfs.md)|[Yes](register-scan-hdfs.md)| [Yes](register-scan-hdfs.md)| No | No| No |No|
-|Services and apps| [Erwin](register-scan-erwin-source.md)| [Yes](register-scan-erwin-source.md#register)| No | [Yes](register-scan-erwin-source.md#lineage)| No| No |No|
-|| [Looker](register-scan-looker-source.md)| [Yes](register-scan-looker-source.md#register)| No | [Yes](register-scan-looker-source.md#lineage)| No| No |No|
-|| [Power BI](register-scan-power-bi-tenant.md)| [Yes](register-scan-power-bi-tenant.md)| No | [Yes](how-to-lineage-powerbi.md)| No| No |No|
-|| [Salesforce](register-scan-salesforce.md) | [Yes](register-scan-salesforce.md#register) | No | No | No | No |No|
-|| [SAP ECC](register-scan-sapecc-source.md)| [Yes](register-scan-sapecc-source.md#register) | No | [Yes*](register-scan-sapecc-source.md#lineage) | No| No | No|
-|| [SAP S/4HANA](register-scan-saps4hana-source.md) | [Yes](register-scan-saps4hana-source.md#register)| No | [Yes*](register-scan-saps4hana-source.md#lineage) | No| No |No|
-
-\* Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).
-
-> [!NOTE]
-> Currently, the Microsoft Purview Data Map can't scan an asset that has `/`, `\`, or `#` in its name. To scope your scan and avoid scanning assets that have those characters in the asset name, use the example in [Register and scan an Azure SQL Database](register-scan-azure-sql-database.md#create-the-scan).
-
-> [!IMPORTANT]
-> If you plan on using a self-hosted integration runtime, scanning some data sources requires additional setup on the self-hosted integration runtime machine. For example, JDK, Visual C++ Redistributable, or specific driver.
-> For your source, **[refer to each source article for prerequisite details.](azure-purview-connector-overview.md)**
-> Any requirements will be listed in the **Prerequisites** section.
-
-## Scan regions
-The following is a list of all the Azure data source (data center) regions where the Microsoft Purview Data Map scanner runs. If your Azure data source is in a region outside of this list, the scanner will run in the region of your Microsoft Purview instance.
-
-### Microsoft Purview Data Map scanner regions
--- Australia East-- Australia Southeast-- Brazil South-- Canada Central-- Central India-- Central US-- East Asia-- East US-- East US 2-- France Central-- Japan East-- Korea Central-- North Central US-- North Europe-- South Africa North-- South Central US-- Southeast Asia-- UAE North-- UK South-- West Central US-- West Europe-- West US-- West US 2-- West US 3-
-## File types supported for scanning
-
-The following file types are supported for scanning, for schema extraction, and classification where applicable:
--- Structured file formats supported by extension: AVRO, ORC, PARQUET, CSV, JSON, PSV, SSV, TSV, TXT, XML, GZIP-- Document file formats supported by extension: DOC, DOCM, DOCX, DOT, ODP, ODS, ODT, PDF, POT, PPS, PPSX, PPT, PPTM, PPTX, XLC, XLS, XLSB, XLSM, XLSX, XLT-- The Microsoft Purview Data Map also supports [custom file extensions and custom parsers](create-a-scan-rule-set.md#create-a-custom-file-type).-
-> [!Note]
-> **Known Limitations:**
-> * The Microsoft Purview Data Map scanner only supports schema extraction for the structured file types listed above.
-> * For AVRO, ORC, and PARQUET file types, the scanner does not support schema extraction for files that contain complex data types (for example, MAP, LIST, STRUCT).
-> * The scanner supports scanning snappy compressed PARQUET types for schema extraction and classification.
-> * For GZIP file types, the GZIP must be mapped to a single csv file within.
-> Gzip files are subject to System and Custom Classification rules. We currently don't support scanning a gzip file mapped to multiple files within, or any file type other than csv.
-> * **For delimited file types (CSV, PSV, SSV, TSV, TXT)**:
-> * We do not support data type detection. The data type will be listed as "string" for all columns.
-> * We only support comma(ΓÇÿ,ΓÇÖ), semicolon(ΓÇÿ;ΓÇÖ), vertical bar(ΓÇÿ|ΓÇÖ) and tab(ΓÇÿ\tΓÇÖ) as delimiters.
-> * Delimited files with less than three rows cannot be determined to be CSV files if they are using a custom delimiter. For example: files with ~ delimiter and less than three rows will not be able to be determined to be CSV files.
-> * If a field contains double quotes, the double quotes can only appear at the beginning and end of the field and must be matched. Double quotes that appear in the middle of the field or appear at the beginning and end but are not matched will be recognized as bad data and there will be no schema get parsed from the file. Rows that have different number of columns than the header row will be judged as error rows. (numbers of error rows / numbers of rows sampled ) must be less than 0.1.
- > * For Parquet files, if you are using a self-hosted integration runtime, you need to install the **64-bit JRE 8 (Java Runtime Environment) or OpenJDK** on your IR machine. Check our [Java Runtime Environment section at the bottom of the page](manage-integration-runtimes.md#java-runtime-environment-installation) for an installation guide.
-
-## Schema extraction
-
-Currently, the maximum number of columns supported in asset schema tab is 800 for Azure sources, Power BI and SQL server.
-
-## Nested data
-
-Currently, nested data is only supported for JSON content.
-
-For all [system supported file types](#file-types-supported-for-scanning), if there's nested JSON content in a column, then the scanner parses the nested JSON data and surfaces it within the schema tab of the asset.
-
-Nested data, or nested schema parsing, isn't supported in SQL. A column with nested data will be reported and classified as is, and subdata won't be parsed.
-
-## Sampling data for classification
-
-In Microsoft Purview Data Map terminology,
-- L1 scan: Extracts basic information and meta data like file name, size and fully qualified name-- L2 scan: Extracts schema for structured file types and database tables-- L3 scan: Extracts schema where applicable and subjects the sampled file to system and custom classification rules-
-For all structured file formats, the Microsoft Purview Data Map scanner samples files in the following way:
--- For structured file types, it samples the top 128 rows in each column or the first 1 MB, whichever is lower.-- For document file formats, it samples the first 20 MB of each file.
- - If a document file is larger than 20 MB, then it isn't subject to a deep scan (subject to classification). In that case, Microsoft Purview captures only basic meta data like file name and fully qualified name.
-- For **tabular data sources (SQL)**, it samples the top 128 rows.-- For **Azure Cosmos DB for NoSQL**, up to 300 distinct properties from the first 10 documents in a container will be collected for the schema and for each property, values from up to 128 documents or the first 1 MB will be sampled.-
-## Resource set file sampling
-
-A folder or group of partition files is detected as a *resource set* in the Microsoft Purview Data Map if it matches with a system resource set policy or a customer defined resource set policy. If a resource set is detected, then the scanner will sample each folder that it contains. Learn more about resource sets [here](concept-resource-sets.md).
-
-File sampling for resource sets by file types:
--- **Delimited files (CSV, PSV, SSV, TSV)** - 1 in 100 files are sampled (L3 scan) within a folder or group of partition files that are considered a 'Resource set'-- **Data Lake file types (Parquet, Avro, Orc)** - 1 in 18446744073709551615 (long max) files are sampled (L3 scan) within a folder or group of partition files that are considered a 'Resource set'-- **Other structured file types (JSON, XML, TXT)** - 1 in 100 files are sampled (L3 scan) within a folder or group of partition files that are considered a 'Resource set'-- **SQL objects and Azure Cosmos DB entities** - Each file is L3 scanned.-- **Document file types** - Each file is L3 scanned. Resource set patterns don't apply to these file types.--
-## Next steps
--- [Discover and govern Azure Blob storage source](register-scan-azure-blob-storage-source.md)-- [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md)-- [Manage data sources in Microsoft Purview](manage-data-sources.md)
purview Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/overview.md
- Title: Introduction to Microsoft Purview governance solutions
-description: This article is an overview of the solutions that Microsoft Purview provides through the Microsoft Purview governance portal, and describes how they work together to help you manage your on-premises, multicloud, and software-as-a-service data.
----- Previously updated : 03/04/2023--
-# What's available in the Microsoft Purview governance portal?
-
-Microsoft Purview's solutions in the governance portal provide a unified data governance service that helps you manage your on-premises, multicloud, and software-as-a-service (SaaS) data. The Microsoft Purview governance portal allows you to:
-- Create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. -- Enable data curators and security administrators to manage and keep your data estate secure.-- Empower data consumers to find valuable, trustworthy data.-
- Chart shows the high-level architecture of Microsoft Purview. Multicloud and on-premises sources flow into Microsoft Purview's Data Map. On top of it, Microsoft Purview's apps (Data Catalog, Data Estate Insights, Data Policy, and Data Sharing) allow data consumers, data curators and security administrators to view and manage metadata, share data, and protect assets. This metadata is also ported to external analytics services from Microsoft Purview for more processing.
-
->[!TIP]
-> Looking to govern your data in Microsoft 365 by keeping what you need and deleting what you don't? Use [Microsoft Purview Data Lifecycle Management](/microsoft-365/compliance/data-lifecycle-management).
-
-## Data Map
-
-Microsoft Purview automates data discovery by providing data scanning and classification for assets across your data estate. Metadata and descriptions of discovered data assets are integrated into a holistic map of your data estate. Microsoft Purview Data Map provides the foundation for data discovery and data governance. Microsoft Purview Data Map is a cloud native PaaS service that captures metadata about enterprise data present in analytics and operation systems on-premises and cloud. Microsoft Purview Data Map is automatically kept up to date with built-in automated scanning and classification system. Business users can configure and use the data map through an intuitive UI and developers can programmatically interact with the Data Map using open-source Apache Atlas 2.2 APIs.
-Microsoft Purview Data Map powers the Microsoft Purview Data Catalog, the Microsoft Purview Data Estate Insights and the Microsoft Purview Data Policy as unified experiences within the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-For more information, see our [introduction to Data Map](concept-elastic-data-map.md).
-
-Atop the Data Map, there are purpose-built apps that create environments for data discovery, access management, and insights about your data landscape.
-
-|App |Description |
-|-|--|
-|[Data Catalog](#data-catalog-app) | Finds trusted data sources by browsing and searching your data assets. The data catalog aligns your assets with friendly business terms and data classification to identify data sources. |
-|[Data Estate Insights](#data-estate-insights-app) | Gives you an overview of your data estate to help you discover what kinds of data you have and where it is. |
-|[Data Sharing](#data-sharing-app) | Allows you to securely share data internally or cross organizations with business partners and customers. |
-|[Data Policy](#data-policy-app) | A set of central, cloud-based experiences that help you provision access to data securely and at scale. |
-|||
-
-## Data Catalog app
-
-With the Microsoft Purview Data Catalog, business and technical users can quickly and easily find relevant data using a search experience with filters based on lenses such as glossary terms, classifications, sensitivity labels and more. For subject matter experts, data stewards and officers, the Microsoft Purview Data Catalog provides data curation features such as business glossary management and the ability to automate tagging of data assets with glossary terms. Data consumers and producers can also visually trace the lineage of data assets: for example, starting from operational systems on-premises, through movement, transformation & enrichment with various data storage and processing systems in the cloud, to consumption in an analytics system like Power BI.
-For more information, see our [introduction to search using Data Catalog](how-to-search-catalog.md).
-
-## Data Estate Insights app
-
-With the Microsoft Purview Data Estate Insights, the chief data officers and other governance stakeholders can get a birdΓÇÖs eye view of their data estate and can gain actionable insights into the governance gaps that can be resolved from the experience itself.
-
-For more information, see our [introduction to Data Estate Insights](concept-insights.md).
-
-## Data Sharing app
-
-Microsoft Purview Data Sharing enables organizations to securely share data both within your organization or cross organizations with business partners and customers. You can share or receive data with just a few clicks. Data providers can centrally manage and monitor data sharing relationships, and revoke sharing at any time. Data consumers can access received data with their own analytics tools and turn data into insights.
-
-For more information, see our [introduction to Data Sharing](concept-data-share.md).
-
-## Data Policy app
-Microsoft Purview Data Policy is a set of central, cloud-based experiences that help you manage access to data sources and datasets securely and at scale.
-- Manage access to data sources from a single-pane of glass, cloud-based experience-- Enables at-scale access provisioning-- Introduces a new data-plane permission model that is external to data sources-- It is seamlessly integrated with Microsoft Purview Data Map and Catalog:
- - Search for data assets and grant access only to what is required via fine-grained policies.
- - Path to support SaaS, on-premises, and multicloud data sources.
- - Path to create policies that leverage any metadata associated to the data objects.
-- Based on role definitions that are simple and abstracted (for example: Read, Modify)-
-For more information, see our introductory guides:
-* [Data owner access policies](concept-policies-data-owner.md) (preview): Provision fine-grained to broad access to users and groups via intuitive authoring experience.
-* [Self-service access policies](concept-self-service-data-access-policy.md) (preview): Self-Service: Workflow approval and automatic provisioning of access requests initiated by business analysts that discover data assets in Microsoft PurviewΓÇÖs catalog.
-* [DevOps policies](concept-policies-devops.md): Provision IT operations personnel access to SQL system metadata, so that they can monitor performance, health and audit security, while limiting the insider threat.
-
-Here are the benefits of the Data Policy app:
-
-| **Principle** | **Benefit** |
-|-|-|
-|*Simplify* |Permissions are bundled into role definitions that are abstracted and consistent across data source types, like Read and Modify.|
-| |Reduce the need of permission expertise for each data source type.|
-|||
-|*Reduce effort* |Graphical interface lets you navigate the data object hierarchy quickly.|
-| |Supports policies on entire Azure resource groups and subscriptions.|
-|||
-|*Enhance security*|Access is granted centrally and can be easily reviewed and revoked.|
-| |Reduces the need for privileged accounts to configure access directly at the data source.|
-| |Supports the Principle of Least Privilege via data resource scopes and common role definitions.|
-|||
-
-## Traditional challenges that Microsoft Purview seeks to address
-
-### Challenges for data consumers
-
-Traditionally, discovering enterprise data sources has been an organic process based on communal knowledge. For companies that want the most value from their information assets, this approach presents many challenges:
-
-* Because there's no central location to register data sources, users might be unaware of a data source unless they come into contact with it as part of another process.
-* Unless users know the location of a data source, they can't connect to the data by using a client application. Data-consumption experiences require users to know the connection string or path.
-* The intended use of the data is hidden to users unless they know the location of a data source's documentation. Data sources and documentation might live in several places and be consumed through different kinds of experiences.
-* If users have questions about an information asset, they must locate the expert, or team responsible for that data and engage them offline. There's no explicit connection between the data and the experts that understand the data's context.
-* Unless users understand the process for requesting access to the data source, discovering the data source and its documentation won't help them access the data.
-
-### Challenges for data producers
-
-Although data consumers face the previously mentioned challenges, users who are responsible for producing and maintaining information assets face challenges of their own:
-
-* Annotating data sources with descriptive metadata is often a lost effort. Client applications typically ignore descriptions that are stored in the data source.
-* Creating documentation for data sources can be difficult and it's an ongoing responsibility to keep documentation in sync with data sources. Users might not trust documentation that's perceived as being out of date.
-* Creating and maintaining documentation for data sources is complex and time-consuming. Making that documentation readily available to everyone who uses the data source can be even more so.
-* Restricting access to data sources and ensuring that data consumers know how to request access is an ongoing challenge.
-
-When such challenges are combined, they present a significant barrier for companies that want to encourage and promote the use and understanding of enterprise data.
-
-### Challenges for security administrators
-
-Users who are responsible for ensuring the security of their organization's data may have any of the challenges listed above as data consumers and producers, and the following extra challenges:
-
-* An organization's data is constantly growing and being stored and shared in new directions. The task of discovering, protecting, and governing your sensitive data is one that never ends. You need to ensure that your organization's content is being shared with the correct people, applications, and with the correct permissions.
-* Understanding the risk levels in your organization's data requires diving deep into your content, looking for keywords, RegEx patterns, and sensitive data types. For example, sensitive data types might include Credit Card numbers, Social Security numbers or Bank Account numbers. You must constantly monitor all data sources for sensitive content, as even the smallest amount of data loss can be critical to your organization.
-* Ensuring that your organization continues to comply with corporate security policies is a challenging task as your content grows and changes, and as those requirements and policies are updated for changing digital realities. Security administrators need to ensure data security in the quickest time possible.
-
-## Microsoft Purview advantages
-
-Microsoft Purview is designed to address the issues mentioned in the previous sections and to help enterprises get the most value from their existing information assets. The catalog makes data sources easily discoverable and understandable by the users who manage the data.
-
-Microsoft Purview provides a cloud-based service into which you can register data sources. During registration, the data remains in its existing location, but a copy of its metadata is added to Microsoft Purview, along with a reference to the data source location. The metadata is also indexed to make each data source easily discoverable via search and understandable to the users who discover it.
-
-After you register a data source, you can then enrich its metadata. Either the user who registered the data source or another user in the enterprise can add more metadata. Any user can annotate a data source by providing descriptions, tags, or other metadata for requesting data source access. This descriptive metadata supplements the structural metadata, such as column names and data types that are registered from the data source.
-
-Discovering and understanding data sources and their use is the primary purpose of registering the sources. Enterprise users might need data for business intelligence, application development, data science, or any other task where the correct data is required. They can use the data catalog discovery experience to quickly find data that matches their needs, understand the data to evaluate its fitness for purpose, and consume the data by opening the data source in their tool of choice.
-
-At the same time, users can contribute to the catalog by tagging, documenting, and annotating data sources that have already been registered. They can also register new data sources, which are then discovered, understood, and consumed by the community of catalog users.
-
-Lastly, Microsoft Purview Data Policy app provides a superior solution to keep your data secure.
-
-## In-region data residency
-
-Microsoft Purview processes data and stores metadata information, but does not store customer data. Data is processed in its data region, and customer metadata stays within the region where Microsoft Purview is deployed. For regions with data residency requirements, customer data stays within its region, and customer metadata is always kept within the same region where Microsoft Purview is deployed.
-
-## Next steps
-
->[!TIP]
-> Check if Microsoft Purview is available in your region on the [regional availability page](https://azure.microsoft.com/explore/global-infrastructure/products-by-region/).
-
-To get started with Microsoft Purview, see [Create a Microsoft Purview account](create-catalog-portal.md).
purview Quickstart ARM Create Microsoft Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-ARM-create-microsoft-purview.md
- Title: 'Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using an ARM Template'
-description: This Quickstart describes how to create a Microsoft Purview (formerly Azure Purview) account using an ARM Template.
-- Previously updated : 05/18/2023-----
-# Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using an ARM template
-
-This quickstart describes the steps to deploy a Microsoft Purview (formerly Azure Purview) account using an Azure Resource Manager (ARM) template.
-
-After you've created the account, you can begin registering your data sources and using the Microsoft Purview governance portal to understand and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, the Microsoft Purview Data Map creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end data linage. Data consumers are able to discover data across your organization and data administrators are able to audit, secure, and ensure right use of your data.
-
-For more information about the governance capabilities of Microsoft Purview, formerly Azure Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md)
-
-To deploy a Microsoft Purview account to your subscription using an ARM template, follow the guide below.
--
-## Deploy a custom template
-
-If your environment meets the prerequisites and you're familiar with using ARM templates, select the **Deploy to Azure** button. The template will open in the Azure portal where you can customize values and deploy.
-The template will deploy a Microsoft Purview account into a new or existing resource group in your subscription.
-
-[![Deploy to Azure](../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.azurepurview%2Fazure-purview-deployment%2Fazuredeploy.json)
--
-## Review the template
-
-The template used in this quickstart is from [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/azure-purview-deployment/).
-
-<! Below link needs to be updated to Purview quickstart, which I'm currently working on. >
-
-The following resources are defined in the template:
-
-* [Microsoft.Purview/accounts](/azure/templates/microsoft.purview/accounts?pivots=deployment-language-arm-template)
-
-The template performs the following tasks:
-
-* Creates a Microsoft Purview account in a specified resource group.
-
-## Customize network settings for your account
-
-When you're deploying your ARM template, you can also use the following settings in the template to manage your public network access settings:
--- **Enabled for all networks**
- `"publicNetworkAccess": "Enabled",
- "managedResourcesPublicNetworkAccess": "Enabled" `
- `"publicNetworkAccess": "Enabled",
- "managedResourcesPublicNetworkAccess": "Disabled" `
- `"publicNetworkAccess": "Disables",
- "managedResourcesPublicNetworkAccess": "Disabled" `
-
-For example:
-`
-"resources": [
- {
- "type": "Microsoft.Purview/accounts",
- "apiVersion": "2021-12-01",
- "name": "[parameters('purviewName')]",
- "location": "[parameters('location')]",
- "sku": {
- "name": "Standard",
- "capacity": 1
- },
- "identity": {
- "type": "SystemAssigned"
- },
- "properties": {
- "publicNetworkAccess": "Enabled",
- "managedResourcesPublicNetworkAccess": "Enabled"
- "managedResourceGroupName": "[format('managed-rg-{0}', parameters('purviewName'))]"
- }
- }
- ]
-`
-
-## Open Microsoft Purview governance portal
-
-After your Microsoft Purview account is created, you'll use the Microsoft Purview governance portal to access and manage it. There are two ways to open Microsoft Purview governance portal:
-
-* Open your Microsoft Purview account in the [Azure portal](https://portal.azure.com). Select the "Open Microsoft Purview governance portal" tile on the overview page.
- :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Microsoft Purview account overview page, with the Microsoft Purview governance portal tile highlighted.":::
-
-* Alternatively, you can browse to [https://web.purview.azure.com](https://web.purview.azure.com), select your Microsoft Purview account, and sign in to your workspace.
-
-## Get started with your Purview resource
-
-After deployment, the first activities are usually:
-
-* [Create a collection](quickstart-create-collection.md)
-* [Register a resource](azure-purview-connector-overview.md)
-* [Scan the resource](concept-scans-and-ingestion.md)
-
-At this time, these actions aren't able to be taken through an Azure Resource Manager template. Follow the guides above to get started!
-
-## Clean up resources
-
-To clean up the resources deployed in this quickstart, delete the resource group, which deletes all resources in the group.
-You can delete the resources either through the Azure portal, or using the PowerShell script below.
-
-```azurepowershell-interactive
-$resourceGroupName = Read-Host -Prompt "Enter the resource group name"
-Remove-AzResourceGroup -Name $resourceGroupName
-Write-Host "Press [ENTER] to continue..."
-```
-
-## Next steps
-
-In this quickstart, you learned how to create a Microsoft Purview (formerly Azure Purview) account and how to access the Microsoft Purview governance portal.
-
-Next, you can create a user-assigned managed identity (UAMI) that will enable your new Microsoft Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
-
-To create a UAMI, follow our [guide to create a user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-Follow these next articles to learn how to navigate the Microsoft Purview governance portal, create a collection, and grant access to Microsoft Purview:
-
-> [!div class="nextstepaction"]
-> [Using the Microsoft Purview governance portal](use-azure-purview-studio.md)
-> [Create a collection](quickstart-create-collection.md)
-> [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Quickstart Bicep Create Microsoft Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-bicep-create-microsoft-purview.md
- Title: 'Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using a Bicep file'
-description: This Quickstart describes how to create a Microsoft Purview (formerly Azure Purview) account using a Bicep file.
-- Previously updated : 09/12/2022-----
-# Quickstart: Create a Microsoft Purview (formerly Azure Purview) account using a Bicep file
-
-This quickstart describes the steps to deploy a Microsoft Purview (formerly Azure Purview) account using a Bicep file.
-
-After you've created the account, you can begin registering your data sources, and using the Microsoft Purview governance portal to understand and govern your data landscape. By connecting to data across your on-premises, multicloud, and software-as-a-service (SaaS) sources, the Microsoft Purview Data Map creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end data linage. Data consumers are able to discover data across your organization and data administrators are able to audit, secure, and ensure right use of your data.
-
-For more information about the governance capabilities of Microsoft Purview, formerly Azure Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md)
-
-To deploy a Microsoft Purview account to your subscription using a Bicep file, follow the guide below.
--
-## Review the Bicep file
-
-The Bicep file used in this quickstart is from [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/azure-purview-deployment/).
-
-<! Below link needs to be updated to Purview quickstart, which I'm currently working on. >
-
-The following resources are defined in the Bicep file:
-
-* [Microsoft.Purview/accounts](/azure/templates/microsoft.purview/accounts?pivots=deployment-language-bicep)
-
-The Bicep file performs the following tasks:
-
-* Creates a Microsoft Purview account in a specified resource group.
-
-## Deploy the Bicep file
-
-1. Save the Bicep file as **main.bicep** to your local computer.
-1. Deploy the Bicep file using either Azure CLI or Azure PowerShell.
-
- # [CLI](#tab/CLI)
-
- ```azurecli
- az group create --name exampleRG --location eastus
- az deployment group create --resource-group exampleRG --template-file main.bicep
- ```
-
- # [PowerShell](#tab/PowerShell)
-
- ```azurepowershell
- New-AzResourceGroup -Name exampleRG -Location eastus
- New-AzResourceGroupDeployment -ResourceGroupName exampleRG -TemplateFile ./main.bicep
- ```
-
-
-
-You'll be prompted to enter the following values:
-
-* **Purview name**: enter a name for the Microsoft Purview account.
-
-When the deployment finishes, you should see a message indicating the deployment succeeded.
-
-## Open Microsoft Purview governance portal
-
-After your Microsoft Purview account is created, you'll use the Microsoft Purview governance portal to access and manage it. There are two ways to open Microsoft Purview governance portal:
-
-* Open your Microsoft Purview account in the [Azure portal](https://portal.azure.com). Select the "Open Microsoft Purview governance portal" tile on the overview page.
- :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Microsoft Purview account overview page, with the Microsoft Purview governance portal tile highlighted.":::
-
-* Alternatively, you can browse to [https://web.purview.azure.com](https://web.purview.azure.com), select your Microsoft Purview account, and sign in to your workspace.
-
-## Get started with your Purview resource
-
-After deployment, the first activities are usually:
-
-* [Create a collection](quickstart-create-collection.md)
-* [Register a resource](azure-purview-connector-overview.md)
-* [Scan the resource](concept-scans-and-ingestion.md)
-
-At this time, these actions aren't able to be taken through a Bicep file. Follow the guides above to get started!
-
-## Clean up resources
-
-When you no longer need them, use the Azure portal, Azure CLI, or Azure PowerShell to remove the resource group, firewall, and all related resources.
-
-# [CLI](#tab/CLI)
-
-```azurecli-interactive
-az group delete --name exampleRG
-```
-
-# [PowerShell](#tab/PowerShell)
-
-```azurepowershell-interactive
-Remove-AzResourceGroup -Name exampleRG
-```
---
-## Next steps
-
-In this quickstart, you learned how to create a Microsoft Purview (formerly Azure Purview) account and how to access the Microsoft Purview governance portal.
-
-Next, you can create a user-assigned managed identity (UAMI) that will enable your new Microsoft Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
-
-To create a UAMI, follow our [guide to create a user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-Follow these next articles to learn how to navigate the Microsoft Purview governance portal, create a collection, and grant access to Microsoft Purview:
-
-> [!div class="nextstepaction"]
-> [Using the Microsoft Purview governance portal](use-azure-purview-studio.md)
-> [Create a collection](quickstart-create-collection.md)
-> [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Quickstart Create Collection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-create-collection.md
- Title: 'Quickstart: Create a collection'
-description: Collections are used for access control, and asset organization in the Microsoft Purview Data Map. This article describes how to create a collection and add permissions, register sources, and register assets to collections.
----- Previously updated : 06/17/2022---
-# Quickstart: Create a collection and assign permissions in the Microsoft Purview Data Map
-
-Collections are the Microsoft Purview Data Map's tool to manage ownership and access control across assets, sources, and information. They also organize your sources and assets into categories that are customized to match your management experience with your data. This guide will take you through setting up your first collection and collection admin to prepare your Microsoft Purview environment for your organization.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* Your own [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-## Check permissions
-
-In order to create and manage collections in the Microsoft Purview Data Map, you'll need to be a **Collection Admin** within the Microsoft Purview governance portal. We can check these permissions in the [governance portal](use-azure-purview-studio.md). You can find the governance portal by:
-
-* Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
-* Open the [Azure portal](https://portal.azure.com), search for and select the Microsoft Purview account you want to use to receive the share. Open [the Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-1. Select Data Map > Collections from the left pane to open collection management page.
-
- :::image type="content" source="./media/quickstart-create-collection/find-collections.png" alt-text="Screenshot of the Microsoft Purview governance portal opened to the Data Map, with the Collections tab selected." border="true":::
-
-1. Select your root collection. This is the top collection in your collection list and will have the same name as your Microsoft Purview account. In our example below, it's called ContosoPurview.
-
- :::image type="content" source="./media/quickstart-create-collection/select-root-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the root collection highlighted." border="true":::
-
-1. Select role assignments in the collection window.
-
- :::image type="content" source="./media/quickstart-create-collection/role-assignments.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
-
-1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant you permission.
-
- :::image type="content" source="./media/quickstart-create-collection/collection-admins.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the collection admin section highlighted." border="true":::
-
-## Create a collection in the portal
-
-To create your collection, we'll start in the [Microsoft Purview governance portal](use-azure-purview-studio.md). You can find the portal by:
-
-* Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
-* Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Select Data Map > Collections from the left pane to open collection management page.
-
- :::image type="content" source="./media/quickstart-create-collection/find-collections.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the Collections tab selected." border="true":::
-
-1. Select **+ Add a collection**.
-
- :::image type="content" source="./media/quickstart-create-collection/select-add-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the Collections tab selected and Add a Collection highlighted." border="true":::
-
-1. In the right panel, enter the collection name, description, and search for users to add them as collection admins.
-
- :::image type="content" source="./media/quickstart-create-collection/create-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
-
-1. Select **Create**. The collection information will reflect on the page.
-
- :::image type="content" source="./media/quickstart-create-collection/created-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, showing the newly created collection window." border="true":::
-
-## Assign permissions to collection
-
-Now that you have a collection, you can assign permissions to this collection to manage your users access to the Microsoft Purview governance portal.
-
-### Roles
-
-All assigned roles apply to sources, assets, and other objects within the collection where the role is applied.
-
-* **Collection admins** - can edit a collection, its details, manage access in the collection, and add subcollections.
-* **Data source admins** - can manage data sources and data scans.
-* **Data curators** - can create, read, modify, and delete actions on catalog data objects.
-* **Data readers** - can access but not modify catalog data objects.
-
-### Assign permissions
-
-1. Select **Role assignments** tab to see all the roles in a collection.
-
- :::image type="content" source="./media/quickstart-create-collection/select-role-assignments.png" alt-text="Screenshot of the Microsoft Purview governance portal collection window, with the role assignments tab highlighted." border="true":::
-
-1. Select **Edit role assignments** or the person icon to edit each role member.
-
- :::image type="content" source="./media/quickstart-create-collection/edit-role-assignments.png" alt-text="Screenshot of the Microsoft Purview governance portal collection window, with the edit role assignments dropdown list selected." border="true":::
-
-1. Type in the textbox to search for users you want to add to the role member. Select **OK** to save the change.
-
-## Next steps
-
-Now that you have a collection, you can follow these guides below to add resources, scan, and manage your collections.
-
-* [Register source to collection](how-to-create-and-manage-collections.md#register-source-to-a-collection)
-* [Access management through collections](how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections)
purview Quickstart Data Share Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-data-share-dotnet.md
- Title: 'Quickstart: Share data using the .NET SDK'
-description: This article guides you through sharing and receiving data using Microsoft Purview Data Sharing account through the .NET SDK.
----- Previously updated : 02/16/2023---
-# Quickstart: Share and receive data with the Microsoft Purview Data Sharing .NET SDK
--
-In this quickstart, you'll use the .NET SDK to share data and receive shares from Azure Data Lake Storage (ADLS Gen2) or Blob storage accounts. The article includes code snippets that allow you to create, accept, and manage shares using Microsoft Purview Data Sharing.
-
-For an overview of how data sharing works, watch this short [demo](https://aka.ms/purview-data-share/overview-demo).
-
->[!NOTE]
->This feature has been updated in February 2023, and the SDK and permissions needed to view and manage data shares in Microsoft Purview have changed.
->
->- No permissions are now required in Microsoft Purview to use the SDK to create and manage shares. (Reader permissions are needed to use the Microsoft Purview Governance Portal for sharing data.)
->- Permissions are still required on storage accounts.
->
-> See the updated [NuGet package](#install-nuget-packages) and [updated code snippets](#create-a-sent-share) to use the updated SDK.
--
-### Visual Studio
-
-The walkthrough in this article uses Visual Studio 2022. The procedures for Visual Studio 2013, 2015, 2017, or 2019 may differ slightly.
-
-### Azure .NET SDK
-
-Download and install [Azure .NET SDK](https://azure.microsoft.com/downloads/) on your machine.
-
-## Use a service principal
-
-In the code snippets in this tutorial, you can authenticate either using your own credentials or using a service principal.
-To set up a service principal, follow these instructions:
-
-1. In [Create an Azure Active Directory application](../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal), create an application that represents the .NET application you're creating in this tutorial. For the sign-on URL, you can provide a dummy URL as shown in the article (`https://contoso.org/exampleapp`).
-1. In [Get values for signing in](../active-directory/develop/howto-create-service-principal-portal.md#sign-in-to-the-application), get the **application ID**,**tenant ID**, and **object ID**, and note down these values that you use later in this tutorial.
-1. In [Certificates and secrets](../active-directory/develop/howto-create-service-principal-portal.md#set-up-authentication), get the **authentication key**, and note down this value that you use later in this tutorial.
-1. [assign the application to these roles:](../active-directory/develop/howto-create-service-principal-portal.md#assign-a-role-to-the-application)
-
- |User| Azure Storage Account Roles | Microsoft Purview Collection Roles |
- |: |: |: |
- | **Data Provider** |Owner OR Blob Storage Data Owner|Data Share Contributor|
- | **Data Consumer** |Contributor OR Owner OR Storage Blob Data Contributor OR Blob Storage Data Owner|Data Share Contributor|
-
-## Create a Visual Studio project
-
-Next, create a C# .NET console application in Visual Studio:
-
-1. Launch **Visual Studio**.
-2. In the Start window, select **Create a new project** > **Console App**. .NET version 6.0 or above is required.
-3. In **Project name**, enter **PurviewDataSharingQuickStart**.
-4. Select **Create** to create the project.
-
-## Install NuGet packages
-
-1. Select **Tools** > **NuGet Package Manager** > **Package Manager Console**.
-1. In the **Package Manage Console** run the .NET cli add package command shown on this page to add the NuGet package: [Microsoft.Azure.Analytics.Purview.Sharing NuGet package](https://www.nuget.org/packages/Azure.Analytics.Purview.Sharing).
-1. In the **Package Manager Console** pane, run the following commands to install packages.
-
- ```powershell
- Install-Package Azure.Analytics.Purview.Sharing -IncludePrerelease
- Install-Package Azure.Identity
- ```
-
- >[!TIP]
- >If you get an error that reads "Could not find any project in..." when attempting these commands, you may just need to move a folder level down in your project. Try the command `dir` to list folders in your directory, then use 'cd \<name of the project folder>' to move down a level to your project folder. Then try again.
-
-## Create a sent share
-
-This script creates a data share that you can send to internal or external users.
-To use it, be sure to fill out these variables:
--- **SenderTenantId** - the [Azure Tenant ID](/partner-center/find-ids-and-domain-names#find-the-microsoft-azure-ad-tenant-id-and-primary-domain-name) for the sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data will be sent from.-- **ShareName** - A display name for your sent share.-- **ShareDescription** - (optional) A description for your sent share.-- **SenderStorageKind** - either BlobAccount or AdlsGen2Account.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be sent from.-- **SenderStorageContainer** - the name of the container where the data to be shared is stored.-- **SenderPathToShare** - the file/folder path to the data to be shared.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-- **SentShareID** - (optional) This option must be a GUID, and the current value generates one for you, but you can replace it with a different value if you would like.-- **ReceiverVisiblePath** - (optional) The name for the share the receiver will see. Currently set to a GUID, but GUID is not required.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.ComponentModel;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string SenderTenantId = "<Sender Identity's Tenant ID>";
- private static string SenderPurviewAccountName = "<Sender Purview Account Name>";
-
- private static string ShareName = "<Share Display Name>";
- private static string ShareDescription = "Share created using the SDK.";
- private static string SenderStorageKind = "<Sender Storage Account Kind (BlobAccount / AdlsGen2Account)>";
- private static string SenderStorageResourceId = "<Sender Storage Account Resource Id>";
- private static string SenderStorageContainer = "<Share Data Container Name>";
- private static string SenderPathToShare = "<File/Folder Path To Share>";
-
- // Set if using Service principal to create shares
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- // [OPTIONAL INPUTS] Override Value If Desired.
- private static string SentShareId = Guid.NewGuid().ToString();
- private static string ReceiverVisiblePath = Guid.NewGuid().ToString();
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
- /// Replace all placeholder inputs above with actual values before running this program.
- /// This updated Share experience API will create Shares based on callers RBAC role on the storage account.
- /// To view/manage Shares via UX in Purview Studio. Storage accounts need to be registered (one time action) in Purview account with DSA permissions.
-
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: CreateShare - START");
- await Sender_CreateSentShare();
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: CreateShare - FINISH");
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- private static async Task<BinaryData> Sender_CreateSentShare()
- {
-
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- if (sentSharesClient == null)
- {
- throw new InvalidEnumArgumentException("Invalid Sent Shares Client.");
- }
-
- // Create sent share
- var inPlaceSentShareDto = new
- {
- shareKind = "InPlace",
- properties = new
- {
- displayName = ShareName,
- description = ShareDescription,
- artifact = new
- {
- storeKind = SenderStorageKind,
- storeReference = new
- {
- referenceName = SenderStorageResourceId,
- type = "ArmResourceReference"
- },
- properties = new
- {
- paths = new[]
- {
- new
- {
- receiverPath = ReceiverVisiblePath,
- containerName = SenderStorageContainer,
- senderPath = SenderPathToShare
- }
- }
- }
- }
- },
- };
-
- Operation<BinaryData> sentShare = await sentSharesClient.CreateOrReplaceSentShareAsync(WaitUntil.Completed, SentShareId, RequestContent.Create(inPlaceSentShareDto));
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine(sentShare.Value);
- Console.ForegroundColor = Console.ForegroundColor;
- return sentShare.Value;
- }
-}
-```
-
-## Send invitation to a user
-
-This script sends an email invitation for a share to a user. If you want to send an invitation to a service principal, [see the next code example](#send-invitation-to-a-service).
-To use it, be sure to fill out these variables:
--- **RecipientUserEmailId** - Email address for the user to send the invitation to.-- **SenderTenantId** - the Azure Tenant ID for the share sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data will be sent from.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be sent from.-- **SentShareDisplayName** - the name of the sent share you're sending an invitation for.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-- **InvitationId** - (optional) This option must be a GUID, and the current value generates one for you, but you can replace it with a different value if you would like.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.ComponentModel;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string RecipientUserEmailId = "<Target User's Email Id>";
-
- private static string SenderTenantId = "<Sender Indentity's Tenant ID>";
- private static string SenderPurviewAccountName = "<Sender Purview Account Name>";
- private static string SenderStorageResourceId = "<Sender Storage Account Resource Id>";
-
- // Set if using Service principal to send invitation
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- private static string SentShareDisplayName = "<Name of share you're sending an invite for.>";
- private static string InvitationId = Guid.NewGuid().ToString();
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
- private static int StepCounter = 0;
-
- private static async Task Main(string[] args)
- {
- try
- {
-
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: SendtoUser - START");
- await Sender_CreateUserRecipient();
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: SendtoUser - FINISH");
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-
- private static async Task<BinaryData> Sender_CreateUserRecipient()
- {
-
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- if (string.IsNullOrEmpty(RecipientUserEmailId))
- {
- throw new InvalidEnumArgumentException("Invalid Recipient User Email Id.");
- }
-
- // Create user recipient and invite
- var invitationData = new
- {
- invitationKind = "User",
- properties = new
- {
- expirationDate = DateTime.Now.AddDays(7).ToString(),
- notify = true, // Send invitation email
- targetEmail = RecipientUserEmailId
- }
- };
-
- var allSentShares = await sentSharesClient.GetAllSentSharesAsync(SenderStorageResourceId).ToResultList();
-
- Console.ForegroundColor = ConsoleColor.Yellow;
- Console.WriteLine("{0}. {1}...", ++StepCounter, "Get a Specific Sent Share");
- Console.ForegroundColor = Console.ForegroundColor;
-
- var mySentShare = allSentShares.First(sentShareDoc =>
- {
- var doc = JsonDocument.Parse(sentShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == SentShareDisplayName;
- });
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("My Sent Share Id: " + JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString());
- Console.ForegroundColor = Console.ForegroundColor;
-
- var SentShareId = JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString();
-
- var sentInvitation = await sentSharesClient.CreateSentShareInvitationAsync(SentShareId, InvitationId, RequestContent.Create(invitationData));
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine(sentInvitation.Content);
- Console.ForegroundColor = Console.ForegroundColor;
-
- return sentInvitation.Content;
- }
-}
-```
-
-## Send invitation to a service
-
-This script sends an email invitation for a share to a service principal. If you want to send an invitation to a user, [see the previous sample](#send-invitation-to-a-user).
-To use it, be sure to fill out these variables:
--- **RecipientApplicationTenantId** - the Azure Tenant ID for the receiving service principal.-- **RecipientApplicationObjectId** - the object ID for the receiving service principal.-- **SenderTenantId** - the Azure Tenant ID for the share sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data will be sent from.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be sent from.-- **SentShareDisplayName** - the name of the sent share you're sending an invitation for.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-- **InvitationId** - (optional) This option must be a GUID, and the current value generates one for you, but you can replace it with a different value if you would like.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.ComponentModel;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string RecipientApplicationTenantId = "<Target Application's Tenant Id>";
- private static string RecipientApplicationObjectId = "<Target Application's Object Id>";
-
- private static string SentShareDisplayName = "<Name of share you're sending an invite for.>";
- private static string InvitationId = Guid.NewGuid().ToString();
-
- private static string SenderTenantId = "<Sender Indentity's Tenant ID>";
- private static string SenderPurviewAccountName = "<Sender Purview Account Name>";
-
- private static string SenderStorageResourceId = "<Resource ID for storage account that has been shared>";
-
- // Set if using Service principal to send invitation
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
-
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: SendtoService - START");
- await Sender_CreateServiceRecipient();
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: SendtoService - FINISH");
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-
- private static async Task<BinaryData> Sender_CreateServiceRecipient()
- {
-
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- if (!Guid.TryParse(RecipientApplicationTenantId, out Guid _))
- {
- throw new InvalidEnumArgumentException("Invalid Recipient Service Tenant Id.");
- }
-
- if (!Guid.TryParse(RecipientApplicationObjectId, out Guid _))
- {
- throw new InvalidEnumArgumentException("Invalid Recipient Service Object Id.");
- }
-
- // Create service recipient
- var invitationData = new
- {
- invitationKind = "Service",
- properties = new
- {
- expirationDate = DateTime.Now.AddDays(5).ToString(),
- targetActiveDirectoryId = RecipientApplicationTenantId,
- targetObjectId = RecipientApplicationObjectId
- }
- };
--
- var allSentShares = await sentSharesClient.GetAllSentSharesAsync(SenderStorageResourceId).ToResultList();
- var mySentShare = allSentShares.First(sentShareDoc =>
- {
- var doc = JsonDocument.Parse(sentShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == SentShareDisplayName;
- });
-
- var SentShareId = JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString();
-
- var sentInvitation = await sentSharesClient.CreateSentShareInvitationAsync(SentShareId, InvitationId, RequestContent.Create(invitationData));
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine(sentInvitation.Content);
- Console.ForegroundColor = Console.ForegroundColor;
- return sentInvitation.Content;
- }
-
-}
-```
-
-## List sent shares
-
-This script lists all the sent shares for a specific storage resource.
-To use it, be sure to fill out these variables:
--- **SenderTenantId** - the Azure Tenant ID for the share sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data was sent from.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where shares have been sent from.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string SenderTenantId = "<Sender Tenant ID>";
- private static string SenderPurviewAccountName = "<Name of the Microsoft Purview account>";
- private static string SenderStorageResourceId = "<Sender Storage Account Resource Id>";
-
- // Set if using Service principal to list shares
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
--
- try
- {
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- var allSentShares = await sentSharesClient.GetAllSentSharesAsync(SenderStorageResourceId).ToResultList();
- Console.WriteLine(allSentShares);
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## List all share recipients
-
-This script lists all recipients for a specific share.
-To use it, be sure to fill out these variables:
--- **SenderTenantId** - the Azure Tenant ID for the share sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data was sent from.-- **SentShareDisplayName** - the name of the sent share you're listing recipients for.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be sent from.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string SentShareDisplayName = "<Name of share you're listing recipients for.>";
- private static string SenderTenantId = "<Sender Tenant ID>";
- private static string SenderPurviewAccountName = "<Name of the Microsoft Purview account>";
-
- private static string SenderStorageResourceId = "<Sender Storage Account Resource Id>";
-
- // Set if using Service principal to list recipients
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- var allSentShares = await sentSharesClient.GetAllSentSharesAsync(SenderStorageResourceId).ToResultList();
- var mySentShare = allSentShares.First(sentShareDoc =>
- {
- var doc = JsonDocument.Parse(sentShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == SentShareDisplayName;
- });
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("My Sent Share Id: " + JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString());
- Console.ForegroundColor = Console.ForegroundColor;
-
- var SentShareId = JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString();
-
- var allRecipients = await sentSharesClient.GetAllSentShareInvitationsAsync(SentShareId).ToResultList();
- Console.WriteLine(allRecipients);
-
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## Delete recipient
-
-This script removes a share invitation, and therefore the share, for a recipient.
-To use it, be sure to fill out these variables:
--- **SenderTenantId** - the Azure Tenant ID for the share sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data was sent from.-- **SentShareDisplayName** - the name of the sent share you're removing a recipient for.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be sent from.-- **RecipientUserEmailId** - Email address for the user you want to delete.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string SentShareDisplayName = "<Name of share you're removing a recipient for.>";
- private static string SenderTenantId = "<Sender Tenant ID>";
- private static string SenderPurviewAccountName = "<Name of the Microsoft Purview account>";
- private static string RecipientUserEmailId = "<Target User's Email Id>";
-
- private static string SenderStorageResourceId = "<Sender Storage Account Resource Id>";
-
- // Set if using Service principal to delete recipients
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- var allSentShares = await sentSharesClient.GetAllSentSharesAsync(SenderStorageResourceId).ToResultList();
-
- var mySentShare = allSentShares.First(sentShareDoc =>
- {
- var doc = JsonDocument.Parse(sentShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == SentShareDisplayName;
- });
-
- var SentShareId = JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString();
-
- var allRecipients = await sentSharesClient.GetAllSentShareInvitationsAsync(SentShareId).ToResultList();
-
- var recipient = allRecipients.First(recipient =>
- {
- var doc = JsonDocument.Parse(recipient).RootElement;
- var props = doc.GetProperty("properties");
- return props.TryGetProperty("targetEmail", out JsonElement rcpt) && rcpt.ToString() == RecipientUserEmailId;
- });
-
- var recipientId = JsonDocument.Parse(recipient).RootElement.GetProperty("id").ToString();
-
- await sentSharesClient.DeleteSentShareInvitationAsync(WaitUntil.Completed, SentShareId, recipientId);
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("Remove Id: " + JsonDocument.Parse(recipient).RootElement.GetProperty("id").ToString());
- Console.WriteLine("Complete");
- Console.ForegroundColor = Console.ForegroundColor;
-
- }
-
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## Delete sent share
-
-This script deletes a sent share.
-To use it, be sure to fill out these variables:
--- **SenderTenantId** - the Azure Tenant ID for the share sender's identity.-- **SenderPurviewAccountName** - the name of the Microsoft Purview account where the data was sent from.-- **SentShareDisplayName** - the name of the sent share you're listing recipients for.-- **SenderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be sent from.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to create the shares, set this to **true**.-- **SenderClientId** - (optional) If using a service principal to create the shares, this is the Application (client) ID for the service principal.-- **SenderClientSecret** - (optional) If using a service principal to create the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string SenderTenantId = "<Sender Tenant ID>";
- private static string SenderPurviewAccountName = "<Name of the Microsoft Purview account>";
- private static string SentShareDisplayName = "<Name of share you're removing.>";
-
- private static string SenderStorageResourceId = "<Sender Storage Account Resource Id>";
-
- // Set if using Service principal to delete share
- private static bool UseServiceTokenCredentials = false;
- private static string SenderClientId = "<Sender Application (Client) Id>";
- private static string SenderClientSecret = "<Sender Application (Client) Secret>";
-
- // General Configs
- private static string SenderPurviewEndPoint = $"https://{SenderPurviewAccountName}.purview.azure.com";
- private static string SenderShareEndPoint = $"{SenderPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
- TokenCredential senderCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(SenderTenantId, SenderClientId, SenderClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = SenderTenantId });
-
- SentSharesClient? sentSharesClient = new SentSharesClient(SenderShareEndPoint, senderCredentials);
-
- var allSentShares = await sentSharesClient.GetAllSentSharesAsync(SenderStorageResourceId).ToResultList();
-
- var mySentShare = allSentShares.First(sentShareDoc =>
- {
- var doc = JsonDocument.Parse(sentShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == SentShareDisplayName;
- });
-
- var SentShareId = JsonDocument.Parse(mySentShare).RootElement.GetProperty("id").ToString();
-
- await sentSharesClient.DeleteSentShareAsync(WaitUntil.Completed, SentShareId);
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("Remove Id: " + SentShareId);
- Console.WriteLine("Complete");
- Console.ForegroundColor = Console.ForegroundColor;
-
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## Create a received share
-
-This script allows you to receive a data share.
-To use it, be sure to fill out these variables:
--- **ReceiverTenantId** - the Azure Tenant ID for the user/service that is receiving the shared data.-- **ReceiverPurviewAccountName** - the name of the Microsoft Purview account where the data will be received.-- **ReceiverStorageKind** - either BlobAccount or AdlsGen2Account.-- **ReceiverStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be received.-- **ReceiverStorageContainer** - the name of the container where the shared data will be stored.-- **ReceiverTargetFolderName** - the folder path to where the shared data will be stored.-- **ReceiverTargetMountPath** - the mount path you'd like to use to store your data in the folder.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to receive the shares, set this to **true**.-- **ReceiverClientId** - (optional) If using a service principal to receive the shares, this is the Application (client) ID for the service principal.-- **ReceiverClientSecret** - (optional) If using a service principal to receive the shares, add your client secret/authentication key.-- **ReceivedShareId** - (optional) This option must be a GUID, and the current value will generate one for you, but you can replace it with a different value if you would like.-- **ReceiverVisiblePath** - (optional) Name you want to use for the path for your received share.-- **ReceivedShareDisplayName** - (optional) A display name for your received share.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.ComponentModel;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string ReceiverTenantId = "<Receiver Indentity's Tenant ID>";
- private static string ReceiverPurviewAccountName = "<Receiver Purview Account Name>";
-
- private static string ReceiverStorageKind = "<Receiver Storage Account Kind (BlobAccount / AdlsGen2Account)>";
- private static string ReceiverStorageResourceId = "<Receiver Storage Account Resource Id>";
- private static string ReceiverStorageContainer = "<Container Name To Receive Data Under>";
- private static string ReceiverTargetFolderName = "<Folder Name to Received Data Under>";
- private static string ReceiverTargetMountPath = "<Mount Path to store Received Data Under>";
-
- //Use if using a service principal to receive a share
- private static bool UseServiceTokenCredentials = false;
- private static string ReceiverClientId = "<Receiver Caller Application (Client) Id>";
- private static string ReceiverClientSecret = "<Receiver Caller Application (Client) Secret>";
-
- // [OPTIONAL INPUTS] Override Values If Desired.
- private static string ReceivedShareId = Guid.NewGuid().ToString();
- private static string ReceiverVisiblePath = "ReceivedSharePath";
-
- private static string ReceivedShareDisplayName = "ReceivedShare";
-
- // General Configs
- private static string ReceiverPurviewEndPoint = $"https://{ReceiverPurviewAccountName}.purview.azure.com";
- private static string ReceiverShareEndPoint = $"{ReceiverPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
--
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: CreateReceivedShare - START");
- await Receiver_CreateReceivedShare();
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: CreateReceivedShare - FINISH");
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- private static async Task<BinaryData> Receiver_CreateReceivedShare()
- {
-
- TokenCredential receiverCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(ReceiverTenantId, ReceiverClientId, ReceiverClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = ReceiverTenantId });
-
- ReceivedSharesClient? receivedSharesClient = new ReceivedSharesClient(ReceiverShareEndPoint, receiverCredentials);
-
- if (receivedSharesClient == null)
- {
- throw new InvalidEnumArgumentException("Invalid Received Shares Client.");
- }
-
- var results = await receivedSharesClient.GetAllDetachedReceivedSharesAsync().ToResultList();
- var detachedReceivedShare = results;
-
- if (detachedReceivedShare == null)
- {
- throw new InvalidOperationException("No received shares found.");
- }
---
- var myReceivedShare = detachedReceivedShare.First(recShareDoc =>
- {
- var doc = JsonDocument.Parse(recShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == ReceivedShareDisplayName;
- });
-
- var ReceivedShareId = JsonDocument.Parse(myReceivedShare).RootElement.GetProperty("id").ToString();
--
- var attachedReceivedShareData = new
- {
- shareKind = "InPlace",
- properties = new
- {
- displayName = ReceivedShareDisplayName,
- sink = new
- {
- storeKind = ReceiverStorageKind,
- properties = new
- {
- containerName = ReceiverStorageContainer,
- folder = ReceiverTargetFolderName,
- mountPath = ReceiverTargetMountPath
- },
- storeReference = new
- {
- referenceName = ReceiverStorageResourceId,
- type = "ArmResourceReference"
- }
- }
- }
- };
-
- var receivedShare = await receivedSharesClient.CreateOrReplaceReceivedShareAsync(WaitUntil.Completed, ReceivedShareId, RequestContent.Create(attachedReceivedShareData));
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine(receivedShare.Value);
- Console.ForegroundColor = Console.ForegroundColor;
-
- return receivedShare.Value;
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## List all received shares
-
-This script lists all received shares on a storage account.
-To use it, be sure to fill out these variables:
--- **ReceiverTenantId** - the Azure Tenant ID for the user/service that is receiving the shared data.-- **ReceiverPurviewAccountName** - the name of the Microsoft Purview account where the data was received.-- **ReceiverStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data has been shared.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to receive the shares, set this to **true**.-- **ReceiverClientId** - (optional) If using a service principal to receive the shares, this is the Application (client) ID for the service principal.-- **ReceiverClientSecret** - (optional) If using a service principal to receive the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string ReceiverTenantId = "<Receiver Indentity's Tenant ID>";
- private static string ReceiverPurviewAccountName = "<Receiver Purview Account Name>";
-
- private static string ReceiverStorageResourceId = "<Storage Account Resource Id that is housing shares>";
-
- //Use if using a service principal to list shares
- private static bool UseServiceTokenCredentials = false;
- private static string ReceiverClientId = "<Receiver Caller Application (Client) Id>";
- private static string ReceiverClientSecret = "<Receiver Caller Application (Client) Secret>";
-
- // General Configs
- private static string ReceiverPurviewEndPoint = $"https://{ReceiverPurviewAccountName}.purview.azure.com";
- private static string ReceiverShareEndPoint = $"{ReceiverPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
- TokenCredential receiverCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(ReceiverTenantId, ReceiverClientId, ReceiverClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = ReceiverTenantId });
-
- ReceivedSharesClient? receivedSharesClient = new ReceivedSharesClient(ReceiverShareEndPoint, receiverCredentials);
-
- var allReceivedShares = await receivedSharesClient.GetAllAttachedReceivedSharesAsync(ReceiverStorageResourceId).ToResultList();
- Console.WriteLine(allReceivedShares);
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## Update received share
-
-This script allows you to update the storage location for a received share. Just like creating a received share, you add the information for the storage account where you want the data to be housed.
-To use it, be sure to fill out these variables:
--- **ReceiverTenantId** - the Azure Tenant ID for the user/service that is receiving the shared data.-- **ReceiverPurviewAccountName** - the name of the Microsoft Purview account where the data will be received.-- **ReceiverStorageKind** - either BlobAccount or AdlsGen2Account.-- **ReceiverStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data has been shared.-- **ReAttachStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data will be received.-- **ReceiverStorageContainer** - the name of the container where the shared data will be stored.-- **ReceiverTargetFolderName** - the folder path to where the shared data will be stored.-- **ReceiverTargetMountPath** - the mount path you'd like to use to store your data in the folder.-- **ReceivedShareDisplayName** - The display name for your received share.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to receive the shares, set this to **true**.-- **ReceiverClientId** - (optional) If using a service principal to receive the shares, this is the Application (client) ID for the service principal.-- **ReceiverClientSecret** - (optional) If using a service principal to receive the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.ComponentModel;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string ReceiverTenantId = "<Receiver Indentity's Tenant ID>";
- private static string ReceiverPurviewAccountName = "<Receiver Purview Account Name>";
-
- private static string ReceiverStorageKind = "<Receiver Storage Account Kind (BlobAccount / AdlsGen2Account)>";
- private static string ReceiverStorageResourceId = "<Storage Account Resource Id for the account where the share is currently attached.>";
- private static string ReAttachStorageResourceId = "<Storage Account Resource Id For Reattaching Received Share>";
- private static string ReceiverStorageContainer = "<Container Name To Receive Data Under>";
- private static string ReceiverTargetFolderName = "<Folder Name to Received Data Under>";
- private static string ReceiverTargetMountPath = "<Mount Path to Received Data Under>";
-
- private static string ReceivedShareDisplayName = "<Display name of your received share>";
-
- //Use if using a service principal to update the share
- private static bool UseServiceTokenCredentials = false;
- private static string ReceiverClientId = "<Receiver Caller Application (Client) Id>";
- private static string ReceiverClientSecret = "<Receiver Caller Application (Client) Secret>";
-
- // General Configs
- private static string ReceiverPurviewEndPoint = $"https://{ReceiverPurviewAccountName}.purview.azure.com";
- private static string ReceiverShareEndPoint = $"{ReceiverPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
-
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: UpdateReceivedShare - START");
- await Receiver_UpdateReceivedShare();
- Console.WriteLine($"{DateTime.Now.ToShortTimeString()}: UpdateReceivedShare - FINISH");
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
-
- private static async Task<BinaryData> Receiver_UpdateReceivedShare()
- {
-
- TokenCredential receiverCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(ReceiverTenantId, ReceiverClientId, ReceiverClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = ReceiverTenantId });
-
- ReceivedSharesClient? receivedSharesClient = new ReceivedSharesClient(ReceiverShareEndPoint, receiverCredentials);
-
- if (receivedSharesClient == null)
- {
- throw new InvalidEnumArgumentException("Invalid Received Shares Client.");
- }
-
- var attachedReceivedShareData = new
- {
- shareKind = "InPlace",
- properties = new
- {
- displayName = ReceivedShareDisplayName,
- sink = new
- {
- storeKind = ReceiverStorageKind,
- properties = new
- {
- containerName = ReceiverStorageContainer,
- folder = ReceiverTargetFolderName,
- mountPath = ReceiverTargetMountPath
- },
- storeReference = new
- {
- referenceName = ReAttachStorageResourceId,
- type = "ArmResourceReference"
- }
- }
- }
- };
-
- var allReceivedShares = await receivedSharesClient.GetAllAttachedReceivedSharesAsync(ReceiverStorageResourceId).ToResultList();
-
- var myReceivedShare = allReceivedShares.First(recShareDoc =>
- {
- var doc = JsonDocument.Parse(recShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == ReceivedShareDisplayName;
- });
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("My Received Share Id: " + JsonDocument.Parse(myReceivedShare).RootElement.GetProperty("id").ToString());
- Console.ForegroundColor = Console.ForegroundColor;
--
- var ReceivedShareId = JsonDocument.Parse(myReceivedShare).RootElement.GetProperty("id").ToString();
-
- var receivedShare = await receivedSharesClient.CreateOrReplaceReceivedShareAsync(WaitUntil.Completed, ReceivedShareId, RequestContent.Create(attachedReceivedShareData));
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine(receivedShare.Value);
- Console.ForegroundColor = Console.ForegroundColor;
-
- return receivedShare.Value;
- }
-
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## Delete received share
-
-This script deletes a received share.
-To use it, be sure to fill out these variables:
--- **ReceiverTenantId** - the Azure Tenant ID for the user/service that is receiving the shared data.-- **ReceiverPurviewAccountName** - the name of the Microsoft Purview account where the data will be received.-- **ReceivedShareDisplayName** - The display name for your received share.-- **ReceiverStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data has been shared.-- **UseServiceTokenCredentials** - (optional) If you want to use a service principal to receive the shares, set this to **true**.-- **ReceiverClientId** - (optional) If using a service principal to receive the shares, this is the Application (client) ID for the service principal.-- **ReceiverClientSecret** - (optional) If using a service principal to receive the shares, add your client secret/authentication key.-
-```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
-using Azure;
-using Azure.Analytics.Purview.Sharing;
-using Azure.Core;
-using Azure.Identity;
-using System.Text.Json;
-
-public static class PurviewDataSharingQuickStart
-{
- // [REQUIRED INPUTS] Set To Actual Values.
- private static string ReceiverTenantId = "<Receiver Indentity's Tenant ID>";
- private static string ReceiverPurviewAccountName = "<Receiver Purview Account Name>";
-
- private static string ReceivedShareDisplayName = "<Display name of your received share>";
-
- private static string ReceiverStorageResourceId = "<Storage Account Resource Id for the account where the share is currently attached.>";
-
- //Use if using a service principal to delete share.
- private static bool UseServiceTokenCredentials = false;
- private static string ReceiverClientId = "<Receiver Caller Application (Client) Id>";
- private static string ReceiverClientSecret = "<Receiver Caller Application (Client) Secret>";
-
- // General Configs
- private static string ReceiverPurviewEndPoint = $"https://{ReceiverPurviewAccountName}.purview.azure.com";
- private static string ReceiverShareEndPoint = $"{ReceiverPurviewEndPoint}/share";
-
- private static async Task Main(string[] args)
- {
- try
- {
- TokenCredential receiverCredentials = UseServiceTokenCredentials
- ? new ClientSecretCredential(ReceiverTenantId, ReceiverClientId, ReceiverClientSecret)
- : new DefaultAzureCredential(new DefaultAzureCredentialOptions { AuthorityHost = new Uri("https://login.windows.net"), TenantId = ReceiverTenantId });
-
- ReceivedSharesClient? receivedSharesClient = new ReceivedSharesClient(ReceiverShareEndPoint, receiverCredentials);
-
- var allReceivedShares = await receivedSharesClient.GetAllAttachedReceivedSharesAsync(ReceiverStorageResourceId).ToResultList();
-
- var myReceivedShare = allReceivedShares.First(recShareDoc =>
- {
- var doc = JsonDocument.Parse(recShareDoc).RootElement;
- var props = doc.GetProperty("properties");
- return props.GetProperty("displayName").ToString() == ReceivedShareDisplayName;
- });
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("My Received Share Id: " + JsonDocument.Parse(myReceivedShare).RootElement.GetProperty("id").ToString());
- Console.ForegroundColor = Console.ForegroundColor;
-
- var ReceivedShareId = JsonDocument.Parse(myReceivedShare).RootElement.GetProperty("id").ToString();
-
- await receivedSharesClient.DeleteReceivedShareAsync(WaitUntil.Completed, ReceivedShareId);
-
- Console.ForegroundColor = ConsoleColor.Green;
- Console.WriteLine("Delete Complete");
- Console.ForegroundColor = Console.ForegroundColor;
- }
- catch (Exception ex)
- {
- Console.ForegroundColor = ConsoleColor.Red;
- Console.WriteLine(ex);
- Console.ForegroundColor = Console.ForegroundColor;
- }
- }
- public static async Task<List<T>> ToResultList<T>(this AsyncPageable<T> asyncPageable)
- {
- List<T> list = new List<T>();
-
- await foreach (T item in asyncPageable)
- {
- list.Add(item);
- }
-
- return list;
- }
-}
-```
-
-## Clean up resources
-
-To clean up the resources created for the quick start, use the following guidelines:
-
-1. Within [Microsoft Purview governance portal](https://web.purview.azure.com/), [delete the sent share](how-to-share-data.md#delete-share).
-1. Also [delete your received share](how-to-receive-share.md#delete-received-share).
-1. Once the shares are successfully deleted, delete the target container and folder Microsoft Purview created in your target storage account when you received shared data.
-
-## Next steps
-
-* [FAQ for data sharing](how-to-data-share-faq.md)
-* [How to share data](how-to-share-data.md)
-* [How to receive share](how-to-receive-share.md)
-* [REST API reference](/rest/api/purview/)
purview Quickstart Data Share https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-data-share.md
- Title: 'Quickstart: Share data'
-description: Learn how to securely share data from your environment using Microsoft Purview Data Sharing.
------ Previously updated : 02/16/2023-
-# Quickstart: Share and receive Azure Storage data in-place with Microsoft Purview Data Sharing (preview)
--
-This article provides a quick guide on how to share data and receive shares from Azure Data Lake Storage (ADLS Gen2) or Blob storage accounts.
--
-## Create a share
-
-1. You can create a share by starting from **Data Map**
-
- Open the [Microsoft Purview governance portal](https://web.purview.azure.com/). Select the **Data Map** icon from the left navigation. Then select **Shares**. Select **+New Share**.
-
- :::image type="content" source="./media/how-to-share-data/create-share-datamap-new-share.png" alt-text="Screenshot that shows the Microsoft Purview governance portal Data Map with Data Map, Shares and New Share highlighted." border="true":::
-
- Select the Storage account type and the Storage account you want to share data from. Then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/create-share-datamap-select-type-account.png" alt-text="Screenshot that shows the New Share creation step with Type and Storage account options highlighted." border="true":::
-
-1. You can create a share by starting from **Data Catalog**
-
- Within the [Microsoft Purview governance portal](https://web.purview.azure.com/), find the Azure Storage or Azure Data Lake Storage (ADLS) Gen 2 data asset you would like to share data from using either the [data catalog search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md).
-
- :::image type="content" source="./media/how-to-share-data/search-or-browse.png" alt-text="Screenshot that shows the Microsoft Purview governance portal homepage with the search and browse options highlighted." border="true":::
-
- Once you have found your data asset, select the **Data Share** button.
-
- :::image type="content" source="./media/how-to-share-data/select-data-share-inline.png" alt-text="Screenshot of a data asset in the Microsoft Purview governance portal with the Data Share button highlighted." border="true" lightbox="./media/how-to-share-data/select-data-share-large.png":::
-
- Select **+New Share**.
-
- :::image type="content" source="./media/how-to-share-data/select-new-share-inline.png" alt-text="Screenshot of the Data Share management window with the New Share button highlighted." border="true" lightbox="./media/how-to-share-data/select-new-share-large.png":::
-
-1. Specify a name and a description of share contents (optional). Then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/create-share-details-inline.png" alt-text="Screenshot showing create share and enter details window, with the Continue button highlighted." border="true" lightbox="./media/how-to-share-data/create-share-details-large.png":::
-
-1. Search for and add all the assets you'd like to share out at the container, folder, or file level, and then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/add-asset.png" alt-text="Screenshot showing the add assets window, with a file and a folder selected to share." border="true":::
-
-1. You can edit the display names the shared data will have, if you like. Then select **Continue**.
-
- :::image type="content" source="./media/how-to-share-data/provide-display-names.png" alt-text="Screenshot showing the second add assets window with the display names unchanged." border="true":::
-
-1. Select **Add Recipient** and select **User** or **App**.
-
- To share data to a user, select **User**, then enter the Azure sign-in email address of who you want to share data with. By default, the option to enter email address of user is shown.
-
- :::image type="content" source="./media/how-to-share-data/create-share-add-user-recipient-inline.png" alt-text="Screenshot showing the add recipients page, with the add recipient button highlighted, default user email option shown." border="true" lightbox="./media/how-to-share-data/create-share-add-user-recipient-large.png":::
-
- To to share data with a service principal, select **App**. Enter the object ID and tenant ID of the recipient you want to share data with.
-
- :::image type="content" source="./media/how-to-share-data/create-share-add-app-recipient-inline.png" alt-text="Screenshot showing the add app recipients page, with the add app option and required fields highlighted." border="true" lightbox="./media/how-to-share-data/create-share-add-app-recipient-large.png":::
-
-1. Select **Create and Share**. Optionally, you can specify an **Expiration date** for when to terminate the share. You can share the same data with multiple recipients by selecting **Add Recipient** multiple times.
-
-You've now created your share. The recipients of your share will receive an invitation and they can view the pending share in their Microsoft Purview account.
-
-## Receive share
-
-1. You can view your share invitations in any Microsoft Purview account. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Select the **Data Map** icon from the left navigation. Then select **Share invites**. If you received an email invitation, you can also select the **View share invite** link in the email to select a Microsoft Purview account.
-
- If you're a guest user of a tenant, you'll be asked to verify your email address for the tenant before viewing pending received share for the first time. [You can see our guide for steps.](how-to-receive-share.md#guest-user-verification) Once verified, it's valid for 12 months.
-
- :::image type="content" source="./media/how-to-receive-share/view-invites.png" alt-text="Screenshot showing the Share invites page in the Microsoft Purview governance portal." border="true":::
-
-1. Select name of the pending share you want to view or configure.
-
-1. If you don't want to accept the invitation, select **Delete**.
-
- :::image type="content" source="./media/how-to-receive-share/select-delete-invitation-inline.png" alt-text="Screenshot showing the share attachment page with the delete button highlighted." border="true" lightbox="./media/how-to-receive-share/select-delete-invitation-large.png":::
-
- >[!NOTE]
- > If you delete an invitation, if you want to accept the share in future it will need to be resent. To deselect the share without deleting select the **Cancel** button instead.
-
-1. You can edit the **Received share name** if you like. Then select a **Storage account name** for a target storage account in the same region as the source. You can choose to **Register a new storage account to map assets** in the drop-down as well.
-
- >[!IMPORTANT]
- >The target storage account needs to be in the same Azure region as the source storage account.
-
-1. Configure the **Path** (either a new container name, or the name of an existing share container) and, if needed, **New folder**.
-
-1. Select **Attach to target**.
-
- :::image type="content" source="./media/how-to-receive-share/attach-shared-data-inline.png" alt-text="Screenshot showing pending share configuration page, with a share name added, a collection selected, and the accept and configure button highlighted." border="true" lightbox="./media/how-to-receive-share/attach-shared-data-large.png":::
-
-1. On the Manage data shares page, you'll see the new share with the status of **Creating** until it has completed and is attached.
-
- :::image type="content" source="./media/how-to-receive-share/manage-data-shares-window-creating.png" alt-text="Screenshot showing the map asset window, with the map button highlighted next to the asset to specify a target data store to receive or access shared data." border="true":::
-
-1. You can access shared data from the target storage account through Azure portal, Azure Storage Explorer, Azure Storage SDK, PowerShell or CLI. You can also analyze the shared data by connecting your storage account to Azure Synapse Analytics Spark or Databricks.
-
-## Clean up resources
-
-To clean up the resources created for the quick start, follow the steps below:
-
-1. Within [Microsoft Purview governance portal](https://web.purview.azure.com/), [delete the sent share](how-to-share-data.md#delete-share).
-1. Also [delete your received share](how-to-receive-share.md#delete-received-share).
-1. Once the shares are successfully deleted, delete the target container and folder Microsoft Purview created in your target storage account when you received shared data.
-
-## Troubleshoot
-
-To troubleshoot issues with sharing data, refer to the [troubleshooting section of the how to share data article](how-to-share-data.md#troubleshoot). To troubleshoot issues with receiving share, refer to the [troubleshooting section of the how to receive shared data article](how-to-receive-share.md#troubleshoot).
-
-## Next steps
-
-* [FAQ for data sharing](how-to-data-share-faq.md)
-* [How to share data](how-to-share-data.md)
-* [How to receive share](how-to-receive-share.md)
-* [REST API reference](/rest/api/purview/)
purview Reference Microsoft Purview Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/reference-microsoft-purview-glossary.md
- Title: Microsoft Purview governance portal product glossary
-description: A glossary defining the terminology used throughout the Microsoft Purview governance portal
------ Previously updated : 06/17/2022-
-# Microsoft Purview governance portal product glossary
-
-Below is a glossary of terminology used throughout the Microsoft Purview governance portal, and documentation.
-
-## Advanced resource sets
-A set of features activated at the Microsoft Purview instance level that, when enabled, enrich resource set assets by computing extra aggregations on the metadata to provide information such as partition counts, total size, and schema counts. Resource set pattern rules are also included.
-## Annotation
-Information that is associated with data assets in the Microsoft Purview Data Map, for example, glossary terms and classifications. After they're applied, annotations can be used within Search to aid in the discovery of the data assets.
-## Approved
-The state given to any request that has been accepted as satisfactory by the designated individual or group who has authority to change the state of the request.
-## Asset
-Any single object that is stored within a Microsoft Purview Data Catalog.
-> [!NOTE]
-> A single object in the catalog could potentially represent many objects in storage, for example, a resource set is an asset but it's made up of many partition files in storage.
-## Azure Information Protection
-A cloud solution that supports labeling of documents and emails to classify and protect information. Labeled items can be protected by encryption, marked with a watermark, or restricted to specific actions or users, and is bound to the item. This cloud-based solution relies on Azure Rights Management Service (RMS) for enforcing restrictions.
-## Business glossary
-A searchable list of specialized terms that an organization uses to describe key business words and their definitions. Using a business glossary can provide consistent data usage across the organization.
-## Capacity unit
-A measure of data map usage. All Microsoft Purview Data Maps include one capacity unit by default, which provides up to 10 GB of metadata storage and has a throughput of 25 data map operations/second.
-## Classification report
-A report that shows key classification details about the scanned data.
-## Classification
-A type of annotation used to identify an attribute of an asset or a column such as "Age", ΓÇ£Email Address", and "Street Address". These attributes can be assigned during scans or added manually.
-## Classification rule
-A classification rule is a set of conditions that determine how scanned data should be classified when content matches the specified pattern.
-## Classified asset
-An asset where Microsoft Purview extracts schema and applies classifications during an automated scan. The scan rule set determines which assets get classified. If the asset is considered a candidate for classification and no classifications are applied during scan time, an asset is still considered a classified asset.
-## Collection
-An organization-defined grouping of assets, terms, annotations, and sources. Collections allow for easier fine-grained access control and discoverability of assets within a data catalog.
-## Collection admin
-A role that can assign roles in the Microsoft Purview governance portal. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.
-## Column pattern
-A regular expression included in a classification rule that represents the column names that you want to match.
-## Contact
-An individual who is associated with an entity in the data catalog.
-## Control plane operation
-An operation that manages resources in your subscription, such as role-based access control and Azure policy that are sent to the Azure Resource Manager end point. Control plane operations can also apply to resources outside of Azure across on-premises, multicloud, and SaaS sources.
-## Credential
-A verification of identity or tool used in an access control system. Credentials can be used to authenticate an individual or group to grant access to a data asset.
-## Data Catalog
-A searchable inventory of assets and their associated metadata that allows users to find and curate data across a data estate. The Data Catalog also includes a business glossary where subject matter experts can provide terms and definitions to add a business context to an asset.
-## Data curator
-A role that provides access to the data catalog to manage assets, configure custom classifications, set up glossary terms, and view insights. Data curators can create, read, modify, move, and delete assets. They can also apply annotations to assets.
-## Data Estate Insights
-An area of the Microsoft Purview governance portal that provides up-to-date reports and actionable insights about the data estate.
-## Data map
-A metadata repository that is the foundation of the Microsoft Purview governance portal. The data map is a graph that describes assets across a data estate and is populated through scans and other data ingestion processes. This graph helps organizations understand and govern their data by providing rich descriptions of assets, representing data lineage, classifying assets, storing relationships between assets, and housing information at both the technical and semantic layers. The data map is an open platform that can be interacted with and accessed through Apache Atlas APIs or the Microsoft Purview governance portal.
-## Data map operation
-A create, read, update, or delete action performed on an entity in the data map. For example, creating an asset in the data map is considered a data map operation.
-## Data owner
-An individual or group responsible for managing a data asset.
-## Data pattern
-A regular expression that represents the data that is stored in a data field. For example, a data pattern for employee ID could be Employee{GUID}.
-## Data plane operation
-An operation within a specific Microsoft Purview instance, such as editing an asset or creating a glossary term. Each instance has predefined roles, such as "data reader" and "data curator" that control which data plane operations a user can perform.
-## Data reader
-A role that provides read-only access to data assets, classifications, classification rules, collections, glossary terms, and insights.
-## Data Sharing
-Microsoft Purview Data Sharing is a set of features in Microsoft Purview that enables you to securely share data across organizations.
-## Data Share contributor
-A role that can share data within an organization and with other organizations using data share capabilities in Microsoft Purview. Data share contributors can view, create, update, and delete sent and received shares.
-## Data source admin
-A role that can manage data sources and scans. A user in the Data source admin role doesn't have access to Microsoft Purview governance portal. Combining this role with the Data reader or Data curator roles at any collection scope provides Microsoft Purview governance portal access.
-## Data steward
-An individual or group responsible for maintaining nomenclature, data quality standards, security controls, compliance requirements, and rules for the associated object.
-## Data dictionary
-A list of canonical names of database columns and their corresponding data types. It's often used to describe the format and structure of a database, and the relationship between its elements.
-## Discovered asset
-An asset that the Microsoft Purview Data Map identifies in a data source during the scanning process. The number of discovered assets includes all files or tables before resource set grouping.
-## Distinct match threshold
-The total number of distinct data values that need to be found in a column before the scanner runs the data pattern on it. For example, a distinct match threshold of eight for employee ID requires that there are at least eight unique data values among the sampled values in the column that match the data pattern set for employee ID.
-## Expert
-An individual within an organization who understands the full context of a data asset or glossary term.
-## Full scan
-A scan that processes all assets within a selected scope of a data source.
-## Fully Qualified Name (FQN)
-A path that defines the location of an asset within its data source.
-## Glossary term
-An entry in the Business glossary that defines a concept specific to an organization. Glossary terms can contain information on synonyms, acronyms, and related terms.
-## Incremental scan
-A scan that detects and processes assets that have been created, modified, or deleted since the previous successful scan. To run an incremental scan, at least one full scan must be completed on the source.
-## Ingested asset
-An asset that has been scanned, classified (when applicable), and added to the Microsoft Purview Data Map. Ingested assets are discoverable and consumable within the data catalog through automated scanning or external connections, such as Azure Data Factory and Azure Synapse.
-## Insight reader
-A role that provides read-only access to Data Estate Insights reports. Insight readers must have at least data reader role access to a collection to view reports about that specific collection.
-## Integration runtime
-The compute infrastructure used to scan in a data source.
-## Lineage
-How data transforms and flows as it moves from its origin to its destination. Understanding this flow across the data estate helps organizations see the history of their data, and aid in troubleshooting or impact analysis.
-## Management
-An area within the Microsoft Purview Governance Portal where you can manage connections, users, roles, and credentials. Also referred to as "Management center."
-## Minimum match threshold
-The minimum percentage of matches among the distinct data values in a column that must be found by the scanner for a classification to be applied.
-
-For example, a minimum match threshold of 60% for employee ID requires that 60% of all distinct values among the sampled data in a column match the data pattern set for employee ID. If the scanner samples 128 values in a column and finds 60 distinct values in that column, then at least 36 of the distinct values (60%) must match the employee ID data pattern for the classification to be applied.
-## Physical asset
-An asset that represents a physical data object. Physical assets are different from business assets because they represent real data. For example, a database is a physical asset.
-## Policy
-A statement or collection of statements that controls how access to data and data sources should be authorized.
-## Object type
-A categorization of assets based upon common data structures. For example, an Azure SQL Server table and Oracle database table both have an object type of table.
-## On-premises data
-Data that is in a data center controlled by a customer, for example, not in the cloud or software as a service (SaaS).
-## Owner
-An individual or group in charge of managing a data asset.
-## Pattern rule
-A configuration that overrides how the Microsoft Purview Data Map groups assets as resource sets and displays them within the catalog.
-## Microsoft Purview instance
-A single Microsoft Purview (formerly Azure Purview) account.
-## Registered source
-A source that has been added to a Microsoft Purview instance and is now managed as a part of the Data catalog.
-## Related terms
-Glossary terms that are linked to other terms within the organization.
-## Resource set
-A single asset that represents many partitioned files or objects in storage. For example, the Microsoft Purview Data Map stores partitioned Apache Spark output as a single resource set instead of unique assets for each individual file.
-## Role
-Permissions assigned to a user within a Microsoft Purview instance. Roles, such as Microsoft Purview Data Curator or Microsoft Purview Data Reader, determine what can be done within the product.
-## Root collection
-A system-generated collection that has the same friendly name as the Microsoft Purview account. All assets belong to the root collection by default.
-## Scan
-A Microsoft Purview Data Map process that discovers and examines metadata in a source or set of sources to populate the data map. A scan automatically connects to a source, extracts metadata, captures lineage, and applies classifications. Scans can be run manually or on a schedule.
-## Scan rule set
-A set of rules that define which data types and classifications a scan ingests into a catalog.
-## Scan trigger
-A schedule that determines the recurrence of when a scan runs.
-## Schema classification
-A classification applied to one of the columns in an asset schema.
-## Search
-A feature that allows users to find items in the data catalog by entering in a set of keywords.
-## Search relevance
-The scoring of data assets that determine the order search results are returned. Multiple factors determine an asset's relevance score.
-## Self-hosted integration runtime
-An integration runtime installed on an on-premises machine or virtual machine inside a private network that is used to connect to data on-premises or in a private network.
-## Sensitivity label
-Annotations that classify and protect an organizationΓÇÖs data. The Microsoft Purview Data Map integrates with Microsoft Purview Information Protection for creation of sensitivity labels.
-## Sensitivity label report
-A summary of which sensitivity labels are applied across the data estate.
-## Service
-A product that provides standalone functionality and is available to customers by subscription or license.
-## Share
-A group of assets that are shared as a single entity.
-## Source
-A system where data is stored. Sources can be hosted in various places such as a cloud or on-premises. You register and scan sources so that you can manage them in the Microsoft Purview governance portal.
-## Source type
-A categorization of the registered sources used in the Microsoft Purview Data Map, for example, Azure SQL Database, Azure Blob Storage, Amazon S3, or SAP ECC.
-## Steward
-An individual who defines the standards for a glossary term. They're responsible for maintaining quality standards, nomenclature, and rules for the assigned entity.
-## Term template
-A definition of attributes included in a glossary term. Users can either use the system-defined term template or create their own to include custom attributes.
-## Workflow
-An automated process that coordinates the creation and modification of catalog entities, including validation and approval. Workflows define repeatable business processes to achieve high quality data, policy compliance, and user collaboration across an organization.
-
-## Next steps
-
-To get started with other Microsoft Purview governance services, see [Quickstart: Create a Microsoft Purview (formerly Azure Purview) account](create-catalog-portal.md).
purview Register Scan Adls Gen1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-adls-gen1.md
- Title: 'Connect to and manage Azure Data Lake Storage (ADLS) Gen1'
-description: This article outlines the process to register an Azure Data Lake Storage Gen1 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen 1 source.
----- Previously updated : 09/14/2022--
-# Connect to Azure Data Lake Gen1 in Microsoft Purview
-
-This article outlines the process to register an Azure Data Lake Storage Gen1 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen1 source.
-
-> [!Note]
-> Azure Data Lake Storage Gen2 is now generally available. We recommend that you start using it today. For more information, see the [product page](https://azure.microsoft.com/services/storage/data-lake-storage/).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)|[Yes](#scan) | [Yes](#scan)|[Yes](#scan)|[Yes](create-sensitivity-label.md)| No |Limited** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section will enable you to register the ADLS Gen1 data source and set up an appropriate authentication mechanism to ensure successful scanning of the data source.
-
-### Steps to register
-
-It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Navigate to the **Data Map --> Sources**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-open-purview-studio.png" alt-text="Screenshot that shows the link to open Microsoft Purview governance portal":::
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sources.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
-
-1. Create the [Collection hierarchy](./quickstart-create-collection.md) using the **Collections** menu and assign permissions to individual subcollections, as required
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-collection.png" alt-text="Screenshot that shows the collection menu to create collection hierarchy":::
-
-1. Navigate to the appropriate collection under the **Sources** menu and select the **Register** icon to register a new ADLS Gen1 data source
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-register.png" alt-text="Screenshot that shows the collection used to register the data source":::
-
-1. Select the **Azure Data Lake Storage Gen1** data source and select **Continue**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-select-source.png" alt-text="Screenshot that allows selection of the data source":::
-
-1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Data Lake Store account name** and the **collection** and select **Apply**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-source-details.png" alt-text="Screenshot that shows the details to be entered in order to register the data source":::
-
-1. The ADLS Gen1 storage account will be shown under the selected Collection
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-source-hierarchy.png" alt-text="Screenshot that shows the data source mapped to the collection to initiate scanning":::
-
-## Scan
-
-### Prerequisites for scan
-
-In order to have access to scan the data source, an authentication method in the ADLS Gen1 Storage account needs to be configured.
-The following options are supported:
-
-> [!Note]
-> If you have firewall enabled for the storage account, you must use managed identity authentication method when setting up a scan.
-
-* **System-assigned managed identity (Recommended)** - As soon as the Microsoft Purview Account is created, a system **Managed Identity** is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview SAMI to perform the scans.
-
-* **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-* **Service Principal** - In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant.
-
-### Authentication for a scan
-
-#### Using system or user-assigned managed identity for scanning
-
-It is important to give your Microsoft Purview account the permission to scan the ADLS Gen1 data source. You can add the system managed identity, or user-assigned managed identity at the Subscription, Resource Group, or Resource level, depending on what you want it to have scan permissions on.
-
-> [!Note]
-> You need to be an owner of the subscription to be able to add a managed identity on an Azure resource.
-
-1. From the [Azure portal](https://portal.azure.com), find either the subscription, resource group, or resource (for example, an Azure Data Lake Storage Gen1 storage account) that you would like to allow the catalog to scan.
-1. Select **Overview** and then select **Data explorer**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-data-explorer.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access** in the top navigation
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-storage-access.png" alt-text="Screenshot that shows the Data explorer for the storage account":::
-
-1. Choose **Select** and add the _Microsoft Purview Name_ (which is the system managed identity) or the _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_(preview), that has already been registered in Microsoft Purview, in the **Select user or group** menu.
-1. Select **Read** and **Execute** permissions. Make sure to choose **This folder and all children**, and **An access permission entry and a default permission entry** in the Add options as shown in the below screenshot. Select **OK**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
-
-> [!Tip]
-> An **access permission entry** is a permission entry on _current_ files and folders. A **default permission entry** is a permission entry that will be _inherited_ by new files and folders.
-To grant permission only to currently existing files, **choose an access permission entry**.
-To grant permission to scan files and folders that will be added in future, **include a default permission entry**.
-
-#### Using Service Principal for scanning
-
-##### Creating a new service principal
-
-If you need to [Create a new service principal](./create-service-principal-azure.md), it is required to register an application in your Azure AD tenant and provide access to Service Principal in your data sources. Your Azure AD Global Administrator or other roles such as Application Administrator can perform this operation.
-
-##### Getting the Service Principal's application ID
-
-1. Copy the **Application (client) ID** present in the **Overview** of the [_Service Principal_](./create-service-principal-azure.md) already created
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sp-appl-id.png" alt-text="Screenshot that shows the Application (client) ID for the Service Principal":::
-
-##### Granting the Service Principal access to your ADLS Gen1 account
-
-It is important to give your service principal the permission to scan the ADLS Gen2 data source. You can add access for the service principal at the Subscription, Resource Group, or Resource level, depending on what permissions it needs.
-
-> [!Note]
-> You need to be an owner of the subscription to be able to add a service principal on an Azure resource.
-
-1. Provide the service principal access to the storage account by opening the storage account and selecting **Overview** --> **Data Explorer**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-data-explorer.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access** in the top navigation
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-storage-access.png" alt-text="Screenshot that shows the Data explorer for the storage account":::
-
-1. Select **Select** and Add the _Service Principal_ in the **Select user or group** selection.
-1. Select **Read** and **Execute** permissions. Make sure to choose **This folder and all children**, and **An access permission entry and a default permission entry** in the Add options. Select **OK**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sp-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the service principal":::
-
-### Creating the scan
-
-1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview governance portal**
-
-1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-open-purview-studio.png" alt-text="Screenshot that shows the collection hierarchy":::
-
-1. Select the **New Scan** icon under the **ADLS Gen1 data source** registered earlier
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-new-scan.png" alt-text="Screenshot that shows the data source with the new scan icon":::
-
-#### If using system or user-assigned managed identity
-
-Provide a **Name** for the scan, select the system or user-assigned managed identity under **Credential**, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**.
--
-#### If using Service Principal
-
-1. Provide a **Name** for the scan, choose the appropriate collection for the scan, and select the **+ New** under **Credential**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sp.png" alt-text="Screenshot that shows the service principal option":::
-
-1. Select the appropriate **Key vault connection** and the **Secret name** that was used while creating the _Service Principal_. The **Service Principal ID** is the **Application (client) ID** copied as indicated earlier
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sp-key-vault.png" alt-text="Screenshot that shows the service principal key vault option":::
-
-1. Select **Test connection**. On a successful connection, select **Continue**
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sp-test-connection.png" alt-text="Screenshot that shows the test connection for service principal":::
-
-### Scoping and running the scan
-
-1. You can scope your scan to specific folders and subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-scope-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. If creating a new _scan rule set_, select the **file types** to be included in the scan rule.
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-file-types.png" alt-text="Scan rule set file types":::
-
-1. You can select the **classification rules** to be included in the scan rule
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-classification-rules.png" alt-text="Scan rule set classification rules":::
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-select-scan-rule-set.png" alt-text="Scan rule set selection":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-scan-trigger.png" alt-text="scan trigger":::
-
- :::image type="content" source="media/register-scan-adls-gen1/register-register-adls-gen1-scan-trigger-selection.png" alt-text="scan trigger selection":::
-
-1. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-review-scan.png" alt-text="review scan":::
-
-### Viewing Scan
-
-1. Navigate to the _data source_ in the _Collection_ and select **View Details** to check the status of the scan
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-view-scan.png" alt-text="view scan":::
-
-1. The scan details indicate the progress of the scan in the **Last run status** and the number of assets _scanned_ and _classified_
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-scan-details.png" alt-text="view scan detail":::
-
-1. The **Last run status** will be updated to **In progress** and then **Completed** once the entire scan has run successfully
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-scan-in-progress.png" alt-text="view scan in progress":::
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-scan-completed.png" alt-text="view scan completed":::
-
-### Managing Scan
-
-Scans can be managed or run again on completion.
-
-1. Select the **Scan name** to manage the scan
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-manage-scan.png" alt-text="manage scan":::
-
-1. You can _run the scan_ again, _edit the scan_, _delete the scan_
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-manage-scan-options.png" alt-text="manage scan options":::
-
- > [!NOTE]
- > * Deleting your scan does not delete catalog assets created from previous scans.
- > * The asset will no longer be updated with schema changes if your source table has changed and you re-scan the source table after editing the description in the schema tab of Microsoft Purview.
-
-1. You can _run an incremental scan_ or a _full scan_ again.
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-full-inc-scan.png" alt-text="manage scan full or incremental":::
-
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-manage-scan-results.png" alt-text="manage scan results":::
-
-## Next steps
-
-Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Adls Gen2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-adls-gen2.md
- Title: 'Discover and govern Azure Data Lake Storage (ADLS) Gen2'
-description: This article outlines the process to register an Azure Data Lake Storage Gen2 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen2 source.
----- Previously updated : 03/17/2023--
-# Connect to Azure Data Lake Storage in Microsoft Purview
-
-This article outlines the process to register and govern an Azure Data Lake Storage (ADLS Gen2) data source in Microsoft Purview including instructions to authenticate and interact with the ADLS Gen2 source.
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)|[Yes](#scan) | [Yes](#scan)|[Yes](#scan)| [Yes](create-sensitivity-label.md)| [Yes (preview)](#access-policy) | Limited* |[Yes](#data-sharing)|
-
-\* *Lineage is supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-When scanning Azure Data Lake Storage Gen2 source, Microsoft Purview supports extracting technical metadata including:
--- Storage account-- Data Lake Storage Gen2 Service-- File system (container)-- Folders-- Files-- Resource sets-
-When setting up scan, you can choose to scan the entire ADLS Gen2 or selective folders. Learn about the supported file format [here](microsoft-purview-connector-overview.md#file-types-supported-for-scanning).
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-* You need to have at least [Reader permission on the ADLS Gen 2 account](../storage/blobs/data-lake-storage-access-control-model.md#role-based-access-control-azure-rbac) to be able to register it.
-
-## Register
-This section will enable you to register the ADLS Gen2 data source for scan and data share in Purview.
-
-### Prerequisites for register
-* You'll need to be a Data Source Admin and one of the other Purview roles (for example, Data Reader or Data Share Contributor) to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-### Steps to register
-
-It's important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-
-1. Go to the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Navigate to the **Data Map --> Sources**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-open-purview-studio.png" alt-text="Screenshot that shows the link to open Microsoft Purview governance portal":::
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-sources.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
-
-1. Create the [Collection hierarchy](./quickstart-create-collection.md) using the **Collections** menu and assign permissions to individual subcollections, as required
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-collections.png" alt-text="Screenshot that shows the collection menu to create collection hierarchy":::
-
-1. Navigate to the appropriate collection under the **Sources** menu and select the **Register** icon to register a new ADLS Gen2 data source
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-register-source.png" alt-text="Screenshot that shows the collection used to register the data source":::
-
-1. Select the **Azure Data Lake Storage Gen2** data source and select **Continue**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-select-data-source.png" alt-text="Screenshot that allows selection of the data source":::
-
-1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Data Lake Store account name** and the **collection** and select **Apply**. Leave the **Data Use Management** toggle on the **disabled** position until you have a chance to carefully go over this [document](./how-to-policies-data-owner-storage.md).
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-data-source-details.png" alt-text="Screenshot that shows the details to be entered in order to register the data source":::
-
-1. The ADLS Gen2 storage account will be shown under the selected Collection
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-data-source-collection.png" alt-text="Screenshot that shows the data source mapped to the collection to initiate scanning":::
-
-## Scan
-
-> [!TIP]
-> To troubleshoot any issues with scanning:
-> 1. Confirm you have properly set up [**authentication for scanning**](#authentication-for-a-scan)
-> 1. Review our [**scan troubleshooting documentation**](troubleshoot-connections.md).
-
-### Authentication for a scan
-
-Your Azure network may allow for communications between your Azure resources, but if you've set up firewalls, private endpoints, or virtual networks within Azure, you'll need to follow one of these configurations below.
-
-|Networking constraints |Integration runtime type |Available credential types |
-||||
-|No private endpoints or firewalls | Azure IR | Managed identity (Recommended), service principal, or account key|
-|Firewall enabled but no private endpoints| Azure IR | Managed identity |
-|Private endpoints enabled | *Self-Hosted IR | Service principal, account key|
-
-*To use a self-hosted integration runtime, you'll first need to [create one](manage-integration-runtimes.md) and confirm your [network settings for Microsoft Purview](catalog-private-link.md)
-
-# [System or user assigned managed identity](#tab/MI)
-
-#### Using a system or user assigned managed identity for scanning
-
-There are two types of managed identity you can use:
-
-* **System-assigned managed identity (Recommended)** - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview system-assigned managed identity (SAMI) to perform the scans.
-
-* **User-assigned managed identity** (preview) - Similar to a system managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-It's important to give your Microsoft Purview account or user-assigned managed identity (UAMI) the permission to scan the ADLS Gen2 data source. You can add your Microsoft Purview account's system-assigned managed identity (which has the same name as your Microsoft Purview account) or UAMI at the Subscription, Resource Group, or Resource level, depending on what level scan permissions are needed.
-
-> [!Note]
-> You need to be an owner of the subscription to be able to add a managed identity on an Azure resource.
-
-1. From the [Azure portal](https://portal.azure.com), find either the subscription, resource group, or resource (for example, an Azure Data Lake Storage Gen2 storage account) that you would like to allow the catalog to scan.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-storage-acct.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access Control (IAM)** in the left navigation and then select **+ Add** --> **Add role assignment**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _Microsoft Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under the **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
-
- > [!Note]
- > For more details, please see steps in [Authorize access to blobs and queues using Azure Active Directory](../storage/blobs/authorize-access-azure-active-directory.md)
-
- > [!NOTE]
- > If you have firewall enabled for the storage account, you must use **managed identity** authentication method when setting up a scan.
-
-1. Go into your ADLS Gen2 storage account in [Azure portal](https://portal.azure.com)
-1. Navigate to **Security + networking > Networking**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-networking.png" alt-text="Screenshot that shows the details to provide firewall access":::
-
-1. Choose **Selected Networks** under **Allow access from**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-network-access.png" alt-text="Screenshot that shows the details to allow access to selected networks":::
-
-1. In the **Exceptions** section, select **Allow trusted Microsoft services to access this storage account** and hit **Save**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-permission-microsoft-services.png" alt-text="Screenshot that shows the exceptions to allow trusted Microsoft services to access the storage account":::
-
-# [Account Key](#tab/AK)
-
-#### Using Account Key for scanning
-
-> [!Note]
-> If you use this option, you need to deploy an _Azure key vault_ resource in your subscription and [assign _Microsoft Purview accountΓÇÖs_ System Assigned Managed Identity (SAMI) required access permission to secrets inside _Azure key vault_.](manage-credentials.md#microsoft-purview-permissions-on-the-azure-key-vault)
-
-When authentication method selected is **Account Key**, you need to get your access key and store in the key vault:
-
-1. Navigate to your ADLS Gen2 storage account
-1. Select **Security + networking > Access keys**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-access-keys.png" alt-text="Screenshot that shows the access keys in the storage account":::
-
-1. Copy your *key* and save it separately for the next steps
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-key.png" alt-text="Screenshot that shows the access keys to be copied":::
-
-1. Navigate to your key vault
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-key-vault.png" alt-text="Screenshot that shows the key vault":::
-
-1. Select **Settings > Secrets** and select **+ Generate/Import**
-
-1. Enter the **Name** and **Value** as the *key* from your storage account
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-secret-values.png" alt-text="Screenshot that shows the key vault option to enter the secret values":::
-
-1. Select **Create** to complete
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-secret.png" alt-text="Screenshot that shows the key vault option to create a secret":::
-
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan
-
-# [Service Principal](#tab/SP)
-
-#### Using Service Principal for scanning
-
-##### Creating a new service principal
-
-If you need to [Create a new service principal](./create-service-principal-azure.md), it's required to register an application in your Azure AD tenant and provide access to Service Principal in your data sources. Your Azure AD Global Administrator or other roles such as Application Administrator can perform this operation.
-
-##### Getting the Service Principal's Application ID
-
-1. Copy the **Application (client) ID** present in the **Overview** of the [_Service Principal_](./create-service-principal-azure.md) already created
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-sp-appln-id.png" alt-text="Screenshot that shows the Application (client) ID for the Service Principal":::
-
-##### Granting the Service Principal access to your ADLS Gen2 account
-
-It's important to give your service principal the permission to scan the ADLS Gen2 data source. You can add access for the service principal at the Subscription, Resource Group, or Resource level, depending on what level scan permissions are needed.
-
-> [!Note]
-> You need to be an owner of the subscription to be able to add a service principal on an Azure resource.
-
-1. From the [Azure portal](https://portal.azure.com), find either the subscription, resource group, or resource (for example, an Azure Data Lake Storage Gen2 storage account) that you would like to allow the catalog to scan.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-storage-acct.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access Control (IAM)** in the left navigation and then select **+ Add** --> **Add role assignment**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _service principal_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-sp-permission.png" alt-text="Screenshot that shows the details to provide storage account permissions to the service principal":::
---
-### Create the scan
-
-1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview governance portal**
-1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy
-1. Select the **New Scan** icon under the **ADLS Gen2 data source** registered earlier
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-new-scan.png" alt-text="Screenshot that shows the screen to create a new scan":::
-
-# [System or user assigned managed identity](#tab/MI)
-
-#### If using a system or user assigned managed identity
-
-1. Provide a **Name** for the scan, select the system-assigned or user-assigned managed identity under **Credential**, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-managed-identity.png" alt-text="Screenshot that shows the managed identity option to run the scan":::
-
-# [Account Key](#tab/AK)
-
-#### If using Account Key
-
-1. Provide a **Name** for the scan, select the Azure IR or your Self-Hosted IR depending on your configuration, choose the appropriate collection for the scan, and select **+ New** under credential.
-
-1. Select **Account Key** as the authentication method, then select the appropriate **Key vault connection**, and provide the name of the secret you used to store the account key. Then select **Create**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-acct-key.png" alt-text="Screenshot that shows the Account Key option for scanning":::
-
-1. Select **Test connection**. On a successful connection, select **Continue**
-
-# [Service Principal](#tab/SP)
-
-#### If using Service Principal
-
-1. Provide a **Name** for the scan, select the Azure IR or your Self-Hosted IR depending on your configuration, choose the appropriate collection for the scan, and select the **+ New** under **Credential**
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-sp-option.png" alt-text="Screenshot that shows the option for service principal to enable scanning":::
-
-1. Select the appropriate **Key vault connection** and the **Secret name** that was used while creating the _Service Principal_. The **Service Principal ID** is the **Application (client) ID** copied earlier.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-service-principal-option.png" alt-text="Screenshot that shows the service principal option":::
-
-1. Select **Test connection**. On a successful connection, select **Continue**
---
-### Scope and run the scan
-
-1. You can scope your scan to specific folders and subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-scope-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. If creating a new _scan rule set_, select the **file types** to be included in the scan rule.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-file-types.png" alt-text="Scan rule set file types":::
-
-1. You can select the **classification rules** to be included in the scan rule
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-classification rules.png" alt-text="Scan rule set classification rules":::
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-select-scan-rule-set.png" alt-text="Scan rule set selection":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-scan-trigger.png" alt-text="scan trigger":::
-
-1. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-review-scan.png" alt-text="review scan":::
--
-## Data sharing
-
-Microsoft Purview Data Sharing (preview) enables sharing of data in-place from ADLS Gen2 to ADLS Gen2. This section provides details about the ADLS Gen2 specific requirements for sharing and receiving data in-place. Refer to [How to share data](how-to-share-data.md) and [How to receive share](how-to-receive-share.md) for step by step guide on how to use data sharing.
-
-### Storage accounts supported for in-place data sharing
-
-The following storage accounts are supported for in-place data sharing:
-
-* Regions: Canada Central, Canada East, UK South, UK West, Australia East, Japan East, Korea South, and South Africa North
-* Redundancy options: LRS, GRS, RA-GRS
-* Tiers: Hot, Cool
-
-Only use storage accounts without production workload for the preview.
-
->[!NOTE]
-> Source and target storage accounts must be in the same region as each other. They don't need to be in the same region as the Microsoft Purview account.
-
-### Storage account permissions required to share data
-
-To add or update a storage account asset to a share, you need ONE of the following permissions:
-
-* **Microsoft.Authorization/roleAssignments/write** - This permission is available in the _Owner_ role.
-* **Microsoft.Storage/storageAccounts/blobServices/containers/blobs/modifyPermissions/** - This permission is available in the _Blob Storage Data Owner_ role.
-
-### Storage account permissions required to receive shared data
-
-To map a storage account asset in a received share, you need ONE of the following permissions:
-
-* **Microsoft.Storage/storageAccounts/write** - This permission is available in the _Contributor_ and _Owner_ role.
-* **Microsoft.Storage/storageAccounts/blobServices/containers/write** - This permission is available in the _Contributor_, _Owner_, _Storage Blob Data Contributor_ and _Storage Blob Data Owner_ role.
-
-### Update shared data in source storage account
-
-Updates you make to shared files or data in the shared folder from source storage account will be made available to recipient in target storage account in near real time. When you delete subfolder or files within the shared folder, they'll disappear for recipient. To delete the shared folder, file or parent folders or containers, you need to first revoke access to all your shares from the source storage account.
-
-### Access shared data in target storage account
-
-The target storage account enables recipient to access the shared data read-only in near real time. You can connect analytics tools such as Synapse Workspace and Databricks to the shared data to perform analytics. Cost of accessing the shared data is charged to the target storage account.
-
-### Service limit
-
-Source storage account can support up to 20 targets, and target storage account can support up to 100 sources. If you require an increase in limit, contact Support.
-
-## Access policy
-
-### Supported policies
-The following types of policies are supported on this data resource from Microsoft Purview:
-- [Data owner policies](concept-policies-data-owner.md)-- [Self-service access policies](concept-self-service-data-access-policy.md)-
-### Access policy pre-requisites on Azure Storage accounts
-
-### Configure the Microsoft Purview account for policies
-
-### Register the data source in Microsoft Purview for Data Use Management
-The Azure Storage resource needs to be registered first with Microsoft Purview before you can create access policies.
-To register your resource, follow the **Prerequisites** and **Register** sections of this guide:
-- [Register Azure Data Lake Storage (ADLS) Gen2 in Microsoft Purview](register-scan-adls-gen2.md#prerequisites)-
-After you've registered the data source, you'll need to enable Data Use Management. This is a pre-requisite before you can create policies on the data source. Data Use Management can impact the security of your data, as it delegates to certain Microsoft Purview roles managing access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** option set to **Enabled**, it will look like this screenshot:
-![Screenshot shows how to register a data source for policy with the option Data use management set to enable](./media/how-to-policies-data-owner-storage/register-data-source-for-policy-storage.png)
-
-### Create a policy
-To create an access policy for Azure Data Lake Storage Gen2, follow this guide:
-* [Provision read/modify access on a single storage account](./how-to-policies-data-owner-storage.md#create-and-publish-a-data-owner-policy)
-
-To create policies that cover all data sources inside a resource group or Azure subscription you can refer to [this section](register-scan-azure-multiple-sources.md#access-policy).
-
-## Next steps
-Follow the below guides to learn more about Microsoft Purview and your data.
-- [Data owner policies in Microsoft Purview](concept-policies-data-owner.md)-- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Data share in Microsoft Purview](concept-data-share.md)
purview Register Scan Amazon Rds https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-amazon-rds.md
- Title: Amazon RDS Multicloud scanning connector for Microsoft Purview
-description: This how-to guide describes details of how to scan Amazon RDS databases, including both Microsoft SQL and PostgreSQL data.
----- Previously updated : 12/07/2022-
-# Customer intent: As a security officer, I need to understand how to use the Microsoft Purview connector for Amazon RDS service to set up, configure, and scan my Amazon RDS databases.
--
-# Amazon RDS Multicloud Scanning Connector for Microsoft Purview (Public preview)
-
-The Multicloud Scanning Connector for Microsoft Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services, in addition to Azure storage services.
--
-This article describes how to use Microsoft Purview to scan your structured data currently stored in Amazon RDS, including both Microsoft SQL and PostgreSQL databases, and discover what types of sensitive information exists in your data. You'll also learn how to identify the Amazon RDS databases where the data is currently stored for easy information protection and data compliance.
-
-For this service, use Microsoft Purview to provide a Microsoft account with secure access to AWS, where the Multicloud Scanning Connectors for Microsoft Purview will run. The Multicloud Scanning Connectors for Microsoft Purview use this access to your Amazon RDS databases to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Microsoft Purview classification and labeling reports to analyze and review your data scan results.
-
-> [!IMPORTANT]
-> The Multicloud Scanning Connectors for Microsoft Purview are separate add-ons to Microsoft Purview. The terms and conditions for the Multicloud Scanning Connectors for Microsoft Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/.
->
-
-## Microsoft Purview scope for Amazon RDS
--- **Supported database engines**: Amazon RDS structured data storage supports multiple database engines. Microsoft Purview supports Amazon RDS with/based on Microsoft SQL and PostgreSQL.--- **Maximum columns supported**: Scanning RDS tables with more than 300 columns isn't supported.--- **Public access support**: Microsoft Purview supports scanning only with VPC Private Link in AWS, and doesn't include public access scanning.--- **Supported regions**: Microsoft Purview only supports Amazon RDS databases that are located in the following AWS regions:-
- - US East (Ohio)
- - US East (N. Virginia)
- - US West (N. California)
- - US West (Oregon)
- - Europe (Frankfurt)
- - Asia Pacific (Tokyo)
- - Asia Pacific (Singapore)
- - Asia Pacific (Sydney)
- - Europe (Ireland)
- - Europe (London)
- - Europe (Paris)
--- **IP address requirements**: Your RDS database must have a static IP address. The static IP address is used to configure AWS PrivateLink, as described in this article.--- **Known issues**: The following functionality isn't currently supported:-
- - The **Test connection** button. The scan status messages will indicate any errors related to connection setup.
- - Selecting specific tables in your database to scan.
- - [Data lineage](concept-data-lineage.md).
-
-For more information, see:
--- [Manage and increase quotas for resources with Microsoft Purview](how-to-manage-quotas.md)-- [Supported data sources and file types in Microsoft Purview](sources-and-scans.md)-- [Use private endpoints for your Microsoft Purview account](catalog-private-link.md)-
-## Prerequisites
-
-Ensure that you've performed the following prerequisites before adding your Amazon RDS database as Microsoft Purview data sources and scanning your RDS data.
-
-> [!div class="checklist"]
-> * You need to be a Microsoft Purview Data Source Admin.
-> * You need a Microsoft Purview account. [Create a Microsoft Purview account instance](create-catalog-portal.md), if you don't yet have one.
-> * You need an Amazon RDS PostgreSQL or Microsoft SQL database, with data.
--
-## Configure AWS to allow Microsoft Purview to connect to your RDS VPC
-
-Microsoft Purview supports scanning only when your database is hosted in a virtual private cloud (VPC), where your RDS database can only be accessed from within the same VPC.
-
-The Azure Multicloud Scanning Connectors for Microsoft Purview service run in a separate, Microsoft account in AWS. To scan your RDS databases, the Microsoft AWS account needs to be able to access your RDS databases in your VPC. To allow this access, youΓÇÖll need to configure [AWS PrivateLink](https://aws.amazon.com/privatelink/) between the RDS VPC (in the customer account) to the VPC where the Multicloud Scanning Connectors for Microsoft Purview run (in the Microsoft account).
-
-The following diagram shows the components in both your customer account and Microsoft account. Highlighted in yellow are the components youΓÇÖll need to create to enable connectivity RDS VPC in your account to the VPC where the Multicloud Scanning Connectors for Microsoft Purview run in the Microsoft account.
---
-> [!IMPORTANT]
-> Any AWS resources created for a customer's private network will incur extra costs on the customer's AWS bill.
->
-
-### Configure AWS PrivateLink using a CloudFormation template
-
-The following procedure describes how to use an AWS CloudFormation template to configure AWS PrivateLink, allowing Microsoft Purview to connect to your RDS VPC. This procedure is performed in AWS and is intended for an AWS admin.
-
-This CloudFormation template is available for download from the [Azure GitHub repository](https://github.com/Azure/Azure-Purview-Starter-Kit/tree/main/Amazon/AWS/RDS), and will help you create a target group, load balancer, and endpoint service.
--- **If you have multiple RDS servers in the same VPC**, perform this procedure once, [specifying all RDS server IP addresses and ports](#parameters). In this case, the CloudFormation output will include different ports for each RDS server.-
- When [registering these RDS servers as data sources in Microsoft Purview](#register-an-amazon-rds-data-source), use the ports included in the output instead of the real RDS server ports.
--- **If you have RDS servers in multiple VPCs**, perform this procedure for each of the VPCs.--
-> [!TIP]
-> You can also perform this procedure manually. For more information, see [Configure AWS PrivateLink manually (advanced)](#configure-aws-privatelink-manually-advanced).
->
-
-**To prepare your RDS database with a CloudFormation template**:
-
-1. Download the CloudFormation [RDSPrivateLink_CloudFormation.yaml](https://github.com/Azure/Azure-Purview-Starter-Kit/tree/main/Amazon/AWS/RDS) template required for this procedure from the Azure GitHub repository:
-
- 1. At the right of the [linked GitHub page](https://github.com/Azure/Azure-Purview-Starter-Kit/blob/main/Amazon/AWS/RDS/Amazon_AWS_RDS_PrivateLink_CloudFormation.zip), select **Download** to download the zip file.
-
- 1. Extract the .zip file to a local location so that you can access the **RDSPrivateLink_CloudFormation.yaml** file.
-
-1. In the AWS portal, navigate to the **CloudFormation** service. At the top-right of the page, select **Create stack** > **With new resources (standard)**.
-
-1. On the **Prerequisite - Prepare Template** page, select **Template is ready**.
-
-1. In the **Specify template** section, select **Upload a template file**. Select **Choose file**, navigate to the **RDSPrivateLink_CloudFormation.yaml** file you downloaded earlier, and then select **Next** to continue.
-
-1. In the **Stack name** section, enter a name for your stack. This name will be used, together with an automatically added suffix, for the resource names created later in the process. Therefore:
-
- - Make sure to use a meaningful name for your stack.
- - Make sure that the stack name is no longer than 19 characters.
-
-1. <a name="parameters"></a>In the **Parameters** area, enter the following values, using data available from your RDS database page in AWS:
-
- |Name |Description |
- |||
- |**Endpoint & port** | Enter the resolved IP address of the RDS endpoint URL and port. For example: `192.168.1.1:5432` <br><br>- **If an RDS proxy is configured**, use the IP address of the read/write endpoint of the proxy for the relevant database. We recommend using an RDS proxy when working with Microsoft Purview, as the IP address is static.<br><br>- **If you have multiple endpoints behind the same VPC**, enter up to 10, comma-separated endpoints. In this case, a single load balancer is created to the VPC, allowing a connection from the Amazon RDS Multicloud Scanning Connector for Microsoft Purview in AWS to all RDS endpoints in the VPC. |
- |**Networking** | Enter your VPC ID |
- |**VPC IPv4 CIDR** | Enter the value your VPC's CIDR. You can find this value by selecting the VPC link on your RDS database page. For example: `192.168.0.0/16` |
- |**Subnets** |Select all the subnets that are associated with your VPC. |
- |**Security** | Select the VPC security group associated with the RDS database. |
- | | |
-
- When you're done, select **Next** to continue.
-
-1. The settings on the **Configure stack options** are optional for this procedure.
-
- Define your settings as needed for your environment. For more information, select the **Learn more** links to access the AWS documentation. When you're done, select **Next** to continue.
-
-1. On the **Review** page, check to make sure that the values you selected are correct for your environment. Make any changes needed and then select **Create stack** when you're done.
-
-1. Watch for the resources to be created. When complete, relevant data for this procedure is shown on the following tabs:
-
- - **Events**: Shows the events / activities performed by the CloudFormation template
- - **Resources**: Shows the newly created target group, load balancer, and endpoint service
- - **Outputs**: Displays the **ServiceName** value, and the IP address and port of the RDS servers
-
- If you have multiple RDS servers configured, a different port is displayed. In this case, use the port shown here instead of the actual RDS server port when [registering your RDS database](#register-an-amazon-rds-data-source) as Microsoft Purview data source.
-
-1. In the **Outputs** tab, copy the **ServiceName** key value to the clipboard.
-
- You'll use the value of the **ServiceName** key in the Microsoft Purview governance portal, when [registering your RDS database](#register-an-amazon-rds-data-source) as Microsoft Purview data source. There, enter the **ServiceName** key in the **Connect to private network via endpoint service** field.
-
-## Register an Amazon RDS data source
-
-**To add your Amazon RDS server as a Microsoft Purview data source**:
-
-1. In Microsoft Purview, navigate to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png).
-
-1. On the **Sources** page, select **Register.** On the **Register sources** page that appears on the right, select the **Database** tab, and then select **Amazon RDS (PostgreSQL)** or **Amazon RDS (SQL)**.
-
- :::image type="content" source="media/register-scan-amazon-rds/register-amazon-rds.png" alt-text="Screenshot of the Register sources page to select Amazon RDS (PostgreSQL)." lightbox="media/register-scan-amazon-rds/register-amazon-rds.png":::
-
-1. Enter the details for your source:
-
- |Field |Description |
- |||
- |**Name** | Enter a meaningful name for your source, such as `AmazonPostgreSql-Ups` |
- |**Server name** | Enter the name of your RDS database in the following syntax: `<instance identifier>.<xxxxxxxxxxxx>.<region>.rds.amazonaws.com` <br><br>We recommend that you copy this URL from the Amazon RDS portal, and make sure that the URL includes the AWS region. |
- |**Port** | Enter the port used to connect to the RDS database:<br><br> - PostgreSQL: `5432`<br> - Microsoft SQL: `1433`<br><br> If you've [configured AWS PrivateLink using a CloudFormation template](#configure-aws-privatelink-using-a-cloudformation-template) and have multiple RDS servers in the same VPC, use the ports listed in the CloudFormation **Outputs** tab instead of the read RDS server ports. |
- |**Connect to private network via endpoint service** | Enter the **ServiceName** key value obtained at the end of the [previous procedure](#configure-aws-privatelink-using-a-cloudformation-template). <br><br>If you've prepared your RDS database manually, use the **Service Name** value obtained at the end of [Step 5: Create an endpoint service](#step-5-create-an-endpoint-service). |
- |**Collection** (optional) | Select a collection to add your data source to. For more information, see [Manage data sources in Microsoft Purview (Preview)](manage-data-sources.md). |
- | | |
-
-1. Select **Register** when youΓÇÖre ready to continue.
-
-Your RDS data source appears in the Sources map or list. For example:
--
-## Create Microsoft Purview credentials for your RDS scan
-
-Credentials supported for Amazon RDS data sources include username/password authentication only, with a password stored in an Azure KeyVault secret.
-
-### Create a secret for your RDS credentials to use in Microsoft Purview
-
-1. Add your password to an Azure KeyVault as a secret. For more information, see [Set and retrieve a secret from Key Vault using Azure portal](../key-vault/secrets/quick-create-portal.md).
-
-1. Add an access policy to your KeyVault with **Get** and **List** permissions. For example:
-
- :::image type="content" source="media/register-scan-amazon-rds/keyvault-for-rds.png" alt-text="Screenshot of an access policy for RDS in Microsoft Purview.":::
-
- When defining the principal for the policy, select your Microsoft Purview account. For example:
-
- :::image type="content" source="media/register-scan-amazon-rds/select-purview-as-principal.png" alt-text="Screenshot of selecting your Microsoft Purview account as Principal.":::
-
- Select **Save** to save your Access Policy update. For more information, see [Assign an Azure Key Vault access policy](/azure/key-vault/general/assign-access-policy-portal).
-
-1. In Microsoft Purview, add a KeyVault connection to connect the KeyVault with your RDS secret to Microsoft Purview. For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-
-### Create your Microsoft Purview credential object for RDS
-
-In Microsoft Purview, create a credentials object to use when scanning your Amazon RDS account.
-
-1. In the Microsoft Purview **Management** area, select **Security and access** > **Credentials** > **New**.
-
-1. Select **SQL authentication** as the authentication method. Then, enter details for the Key Vault where your RDS credentials are stored, including the names of your Key Vault and secret.
-
- For example:
-
- :::image type="content" source="media/register-scan-amazon-rds/new-credential-for-rds.png" alt-text="Screenshot of a new credential for RDS.":::
-
-For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-
-## Scan an Amazon RDS database
-
-To configure a Microsoft Purview scan for your RDS database:
-
-1. From the Microsoft Purview **Sources** page, select the Amazon RDS data source to scan.
-
-1. Select :::image type="icon" source="media/register-scan-amazon-s3/new-scan-button.png" border="false"::: **New scan** to start defining your scan. In the pane that opens on the right, enter the following details, and then select **Continue**.
-
- - **Name**: Enter a meaningful name for your scan.
- - **Database name**: Enter the name of the database you want to scan. YouΓÇÖll need to find the names available from outside Microsoft Purview, and create a separate scan for each database in the registered RDS server.
- - **Credential**: Select the credential you created earlier for the Multicloud Scanning Connectors for Microsoft Purview to access the RDS database.
-
-1. On the **Select a scan rule set** pane, select the scan rule set you want to use, or create a new one. For more information, see [Create a scan rule set](create-a-scan-rule-set.md).
-
-1. On the **Set a scan trigger** pane, select whether you want to run the scan once, or at a recurring time, and then select **Continue**.
-
-1. On the **Review your scan** pane, review the details and then select **Save and Run**, or **Save** to run it later.
-
-While you run your scan, select **Refresh** to monitor the scan progress.
-
-> [!NOTE]
-> When working with Amazon RDS PostgreSQL databases, only full scans are supported. Incremental scans are not supported as PostgreSQL does not have a **Last Modified Time** value.ΓÇâ
->
-
-## Explore scanning results
-
-After a Microsoft Purview scan is complete on your Amazon RDS databases, drill down in the Microsoft Purview **Data Map** area to view the scan history. Select a data source to view its details, and then select the **Scans** tab to view any currently running or completed scans.
-
-Use the other areas of Microsoft Purview to find out details about the content in your data estate, including your Amazon RDS databases:
--- **Explore RDS data in the catalog**. The Microsoft Purview catalog shows a unified view across all source types, and RDS scanning results are displayed in a similar way to Azure SQL. You can browse the catalog using filters or browse the assets and navigate through the hierarchy. For more information, see:-
- - [Tutorial: Browse assets in Microsoft Purview (preview) and view their lineage](tutorial-browse-and-view-lineage.md)
- - [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
- - [Register and scan an Azure SQL Database](register-scan-azure-sql-database.md)
--- **View Insight reports** to view statistics for the classification, sensitivity labels, file types, and more details about your content.-
- All Microsoft Purview Insight reports include the Amazon RDS scanning results, along with the rest of the results from your Azure data sources. When relevant, an **Amazon RDS** asset type is added to the report filtering options.
-
- For more information, see the [Understand Data Estate Insights in Microsoft Purview](concept-insights.md).
--- **View RDS data in other Microsoft Purview features**, such as the **Scans** and **Glossary** areas. For more information, see:-
- - [Create a scan rule set](create-a-scan-rule-set.md)
- - [Tutorial: Create and import glossary terms in Microsoft Purview (preview)](tutorial-import-create-glossary-terms.md)
--
-## Configure AWS PrivateLink manually (advanced)
-
-This procedure describes the manual steps required for preparing your RDS database in a VPC to connect to Microsoft Purview.
-
-By default, we recommend that you use a CloudFormation template instead, as described earlier in this article. For more information, see [Configure AWS PrivateLink using a CloudFormation template](#configure-aws-privatelink-using-a-cloudformation-template).
-
-### Step 1: Retrieve your Amazon RDS endpoint IP address
-
-Locate the IP address of your Amazon RDS endpoint, hosted inside an Amazon VPC. YouΓÇÖll use this IP address later in the process when you create your target group.
-
-**To retrieve your RDS endpoint IP address**:
-
-1. In Amazon RDS, navigate to your RDS database, and identify your endpoint URL. This is located under **Connectivity & security**, as your **Endpoint** value.
-
- > [!TIP]
- > Use the following command to get a list of the databases in your endpoint: `aws rds describe-db-instances`
- >
-
-1. Use the endpoint URL to find the IP address of your Amazon RDS database. For example, use one of the following methods:
-
- - **Ping**: `ping <DB-Endpoint>`
-
- - **nslookup**: `nslookup <Db-Endpoint>`
-
- - **Online nslookup**. Enter your database **Endpoint** value in the search box and select **Find DNS records**. **NSLookup.io** shows your IP address on the next screen.
-
-### Step 2: Enable your RDS connection from a load balancer
-
-To ensure that your RDS connection will be allowed from the load balancer you create later in the process:
-
-1. **Find the VPC IP range**.
-
- In Amazon RDS, navigate to your RDS database. In the **Connectivity & security** area, select the **VPC** link to find its IP range (IPv4 CIDR).
-
- In the **Your VPCs** area, your IP range is shown in the **IPv4 CIDR** column.
-
- > [!TIP]
- > To perform this step via CLI, use the following command: `aws ec2 describe-vpcs`
- >
- > For more information, see [ec2 ΓÇö AWS CLI 1.19.105 Command Reference (amazon.com)](https://docs.aws.amazon.com/cli/latest/reference/ec2/).
- >
-
-1. <a name="security-group"></a>**Create a Security Group for this IP range**.
-
- 1. Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/ and navigate to **Security Groups**.
-
- 1. Select **Create security group** and then create your security group, making sure to include the following details:
-
- - **Security group name**: Enter a meaningful name
- - **Description**: Enter a description for your security group
- - **VPC**: Select your RDS database VPC
-
- 1. Under **Inbound rules**, select **Add rule** and enter the following details:
-
- - **Type**: Select **Custom TCP**
- - **Port range**: Enter your RDS database port
- - **Source**: Select **Custom** and enter the VPC IP range from the previous step.
-
- 1. Scroll to the bottom of the page and select **Create security group**.
-
-1. **Associate the new security group to RDS**.
-
- 1. In Amazon RDS, navigate to your RDS database, and select **Modify**.
-
- 1. Scroll down to the **Connectivity** section, and in the **Security group** field, add the new security group that you created in the [previous step](#security-group). Then scroll down to the bottom of the page and select **Continue.**
-
- 1. In the **Scheduling of modifications** section, select **Apply immediately** to update the security group immediately.
-
- 1. Select **Modify DB instance**.
-
-> [!TIP]
-> To perform this step via CLI, use the following commands:
->
-> - `aws ec2 create-security-group--description <value>--group-name <value>[--vpc-id <value>]`
->
-> For more information, see [create-security-group ΓÇö AWS CLI 1.19.105 Command Reference (amazon.com)](https://docs.aws.amazon.com/cli/latest/reference/ec2/create-security-group.html).
->
-> - `aws rds --db-instance-identifier <value> --vpc-security-group-ids <value>`
->
-> For more information, see [modify-db-instance ΓÇö AWS CLI 1.19.105 Command Reference (amazon.com)](https://docs.aws.amazon.com/cli/latest/reference/rds/modify-db-instance.html).
->
-
-### Step 3: Create a target group
-
-**To create your target group in AWS**:
-
-1. Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/ and navigate to **Load Balancing** > **Target Groups**.
-
-1. Select **Create target group**, and create your target group, making sure to include the following details:
-
- - **Target type**: Select **IP addresses** (optional)
- - **Protocol**: Select **TCP**
- - **Port**: Enter your RDS database port
- - **VPC**: Enter your RDS database VPC
-
- > [!NOTE]
- > You can find the RDS database port and VPC values on your RDS database page, under **Connectivity & security**
-
- When youΓÇÖre done, select **Next** to continue.
-
-1. In the **Register targets** page, enter your RDS database IP address, and then select **Include as pending below**.
-
-1. After you see the new target listed in the **Targets** table, select **Create target group** at the bottom of the page.
-
-> [!TIP]
-> To perform this step via CLI, use the following command:
->
-> - `aws elbv2 create-target-group --name <tg-name> --protocol <db-protocol> --port <db-port> --target-type ip --vpc-id <db-vpc-id>`
->
-> For more information, see [create-target-group ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/elbv2/create-target-group.html).
->
-> - `aws elbv2 register-targets --target-group-arn <tg-arn> --targets Id=<db-ip>,Port=<db-port>`
->
-> For more information, see [register-targets ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/elbv2/register-targets.html).
->
-
-### Step 4: Create a load balancer
-
-You can either [create a new network load balancer](#create-load-balancer) to forward traffic to the RDS IP address, or [add a new listener](#listener) to an existing load balancer.
-
-<a name="create-load-balancer"></a>**To create a network load balancer to forward traffic to the RDS IP address**:
-
-1. Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/ and navigate to **Load Balancing** > **Load Balancers**.
-
-1. Select **Create Load Balancer** > **Network Load Balancer** and then select or enter the following values:
-
- - **Scheme**: Select **Internal**
-
- - **VPC**: Select your RDS database VPC
-
- - **Mapping**: Make sure that the RDS is defined for all AWS regions, and then make sure to select all of those regions. You can find this information in the **Availability zone** value on the [RDS database](#step-1-retrieve-your-amazon-rds-endpoint-ip-address) page, on the **Connectivity & security** tab.
-
- - **Listeners and Routing**:
-
- - *Protocol*: Select **TCP**
- - *Port*: Select **RDS DB port**
- - *Default action*: Select the target group created in the [previous step](#step-3-create-a-target-group)
-
-1. At the bottom of the page, select **Create Load Balancer** > **View Load Balancers**.
-
-1. Wait few minutes and refresh the screen, until the **State** column of the new Load Balancer is **Active.**
--
-> [!TIP]
-> To perform this step via CLI, use the following commands:
-> - `aws elbv2 create-load-balancer --name <lb-name> --type network --scheme internal --subnet-mappings SubnetId=<value>`
->
-> For more information, see [create-load-balancer ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/elbv2/create-load-balancer.html).
->
-> - `aws elbv2 create-listener --load-balancer-arn <lb-arn> --protocol TCP --port 80 --default-actions Type=forward,TargetGroupArn=<tg-arn>`
->
-> For more information, see [create-listener ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/elbv2/create-listener.html).
->
-
-<a name="listener"></a>**To add a listener to an existing load balancer**:
-
-1. Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/ and navigate to **Load Balancing** > **Load Balancers**.
-
-1. Select your load balancer > **Listeners** > **Add listener**.
-
-1. On the **Listeners** tab, in the **Protocol : port** area, select **TCP** and enter a new port for your listener.
--
-> [!TIP]
-> To perform this step via CLI, use the following command: `aws elbv2 create-listener --load-balancer-arn <value> --protocol <value> --port <value> --default-actions Type=forward,TargetGroupArn=<target_group_arn>`
->
-> For more information, see the [AWS documentation](https://docs.aws.amazon.com/elasticloadbalancing/latest/network/create-listener.html).
->
-
-### Step 5: Create an endpoint service
-
-After the [Load Balancer is created](#step-4-create-a-load-balancer) and its State is **Active** you can create the endpoint service.
-
-**To create the endpoint service**:
-
-1. Open the Amazon VPC console at https://console.aws.amazon.com/vpc/ and navigate to **Virtual Private Cloud > Endpoint Services**.
-
-1. Select **Create Endpoint Service**, and in the **Available Load Balancers** dropdown list, select the new load balancer created in the [previous step](#step-4-create-a-load-balancer), or the load balancer where you'd added a new listener.
-
-1. In the **Create endpoint service** page, clear the selection for the **Require acceptance for endpoint** option.
-
-1. At the bottom of the page, select **Create Service** > **Close**.
-
-1. Back in the **Endpoint services** page:
-
- 1. Select the new endpoint service you created.
- 1. In the **Allow principals** tab, select **Add principals**.
- 1. In the **Principals to add > ARN** field, enter `arn:aws:iam::181328463391:root`.
- 1. Select **Add principals**.
-
- > [!NOTE]
- > When adding an identity, use an asterisk (*****) to add permissions for all principals. This enables all principals, in all AWS accounts to create an endpoint to your endpoint service. For more information, see the [AWS documentation](https://docs.aws.amazon.com/vpc/latest/privatelink/add-endpoint-service-permissions.html).
-
-> [!TIP]
-> To perform this step via CLI, use the following commands:
->
-> - `aws ec2 create-vpc-endpoint-service-configuration --network-load-balancer-arns <lb-arn> --no-acceptance-required`
->
-> For more information, see [create-vpc-endpoint-service-configuration ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ec2/create-vpc-endpoint-service-configuration.html).
->
-> - `aws ec2 modify-vpc-endpoint-service-permissions --service-id <endpoint-service-id> --add-allowed-principals <purview-scanner-arn>`
->
-> For more information, see [modify-vpc-endpoint-service-permissions ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ec2/modify-vpc-endpoint-service-permissions.html).
->
-
-<a name="service-name"></a>**To copy the service name for use in Microsoft Purview**:
-
-After youΓÇÖve created your endpoint service, you can copy the **Service name** value in the Microsoft Purview governance portal, when [registering your RDS database](#register-an-amazon-rds-data-source) as Microsoft Purview data source.
-
-Locate the **Service name** on the **Details** tab for your selected endpoint service.
-
-> [!TIP]
-> To perform this step via CLI, use the following command: `Aws ec2 describe-vpc-endpoint-services`
->
-> For more information, see [describe-vpc-endpoint-services ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ec2/describe-vpc-endpoint-services.html).
->
-
-## Troubleshoot your VPC connection
-
-This section describes common errors that may occur when configuring your VPC connection with Microsoft Purview, and how to troubleshoot and resolve them.
-
-### Invalid VPC service name
-
-If an error of `Invalid VPC service name` or `Invalid endpoint service` appears in Microsoft Purview, use the following steps to troubleshoot:
-
-1. Make sure that your VPC service name is correct. For example:
-
- :::image type="content" source="media/register-scan-amazon-rds/locate-service-name.png" alt-text="Screenshot of the VPC service name in AWS." lightbox="media/register-scan-amazon-rds/locate-service-name.png":::
-
-1. Make sure that the Microsoft ARN is listed in the allowed principals: `arn:aws:iam::181328463391:root`
-
- For more information, see [Step 5: Create an endpoint service](#step-5-create-an-endpoint-service).
-
-1. Make sure that your RDS database is listed in one of the supported regions. For more information, see [Microsoft Purview scope for Amazon RDS](#microsoft-purview-scope-for-amazon-rds).
-
-### Invalid availability zone
-
-If an error of `Invalid Availability Zone` appears in Microsoft Purview, make sure that your RDS is defined for at least one of the following three regions:
--- **us-east-1a**-- **us-east-1b**-- **us-east-1c**-
-For more information, see the [AWS documentation](https://docs.aws.amazon.com/elasticloadbalancing/latest/classic/elb-getting-started.html).
-
-### RDS errors
-
-The following errors may appear in Microsoft Purview:
--- `Unknown database`. In this case, the database defined doesn't exist. Check to see that the configured database name is correct--- `Failed to login to the Sql data source. The given auth credential does not have permission on the target database.` In this case, your username and password is incorrect. Check your credentials and update them as needed.--
-## Next steps
-
-Learn more about Microsoft Purview Insight reports:
-
-> [!div class="nextstepaction"]
-> [Understand Data Estate Insights in Microsoft Purview](concept-insights.md)
purview Register Scan Amazon S3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-amazon-s3.md
- Title: Amazon S3 multi-cloud scanning connector for Microsoft Purview
-description: This how-to guide describes details of how to scan Amazon S3 buckets in Microsoft Purview.
----- Previously updated : 05/08/2023-
-# Customer intent: As a security officer, I need to understand how to use the Microsoft Purview connector for Amazon S3 service to set up, configure, and scan my Amazon S3 buckets.
---
-# Amazon S3 Multi-Cloud Scanning Connector for Microsoft Purview
-
-The Multi-Cloud Scanning Connector for Microsoft Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services in addition to Azure storage services.
-
-This article describes how to use Microsoft Purview to scan your unstructured data currently stored in Amazon S3 standard buckets, and discover what types of sensitive information exist in your data. This how-to guide also describes how to identify the Amazon S3 Buckets where the data is currently stored for easy information protection and data compliance.
-
-For this service, use Microsoft Purview to provide a Microsoft account with secure access to AWS, where the Multi-Cloud Scanning Connector for Microsoft Purview will run. The Multi-Cloud Scanning Connector for Microsoft Purview uses this access to your Amazon S3 buckets to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Microsoft Purview classification and labeling reports to analyze and review your data scan results.
-
-> [!IMPORTANT]
-> The Multi-Cloud Scanning Connector for Microsoft Purview is a separate add-on to Microsoft Purview. The terms and conditions for the Multi-Cloud Scanning Connector for Microsoft Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/.
->
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| Yes | Yes | Yes | Yes | Yes | [Yes](create-sensitivity-label.md)| No | Limited** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-### Known limitations
-
-When scanning Amazon S3 of Glacier storage classes, schema extraction, classification and sensitivity labels are not supported.
-
-Microsoft Purview private endpoints are not supported when scanning Amazon S3.
-
-For more information about Microsoft Purview limits, see:
--- [Manage and increase quotas for resources with Microsoft Purview](how-to-manage-quotas.md)-- [Supported data sources and file types in Microsoft Purview](sources-and-scans.md)-
-### Storage and scanning regions
-
-The Microsoft Purview connector for the Amazon S3 service is currently deployed in specific regions only. The following table maps the regions where your data is stored to the region where it would be scanned by Microsoft Purview.
-
-> [!IMPORTANT]
-> Customers will be charged for all related data transfer charges according to the region of their bucket.
->
-
-| Storage region | Scanning region |
-| - | - |
-| US East (Ohio) | US East (Ohio) |
-| US East (N. Virginia) | US East (N. Virginia) |
-| US West (N. California) | US West (N. California) |
-| US West (Oregon) | US West (Oregon) |
-| Africa (Cape Town) | Europe (Frankfurt) |
-| Asia Pacific (Hong Kong Special Administrative Region) | Asia Pacific (Tokyo) |
-| Asia Pacific (Mumbai) | Asia Pacific (Singapore) |
-| Asia Pacific (Osaka-Local) | Asia Pacific (Tokyo) |
-| Asia Pacific (Seoul) | Asia Pacific (Tokyo) |
-| Asia Pacific (Singapore) | Asia Pacific (Singapore) |
-| Asia Pacific (Sydney) | Asia Pacific (Sydney) |
-| Asia Pacific (Tokyo) | Asia Pacific (Tokyo) |
-| Canada (Central) | US East (Ohio) |
-| China (Beijing) | Not supported |
-| China (Ningxia) | Not supported |
-| Europe (Frankfurt) | Europe (Frankfurt) |
-| Europe (Ireland) | Europe (Ireland) |
-| Europe (London) | Europe (London) |
-| Europe (Milan) | Europe (Paris) |
-| Europe (Paris) | Europe (Paris) |
-| Europe (Stockholm) | Europe (Frankfurt) |
-| Middle East (Bahrain) | Europe (Frankfurt) |
-| South America (São Paulo) | US East (Ohio) |
-| | |
-
-## Prerequisites
-
-Ensure that you've performed the following prerequisites before adding your Amazon S3 buckets as Microsoft Purview data sources and scanning your S3 data.
-
-> [!div class="checklist"]
-> * You need to be a Microsoft Purview Data Source Admin.
-> * [Create a Microsoft Purview account](#create-a-microsoft-purview-account) if you don't yet have one
-> * [Create a new AWS role for use with Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview)
-> * [Create a Microsoft Purview credential for your AWS bucket scan](#create-a-microsoft-purview-credential-for-your-aws-s3-scan)
-> * [Configure scanning for encrypted Amazon S3 buckets](#configure-scanning-for-encrypted-amazon-s3-buckets), if relevant
-> * Make sure that your bucket policy does not block the connection. For more information, see [Bucket policy requirements](#confirm-your-bucket-policy-access) and [SCP policy requirements](#confirm-your-scp-policy-access). For these items, you may need to consult with an AWS expert to ensure that your policies allow required access.
-> * When adding your buckets as Microsoft Purview resources, you'll need the values of your [AWS ARN](#retrieve-your-new-role-arn), [bucket name](#retrieve-your-amazon-s3-bucket-name), and sometimes your [AWS account ID](#locate-your-aws-account-id).
--
-### Create a Microsoft Purview account
--- **If you already have a Microsoft Purview account,** you can continue with the configurations required for AWS S3 support. Start with [Create a Microsoft Purview credential for your AWS bucket scan](#create-a-microsoft-purview-credential-for-your-aws-s3-scan).--- **If you need to create a Microsoft Purview account,** follow the instructions in [Create a Microsoft Purview account instance](create-catalog-portal.md). After creating your account, return here to complete configuration and begin using Microsoft Purview connector for Amazon S3.-
-### Create a new AWS role for Microsoft Purview
-
-The Microsoft Purview scanner is deployed in a Microsoft account in AWS. To allow the Microsoft Purview scanner to read your S3 data, you must create a dedicated role in the AWS portal, in the IAM area, to be used by the scanner.
-
-This procedure describes how to create the AWS role, with the required Microsoft Account ID and External ID from Microsoft Purview, and then enter the Role ARN value in Microsoft Purview.
--
-**To locate your Microsoft Account ID and External ID**:
-
-1. In Microsoft Purview, go to the **Management Center** > **Security and access** > **Credentials**.
-
-1. Select **New** to create a new credential.
-
- In the **New credential** pane that appears, in the **Authentication method** dropdown, select **Role ARN**.
-
- Then copy the **Microsoft account ID** and **External ID** values that appear to a separate file, or have them handy for pasting into the relevant field in AWS. For example:
-
- [ ![Locate your Microsoft account ID and External ID values.](./media/register-scan-amazon-s3/locate-account-id-external-id.png) ](./media/register-scan-amazon-s3/locate-account-id-external-id.png#lightbox)
--
-**To create your AWS role for Microsoft Purview**:
-
-1. Open your **Amazon Web Services** console, and under **Security, Identity, and Compliance**, select **IAM**.
-
-1. Select **Roles** and then **Create role**.
-
-1. Select **Another AWS account**, and then enter the following values:
-
- |Field |Description |
- |||
- |**Account ID** | Enter your Microsoft Account ID. For example: `181328463391` |
- |**External ID** | Under options, select **Require external ID...**, and then enter your External ID in the designated field. <br>For example: `e7e2b8a3-0a9f-414f-a065-afaf4ac6d994` |
- | | |
-
- For example:
-
- ![Add the Microsoft Account ID to your AWS account.](./media/register-scan-amazon-s3/aws-create-role-amazon-s3.png)
-
-1. In the **Create role > Attach permissions policies** area, filter the permissions displayed to **S3**. Select **AmazonS3ReadOnlyAccess**, and then select **Next: Tags**.
-
- ![Select the ReadOnlyAccess policy for the new Amazon S3 scanning role.](./media/register-scan-amazon-s3/aws-permission-role-amazon-s3.png)
-
- > [!IMPORTANT]
- > The **AmazonS3ReadOnlyAccess** policy provides minimum permissions required for scanning your S3 buckets, and may include other permissions as well.
- >
- >To apply only the minimum permissions required for scanning your buckets, create a new policy with the permissions listed in [Minimum permissions for your AWS policy](#minimum-permissions-for-your-aws-policy), depending on whether you want to scan a single bucket or all the buckets in your account.
- >
- >Apply your new policy to the role instead of **AmazonS3ReadOnlyAccess.**
-
-1. In the **Add tags (optional)** area, you can optionally choose to create a meaningful tag for this new role. Useful tags enable you to organize, track, and control access for each role you create.
-
- Enter a new key and value for your tag as needed. When you're done, or if you want to skip this step, select **Next: Review** to review the role details and complete the role creation.
-
- ![Add a meaningful tag to organize, track, or control access for your new role.](./media/register-scan-amazon-s3/add-tag-new-role.png)
-
-1. In the **Review** area, do the following:
-
- - In the **Role name** field, enter a meaningful name for your role
- - In the **Role description** box, enter an optional description to identify the role's purpose
- - In the **Policies** section, confirm that the correct policy (**AmazonS3ReadOnlyAccess**) is attached to the role.
-
- Then select **Create role** to complete the process. For example:
-
- ![Review details before creating your role.](./media/register-scan-amazon-s3/review-role.png)
-
-**Extra required configurations**:
--- For buckets that use **AWS-KMS** encryption, [special configuration](#configure-scanning-for-encrypted-amazon-s3-buckets) is required to enable scanning.--- Make sure that your bucket policy doesn't block the connection. For more information, see:-
- - [Confirm your bucket policy access](#confirm-your-bucket-policy-access)
- - [Confirm your SCP policy access](#confirm-your-scp-policy-access)
-
-### Create a Microsoft Purview credential for your AWS S3 scan
-
-This procedure describes how to create a new Microsoft Purview credential to use when scanning your AWS buckets.
-
-> [!TIP]
-> If you're continuing directly on from [Create a new AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview), you may already have the **New credential** pane open in Microsoft Purview.
->
-> You can also create a new credential in the middle of the process, while [configuring your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets). In that case, in the **Credential** field, select **New**.
->
-
-1. In Microsoft Purview, go to the **Management Center**, and under **Security and access**, select **Credentials**.
-
-1. Select **New**, and in the **New credential** pane that appears on the right, use the following fields to create your Microsoft Purview credential:
-
- |Field |Description |
- |||
- |**Name** |Enter a meaningful name for this credential. |
- |**Description** |Enter an optional description for this credential, such as `Used to scan the tutorial S3 buckets` |
- |**Authentication method** |Select **Role ARN**, since you're using a role ARN to access your bucket. |
- |**Role ARN** | Once you've [created your Amazon IAM role](#create-a-new-aws-role-for-microsoft-purview), navigate to your role in the AWS IAM area, copy the **Role ARN** value, and enter it here. For example: `arn:aws:iam::181328463391:role/S3Role`. <br><br>For more information, see [Retrieve your new Role ARN](#retrieve-your-new-role-arn). |
- | | |
-
- The **Microsoft account ID** and the **External ID** values are used when [creating your Role ARN in AWS.](#create-a-new-aws-role-for-microsoft-purview).
-
-1. Select **Create** when you're done to finish creating the credential.
-
-For more information about Microsoft Purview credentials, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
--
-### Configure scanning for encrypted Amazon S3 buckets
-
-AWS buckets support multiple encryption types. For buckets that use **AWS-KMS** encryption, special configuration is required to enable scanning.
-
-> [!NOTE]
-> For buckets that use no encryption, AES-256 or AWS-KMS S3 encryption, skip this section and continue to [Retrieve your Amazon S3 bucket name](#retrieve-your-amazon-s3-bucket-name).
->
-
-**To check the type of encryption used in your Amazon S3 buckets:**
-
-1. In AWS, navigate to **Storage** > **S3** > and select **Buckets** from the menu on the left.
-
- ![Select the Amazon S3 Buckets tab.](./media/register-scan-amazon-s3/check-encryption-type-buckets.png)
-
-1. Select the bucket you want to check. On the bucket's details page, select the **Properties** tab and scroll down to the **Default encryption** area.
-
- - If the bucket you selected is configured for anything but **AWS-KMS** encryption, including if default encryption for your bucket is **Disabled**, skip the rest of this procedure and continue with [Retrieve your Amazon S3 bucket name](#retrieve-your-amazon-s3-bucket-name).
-
- - If the bucket you selected is configured for **AWS-KMS** encryption, continue as described below to add a new policy that allows for scanning a bucket with custom **AWS-KMS** encryption.
-
- For example:
-
- ![View an Amazon S3 bucket configured with AWS-KMS encryption](./media/register-scan-amazon-s3/default-encryption-buckets.png)
-
-**To add a new policy to allow for scanning a bucket with custom AWS-KMS encryption:**
-
-1. In AWS, navigate to **Services** > **IAM** > **Policies**, and select **Create policy**.
-
-1. On the **Create policy** > **Visual editor** tab, define your policy with the following values:
-
- |Field |Description |
- |||
- |**Service** | Enter and select **KMS**. |
- |**Actions** | Under **Access level**, select **Write** to expand the **Write** section.<br>Once expanded, select only the **Decrypt** option. |
- |**Resources** |Select a specific resource or **All resources**. |
- | | |
-
- When you're done, select **Review policy** to continue.
-
- ![Create a policy for scanning a bucket with AWS-KMS encryption.](./media/register-scan-amazon-s3/create-policy-kms.png)
-
-1. On the **Review policy** page, enter a meaningful name for your policy and an optional description, and then select **Create policy**.
-
- The newly created policy is added to your list of policies.
-
-1. Attach your new policy to the role you added for scanning.
-
- 1. Navigate back to the **IAM** > **Roles** page, and select the role you added [earlier](#create-a-new-aws-role-for-microsoft-purview).
-
- 1. On the **Permissions** tab, select **Attach policies**.
-
- ![On your role's Permissions tab, select Attach policies.](./media/register-scan-amazon-s3/iam-attach-policies.png)
-
- 1. On the **Attach Permissions** page, search for and select the new policy you created above. Select **Attach policy** to attach your policy to the role.
-
- The **Summary** page is updated, with your new policy attached to your role.
-
- ![View an updated Summary page with the new policy attached to your role.](./media/register-scan-amazon-s3/attach-policy-role.png)
-
-### Confirm your bucket policy access
-
-Make sure that the S3 bucket [policy](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-iam-policies.html) doesn't block the connection:
-
-1. In AWS, navigate to your S3 bucket, and then select the **Permissions** tab > **Bucket policy**.
-1. Check the policy details to make sure that it doesn't block the connection from the Microsoft Purview scanner service.
-
-### Confirm your SCP policy access
-
-Make sure that there's no [SCP policy](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps.html) that blocks the connection to the S3 bucket.
-
-For example, your SCP policy might block read API calls to the [AWS Region](#storage-and-scanning-regions) where your S3 bucket is hosted.
--- Required API calls, which must be allowed by your SCP policy, include: `AssumeRole`, `GetBucketLocation`, `GetObject`, `ListBucket`, `GetBucketPublicAccessBlock`. -- Your SCP policy must also allow calls to the **us-east-1** AWS Region, which is the default Region for API calls. For more information, see the [AWS documentation](https://docs.aws.amazon.com/general/latest/gr/rande.html).-
-Follow the [SCP documentation](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps_create.html), review your organizationΓÇÖs SCP policies, and make sure all the [permissions required for the Microsoft Purview scanner](#create-a-new-aws-role-for-microsoft-purview) are available.
--
-### Retrieve your new Role ARN
-
-You'll need to record your AWS Role ARN and copy it in to Microsoft Purview when [creating a scan for your Amazon S3 bucket](#create-a-scan-for-one-or-more-amazon-s3-buckets).
-
-**To retrieve your role ARN:**
-
-1. In the AWS **Identity and Access Management (IAM)** > **Roles** area, search for and select the new role you [created for Microsoft Purview](#create-a-microsoft-purview-credential-for-your-aws-s3-scan).
-
-1. On the role's **Summary** page, select the **Copy to clipboard** button to the right of the **Role ARN** value.
-
- ![Copy the role ARN value to the clipboard.](./media/register-scan-amazon-s3/aws-copy-role-purview.png)
-
-In Microsoft Purview, you can edit your credential for AWS S3, and paste the retrieved role in the **Role ARN** field. For more information, see [Create a scan for one or more Amazon S3 buckets](#create-a-scan-for-one-or-more-amazon-s3-buckets).
-
-### Retrieve your Amazon S3 bucket name
-
-You'll need the name of your Amazon S3 bucket to copy it in to Microsoft Purview when [creating a scan for your Amazon S3 bucket](#create-a-scan-for-one-or-more-amazon-s3-buckets)
-
-**To retrieve your bucket name:**
-
-1. In AWS, navigate to **Storage** > **S3** > and select **Buckets** from the menu on the left.
-
- ![View the Amazon S3 Buckets tab.](./media/register-scan-amazon-s3/check-encryption-type-buckets.png)
-
-1. Search for and select your bucket to view the bucket details page, and then copy the bucket name to the clipboard.
-
- For example:
-
- ![Retrieve and copy the S3 bucket URL.](./media/register-scan-amazon-s3/retrieve-bucket-url-amazon.png)
-
- Paste your bucket name in a secure file, and add an `s3://` prefix to it to create the value you'll need to enter when configuring your bucket as a Microsoft Purview account.
-
- For example: `s3://purview-tutorial-bucket`
-
-> [!TIP]
-> Only the root level of your bucket is supported as a Microsoft Purview data source. For example, the following URL, which includes a sub-folder is *not* supported: `s3://purview-tutorial-bucket/view-data`
->
-> However, if you configure a scan for a specific S3 bucket, you can select one or more specific folders for your scan. For more information, see the step to [scope your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets).
->
-
-### Locate your AWS account ID
-
-You'll need your AWS account ID to register your AWS account as a Microsoft Purview data source, together with all of its buckets.
-
-Your AWS account ID is the ID you use to sign in to the AWS console. You can also find it once you're logged in on the IAM dashboard, on the left under the navigation options, and at the top, as the numerical part of your sign-in URL:
-
-For example:
-
-![Retrieve your AWS account ID.](./media/register-scan-amazon-s3/aws-locate-account-id.png)
--
-## Add a single Amazon S3 bucket as a Microsoft Purview account
-
-Use this procedure if you only have a single S3 bucket that you want to register to Microsoft Purview as a data source, or if you have multiple buckets in your AWS account, but don't want to register all of them to Microsoft Purview.
-
-**To add your bucket**:
-
-1. In Microsoft Purview, go to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) > **Amazon S3** > **Continue**.
-
- ![Add an Amazon AWS bucket as a Microsoft Purview data source.](./media/register-scan-amazon-s3/add-s3-datasource-to-purview.png)
-
- > [!TIP]
- > If you have multiple [collections](manage-data-sources.md#manage-collections) and want to add your Amazon S3 to a specific collection, select the **Map view** at the top right, and then select the **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) button inside your collection.
- >
-
-1. In the **Register sources (Amazon S3)** pane that opens, enter the following details:
-
- |Field |Description |
- |||
- |**Name** |Enter a meaningful name, or use the default provided. |
- |**Bucket URL** | Enter your AWS bucket URL, using the following syntax: `s3://<bucketName>` <br><br>**Note**: Make sure to use only the root level of your bucket. For more information, see [Retrieve your Amazon S3 bucket name](#retrieve-your-amazon-s3-bucket-name). |
- |**Select a collection** |If you selected to register a data source from within a collection, that collection already listed. <br><br>Select a different collection as needed, **None** to assign no collection, or **New** to create a new collection now. <br><br>For more information about Microsoft Purview collections, see [Manage data sources in Microsoft Purview](manage-data-sources.md#manage-collections).|
- | | |
-
- When you're done, select **Finish** to complete the registration.
-
-Continue with [Create a scan for one or more Amazon S3 buckets.](#create-a-scan-for-one-or-more-amazon-s3-buckets).
-
-## Add an AWS account as a Microsoft Purview account
-
-Use this procedure if you have multiple S3 buckets in your Amazon account, and you want to register all of them as Microsoft Purview data sources.
-
-When [configuring your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets), you'll be able to select the specific buckets you want to scan, if you don't want to scan all of them together.
-
-**To add your Amazon account**:
-
-1. In Microsoft Purview, go to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) > **Amazon accounts** > **Continue**.
-
- ![Add an Amazon account as a Microsoft Purview data source.](./media/register-scan-amazon-s3/add-s3-account-to-purview.png)
-
- > [!TIP]
- > If you have multiple [collections](manage-data-sources.md#manage-collections) and want to add your Amazon S3 to a specific collection, select the **Map view** at the top right, and then select the **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) button inside your collection.
- >
-
-1. In the **Register sources (Amazon S3)** pane that opens, enter the following details:
-
- |Field |Description |
- |||
- |**Name** |Enter a meaningful name, or use the default provided. |
- |**AWS account ID** | Enter your AWS account ID. For more information, see [Locate your AWS account ID](#locate-your-aws-account-id)|
- |**Select a collection** |If you selected to register a data source from within a collection, that collection already listed. <br><br>Select a different collection as needed, **None** to assign no collection, or **New** to create a new collection now. <br><br>For more information about Microsoft Purview collections, see [Manage data sources in Microsoft Purview](manage-data-sources.md#manage-collections).|
- | | |
-
- When you're done, select **Finish** to complete the registration.
-
-Continue with [Create a scan for one or more Amazon S3 buckets](#create-a-scan-for-one-or-more-amazon-s3-buckets).
-
-## Create a scan for one or more Amazon S3 buckets
-
-Once you've added your buckets as Microsoft Purview data sources, you can configure a scan to run at scheduled intervals or immediately.
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), and then do one of the following:
-
- - In the **Map view**, select **New scan** ![New scan icon.](./media/register-scan-amazon-s3/new-scan-button.png) in your data source box.
- - In the **List view**, hover over the row for your data source, and select **New scan** ![New scan icon.](./media/register-scan-amazon-s3/new-scan-button.png).
-
-1. On the **Scan...** pane that opens on the right, define the following fields and then select **Continue**:
-
- |Field |Description |
- |||
- |**Name** | Enter a meaningful name for your scan or use the default. |
- |**Type** |Displayed only if you've added your AWS account, with all buckets included. <br><br>Current options include only **All** > **Amazon S3**. Stay tuned for more options to select as Microsoft Purview's support matrix expands. |
- |**Credential** | Select a Microsoft Purview credential with your role ARN. <br><br>**Tip**: If you want to create a new credential at this time, select **New**. For more information, see [Create a Microsoft Purview credential for your AWS bucket scan](#create-a-microsoft-purview-credential-for-your-aws-s3-scan). |
- | **Amazon S3** | Displayed only if you've added your AWS account, with all buckets included. <br><br>Select one or more buckets to scan, or **Select all** to scan all the buckets in your account. |
- | | |
-
- Microsoft Purview automatically checks that the role ARN is valid, and that the buckets and objects within the buckets are accessible, and then continues if the connection succeeds.
-
- > [!TIP]
- > To enter different values and test the connection yourself before continuing, select **Test connection** at the bottom right before selecting **Continue**.
- >
-
-1. <a name="scope-your-scan"></a>On the **Scope your scan** pane, select the specific buckets or folders you want to include in your scan.
-
- When creating a scan for an entire AWS account, you can select specific buckets to scan. When creating a scan for a specific AWS S3 bucket, you can select specific folders to scan.
-
-1. On the **Select a scan rule set** pane, either select the **AmazonS3** default rule set, or select **New scan rule set** to create a new custom rule set. Once you have your rule set selected, select **Continue**.
-
- If you select to create a new custom scan rule set, use the wizard to define the following settings:
-
- |Pane |Description |
- |||
- |**New scan rule set** /<br>**Scan rule description** | Enter a meaningful name and an optional description for your rule set |
- |**Select file types** | Select all the file types you want to include in the scan, and then select **Continue**.<br><br>To add a new file type, select **New file type**, and define the following: <br>- The file extension you want to add <br>- An optional description <br>- Whether the file contents have a custom delimiter, or are a system file type. Then, enter your custom delimiter, or select your system file type. <br><br>Select **Create** to create your custom file type. |
- |**Select classification rules** | Navigate to and select the classification rules you want to run on your dataset. |
- | | |
-
- Select **Create** when you're done to create your rule set.
-
-1. On the **Set a scan trigger** pane, select one of the following, and then select **Continue**:
-
- - **Recurring** to configure a schedule for a recurring scan
- - **Once** to configure a scan that starts immediately
-
-1. On the **Review your scan** pane, check your scanning details to confirm that they're correct, and then select **Save** or **Save and Run** if you selected **Once** in the previous pane.
-
- > [!NOTE]
- > Once started, scanning can take up to 24 hours to complete. You'll be able to review your **Insight Reports** and search the catalog 24 hours after you started each scan.
- >
-
-For more information, see [Explore Microsoft Purview scanning results](#explore-microsoft-purview-scanning-results).
-
-## Explore Microsoft Purview scanning results
-
-Once a Microsoft Purview scan is complete on your Amazon S3 buckets, drill down in the Microsoft Purview **Data Map** area to view the scan history.
-
-Select a data source to view its details, and then select the **Scans** tab to view any currently running or completed scans.
-If you've added an AWS account with multiple buckets, the scan history for each bucket is shown under the account.
-
-For example:
-
-![Show the AWS S3 bucket scans under your AWS account source.](./media/register-scan-amazon-s3/account-scan-history.png)
-
-Use the other areas of Microsoft Purview to find out details about the content in your data estate, including your Amazon S3 buckets:
--- **Search the Microsoft Purview data catalog,** and filter for a specific bucket. For example:-
- ![Search the catalog for AWS S3 assets.](./media/register-scan-amazon-s3/search-catalog-screen-aws.png)
--- **View Insight reports** to view statistics for the classification, sensitivity labels, file types, and more details about your content.-
- All Microsoft Purview Insight reports include the Amazon S3 scanning results, along with the rest of the results from your Azure data sources. When relevant, another **Amazon S3** asset type was added to the report filtering options.
-
- For more information, see the [Understand Data Estate Insights in Microsoft Purview](concept-insights.md).
-
-## Minimum permissions for your AWS policy
-
-The default procedure for [creating an AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview) to use when scanning your S3 buckets uses the **AmazonS3ReadOnlyAccess** policy.
-
-The **AmazonS3ReadOnlyAccess** policy provides minimum permissions required for scanning your S3 buckets, and may include other permissions as well.
-
-To apply only the minimum permissions required for scanning your buckets, create a new policy with the permissions listed in the following sections, depending on whether you want to scan a single bucket or all the buckets in your account.
-
-Apply your new policy to the role instead of **AmazonS3ReadOnlyAccess.**
-
-### Individual buckets
-
-When scanning individual S3 buckets, minimum AWS permissions include:
--- `GetBucketLocation`-- `GetBucketPublicAccessBlock`-- `GetObject`-- `ListBucket`-
-Make sure to define your resource with the specific bucket name.
-For example:
-
-```json
-{
-"Version": "2012-10-17",
-"Statement": [
- {
- "Effect": "Allow",
- "Action": [
- "s3:GetBucketLocation",
- "s3:GetBucketPublicAccessBlock",
- "s3:GetObject",
- "s3:ListBucket"
- ],
- "Resource": "arn:aws:s3:::<bucketname>"
- },
- {
- "Effect": "Allow",
- "Action": [
- "s3:GetObject"
- ],
- "Resource": "arn:aws:s3::: <bucketname>/*"
- }
- ]
-}
-```
-
-### All buckets in your account
-
-When scanning all the buckets in your AWS account, minimum AWS permissions include:
--- `GetBucketLocation`-- `GetBucketPublicAccessBlock`-- `GetObject`-- `ListAllMyBuckets`-- `ListBucket`.-
-Make sure to define your resource with a wildcard. For example:
-
-```json
-{
-"Version": "2012-10-17",
-"Statement": [
- {
- "Effect": "Allow",
- "Action": [
- "s3:GetBucketLocation",
- "s3:GetBucketPublicAccessBlock",
- "s3:GetObject",
- "s3:ListAllMyBuckets",
- "s3:ListBucket"
- ],
- "Resource": "*"
- },
- {
- "Effect": "Allow",
- "Action": [
- "s3:GetObject"
- ],
- "Resource": "*"
- }
- ]
-}
-```
-
-## Troubleshooting
-
-Scanning Amazon S3 resources requires [creating a role in AWS IAM](#create-a-new-aws-role-for-microsoft-purview) to allow the Microsoft Purview scanner service running in a Microsoft account in AWS to read the data.
-
-Configuration errors in the role can lead to connection failure. This section describes some examples of connection failures that may occur while setting up the scan, and the troubleshooting guidelines for each case.
-
-If all of the items described in the following sections are properly configured, and scanning S3 buckets still fails with errors, contact Microsoft support.
-
-> [!NOTE]
-> For policy access issues, make sure that neither your bucket policy, nor your SCP policy are blocking access to your S3 bucket from Microsoft Purview.
->
->For more information, see [Confirm your bucket policy access](#confirm-your-bucket-policy-access) and [Confirm your SCP policy access](#confirm-your-scp-policy-access).
->
-### Bucket is encrypted with KMS
-
-Make sure that the AWS role has **KMS Decrypt** permissions. For more information, see [Configure scanning for encrypted Amazon S3 buckets](#configure-scanning-for-encrypted-amazon-s3-buckets).
-
-### AWS role is missing an external ID
-
-Make sure that the AWS role has the correct external ID:
-
-1. In the AWS IAM area, select the **Role > Trust relationships** tab.
-1. Follow the steps in [Create a new AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview) again to verify your details.
-
-### Error found with the role ARN
-
-This is a general error that indicates an issue when using the Role ARN. For example, you may want to troubleshoot as follows:
--- Make sure that the AWS role has the required permissions to read the selected S3 bucket. Required permissions include `AmazonS3ReadOnlyAccess` or the [minimum read permissions](#minimum-permissions-for-your-aws-policy), and `KMS Decrypt` for encrypted buckets.--- Make sure that the AWS role has the correct Microsoft account ID. In the AWS IAM area, select the **Role > Trust relationships** tab and then follow the steps in [Create a new AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview) again to verify your details.-
-For more information, see [Can't find the specified bucket](#cant-find-the-specified-bucket),
-
-### Can't find the specified bucket
-
-Make sure that the S3 bucket URL is properly defined:
-
-1. In AWS, navigate to your S3 bucket, and copy the bucket name.
-1. In Microsoft Purview, edit the Amazon S3 data source, and update the bucket URL to include your copied bucket name, using the following syntax: `s3://<BucketName>`
--
-## Next steps
-
-Learn more about Microsoft Purview Insight reports:
-
-> [!div class="nextstepaction"]
-> [Understand Data Estate Insights in Microsoft Purview](concept-insights.md)
purview Register Scan Azure Arc Enabled Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-arc-enabled-sql-server.md
- Title: Connect to and manage Azure Arc-enabled SQL Server
-description: This guide describes how to connect to Azure Arc-enabled SQL Server in Microsoft Purview, and use Microsoft Purview features to scan and manage your Azure Arc-enabled SQL Server source.
----- Previously updated : 04/05/2023---
-# Connect to and manage Azure Arc-enabled SQL Server in Microsoft Purview
-
-This article shows how to register an Azure Arc-enabled SQL Server instance. It also shows how to authenticate and interact with Azure Arc-enabled SQL Server in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)(GA) | [Yes](#scan)(preview) | [Yes](#scan)(preview) | [Yes](#scan)(preview) | [Yes](#scan)(preview) | No | [Yes](#access-policy)(GA) | Limited** | No |
-
-\** Lineage is supported if the dataset is used as a source/sink in the [Azure Data Factory copy activity](how-to-link-azure-data-factory.md).
-
-The supported SQL Server versions are 2012 and later. SQL Server Express LocalDB is not supported.
-
-When you're scanning Azure Arc-enabled SQL Server, Microsoft Purview supports extracting the following technical metadata:
--- Instances-- Databases-- Schemas-- Tables, including the columns-- Views, including the columns-
-When you're setting up a scan, you can choose to specify the database name to scan one database. You can further scope the scan by selecting tables and views as needed. The whole Azure Arc-enabled SQL Server instance will be scanned if you don't provide a database name.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. See [Access control in the Microsoft Purview governance portal](catalog-permissions.md) for details.
-
-* The latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and manage a self-hosted integration runtime](manage-integration-runtimes.md).
-
-## Register
-
-This section describes how to register an Azure Arc-enabled SQL Server instance in Microsoft Purview by using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-There are two ways to set up authentication for scanning Azure Arc-enabled SQL Server with a self-hosted integration runtime:
--- Windows authentication-- SQL Server authentication-
-To configure authentication for the SQL Server deployment:
-
-1. In SQL Server Management Studio (SSMS), go to **Server Properties**, and then select **Security** on the left pane.
-1. Under **Server authentication**:
-
- - For Windows authentication, select either **Windows Authentication mode** or **SQL Server and Windows Authentication mode**.
- - For SQL Server authentication, select **SQL Server and Windows Authentication mode**.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/enable-sql-server-authentication.png" alt-text="Screenshot that shows the Security page of the Server Properties window, with options for selecting authentication mode.":::
-
-A change to the server authentication requires you to restart the SQL Server instance and SQL Server Agent. In SSMS, go to the SQL Server instance and select **Restart** on the right-click options pane.
-
-#### Create a new login and user
-
-If you want to create a new login and user to scan your SQL Server instance, use the following steps.
-
-The account must have access to the master database, because `sys.databases` is in the master database. The Microsoft Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
-
-> [!Note]
-> You can run all the following steps by using [this code](https://github.com/Azure/Purview-Samples/blob/master/TSQL-Code-Permissions/grant-access-to-on-prem-sql-databases.sql).
-
-1. Go to SSMS, connect to the server, and then select **Security** on the left pane.
-
-1. Select and hold (or right-click) **Login**, and then select **New login**. If Windows authentication is applied, select **Windows authentication**. If SQL Server authentication is applied, select **SQL Server authentication**.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/create-new-login-user.png" alt-text="Screenshot that shows selections for creating a new login and user.":::
-
-1. Select **Server Roles** on the left pane, and ensure that a public role is assigned.
-
-1. Select **User Mapping** on the left pane, select all the databases in the map, and then select the **db_datareader** database role.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/user-mapping.png" alt-text="Screenshot that shows user mapping.":::
-
-1. Select **OK** to save.
-
-1. If SQL Server authentication is applied, you must change your password as soon as you create a new login:
-
- 1. Select and hold (or right-click) the user that you created, and then select **Properties**.
- 1. Enter a new password and confirm it.
- 1. Select the **Specify old password** checkbox and enter the old password.
- 1. Select **OK**.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/change-password.png" alt-text="Screenshot that shows selections for changing a password.":::
-
-#### Store your SQL Server login password in a key vault and create a credential in Microsoft Purview
-
-1. Go to your key vault in the Azure portal. Select **Settings** > **Secrets**.
-1. Select **+ Generate/Import**. For **Name** and **Value**, enter the password from your SQL Server login.
-1. Select **Create**.
-1. If your key vault is not connected to Microsoft Purview yet, [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account).
-1. [Create a new credential](manage-credentials.md#create-a-new-credential) by using the username and password to set up your scan.
-
- Be sure to select the right authentication method when you're creating a new credential. If Windows authentication is applied, select **Windows authentication**. If SQL Server authentication is applied, select **SQL Server authentication**.
-
-### Steps to register
-
-1. Go to your Microsoft Purview account.
-
-1. Under **Sources and scanning** on the left pane, select **Integration runtimes**. Make sure that a self-hosted integration runtime is set up. If it isn't set up, [follow the steps to create a self-hosted integration runtime](manage-integration-runtimes.md) for scanning on an on-premises or Azure virtual machine that has access to your on-premises network.
-
-1. Select **Data Map** on the left pane.
-
-1. Select **Register**.
-
-1. Select **Azure Arc-enabled SQL Server**, and then select **Continue**.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/set-up-azure-arc-enabled-sql-data-source.png" alt-text="Screenshot that shows selecting a SQL data source.":::
-
-1. Provide a friendly name, which is a short name that you can use to identify your server. Also provide the server endpoint.
-
-1. Select **Finish** to register the data source.
-
-## Scan
-
-Use the following steps to scan Azure Arc-enabled SQL Server instances to automatically identify assets and classify your data. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
-
-To create and run a new scan:
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), select the **Data Map** tab on the left pane.
-
-1. Select the Azure Arc-enabled SQL Server source that you registered.
-
-1. Select **New scan**.
-
-1. Select the credential to connect to your data source. Credentials are grouped and listed under the authentication methods.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/azure-arc-enabled-sql-set-up-scan-win-auth.png" alt-text="Screenshot that shows selecting a credential for a scan.":::
-
-1. You can scope your scan to specific tables by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/azure-arc-enabled-sql-scope-your-scan.png" alt-text="Screenshot that shows selected assets for scoping a scan.":::
-
-1. Select a scan rule set. You can choose between the system default, existing custom rule sets, or creation of a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/azure-arc-enabled-sql-scan-rule-set.png" alt-text="Screenshot that shows selecting a scan rule set.":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-arc-enabled-sql-server/trigger-scan.png" alt-text="Screenshot that shows setting up a recurring scan trigger.":::
-
-1. Review your scan, and then select **Save and run**.
--
-## Access policy
-
-### Supported policies
-
-The following types of policies are supported on this data resource from Microsoft Purview:
--- [DevOps policies](concept-policies-devops.md)-- [Data Owner policies](concept-policies-data-owner.md) (preview)-
-### Access policy prerequisites on Azure Arc-enabled SQL Server
--
-### Configure the Microsoft Purview account for policies
--
-### Register the data source and enable Data use management
-
-Before you can create policies, you must register the Azure Arc-enabled SQL Server data source with Microsoft Purview:
-
-1. Sign in to Microsoft Purview Studio.
-
-1. Go to **Data map** on the left pane, select **Sources**, and then select **Register**. Enter **Azure Arc** in the search box and select **SQL Server on Azure Arc**. Then select **Continue**.
-
- ![Screenshot that shows selecting a source for registration.](./media/how-to-policies-data-owner-sql/select-arc-sql-server-for-registration.png)
-
-1. For **Name**, enter a name for this registration. It's best practice to make the name of the registration the same as the server name in the next step.
-
-1. Select values for **Azure subscription**, **Server name**, and **Server endpoint**.
-
-1. For **Select a collection**, choose a collection to put this registration in.
-
-1. Enable **Data use management**. **Data use management** needs certain permissions and can affect the security of your data, because it delegates to certain Microsoft Purview roles to manage access to the data sources. Go through the secure practices related to **Data use management** in this guide: [Enable Data use management on your Microsoft Purview sources](./how-to-enable-data-use-management.md).
-
-1. Upon enabling Data Use Management, Microsoft Purview will automatically capture the **Application ID** of the App Registration related to this Azure Arc-enabled SQL Server if one has been configured. Come back to this screen and hit the refresh button on the side of it to refresh, in case the association between the Azure Arc-enabled SQL Server and the App Registration changes in the future.
-
-1. Select **Register** or **Apply**.
-
-![Screenshot that shows selections for registering a data source for a policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-arc-sql.png)
-
-### Enable policies in Azure Arc-enabled SQL Server
-
-### Create a policy
-
-To create an access policy for Azure Arc-enabled SQL Server, follow these guides:
-
-* [Provision access to system health, performance and audit information in SQL Server 2022](./how-to-policies-devops-arc-sql-server.md#create-a-new-devops-policy)
-* [Provision read/modify access on a single SQL Server 2022](./how-to-policies-data-owner-arc-sql-server.md#create-and-publish-a-data-owner-policy)
-
-To create policies that cover all data sources inside a resource group or Azure subscription, see [Discover and govern multiple Azure sources in Microsoft Purview](register-scan-azure-multiple-sources.md#access-policy).
-
-## Next steps
-
-Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
--- [DevOps policies in Microsoft Purview](concept-policies-devops.md)-- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Azure Blob Storage Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-blob-storage-source.md
- Title: 'Discover and govern Azure Blob Storage'
-description: This article outlines the process to register an Azure Blob Storage data source in Microsoft Purview including instructions to authenticate and interact with the Azure Blob Storage Gen2 source
----- Previously updated : 02/16/2023---
-# Connect to Azure Blob storage in Microsoft Purview
-
-This article outlines the process to register and govern Azure Blob Storage accounts in Microsoft Purview including instructions to authenticate and interact with the Azure Blob Storage source
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)|[Yes](#scan) | [Yes](#scan)|[Yes](#scan)| [Yes](create-sensitivity-label.md)| [Yes (preview)](#access-policy) | Limited** |[Yes](#data-sharing)|
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-For file types such as csv, tsv, psv, ssv, the schema is extracted when the following logics are in place:
-
-* First row values are non-empty
-* First row values are unique
-* First row values aren't a date or a number
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-## Register
-This section will enable you to register the Azure Blob storage account for scan and data share in Purview.
-
-### Prerequisites for register
-* You'll need to be a Data Source Admin and one of the other Purview roles (for example, Data Reader or Data Share Contributor) to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-### Steps to register
-
-It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-
-1. Go to the Microsoft Purview governance portal by:
-
- * Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- * Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Navigate to the **Data Map --> Sources**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-open-purview-studio.png" alt-text="Screenshot that shows the link to open Microsoft Purview governance portal":::
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-sources.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
-
-1. Create the [Collection hierarchy](./quickstart-create-collection.md) using the **Collections** menu and assign permissions to individual subcollections, as required
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-collections.png" alt-text="Screenshot that shows the collection menu to create collection hierarchy":::
-
-1. Navigate to the appropriate collection under the **Sources** menu and select the **Register** icon to register a new Azure Blob data source
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-register-source.png" alt-text="Screenshot that shows the collection used to register the data source":::
-
-1. Select the **Azure Blob Storage** data source and select **Continue**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-select-data-source.png" alt-text="Screenshot that allows selection of the data source":::
-
-1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Azure Blob Storage account name** and the **collection** and select **Apply**. Leave the **Data Use Management** toggle on the **disabled** position until you have a chance to carefully go over this [document](./how-to-policies-data-owner-storage.md).
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-data-source-details.png" alt-text="Screenshot that shows the details to be entered in order to register the data source":::
-
-1. The Azure Blob storage account will be shown under the selected Collection
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-data-source-collection.png" alt-text="Screenshot that shows the data source mapped to the collection to initiate scanning":::
-
-## Scan
-
-For file types such as csv, tsv, psv, ssv, the schema is extracted when the following logics are in place:
-
-* First row values are non-empty
-* First row values are unique
-* First row values are not a date or a number
-
-### Authentication for a scan
-
-Your Azure network may allow for communications between your Azure resources, but if you've set up firewalls, private endpoints, or virtual networks within Azure, you'll need to follow one of these configurations below.
-
-|Networking constraints |Integration runtime type |Available credential types |
-||||
-|No private endpoints or firewalls | Azure IR | Managed identity (Recommended), service principal, or account key|
-|Firewall enabled but no private endpoints| Azure IR | Managed identity |
-|Private endpoints enabled | *Self-Hosted IR | Service principal, account key|
-
-*To use a self-hosted integration runtime, you'll first need to [create one](manage-integration-runtimes.md) and confirm your [network settings for Microsoft Purview](catalog-private-link.md)
-
-#### Using a system or user assigned managed identity for scanning
-
-There are two types of managed identity you can use:
-
-* **System-assigned managed identity (Recommended)** - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview system-assigned managed identity (SAMI) to perform the scans.
-
-* **User-assigned managed identity** (preview) - Similar to a system managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-It's important to give your Microsoft Purview account the permission to scan the Azure Blob data source. You can add access for the SAMI or UAMI at the Subscription, Resource Group, or Resource level, depending on what level scan permission is needed.
-
-> [!NOTE]
-> If you have firewall enabled for the storage account, you must use **managed identity** authentication method when setting up a scan.
-
-> [!Note]
-> You need to be an owner of the subscription to be able to add a managed identity on an Azure resource.
-
-1. From the [Azure portal](https://portal.azure.com), find either the subscription, resource group, or resource (for example, an Azure Blob storage account) that you would like to allow the catalog to scan.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-storage-acct.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access Control (IAM)** in the left navigation and then select **+ Add** --> **Add role assignment**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _Microsoft Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
-
-1. Go into your Azure Blob storage account in [Azure portal](https://portal.azure.com)
-1. Navigate to **Security + networking > Networking**
-
-1. Choose **Selected Networks** under **Allow access from**
-
-1. In the **Exceptions** section, select **Allow trusted Microsoft services to access this storage account** and hit **Save**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-permission.png" alt-text="Screenshot that shows the exceptions to allow trusted Microsoft services to access the storage account":::
-
-> [!Note]
-> For more details, please see steps in [Authorize access to blobs and queues using Azure Active Directory](../storage/blobs/authorize-access-azure-active-directory.md)
-
-#### Using Account Key for scanning
-
-When authentication method selected is **Account Key**, you need to get your access key and store in the key vault:
-
-1. Navigate to your Azure Blob storage account
-1. Select **Security + networking > Access keys**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-access-keys.png" alt-text="Screenshot that shows the access keys in the storage account":::
-
-1. Copy your *key* and save it separately for the next steps
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-key.png" alt-text="Screenshot that shows the access keys to be copied":::
-
-1. Navigate to your key vault
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-key-vault.png" alt-text="Screenshot that shows the key vault":::
-
-1. Select **Settings > Secrets** and select **+ Generate/Import**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-generate-secret.png" alt-text="Screenshot that shows the key vault option to generate a secret":::
-
-1. Enter the **Name** and **Value** as the *key* from your storage account
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-secret-values.png" alt-text="Screenshot that shows the key vault option to enter the secret values":::
-
-1. Select **Create** to complete
-
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan
-
-#### Using Service Principal for scanning
-
-##### Creating a new service principal
-
-If you need to [Create a new service principal](./create-service-principal-azure.md), it's required to register an application in your Azure AD tenant and provide access to Service Principal in your data sources. Your Azure AD Global Administrator or other roles such as Application Administrator can perform this operation.
-
-##### Getting the Service Principal's Application ID
-
-1. Copy the **Application (client) ID** present in the **Overview** of the [_Service Principal_](./create-service-principal-azure.md) already created
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-sp-appln-id.png" alt-text="Screenshot that shows the Application (client) ID for the Service Principal":::
-
-##### Granting the Service Principal access to your Azure Blob account
-
-It's important to give your service principal the permission to scan the Azure Blob data source. You can add access for the service principal at the Subscription, Resource Group, or Resource level, depending on what level scan access is needed.
-
-> [!Note]
-> You need to be an owner of the subscription to be able to add a service principal on an Azure resource.
-
-1. From the [Azure portal](https://portal.azure.com), find either the subscription, resource group, or resource (for example, an Azure Blob Storage storage account) that you would like to allow the catalog to scan.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-storage-acct.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access Control (IAM)** in the left navigation and then select **+ Add** --> **Add role assignment**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _service principal_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-sp-permission.png" alt-text="Screenshot that shows the details to provide storage account permissions to the service principal":::
-
-### Creating the scan
-
-1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview governance portal**
-1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy
-1. Select the **New Scan** icon under the **Azure Blob data source** registered earlier
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-new-scan.png" alt-text="Screenshot that shows the screen to create a new scan":::
-
-#### If using a system or user assigned managed identity
-
-Provide a **Name** for the scan, select the Microsoft Purview accounts SAMI or UAMI under **Credential**, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-managed-identity.png" alt-text="Screenshot that shows the managed identity option to run the scan":::
-
-#### If using Account Key
-
-Provide a **Name** for the scan, select the Azure IR or your Self-Hosted IR depending on your configuration, choose the appropriate collection for the scan, and select **Authentication method** as _Account Key_ and select **Create**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-acct-key.png" alt-text="Screenshot that shows the Account Key option for scanning":::
-
-#### If using Service Principal
-
-1. Provide a **Name** for the scan, select the Azure IR or your Self-Hosted IR depending on your configuration, choose the appropriate collection for the scan, and select the **+ New** under **Credential**
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-sp-option.png" alt-text="Screenshot that shows the option for service principal to enable scanning":::
-
-1. Select the appropriate **Key vault connection** and the **Secret name** that was used while creating the _Service Principal_. The **Service Principal ID** is the **Application (client) ID** copied earlier
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-service-principal-option.png" alt-text="Screenshot that shows the service principal option":::
-
-1. Select **Test connection**. On a successful connection, select **Continue**
-
-### Scoping and running the scan
-
-1. You can scope your scan to specific folders and subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-scope-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. If creating a new _scan rule set_, select the **file types** to be included in the scan rule.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-file-types.png" alt-text="Scan rule set file types":::
-
-1. You can select the **classification rules** to be included in the scan rule
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-classification rules.png" alt-text="Scan rule set classification rules":::
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-select-scan-rule-set.png" alt-text="Scan rule set selection":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-scan-trigger.png" alt-text="scan trigger":::
-
-1. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-review-scan.png" alt-text="review scan":::
-
-### Viewing Scan
-
-1. Navigate to the _data source_ in the _Collection_ and select **View Details** to check the status of the scan
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-view-scan.png" alt-text="view scan":::
-
-1. The scan details indicate the progress of the scan in the **Last run status** and the number of assets _scanned_ and _classified_
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-scan-details.png" alt-text="view scan details":::
-
-1. The **Last run status** will be updated to **In progress** and then **Completed** once the entire scan has run successfully
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-scan-in-progress.png" alt-text="view scan in progress":::
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-scan-completed.png" alt-text="view scan completed":::
-
-### Managing Scan
-
-Scans can be managed or run again on completion
-
-1. Select the **Scan name** to manage the scan
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-manage-scan.png" alt-text="manage scan":::
-
-1. You can _run the scan_ again, _edit the scan_, _delete the scan_
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-manage-scan-options.png" alt-text="manage scan options":::
-
-1. You can _run an incremental scan_ or a _full scan_ again.
-
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-full-inc-scan.png" alt-text="full or incremental scan":::
-
-## Data sharing
-
-Microsoft Purview Data Sharing (preview) enables sharing of data in-place from Azure Blob storage account to Azure Blob storage account. This section provides details about the specific requirements for sharing and receiving data in-place between Azure Blob storage accounts. Refer to [How to share data](how-to-share-data.md) and [How to receive share](how-to-receive-share.md) for step by step guides on how to use data sharing.
-
-### Storage accounts supported for in-place data sharing
-
-The following storage accounts are supported for in-place data sharing:
-
-* Regions: Canada Central, Canada East, UK South, UK West, Australia East, Japan East, Korea South, and South Africa North
-* Redundancy options: LRS, GRS, RA-GRS
-* Tiers: Hot, Cool
-
-Only use storage accounts without production workload for the preview.
-
->[!NOTE]
-> Source and target storage accounts must be in the same region as each other. They don't need to be in the same region as the Microsoft Purview account.
-
-### Storage account permissions required to share data
-
-To add or update a storage account asset to a share, you need ONE of the following permissions:
-
-* **Microsoft.Authorization/roleAssignments/write** - This permission is available in the _Owner_ role.
-* **Microsoft.Storage/storageAccounts/blobServices/containers/blobs/modifyPermissions/** - This permission is available in the _Blob Storage Data Owner_ role.
-
-### Storage account permissions required to receive shared data
-
-To map a storage account asset in a received share, you need ONE of the following permissions:
-
-* **Microsoft.Storage/storageAccounts/write** - This permission is available in the _Contributor_ and _Owner_ role.
-* **Microsoft.Storage/storageAccounts/blobServices/containers/write** - This permission is available in the _Contributor_, _Owner_, _Storage Blob Data Contributor_ and _Storage Blob Data Owner_ role.
-
-### Update shared data in source storage account
-
-Updates you make to shared files or data in the shared folder from source storage account will be made available to recipient in target storage account in near real time. When you delete subfolder or files within the shared folder, they will disappear for recipient. To delete the shared folder, file or parent folders or containers, you need to first revoke access to all your shares from the source storage account.
-
-### Access shared data in target storage account
-
-The target storage account enables recipient to access the shared data read-only in near real time. You can connect analytics tools such as Synapse Workspace and Databricks to the shared data to perform analytics. Cost of accessing the shared data is charged to the target storage account.
-
-### Service limit
-
-Source storage account can support up to 20 targets, and target storage account can support up to 100 sources. If you require an increase in limit, please contact Support.
-
-## Access policy
-
-The following types of policies are supported on this data resource from Microsoft Purview:
-- [Data owner policies](concept-policies-data-owner.md)-- [Self-service access policies](concept-self-service-data-access-policy.md)-
-### Access policy pre-requisites on Azure Storage accounts
-
-### Configure the Microsoft Purview account for policies
-
-### Register the data source in Microsoft Purview for Data Use Management
-The Azure Storage resource needs to be registered first with Microsoft Purview before you can create access policies.
-To register your resource, follow the **Prerequisites** and **Register** sections of this guide:
-- [Register Azure Blob Storage in Microsoft Purview](register-scan-azure-blob-storage-source.md#prerequisites)-
-After you've registered the data source, you'll need to enable Data Use Management. This is a pre-requisite before you can create policies on the data source. Data Use Management can impact the security of your data, as it delegates to certain Microsoft Purview roles managing access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** option set to **Enabled**, it will look like this screenshot:
-![Screenshot shows how to register a data source for policy with the option Data use management set to enable](./media/how-to-policies-data-owner-storage/register-data-source-for-policy-storage.png)
-
-### Create a policy
-To create an access policy for Azure Blob Storage, follow this guide: [Provision read/modify access on a single storage account](./how-to-policies-data-owner-storage.md#create-and-publish-a-data-owner-policy).
-
-To create policies that cover all data sources inside a resource group or Azure subscription you can refer to [this section](register-scan-azure-multiple-sources.md#access-policy).
-
-## Next steps
-
-Follow the below guides to learn more about Microsoft Purview and your data.
-- [Data owner policies in Microsoft Purview](concept-policies-data-owner.md)-- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Data Sharing in Microsoft Purview](concept-data-share.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Cosmos Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-cosmos-database.md
- Title: 'Connect to Azure Cosmos DB for SQL API'
-description: This article outlines the process to register an Azure Cosmos DB instance in Microsoft Purview including instructions to authenticate and interact with the Azure Cosmos DB database
----- Previously updated : 09/14/2022--
-# Connect to Azure Cosmos DB for SQL API in Microsoft Purview
-
-This article outlines the process to register and scan Azure Cosmos DB for SQL API instance in Microsoft Purview, including instructions to authenticate and interact with the Azure Cosmos DB database source
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)|[No](#scan) | [Yes](#scan)|[Yes](#scan)|[Yes](create-sensitivity-label.md)|No|No** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section will enable you to register the Azure Cosmos DB for SQL API instance and set up an appropriate authentication mechanism to ensure successful scanning of the data source.
-
-### Steps to register
-
-It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-
-1. Open the Microsoft Purview governance portal by:
-
- * Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- * Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Navigate to the **Data Map --> Collections**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-open-purview-studio.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
-
-1. Create the [Collection hierarchy](./quickstart-create-collection.md) using the **Collections** menu and assign permissions to individual subcollections, as required
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-collections.png" alt-text="Screenshot that shows the collection menu to create collection hierarchy":::
-
-1. Navigate to the appropriate collection under the **Sources** menu and select the **Register** icon to register a new Azure Cosmos DB database
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-register-data-source.png" alt-text="Screenshot that shows the collection used to register the data source":::
-
-1. Select the **Azure Cosmos DB for SQL API** data source and select **Continue**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-select-data-source.png" alt-text="Screenshot that allows selection of the data source":::
-
-1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, **Cosmos DB account name** and the **collection** and select **Apply**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-data-source-details.png" alt-text="Screenshot that shows the details to be entered in order to register the data source":::
-
-1. The _Azure Cosmos DB database_ storage account will be shown under the selected Collection
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-collection-mapping.png" alt-text="Screenshot that shows the data source mapped to the collection to initiate scanning":::
-
-## Scan
-
-### Authentication for a scan
-
-In order to have access to scan the data source, an authentication method in the Azure Cosmos DB database Storage account needs to be configured.
-
-There is only one way to set up authentication for Azure Cosmos DB Database:
-
-**Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Microsoft Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password or a password.
-
-> [!Note]
-> You need to deploy an _Azure key vault_ resource in your subscription and assign _Microsoft Purview accountΓÇÖs_ MSI with required access permission to secrets inside _Azure key vault_.
-
-#### Using Account Key for scanning
-
-You need to get your access key and store in the key vault:
-
-1. Navigate to your Azure Cosmos DB database storage account
-1. Select **Settings > Keys**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-access-keys.png" alt-text="Screenshot that shows the access keys in the storage account":::
-
-1. Copy your *key* and save it separately for the next steps
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-key.png" alt-text="Screenshot that shows the access keys to be copied":::
-
-1. Navigate to your key vault
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-key-vault.png" alt-text="Screenshot that shows the key vault":::
-
-1. Select **Settings > Secrets** and select **+ Generate/Import**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-generate-secret.png" alt-text="Screenshot that shows the key vault option to generate a secret":::
-
-1. Enter the **Name** and **Value** as the *key* from your storage account and Select **Create** to complete
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-key-vault-options.png" alt-text="Screenshot that shows the key vault option to enter the secret values":::
-
-1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan.
-
-### Creating the scan
-
-1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview governance portal**
-1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy
-1. Select the **New Scan** icon under the **Azure Cosmos database** registered earlier
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-create-scan.png" alt-text="Screenshot that shows the screen to create a new scan":::
-
-1. Provide a **Name** for the scan, choose the appropriate collection for the scan and select **+ New** under **Credential**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-acct-key-option.png" alt-text="Screenshot that shows the Account Key option for scanning":::
-
-1. Select the appropriate **Key vault connection** and the **Secret name** that was used while creating the _Account Key_. Choose **Authentication method** as _Account Key_
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-acct-key-details.png" alt-text="Screenshot that shows the account key options":::
-
-1. Select **Test connection**. On a successful connection, select **Continue**
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-test-connection.png" alt-text="Screenshot that shows Test Connection success":::
-
-### Scoping and running the scan
-
-1. You can scope your scan to specific folders and subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-scope-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-scan-rule-set.png" alt-text="Scan rule set":::
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-new-scan-rule-set.png" alt-text="New Scan rule":::
-
-1. You can select the **classification rules** to be included in the scan rule
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-classification.png" alt-text="Scan rule set classification rules":::
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-select-scan-rule-set.png" alt-text="Scan rule set selection":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-scan-trigger.png" alt-text="scan trigger":::
-
-1. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-review-scan.png" alt-text="review scan":::
-
-### Viewing Scan
-
-1. Navigate to the _data source_ in the _Collection_ and select **View Details** to check the status of the scan
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-view-scan.png" alt-text="view scan":::
-
-1. The scan details indicate the progress of the scan in the **Last run status** and the number of assets _scanned_ and _classified_
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-last-run-status.png" alt-text="view scan details":::
-
-1. The **Last run status** will be updated to **In progress** and then **Completed** once the entire scan has run successfully
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-scan-in-progress.png" alt-text="view scan in progress":::
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-scan-completed.png" alt-text="view scan completed":::
-
-### Managing Scan
-
-Scans can be managed or run again on completion.
-
-1. Select the **Scan name** to manage the scan
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-manage-scan.png" alt-text="manage scan":::
-
-1. You can _run the scan_ again, _edit the scan_, _delete the scan_
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-manage-scan-options.png" alt-text="manage scan options":::
-
-1. You can run a _Full Scan_ again
-
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-full-scan.png" alt-text="full scan":::
-
-## Next steps
-
-Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Data Explorer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-data-explorer.md
- Title: 'Connect to and manage Azure Data Explorer'
-description: This guide describes how to connect to Azure Data Explorer in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Data Explorer source.
----- Previously updated : 02/27/2023---
-# Connect to and manage Azure Data Explorer in Microsoft Purview
--
-This article outlines how to register Azure Data Explorer, and how to authenticate and interact with Azure Data Explorer in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan)| [Yes](#scan)| [Yes](create-sensitivity-label.md)| No | Limited* | No |
-
-\* *Lineage is supported if dataset is used as a sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section describes how to register Azure Data Explorer in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-There are several methods available for authentication for Azure Data Explorer:
--- [Service Principal](#service-principal-to-register)-- [System-assigned managed identity (SAMI)](#system-or-user-assigned-managed-identity-to-register)-- [User-assigned managed identity (UAMI)](#system-or-user-assigned-managed-identity-to-register)-
-#### Service Principal to register
-
-To use service principal authentication for scans, you can use an existing one or create a new one.
-
-> [!Note]
-> If you have to create a new Service Principal, please follow these steps:
-> 1. Navigate to the [Azure portal](https://portal.azure.com).
-> 1. Select **Azure Active Directory** from the left-hand side menu.
-> 1. Select **App registrations**.
-> 1. Select **+ New application registration**.
-> 1. Enter a name for the **application** (the service principal name).
-> 1. Select **Accounts in this organizational directory only**.
-> 1. For Redirect URI select **Web** and enter any URL you want; it doesn't have to be real or work.
-> 1. Then select **Register**.
-
-It is required to get the Service Principal's application ID and secret:
-
-1. Navigate to your Service Principal in the [Azure portal](https://portal.azure.com)
-1. Copy the values the **Application (client) ID** from **Overview** and **Client secret** from **Certificates & secrets**.
-1. Navigate to your key vault
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** of your choice and **Value** as the **Client secret** from your Service Principal
-1. Select **Create** to complete
-1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the Service Principal to set up your scan
-
-#### Granting the Service Principal access to your Azure data explorer instance
-
-1. Navigate to the Azure portal. Then navigate to your Azure data explorer instance.
-
-1. Add the service principal to the **AllDatabasesViewer** role in the **Permissions** tab.
-
-### System or user assigned managed identity to register
-
-* **System-assigned managed identity** - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. It has the same name as your Microsoft Purview account.
-
-* **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-To register using either of these managed identities, follow these steps:
-
-1. If you would like to use a user-assigned managed identity and have not created one, follow the steps to create the identity in the [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-1. Navigate to the [Azure portal](https://portal.azure.com). Then navigate to your Azure data explorer instance.
-
-1. Select the **Permissions** tab on the left pane.
-
-1. Add the SAMI or UAMI to the **AllDatabasesViewer** role in the **Permissions** tab.
-
-### Steps to register
-
-To register a new Azure Data Explorer (Kusto) account in your data catalog, follow these steps:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On **Register sources**, select **Azure Data Explorer**
-1. Select **Continue**
--
-On the **Register sources (Azure Data Explorer (Kusto))** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed with in the Catalog.
-2. Choose your Azure subscription to filter down Azure Data Explorer.
-3. Select an appropriate cluster.
-4. Select a collection or create a new one (Optional).
-5. Select **Register** to register the data source.
--
-## Scan
-
-Follow the steps below to scan Azure Data Explorer to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the Azure Data Explorer source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source.
-
- :::image type="content" source="media/register-scan-azure-data-explorer/set-up-scan-data-explorer.png" alt-text="Set up scan":::
-
-1. You can scope your scan to specific databases by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-data-explorer/scope-your-scan-data-explorer.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-data-explorer/scan-rule-set-data-explorer.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-data-explorer/trigger-scan.png" alt-text="trigger":::
-
-1. Review your scan and select **Save and run**.
--
-## Next steps
-
-Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Databricks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-databricks.md
- Title: Connect to and manage Azure Databricks
-description: This guide describes how to connect to Azure Databricks in Microsoft Purview, and how to use Microsoft Purview to scan and manage your Azure Databricks source.
----- Previously updated : 04/20/2023---
-# Connect to and manage Azure Databricks in Microsoft Purview (Preview)
-
-This article outlines how to register Azure Databricks, and how to authenticate and interact with Azure Databricks in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
--
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | No | No | No| No| [Yes](#lineage) | No |
-
-When scanning Azure Databricks source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Azure Databricks workspace
- - Hive server
- - Databases
- - Tables including the columns, foreign keys, unique constraints, and storage description
- - Views including the columns and storage description
--- Fetching relationship between external tables and Azure Data Lake Storage Gen2/Azure Blob assets (external locations). -- Fetching static lineage between tables and views based on the view definition.-
-This connector brings metadata from Databricks metastore. Comparing to scan via [Hive Metastore connector](register-scan-hive-metastore-source.md) in case you use it to scan Azure Databricks earlier:
--- You can directly set up scan for Azure Databricks workspaces without direct HMS access. It uses Databricks personal access token for authentication and connects to a cluster to perform scan. -- The Databricks workspace info is captured.-- The relationship between tables and storage assets is captured.-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* You must have an active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need an Azure Key Vault, and to [grant Microsoft Purview permissions to access secrets](manage-credentials.md#grant-microsoft-purview-access-to-your-azure-key-vault).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and configure a self-hosted integration runtime](manage-integration-runtimes.md). The minimal supported self-hosted Integration Runtime version is 5.20.8227.2.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure that Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the machine where the self-hosted integration runtime is running. If you don't have this update installed, [download it now](/cpp/windows/latest-supported-vc-redist).
-
-* In your Azure Databricks workspace:
-
- * [Generate a personal access token](/azure/databricks/dev-tools/auth#--azure-databricks-personal-access-tokens), and store it as a secret in Azure Key Vault.
- * [Create a cluster](/azure/databricks/clusters/create-cluster). Note down the cluster ID - you can find it in Azure Databricks workspace -> Compute -> your cluster -> Tags -> Automatically added tags -> `ClusterId`.
- * Make sure the user has the following [permissions](/azure/databricks/security/access-control/cluster-acl) so as to connect to the Azure Databricks cluster:
-
- * **Can Attach To** permission to connect to the running cluster.
- * **Can Restart** permission to automatically trigger the cluster to start if its state is terminated when connecting.
-
-## Register
-
-This section describes how to register an Azure Databricks workspace in Microsoft Purview by using [the Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-1. Go to your Microsoft Purview account.
-
-1. Select **Data Map** on the left pane.
-
-1. Select **Register**.
-
-1. In **Register sources**, select **Azure Databricks** > **Continue**.
-
-1. On the **Register sources (Azure Databricks)** screen, do the following:
-
- 1. For **Name**, enter a name that Microsoft Purview will list as the data source.
-
- 1. For **Azure subscription** and **Databricks workspace name**, select the subscription and workspace that you want to scan from the dropdown. The Databricks workspace URL will be automatically populated.
-
- 1. For **Select a collection**, choose a collection from the list or create a new one. This step is optional.
-
- :::image type="content" source="media/register-scan-azure-databricks/configure-sources.png" alt-text="Screenshot of registering Azure Databricks source." border="true":::
-
-1. Select **Finish**.
-
-## Scan
-
-> [!TIP]
-> To troubleshoot any issues with scanning:
-> 1. Confirm you have followed all [**prerequisites**](#prerequisites).
-> 1. Review our [**scan troubleshooting documentation**](troubleshoot-connections.md).
-
-Use the following steps to scan Azure Databricks to automatically identify assets. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
-
-1. In the Management Center, select integration runtimes. Make sure that a self-hosted integration runtime is set up. If it isn't set up, use the steps in [Create and manage a self-hosted integration runtime](./manage-integration-runtimes.md).
-
-1. Go to **Sources**.
-
-1. Select the registered Azure Databricks.
-
-1. Select **+ New scan**.
-
-1. Provide the following details:
-
- 1. **Name**: Enter a name for the scan.
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
-
- * Select **Access Token Authentication** while creating a credential.
- * Provide secret name of the personal access token that you created in [Prerequisites](#prerequisites) in the appropriate box.
-
- For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-
- 1. **Cluster ID**: Specify the cluster ID that Microsoft Purview will connect to and perform the scan. You can find it in Azure Databricks workspace -> Compute -> your cluster -> Tags -> Automatically added tags -> `ClusterId`.
-
- 1. **Mount points**: Provide the mount point and Azure Storage source location string when you have external storage manually mounted to Databricks. Use the format `/mnt/<path>=abfss://<container>@<adls_gen2_storage_account>.dfs.core.windows.net/;/mnt/<path>=wasbs://<container>@<blob_storage_account>.blob.core.windows.net` It will be used to capture the relationship between tables and the corresponding storage assets in Microsoft Purview. This setting is optional, if it's not specified, such relationship won't be retrieved.
-
- You can get the list of mount points in your Databricks workspace by running the following Python command in a notebook:
-
- ```
- dbutils.fs.mounts()
- ```
-
- It will print all the mount points like below:
-
- ```
- [MountInfo(mountPoint='/databricks-datasets', source='databricks-datasets', encryptionType=''),
- MountInfo(mountPoint='/mnt/ADLS2', source='abfss://samplelocation1@azurestorage1.dfs.core.windows.net/', encryptionType=''),
- MountInfo(mountPoint='/databricks/mlflow-tracking', source='databricks/mlflow-tracking', encryptionType=''),
- MountInfo(mountPoint='/mnt/Blob', source='wasbs://samplelocation2@azurestorage2.blob.core.windows.net', encryptionType=''),
- MountInfo(mountPoint='/databricks-results', source='databricks-results', encryptionType=''),
- MountInfo(mountPoint='/databricks/mlflow-registry', source='databricks/mlflow-registry', encryptionType=''), MountInfo(mountPoint='/', source='DatabricksRoot', encryptionType='')]ΓÇ»
- ```
-
- In this example, specify the following as mount points:
-
- `/mnt/ADLS2=abfss://samplelocation1@azurestorage1.dfs.core.windows.net/;/mnt/Blob=wasbs://samplelocation2@azurestorage2.blob.core.windows.net`
-
- 1. **Maximum memory available**: Maximum memory (in gigabytes) available on the customer's machine for the scanning processes to use. This value is dependent on the size of Azure Databricks to be scanned.
-
- > [!Note]
- > As a thumb rule, please provide 1GB memory for every 1000 tables.
-
- :::image type="content" source="media/register-scan-azure-databricks/scan.png" alt-text="Screenshot of setting up Azure Databricks scan." border="true":::
-
-1. Select **Continue**.
-
-1. For **Scan trigger**, choose whether to set up a schedule or run the scan once.
-
-1. Review your scan and select **Save and Run**.
-
-Once the scan successfully completes, see how to [browse and search Azure Databricks assets](#browse-and-search-assets).
--
-## Browse and search assets
-
-After scanning your Azure Databricks, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-From the Databricks workspace asset, you can find the associated Hive Metastore and the tables/views, reversed applies too.
----
-## Lineage
-
-Refer to the [supported capabilities](#supported-capabilities) section on the supported Azure Databricks scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
-
-Go to the Hive table/view asset -> lineage tab, you can see the asset relationship when applicable. For relationship between table and external storage assets, you'll see Hive table asset and the storage asset are directly connected bi-directionally, as they mutually impact each other. If you use mount point in create table statement, you need to provide the mount point information in [scan settings](#scan) to extract such relationship.
--
-## Next steps
-
-Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Azure Files Storage Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-files-storage-source.md
- Title: Connect to and manage Azure Files
-description: This guide describes how to connect to Azure Files in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Files source.
----- Previously updated : 02/14/2023---
-# Connect to and manage Azure Files in Microsoft Purview
-
-This article outlines how to register Azure Files, and how to authenticate and interact with Azure Files in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](create-sensitivity-label.md)| No | Limited** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-Azure Files supports full and incremental scans to capture the metadata and classifications, based on system default and custom classification rules.
-
-For file types such as csv, tsv, psv, ssv, the schema is extracted when the following logics are in place:
-
-1. First row values are non-empty
-2. First row values are unique
-3. First row values are neither a date nor a number
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section describes how to register Azure Files in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-Currently there's only one way to set up authentication for Azure file shares:
--- Account Key-
-#### Account Key to register
-
-When authentication method selected is **Account Key**, you need to get your access key and store in the key vault:
-
-1. Navigate to your storage account
-1. Select **Security + networking > Access keys**
-1. Copy your *key* and save it somewhere for the next steps
-1. Navigate to your key vault
-1. Select **Objects > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *key* from your storage account
-1. Select **Create** to complete
-1. If your key vault isn't connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan
-
-### Steps to register
-
-To register a new Azure Files account in your data catalog, follow these steps:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On **Register sources**, select **Azure Files**
-1. Select **Continue**
--
-On the **Register sources (Azure Files)** screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed with in the Catalog.
-2. Choose your Azure subscription to filter down Azure Storage Accounts.
-3. Select an Azure Storage Account.
-4. Select a collection or create a new one (Optional).
-5. Select **Register** to register the data source.
--
-## Scan
-
-Follow the steps below to scan Azure Files to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the Azure Files source that you registered.
-
-1. Select **New scan**
-
-1. Select the account key credential to connect to your data source.
-
- :::image type="content" source="media/register-scan-azure-files/set-up-scan-azure-file.png" alt-text="Set up scan":::
-
-1. You can scope your scan to specific databases by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-files/azure-file-scope-your-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-files/azure-file-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule to reoccur, or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-files/trigger-scan.png" alt-text="trigger":::
-
-1. Review your scan and select **Save and run**.
--
-## Next steps
-
-Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Machine Learning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-machine-learning.md
- Title: Connect to and manage Azure Machine Learning
-description: This guide describes how to connect to Azure Machine Learning in Microsoft Purview.
----- Previously updated : 05/23/2023---
-# Connect to and manage Azure Machine Learning in Microsoft Purview (Preview)
-
-This article outlines how to register Azure Machine Learning and how to authenticate and interact with Azure Machine Learning in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-This integration between Azure Machine Learning and Microsoft Purview applies an auto push model that, once the Azure Machine Learning workspace has been registered in Microsoft Purview, the metadata from workspace is pushed to Microsoft Purview automatically on a daily basis. It isn't necessary to manually scan to bring metadata from the workspace into Microsoft Purview.
--
-## Supported capabilities
-
-|**Metadata Extraction**|  **Full Scan**  |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| Yes | Yes | No | No | No| No| [Yes](#lineage) | No |
-
-When scanning the Azure Machine Learning source, Microsoft Purview supports:
--- Extracting technical metadata from Azure Machine Learning, including:
- - Workspace
- - Models
- - Datasets
- - Jobs
-
-## Prerequisites
-
-* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* You must have an active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* An active Azure Machine Learning workspace
-
-* A user must have, at minimum, read access to the Azure Machine Learning workspace to enable auto push from Azure Machine Learning workspace.
-
-## Register
-
-This section describes how to register an Azure Machine Learning workspace in Microsoft Purview by using [the Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-1. Go to your Microsoft Purview account.
-
-1. Select **Data Map** on the left pane.
-
-1. Select **Register**.
-
-1. In **Register sources**, select **Azure Machine Learning (Preview)** >ΓÇ»**Continue**.
-
- :::image type="content" source="./media/register-scan-azure-machine-learning/register-source.png" alt-text="Screenshot of the Azure Machine Learning source entry.":::
-
-1. On the **Register sources (Azure Machine Learning)** screen, do the following:
-
- 1. For **Name**, enter a friendly name that Microsoft Purview lists as the data source for the workspace.
-
- 1. For **Azure subscription** and **Workspace name**, select the subscription and workspace that you want to push from the dropdown. The Azure Machine Learning workspace URL is automatically populated.
-
- 1. For **Select a collection**, choose a collection from the list or create a new one. This step is optional.
-
-1. Select **Register** to register the source.   
-
-## Scan
-
-After you register your Azure Machine Learning workspace, the metadata will be automatically pushed to Microsoft Purview on a daily basis.
-
-## Browse and discover
-
-To access the browse experience for data assets from your Azure Machine Learning workspace, select __Browse Assets__.
--
-### Browse by collection
-
-Browse by collection allows you to explore the different collections you're a data reader or curator for.
--
-### Browse by source type
-
-1. On the browse by source types page, select __Azure Machine Learning__.
-
- :::image type="content" source="./media/register-scan-azure-machine-learning/browse-by-type.png" alt-text="Screenshot of the Azure Machine Learning source type." lightbox="./media/register-scan-azure-machine-learning/browse-by-type.png":::
-
-1. The top-level assets under your selected data type are listed. Pick one of the assets to further explore its contents. For example, after selecting Azure Machine Learning, you'll see a list of workspaces with assets in the data catalog.
-
- :::image type="content" source="./media/register-scan-azure-machine-learning/top-level-assets.png" alt-text="Screenshot of the top level assets." lightbox="./media/register-scan-azure-machine-learning/top-level-assets.png":::
-
-1. Selecting one of the workspaces displays the child assets.
-
- :::image type="content" source="./media/register-scan-azure-machine-learning/child-assets.png" alt-text="Screenshot of child assets." lightbox="./media/register-scan-azure-machine-learning/child-assets.png":::
-
-1. From the list, you can select on any of the asset items to view details. For example, selecting one of the Azure Machine Learning job assets displays the details of the job.
-
- :::image type="content" source="./media/register-scan-azure-machine-learning/asset-details.png" alt-text="Screenshot of asset details." lightbox="./media/register-scan-azure-machine-learning/asset-details.png":::
-
-## Lineage
-
-To view lineage information, select an asset and then select the __Lineage__ tab. From the lineage tab, you can see the asset's relationships when applicable. You can see what source data was used (if registered in Purview), the data asset created in Azure Machine Learning, any jobs, and finally the resulting machine learning model. In more advanced scenarios, you can see:
--- If multiple data sources were used-- Multiple stages of training on multiple data assets-- If multiple models were created from the same data sources--
-For more information on lineage in general, see [data lineage](concept-data-lineage.md) and [lineage users guide](catalog-lineage-user-guide.md).
-
-## Next steps
-
-Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Azure Multiple Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-multiple-sources.md
- Title: Discover and govern multiple Azure sources
-description: This guide describes how to connect to multiple Azure sources in Microsoft Purview at once, and use Microsoft Purview's features to scan and manage your sources.
----- Previously updated : 04/13/2023---
-# Discover and govern multiple Azure sources in Microsoft Purview
-
-This article outlines how to register multiple Azure sources and how to authenticate and interact with them in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan)| [Yes](#scan)| [Source dependent](create-sensitivity-label.md) | [Yes](#access-policy) | [Source Dependent](catalog-lineage-user-guide.md)| No |
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section describes how to register multiple Azure sources in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Prerequisites for registration
-
-Microsoft Purview needs permissions to be able to list resources under a subscription or resource group.
--
-### Authentication for registration
-
-There are two ways to set up authentication for multiple sources in Azure:
-
-* Managed identity
-* Service principal
-
-You must set up authentication on each resource within your subscription or resource group that you want to register and scan. Azure Storage resource types (Azure Blob Storage and Azure Data Lake Storage Gen2) make it easy by allowing you to add the MSI file or service principal at the subscription or resource group level as a storage blob data reader. The permissions then trickle down to each storage account within that subscription or resource group. For all other resource types, you must apply the MSI file or service principal on each resource, or create a script to do so.
-
-To learn how to add permissions on each resource type within a subscription or resource group, see the following resources:
-
-- [Azure Blob Storage](register-scan-azure-blob-storage-source.md#authentication-for-a-scan)-- [Azure Data Lake Storage Gen1](register-scan-adls-gen1.md#authentication-for-a-scan)-- [Azure Data Lake Storage Gen2](register-scan-adls-gen2.md#authentication-for-a-scan)-- [Azure SQL Database](register-scan-azure-sql-database.md#configure-authentication-for-a-scan)-- [Azure SQL Managed Instance](register-scan-azure-sql-managed-instance.md#authentication-for-registration)-- [Azure Synapse Analytics](register-scan-azure-synapse-analytics.md#authentication-for-registration)-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left menu.
-1. Select **Register**.
-1. On **Register sources**, select **Azure (multiple)**.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/register-azure-multiple.png" alt-text="Screenshot that shows the tile for Azure Multiple on the screen for registering multiple sources.":::
-
-1. Select **Continue**.
-1. On the **Register sources (Azure)** screen, do the following:
-
- 1. In the **Name** box, enter a name that the data source will be listed with in the catalog.
- 1. In the **Management group** box, optionally choose a management group to filter down to.
- 1. In the **Subscription** and **Resource group** dropdown list boxes, select a subscription or a specific resource group, respectively. The registration scope will be set to the selected subscription or resource group.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/azure-multiple-source-setup.png" alt-text="Screenshot that shows the boxes for selecting a subscription and resource group.":::
-
- 1. In the **Select a collection** box, select a collection or create a new one (optional).
- 1. Select **Register** to register the data sources.
-
-## Scan
-
->[!IMPORTANT]
-> Currently, scanning multiple Azure sources is only supported using Azure integration runtime, therefore, only Microsoft Purview accounts that allow public access on the firewall can use this option.
-
-Follow the steps below to scan multiple Azure sources to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. Select the **Data Map** tab on the left pane in the Microsoft Purview governance portal.
-1. Select the data source that you registered.
-1. Select **View details** > **+ New scan**, or use the **Scan** quick-action icon on the source tile.
-1. For **Name**, fill in the name.
-1. For **Type**, select the types of resources that you want to scan within this source. Choose one of these options:
-
- - Leave it as **All**. This selection includes future resource types that might not currently exist within that subscription or resource group.
- - Use the boxes to specifically select resource types that you want to scan. If you choose this option, future resource types that might be created within this subscription or resource group won't be included for scans, unless the scan is explicitly edited in the future.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/multiple-source-scan.png" alt-text="Screenshot that shows options for scanning multiple sources.":::
-
-1. Select the credential to connect to the resources within your data source:
- - You can select a credential at the parent level as an MSI file, or you can select a credential for a particular service principal type. You can then use that credential for all the resource types under the subscription or resource group.
- - You can specifically select the resource type and apply a different credential for that resource type.
-
- Each credential will be considered as the method of authentication for all the resources under a particular type. You must set the chosen credential on the resources in order to successfully scan them, as described [earlier in this article](#authentication-for-registration).
-1. Within each type, you can select to either scan all the resources or scan a subset of them by name:
- - If you leave the option as **All**, then future resources of that type will also be scanned in future scan runs.
- - If you select specific storage accounts or SQL databases, then future resources of that type created within this subscription or resource group won't be included for scans, unless the scan is explicitly edited in the future.
-
-1. Select **Test connection**. This will first test access to check if you've applied the Microsoft Purview MSI file as a reader on the subscription or resource group. If you get an error message, follow [these instructions](#prerequisites-for-registration) to resolve it. Then it will test your authentication and connection to each of your selected sources and generate a report. The number of sources selected will impact the time it takes to generate this report. If failed on some resources, hovering over the **X** icon will display the detailed error message.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/test-connection.png" alt-text="Screenshot showing the scan setup slider, with the Test Connection button highlighted.":::
- :::image type="content" source="media/register-scan-azure-multiple-sources/test-connection-report.png" alt-text="Screenshot showing an example test connection report, with some connections passing and some failing. Hovering over one of the failed connections shows a detailed error report.":::
-
-1. After your test connection has passed, select **Continue** to proceed.
-
-1. Select scan rule sets for each resource type that you chose in the previous step. You can also create scan rule sets inline.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/multiple-scan-rule-set.png" alt-text="Screenshot that shows scan rules for each resource type.":::
-
-1. Choose your scan trigger. You can schedule it to run weekly, monthly, or once.
-
-1. Review your scan and select **Save** to complete setup.
-
-## View your scans and scan runs
-
-1. View source details by selecting **View details** on the tile under the **Data Map** section.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/multiple-source-detail.png" alt-text="Screenshot that shows source details.":::
-
-1. View scan run details by going to the **Scan details** page.
-
- The *status bar* is a brief summary of the running status of the child resources. It's displayed on the subscription level or resource group level. The colors have the following meanings:
-
- - Green: The scan was successful.
- - Red: The scan failed.
- - Gray: The scan is still in progress.
-
- You can select each scan to view finer details.
-
- :::image type="content" source="media/register-scan-azure-multiple-sources/multiple-scan-full-details.png" alt-text="Screenshot that shows scan details.":::
-
-1. View a summary of recent failed scan runs at the bottom of the source details. You can also view more granular details about these runs.
-
-## Manage your scans: edit, delete, or cancel
-
-To manage a scan, do the following:
-
-1. Go to the management center.
-1. Select **Data sources** under the **Sources and scanning** section, and then select the desired data source.
-1. Select the scan that you want to manage. Then:
-
- - You can edit the scan by selecting **Edit**.
- - You can delete the scan by selecting **Delete**.
- - If the scan is running, you can cancel it by selecting **Cancel**.
-
-## Access Policy
-
-### Supported policies
-The following types of policies are supported on this data resource from Microsoft Purview:
-- [DevOps policies](concept-policies-devops.md)-- [Data owner policies](concept-policies-data-owner.md)--
-### Access policy pre-requisites on Azure Storage accounts
-To be able to enforce policies from Microsoft Purview, data sources under a resource group or subscription need to be configured first. Instructions vary based on the data source type.
-Please review whether they support Microsoft Purview policies, and if so, the specific instructions to enable them, under the Access Policy link in the [Microsoft Purview connector document](./microsoft-purview-connector-overview.md).
-
-### Configure the Microsoft Purview account for policies
-
-### Register the data source in Microsoft Purview for Data Use Management
-The Azure subscription or resource group needs to be registered first with Microsoft Purview before you can create access policies.
-To register your resource, follow the **Prerequisites** and **Register** sections of this guide:
-- [Register multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md#prerequisites)-
-After you've registered the data resource, you'll need to enable Data Use Management. This is a pre-requisite before you can create policies on the data resource. Data Use Management can impact the security of your data, as it delegates to certain Microsoft Purview roles managing access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
-
-Once your data source has the **Data Use Management** option set to **Enabled**, it will look like this screenshot:
-![Screenshot shows how to register a data source for policy with the option Data use management set to enable.](./media/how-to-policies-data-owner-resource-group/register-resource-group-for-policy.png)
-
-### Create a policy
-To create an access policy on an entire Azure subscription or resource group, follow these guides:
-* [DevOps policy covering all sources in a subscription or resource group](./how-to-policies-devops-resource-group.md#create-a-new-devops-policy)
-* [Provision read/modify access to all sources in a subscription or resource group](./how-to-policies-data-owner-resource-group.md#create-and-publish-a-data-owner-policy)
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Devops policies in Microsoft Purview](concept-policies-devops.md)-- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Mysql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-mysql-database.md
- Title: 'Connect to and manage Azure Database for MySQL'
-description: This guide describes how to connect to Azure Database for MySQL in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Database for MySQL source.
----- Previously updated : 12/13/2022---
-# Connect to and manage Azure Database for MySQL in Microsoft Purview
-
-This article outlines how to register a database in Azure Database for MySQL, and how to authenticate and interact with Azure Database for MySQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)| [Yes*](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](create-sensitivity-label.md) | No | Limited** | No |
-
-\* Microsoft Purview relies on UPDATE_TIME metadata from Azure Database for MySQL for incremental scans. In some cases, this field might not persist in the database and a full scan is performed. For more information, see [The INFORMATION_SCHEMA TABLES Table](https://dev.mysql.com/doc/refman/5.7/en/information-schema-tables-table.html) for MySQL.
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-> [!Important]
-> Microsoft Purview only supports single server deployment option for Azure Database for MySQL.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section describes how to register an Azure Database for MySQL in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-You'll need **username** and **password** for the next steps.
-
-Follow the instructions in [CREATE DATABASES AND USERS](../mysql/howto-create-users.md) to create a login for your Azure Database for MySQL.
-
-1. Navigate to your key vault in the Azure portal
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your Azure SQL Database
-1. Select **Create** to complete
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) of type SQL authentication using the **username** and **password** to set up your scan.
-
-### Steps to register
-
-To register a new Azure Database for MySQL in your data catalog, do the following:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-
-1. Select **Register**.
-
-1. On **Register sources**, select **Azure Database for MySQL**. Select **Continue**.
--
-On the **Register sources (Azure Database for MySQL)** screen, do the following:
-
-1. Enter a **Name** for your data source. This will be the display name for this data source in your Catalog.
-1. Select **From Azure subscription**, select the appropriate subscription from the **Azure subscription** drop-down box and the appropriate server from the **Server name** drop-down box.
-1. Select **Register** to register the data source.
--
-## Scan
-
-Follow the steps below to scan Azure Database for MySQL to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the Azure Database for MySQL source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source and check your connection to make sure your credential is properly configured.
-
- :::image type="content" source="media/register-scan-azure-mysql/03-new-scan-azure-mysql-connection-credential.png" alt-text="Set up scan":::
-
-1. You can scope your scan to specific folders or subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-mysql/04-scope-azure-mysql-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-mysql/05-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-mysql/06-trigger-scan.png" alt-text="trigger":::
-
-1. Review your scan and select **Save and run**.-->
---
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-postgresql.md
- Title: 'Connect to and manage an Azure Database for PostgreSQL'
-description: This guide describes how to connect to an Azure Database for PostgreSQL single server in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Database for PostgreSQL source.
----- Previously updated : 12/13/2022---
-# Connect to and manages an Azure Database for PostgreSQL in Microsoft Purview
-
-This article outlines how to register an Azure Database for PostgreSQL deployed with single server deployment option, and how to authenticate and interact with an Azure Database for PostgreSQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)| [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](create-sensitivity-label.md)| No | Limited** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-> [!Important]
-> Microsoft Purview only supports single server deployment option for Azure Database for PostgreSQL.
-> Versions 8.x to 12.x
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section describes how to register an Azure Database for PostgreSQL in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-Currently, to be able to manage and interact with an Azure Database for PostgreSQL single server, only SQL Authentication is supported.
-
-#### SQL Authentication
-
-Connecting to an Azure Database for PostgreSQL database requires the fully qualified server name and login credentials. You can follow the instructions in [CONNECT AND QUERY](../postgresql/connect-python.md) to create a login for your Azure Database for PostgreSQL if you don't have this available. You'll need **username** and **password** for the next steps.
-
-1. If you don't have an Azure Key vault already, follow [this guide to create an Azure Key Vault](../key-vault/certificates/quick-create-portal.md).
-1. Navigate to your key vault in the Azure portal
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your Azure PostgreSQL Database
-1. Select **Create** to complete
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) of type SQL authentication using the **username** and **password** to set up your scan
-
-### Steps to register
-
-To register a new Azure Database for PostgreSQL in your data catalog, do the following:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-
-1. Select **Register**
-
-1. On **Register sources**, select **Azure Database for PostgreSQL**. Select **Continue**.
--
-On the **Register sources Azure Database for PostgreSQL** screen, do the following:
-
-1. Enter a **Name** for your data source. This will be the display name for this data source in your Catalog.
-1. Select **From Azure subscription**, select the appropriate subscription from the **Azure subscription** drop-down box and the appropriate server from the **Server name** drop-down box.
-1. Select **Register** to register the data source.
--
-## Scan
-
-Follow the steps below to scan an Azure Database for PostgreSQL database to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the Azure Database for PostgreSQL source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source.
-
- :::image type="content" source="media/register-scan-azure-postgresql/03-azure-postgres-scan.png" alt-text="Set up scan":::
-
-1. You can scope your scan to specific folders or subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-postgresql/04-scope-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-postgresql/05-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-postgresql/06-trigger-scan.png" alt-text="trigger scan":::
-
-1. Review your scan and select **Save and run**.
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-sql-database.md
- Title: 'Discover and govern Azure SQL Database'
-description: Learn how to register, authenticate with, and interact with an Azure SQL database in Microsoft Purview.
----- Previously updated : 06/12/2023--
-# Discover and govern Azure SQL Database in Microsoft Purview
-
-This article outlines the process to register an Azure SQL database source in Microsoft Purview. It includes instructions to authenticate and interact with the SQL database.
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register-the-data-source) | [Yes](#scope-and-run-the-scan)|[Yes](#scope-and-run-the-scan) | [Yes](#scope-and-run-the-scan)|[Yes](#scope-and-run-the-scan)| [Yes](create-sensitivity-label.md)| [Yes](#set-up-access-policies) | [Yes (preview)](#extract-lineage-preview) | No |
-
-> [!NOTE]
-> [Data lineage extraction is currently supported only for stored procedure runs.](#troubleshoot-lineage-extraction) Lineage is also supported if Azure SQL tables or views are used as a source/sink in [Azure Data Factory Copy and Data Flow activities](how-to-link-azure-data-factory.md).
-
-When you're scanning Azure SQL Database, Microsoft Purview supports extracting technical metadata from these sources:
--- Server-- Database-- Schemas-- Tables, including columns-- Views, including columns-- Stored procedures (with lineage extraction enabled)-- Stored procedure runs (with lineage extraction enabled)-
-When you're setting up a scan, you can further scope it after providing the database name by selecting tables and views as needed.
-
-### Known limitations
-
-* Microsoft Purview supports a maximum of 800 columns on the schema tab. If there are more than 800 columns, Microsoft Purview will show **Additional-Columns-Truncated**.
-* For [lineage extraction scan](#extract-lineage-preview):
- * Lineage extraction scan is currently not supported if your logical server in Azure disables public access or doesn't allow Azure services to access it.
- * Lineage extraction scan is scheduled to run every six hours by default. The frequency can't be changed.
- * Lineage is captured only when the stored procedure execution transfers data from one table to another. And it's not supported for temporary tables.
- * Lineage extraction is not supported for functions or triggers.
- * Note due to the following limitations, currently you may see duplicate assets in the catalog if you have such scenarios.
- * The object names in assets and fully qualified names follow the case used in stored procedure statements, which may not align with the object case in original data source.
- * When SQL views are referenced in stored procedures, they're currently captured as SQL tables.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* Data Source Administrator and Data Reader permissions, so you can register a source and manage it in the Microsoft Purview governance portal. For details, see [Access control in the Microsoft Purview governance portal](catalog-permissions.md).
-
-## Register the data source
-
-Before you scan, it's important to register the data source in Microsoft Purview:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Navigate to the **Data Map**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-open-purview-studio.png" alt-text="Screenshot that shows the area for opening a Microsoft Purview governance portal.":::
-
-1. Create the [collection hierarchy](./quickstart-create-collection.md) by going to **Collections** and then selecting **Add a collection**. Assign permissions to individual subcollections as required.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-collections.png" alt-text="Screenshot that shows selections for assigning access control permissions to the collection hierarchy.":::
-
-1. Go to the appropriate collection under **Sources**, and then select the **Register** icon to register a new SQL database.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-data-source.png" alt-text="Screenshot that shows the collection that's used to register the data source.":::
-
-1. Select the **Azure SQL Database** data source, and then select **Continue**.
-
-1. For **Name**, provide a suitable name for the data source. Select relevant names for **Azure subscription**, **Server name**, and **Select a collection**, and then select **Apply**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-ds-details.png" alt-text="Screenshot that shows details entered to register a data source.":::
-
-1. Confirm that the SQL database appears under the selected collection.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-ds-collections.png" alt-text="Screenshot that shows a data source mapped to a collection to initiate scanning.":::
-
-## Update firewall settings
-
-If your database server has a firewall enabled, you need to update the firewall to allow access in one of the following ways:
--- [Allow Azure connections through the firewall](#allow-azure-connections). This is a straightforward option to route traffic through Azure networking, without needing to manage virtual machines.-- [Install a self-hosted integration runtime on a machine in your network and give it access through the firewall](#install-a-self-hosted-integration-runtime). If you have a private virtual network set up within Azure, or have any other closed network set up, using a self-hosted integration runtime on a machine within that network will allow you to fully manage traffic flow and utilize your existing network.-- [Use a managed virtual network](catalog-managed-vnet.md). Setting up a managed virtual network with your Microsoft Purview account will allow you to connect to Azure SQL by using the Azure integration runtime in a closed network.-
-For more information about the firewall, see the [Azure SQL Database firewall documentation](/azure/azure-sql/database/firewall-configure).
-
-### Allow Azure connections
-
-Enabling Azure connections will allow Microsoft Purview to connect to the server without requiring you to update the firewall itself.
-
-1. Go to your database account.
-1. On the **Overview** page, select the server name.
-1. Select **Security** > **Firewalls and virtual networks**.
-1. For **Allow Azure services and resources to access this server**, select **Yes**.
--
-For more information about allowing connections from inside Azure, see the [how-to guide](/azure/azure-sql/database/firewall-configure#connections-from-inside-azure).
-
-### Install a self-hosted integration runtime
-
-You can install a self-hosted integration runtime on a machine to connect with a resource in a private network:
-
-1. [Create and install a self-hosted integration runtime](./manage-integration-runtimes.md) on a personal machine, or on a machine inside the same virtual network as your database server.
-1. Check your database server's networking configuration to confirm that a private endpoint is accessible to the machine that contains the self-hosted integration runtime. Add the IP address of the machine if it doesn't already have access.
-1. If your logical server is behind a private endpoint or in a virtual network, you can use an [ingestion private endpoint](catalog-private-link-ingestion.md#deploy-self-hosted-integration-runtime-ir-and-scan-your-data-sources) to ensure end-to-end network isolation.
-
-## Configure authentication for a scan
-
-To scan your data source, you need to configure an authentication method in Azure SQL Database.
-
->[!IMPORTANT]
-> If you're using a [self-hosted integration runtime](manage-integration-runtimes.md) to connect to your resource, system-assigned and user-assigned managed identities won't work. You need to use service principal authentication or SQL authentication.
-
-Microsoft Purview supports the following options:
-
-* **System-assigned managed identity (SAMI)** (recommended). This is an identity that's associated directly with your Microsoft Purview account. It allows you to authenticate directly with other Azure resources without needing to manage a go-between user or credential set.
-
- The SAMI is created when your Microsoft Purview resource is created. It's managed by Azure and uses your Microsoft Purview account's name. The SAMI can't currently be used with a self-hosted integration runtime for Azure SQL.
-
- For more information, see the [managed identity overview](../active-directory/managed-identities-azure-resources/overview.md).
-
-* **User-assigned managed identity (UAMI)** (preview). Similar to a SAMI, a UAMI is a credential resource that allows Microsoft Purview to authenticate against Azure Active Directory (Azure AD).
-
- The UAMI is managed by users in Azure, rather than by Azure itself, which gives you more control over security. The UAMI can't currently be used with a self-hosted integration runtime for Azure SQL.
-
- For more information, see the [guide for user-assigned managed identities](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-* **Service principal**. A service principal is an application that can be assigned permissions like any other group or user, without being associated directly with a person. Authentication for service principals has an expiration date, so it can be useful for temporary projects.
-
- For more information, see the [service principal documentation](../active-directory/develop/app-objects-and-service-principals.md).
-
-* **SQL authentication**. Connect to the SQL database with a username and password. For more information, see the [SQL authentication documentation](/sql/relational-databases/security/choose-an-authentication-mode#connecting-through-sql-server-authentication).
-
- If you need to create a login, follow [this guide to query a SQL database](/azure/azure-sql/database/connect-query-portal). Use [this guide to create a login by using T-SQL](/sql/t-sql/statements/create-login-transact-sql).
-
- > [!NOTE]
- > Be sure to select the **Azure SQL Database** option on the page.
-
-For steps to authenticate with your SQL database, select your chosen method of authentication from the following tabs.
-
-# [SQL authentication](#tab/sql-authentication)
-
-> [!Note]
-> Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. The Microsoft Purview account should be able to scan the resources about 15 minutes after it gets permissions.
-
-1. You need a SQL login with at least `db_datareader` permissions to be able to access the information that Microsoft Purview needs to scan the database. You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-login-transact-sql?view=azuresqldb-current&preserve-view=true#examples-1) to create a sign-in for Azure SQL Database. Save the username and password for the next steps.
-
-1. Go to your key vault in the Azure portal.
-
-1. Select **Settings** > **Secrets**, and then select **+ Generate/Import**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-secret.png" alt-text="Screenshot that shows the key vault option to generate a secret.":::
-
-1. For **Name** and **Value**, use the username and password (respectively) from your SQL database.
-
-1. Select **Create**.
-
-1. If your key vault isn't connected to Microsoft Purview yet, [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account).
-
-1. [Create a new credential](manage-credentials.md#create-a-new-credential) by using the key to set up your scan.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-credentials.png" alt-text="Screenshot that shows the key vault option to set up credentials.":::
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-key-vault-options.png" alt-text="Screenshot that shows the key vault option to create a secret.":::
-
-# [Managed identity](#tab/managed-identity)
-
->[!IMPORTANT]
-> If you're using a [self-hosted integration runtime](manage-integration-runtimes.md) to connect to your resource, system-assigned and user-assigned managed identities won't work. You need to use SQL authentication or service principal authentication.
-
-### Configure Azure AD authentication in the database account
-
-The managed identity needs permission to get metadata for the database, schemas, and tables. It must also be authorized to query the tables to sample for classification.
-
-1. If you haven't already, [configure Azure AD authentication with Azure SQL](/azure/azure-sql/database/authentication-aad-configure).
-1. Create an Azure AD user in Azure SQL Database with the exact managed identity from Microsoft Purview. Follow the steps in [Create the service principal user in Azure SQL Database](/azure/azure-sql/database/authentication-aad-service-principal-tutorial#create-the-service-principal-user-in-azure-sql-database).
-1. Assign proper permission (for example: `db_datareader`) to the identity. Here's example SQL syntax to create the user and grant permission:
-
- ```sql
- CREATE USER [Username] FROM EXTERNAL PROVIDER
- GO
-
- EXEC sp_addrolemember 'db_datareader', [Username]
- GO
- ```
-
- > [!Note]
- > The `[Username]` value is your managed identity name from Microsoft Purview. You can [read more about fixed-database roles and their capabilities](/sql/relational-databases/security/authentication-access/database-level-roles#fixed-database-roles).
-
-### Configure portal authentication
-
-It's important to give your Microsoft Purview account's system-assigned managed identity or [user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) the permission to scan the SQL database. You can add the SAMI or UAMI at the subscription, resource group, or resource level, depending on the breadth of the scan.
-
-> [!Note]
-> To add a managed identity on an Azure resource, you need to be an owner of the subscription.
-
-1. From the [Azure portal](https://portal.azure.com), find the subscription, resource group, or resource (for example, a SQL database) that the catalog should scan.
-
-1. Select **Access control (IAM)** on the left menu, and then select **+ Add** > **Add role assignment**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sql-ds.png" alt-text="Screenshot that shows selections for adding a role assignment for access control.":::
-
-1. Set **Role** to **Reader**. In the **Select** box, enter your Microsoft Purview account name or UAMI. Then, select **Save** to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-access-managed-identity.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account.":::
-
-# [Service principal](#tab/service-principal)
-
-### Create a new service principal
-
-If you don't have a service principal, you can follow the [service principal guide](./create-service-principal-azure.md) to create one.
-
-> [!NOTE]
-> To create a service principal, you must register an application in your Azure AD tenant. If you don't have the required access, your Azure AD Global Administrator or Application Administrator can perform this operation.
-
-### Grant the service principal access to your SQL database
-
-The service principal needs permission to get metadata for the database, schemas, and tables. It must also be authorized to query the tables to sample for classification.
-
-1. If you haven't already, [configure Azure AD authentication with Azure SQL](/azure/azure-sql/database/authentication-aad-configure).
-1. Create an Azure AD user in Azure SQL Database with your service principal. Follow the steps in [Create the service principal user in Azure SQL Database](/azure/azure-sql/database/authentication-aad-service-principal-tutorial#create-the-service-principal-user-in-azure-sql-database).
-1. Assign proper permission (for example: `db_datareader`) to the identity. Here's example SQL syntax to create the user and grant permission:
-
- ```sql
- CREATE USER [Username] FROM EXTERNAL PROVIDER
- GO
-
- EXEC sp_addrolemember 'db_datareader', [Username]
- GO
- ```
-
- > [!Note]
- > The `[Username]` value is your own service principal's name. You can [read more about fixed-database roles and their capabilities](/sql/relational-databases/security/authentication-access/database-level-roles#fixed-database-roles).
-
-### Create the credential
-
-1. Go to your key vault in the Azure portal.
-
-1. Select **Settings** > **Secrets**, and then select **+ Generate/Import**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-secret.png" alt-text="Screenshot that shows the key vault option to generate a secret for a service principal.":::
-
-1. For **Name**, give the secret a name of your choice.
-
-1. For **Value**, use the service principal's secret value. If you've already created a secret for your service principal, you can find its value in **Client credentials** on your secret's overview page.
-
- If you need to create a secret, you can follow the steps in the [service principal guide](create-service-principal-azure.md#adding-a-secret-to-the-client-credentials).
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sp-client-credentials.png" alt-text="Screenshot that shows the client credentials for a service principal.":::
-
-1. Select **Create** to create the secret.
-
-1. If your key vault isn't connected to Microsoft Purview yet, [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account).
-
-1. [Create a new credential](manage-credentials.md#create-a-new-credential).
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-credentials.png" alt-text="Screenshot that shows the key vault option to add a credential for a service principal.":::
-
-1. For **Service Principal ID**, use the application (client) ID of your service principal.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sp-appln-id.png" alt-text="Screenshot that shows the application ID for a service principal.":::
-
-1. For **Secret name**, use the name of the secret that you created in previous steps.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sp-cred.png" alt-text="Screenshot that shows the key vault option to create a secret for a service principal.":::
---
-## Create the scan
-
-1. Open your Microsoft Purview account and select **Open Microsoft Purview governance portal**.
-1. Go to **Data map** > **Sources** to view the collection hierarchy.
-1. Select the **New Scan** icon under the SQL database that you registered earlier.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-new-scan.png" alt-text="Screenshot that shows the pane for creating a new scan.":::
-
-To learn more about data lineage in Azure SQL Database, see the [Extract lineage (preview)](#extract-lineage-preview) section of this article.
-
-For scanning steps, select your method of authentication from the following tabs.
-
-# [SQL authentication](#tab/sql-authentication)
-
-1. For **Name**, provide a name for the scan.
-
-1. For **Database selection method**, select **Enter manually**.
-
-1. For **Database name** and **Credential**, enter the values that you created earlier.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sql-auth.png" alt-text="Screenshot that shows database and credential information for the SQL authentication option to run a scan.":::
-
-1. For **Select a connection**, choose the appropriate collection for the scan.
-
-1. Select **Test connection** to validate the connection. After the connection is successful, select **Continue**.
-
-# [Managed identity](#tab/managed-identity)
-
-1. For **Name**, provide a name for the scan.
-
-1. Select the SAMI or UAMI under **Credential**, and choose the appropriate collection for the scan.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-managed-id.png" alt-text="Screenshot that shows credential and collection information for the managed identity option to run a scan.":::
-
-1. Select **Test connection**. After the connection is successful, select **Continue**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-test.png" alt-text="Screenshot that shows the message for a successful connection for the managed identity option to run a scan.":::
-
-# [Service principal](#tab/service-principal)
-
-1. For **Name**, provide a name for the scan.
-
-1. Choose the appropriate collection for the scan, and select the credential that you created earlier under **Credential**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sp.png" alt-text="Screenshot that shows collection and credential information for the service principal option to enable scanning.":::
-
-1. Select **Test connection**. After the connection is successful, select **Continue**.
---
-## Scope and run the scan
-
-1. You can scope your scan to specific database objects by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-scope-scan.png" alt-text="Screenshot that shows options for scoping a scan.":::
-
-1. Select a scan rule set. You can use the system default, choose from existing custom rule sets, or create a new rule set inline. Select **Continue** when you're finished.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-scan-rule-set.png" alt-text="Screenshot that shows options for selecting a scan rule set.":::
-
- If you select **New scan rule set**, a pane opens so that you can enter the source type, the name of the rule set, and a description. Select **Continue** when you're finished.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-new-scan-rule-set.png" alt-text="Screenshot that shows information for creating a new scan rule set.":::
-
- For **Select classification rules**, choose the classification rules that you want to include in the scan rule set, and then select **Create**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-classification.png" alt-text="Screenshot that shows a list of classification rules for a scan rule set.":::
-
- The new scan rule set then appears in the list of available rule sets.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sel-scan-rule.png" alt-text="Screenshot that shows the selection of a new scan rule set.":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
-1. Review your scan, and then select **Save and run**.
-
-### View a scan
-
-To check the status of a scan, go to the data source in the collection, and then select **View details**.
--
-The scan details indicate the progress of the scan in **Last run status**, along with the number of assets scanned and classified.
-**Last run status** is updated to **In progress** and then **Completed** after the entire scan has run successfully.
--
-### Manage a scan
-
-After you run a scan, you can use the run history to manage it:
-
-1. Under **Recent scans**, select a scan.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-manage scan.png" alt-text="Screenshot that shows the selection of a recently completed scan.":::
-
-1. In the run history, you have options for running the scan again, editing it, or deleting it.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-manage-scan-options.png" alt-text="Screenshot that shows options for running, editing, and deleting a scan.":::
-
- If you select **Run scan now** to rerun the scan, you can then choose either **Incremental scan** or **Full scan**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-full-inc.png" alt-text="Screenshot that shows options for full or incremental scan.":::
--
-### Troubleshoot scanning
-
-If you have problems with scanning, try these tips:
--- Confirm that you followed all [prerequisites](#prerequisites).-- Check the network by confirming [firewall](#update-firewall-settings), [Azure connections](#allow-azure-connections), or [integration runtime](#install-a-self-hosted-integration-runtime) settings.-- Confirm that [authentication](#configure-authentication-for-a-scan) is properly set up.-
-For more information, review [Troubleshoot your connections in Microsoft Purview](troubleshoot-connections.md).
-
-## Set up access policies
-
-The following types of policies are supported on this data resource from Microsoft Purview:
-- [DevOps policies](concept-policies-devops.md)-- [Data owner policies](concept-policies-data-owner.md)-- [Self-service policies](concept-self-service-data-access-policy.md)-
-### Access policy prerequisites on Azure SQL Database
--
-### Configure the Microsoft Purview account for policies
--
-### Register the data source and enable Data use management
-
-The Azure SQL Database resource needs to be registered with Microsoft Purview before you can create access policies. To register your resources, follow the "Prerequisites" and "Register the data source" sections in [Enable Data use management on your Microsoft Purview sources](./register-scan-azure-sql-database.md#prerequisites).
-
-After you register the data source, you need to enable **Data use management**. This is a prerequisite before you can create policies on the data source. **Data use management** can affect the security of your data, because it delegates to certain Microsoft Purview roles that manage access to the data sources. Go through the security practices in [Enable Data use management on your Microsoft Purview sources](./how-to-enable-data-use-management.md).
-
-After your data source has the **Data use management** option set to **Enabled**, it will look like this screenshot:
-
-![Screenshot that shows the panel for registering a data source for a policy, including areas for name, server name, and data use management.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-azure-sql-db.png)
--
-### Create a policy
-
-To create an access policy for Azure SQL Database, follow these guides:
-
-* [Provision access to system health, performance and audit information in Azure SQL Database](./how-to-policies-devops-azure-sql-db.md#create-a-new-devops-policy). Use this guide to apply a DevOps policy on a single SQL database.
-* [Provision read/modify access on a single Azure SQL Database](./how-to-policies-data-owner-azure-sql-db.md#create-and-publish-a-data-owner-policy). Use this guide to provision access on a single SQL database account in your subscription.
-* [Self-service access policies for Azure SQL Database](./how-to-policies-self-service-azure-sql-db.md). Use this guide to allow data consumers to request access to data assets by using a self-service workflow.
-
-To create policies that cover all data sources inside a resource group or Azure subscription, see [Discover and govern multiple Azure sources in Microsoft Purview](register-scan-azure-multiple-sources.md#access-policy).
---
-## Extract lineage (preview)
-<a id="lineagepreview"></a>
-
->[!NOTE]
->Lineage is not currently supported using a self-hosted integration runtime or managed VNET runtime and a private endpoint. You need to enable Azure services to access the server under network settings for your Azure SQL Database.
-
-Microsoft Purview supports lineage from Azure SQL Database. When you're setting up a scan, you turn on the **Lineage extraction** toggle to extract lineage.
-
-### Prerequisites for setting up a scan with lineage extraction
-
-1. Follow the steps in the [Configure authentication for a scan](#configure-authentication-for-a-scan) section of this article to authorize Microsoft Purview to scan your SQL database.
-
-1. Sign in to Azure SQL Database with your Azure AD account, and assign `db_owner` permissions to the Microsoft Purview managed identity.
-
- >[!NOTE]
- > The 'db_owner' permissions is needed because lineage is based on XEvent sessions. So Microsoft Purview needs the permission to manage the XEvent sessions in SQL.
-
- Use the following example SQL syntax to create a user and grant permission. Replace `<purview-account>` with your account name.
-
- ```sql
- Create user <purview-account> FROM EXTERNAL PROVIDER
- GO
- EXEC sp_addrolemember 'db_owner', <purview-account>
- GO
- ```
-1. Run the following command on your SQL database to create a master key:
-
- ```sql
- Create master key
- Go
- ```
-1. Ensure that **Allow Azure services and resources to access this server** is enabled under networking/firewall for your Azure SQL resource.
-
-### Create a scan with lineage extraction turned on
-
-1. On the pane for setting up a scan, turn on the **Enable lineage extraction** toggle.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-lineage-extraction.png" alt-text="Screenshot that shows the pane for creating a new scan, with lineage extraction turned on." lightbox="media/register-scan-azure-sql-database/register-scan-azure-sql-db-lineage-extraction-expanded.png":::
-
-2. Select your method of authentication by following the steps in the [Create the scan](#create-the-scan) section of this article.
-3. After you successfully set up the scan, a new scan type called **Lineage extraction** will run incremental scans every six hours to extract lineage from Azure SQL Database. Lineage is extracted based on the stored procedure runs in the SQL database.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-lineage-extraction-runs.png" alt-text="Screenshot that shows the screen that runs lineage extraction every six hours."lightbox="media/register-scan-azure-sql-database/register-scan-azure-sql-db-lineage-extraction-runs-expanded.png":::
-
- > [!Note]
- > Toggle on **Lineage extraction** will trigger daily scan.
-
-### Search Azure SQL Database assets and view runtime lineage
-
-You can [browse through the data catalog](how-to-browse-catalog.md) or [search the data catalog](how-to-search-catalog.md) to view asset details for Azure SQL Database. The following steps describe how to view runtime lineage details:
-
-1. Go to the **Lineage** tab for the asset. When applicable, the asset lineage appears here.
--
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-lineage.png" alt-text="Screenshot that shows lineage details from stored procedures.":::
-
- When applicable, you can further drill down to see the lineage at SQL statement level within a stored procedure, along with column level lineage. When using Self-hosted Integration Runtime for scan, retrieving the lineage drilldown information during scan is supported since version 5.25.8374.1.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-lineage-drilldown.png" alt-text="Screenshot that shows stored procedure lineage drilldown.":::
-
- For information about supported Azure SQL Database lineage scenarios, refer to the [Supported capabilities](#supported-capabilities) section of this article. For more information about lineage in general, see [Data lineage in Microsoft Purview](concept-data-lineage.md) and [Microsoft Purview Data Catalog lineage user guide](catalog-lineage-user-guide.md).
-
-2. Go to the stored procedure asset. On the **Properties** tab, go to **Related assets** to get the latest run details of stored procedures.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-stored-procedure-properties.png" alt-text="Screenshot that shows run details for stored procedure properties.":::
-
-3. Select the stored procedure hyperlink next to **Runs** to see the **Azure SQL Stored Procedure Run** overview. Go to the **Properties** tab to see enhanced runtime information from the stored procedure, such as **executedTime**, **rowCount**, and **Client Connection**.
-
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-stored-procedure-run-properties.png" alt-text="Screenshot that shows run properties for a stored procedure."lightbox="media/register-scan-azure-sql-database/register-scan-azure-sql-db-stored-procedure-run-properties-expanded.png":::
-
-### Troubleshoot lineage extraction
-
-The following tips can help you solve problems related to lineage:
-
-* If no lineage is captured after a successful **Lineage extraction** run, it's possible that no stored procedures have run at least once since you set up the scan.
-* Lineage is captured for stored procedure runs that happen after a successful scan is set up. Lineage from past stored procedure runs isn't captured.
-* If your database is processing heavy workloads with lots of stored procedure runs, lineage extraction will filter only the most recent runs. Stored procedure runs early in the six-hour window, or the run instances that create heavy query load, won't be extracted. Contact support if you're missing lineage from any stored procedure runs.
-* If a stored procedure contains drop or create statements, they are not currently captured in lineage
-
-## Next steps
-
-To learn more about Microsoft Purview and your data, use these guides:
--- [Concepts for Microsoft Purview DevOps policies](concept-policies-devops.md)-- [Understand the Microsoft Purview Data Estate Insights application](concept-insights.md)-- [Microsoft Purview Data Catalog lineage user guide](catalog-lineage-user-guide.md)-- [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Sql Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-sql-managed-instance.md
- Title: 'Connect to and manage Azure SQL Managed Instance'
-description: This guide describes how to connect to Azure SQL Managed Instance in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure SQL Managed Instance source.
----- Previously updated : 12/13/2022---
-# Connect to and manage an Azure SQL Managed Instance in Microsoft Purview
-
-This article outlines how to register and Azure SQL Managed Instance, as well as how to authenticate and interact with the Azure SQL Managed Instance in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-|||||||||--|
-| [Yes](#register) | [Yes](#scan)| [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](create-sensitivity-label.md)| No | Limited** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-* [Configure public endpoint in Azure SQL Managed Instance](/azure/azure-sql/managed-instance/public-endpoint-configure)
-
- > [!Note]
- > We now support scanning Azure SQL Managed Instances over the private connection using Microsoft Purview ingestion private endpoints and a self-hosted integration runtime VM.
- > For more information related to prerequisites, see [Connect to your Microsoft Purview and scan data sources privately and securely](./catalog-private-link-end-to-end.md)
-
-## Register
-
-This section describes how to register an Azure SQL Managed Instance in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-If you need to create new authentication, you need to [authorize database access to SQL Database SQL Managed Instance](/azure/azure-sql/database/logins-create-manage). There are three authentication methods that Microsoft Purview supports today:
--- [System or user assigned managed identity](#system-or-user-assigned-managed-identity-to-register)-- [Service Principal](#service-principal-to-register)-- [SQL authentication](#sql-authentication-to-register)-
-#### System or user assigned managed identity to register
-
-You can use either your Microsoft Purview system-assigned managed identity (SAMI), or a [user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) (UAMI) to authenticate. Both options allow you to assign authentication directly to Microsoft Purview, like you would for any other user, group, or service principal. The Microsoft Purview system-assigned managed identity is created automatically when the account is created and has the same name as your Microsoft Purview account. A user-assigned managed identity is a resource that can be created independently. To create one, you can follow our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-
-You can find your managed identity Object ID in the Azure portal by following these steps:
-
-For Microsoft Purview accountΓÇÖs system-assigned managed identity:
-1. Open the Azure portal, and navigate to your Microsoft Purview account.
-1. Select the **Properties** tab on the left side menu.
-1. Select the **Managed identity object ID** value and copy it.
-
-For user-assigned managed identity (preview):
-1. Open the Azure portal, and navigate to your Microsoft Purview account.
-1. Select the Managed identities tab on the left side menu
-1. Select the user assigned managed identities, select the intended identity to view the details.
-1. The object (principal) ID is displayed in the overview essential section.
-
-Either managed identity will need permission to get metadata for the database, schemas and tables, and to query the tables for classification.
-- Create an Azure AD user in Azure SQL Managed Instance by following the prerequisites and tutorial on [Create contained users mapped to Azure AD identities](/azure/azure-sql/database/authentication-aad-configure?tabs=azure-powershell#create-contained-users-mapped-to-azure-ad-identities)-- Assign `db_datareader` permission to the identity.-
-#### Service Principal to register
-
-There are several steps to allow Microsoft Purview to use service principal to scan your Azure SQL Managed Instance.
-
-#### Create or use an existing service principal
-
-To use a service principal, you can use an existing one or create a new one. If you're going to use an existing service principal, skip to the next step.
-If you have to create a new Service Principal, follow these steps:
-
- 1. Navigate to the [Azure portal](https://portal.azure.com).
- 1. Select **Azure Active Directory** from the left-hand side menu.
- 1. Select **App registrations**.
- 1. Select **+ New application registration**.
- 1. Enter a name for the **application** (the service principal name).
- 1. Select **Accounts in this organizational directory only**.
- 1. For Redirect URI, select **Web** and enter any URL you want; it doesn't have to be real or work.
- 1. Then select **Register**.
-
-#### Configure Azure AD authentication in the database account
-
-The service principal must have permission to get metadata for the database, schemas, and tables. It must also be able to query the tables to sample for classification.
-- [Configure and manage Azure AD authentication with Azure SQL](/azure/azure-sql/database/authentication-aad-configure)-- Create an Azure AD user in Azure SQL Managed Instance by following the prerequisites and tutorial on [Create contained users mapped to Azure AD identities](/azure/azure-sql/database/authentication-aad-configure?tabs=azure-powershell#create-contained-users-mapped-to-azure-ad-identities)-- Assign `db_datareader` permission to the identity.-
-#### Add service principal to key vault and Microsoft Purview's credential
-
-It's required to get the service principal's application ID and secret:
-
-1. Navigate to your Service Principal in the [Azure portal](https://portal.azure.com)
-1. Copy the values the **Application (client) ID** from **Overview** and **Client secret** from **Certificates & secrets**.
-1. Navigate to your key vault
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** of your choice and **Value** as the **Client secret** from your Service Principal
-1. Select **Create** to complete
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the Service Principal to set up your scan.
-
-#### SQL authentication to register
-
-> [!Note]
-> Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about **15 minutes** after granting permission, the Microsoft Purview account should have the appropriate permissions to be able to scan the resource(s).
-
-You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-login-transact-sql?view=azuresqldb-current&preserve-view=true#examples-1) to create a login for Azure SQL Managed Instance if you don't have this login available. You'll need **username** and **password** for the next steps.
-
-1. Navigate to your key vault in the Azure portal
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your Azure SQL Managed Instance
-1. Select **Create** to complete
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the **username** and **password** to set up your scan.
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Navigate to the **Data Map**.
-
-1. Select **Register**
-
-1. Select **Azure SQL Managed Instance** and then **Continue**.
-
-1. Select **From Azure subscription**, select the appropriate subscription from the **Azure subscription** drop-down box and the appropriate server from the **Server name** drop-down box.
-
-1. Provide the **public endpoint fully qualified domain name** and **port number**. Then select **Register** to register the data source.
-
- :::image type="content" source="media/register-scan-azure-sql-managed-instance/add-azure-sql-database-managed-instance.png" alt-text="Screenshot of register sources screen, with Name, subscription, server name, and endpoint filled out.":::
-
- For Example: `foobar.public.123.database.windows.net,3342`
-
-## Scan
-
-Follow the steps below to scan an Azure SQL Managed Instance to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-To create and run a new scan, complete the following steps:
-
-1. Select the **Data Map** tab on the left pane in the Microsoft Purview governance portal.
-
-1. Select the Azure SQL Managed Instance source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source.
-
- :::image type="content" source="media/register-scan-azure-sql-managed-instance/set-up-scan-sql-mi.png" alt-text="Screenshot of new scan window, with the Purview MSI selected as the credential, but a service principal, or SQL authentication also available.":::
-
-1. You can scope your scan to specific tables by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-sql-managed-instance/scope-your-scan.png" alt-text="Screenshot of the scope your scan window, with a subset of tables selected for scanning.":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-sql-managed-instance/scan-rule-set.png" alt-text="Screenshot of scan rule set window, with the system default scan rule set selected.":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-sql-managed-instance/trigger-scan.png" alt-text="Screenshot of the set scan trigger window, with the recurring tab selected.":::
-
-1. Review your scan and select **Save and run**.
-
-If you're having trouble connecting to your data source, or running your scan, you scan see our [troubleshooting guide for scans and connections.](troubleshoot-connections.md)
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Synapse Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-synapse-analytics.md
- Title: 'Connect to and manage dedicated SQL pools (formerly SQL DW)'
-description: This guide describes how to connect to dedicated SQL pools (formerly SQL DW) in Microsoft Purview, and use Microsoft Purview's features to scan and manage your dedicated SQL pools source.
----- Previously updated : 01/31/2023---
-# Connect to and manage dedicated SQL pools in Microsoft Purview
-
-This article outlines how to register dedicated SQL pools (formerly SQL DW), and how to authenticate and interact with dedicated SQL pools in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md)
-
-> [!NOTE]
-> If you are looking to register and scan a dedicated SQL database within a Synapse workspace, you must follow instructions [here](register-scan-synapse-workspace.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-|||||||||--|
-| [Yes](#register) | [Yes](#scan)| [Yes](#scan)| [Yes](#scan)| [Yes](#scan)|[Yes](create-sensitivity-label.md)| No | Limited* | No |
-
-\* *Lineage is supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-When scanning dedicated SQL pool (formerly SQL DW) source, Microsoft Purview supports extracting technical metadata including:
--- Server-- Dedicated SQL pools-- Schemas-- Tables including columns-- Views including columns-
-When setting up the scan, you can further scope it after providing the dedicated SQL pool name by selecting tables and views as needed.
-
-### Known limitations
-
-* Microsoft Purview doesn't support over 800 columns in the Schema tab and it will show "Additional-Columns-Truncated".
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-## Register
-
-This section describes how to register dedicated SQL pools in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-There are three ways to set up authentication:
--- [System or user assigned managed identity](#system-or-user-assigned-managed-identity-to-register) (Recommended)-- [Service Principal](#service-principal-to-register)-- [SQL authentication](#sql-authentication-to-register)-
- > [!Note]
- > Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about 15 minutes after granting permission, the Microsoft Purview account should have the appropriate permissions to be able to scan the resource(s).
-
-#### System or user assigned managed identity to register
-
-You can use either your Microsoft Purview system-assigned managed identity (SAMI), or a [User-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) (UAMI) to authenticate. Both options allow you to assign authentication directly to Microsoft Purview, like you would for any other user, group, or service principal. The Microsoft Purview SAMI is created automatically when the account is created. A UAMI is a resource that can be created independently, and to create one you can follow our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity). Create an Azure AD user in the dedicated SQL pool using your managed identity object name by following the prerequisites and tutorial on [Create Azure AD users using Azure AD applications](/azure/azure-sql/database/authentication-aad-service-principal-tutorial).
-
-Example SQL syntax to create user and grant permission:
-
-```sql
-CREATE USER [PurviewManagedIdentity] FROM EXTERNAL PROVIDER
-GO
-
-EXEC sp_addrolemember 'db_datareader', [PurviewManagedIdentity]
-GO
-```
-
-The authentication must have permission to get metadata for the database, schemas, and tables. It must also be able to query the tables to sample for classification. The recommendation is to assign `db_datareader` permission to the identity.
-
-#### Service Principal to register
-
-To use service principal authentication for scans, you can use an existing one or create a new one.
-
-If you need to create a new Service Principal, follow these steps:
- 1. Navigate to the [Azure portal](https://portal.azure.com).
- 1. Select **Azure Active Directory** from the left-hand side menu.
- 1. 1. Select **App registrations**.
- 1. Select **+ New application registration**.
- 1. Enter a name for the **application** (the service principal name).
- 1. Select **Accounts in this organizational directory only**.
- 1. For Redirect URI, select **Web** and enter any URL you want; it doesn't have to be real or work.
- 1. Then select **Register**.
-
-It is required to get the Service Principal's application ID and secret:
-
-1. Navigate to your Service Principal in the [Azure portal](https://portal.azure.com)
-1. Copy the values the **Application (client) ID** from **Overview** and **Client secret** from **Certificates & secrets**.
-1. Navigate to your key vault
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** of your choice and **Value** as the **Client secret** from your Service Principal
-1. Select **Create** to complete
-1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the Service Principal to set up your scan.
-
-##### Granting the Service Principal access
-
-In addition, you must also create an Azure AD user in the dedicated pool by following the prerequisites and tutorial on [Create Azure AD users using Azure AD applications](/azure/azure-sql/database/authentication-aad-service-principal-tutorial). Example SQL syntax to create user and grant permission:
-
-```sql
-CREATE USER [ServicePrincipalName] FROM EXTERNAL PROVIDER
-GO
-
-ALTER ROLE db_datareader ADD MEMBER [ServicePrincipalName]
-GO
-```
-
-> [!Note]
-> Microsoft Purview will need the **Application (client) ID** and the **client secret** in order to scan.
-
-#### SQL authentication to register
-
-You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-login-transact-sql?view=azure-sqldw-latest&preserve-view=true#examples-1) to create a login for your dedicated SQL pool (formerly SQL DW) if you don't already have one.
-
-When authentication method selected is **SQL Authentication**, you need to get your password and store in the key vault:
-
-1. Get the password for your SQL login
-1. Navigate to your key vault
-1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* for your SQL login
-1. Select **Create** to complete
-1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan.
-
-### Steps to register
-
-To register a new SQL dedicated pool in Microsoft Purview, complete the following steps:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On **Register sources**, select **Azure Dedicated SQL Pool (formerly SQL DW)**.
-1. Select **Continue**
-
-On the **Register sources** screen, complete the following steps:
-
-1. Enter a **Name** that the data source will be listed with in the Catalog.
-2. Choose your Azure subscription to filter down dedicated SQL pools.
-3. Select your dedicated SQL pool.
-4. Select a collection or create a new one (Optional).
-5. Select **Register** to register the data source.
-
-## Scan
-
-Follow the steps below to scan dedicated SQL pools to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, complete the following steps:
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the SQL dedicated pool source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source.
-
- :::image type="content" source="media/register-scan-azure-synapse-analytics/sql-dedicated-pool-set-up-scan.png" alt-text="Set up scan":::
-
-1. You can scope your scan to specific tables by choosing the appropriate items in the list.
-
- :::image type="content" source="media/register-scan-azure-synapse-analytics/scope-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-azure-synapse-analytics/select-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-azure-synapse-analytics/trigger-scan.png" alt-text="trigger":::
-
-1. Review your scan and select **Save and run**.
--
-## Next steps
-
-Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Cassandra Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-cassandra-source.md
- Title: Connect to and manage Cassandra
-description: This guide describes how to connect to Cassandra in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Cassandra source.
----- Previously updated : 07/18/2023---
-# Connect to and manage Cassandra in Microsoft Purview
-
-This article outlines how to register Cassandra, and how to authenticate and interact with Cassandra in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register) | [Yes](#scan)| No | [Yes](#scan) | No | No| No| [Yes](#lineage)| No |
-
-The supported Cassandra server versions are 3.*x* or 4.*x*.
-
-When scanning Cassandra source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Cluster
- - Keyspaces
- - Tables including the columns and indexes
- - Materialized views including the columns
--- Fetching static lineage on assets relationships among tables and materialized views.-
-When setting up scan, you can choose to scan an entire Cassandra instance, or scope the scan to a subset of keyspaces matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-
-> [!NOTE]
-> **If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.), **you will need to configure a self hosted integration runtime to connect to it**.
--- If your data store isn't publicly accessible, set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
- - Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
- - Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-## Register
-
-This section describes how to register Cassandra in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new Cassandra server in your data catalog:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left pane.
-1. Select **Register**.
-1. On the **Register sources** screen, select **Cassandra**, and then select **Continue**:
-
- :::image type="content" source="media/register-scan-cassandra-source/register-sources.png" alt-text="Screenshot that shows the Register sources screen." border="true":::
-
-1. On the **Register sources (Cassandra)** screen:
-
- 1. Enter a **Name**. The data source will use this name in the
- catalog.
- 1. In the **Host** box, enter the server address where the Cassandra server is running. For example, 20.190.193.10.
- 1. In the **Port** box, enter the port used by the Cassandra server.
- 1. Select a collection or create a new one (optional).
- :::image type="content" source="media/register-scan-cassandra-source/configure-sources.png" alt-text="Screenshot that shows the Register sources (Cassandra) screen." border="true":::
- 1. Select **Register**.
-
-## Scan
-
-Follow the steps below to scan Cassandra to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan:
-
-1. If your server is publicly accessible, skip to step two. Otherwise, you'll need to make sure your self-hosted integration runtime is configured:
- 1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), got to the Management Center, and select **Integration runtimes**.
- 1. Make sure a self-hosted integration runtime is available. If one isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to **Sources**.
-
-1. Select the registered Cassandra server.
-
-1. Select **New scan**.
-
-1. Provide the following details.
-
- 1. **Name**: Specify a name for the scan.
-
- 1. **Connect via integration runtime**: Select the Azure auto-resolved integration runtime if your server is publicly accessible, or your configured self-hosted integration runtime if it isn't publicly available.
-
- 1. **Credential**: When you configure the Cassandra credentials, be sure
- to:
-
- * Select **Basic Authentication** as the authentication method.
- * In the **User name** box, provide the name of the user you're making the connection for.
- * In the key vault's secret, save the password of the Cassandra user you're making the connection for.
-
- For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-
- 1. **Keyspaces**: Specify a list of Cassandra keyspaces to import. Multiple keyspaces must be separated with semicolons. For example, keyspace1; keyspace2. When the list is empty, all available keyspaces are imported.
-
- You can use keyspace name patterns that use SQL LIKE expression syntax, including %.
-
- For example: A%; %B; %C%; D
-
- This expression means:
- * Starts with A or
- * Ends with B or
- * Contains C or
- * Equals D
-
- You can't use NOT or special characters.
-
- 1. **Use Secure Sockets Layer(SSL)**: Select **True** or **False** to specify whether
- to use Secure Sockets Layer (SSL) when connecting to the
- Cassandra server. By default, this option is set to **False**.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Specify the maximum memory (in GB) available on your VM to be used for scanning processes. This value depends on the size of Cassandra server to be scanned.
- :::image type="content" source="media/register-scan-cassandra-source/scan.png" alt-text="scan Cassandra source" border="true":::
-
-1. Select **Test connection** to validate the settings.
-
-1. Select **Continue**.
-
-1. Select a **scan trigger**. You can set up a schedule or run the
- scan once.
-
-1. Review your scan, and then select **Save and Run**.
--
-## Lineage
-
-After scanning your Cassandra source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Cassandra lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Db2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-db2.md
- Title: Connect to and manage Db2
-description: This guide describes how to connect to Db2 in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Db2 source.
----- Previously updated : 04/20/2023---
-# Connect to and manage Db2 in Microsoft Purview
-
-This article outlines how to register Db2, and how to authenticate and interact with Db2 in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No| [Yes](#lineage)| No |
--
-The supported IBM Db2 versions are Db2 for LUW 9.7 to 11.x. Db2 for z/OS (mainframe) and iSeries (AS/400) aren't supported now.
-
-When scanning IBM Db2 source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Databases
- - Schemas
- - Tables including the columns, foreign keys, indexes, and constraints
- - Views including the columns
- - Triggers
--- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire Db2 database, or scope the scan to a subset of schemas matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.12.7984.1.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * Download the [Db2 JDBC driver](https://www.ibm.com/support/pages/db2-jdbc-driver-versions-and-downloads) on the machine where your self-hosted integration runtime is running. Note down the folder path which you will use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
-* The Db2 user must have the CONNECT permission. Microsoft Purview connects to the syscat tables in IBM Db2 environment when importing metadata.
-
-## Register
-
-This section describes how to register Db2 in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new Db2 source in your data catalog, do the following:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **Db2**. Select **Continue**.
-
-On the **Register sources (Db2)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Server** name to connect to a Db2 source. This can either be:
- * A host name used to connect to the database server. For example: `MyDatabaseServer.com`
- * An IP address. For example: `192.169.1.2`
-
-1. Enter the **Port** used to connect to the database server (446 by default for Db2).
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-db2/register-sources.png" alt-text="register sources options" border="true":::
-
-## Scan
-
-Follow the steps below to scan Db2 to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for a Db2 source is **Basic authentication**.
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime.
-
-1. Navigate to **Sources**.
-
-1. Select the registered Db2 source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Basic Authentication** while creating a credential.
- * Provide the user name used to connect to the database server in the User name input field.
- * Store the user password used to connect to the database server in the secret key.
-
- 1. **Database**: The name of the database instance to import.
-
- 1. **Schema**: List subset of schemas to import expressed as a semicolon separated list. For example, `schema1; schema2`. All user schemas are imported if that list is empty. All system schemas (for example, SysAdmin) and objects are ignored by default.
-
- Acceptable schema name patterns using SQL LIKE expressions syntax include using %. For example: `A%; %B; %C%; D`
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Driver location**: Specify the path to the JDBC driver location in your machine where self-host integration runtime is running, e.g. `D:\Drivers\Db2`. It's the path to valid JAR folder location. Make sure the driver is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Maximum memory available**: Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of Db2 source to be scanned.
-
- > [!Note]
- > As a thumb rule, please provide 1GB memory for every 1000 tables
-
- :::image type="content" source="media/register-scan-db2/scan.png" alt-text="scan Db2" border="true":::
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your Db2 source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Db2 lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
-
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Erwin Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-erwin-source.md
- Title: Connect to and manage erwin Mart servers
-description: This guide describes how to connect to erwin Mart servers in Microsoft Purview, and use Microsoft Purview's features to scan and manage your erwin Mart server source.
----- Previously updated : 04/20/2023---
-# Connect to and manage erwin Mart servers in Microsoft Purview
-
-This article outlines how to register erwin Mart servers, and how to authenticate and interact with erwin Mart Servers in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No| [Yes](#lineage)| No |
-
-The supported erwin Mart versions are 9.x to 2021.
-
-When scanning erwin Mart source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Mart
- - Libraries
- - Models
- - Entities including the attributes, foreign keys, indexes, index members, candidate keys, and triggers
- - Default values
- - Synonyms
- - Sequences
- - Domains
- - Subject areas
- - Relationships
- - Validation rules including the valid values
- - ER diagrams
- - Views including the attributes
- - Stored procedures including the parameters
- - Schemas
- - Subtype relationships
- - View relationship
- - User defined properties
--- Fetching static lineage on assets relationships among entities, views and stored procedures.-
-When setting up scan, you can choose to scan an entire erwin Mart server, or scope the scan to a list of models matching the given name(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
- > [!IMPORTANT]
- > Make sure to install the self-hosted integration runtime and the Erwin Data Modeler software on the same machine where erwin Mart instance is running.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-## Register
-
-This section describes how to register erwin Mart servers in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-The only supported authentication for an erwin Mart source is **Server Authentication** in the form of username and password.
-
-### Steps to register
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **erwin**. Select **Continue.**
- :::image type="content" source="media/register-scan-erwin-source/register-sources.png" alt-text="register erwin source" border="true":::
-
-On the Register sources (erwin) screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-1. Enter the erwin Mart **Server name.** This is the network host name used to connect to the erwin Mart server. For example, localhost
-1. Enter the **Port** number used when connecting to erwin Mart. By default, this value is 18170.
-1. Enter the **Application name**
-
- >[!Note]
- > The above details can be found by navigating to your erwin Data Modeler. Select Mart -\> Connect to see details related to server name, port and application name.
-
- :::image type="content" source="media/register-scan-erwin-source/erwin-details.png" alt-text="find erwin details" border="true":::
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-erwin-source/register-erwin.png" alt-text="register source" border="true":::
-
-## Scan
-
-Follow the steps below to scan erwin Mart servers to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up on the VM where erwin Mart instance is running. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-1. Navigate to **Sources**.
-
-1. Select the registered **erwin** Mart.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured
- self-hosted integration runtime.
-
- 1. **Server name, Port** and **Application name** are auto
- populated based on the values entered during registration.
-
- 1. **Credential:** Select the credential configured to connect to your erwin Mart server. While creating a credential, make sure to:
- * Select **Basic Authentication** as the Authentication method
- * Provide your erwin Mart server's username in the User name field.
- * Save your user password for server authentication in the key vault's secret.
-
- To understand more on credentials, refer to the link [here](manage-credentials.md).
-
- 1. **Use Internet Information Services (IIS)** - Select True or False to notify if Microsoft Internet Information Services (IIS) must be used when connecting to the erwin Mart server. By default this value is set to False.
-
- 1. **Use Secure Sockets Layer (SSL)** - Select True or False to Notify if Secure Sockets Layer (SSL) must be used when connecting to the erwin Mart server. By default, this value is set to False.
-
- > [!Note]
- > This parameter is only applicable for erwin Mart version 9.1 or later.
-
- 1. **Models** - Scope your scan by providing a semicolon separated list of erwin model locator strings. For example, mart://Mart/Samples/eMovies;mart://Mart/Erwin_Tutorial/AP_Physical
-
- 1. **Maximum memory available**: Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of erwin Mart to be scanned.
-
- :::image type="content" source="media/register-scan-erwin-source/setup-scan.png" alt-text="trigger scan" border="true":::
-
-1. Select **Test connection.**
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the
- scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your erwin source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported erwin lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Google Bigquery Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-google-bigquery-source.md
- Title: Connect to and manage Google BigQuery projects
-description: This guide describes how to connect to Google BigQuery projects in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Google BigQuery source.
----- Previously updated : 04/20/2023---
-# Connect to and manage Google BigQuery projects in Microsoft Purview
-
-This article outlines how to register Google BigQuery projects, and how to authenticate and interact with Google BigQuery in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No| [Yes](#lineage)| No |
--
-When scanning Google BigQuery source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Projects
- - Datasets
- - Tables including the columns
- - Views including the columns
--- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire Google BigQuery project, or scope the scan to a subset of datasets matching the given name(s) or name pattern(s).
-
-### Known limitations
--- Currently, Microsoft Purview only supports scanning Google BigQuery datasets in US multi-regional location. If the specified dataset is in other location e.g. us-east1 or EU, you will observe scan completes but no assets shown up in Microsoft Purview.-- When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * Download and unzip the [BigQuery JDBC driver](https://cloud.google.com/bigquery/providers/simba-drivers) on the machine where your self-hosted integration runtime is running. Note down the folder path which you will use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
-## Register
-
-This section describes how to register a Google BigQuery project in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register.**
-1. On Register sources, select **Google BigQuery** . Select **Continue.**
-
- :::image type="content" source="media/register-scan-google-bigquery-source/register-sources.png" alt-text="register BigQuery source" border="true":::
-
-On the Register sources (Google BigQuery) screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **ProjectID.** This should be a fully qualified project ID. For example, mydomain.com:myProject
-
-1. Select a collection or create a new one (Optional)
-
-1. Select **Register**.
-
- :::image type="content" source="media/register-scan-google-bigquery-source/configure-sources.png" alt-text="configure BigQuery source" border="true":::
-
-## Scan
-
-Follow the steps below to scan a Google BigQuery project to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md).
-
-1. Navigate to **Sources**.
-
-1. Select the registered **BigQuery** project.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime
-
- 1. **Credential**: While configuring BigQuery credential, make sure to:
-
- * Select **Basic Authentication** as the Authentication method
- * Provide the email ID of the service account in the User name field. For example, `xyz\@developer.gserviceaccount.com`
- * Follow below steps to generate the private key, copy the entire JSON key file then store it as the value of a Key Vault secret.
-
- To create a new private key from Google's cloud platform:
- 1. In the navigation menu, select IAM & Admin -\> Service Accounts -\> Select a project -\>
- 1. Select the email address of the service account that you want to create a key for.
- 1. Select the **Keys** tab.
- 1. Select the **Add key** drop-down menu, then select Create new key.
- 1. Choose JSON format.
-
- > [!Note]
- > The contents of the private key are saved in a temp file on the VM when scanning processes are running. This temp file is deleted after the scans are successfully completed. In the event of a scan failure, the system will continue to retry until success. Please make sure access is appropriately restricted on the VM where SHIR is running.
-
- To understand more on credentials, refer to the link [here](manage-credentials.md).
-
- 1. **Driver location**: Specify the path to the JDBC driver location in your machine where self-host integration runtime is running, e.g. `D:\Drivers\GoogleBigQuery`. It's the path to valid JAR folder location. Make sure the driver is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Dataset**: Specify a list of BigQuery datasets to import.
- For example, dataset1; dataset2. When the list is empty, all available datasets are imported.
- Acceptable dataset name patterns using SQL LIKE expressions syntax include using %.
-
- Example:
- A%; %B; %C%; D
- * Start with A or
- * end with B or
- * contain C or
- * equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Maximum memory available**: Maximum memory (in GB) available on your VM to be used by scanning processes. This is dependent on the size of Google BigQuery project to be scanned.
-
- :::image type="content" source="media/register-scan-google-bigquery-source/scan.png" alt-text="scan BigQuery source" border="true":::
-
-1. Select **Test connection.**
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the
- scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your Google BigQuery source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Google BigQuery lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Hdfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-hdfs.md
- Title: Connect to and manage HDFS
-description: This guide describes how to connect to HDFS in Microsoft Purview, and use Microsoft Purview's features to scan and manage your HDFS source.
----- Previously updated : 08/03/2022---
-# Connect to and manage HDFS in Microsoft Purview
-
-This article outlines how to register Hadoop Distributed File System (HDFS), and how to authenticate and interact with HDFS in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | No| No | No | No|
-
-When scanning HDFS source, Microsoft Purview supports extracting technical metadata including HDFS:
--- Namenode-- Folders-- Files-- Resource sets-
-When setting up scan, you can choose to scan the entire HDFS or selective folders. Learn about the supported file format [here](microsoft-purview-connector-overview.md#file-types-supported-for-scanning).
-
-The connector uses *webhdfs* protocol to connect to HDFS and retrieve metadata. MapR Hadoop distribution is not supported.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-- Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.20.8235.2.-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
- * Ensure JRE or OpenJDK is installed on the self-hosted integration runtime machine for parsing Parquet and ORC files. Learn more from [here](manage-integration-runtimes.md#java-runtime-environment-installation).
- * To set up your environment to enable Kerberos authentication, see the [Use Kerberos authentication for the HDFS connector](#use-kerberos-authentication-for-the-hdfs-connector) section.
-
-## Register
-
-This section describes how to register HDFS in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new HDFS source in your data catalog, follow these steps:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **HDFS**. Select **Continue**.
-
-On the **Register sources (HDFS)** screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Cluster URL** of the HDFS NameNode in the form of `https://<namenode>:<port>` or `http://<namenode>:<port>`, e.g. `https://namenodeserver.com:50470` or `http://namenodeserver.com:50070`.
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-hdfs/register-sources.png" alt-text="Screenshot of HDFS source registration in Purview." border="true":::
-
-## Scan
-
-Follow the steps below to scan HDFS to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for an HDFS source is **Kerberos authentication**.
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime.
-
-1. Navigate to **Sources**.
-
-1. Select the registered HDFS source.
-
-1. Select **+ New scan**.
-
-1. On "**Scan *source_name***"" page, provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime. See setup requirements in [Prerequisites](#prerequisites) section.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Kerberos Authentication** while creating a credential.
- * Provide the user name in the format of `<username>@<domain>.com` in the User name input field. Learn more from [Use Kerberos authentication for the HDFS connector](#use-kerberos-authentication-for-the-hdfs-connector).
- * Store the user password used to connect to HDFS in the secret key.
-
- :::image type="content" source="media/register-scan-hdfs/scan.png" alt-text="Screenshot of HDFS scan configurations in Purview." border="true":::
-
-1. Select **Test connection**.
-
-1. Select **Continue**.
-
-1. On "**Scope your scan**" page, select the path(s) that you want to scan.
-
-1. On "**Select a scan rule set**" page, select the scan rule set you want to use for schema extraction and classification. You can choose between the system default, existing custom rule sets, or create a new rule set inline. Learn more from [Create a scan rule set](create-a-scan-rule-set.md).
-
-1. On "**Set a scan trigger**" page, choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Use Kerberos authentication for the HDFS connector
-
-There are two options for setting up the on-premises environment to use Kerberos authentication for the HDFS connector. You can choose the one that better fits your situation.
-* Option 1: [Join a self-hosted integration runtime machine in the Kerberos realm](#kerberos-join-realm)
-* Option 2: [Enable mutual trust between the Windows domain and the Kerberos realm](#kerberos-mutual-trust)
-
-For either option, make sure you turn on webhdfs for Hadoop cluster:
-
-1. Create the HTTP principal and keytab for webhdfs.
-
- > [!IMPORTANT]
- > The HTTP Kerberos principal must start with "**HTTP/**" according to Kerberos HTTP SPNEGO specification. Learn more from [here](https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#HDFS_Configuration_Options).
-
- ```bash
- Kadmin> addprinc -randkey HTTP/<namenode hostname>@<REALM.COM>
- Kadmin> ktadd -k /etc/security/keytab/spnego.service.keytab HTTP/<namenode hostname>@<REALM.COM>
- ```
-
-2. HDFS configuration options: add the following three properties in `hdfs-site.xml`.
- ```xml
- <property>
- <name>dfs.webhdfs.enabled</name>
- <value>true</value>
- </property>
- <property>
- <name>dfs.web.authentication.kerberos.principal</name>
- <value>HTTP/_HOST@<REALM.COM></value>
- </property>
- <property>
- <name>dfs.web.authentication.kerberos.keytab</name>
- <value>/etc/security/keytab/spnego.service.keytab</value>
- </property>
- ```
-
-### <a name="kerberos-join-realm"></a>Option 1: Join a self-hosted integration runtime machine in the Kerberos realm
-
-#### Requirements
-
-* The self-hosted integration runtime machine needs to join the Kerberos realm and canΓÇÖt join any Windows domain.
-
-#### How to configure
-
-**On the KDC server:**
-
-Create a principal, and specify the password.
-
-> [!IMPORTANT]
-> The username should not contain the hostname.
-
-```bash
-Kadmin> addprinc <username>@<REALM.COM>
-```
-
-**On the self-hosted integration runtime machine:**
-
-1. Run the Ksetup utility to configure the Kerberos Key Distribution Center (KDC) server and realm.
-
- The machine must be configured as a member of a workgroup, because a Kerberos realm is different from a Windows domain. You can achieve this configuration by setting the Kerberos realm and adding a KDC server by running the following commands. Replace *REALM.COM* with your own realm name.
-
- ```cmd
- C:> Ksetup /setdomain REALM.COM
- C:> Ksetup /addkdc REALM.COM <your_kdc_server_address>
- ```
-
- After you run these commands, restart the machine.
-
-2. Verify the configuration with the `Ksetup` command. The output should be like:
-
- ```cmd
- C:> Ksetup
- default realm = REALM.COM (external)
- REALM.com:
- kdc = <your_kdc_server_address>
- ```
-
-**In your Purview account:**
-
-* Configure a credential with Kerberos authentication type with your Kerberos principal name and password to scan the HDFS. For configuration details, check the credential setting part in [Scan section](#scan).
-
-### <a name="kerberos-mutual-trust"></a>Option 2: Enable mutual trust between the Windows domain and the Kerberos realm
-
-#### Requirements
-
-* The self-hosted integration runtime machine must join a Windows domain.
-* You need permission to update the domain controller's settings.
-
-#### How to configure
-
-> [!NOTE]
-> Replace REALM.COM and AD.COM in the following tutorial with your own realm name and domain controller.
-
-**On the KDC server:**
-
-1. Edit the KDC configuration in the *krb5.conf* file to let KDC trust the Windows domain by referring to the following configuration template. By default, the configuration is located at */etc/krb5.conf*.
-
- ```config
- [logging]
- default = FILE:/var/log/krb5libs.log
- kdc = FILE:/var/log/krb5kdc.log
- admin_server = FILE:/var/log/kadmind.log
-
- [libdefaults]
- default_realm = REALM.COM
- dns_lookup_realm = false
- dns_lookup_kdc = false
- ticket_lifetime = 24h
- renew_lifetime = 7d
- forwardable = true
-
- [realms]
- REALM.COM = {
- kdc = node.REALM.COM
- admin_server = node.REALM.COM
- }
- AD.COM = {
- kdc = windc.ad.com
- admin_server = windc.ad.com
- }
-
- [domain_realm]
- .REALM.COM = REALM.COM
- REALM.COM = REALM.COM
- .ad.com = AD.COM
- ad.com = AD.COM
-
- [capaths]
- AD.COM = {
- REALM.COM = .
- }
- ```
-
- After you configure the file, restart the KDC service.
-
-2. Prepare a principal named *krbtgt/REALM.COM\@AD.COM* in the KDC server with the following command:
-
- ```cmd
- Kadmin> addprinc krbtgt/REALM.COM@AD.COM
- ```
-
-3. In the *hadoop.security.auth_to_local* HDFS service configuration file, add `RULE:[1:$1@$0](.*\@AD.COM)s/\@.*//`.
-
-**On the domain controller:**
-
-1. Run the following `Ksetup` commands to add a realm entry:
-
- ```cmd
- C:> Ksetup /addkdc REALM.COM <your_kdc_server_address>
- C:> ksetup /addhosttorealmmap HDFS-service-FQDN REALM.COM
- ```
-
-2. Establish trust from the Windows domain to the Kerberos realm. [password] is the password for the principal *krbtgt/REALM.COM\@AD.COM*.
-
- ```cmd
- C:> netdom trust REALM.COM /Domain: AD.COM /add /realm /password:[password]
- ```
-
-3. Select the encryption algorithm that's used in Kerberos.
-
- 1. Select **Server Manager** > **Group Policy Management** > **Domain** > **Group Policy Objects** > **Default or Active Domain Policy**, and then select **Edit**.
-
- 1. On the **Group Policy Management Editor** pane, select **Computer Configuration** > **Policies** > **Windows Settings** > **Security Settings** > **Local Policies** > **Security Options**, and then configure **Network security: Configure Encryption types allowed for Kerberos**.
-
- 1. Select the encryption algorithm you want to use when you connect to the KDC server. You can select all the options.
-
- :::image type="content" source="media/register-scan-hdfs/config-encryption-types-for-kerberos.png" alt-text="Screenshot of the Network security: Configure encryption types allowed for Kerberos pane.":::
-
- 1. Use the `Ksetup` command to specify the encryption algorithm to be used on the specified realm.
-
- ```cmd
- C:> ksetup /SetEncTypeAttr REALM.COM DES-CBC-CRC DES-CBC-MD5 RC4-HMAC-MD5 AES128-CTS-HMAC-SHA1-96 AES256-CTS-HMAC-SHA1-96
- ```
-
-4. Create the mapping between the domain account and the Kerberos principal, so that you can use the Kerberos principal in the Windows domain.
-
- 1. Select **Administrative tools** > **Active Directory Users and Computers**.
-
- 1. Configure advanced features by selecting **View** > **Advanced Features**.
-
- 1. On the **Advanced Features** pane, right-click the account to which you want to create mappings and, on the **Name Mappings** pane, select the **Kerberos Names** tab.
-
- 1. Add a principal from the realm.
-
- :::image type="content" source="media/register-scan-hdfs/map-security-identity.png" alt-text="Screenshot of the Security Identity Mapping pane.":::
-
-**On the self-hosted integration runtime machine:**
-
-* Run the following `Ksetup` commands to add a realm entry.
-
- ```cmd
- C:> Ksetup /addkdc REALM.COM <your_kdc_server_address>
- C:> ksetup /addhosttorealmmap HDFS-service-FQDN REALM.COM
- ```
-
-**In your Purview account:**
-
-* Configure a credential with Kerberos authentication type with your Kerberos principal name and password to scan the HDFS. For configuration details, check the credential setting part in [Scan section](#scan).
-
-## Known limitations
-
-Currently, HDFS connector doesn't support custom resource set pattern rule for [advanced resource set](concept-resource-sets.md#advanced-resource-sets), the built-in resource set patterns will be applied.
-
-[Sensitivity label](create-sensitivity-label.md) is not yet supported.
-
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Search Data Catalog](how-to-search-catalog.md)-- [Data Estate Insights in Microsoft Purview](concept-insights.md)
purview Register Scan Hive Metastore Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-hive-metastore-source.md
- Title: Connect to and manage Hive Metastore databases
-description: This guide describes how to connect to Hive Metastore databases in Microsoft Purview, and how to use Microsoft Purview to scan and manage your Hive Metastore database source.
----- Previously updated : 06/08/2023---
-# Connect to and manage Hive Metastore databases in Microsoft Purview
-
-This article outlines how to register Hive Metastore databases, and how to authenticate and interact with Hive Metastore databases in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No| [Yes*](#lineage) | No |
-
-\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-The supported Hive versions are 2.x to 3.x. The supported platforms are Apache Hadoop, Cloudera, and Hortonworks. If you want to scan Azure Databricks, you are suggested to use the [Azure Databricks connector](register-scan-azure-databricks.md) which is more compatible and user friendly.
-
-When scanning Hive metastore source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Databases
- - Tables including the columns, foreign keys, unique constraints, and storage description
- - Views including the columns and storage description
--- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire Hive metastore database, or scope the scan to a subset of schemas matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* You must have an active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and configure a self-hosted integration runtime](manage-integration-runtimes.md).
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure that Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the machine where the self-hosted integration runtime is running. If you don't have this update installed, [download it now](/cpp/windows/latest-supported-vc-redist).
-
- * Download the Hive Metastore database's JDBC driver on the machine where your self-hosted integration runtime is running. For example, if the database is *mssql*, download [Microsoft's JDBC driver for SQL Server](/sql/connect/jdbc/download-microsoft-jdbc-driver-for-sql-server). Note down the folder path that you'll use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
-## Register
-
-This section describes how to register a Hive Metastore database in Microsoft Purview by using [the Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-The only supported authentication for a Hive Metastore database is Basic Authentication.
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Select **Data Map** on the left pane.
-
-1. Select **Register**.
-
-1. In **Register sources**, select **Hive Metastore** > **Continue**.
-
-1. On the **Register sources (Hive Metastore)** screen, do the following:
-
- 1. For **Name**, enter a name that Microsoft Purview will list as the data source.
-
- 1. For **Hive Cluster URL**, enter a value that you get from the Ambari URL. For example, enter **hive.azurehdinsight.net**.
-
- 1. For **Hive Metastore Server URL**, enter a URL for the server. For example, enter **sqlserver://hive.database.windows.net**.
-
- 1. For **Select a collection**, choose a collection from the list or create a new one. This step is optional.
-
- :::image type="content" source="media/register-scan-hive-metastore-source/configure-sources.png" alt-text="Screenshot that shows boxes for registering Hive sources." border="true":::
-
-1. Select **Finish**.
-
-## Scan
-
-> [!TIP]
-> To troubleshoot any issues with scanning:
-> 1. Confirm you have followed all [**prerequisites**](#prerequisites).
-> 1. Review our [**scan troubleshooting documentation**](troubleshoot-connections.md).
-
-Use the following steps to scan Hive Metastore databases to automatically identify assets. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
-
-1. In the Management Center, select integration runtimes. Make sure that a self-hosted integration runtime is set up. If it isn't set up, use the steps in [Create and manage a self-hosted integration runtime](./manage-integration-runtimes.md).
-
-1. Go to **Sources**.
-
-1. Select the registered Hive Metastore database.
-
-1. Select **+ New scan**.
-
-1. Provide the following details:
-
- 1. **Name**: Enter a name for the scan.
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
-
- * Select Basic Authentication while creating a credential.
- * Provide the Metastore username in the appropriate box.
- * Store the Metastore password in the secret key.
-
- For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-
- 1. **Metastore JDBC Driver Location**: Specify the path to the JDBC driver location in your machine where self-host integration runtime is running, for example, `D:\Drivers\HiveMetastore`. It's the path to valid JAR folder location. Make sure the driver is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Metastore JDBC Driver Class**: Provide the class name for the connection driver. For example, enter **\com.microsoft.sqlserver.jdbc.SQLServerDriver**.
-
- 1. **Metastore JDBC URL**: Provide the connection URL value and define the connection to the URL of the Metastore database server. For example: `jdbc:sqlserver://hive.database.windows.net;database=hive;encrypt=true;trustServerCertificate=true;create=false;loginTimeout=300`.
-
- > [!NOTE]
- > When you copy the URL from *hive-site.xml*, remove `amp;` from the string or the scan will fail.
- >
- > [Download the SSL certificate](https://www.digicert.com/CACerts/BaltimoreCyberTrustRoot.crt.pem) to the self-hosted integration runtime machine, then update the path to the SSL certificate's location on your machine in the URL.
- >
- > When you enter local file paths in the scan configuration, change the Windows path separator character from a backslash (`\`) to a forward slash (`/`). For example, if you place the SSL certificate at local file path *D:\Drivers\SSLCert\BaltimoreCyberTrustRoot.crt.pem*, change the `serverSslCert` parameter value to *D:/Drivers/SSLCert/BaltimoreCyberTrustRoot.crt.pem*.
-
- The **Metastore JDBC URL** value will look like this example:
-
- `jdbc:mariadb://samplehost.mysql.database.azure.com:3306/XXXXXXXXXXXXXXXX?useSSL=true&enabledSslProtocolSuites=TLSv1,TLSv1.1,TLSv1.2&serverSslCert=D:/Drivers/SSLCert/BaltimoreCyberTrustRoot.crt.pem`
-
- 1. **Metastore database name**: Provide the name of the Hive Metastore database.
-
- 1. **Schema**: Specify a list of Hive schemas to import. For example: **schema1; schema2**.
-
- All user schemas are imported if that list is empty. All system schemas (for example, SysAdmin) and objects are ignored by default.
-
- Acceptable schema name patterns that use SQL `LIKE` expression syntax include the percent sign (%). For example, `A%; %B; %C%; D` means:
-
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of `NOT` and special characters isn't acceptable.
-
- 1. **Maximum memory available**: Maximum memory (in gigabytes) available on the customer's machine for the scanning processes to use. This value is dependent on the size of Hive Metastore database to be scanned.
-
- > [!Note]
- > As a thumb rule, please provide 1GB memory for every 1000 tables.
-
- :::image type="content" source="media/register-scan-hive-metastore-source/scan.png" alt-text="Screenshot that shows boxes for scan details." border="true":::
-
-1. Select **Continue**.
-
-1. For **Scan trigger**, choose whether to set up a schedule or run the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your Hive Metastore source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Hive Metastore lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
-
-## Next steps
-
-Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Looker Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-looker-source.md
- Title: Connect to and manage Looker
-description: This guide describes how to connect to Looker in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Looker source.
----- Previously updated : 07/18/2023---
-# Connect to and manage Looker in Microsoft Purview (Preview)
-
-This article outlines how to register Looker, and how to authenticate and interact with Looker in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
--
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No| [Yes](#lineage)| No |
-
-The supported Looker server version is 7.2.
-
-When scanning Looker source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Folders
- - Projects
- - Models
- - Dashboards
- - Looks
- - Explore diagrams including the joins
- - Views including the dimensions, measures, parameters, and filters
- - Layouts including the chart layouts, table layouts, text, and fields
--- Fetching static lineage on assets relationships among views and layouts.-
-When setting up scan, you can choose to scan an entire Looker server, or scope the scan to a subset of Looker projects matching the given name(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-
-> [!NOTE]
-> **If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.), **you will need to configure a self hosted integration runtime to connect to it**.
--- If your data store isn't publicly accessible, set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
- - Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
- - Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-## Register
-
-This section describes how to register Looker in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-An API3 key is required to connect to the Looker server. The API3 key consists in a public client_id and a private client_secret and follows an OAuth2 authentication pattern.
-
-### Steps to register
-
-To register a new Looker server in your data catalog, follow these steps:
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register.**
-1. On Register sources, select **Looker**. Select **Continue.**
--
-On the Register sources (Looker) screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the Looker API URL in the **Server API URL** field. The default port for API requests is port 19999. Also, all Looker API endpoints require an HTTPS connection. For example: 'https://azurepurview.cloud.looker.com'
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-looker-source/scan-source.png" alt-text="scan looker source" border="true":::
-
-## Scan
-
-Follow the steps below to scan Looker to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. If your server is publicly accessible, skip to step two. Otherwise, you'll need to make sure your self-hosted integration runtime is configured:
- 1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), got to the Management Center, and select **Integration runtimes**.
- 1. Make sure a self-hosted integration runtime is available. If one isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to **Sources**.
-
-1. Select the registered **Looker** server.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the Azure auto-resolved integration runtime if your server is publicly accessible, or your configured self-hosted integration runtime if it isn't publicly available.
-
- 1. **Server API URL** is auto populated based on the value entered during registration.
-
- 1. **Credential:** While configuring Looker credential, make sure to:
-
- * Select **Basic Authentication** as the Authentication method
- * Provide your Looker API3 key's client ID in the User name field
- * Save your Looker API3 key's client secret in the key vault's secret.
-
- To access client ID and client secret, navigate to Looker -\>Admin -\> Users -\> Select **Edit** on a user -\> Select **EditKeys** -\> Use the Client ID and Client Secret or create a new one.
-
- :::image type="content" source="media/register-scan-looker-source/looker-details.png" alt-text="get looker details" border="true":::
-
- To understand more on credentials, refer to the link [here](manage-credentials.md)
-
- 1. **Project filter** - Scope your scan by providing a semicolon separated list of Looker projects. This option is used to select looks and dashboards by their parent project.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of erwin Mart to be scanned.
-
- :::image type="content" source="media/register-scan-looker-source/setup-scan.png" alt-text="trigger scan" border="true":::
-
-1. Select **Test connection** to validate the settings.
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select on **Save and Run**.
--
-## Lineage
-
-After scanning your Looker source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Looker lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-mongodb.md
- Title: Connect to and manage MongoDB
-description: This guide describes how to connect to MongoDB in Microsoft Purview, and use Microsoft Purview's features to scan and manage your MongoDB source.
----- Previously updated : 04/20/2023---
-# Connect to and manage MongoDB in Microsoft Purview
-
-This article outlines how to register MongoDB, and how to authenticate and interact with MongoDB in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No| No | No |
-
-The supported MongoDB versions are 2.6 to 5.1.
-
-When scanning MongoDB source, Microsoft Purview supports extracting technical metadata including:
--- Server-- Databases-- Collections including the schema-- Views including the schema-
-During scan, Microsoft Purview retrieves and analyzes sample documents to infer the collection/view schema. The sample size is configurable.
-
-When setting up scan, you can choose to scan one or more MongoDB database(s) entirely, or further scope the scan to a subset of collections matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.16.8093.1.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-## Register
-
-This section describes how to register MongoDB in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new MongoDB source in your data catalog, do the following:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **MongoDB**. Select **Continue**.
-
-On the **Register sources (MongoDB)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **server** name. Specify a name to uniquely identify your MongoDB instance in your company. For example, `host` for standalone deployment, `MyReplicaSetName` for replica set, `MyClusterName` for sharded cluster. This value will be used in asset qualified name and cannot be changed.
-
-1. Select a collection or create a new one (Optional).
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-mongodb/register-sources.png" alt-text="register sources options" border="true":::
-
-## Scan
-
-Follow the steps below to scan MongoDB to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for a MongoDB source is **Basic authentication**.
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime.
-
-1. Navigate to **Sources**.
-
-1. Select the registered MongoDB source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the self-hosted integration runtime used to perform scan.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Basic Authentication** while creating a credential.
- * Provide the user name used to connect to MongoDB in the User name input field.
- * Store the user password used to connect to MongoDB in the secret key.
-
- 1. **Connection string**: Specify the MongoDB connection string used to connect to your MongoDB, excluding the username and password. For example, `mongodb://mongodb0.example.com:27017,mongodb1.example.com:27017/?replicaSet=myRepl`.
-
- 1. **Databases**: Specify a list of MongoDB databases to be imported. The list can have one or more database names separated by semicolon (;), e.g. `database1; database2`.
-
- 1. **Collections**: The subset of collections to import expressed as a semicolon separated list of collections, e.g. `collection1; collection2`. All collections are imported if the list is empty.ΓÇï
-
- Acceptable collection name patterns using SQL LIKE expressions syntax include using %. For example: `A%; %B; %C%; D`:
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Number of sample documents**: Number of sample documents to be analyzed for schema extraction. Default is 10.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Maximum memory (in GB) available on customer's VM to be used by scanning processes. It's dependent on the size of MongoDB source to be scanned.
-
- :::image type="content" source="media/register-scan-mongodb/scan.png" alt-text="scan MongoDB" border="true":::
-
-1. Select **Test connection** to validate the configurations.
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-mysql.md
- Title: Connect to and manage MySQL
-description: This guide describes how to connect to MySQL in Microsoft Purview, and use Microsoft Purview's features to scan and manage your MySQL source.
----- Previously updated : 07/18/2023---
-# Connect to and manage MySQL in Microsoft Purview
-
-This article outlines how to register MySQL, and how to authenticate and interact with MySQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No |No| No| [Yes](#lineage)| No|
-
-The supported MySQL server versions are 5.7 to 8.x.
-
-When scanning MySQL source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Databases
- - Tables including the columns
- - Views including the columns
--- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire MySQL server, or scope the scan to a subset of databases matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-
-> [!NOTE]
-> **If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.), **you will need to configure a self hosted integration runtime to connect to it**.
--- If your data store isn't publicly accessible, set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
- - Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
- - Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-### Required permissions for scan
-
-The MySQL user must have the SELECT, SHOW VIEW and EXECUTE permissions for each target MySQL schema that contains metadata.
-
-## Register
-
-This section describes how to register MySQL in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new MySQL source in your data catalog, follow these steps:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **MySQL**. Select **Continue**.
-
-On the **Register sources (MySQL)** screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Server** name to connect to a MySQL source. This can either be:
- * A host name used to connect to the database server. For example: `MyDatabaseServer.com`
- * An IP address. For example: `192.169.1.2`
-
-1. Enter the **Port** used to connect to the database server (3306 by default for MySQL).
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-mysql/register-sources.png" alt-text="register sources options" border="true":::
-
-## Scan
-
-Follow the steps below to scan MySQL to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for a MySQL source is **Basic authentication**.
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. If your server is publicly accessible, skip to step two. Otherwise, you'll need to make sure your self-hosted integration runtime is configured:
- 1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), got to the Management Center, and select **Integration runtimes**.
- 1. Make sure a self-hosted integration runtime is available. If one isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to **Sources**.
-
-1. Select the registered MySQL source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the Azure auto-resolved integration runtime if your server is publicly accessible, or your configured self-hosted integration runtime if it isn't publicly available.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Basic Authentication** while creating a credential.
- * Provide the user name used to connect to the database server in the User name input field.
- * Store the user password used to connect to the database server in the secret key.
-
- 1. **Database**: List subset of databases to import expressed as a semicolon separated list. For example, `database1; database2`. All user databases are imported if the list is empty. All system databases (for example, SysAdmin) are ignored by default.
-
- Acceptable schema name patterns using SQL LIKE expressions syntax include using %. For example: `A%; %B; %C%; D`
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of MySQL source to be scanned.
-
- > [!Note]
- > As a rule of thumb, please provide 1GB memory for every 1000 tables
-
- :::image type="content" source="media/register-scan-mysql/scan.png" alt-text="scan MySQL" border="true":::
-
-1. Select **Test connection** to validate the settings (available when using Azure Integration Runtime).
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your MySQL source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported MySQL lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
-
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan On Premises Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-on-premises-sql-server.md
- Title: Connect to and manage on-premises SQL server instances
-description: This guide describes how to connect to on-premises SQL server instances in Microsoft Purview, and use Microsoft Purview's features to scan and manage your on-premises SQL server source.
----- Previously updated : 2/14/2023---
-# Connect to and manage an on-premises SQL server instance in Microsoft Purview
-
-This article outlines how to register on-premises SQL server instances, and how to authenticate and interact with an on-premises SQL server instance in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-|||||||||--|
-| [Yes](#register) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan) | [Yes](create-sensitivity-label.md)| No| Limited** | No |
-
-\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-
-The supported SQL Server versions are 2005 and above. SQL Server Express LocalDB is not supported.
-
-When scanning on-premises SQL server, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Instance
- - Databases
- - Schemas
- - Tables including the columns
- - Views including the columns
-
-When setting up scan, you can choose to specify the database name to scan one database, and you can further scope the scan by selecting tables and views as needed. The whole SQL Server instance will be scanned if database name is not provided.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview governance portal. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
-## Register
-
-This section describes how to register an on-premises SQL server instance in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-There are two ways to set up authentication for SQL server on-premises:
--- SQL Authentication-- Windows Authentication-
-#### Set up SQL server authentication
-
-If SQL Authentication is applied, ensure the SQL Server deployment is configured to allow SQL Server and Windows Authentication.
-
-To enable this, within SQL Server Management Studio (SSMS), navigate to "Server Properties" and change from "Windows Authentication Mode" to "SQL Server and Windows Authentication mode".
--
-If Windows Authentication is applied, configure the SQL Server deployment to use Windows Authentication mode.
-
-A change to the Server Authentication will require a restart of the SQL Server Instance and SQL Server Agent, this can be triggered within SSMS by navigating to the SQL Server instance and selecting "Restart" within the right-click options pane.
-
-##### Creating a new login and user
-
-If you would like to create a new login and user to be able to scan your SQL server, follow the steps below:
-
-The account must have access to the **master** database. This is because the `sys.databases` is in the master database. The Microsoft Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
-
-> [!Note]
-> All the steps below can be executed using the code provided [here](https://github.com/Azure/Purview-Samples/blob/master/TSQL-Code-Permissions/grant-access-to-on-prem-sql-databases.sql)
-
-1. Navigate to SQL Server Management Studio (SSMS), connect to the server, navigate to security, select and hold (or right-click) on login and create New login. If Windows Authentication is applied, select "Windows authentication". If SQL Authentication is applied, make sure to select "SQL authentication".
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/create-new-login-user.png" alt-text="Create new login and user.":::
-
-1. Select Server roles on the left navigation and ensure that public role is assigned.
-
-1. Select User mapping on the left navigation, select all the databases in the map and select the Database role: **db_datareader**.
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/user-mapping.png" alt-text="user mapping.":::
-
-1. Select OK to save.
-
-1. If SQL Authentication is applied, navigate again to the user you created, by selecting and holding (or right-clicking) and selecting **Properties**. Enter a new password and confirm it. Select the 'Specify old password' and enter the old password. **It is required to change your password as soon as you create a new login.**
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/change-password.png" alt-text="change password.":::
-
-##### Storing your SQL login password in a key vault and creating a credential in Microsoft Purview
-
-1. Navigate to your key vault in the Azure portal1. Select **Settings > Secrets**
-1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your SQL server login
-1. Select **Create** to complete
-1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the **username** and **password** to set up your scan. Make sure the right authentication method is selected when creating a new credential. If SQL Authentication is applied, select "SQL authentication" as the authentication method. If Windows Authentication is applied, then select "Windows authentication".
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Under Sources and scanning in the left navigation, select **Integration runtimes**. Make sure a self-hosted integration runtime is set up. If it is not set up, follow the steps mentioned [here](manage-integration-runtimes.md) to create a self-hosted integration runtime for scanning on an on-premises or Azure VM that has access to your on-premises network.
-
-1. Select **Data Map** on the left navigation.
-
-1. Select **Register**
-
-1. Select **SQL server** and then **Continue**
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/set-up-sql-data-source.png" alt-text="Set up the SQL data source.":::
-
-1. Provide a friendly name, which will be a short name you can use to identify your server, and the server endpoint.
-
-1. Select **Finish** to register the data source.
-
-## Scan
-
-Follow the steps below to scan on-premises SQL server instances to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md)
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the SQL Server source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source. The credentials are grouped and listed under different authentication methods.
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/on-premises-sql-set-up-scan-win-auth.png" alt-text="Set up scan":::
-
-1. You can scope your scan to specific tables by choosing the appropriate items in the list after enter Database name.
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/on-premises-sql-scope-your-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/on-premises-sql-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/register-scan-on-premises-sql-server/trigger-scan.png" alt-text="trigger":::
-
-1. Review your scan and select **Save and run**.
--
-## Next steps
-
-Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Oracle Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-oracle-source.md
- Title: Connect to and manage Oracle
-description: This guide describes how to connect to Oracle in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Oracle source.
----- Previously updated : 04/20/2023---
-# Connect to and manage Oracle in Microsoft Purview
-
-This article outlines how to register Oracle, and how to authenticate and interact with Oracle in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | [Yes](#scan) | No |No| [Yes*](#lineage)| No |
-
-\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-The supported Oracle server versions are 6i to 19c. Oracle proxy server isn't supported when scanning Oracle source.
-
-When scanning Oracle source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Schemas
- - Packages
- - Tables including the columns, foreign keys, indexes, triggers and unique constraints
- - Views including the columns and triggers
- - Stored procedures including the parameter dataset and result set
- - Functions including the parameter dataset
- - Sequences
- - Synonyms
- - Types including the type attributes
--- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire Oracle server, or scope the scan to a subset of schemas matching the given name(s) or name pattern(s).
-
-### Known limitations
--- Currently, the Oracle service name isn't captured in the metadata or hierarchy.-- When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * Download the [Oracle JDBC driver](https://www.oracle.com/database/technologies/appdev/jdbc-downloads.html) on the machine where your self-hosted integration runtime is running. Note down the folder path that you'll use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
-### Required permissions for scan
-
-Microsoft Purview supports basic authentication (username and password) for scanning Oracle. The Oracle user must have read access to system tables in order to access advanced metadata.
-
-For classification, user also needs to be the owner of the table.
-
->[!IMPORTANT]
->If the user is not the owner of the table, the scan will run successfully and ingest metadata, but will not identify any classifications.
-
-The user should have permission to create a session and role SELECT\_CATALOG\_ROLE assigned. Alternatively, the user may have SELECT permission granted for every individual system table that this connector queries metadata from:
-
-```sql
-grant create session to [user];
-grant select on all_users to [user];
-grant select on dba_objects to [user];
-grant select on dba_tab_comments to [user];
-grant select on dba_external_locations to [user];
-grant select on dba_directories to [user];
-grant select on dba_mviews to [user];
-grant select on dba_clu_columns to [user];
-grant select on dba_tab_columns to [user];
-grant select on dba_col_comments to [user];
-grant select on dba_constraints to [user];
-grant select on dba_cons_columns to [user];
-grant select on dba_indexes to [user];
-grant select on dba_ind_columns to [user];
-grant select on dba_procedures to [user];
-grant select on dba_synonyms to [user];
-grant select on dba_views to [user];
-grant select on dba_source to [user];
-grant select on dba_triggers to [user];
-grant select on dba_arguments to [user];
-grant select on dba_sequences to [user];
-grant select on dba_dependencies to [user];
-grant select on dba_type_attrs to [user];
-grant select on V_$INSTANCE to [user];
-grant select on v_$database to [user];
-```
-
-## Register
-
-This section describes how to register Oracle in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new Oracle source in your data catalog, do the following:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **Oracle**. Select **Continue**.
-
- :::image type="content" source="media/register-scan-oracle-source/register-sources.png" alt-text="register sources" border="true":::
-
-On the **Register sources (Oracle)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Host** name to connect to an Oracle source. This can either be:
- * A host name used to connect to the database server. For example: `MyDatabaseServer.com`
- * An IP address. For example: `192.169.1.2`
-
-1. Enter the **Port number** used to connect to the database server (1521 by default for Oracle).
-
-1. Enter the **Oracle service name** (not Oracle UID) used to connect to the database server.
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-oracle-source/register-sources-2-inline.png" alt-text="register sources options" lightbox="media/register-scan-oracle-source/register-sources-2-expanded.png" border="true":::
-
-## Scan
-
-Follow the steps below to scan Oracle to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-> [!TIP]
-> To troubleshoot any issues with scanning:
-> 1. Confirm you have followed all [**prerequisites**](#prerequisites).
-> 1. Review our [**scan troubleshooting documentation**](troubleshoot-connections.md).
-
-### Create and run scan
-
-To create and run a new scan, do the following:
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime.
-
-1. Navigate to **Sources**.
-
-1. Select the registered Oracle source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured
- self-hosted integration runtime
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select Basic Authentication while creating a credential.
- * Provide the user name used to connect to the database server in the User name input field.
- * Store the user password used to connect to the database server in the secret key.
-
- 1. **Schema**: List subset of schemas to import expressed as a semicolon separated list in **case-sensitive** manner. For example, `schema1; schema2`. All user schemas are imported if that list is empty. All system schemas (for example, SysAdmin) and objects are ignored by default.
-
- Acceptable schema name patterns using SQL LIKE expressions syntax include using %. For example: `A%; %B; %C%; D`
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Driver location**: Specify the path to the JDBC driver location in your machine where self-host integration runtime is running, for example, `D:\Drivers\Oracle`. It's the path to valid JAR folder location. Make sure the driver is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Stored procedure details**: Controls the number of details imported from stored procedures:
-
- - Signature: The name and parameters of stored procedures.
- - Code, signature: The name, parameters and code of stored procedures.
- - Lineage, code, signature: The name, parameters and code of stored procedures, and the data lineage derived from the code.
- - None: Stored procedure details aren't included.
-
- 1. **Maximum memory available**: Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of Oracle source to be scanned.
-
- > [!Note]
- > As a rule of thumb, please provide 1GB memory for every 1000 tables
-
- :::image type="content" source="media/register-scan-oracle-source/scan.png" alt-text="scan oracle" border="true":::
-
-1. Select **Test connection**.
-
- > [!Note]
- > Use the "Test connection" button in the scan setup UI to test the connection. The "Test Connection" in self-hosted integration runtime configuration manager UI -> Diagnostics tab does not fully validate the connectivity.
-
-1. Select **Continue**.
-
-1. Select a **scan rule set** for classification. You can choose between the system default, existing custom rule sets, or [create a new rule set](create-a-scan-rule-set.md) inline.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your Oracle source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Oracle lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-postgresql.md
- Title: Connect to and manage PostgreSQL
-description: This guide describes how to connect to PostgreSQL in Microsoft Purview, and use Microsoft Purview's features to scan and manage your PostgreSQL source.
----- Previously updated : 07/18/2023---
-# Connect to and manage PostgreSQL in Microsoft Purview
-
-This article outlines how to register PostgreSQL, and how to authenticate and interact with PostgreSQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No |No| No| [Yes](#lineage) | No |
--
-The supported PostgreSQL server versions are 8.4 to 12.x.
-
-When scanning PostgreSQL source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Databases
- - Schemas
- - Tables including the columns
- - Views including the columns
-
-- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire PostgreSQL database, or scope the scan to a subset of schemas matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-
-> [!NOTE]
-> **If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.), **you will need to configure a self hosted integration runtime to connect to it**.
--- If your data store isn't publicly accessible, set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
- - Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
- - Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-### Required permissions for scan
-
-The PostgreSQL user must have read access to system tables in order to access advanced metadata.
-
-## Register
-
-This section describes how to register PostgreSQL in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new PostgreSQL source in your data catalog, follow these steps:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **PostgreSQL**. Select **Continue**.
-
-On the **Register sources (PostgreSQL)** screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Server** name to connect to a PostgreSQL source. This can either be:
- * A host name used to connect to the database server. For example: `MyDatabaseServer.com`
- * An IP address. For example: `192.169.1.2`
-
-1. Enter the **Port** used to connect to the database server (5432 by default for PostgreSQL).
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-postgresql/register-sources.png" alt-text="register sources options" border="true":::
-
-## Scan
-
-Follow the steps below to scan PostgreSQL to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for a PostgreSQL source is **Basic authentication**.
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. If your server is publicly accessible, skip to step two. Otherwise, you'll need to make sure your self-hosted integration runtime is configured:
- 1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), got to the Management Center, and select **Integration runtimes**.
- 1. Make sure a self-hosted integration runtime is available. If one isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to **Sources**.
-
-1. Select the registered PostgreSQL source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the Azure auto-resolved integration runtime if your server is publicly accessible, or your configured self-hosted integration runtime if it isn't publicly available.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Basic Authentication** while creating a credential.
- * Provide the user name used to connect to the database server in the User name input field.
- * Store the user password used to connect to the database server in the secret key.
-
- 1. **Database**: Specify the name of the database instance to import.
-
- 1. **Schema**: List subset of schemas to import expressed as a semicolon separated list. For example, `schema1; schema2`. All user schemas are imported if that list is empty. All system schemas (for example, SysAdmin) and objects are ignored by default.
-
- Acceptable schema name patterns using SQL LIKE expressions syntax include using %. For example: `A%; %B; %C%; D`
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of PostgreSQL source to be scanned.
-
- > [!Note]
- > As a rule of thumb, please provide 1GB memory for every 1000 tables
-
- :::image type="content" source="media/register-scan-postgresql/scan.png" alt-text="scan PostgreSQL" border="true":::
-
-1. Select **Test connection** to validate the settings (available when using Azure Integration Runtime).
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your PostgreSQL source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported PostgreSQL lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Power Bi Tenant Cross Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-power-bi-tenant-cross-tenant.md
- Title: Connect to and manage a Power BI tenant (cross-tenant)
-description: This guide describes how to connect to a Power BI tenant in a cross-tenant scenario. You use Microsoft Purview to scan and manage your Power BI tenant source.
----- Previously updated : 06/08/2023---
-# Connect to and manage a Power BI tenant in Microsoft Purview (cross-tenant)
-
-This article outlines how to register a Power BI tenant in a cross-tenant scenario, and how to authenticate and interact with the tenant in Microsoft Purview. If you're unfamiliar with the service, see [What is Microsoft Purview?](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#deployment-checklist)| [Yes](#deployment-checklist)| Yes | No | No | No| No| [Yes](how-to-lineage-powerbi.md)| No|
-
-When scanning Power BI source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Workspaces
- - Dashboards
- - Reports
- - Datasets including the tables and columns
- - Dataflows
- - Datamarts
--- Fetching static lineage on assets relationships among above Power BI artifacts as well as external data source assets. Learn more from [Power BI lineage](how-to-lineage-powerbi.md).-
-### Supported scenarios for Power BI scans
-
-|**Scenario** |**Microsoft Purview public access** |**Power BI public access** | **Runtime option** | **Authentication option** | **Deployment checklist** |
-|||||||
-|Public access with Azure integration runtime |Allowed |Allowed |Azure runtime |Delegated authentication | [Deployment checklist](#deployment-checklist) |
-|Public access with self-hosted integration runtime |Allowed |Allowed |Self-hosted runtime |Delegated authentication / service principal | [Deployment checklist](#deployment-checklist) |
-
-### Known limitations
--- For the cross-tenant scenario, delegated authentication and service principal are the only supported authentication options for scanning.-- You can create only one scan for a Power BI data source that is registered in your Microsoft Purview account.-- If the Power BI dataset schema isn't shown after the scan, it's due to one of the current limitations with the [Power BI metadata scanner](/power-bi/admin/service-admin-metadata-scanning).-- Empty workspaces are skipped.-
-## Prerequisites
-
-Before you start, make sure you have the following:
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).--- An active [Microsoft Purview account](create-catalog-portal.md).
-
-## Deployment checklist
-
-Use either of the following deployment checklists during the setup, or for troubleshooting purposes, based on your scenario.
-
-# [Public access with Azure integration runtime](#tab/Scenario1)
-
-### Scan cross-tenant Power BI by using delegated authentication in a public network
-
-1. Make sure the Power BI and Microsoft Purview accounts are in the cross-tenant mode.
-
-1. Make sure the Power BI tenant ID is entered correctly during the registration. By default, the Power BI tenant ID that exists in the same Azure Active Directory (Azure AD) instance as Microsoft Purview will be populated.
-
-1. Make sure your [Power BI metadata model is up to date by enabling metadata scanning](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning).
-
-1. From the Azure portal, validate if the Microsoft Purview account network is set to **public access**.
-
-1. From the Power BI tenant admin portal, make sure the Power BI tenant is configured to allow a public network.
-
-1. Check your instance of Azure Key Vault to make sure:
- 1. There are no typos in the password or secret.
- 2. Microsoft Purview managed identity has **get** and **list** access to secrets.
-
-1. Review your credential to validate that the:
- 1. Client ID matches the _Application (Client) ID_ of the app registration.
- 2. For **delegated auth**, username includes the user principal name, such as `johndoe@contoso.com`.
-
-1. In the Power BI Azure AD tenant, validate the following Power BI admin user settings:
- 1. The user is assigned to the Power BI administrator role.
- 2. At least one [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to the user.
- 3. If the user is recently created, sign in with the user at least once, to make sure that the password is reset successfully, and the user can successfully initiate the session.
- 4. There are no multifactor authentication or conditional access policies enforced on the user.
-
-1. In the Power BI Azure AD tenant, validate the following app registration settings:
- 1. The app registration exists in your Azure AD tenant where the Power BI tenant is located.
-
- 2. If service principal is used, under **API permissions**, the following **delegated permissions** are assigned with read for the following APIs:
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. If delegated authentication is used, under **API permissions**, the following **delegated permissions** and **grant admin consent for the tenant** is set up with read for the following APIs:
- - Power BI Service Tenant.Read.All
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. Under **Authentication**:
- 1. **Supported account types** > **Accounts in any organizational directory (Any Azure AD directory - Multitenant)** is selected.
- 2. **Implicit grant and hybrid flows** > **ID tokens (used for implicit and hybrid flows)** is selected.
- 3. **Allow public client flows** is enabled.
-
-1. In Power BI tenant, In Azure Active Directory create a security group.
-1. In Power BI tenant, from Azure Active Directory tenant, make sure [Service Principal is member of the new security group](#authenticate-to-power-bi-tenant).
-1. On the Power BI Tenant Admin portal, validate if [Allow service principals to use read-only Power BI admin APIs](#associate-the-security-group-with-power-bi-tenant) is enabled for the new security group.
-
-# [Public access with self-hosted integration runtime](#tab/Scenario2)
-
-### Scan cross-tenant Power BI by using delegated authentication in a public network
-
-1. Make sure the Power BI and Microsoft Purview accounts are in the cross-tenant mode.
-
-1. Make sure the Power BI tenant ID is entered correctly during the registration. By default, the Power BI tenant ID that exists in the same Azure Active Directory (Azure AD) instance as Microsoft Purview will be populated.
-
-1. Make sure your [Power BI metadata model is up to date by enabling metadata scanning](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning).
-
-1. From the Azure portal, validate if the Microsoft Purview account nNetwork is set to **public access**.
-
-1. From the Power BI tenant admin portal, make sure the Power BI tenant is configured to allow a public network.
-
-1. Check your instance of Azure Key Vault to make sure:
- 1. There are no typos in the password.
- 2. Microsoft Purview managed identity has **get** and **list** access to secrets.
-
-1. Review your credential to validate that the:
- 1. Client ID matches the _Application (Client) ID_ of the app registration.
- 2. Username includes the user principal name, such as `johndoe@contoso.com`.
-
-1. In the Power BI Azure AD tenant, validate the following app registration settings:
- 1. The app registration exists in your Azure AD tenant where the Power BI tenant is located.
-
- 2. If service principal is used, under **API permissions**, the following **delegated permissions** are assigned with read for the following APIs:
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. If delegated authentication is used, under **API permissions**, the following **delegated permissions** and **grant admin consent for the tenant** is set up with read for the following APIs:
- - Power BI Service Tenant.Read.All
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. Under **Authentication**:
- 1. **Supported account types** > **Accounts in any organizational directory (Any Azure AD directory - Multitenant)** is selected.
- 2. **Implicit grant and hybrid flows** > **ID tokens (used for implicit and hybrid flows)** is selected.
- 3. **Allow public client flows** is enabled.
-
-1. If delegated authentication is used, in the Power BI Azure AD tenant, validate the following Power BI admin user settings:
- 1. The user is assigned to the Power BI administrator role.
- 2. At least one [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to the user.
- 3. If the user is recently created, sign in with the user at least once, to make sure that the password is reset successfully, and the user can successfully initiate the session.
- 4. There are no multifactor authentication or conditional access policies enforced on the user.
-
-1. Validate the following self-hosted runtime settings:
- 1. The latest version of the [self-hosted runtime](https://www.microsoft.com/download/details.aspx?id=39717) is installed on the VM.
- 1. Network connectivity from the self-hosted runtime to the Power BI tenant is enabled. The following endpoints must be reachable from self-hosted runtime VM:
- - `*.powerbi.com`
- - `*.analysis.windows.net`
-
- 1. Network connectivity from the self-hosted runtime to Microsoft services is enabled.
- 1. [JDK 8 or later](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html) is installed. Restart the machine after you newly install the JDK for it to take effect.
-1. In Power BI tenant, In Azure Active Directory create a security group.
-1. In Power BI tenant, from Azure Active Directory tenant, make sure [Service Principal is member of the new security group](#authenticate-to-power-bi-tenant).
-1. On the Power BI Tenant Admin portal, validate if [Allow service principals to use read-only Power BI admin APIs](#associate-the-security-group-with-power-bi-tenant) is enabled for the new security group.
--
-## Register the Power BI tenant
-
-1. From the options on the left, select **Data Map**.
-
-1. Select **Register**, and then select **Power BI** as your data source.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/select-power-bi-data-source.png" alt-text="Screenshot that shows the list of data sources available to choose.":::
-
-1. Give your Power BI instance a friendly name. The name must be 3 to 63 characters long, and must contain only letters, numbers, underscores, and hyphens. Spaces aren't allowed.
-
-1. Edit the **Tenant ID** field, to replace with the cross-tenant Power BI that you want to register and scan. By default, the Power BI tenant ID that exists in the same Azure AD instance as Microsoft Purview is populated.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/register-cross-tenant.png" alt-text="Screenshot that shows the registration experience for cross-tenant Power BI.":::
-
-## Scan cross-tenant Power BI
-
-Delegated authentication is the only supported option for cross-tenant scanning. You can use either Azure runtime or a self-hosted integration runtime to run a scan.
-
-> [!TIP]
-> To troubleshoot any issues with scanning:
-> 1. Confirm you have completed the [deployment checklist for your scenario](#deployment-checklist).
-> 1. Review the [scan troubleshooting documentation](register-scan-power-bi-tenant-troubleshoot.md).
-
-### Authenticate to Power BI tenant
-
-In Azure Active Directory Tenant, where Power BI tenant is located:
-
-1. In the [Azure portal](https://portal.azure.com), search for **Azure Active Directory**.
-
-2. Create a new security group in your Azure Active Directory, by following [Create a basic group and add members using Azure Active Directory](../active-directory/fundamentals/active-directory-groups-create-azure-portal.md).
-
- > [!Tip]
- > You can skip this step if you already have a security group you want to use.
-
-3. Select **Security** as the **Group Type**.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/security-group.png" alt-text="Screenshot of security group type.":::
-
-4. Add your **service principal** to this security group. Select **Members**, then select **+ Add members**.
-
-5. Search for your Microsoft Purview managed identity or service principal and select it.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/add-catalog-to-group-by-search.png" alt-text="Screenshot showing how to add catalog by searching for its name.":::
-
- You should see a success notification showing you that it was added.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/success-add-catalog-msi.png" alt-text="Screenshot showing successful addition of catalog managed identity.":::
-
-### Associate the security group with Power BI tenant
-
-1. Log into the [Power BI admin portal](https://app.powerbi.com/admin-portal/tenantSettings).
-
-2. Select the **Tenant settings** page.
-
- > [!Important]
- > You need to be a Power BI Admin to see the tenant settings page.
-
-3. Select **Admin API settings** > **Allow service principals to use read-only Power BI admin APIs (Preview)**.
-
-4. Select **Specific security groups**.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/allow-service-principals-power-bi-admin.png" alt-text="Image showing how to allow service principals to get read-only Power BI admin API permissions.":::
-
-5. Select **Admin API settings** > **Enhance admin APIs responses with detailed metadata** and **Enhance admin APIs responses with DAX and mashup expressions** > Enable the toggle to allow Microsoft Purview Data Map automatically discover the detailed metadata of Power BI datasets as part of its scans.
-
- > [!IMPORTANT]
- > After you update the Admin API settings on your power bi tenant, wait around 15 minutes before registering a scan and test connection.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-sub-artifacts.png" alt-text="Image showing the Power BI admin portal config to enable subartifact scan.":::
-
- > [!Caution]
- > When you allow the security group you created (that has your Microsoft Purview managed identity as a member) to use read-only Power BI admin APIs, you also allow it to access the metadata (e.g. dashboard and report names, owners, descriptions, etc.) for all of your Power BI artifacts in this tenant. Once the metadata has been pulled into the Microsoft Purview, Microsoft Purview's permissions, not Power BI permissions, determine who can see that metadata.
-
- > [!Note]
- > You can remove the security group from your developer settings, but the metadata previously extracted won't be removed from the Microsoft Purview account. You can delete it separately, if you wish.
-
-### Create scan for cross-tenant using Azure IR with delegated authentication
-
-To create and run a new scan by using the Azure runtime, perform the following steps:
-
-1. Create a user account in the Azure AD tenant where the Power BI tenant is located, and assign the user to the Azure AD role, **Power BI Administrator**. Take note of the username and sign in to change the password.
-
-1. Assign the proper Power BI license to the user.
-
-1. Go to the instance of Azure Key Vault in the tenant where Microsoft Purview is created.
-
-1. Select **Settings** > **Secrets**, and then select **+ Generate/Import**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot of the instance of Azure Key Vault.":::
-
-1. Enter a name for the secret. For **Value**, type the newly created password for the Azure AD user. Select **Create** to complete.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret.png" alt-text="Screenshot that shows how to generate a secret in Azure Key Vault.":::
-
-1. If your key vault isn't connected to Microsoft Purview yet, you need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account).
-
-1. Create an app registration in your Azure AD tenant where Power BI is located. Provide a web URL in the **Redirect URI**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant-app-registration.png" alt-text="Screenshot how to create App in Azure AD for cross tenant.":::
-
-3. Take note of the client ID (app ID).
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-create-service-principle.png" alt-text="Screenshot that shows how to create a service principle.":::
-
-1. From the Azure AD dashboard, select the newly created application, and then select **App permissions**. Assign the application the following delegated permissions, and grant admin consent for the tenant:
-
- - Power BI Service Tenant.Read.All
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-delegated-permissions.png" alt-text="Screenshot of delegated permissions on Power BI and Microsoft Graph.":::
-
-1. From the Azure AD dashboard, select the newly created application, and then select **Authentication**. Under **Supported account types**, select **Accounts in any organizational directory (Any Azure AD directory - Multitenant)**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-multitenant.png" alt-text="Screenshot of account type support multitenant.":::
-
-1. Under **Implicit grant and hybrid flows**, select **ID tokens (used for implicit and hybrid flows)**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-id-token-hybrid-flows.png" alt-text="Screenshot of ID token hybrid flows.":::
-
-1. Under **Advanced settings**, enable **Allow Public client flows**.
-
-1. In the Microsoft Purview Studio, go to the **Data map** in the left menu. Go to **Sources**.
-
-1. Select the registered Power BI source from cross-tenant.
-
-1. Select **+ New scan**.
-
-1. Give your scan a name. Then select the option to include or exclude the personal workspaces.
-
- > [!Note]
- > If you switch the configuration of a scan to include or exclude a personal workspace, you trigger a full scan of the Power BI source.
-
-1. Select **Azure AutoResolveIntegrationRuntime** from the dropdown list.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant.png" alt-text="Screenshot that shows the Power BI scan setup, using Azure integration runtime for cross-tenant.":::
-
-1. For the **Credential**, select **Delegated authentication**, and then select **+ New** to create a new credential.
-
-1. Create a new credential and provide the following required parameters:
-
- - **Name**: Provide a unique name for the credential.
-
- - **Client ID**: Use the service principal client ID (app ID) that you created earlier.
-
- - **User name**: Provide the username of the Power BI administrator that you created earlier.
-
- - **Password**: Select the appropriate Key Vault connection, and the **Secret name** where the Power BI account password was saved earlier.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-delegated-authentication.png" alt-text="Screenshot that shows the Power BI scan setup, using delegated authentication.":::
-
-1. Select **Test connection** before continuing to the next steps.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant-test.png" alt-text="Screenshot that shows the test connection status.":::
-
- If the test fails, select **View Report** to see the detailed status and troubleshoot the problem:
-
- 1. *Access - Failed* status means that the user authentication failed. Validate if the username and password are correct. Review if the credential contains the correct client (app) ID from the app registration.
- 2. *Assets (+ lineage) - Failed* status means that the authorization between Microsoft Purview and Power BI has failed. Make sure that the user is added to the Power BI administrator role, and has the proper Power BI license assigned.
- 3. *Detailed metadata (Enhanced) - Failed* status means that the Power BI admin portal is disabled for the following setting: **Enhance admin APIs responses with detailed metadata**.
-
-1. Set up a scan trigger. Your options are **Recurring** or **Once**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
-
-1. On **Review new scan**, select **Save and run** to launch your scan.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/save-run-power-bi-scan.png" alt-text="Screenshot that shows how to save and run the Power BI source.":::
-
-### Create scan for cross-tenant using self-hosted IR with service principal
-
-To create and run a new scan by using the self-hosted integration runtime, perform the following steps:
-
-1. Create an app registration in your Azure AD tenant where Power BI is located. Provide a web URL in the **Redirect URI**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant-app-registration.png" alt-text="Screenshot how to create App in Azure AD for cross tenant.":::
-
-2. Take note of the client ID (app ID).
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-create-service-principle.png" alt-text="Screenshot that shows how to create a service principle.":::
-
-1. From the Azure AD dashboard, select the newly created application, and then select **App permissions**. Assign the application the following delegated permissions:
-
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-spn-api-permissions.png" alt-text="Screenshot of delegated permissions on Microsoft Graph.":::
-
-1. From the Azure AD dashboard, select the newly created application, and then select **Authentication**. Under **Supported account types**, select **Accounts in any organizational directory (Any Azure AD directory - Multitenant)**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-multitenant.png" alt-text="Screenshot of account type support multitenant.":::
-
-1. Under **Implicit grant and hybrid flows**, select **ID tokens (used for implicit and hybrid flows)**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-id-token-hybrid-flows.png" alt-text="Screenshot of ID token hybrid flows.":::
-
-1. Under **Advanced settings**, enable **Allow Public client flows**.
-
-1. In the tenant where Microsoft Purview is created go to the instance of Azure Key Vault.
-
-1. Select **Settings** > **Secrets**, and then select **+ Generate/Import**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot of the instance of Azure Key Vault.":::
-
-1. Enter a name for the secret. For **Value**, type the newly created secret for the App registration. Select **Create** to complete.
-
-
-2. Under **Certificates & secrets**, create a new secret and save it securely for next steps.
-
-3. In Azure portal, navigate to your Azure key vault.
-
-4. Select **Settings** > **Secrets** and select **+ Generate/Import**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot how to navigate to Azure Key Vault.":::
-
-5. Enter a name for the secret and for **Value**, type the newly created secret for the App registration. Select **Create** to complete.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret-spn.png" alt-text="Screenshot how to generate an Azure Key Vault secret for SPN.":::
-
-1. If your key vault isn't connected to Microsoft Purview yet, you need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account).
-
-1. In the Microsoft Purview Studio, go to the **Data map** in the left menu. Go to **Sources**.
-
-1. Select the registered Power BI source from cross-tenant.
-
-1. Select **+ New scan**.
-
-1. Give your scan a name. Then select the option to include or exclude the personal workspaces.
-
- > [!Note]
- > If you switch the configuration of a scan to include or exclude a personal workspace, you trigger a full scan of the Power BI source.
-
-1. Select your self-hosted integration runtime from the drop-down list.
-
-1. For the **Credential**, select **Service Principal**, and then select **+ New** to create a new credential.
-
-1. Create a new credential and provide the following required parameters:
-
- - **Name**: Provide a unique name for credential
- - **Authentication method**: Service principal
- - **Tenant ID**: Your Power BI tenant ID
- - **Client ID**: Use Service Principal Client ID (App ID) you created earlier
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-spn-authentication.png" alt-text="Screenshot of the new credential menu, showing Power BI credential for SPN with all required values supplied.":::
-
-1. Select **Test connection** before continuing to the next steps.
-
- If the test fails, select **View Report** to see the detailed status and troubleshoot the problem:
-
- 1. *Access - Failed* status means that the user authentication failed. Validate if the App ID and secret are correct. Review if the credential contains the correct client (app) ID from the app registration.
- 2. *Assets (+ lineage) - Failed* status means that the authorization between Microsoft Purview and Power BI has failed. Make sure that the user is added to the Power BI administrator role, and has the proper Power BI license assigned.
- 3. *Detailed metadata (Enhanced) - Failed* status means that the Power BI admin portal is disabled for the following setting: **Enhance admin APIs responses with detailed metadata**.
-
-1. Set up a scan trigger. Your options are **Recurring** or **Once**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
-
-1. On **Review new scan**, select **Save and run** to launch your scan.
-
-## Next steps
-
-Now that you've registered your source, see the following guides to learn more about Microsoft Purview and your data.
--- [Data estate insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search data catalog](how-to-search-catalog.md)
purview Register Scan Power Bi Tenant Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-power-bi-tenant-troubleshoot.md
- Title: Troubleshoot Power BI tenant scans
-description: This guide describes how to troubleshoot Power BI tenant scans in Microsoft Purview.
----- Previously updated : 09/22/2022---
-# Troubleshoot Power BI tenant scans in Microsoft Purview
-
-This article explores common troubleshooting methods for scanning Power BI tenants in [Microsoft Purview](overview.md).
-
-## Supported scenarios for Power BI scans
-
-### Same-tenant
-
-|**Scenarios** |**Microsoft Purview public access allowed/denied** |**Power BI public access allowed /denied** | **Runtime option** | **Authentication option** | **Deployment checklist** |
-|||||||
-|Public access with Azure IR |Allowed |Allowed |Azure Runtime | Microsoft Purview Managed Identity | [Review deployment checklist](register-scan-power-bi-tenant.md#deployment-checklist) |
-|Public access with self-hosted IR |Allowed |Allowed |Self-hosted runtime |Delegated authentication / Service principal | [Review deployment checklist](register-scan-power-bi-tenant.md#deployment-checklist) |
-|Private access |Allowed |Denied |Self-hosted runtime |Delegated authentication / Service principal | [Review deployment checklist](register-scan-power-bi-tenant.md#deployment-checklist) |
-|Private access |Denied |Allowed |Self-hosted runtime |Delegated authentication / Service principal | [Review deployment checklist](register-scan-power-bi-tenant.md#deployment-checklist) |
-|Private access |Denied |Denied |Self-hosted runtime |Delegated authentication / Service principal | [Review deployment checklist](register-scan-power-bi-tenant.md#deployment-checklist) |
-
-### Cross-tenant
-
-|**Scenarios** |**Microsoft Purview public access allowed/denied** |**Power BI public access allowed /denied** | **Runtime option** | **Authentication option** | **Deployment checklist** |
-|||||||
-|Public access with Azure IR |Allowed |Allowed |Azure runtime |Delegated Authentication | [Deployment checklist](register-scan-power-bi-tenant-cross-tenant.md#deployment-checklist) |
-|Public access with Self-hosted IR |Allowed |Allowed |Self-hosted runtime |Delegated authentication / Service principal | [Deployment checklist](register-scan-power-bi-tenant-cross-tenant.md#deployment-checklist) |
-
-## Troubleshooting tips
-
-If delegated auth is used:
-- Check your key vault. Make sure there are no typos in the password.-- Assign proper [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) to Power BI administrator user.-- Validate if user is assigned to Power BI Administrator role.-- If user is recently created, make sure password is reset successfully and user can successfully initiate the session.-
-## My schema is not showing up after scanning
-
-It can take some time for schema to finish the scanning and ingestion process, depending on the size of your Power BI Tenant. Currently if you have a large PowerBI tenant, this process could take a few hours.
-
-## Error code: Test connection failed - AASDST50079
--- **Message**: `Failed to get access token with given credential to access Power BI tenant. Authentication type PowerBIDelegated Message: AASDST50079 Due to a configuration change made by your administrator or because you moved to a new location, you must enroll in multi-factor authentication.`--- **Cause**: Authentication is interrupted, due multi-factor authentication requirement for the Power BI admin user.--- **Recommendation**: Disable multi-factor authentication requirement and exclude user from conditional access policies. Login with the user to Power BI dashboard to validate if user can successfully login to the application.-
-## Error code: Test connection failed - AASTS70002
--- **Message**: `Failed to access token with given credential to access Power BI tenant. Authentication type: PowerBiDelegated Message AASTS70002: The request body must contain the following parameter: 'client_assertion' or 'client_secret'.`--- **Cause**: If Delegated Authentication is used, the problem could be a misconfiguration on the app registration. --- **Recommendation**: Review Power BI deployment checklist based on your scenario.-
-## Error code: Test connection failed - Detailed metadata
--- **Message**: `Failed to enable the PowerBI administrator API to fetch basic metadata and lineage.`--- **Cause**: **Allow service principals to use read-only Power BI admin APIs** is disabled.
-
-- **Recommendation**: Under Power BI Admin portal, enable **Allow service principals to use read-only Power BI admin APIs**.-
-## Issue: Test Connection succeeded. No assets discovered.
--- **Message**: N/A--- **Cause**: This problem can occur in same-tenant or cross-tenant scenarios, due problem with networking or authentication issues.--- **Recommendation**:
- - If Delegated Authentication is used, validate Power BI Admin user sign in logs in Azure Active Directory logs to make sure user sign in is successful. Login with the user to Power BI dashboard to validate if user can successfully login to the application.
-
- - Review your network configurations. Private endpoint is required for **both** Power BI tenant and Purview account, if one of these services (Power BI tenant or Microsoft Purview) is configured to block public access.
-
-## Next steps
-
-Follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Power Bi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-power-bi-tenant.md
- Title: Connect to and manage a Power BI tenant same tenant
-description: This guide describes how to connect to a Power BI tenant in the same tenant as Microsoft Purview, and use Microsoft Purview's features to scan and manage your Power BI tenant source.
----- Previously updated : 07/17/2023---
-# Connect to and manage a Power BI tenant in Microsoft Purview (Same Tenant)
-
-This article outlines how to register a Power BI tenant in a **same-tenant scenario**, and how to authenticate and interact with the tenant in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#deployment-checklist)| [Yes](#deployment-checklist)| Yes | No | No |No| No| [Yes](how-to-lineage-powerbi.md)| No |
-
-When scanning Power BI source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Workspaces
- - Dashboards
- - Reports
- - Datasets including the tables and columns
- - Dataflows
- - Datamarts
--- Fetching static lineage on assets relationships among above Power BI artifacts as well as external data source assets. Learn more from [Power BI lineage](how-to-lineage-powerbi.md).-
-For a list of metadata available for Power BI, see our [available metadata documentation](available-metadata.md).
-
-### Supported scenarios for Power BI scans
-
-|**Scenarios** |**Microsoft Purview public access allowed/denied** |**Power BI public access allowed /denied** | **Runtime option** | **Authentication option** | **Deployment checklist** |
-|||||||
-|Public access with Azure IR |Allowed |Allowed |Azure Runtime | Microsoft Purview Managed Identity | [Review deployment checklist](#deployment-checklist) |
-|Public access with Self-hosted IR |Allowed |Allowed |Self-hosted runtime |Delegated authentication / Service principal| [Review deployment checklist](#deployment-checklist) |
-|Private access |Allowed |Denied |Self-hosted runtime |Delegated authentication / Service principal| [Review deployment checklist](#deployment-checklist) |
-|Private access |Denied |Allowed |Self-hosted runtime |Delegated authentication / Service principal| [Review deployment checklist](#deployment-checklist) |
-|Private access |Denied |Denied |Self-hosted runtime |Delegated authentication / Service principal| [Review deployment checklist](#deployment-checklist) |
-
-### Known limitations
--- If Microsoft Purview or Power BI tenant is protected behind a private endpoint, Self-hosted runtime is the only option to scan.-- Delegated authentication and service principal are the only supported authentication options when self-hosted integration runtime is used during the scan.-- You can create only one scan for a Power BI data source that is registered in your Microsoft Purview account.-- If Power BI dataset schema isn't shown after scan, it's due to one of the current limitations with [Power BI Metadata scanner](/power-bi/admin/service-admin-metadata-scanning).-- Empty workspaces are skipped.-- Payload is currently limited to 2MB and 800 columns when scanning an asset.-
-## Prerequisites
-
-Before you start, make sure you have the following prerequisites:
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).--- An active [Microsoft Purview account](create-catalog-portal.md).-
-## Authentication options
--- Managed Identity -- Delegated Authentication-- Service Principal--
-## Deployment checklist
-Use any of the following deployment checklists during the setup or for troubleshooting purposes, based on your scenario:
-
-# [Public access with Azure IR](#tab/Scenario1)
-### Scan same-tenant Power BI using Azure IR and Managed Identity in public network
-
-1. Make sure Power BI and Microsoft Purview accounts are in the same tenant.
-
-1. Make sure Power BI tenant ID is entered correctly during the registration.
-
-1. Make sure your [Power BI Metadata model is up to date by enabling metadata scanning.](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning)
-
-1. From Azure portal, validate if Microsoft Purview account Network is set to public access.
-
-1. From Power BI tenant Admin Portal, make sure Power BI tenant is configured to allow public network.
-
-1. In Azure Active Directory tenant, create a security group.
-
-1. From Azure Active Directory tenant, make sure [Microsoft Purview account MSI is member of the new security group](#authenticate-to-power-bi-tenant).
-
-1. On the Power BI Tenant Admin portal, validate if [Allow service principals to use read-only Power BI admin APIs](#associate-the-security-group-with-power-bi-tenant) is enabled for the new security group.
-
-# [Public access with Self-hosted IR](#tab/Scenario2)
-### Scan same-tenant Power BI using self-hosted IR with Delegated Authentication or Service Principal in public network
-
-1. Make sure Power BI and Microsoft Purview accounts are in the same tenant.
-
-1. Make sure Power BI tenant ID is entered correctly during the registration.
-
-1. Make sure your [Power BI Metadata model is up to date by enabling metadata scanning.](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning)
-1. From Azure portal, validate if Microsoft Purview account Network is set to public access.
-
-1. From Power BI tenant Admin Portal, make sure Power BI tenant is configured to allow public network.
-
-1. Check your Azure Key Vault to make sure:
- 1. There are no typos in the password or secret.
- 2. Microsoft Purview Managed Identity has get/list access to secrets.
-
-1. Review your credential to validate:
- 1. Client ID matches _Application (Client) ID_ of the app registration.
- 2. Username includes the user principal name such as `johndoe@contoso.com`.
-
-1. Validate App registration settings to make sure:
- 1. App registration exists in your Azure Active Directory tenant.
-
- 2. If service principal is used, under **API permissions**, the following **delegated permissions** are assigned with read for the following APIs:
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. If delegated authentication is used, under **API permissions**, the following **delegated permissions** and **grant admin consent for the tenant** is set up with read for the following APIs:
- - Power BI Service Tenant.Read.All
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. Under **Authentication**, **Allow public client flows** is enabled.
-
-2. If delegated authentication is used, validate Power BI admin user settings to make sure:
- 1. User is assigned to Power BI Administrator role.
- 2. At least one [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to the user.
- 3. If user is recently created, sign in with the user at least once to make sure password is reset successfully and user can successfully initiate the session.
- 4. There's no MFA or Conditional Access Policies are enforced on the user.
-
-3. Validate Self-hosted runtime settings:
- 1. Latest version of [Self-hosted runtime](https://www.microsoft.com/download/details.aspx?id=39717) is installed on the VM.
- 2. Network connectivity from Self-hosted runtime to Power BI tenant is enabled. The following endpoints must be reachable from self-hosted runtime VM:
- - `*.powerbi.com`
- - `*.analysis.windows.net`
-
- 3. Network connectivity from Self-hosted runtime to Microsoft services is enabled.
- 4. [JDK 8 or later](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html) is installed. Restart the machine after you newly install the JDK for it to take effect.
-
-1. In Azure Active Directory tenant, create a security group.
-
-1. From Azure Active Directory tenant, make sure [Service Principal is member of the new security group](#authenticate-to-power-bi-tenant).
-
-1. On the Power BI Tenant Admin portal, validate if [Allow service principals to use read-only Power BI admin APIs](#associate-the-security-group-with-power-bi-tenant) is enabled for the new security group.
-
-# [Private access](#tab/Scenario3)
-### Scan same-tenant Power BI using self-hosted IR with Delegated Authentication or Service Principal in a private network
-
-1. Make sure Power BI and Microsoft Purview accounts are in the same tenant.
-
-1. Make sure Power BI tenant ID is entered correctly during the registration.
-
-1. Make sure your [Power BI Metadata model is up to date by enabling metadata scanning.](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning)
-
-1. Check your Azure Key Vault to make sure:
- 1. There are no typos in the password.
- 2. Microsoft Purview Managed Identity has get/list access to secrets.
-
-1. Review your credential to validate:
- 1. Client ID matches _Application (Client) ID_ of the app registration.
- 2. Username includes the user principal name such as `johndoe@contoso.com`.
-
-1. If Delegated Authentication is used, validate Power BI admin user settings to make sure:
- 1. User is assigned to Power BI Administrator role.
- 2. At least one [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to the user.
- 3. If user is recently created, sign in with the user at least once to make sure password is reset successfully and user can successfully initiate the session.
- 4. There's no MFA or Conditional Access Policies are enforced on the user.
-
-1. Validate Self-hosted runtime settings:
- 1. Latest version of [Self-hosted runtime](https://www.microsoft.com/download/details.aspx?id=39717) is installed on the VM.
- 2. [JDK 8 or later](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html) is installed. Restart the machine after you newly install the JDK for it to take effect.
-
-1. Validate App registration settings to make sure:
- 1. App registration exists in your Azure Active Directory tenant.
-
- 2. If service principal is used, under **API permissions**, the following **delegated permissions** are assigned with read for the following APIs:
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. If delegated authentication is used, under **API permissions**, the following **delegated permissions** and **grant admin consent for the tenant** is set up with read for the following APIs:
- - Power BI Service Tenant.Read.All
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- 3. Under **Authentication**, **Allow public client flows** is enabled.
-
-2. Review network configuration and validate if:
- 1. If your Power BI doesn't allow public access, make sure [private endpoint for Power BI tenant](/power-bi/enterprise/service-security-private-links) is deployed.
- 2. All required [private endpoints for Microsoft Purview](./catalog-private-link-end-to-end.md) are deployed.
- 3. Network connectivity from Self-hosted runtime to Power BI tenant is enabled. The following endpoints must be reachable from self-hosted runtime VM:
- - `*.powerbi.com`
- - `*.analysis.windows.net`
-
- 4. Network connectivity from Self-hosted runtime to Microsoft services is enabled through private network.
-
-1. In Azure Active Directory tenant, create a security group.
-
-1. From Azure Active Directory tenant, make sure [Service Principal is member of the new security group](#authenticate-to-power-bi-tenant).
-
-1. On the Power BI Tenant Admin portal, validate if [Allow service principals to use read-only Power BI admin APIs](#associate-the-security-group-with-power-bi-tenant) is enabled for the new security group.
--
-## Register Power BI tenant
-
-This section describes how to register a Power BI tenant in Microsoft Purview for same-tenant scenario.
-
-1. Select the **Data Map** on the left navigation.
-
-1. Then select **Register**.
-
- Select **Power BI** as your data source.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/select-power-bi-data-source.png" alt-text="Image showing the list of data sources available to choose.":::
-
-1. Give your Power BI instance a friendly name.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-friendly-name.png" alt-text="Image showing Power BI data source-friendly name.":::
-
- The name must be between 3-63 characters long and must contain only letters, numbers, underscores, and hyphens. Spaces aren't allowed.
-
- By default, the system will find the Power BI tenant that exists in the same Azure Active Directory tenant.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-datasource-registered.png" alt-text="Image showing the registered Power BI data source.":::
-
-## Scan same-tenant Power BI
-
-> [!TIP]
-> To troubleshoot any issues with scanning:
-> 1. Confirm you have completed the [**deployment checklist for your scenario**](#deployment-checklist).
-> 1. Review our [**scan troubleshooting documentation**](register-scan-power-bi-tenant-troubleshoot.md).
-
-### Authenticate to Power BI tenant
-
-In Azure Active Directory Tenant, where Power BI tenant is located:
-
-1. In the [Azure portal](https://portal.azure.com), search for **Azure Active Directory**.
-
-2. Create a new security group in your Azure Active Directory, by following [Create a basic group and add members using Azure Active Directory](../active-directory/fundamentals/active-directory-groups-create-azure-portal.md).
-
- > [!Tip]
- > You can skip this step if you already have a security group you want to use.
-
-3. Select **Security** as the **Group Type**.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/security-group.png" alt-text="Screenshot of security group type.":::
-
-4. Add relevant user to the security group:
-
- - If you are using **Managed Identity** as authentication method, add your Microsoft Purview managed identity to this security group. Select **Members**, then select **+ Add members**.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/add-group-member.png" alt-text="Screenshot of how to add the catalog's managed instance to group.":::
-
- - If you are using **delegated authentication** or **service principal** as authentication method, add your **service principal** to this security group. Select **Members**, then select **+ Add members**.
-
-5. Search for your Microsoft Purview managed identity or service principal and select it.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/add-catalog-to-group-by-search.png" alt-text="Screenshot showing how to add catalog by searching for its name.":::
-
- You should see a success notification showing you that it was added.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/success-add-catalog-msi.png" alt-text="Screenshot showing successful addition of catalog managed identity.":::
-
-### Associate the security group with Power BI tenant
-
-1. Log into the [Power BI admin portal](https://app.powerbi.com/admin-portal/tenantSettings).
-
-2. Select the **Tenant settings** page.
-
- > [!Important]
- > You need to be a Power BI Admin to see the tenant settings page.
-
-3. Select **Admin API settings** > **Allow service principals to use read-only Power BI admin APIs (Preview)**.
-
-4. Select **Specific security groups**.
-
- :::image type="content" source="./media/setup-power-bi-scan-PowerShell/allow-service-principals-power-bi-admin.png" alt-text="Image showing how to allow service principals to get read-only Power BI admin API permissions.":::
-
-5. Select **Admin API settings** > **Enhance admin APIs responses with detailed metadata** and **Enhance admin APIs responses with DAX and mashup expressions** > Enable the toggle to allow Microsoft Purview Data Map automatically discover the detailed metadata of Power BI datasets as part of its scans.
-
- > [!IMPORTANT]
- > After you update the Admin API settings on your power bi tenant, wait around 15 minutes before registering a scan and test connection.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-sub-artifacts.png" alt-text="Image showing the Power BI admin portal config to enable subartifact scan.":::
-
- > [!Caution]
- > When you allow the security group you created (that has your Microsoft Purview managed identity as a member) to use read-only Power BI admin APIs, you also allow it to access the metadata (e.g. dashboard and report names, owners, descriptions, etc.) for all of your Power BI artifacts in this tenant. Once the metadata has been pulled into the Microsoft Purview, Microsoft Purview's permissions, not Power BI permissions, determine who can see that metadata.
-
- > [!Note]
- > You can remove the security group from your developer settings, but the metadata previously extracted won't be removed from the Microsoft Purview account. You can delete it separately, if you wish.
-
-### Create scan for same-tenant Power BI using Azure IR and Managed Identity
-This is a suitable scenario, if both Microsoft Purview and Power BI tenant are configured to allow public access in the network settings.
-
-To create and run a new scan, do the following:
-
-1. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu.
-
-1. Navigate to **Sources**.
-
-1. Select the registered Power BI source.
-
-1. Select **+ New scan**.
-
-2. Give your scan a name. Then select the option to include or exclude the personal workspaces.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-setup.png" alt-text="Image showing Power BI scan setup.":::
-
- > [!Note]
- > Switching the configuration of a scan to include or exclude a personal workspace will trigger a full scan of Power BI source.
-
-3. Select **Test Connection** before continuing to next steps. If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem.
- 1. Access - Failed status means the user authentication failed. Scans using managed identity will always pass because no user authentication required.
- 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the Microsoft Purview managed identity is added to the security group associated in Power BI admin portal.
- 3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata**
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-test-connection-status-report.png" alt-text="Screenshot of test connection status report page.":::
-
-4. Set up a scan trigger. Your options are **Recurring**, and **Once**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
-
-5. On **Review new scan**, select **Save and run** to launch your scan.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/save-run-power-bi-scan-managed-identity.png" alt-text="Screenshot of Save and run Power BI source using Managed Identity.":::
-
-### Create scan for same-tenant using self-hosted IR with service principal
-
-This scenario can be used when Microsoft Purview and Power BI tenant or both, are configured to use private endpoint and deny public access. Additionally, this option is also applicable if Microsoft Purview and Power BI tenant are configured to allow public access.
-
-For more information related to Power BI network, see [How to configure private endpoints for accessing Power BI](/power-bi/enterprise/service-security-private-links).
-
-For more information about Microsoft Purview network settings, see [Use private endpoints for your Microsoft Purview account](catalog-private-link.md).
-
-To create and run a new scan, do the following:
-
-1. In the [Azure portal](https://portal.azure.com), select **Azure Active Directory** and create an App Registration in the tenant. Provide a web URL in the **Redirect URI**. [For information about the Redirect URI see this documentation from Azure Active Directory](../active-directory/develop/reply-url.md).
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-app-registration.png" alt-text="Screenshot how to create App in Azure AD.":::
-
-2. Take note of Client ID(App ID).
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-create-service-principle.png" alt-text="Screenshot how to create a Service principle.":::
-
-1. From Azure Active Directory dashboard, select newly created application and then select **App registration**. From **API Permissions**, assign the application the following delegated permissions:
-
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-spn-api-permissions.png" alt-text="Screenshot of delegated permissions on Microsoft Graph.":::
-
-1. Under **Advanced settings**, enable **Allow Public client flows**.
-
-2. Under **Certificates & secrets**, create a new secret and save it securely for next steps.
-
-3. In Azure portal, navigate to your Azure key vault.
-
-4. Select **Settings** > **Secrets** and select **+ Generate/Import**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot how to navigate to Azure Key Vault.":::
-
-5. Enter a name for the secret and for **Value**, type the newly created secret for the App registration. Select **Create** to complete.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret-spn.png" alt-text="Screenshot how to generate an Azure Key Vault secret for SPN.":::
-
-6. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-
-7. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu.
-
-8. Navigate to **Sources**.
-
-9. Select the registered Power BI source.
-
-10. Select **+ New scan**.
-
-11. Give your scan a name. Then select the option to include or exclude the personal workspaces.
-
- >[!Note]
- > Switching the configuration of a scan to include or exclude a personal workspace will trigger a full scan of Power BI source.
-
-12. Select your self-hosted integration runtime from the drop-down list.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-shir.png" alt-text="Image showing Power BI scan setup using SHIR for same tenant.":::
-
-13. For the **Credential**, select **service principal** and select **+ New** to create a new credential.
-
-14. Create a new credential and provide required parameters:
-
- - **Name**: Provide a unique name for credential
- - **Authentication method**: Service principal
- - **Tenant ID**: Your Power BI tenant ID
- - **Client ID**: Use Service Principal Client ID (App ID) you created earlier
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-spn-authentication.png" alt-text="Screenshot of the new credential menu, showing Power BI credential for SPN with all required values supplied.":::
-
-15. Select **Test Connection** before continuing to next steps. If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem
- 1. Access - Failed status means the user authentication failed. Scans using managed identity will always pass because no user authentication required.
- 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the Microsoft Purview managed identity is added to the security group associated in Power BI admin portal.
- 3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata**
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-test-connection-status-report.png" alt-text="Screenshot of test connection status report page.":::
-
-16. Set up a scan trigger. Your options are **Recurring**, and **Once**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
-
-17. On **Review new scan**, select **Save and run** to launch your scan.
-
-### Create scan for same-tenant using self-hosted IR with delegated authentication
-
-This scenario can be used when Microsoft Purview and Power BI tenant or both, are configured to use private endpoint and deny public access. Additionally, this option is also applicable if Microsoft Purview and Power BI tenant are configured to allow public access.
-
-For more information related to Power BI network, see [How to configure private endpoints for accessing Power BI](/power-bi/enterprise/service-security-private-links).
-
-For more information about Microsoft Purview network settings, see [Use private endpoints for your Microsoft Purview account](catalog-private-link.md).
-
-To create and run a new scan, do the following:
-
-1. Create a user account in Azure Active Directory tenant and assign the user to Azure Active Directory role, **Power BI Administrator**. Take note of username and sign in to change the password.
-
-1. Assign proper Power BI license to the user.
-
-1. Navigate to your Azure key vault.
-
-1. Select **Settings** > **Secrets** and select **+ Generate/Import**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot how to navigate to Azure Key Vault.":::
-
-1. Enter a name for the secret and for **Value**, type the newly created password for the Azure AD user. Select **Create** to complete.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret.png" alt-text="Screenshot how to generate an Azure Key Vault secret.":::
-
-1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-
-1. Create an App Registration in your Azure Active Directory tenant. Provide a web URL in the **Redirect URI**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-app-registration.png" alt-text="Screenshot how to create App in Azure AD.":::
-
-2. Take note of Client ID(App ID).
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-create-service-principle.png" alt-text="Screenshot how to create a Service principle.":::
-
-1. From Azure Active Directory dashboard, select newly created application and then select **App registration**. Assign the application the following delegated permissions, and grant admin consent for the tenant:
-
- - Power BI Service Tenant.Read.All
- - Microsoft Graph openid
- - Microsoft Graph User.Read
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-delegated-permissions.png" alt-text="Screenshot of delegated permissions on Power BI Service and Microsoft Graph.":::
-
-1. Under **Advanced settings**, enable **Allow Public client flows**.
-
-2. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu.
-
-1. Navigate to **Sources**.
-
-1. Select the registered Power BI source.
-
-1. Select **+ New scan**.
-
-1. Give your scan a name. Then select the option to include or exclude the personal workspaces.
-
- >[!Note]
- > Switching the configuration of a scan to include or exclude a personal workspace will trigger a full scan of Power BI source.
-
-1. Select your self-hosted integration runtime from the drop-down list.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-shir.png" alt-text="Image showing Power BI scan setup using SHIR for same tenant.":::
-
-1. For the **Credential**, select **Delegated authentication** and select **+ New** to create a new credential.
-
-1. Create a new credential and provide required parameters:
-
- - **Name**: Provide a unique name for credential
- - **Authentication method**: Delegated auth
- - **Client ID**: Use Service Principal Client ID (App ID) you created earlier
- - **User name**: Provide the username of Power BI Administrator you created earlier
- - **Password**: Select the appropriate Key vault connection and the **Secret name** where the Power BI account password was saved earlier.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-delegated-authentication.png" alt-text="Screenshot of the new credential menu, showing Power B I credential with all required values supplied.":::
-
-2. Select **Test Connection** before continuing to next steps. If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem
- 1. Access - Failed status means the user authentication failed. Scans using managed identity will always pass because no user authentication required.
- 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the Microsoft Purview managed identity is added to the security group associated in Power BI admin portal.
- 3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata**
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-test-connection-status-report.png" alt-text="Screenshot of test connection status report page.":::
-
-3. Set up a scan trigger. Your options are **Recurring**, and **Once**.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
-
-4. On **Review new scan**, select **Save and run** to launch your scan.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/save-run-power-bi-scan.png" alt-text="Screenshot of Save and run Power BI source.":::
-
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Salesforce https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-salesforce.md
- Title: Connect to and manage Salesforce
-description: This guide describes how to connect to Salesforce in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Salesforce source.
----- Previously updated : 07/18/2023---
-# Connect to and manage Salesforce in Microsoft Purview
-
-This article outlines how to register Salesforce, and how to authenticate and interact with Salesforce in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No| No | No| No|
-
-When scanning Salesforce source, Microsoft Purview supports extracting technical metadata including:
--- Organization-- Objects including the fields, foreign keys, and unique_constraints-
-When setting up scan, you can choose to scan an entire Salesforce organization, or scope the scan to a subset of objects matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-- A Salesforce connected app, which will be used to access your Salesforce information.
- - If you need to create a connected app, you can follow the [Salesforce documentation](https://help.salesforce.com/s/articleView?id=sf.connected_app_create_basics.htm&type=5).
- - You will need to [enable OAuth for your Salesforce application](https://help.salesforce.com/s/articleView?id=sf.connected_app_create_api_integration.htm&type=5).
-
-> [!NOTE]
-> **If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.), **you will need to configure a self hosted integration runtime to connect to it**.
--- If your data store isn't publicly accessible, set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
- - Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
- - Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
- - Ensure the self-hosted integration runtime machine's IP is within the [trusted IP ranges for your organization](https://help.salesforce.com/s/articleView?id=sf.security_networkaccess.htm&type=5) set on Salesforce.
-
-### Required permissions for scan
-
-If users will be submitting Salesforce Documents, certain security settings must be configured to allow this access on Standard Objects and Custom Objects. To configure permissions:
--- Within Salesforce, select Setup and then select Manage Users.-- Under the Manage Users tree select Profiles.-- Once the Profiles appear on the right, select which Profile you want to edit and select the Edit link next to the corresponding profile.-
-For Standard Objects, ensure that the "Documents" section has the Read permissions selected. For Custom Objects, ensure that the Read permissions selected for each custom objects.
-
-## Register
-
-This section describes how to register Salesforce in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new Salesforce source in your data catalog, follow these steps:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **Salesforce**. Select **Continue**.
-
-On the **Register sources (Salesforce)** screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the Salesforce login endpoint URL as **Domain URL**. For example, `https://login.salesforce.com`. You can use your company' instance URL (such as `https://na30.salesforce.com`) or My Domain URL (such as `https://myCompanyName.my.salesforce.com/`).
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-salesforce/register-sources.png" alt-text="register sources options" border="true":::
-
-## Scan
-
-Follow the steps below to scan Salesforce to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-Microsoft Purview uses Salesforce REST API version 41.0 to extract metadata, including REST requests like 'Describe Global' URI (/v41.0/sobjects/),'sObject Basic Information' URI (/v41.0/sobjects/sObject/), and 'SOQL Query' URI (/v41.0/query?).
-
-### Authentication for a scan
-
-The supported authentication type for a Salesforce source is **Consumer key authentication**.
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. If your server is publicly accessible, skip to step two. Otherwise, you'll need to make sure your self-hosted integration runtime is configured:
- 1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), got to the Management Center, and select **Integration runtimes**.
- 1. Make sure a self-hosted integration runtime is available. If one isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to **Sources**.
-
-1. Select the registered Salesforce source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the Azure auto-resolved integration runtime if your server is publicly accessible, or your configured self-hosted integration runtime if it isn't publicly available.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Consumer key** while creating a credential.
- * Provide the username of the user that the [connected app](#prerequisites) is imitating in the User name input field.
- * Store the password of the user that the connected app is imitating in an Azure Key Vault secret.
- * If your self-hosted integration runtime machine's IP is within the [trusted IP ranges for your organization](https://help.salesforce.com/s/articleView?id=sf.security_networkaccess.htm&type=5) set on Salesforce, provide just the password of the user.
- * Otherwise, **concatenate the password and security token as the value of the secret**. The security token is an automatically generated key that must be added to the end of the password when logging in to Salesforce from an untrusted network. Learn more about how to [get or reset a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm).
- * Provide the consumer key from the connected app definition. You can find it on the connected app's Manage Connected Apps page or from the connected app's definition.
- * Stored the consumer secret from the connected app definition in an Azure Key Vault secret. You can find it along with consumer key.
-
- 1. **Objects**: Provide a list of object names to scope your scan. For example, `object1; object2`. An empty list means retrieving all available objects. You can specify object names as a wildcard pattern. For example, `topic?`, `*topic*`, or `topic_?,*topic*`.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of Salesforce source to be scanned.
-
- > [!Note]
- > As a rule of thumb, please provide 1GB memory for every 1000 tables
-
- :::image type="content" source="media/register-scan-salesforce/scan.png" alt-text="scan Salesforce" border="true":::
-
-1. Select **Test connection** to validate the settings (available when using Azure Integration Runtime).
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Sap Bw https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-sap-bw.md
- Title: Connect to and manage an SAP Business Warehouse
-description: This guide describes how to connect to SAP Business Warehouse in Microsoft Purview, and use Microsoft Purview's features to scan and manage your SAP BW source.
----- Previously updated : 04/20/2023---
-# Connect to and manage SAP Business Warehouse in Microsoft Purview
-
-This article outlines how to register SAP Business Warehouse (BW), and how to authenticate and interact with SAP BW in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | No | No | No| No|No|No|
-
-The supported SAP BW versions are 7.3 to 7.5. SAP BW/4HANA isn't supported.
-
-When scanning SAP BW source, Microsoft Purview supports extracting technical metadata including:
--- Instance-- InfoArea-- InfoSet-- InfoSet query-- Classic InfoSet-- InfoObject including unit of measurement, time characteristic, navigation attribute, data packet characteristic, currency, characteristic, field, and key figure-- Data store object (DSO)-- Aggregation level-- Open hub destination-- Query including the query condition-- Query view-- HybridProvider-- MultiProvider-- InfoCube-- Aggregate-- Dimension-- Time dimension-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview resource](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.15.8079.1.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * The connector reads metadata from SAP using the [SAP Java Connector (JCo)](https://support.sap.com/en/product/connectors/jco.html) 3.0 API. Make sure the Java Connector is available on your machine where self-hosted integration runtime is installed. Make sure that you use the correct JCo distribution for your environment, and the **sapjco3.jar** and **sapjco3.dll** files are available.
-
- > [!Note]
- > The driver should be accessible to all accounts in the machine. Don't put it in a path under user account.
-
- * Self-hosted integration runtime communicates with the SAP server over dispatcher port 32NN and gateway port 33NN, where NN is your SAP instance number from 00 to 99. Make sure the outbound traffic is allowed on your firewall.
-
-* Deploy the metadata extraction ABAP function module on the SAP server by following the steps mentioned in [ABAP functions deployment guide](abap-functions-deployment-guide.md). You need an ABAP developer account to create the RFC function module on the SAP server. For scan execution, the user account requires sufficient permissions to connect to the SAP server and execute the following RFC function modules:
-
- * STFC_CONNECTION (check connectivity)
- * RFC_SYSTEM_INFO (check system information)
- * OCS_GET_INSTALLED_COMPS (check software versions)
- * Z_MITI_BW_DOWNLOAD (main metadata import, the function module you create following the Purview guide)
-
- The underlying SAP Java Connector (JCo) libraries may call additional RFC function modules e.g. RFC_PING, RFC_METADATA_GET, etc., refer to [SAP support note 460089](https://launchpad.support.sap.com/#/notes/460089) for details.
-
-## Register
-
-This section describes how to register SAP BW in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-The only supported authentication for SAP BW source is **Basic authentication**.
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register**.
-1. In **Register sources**, select **SAP BW** > **Continue**.
-
-On the **Register sources (SAP BW)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Application server** name to connect to SAP BW source. It can also be an IP address of the SAP application server host.
-
-1. Enter the SAP **System number**. It's an integer between 0 and 99.
-
-1. Select a collection or create a new one (Optional).
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-sap-bw/register-sap-bw.png" alt-text="Screenshot of registering an SAP BW source." border="true":::
-
-## Scan
-
-Follow the steps below to scan SAP BW to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime.
-
-1. Navigate to **Sources**
-
-1. Select the registered SAP BW source.
-
-1. Select **+ New scan**
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
-
- * Select Basic Authentication while creating a credential.
- * Provide a user ID to connect to SAP server in the User name input field.
- * Store the user password used to connect to SAP server in the secret key.
-
- 1. **Client ID**: Enter the SAP Client ID. It's a three-digit numeric number from 000 to 999.
-
- 1. **JCo library path**: Specify the directory path where the JCo libraries are located, e.g. `D:\Drivers\SAPJCo`. Make sure the path is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Maximum memory available:** Maximum memory (in GB) available on the Self-hosted Integration Runtime machine to be used by scanning processes. This is dependent on the size of SAP BW source to be scanned.
-
- :::image type="content" source="media/register-scan-sap-bw/scan-sap-bw.png" alt-text="Screenshot of setting up an SAP BW scan." border="true":::
-
-1. Select **Test connection**.
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the
- scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Search Data Catalog](how-to-search-catalog.md)-- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Supported data sources and file types](azure-purview-connector-overview.md)
purview Register Scan Sap Hana https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-sap-hana.md
- Title: Connect to and manage SAP HANA
-description: This guide describes how to connect to SAP HANA in Microsoft Purview, and how to use Microsoft Purview to scan and manage your SAP HANA source.
----- Previously updated : 04/20/2023---
-# Connect to and manage SAP HANA in Microsoft Purview
-
-This article outlines how to register SAP HANA, and how to authenticate and interact with SAP HANA in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No|No | No | No |
-
-When scanning SAP HANA source, Microsoft Purview supports extracting technical metadata including:
--- Server-- Databases-- Schemas-- Tables including the columns, foreign keys, indexes, and unique constraints-- Views including the columns. Note SAP HANA Calculation Views are not supported now.-- Stored procedures including the parameter dataset and result set-- Functions including the parameter dataset-- Sequences-- Synonyms-
-When setting up scan, you can choose to scan an entire SAP HANA database, or scope the scan to a subset of schemas matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* You must have an active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and configure a self-hosted integration runtime](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.13.8013.1.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure that Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the machine where the self-hosted integration runtime is running. If you don't have this update installed, [download it now](/cpp/windows/latest-supported-vc-redist).
-
- * Download the SAP HANA JDBC driver ([JAR ngdbc](https://mvnrepository.com/artifact/com.sap.cloud.db.jdbc/ngdbc)) on the machine where your self-hosted integration runtime is running. Note down the folder path which you will use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
-### Required permissions for scan
-
-Microsoft Purview supports basic authentication (username and password) for scanning SAP HANA.
-
-The SAP HANA user you specified must have the permission to select metadata of the schemas you want to import.
-
-```sql
-CREATE USER <user> PASSWORD <password> NO FORCE_FIRST_PASSWORD_CHANGE;
-GRANT SELECT METADATA ON SCHEMA <schema1> TO <user>;
-GRANT SELECT METADATA ON SCHEMA <schema2> TO <user>;
-```
-
-And the user must have the permission to select on system table _SYS_REPO.ACTIVE_OBJECT and on system schemas _SYS_BI and _SYS_BIC.
-
-```sql
-GRANT SELECT ON _SYS_REPO.ACTIVE_OBJECT TO <user>;
-GRANT SELECT ON SCHEMA _SYS_BI TO <user>;
-GRANT SELECT ON SCHEMA _SYS_BIC TO <user>;
-```
-
-## Register
-
-This section describes how to register an SAP HANA in Microsoft Purview by using [the Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-1. Open the Microsoft Purview governance portal by:
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left pane.
-
-1. Select **Register**.
-
-1. In **Register sources**, select **SAP HANA** > **Continue**.
-
-1. On the **Register sources (SAP HANA)** screen, do the following:
-
- 1. For **Name**, enter a name that Microsoft Purview will list as the data source.
-
- 1. For **Server**, enter the host name or IP address used to connect to an SAP HANA source. For example, `MyDatabaseServer.com` or `192.169.1.2`.
-
- 1. For **Port**, enter the port number used to connect to the database server (39013 by default for SAP HANA).
-
- 1. For **Select a collection**, choose a collection from the list or create a new one. This step is optional.
-
- :::image type="content" source="media/register-scan-sap-hana/configure-sources.png" alt-text="Screenshot that shows boxes for registering SAP HANA sources." border="true":::
-
-1. Select **Finish**.
-
-## Scan
-
-Use the following steps to scan SAP HANA databases to automatically identify assets. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for an SAP HANA source is **Basic authentication**.
-
-### Create and run scan
-
-1. In the Management Center, select integration runtimes. Make sure that a self-hosted integration runtime is set up. If it isn't set up, use the steps in [Create and manage a self-hosted integration runtime](./manage-integration-runtimes.md).
-
-1. Go to **Sources**.
-
-1. Select the registered SAP HANA source.
-
-1. Select **+ New scan**.
-
-1. Provide the following details:
-
- 1. **Name**: Enter a name for the scan.
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
-
- * Select **Basic Authentication** while creating a credential.
- * Provide the user name used to connect to the database server in the User name input field.
- * Store the user password used to connect to the database server in the secret key.
-
- For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-
- 1. **Database**: Specify the name of the database instance to import.
-
- 1. **Schema**: List subset of schemas to import expressed as a semicolon separated list. For example, `schema1; schema2`. All user schemas are imported if that list is empty. All system schemas and objects are ignored by default.
-
- Acceptable schema name patterns that use SQL `LIKE` expression syntax include the percent sign (%). For example, `A%; %B; %C%; D` means:
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Driver location**: Specify the path to the JDBC driver location in your machine where self-host integration runtime is running, e.g. `D:\Drivers\SAPHANA`. It's the path to valid JAR folder location. Make sure the driver is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Maximum memory available**: Maximum memory (in gigabytes) available on the customer's machine for the scanning processes to use. This value is dependent on the size of SAP HANA database to be scanned.
-
- :::image type="content" source="media/register-scan-sap-hana/scan.png" alt-text="Screenshot that shows boxes for scan details." border="true":::
-
-1. Select **Continue**.
-
-1. For **Scan trigger**, choose whether to set up a schedule or run the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Next steps
-
-Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Sapecc Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-sapecc-source.md
- Title: Connect to and manage an SAP ECC source
-description: This guide describes how to connect to SAP ECC in Microsoft Purview, and use Microsoft Purview's features to scan and manage your SAP ECC source.
----- Previously updated : 04/20/2023---
-# Connect to and manage SAP ECC in Microsoft Purview
-
-This article outlines how to register SAP ECC, and how to authenticate and interact with SAP ECC in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | No | No | No| No| [Yes*](#lineage)| No |
-
-\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-When scanning SAP ECC source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Instance
- - Application components
- - Packages
- - Tables including the fields, foreign keys, indexes, and index members
- - Views including the fields
- - Transactions
- - Programs
- - Classes
- - Function groups
- - Function modules
- - Domains including the domain values
- - Data elements
--- Fetching static lineage on assets relationships among tables and views.-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
- >[!NOTE]
- >Scanning SAP ECC is a memory intensive operation, you are recommended to install Self-hosted Integration Runtime on a machine with at least 128GB RAM.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * Download the 64-bit [SAP Connector for Microsoft .NET 3.0](https://support.sap.com/en/product/connectors/msnet.html) from SAP\'s website and install it on the self-hosted integration runtime machine. During installation, make sure you select the **Install Assemblies to GAC** option in the **Optional setup steps** window.
-
- :::image type="content" source="media/register-scan-saps4hana-source/requirement.png" alt-text="pre-requisite" border="true":::
-
- * The connector reads metadata from SAP using the [SAP Java Connector (JCo)](https://support.sap.com/en/product/connectors/jco.html) 3.0 API. Make sure the Java Connector is available on your virtual machine where self-hosted integration runtime is installed. Make sure that you're using the correct JCo distribution for your environment. For example: on a Microsoft Windows machine, make sure the sapjco3.jar and sapjco3.dll files are available. Note down the folder path which you will use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
- * Self-hosted integration runtime communicates with the SAP server over dispatcher port 32NN and gateway port 33NN, where NN is your SAP instance number from 00 to 99. Make sure the outbound traffic is allowed on your firewall.
-
-* Deploy the metadata extraction ABAP function module on the SAP server by following the steps mentioned in [ABAP functions deployment guide](abap-functions-deployment-guide.md). You'll need an ABAP developer account to create the RFC function module on the SAP server. For scan execution, the user account requires sufficient permissions to connect to the SAP server and execute the following RFC function modules:
-
- * STFC_CONNECTION (check connectivity)
- * RFC_SYSTEM_INFO (check system information)
- * OCS_GET_INSTALLED_COMPS (check software versions)
- * Z_MITI_DOWNLOAD (main metadata import, the function module you create following the Purview guide)
-
- The underlying SAP Java Connector (JCo) libraries may call additional RFC function modules e.g. RFC_PING, RFC_METADATA_GET, etc., refer to [SAP support note 460089](https://launchpad.support.sap.com/#/notes/460089) for details.
-
-## Register
-
-This section describes how to register SAP ECC in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-The only supported authentication for SAP ECC source is **Basic authentication**.
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **SAP ECC**. Select **Continue.**
-
- :::image type="content" source="media/register-scan-sapecc-source/register-sapecc.png" alt-text="register SAPECC options" border="true":::
-
-On the **Register sources (SAP ECC)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the
- Catalog.
-
-1. Enter the **Application server** name to connect to SAP ECC source. It can also be an IP address of the SAP application server host.
-
-1. Enter the SAP **System number**. This is a two-digit integer between 00 and 99.
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-sapecc-source/register-sapecc-2.png" alt-text="register SAPECC" border="true":::
-
-## Scan
-
-Follow the steps below to scan SAP ECC to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime.
-
-1. Navigate to **Sources**
-
-1. Select the registered SAP ECC source.
-
-1. Select **+ New scan**
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
-
- * Select Basic Authentication while creating a credential.
- * Provide a user ID to connect to SAP server in the User name input field.
- * Store the user password used to connect to SAP server in the secret key.
-
- 1. **Client ID**: Enter the SAP Client ID. This is a three-digit numeric number from 000 to 999.
-
- 1. **JCo library path**: Specify the directory path where the JCo libraries are located, e.g. `D:\Drivers\SAPJCo`. Make sure the path is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Maximum memory available:** Maximum memory (in GB) available on the Self-hosted Integration Runtime machine to be used by scanning processes. This is dependent on the size of SAP ECC source to be scanned. It's recommended to provide large available memory, for example, 100.
-
- :::image type="content" source="media/register-scan-sapecc-source/scan-sapecc-inline.png" alt-text="scan SAPECC" lightbox="media/register-scan-sapecc-source/scan-sapecc-expanded.png" border="true":::
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the
- scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your SAP ECC source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported SAP ECC lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Saps4hana Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-saps4hana-source.md
- Title: Connect to and manage an SAP S/4HANA source
-description: This guide describes how to connect to SAP S/4HANA in Microsoft Purview, and use Microsoft Purview's features to scan and manage your SAP S/4HANA source.
----- Previously updated : 04/20/2023---
-# Connect to and manage SAP S/4HANA in Microsoft Purview
-
-This article outlines how to register SAP S/4HANA, and how to authenticate and interact with SAP S/4HANA in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | No | No | No| No| [Yes*](#lineage)| No |
-
-\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-When scanning SAP S/4HANA source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Instance
- - Application components
- - Packages
- - Tables including the fields, foreign keys, indexes, and index members
- - Views including the fields
- - Transactions
- - Programs
- - Classes
- - Function groups
- - Function modules
- - Domains including the domain values
- - Data elements
--- Fetching static lineage on assets relationships among tables and views.-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
- >[!NOTE]
- >Scanning SAP S/4HANA is a memory intensive operation, you are recommended to install Self-hosted Integration Runtime on a machine with at least 128GB RAM.
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * Download the 64-bit [SAP Connector for Microsoft .NET 3.0](https://support.sap.com/en/product/connectors/msnet.html) from SAP\'s website and install it on the self-hosted integration runtime machine. During installation, make sure you select the **Install Assemblies to GAC** option in the **Optional setup steps** window.
-
- :::image type="content" source="media/register-scan-saps4hana-source/requirement.png" alt-text="pre-requisite" border="true":::
-
- * The connector reads metadata from SAP using the [SAP Java Connector (JCo)](https://support.sap.com/en/product/connectors/jco.html) 3.0 API. Hence make sure the Java Connector is available on your virtual machine where self-hosted integration runtime is installed. Make sure that you're using the correct JCo distribution for your environment. For example, on a Microsoft Windows machine, make sure the sapjco3.jar and sapjco3.dll files are available. Note down the folder path which you will use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
- * Self-hosted integration runtime communicates with the SAP server over dispatcher port 32NN and gateway port 33NN, where NN is your SAP instance number from 00 to 99. Make sure the outbound traffic is allowed on your firewall.
-
-* Deploy the metadata extraction ABAP function module on the SAP server by following the steps mentioned in [ABAP functions deployment guide](abap-functions-deployment-guide.md). You'll need an ABAP developer account to create the RFC function module on the SAP server. For scan execution, the user account requires sufficient permissions to connect to the SAP server and execute the following RFC function modules:
-
- * STFC_CONNECTION (check connectivity)
- * RFC_SYSTEM_INFO (check system information)
- * OCS_GET_INSTALLED_COMPS (check software versions)
- * Z_MITI_DOWNLOAD (main metadata import, the function module you create following the Purview guide)
-
- The underlying SAP Java Connector (JCo) libraries may call additional RFC function modules e.g. RFC_PING, RFC_METADATA_GET, etc., refer to [SAP support note 460089](https://launchpad.support.sap.com/#/notes/460089) for details.
-
-## Register
-
-This section describes how to register SAP S/4HANA in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Authentication for registration
-
-The only supported authentication for SAP S/4HANA source is **Basic authentication**.
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **SAP S/4HANA.** Select **Continue**
-
- :::image type="content" source="media/register-scan-saps4hana-source/register-saps-4-hana.png" alt-text="register SAPS/4Hana options" border="true":::
-
-On the **Register sources (SAP S/4HANA)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **Application server** name to connect to SAP S/4HANA source. It can also be an IP address of the SAP application server host.
-
-1. Enter the SAP **System number**. This is a two-digit integer between 00 and 99.
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-saps4hana-source/register-saps-4-hana-2.png" alt-text="register SAP S/4HANA" border="true":::
-
-## Scan
-
-Follow the steps below to scan SAP S/4HANA to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-1. In the Management Center, select Integration runtimes. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to create a self-hosted integration runtime
-
-1. Navigate to **Sources.**
-
-1. Select the registered SAP S/4HANA source.
-
-1. Select **+ New scan**
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
-
- * Select Basic Authentication while creating a credential.
- * Provide a user ID to connect to SAP server in the User name input field.
- * Store the user password used to connect to SAP server in the secret key.
-
- 1. **Client ID:** Enter here the SAP system client ID. The client
- is identified with three-digit numeric number from 000 to 999.
-
- 1. **JCo library path**: Specify the directory path where the JCo libraries are located, e.g. `D:\Drivers\SAPJCo`. Make sure the path is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Maximum memory available:** Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of SAP S/4HANA source to be scanned. It's recommended to provide large available memory, for example, 100.
-
- :::image type="content" source="media/register-scan-saps4hana-source/scan-saps-4-hana.png" alt-text="scan SAP S/4HANA" border="true":::
-
-1. Select **Continue**.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your SAP S/4HANA source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported SAP S/4HANA lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
-
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Snowflake https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-snowflake.md
- Title: Connect to and manage Snowflake
-description: This guide describes how to connect to Snowflake in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Snowflake source.
----- Previously updated : 07/17/2023---
-# Connect to and manage Snowflake in Microsoft Purview
-
-This article outlines how to register Snowflake, and how to authenticate and interact with Snowflake in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | [Yes](#scan) | No| No| [Yes](#lineage) | No|
-
-When scanning Snowflake source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Databases
- - Schemas
- - Tables including the columns, foreign keys and unique constraints
- - Views including the columns
- - Stored procedures including the parameter dataset and result set
- - Functions including the parameter dataset
- - Pipes
- - Stages
- - Streams including the columns
- - Tasks
- - Sequences
--- Fetching static lineage on assets relationships among tables, views, streams, and stored procedures.-
-For stored procedures, you can choose the level of details to extract on [scan settings](#scan). Stored procedure lineage is supported for Snowflake Scripting (SQL) and JavaScript languages, and generated based on the procedure definition.
-
-When setting up scan, you can choose to scan one or more Snowflake database(s) entirely based on the given name(s) or name pattern(s), or further scope the scan to a subset of schemas matching the given name(s) or name pattern(s).
-
-### Known limitations
--- When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.-- Stored procedure lineage is not supported for the following patterns:
- - Stored procedure defined in Java, Python and Scala languages.
- - Stored procedure using SQL [EXECUTE IMMEDIATE](https://docs.snowflake.com/en/sql-reference/sql/execute-immediate) with static SQL query as variable.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Microsoft Purview account](create-catalog-portal.md).-- You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).-
-> [!NOTE]
-> **If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.), **you will need to configure a self hosted integration runtime to connect to it**.
--- If your data store isn't publicly accessible, set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
- - Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
- - Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
-### Required permissions for scan
-
-Microsoft Purview supports basic authentication (username and password) for scanning Snowflake. The default role of the given user will be used to perform the scan. The Snowflake user must have usage rights on a warehouse and the database(s) to be scanned, and read access to system tables in order to access advanced metadata.
-
-Here's a sample walkthrough to create a user specifically for Microsoft Purview scan and set up the permissions. If you choose to use an existing user, make sure it has adequate rights to the warehouse and database objects.
-
-1. Set up a `purview_reader` role. You need _ACCOUNTADMIN_ rights to do this.
-
- ```sql
- USE ROLE ACCOUNTADMIN;
-
- --create role to allow read only access - this will later be assigned to the Microsoft Purview user
- CREATE OR REPLACE ROLE purview_reader;
-
- --make sysadmin the parent role
- GRANT ROLE purview_reader TO ROLE sysadmin;
- ```
-
-2. Create a warehouse for Microsoft Purview to use and grant rights.
-
- ```sql
- --create warehouse - account admin required
- CREATE OR REPLACE WAREHOUSE purview_wh WITH
- WAREHOUSE_SIZE = 'XSMALL'
- WAREHOUSE_TYPE = 'STANDARD'
- AUTO_SUSPEND = 300
- AUTO_RESUME = TRUE
- MIN_CLUSTER_COUNT = 1
- MAX_CLUSTER_COUNT = 2
- SCALING_POLICY = 'STANDARD';
-
- --grant rights to the warehouse
- GRANT USAGE ON WAREHOUSE purview_wh TO ROLE purview_reader;
- ```
-
-3. Create a user `purview` for Microsoft Purview scan.
-
- ```sql
- CREATE OR REPLACE USER purview
- PASSWORD = '<password>';
-
- --note the default role will be used during scan
- ALTER USER purview SET DEFAULT_ROLE = purview_reader;
-
- --add user to purview_reader role
- GRANT ROLE purview_reader TO USER purview;
- ```
-
-4. Grant reader rights to the database objects.
-
- ```sql
- GRANT USAGE ON DATABASE <your_database_name> TO purview_reader;
-
- --grant reader access to all the database structures that purview can currently scan
- GRANT USAGE ON ALL SCHEMAS IN DATABASE <your_database_name> TO role purview_reader;
- GRANT USAGE ON ALL FUNCTIONS IN DATABASE <your_database_name> TO role purview_reader;
- GRANT USAGE ON ALL PROCEDURES IN DATABASE <your_database_name> TO role purview_reader;
- GRANT SELECT ON ALL TABLES IN DATABASE <your_database_name> TO role purview_reader;
- GRANT SELECT ON ALL VIEWS IN DATABASE <your_database_name> TO role purview_reader;
- GRANT USAGE, READ on ALL STAGES IN DATABASE <your_database_name> TO role purview_reader;
-
- --grant reader access to any future objects that could be created
- GRANT USAGE ON FUTURE SCHEMAS IN DATABASE <your_database_name> TO role purview_reader;
- GRANT USAGE ON FUTURE FUNCTIONS IN DATABASE <your_database_name> TO role purview_reader;
- GRANT USAGE ON FUTURE PROCEDURES IN DATABASE <your_database_name> TO role purview_reader;
- GRANT SELECT ON FUTURE TABLES IN DATABASE <your_database_name> TO role purview_reader;
- GRANT SELECT ON FUTURE VIEWS IN DATABASE <your_database_name> TO role purview_reader;
- GRANT USAGE, READ ON FUTURE STAGES IN DATABASE <your_database_name> TO role purview_reader;
- ```
-
-## Register
-
-This section describes how to register Snowflake in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-To register a new Snowflake source in your data catalog, follow these steps:
-
-1. Navigate to your Microsoft Purview account in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **Snowflake**. Select **Continue**.
-
-On the **Register sources (Snowflake)** screen, follow these steps:
-
-1. Enter a **Name** that the data source will be listed within the Catalog.
-
-1. Enter the **server** URL used to connect to the Snowflake account in the form of `<account_identifier>.snowflakecomputing.com`, for example, `orgname-accountname.snowflakecomputing.com`. Learn more about Snowflake [account identifier](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#).
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-snowflake/register-sources.png" alt-text="register sources options" border="true":::
-
-## Scan
-
-Follow the steps below to scan Snowflake to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Authentication for a scan
-
-The supported authentication type for a Snowflake source is **Basic authentication**.
-
-### Create and run scan
-
-To create and run a new scan, follow these steps:
-
-1. If your server is publicly accessible, skip to step two. Otherwise, you'll need to make sure your self-hosted integration runtime is configured:
- 1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), got to the Management Center, and select **Integration runtimes**.
- 1. Make sure a self-hosted integration runtime is available. If one isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime.
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/), navigate to **Sources**.
-
-1. Select the registered Snowflake source.
-
-1. Select **+ New scan**.
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the Azure auto-resolved integration runtime if your server is publicly accessible, or your configured self-hosted integration runtime if it isn't publicly available.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select **Basic Authentication** while creating a credential.
- * Provide the user name used to connect to Snowflake in the User name input field.
- * Store the user password used to connect to Snowflake in the secret key.
-
- 1. **Warehouse**: Specify the name of the warehouse instance used to empower scan in capital case. The default role assigned to the user specified in the credential must have USAGE rights on this warehouse.
-
- 1. **Databases**: Specify one or more database instance names to import in capital case. Separate the names in the list with a semi-colon (;). For example, `db1;db2`. The default role assigned to the user specified in the credential must have adequate rights on the database objects.
-
- Acceptable database name patterns using SQL LIKE expressions syntax include using %. For example: `A%;%B;%C%;D`:
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- 1. **Schema**: List subset of schemas to import expressed as a semicolon separated list. For example, `schema1;schema2`. All user schemas are imported if that list is empty. All system schemas and objects are ignored by default.
-
- Acceptable schema name patterns using SQL LIKE expressions syntax include using %. For example: `A%;%B;%C%;D`:
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable.
-
- 1. **Stored procedure details**: Controls the number of details imported from stored procedures:
-
- - Signature (default): The name and parameters of stored procedures.
- - Code, signature: The name, parameters and code of stored procedures.
- - Lineage, code, signature: The name, parameters and code of stored procedures, and the data lineage derived from the code.
- - None: Stored procedure details aren't included.
-
- > [!Note]
- > If you use Self-hosted Integration Runtime for scan, customized setting other than the default Signature is supported since version 5.30.8541.1. The earlier versions always extract the name and parameters of stored procedures.
-
- 1. **Maximum memory available** (applicable when using self-hosted integration runtime): Maximum memory (in GB) available on customer's VM to be used by scanning processes. It's dependent on the size of Snowflake source to be scanned.
-
- > [!Note]
- > As a rule of thumb, please provide 1GB memory for every 1000 tables.
-
- :::image type="content" source="media/register-scan-snowflake/scan.png" alt-text="scan Snowflake" border="true":::
-
-1. Select **Test connection** to validate the settings (available when using Azure Integration Runtime).
-
-1. Select **Continue**.
-
-1. Select a **scan rule set** for classification. You can choose between the system default, existing custom rule sets, or [create a new rule set](create-a-scan-rule-set.md) inline. Check the [Classification](apply-classifications.md) article to learn more.
- > [!NOTE]
- > If you are using Self-hosted runtime then you will need to upgrade to version 5.26.404.1 or higher to use Snowflake classification. You can find the latest version of Microsoft Integration runtime [here](https://www.microsoft.com/download/details.aspx?id=39717).
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your Snowflake source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Snowflake lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
-
-> [!NOTE]
-> If a view was created by tables from different databases, scan all databases simultaneously using the names in the semicolon (;) list.
-
-## Troubleshooting tips
--- Check your account identifier in the source registration step. Don't include `https://` part at the front.-- Make sure the warehouse name and database name are in capital case on the scan setup page.-- Check your key vault. Make sure there are no typos in the password.-- Check the credential you set up in Microsoft Purview. The user you specify must have a default role with the necessary access rights to both the warehouse and the database you're trying to scan. See [Required permissions for scan](#required-permissions-for-scan). USE `DESCRIBE USER;` to verify the default role of the user you've specified for Microsoft Purview.-- Use Query History in Snowflake to see if any activity is coming across.
- - If there's a problem with the account identifer or password, you won't see any activity.
- - If there's a problem with the default role, you should at least see a `USE WAREHOUSE . . .` statement.
- - You can use the [QUERY_HISTORY_BY_USER table function](https://docs.snowflake.com/en/sql-reference/functions/query_history.html) to identify what role is being used by the connection. Setting up a dedicated Microsoft Purview user will make troubleshooting easier.
-
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Synapse Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-synapse-workspace.md
- Title: Connect to and manage Azure Synapse Analytics workspaces
-description: This guide describes how to connect to Azure Synapse Analytics workspaces in Microsoft Purview, and how to use Microsoft Purview features to scan and manage your Azure Synapse Analytics workspace source.
----- Previously updated : 05/15/2023---
-# Connect to and manage Azure Synapse Analytics workspaces in Microsoft Purview
-
-This article outlines how to register Azure Synapse Analytics workspaces. It also describes how to authenticate and interact with Azure Synapse Analytics workspaces in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|Metadata extraction|Full scan|Incremental scan|Scoped scan|Classification|Labeling|Access policy|Lineage|Data sharing|
-||||||||||
-| [Yes](#register) | [Yes](#scan)| [Yes](#scan) | No| [Yes](#scan)| No| No| [Yes - pipelines](how-to-lineage-azure-synapse-analytics.md)| No|
-
-Currently, Azure Synapse Analytics lake databases are not supported.
-
-For external tables, Azure Synapse Analytics doesn't currently capture the relationship of those tables to their original files.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* Data source administrator and data reader permissions to register a source and manage it in the Microsoft Purview governance portal. For details, see [Access control in the Microsoft Purview governance portal](catalog-permissions.md).
-
-## Register
-
-The following procedure describes how to register Azure Synapse Analytics workspaces in Microsoft Purview by using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-Only a user who has at least a data reader role on the Azure Synapse Analytics workspace and who is also a data source administrator in Microsoft Purview can register an Azure Synapse Analytics workspace.
-
-1. Open the [Microsoft Purview governance portal](https://web.purview.azure.com) and select your Microsoft Purview account.
-
- Alternatively, go to the [Azure portal](https://portal.azure.com), search for and select the Microsoft Purview account, and then select the **Microsoft Purview governance portal** button.
-1. On the left pane, select **Sources**.
-1. Select **Register**.
-1. Under **Register sources**, select **Azure Synapse Analytics (multiple)**.
-1. Select **Continue**.
-
- :::image type="content" source="media/register-scan-synapse-workspace/register-synapse-source.png" alt-text="Screenshot of a selection of sources in Microsoft Purview, including Azure Synapse Analytics.":::
-
-1. On the **Register sources (Azure Synapse Analytics)** page, do the following:
-
- 1. For **Name**, enter a name for the data source to be listed in the data catalog.
- 1. Optionally, for **Azure subscription**, choose a subscription to filter down to.
- 1. For **Workspace name**, select the workspace that you're working with.
-
- The boxes for the SQL endpoints are automatically filled in based on your workspace selection.
- 1. For **Select a collection**, choose the collection that you're working with, or create a new one.
- 1. Select **Register** to finish registering the data source.
-
- :::image type="content" source="media/register-scan-synapse-workspace/register-synapse-source-details.png" alt-text="Screenshot of the page for entering details about the Azure Synapse source.":::
-
-## Scan
-
-Use the following steps to scan Azure Synapse Analytics workspaces to automatically identify assets and classify your data. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
-
-1. Set up authentication for enumerating your [dedicated](#authentication-for-enumerating-dedicated-sql-database-resources) or [serverless](#authentication-for-enumerating-serverless-sql-database-resources) resources. This step will allow Microsoft Purview to enumerate your workspace assets and perform scans.
-1. Apply [permissions to scan the contents of the workspace](#apply-permissions-to-scan-the-contents-of-the-workspace).
-1. Confirm that your [network is set up to allow access for Microsoft Purview](#set-up-firewall-access-for-the-azure-synapse-analytics-workspace).
-
-### Enumeration authentication
-
-Use the following procedures to set up authentication. You must be an owner or a user access administrator to add the specified roles.
-
-#### Authentication for enumerating dedicated SQL database resources
-
-1. In the Azure portal, go to the Azure Synapse Analytics workspace resource.
-1. On the left pane, selectΓÇ»**Access Control (IAM)**.
-1. Select the **Add** button.
-1. Set the **Reader** role and enter your Microsoft Purview account name, which represents its managed service identity (MSI).
-1. Select **Save** to finish assigning the role.
-
-> [!NOTE]
-> If you're planning to register and scan multiple Azure Synapse Analytics workspaces in your Microsoft Purview account, you can also assign the role from a higher level, such as a resource group or a subscription.
-
-#### Authentication for enumerating serverless SQL database resources
-
-There are three places where you need to set authentication to allow Microsoft Purview to enumerate your serverless SQL database resources.
-
-To set authentication for the Azure Synapse Analytics workspace:
-
-1. In the Azure portal, go to the Azure Synapse Analytics workspace resource.
-1. On the left pane, selectΓÇ»**Access Control (IAM)**.
-1. Select the **Add** button.
-1. Set the **Reader** role and enter your Microsoft Purview account name, which represents its MSI.
-1. Select **Save** to finish assigning the role.
-
-To set authentication for the storage account:
-
-1. In the Azure portal, go to the resource group or subscription that contains the storage account associated with the Azure Synapse Analytics workspace.
-1. On the left pane, selectΓÇ»**Access Control (IAM)**.
-1. Select the **Add** button.
-1. Set the **Storage blob data reader** role and enter your Microsoft Purview account name (which represents its MSI) in the **Select** box.
-1. Select **Save** to finish assigning the role.
-
-To set authentication for the Azure Synapse Analytics serverless database:
-
-1. Go to your Azure Synapse Analytics workspace and open Synapse Studio.
-1. On the left pane, select **Data**.
-1. Select the ellipsis (**...**) next to one of your databases, and then start a new SQL script.
-1. Run the following command in your SQL script to add the Microsoft Purview account MSI (represented by the account name) on the serverless SQL databases:
-
- ```sql
- CREATE LOGIN [PurviewAccountName] FROM EXTERNAL PROVIDER;
- ```
-
-### Apply permissions to scan the contents of the workspace
-
-You must set up authentication on each SQL database that you want to register and scan from your Azure Synapse Analytics workspace. Select from the following scenarios for steps to apply permissions.
-
-> [!IMPORTANT]
-> The following steps for serverless databases *do not* apply to replicated databases. In Azure Synapse Analytics, serverless databases that are replicated from Spark databases are currently read-only. For more information, see [Operation isn't allowed for a replicated database](../synapse-analytics/sql/resources-self-help-sql-on-demand.md#operation-isnt-allowed-for-a-replicated-database).
-
-# [Managed identity](#tab/MI)
-
-#### Use a managed identity for dedicated SQL databases
-
-To run the commands in the following procedure, you must be an Azure Synapse administrator on the workspace. For more information about Azure Synapse Analytics permissions, see [Set up access control for your Azure Synapse Analytics workspace](../synapse-analytics/security/how-to-set-up-access-control.md).
-
-1. Go to your Azure Synapse Analytics workspace.
-1. Go to the **Data** section, and then look for one of your dedicated SQL databases.
-1. Select the ellipsis (**...**) next to the database name, and then start a new SQL script.
-1. Run the following command in your SQL script to add the Microsoft Purview account MSI (represented by the account name) as `db_datareader` on the dedicated SQL database:
-
- ```sql
- CREATE USER [PurviewAccountName] FROM EXTERNAL PROVIDER
- GO
-
- EXEC sp_addrolemember 'db_datareader', [PurviewAccountName]
- GO
- ```
-
-1. Run the following command in your SQL script to verify the addition of the role:
-
- ```sql
- SELECT p.name AS UserName, r.name AS RoleName
- FROM sys.database_principals p
- LEFT JOIN sys.database_role_members rm ON p.principal_id = rm.member_principal_id
- LEFT JOIN sys.database_principals r ON rm.role_principal_id = r.principal_id
- WHERE p.authentication_type_desc = 'EXTERNAL'
- ORDER BY p.name;
- ```
-
-Follow the same steps for each database that you want to scan.
-
-#### Use a managed identity for serverless SQL databases
-
-1. Go to your Azure Synapse Analytics workspace.
-1. Go to the **Data** section, and select one of your SQL databases.
-1. Select the ellipsis (**...**) next to the database name, and then start a new SQL script.
-1. Run the following command in your SQL script to add the Microsoft Purview account MSI (represented by the account name) as `db_datareader` on the serverless SQL databases:
-
- ```sql
- CREATE USER [PurviewAccountName] FOR LOGIN [PurviewAccountName];
- ALTER ROLE db_datareader ADD MEMBER [PurviewAccountName];
- ```
-
-1. Run the following command in your SQL script to verify the addition of the role:
-
- ```sql
- SELECT p.name AS UserName, r.name AS RoleName
- FROM sys.database_principals p
- LEFT JOIN sys.database_role_members rm ON p.principal_id = rm.member_principal_id
- LEFT JOIN sys.database_principals r ON rm.role_principal_id = r.principal_id
- WHERE p.authentication_type_desc = 'EXTERNAL'
- ORDER BY p.name;
- ```
-
-Follow the same steps for each database that you want to scan.
-
-#### Grant permission to use credentials for external tables
-
-If the Azure Synapse Analytics workspace has any external tables, you must give the Microsoft Purview managed identity References permission on the external table's scoped credentials. With the References permission, Microsoft Purview can read data from external tables.
-
-1. Run the following command in your SQL script to get the list of database scoped credentials:
-
- ```sql
- Select name, credential_identity
- from sys.database_scoped_credentials;
- ```
-
-1. To grant the access to database scoped credentials, run the following command. Replace `scoped_credential` with the name of the database scoped credential.
-
- ```sql
- GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[scoped_credential] TO [PurviewAccountName];
- ```
-
-1. To verify the permission assignment, run the following command in your SQL script:
-
- ```sql
- SELECT dp.permission_name, dp.grantee_principal_id, p.name AS grantee_principal_name
- FROM sys.database_permissions AS dp
- JOIN sys.database_principals AS p ON dp.grantee_principal_id = p.principal_id
- JOIN sys.database_scoped_credentials AS c ON dp.major_id = c.credential_id;
- ```
-
-# [Service principal](#tab/SP)
-
-#### Use a service principal for dedicated SQL databases
-
-1. Set up a new credential of type *service principal* by following the instructions in [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-1. Go to your Azure Synapse Analytics workspace.
-1. Go to the **Data** section, and then look for one of your dedicated SQL databases.
-1. Select the ellipsis (**...**) next to the database name, and then start a new SQL script.
-1. Run the following command in your SQL script to add the service principal ID as `db_datareader` on the dedicated SQL database:
-
- ```sql
- CREATE USER [ServicePrincipalID] FROM EXTERNAL PROVIDER
- GO
-
- EXEC sp_addrolemember 'db_datareader', [ServicePrincipalID]
- GO
- ```
-
-Repeat the previous steps for all dedicated SQL databases in your Azure Synapse Analytics workspace.
-
-#### Use a service principal for serverless SQL databases
-
-1. Go to your Azure Synapse Analytics workspace.
-1. Go to the **Data** section, and then look for one of your serverless SQL databases.
-1. Select the ellipsis (**...**) next to the database name, and then start a new SQL script.
-1. Run the following command in your SQL script to add the service principal ID on the serverless SQL databases:
-
- ```sql
- CREATE LOGIN [ServicePrincipalID] FROM EXTERNAL PROVIDER;
- ```
-
-1. Run the following command in your SQL script to add the service principal ID as `db_datareader` on each of the serverless SQL databases that you want to scan:
-
- ```sql
- CREATE USER [ServicePrincipalID] FOR LOGIN [ServicePrincipalID];
- ALTER ROLE db_datareader ADD MEMBER [ServicePrincipalID];
- ```
-
-# [SQL authentication](#tab/SQLAuth)
-
-#### Use SQL authentication for dedicated SQL databases
-
-1. Set up a new credential of type *SQL authentication* by following the instructions in [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-1. Go to your Azure Synapse Analytics workspace.
-1. Go to the **Data** section, and then look for one of your dedicated SQL databases.
-1. Select the ellipsis (**...**) next to the database name, and then start a new SQL script.
-1. Run the following command in your SQL script to add the SQL authentication login name as `db_datareader` on the dedicated SQL database:
-
- ```sql
- CREATE USER [SQLUser] FROM LOGIN [SQLUser];
- GO
-
- EXEC sp_addrolemember 'db_datareader', [SQLUser];
- GO
- ```
-
-Repeat the previous steps for all dedicated SQL databases in your Azure Synapse Analytics workspace.
-
-#### Use SQL authentication for serverless SQL databases
-
-1. Go to your Azure Synapse Analytics workspace.
-1. Go to the **Data** section, and then look for one of your serverless SQL databases.
-1. Select the ellipsis (**...**) next to the database name, and then start a new SQL script.
-1. Run the following command in your SQL script to add the SQL authentication login name on the serverless SQL databases:
-
- ```sql
- CREATE USER [SQLUser] FROM LOGIN [SQLUser];
- GO
- ```
-
-1. Run the following command in your SQL script to add the service principal ID as `db_datareader` on each of the serverless SQL databases that you want to scan:
-
- ```sql
- ALTER ROLE db_datareader ADD MEMBER [SQLUser];
- GO
- ```
---
-### Set up firewall access for the Azure Synapse Analytics workspace
-
-1. In the Azure portal, go to the Azure Synapse Analytics workspace.
-
-1. On the left pane, select **Networking**.
-
-1. For **Allow Azure services and resources to access this workspace** control, select **ON**.
-
-1. Select **Save**.
-
-> [!IMPORTANT]
-> If you can't enable **Allow Azure services and resources to access this workspace** on your Azure Synapse Analytics workspaces, you'll get a serverless database enumeration failure when you set up a scan in the Microsoft Purview governance portal. In this case, you can choose the [Enter manually](#create-and-run-a-scan) option to specify the database names that you want to scan, and then proceed.
-
-### Create and run a scan
-
-1. In the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/), on the left pane, select **Data Map**.
-
-1. Select the data source that you registered.
-
-1. Select **View details**, and then select **New scan**. Alternatively, you can select the **Scan quick action** icon on the source tile.
-
-1. On the **Scan** details pane, in the **Name** box, enter a name for the scan.
-
-1. In the **Credential** dropdown list, select the credential to connect to the resources within your data source.
-
-1. For **Database selection method**, select **From Synapse workspace** or **Enter manually**. By default, Microsoft Purview tries to enumerate the databases under the workspace, and you can select the ones that you want to scan.
-
- :::image type="content" source="media/register-scan-synapse-workspace/synapse-scan-setup.png" alt-text="Screenshot of the details pane for the Azure Synapse source scan.":::
-
- If you get an error that says Microsoft Purview failed to load the serverless databases, you can select **Enter manually** to specify the type of database (dedicated or serverless) and the corresponding database name.
-
- :::image type="content" source="media/register-scan-synapse-workspace/synapse-scan-setup-enter-manually.png" alt-text="Screenshot of the selection for manually entering database names when setting up a scan.":::
-
-1. Select **Test connection** to validate the settings. If you get any error, on the report page, hover over the connection status to see details.
-
-1. Select **Continue**.
-
-1. Select **Scan rule sets** of type **Azure Synapse SQL**. You can also create scan rule sets inline.
-
-1. Choose your scan trigger. You can schedule it to run **weekly/monthly** or **once**.
-
-1. Review your scan, and then select **Save** to complete the setup.
--
-### Set up a scan by using an API
-
-Here's an example of creating a scan for a serverless database by using the Microsoft Purview REST API. Replace the placeholders in braces (`{}`) with your actual settings. Learn more from [Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/).
-
-```http
-PUT https://{purview_account_name}.purview.azure.com/scan/datasources/<data_source_name>/scans/{scan_name}?api-version=2022-02-01-preview
-```
-
-In the following code, `collection_id` is not the friendly name for the collection, a five-character ID. For the root collection, `collection_id` is the name of the collection. For all subcollections, it's instead the ID that you can find in one of these places:
--- The URL in the Microsoft Purview governance portal. Select the collection, and check the URL to find where it says **collection=**. That's your ID. In the following example, the **Investment** collection has the ID **50h55c**.-
- :::image type="content" source="media/register-scan-synapse-workspace/find-collection-id.png" alt-text="Screenshot of a collection ID in a URL." lightbox="media/register-scan-synapse-workspace/find-collection-id.png" :::
-- You can [list child collection names](/rest/api/purview/accountdataplane/collections/list-child-collection-names) of the root collection to list the collections, and then use the name instead of the friendly name.-
-```json
-{
- "properties":{
- "resourceTypes":{
- "AzureSynapseServerlessSql":{
- "scanRulesetName":"AzureSynapseSQL",
- "scanRulesetType":"System",
- "resourceNameFilter":{
- "resources":[ "{serverless_database_name_1}", "{serverless_database_name_2}", ...]
- }
- }
- },
- "credential":{
- "referenceName":"{credential_name}",
- "credentialType":"SqlAuth | ServicePrincipal | ManagedIdentity (if UAMI authentication)"
- },
- "collection":{
- "referenceName":"{collection_id}",
- "type":"CollectionReference"
- },
- "connectedVia":{
- "referenceName":"{integration_runtime_name}",
- "integrationRuntimeType":"SelfHosted (if self-hosted IR) | Managed (if VNet IR)"
- }
- },
- "kind":"AzureSynapseWorkspaceCredential | AzureSynapseWorkspaceMsi (if system-assigned managed identity authentication)"
-}
-```
-
-To schedule the scan, create a trigger for it after scan creation. For more information, see [Triggers - Create Trigger](/rest/api/purview/scanningdataplane/triggers/create-trigger).
-
-### Troubleshooting
-
-If you have any problems with scanning:
--- Confirm that you followed all [prerequisites](#prerequisites).-- Confirm that you set up [enumeration authentication](#enumeration-authentication) for your resources.-- Confirm that you set up [authentication](#apply-permissions-to-scan-the-contents-of-the-workspace).-- Check the network by confirming [firewall settings](#set-up-firewall-access-for-the-azure-synapse-analytics-workspace).-- Review the [scan troubleshooting documentation](troubleshoot-connections.md).-
-## Next steps
-
-Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
purview Register Scan Teradata Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-teradata-source.md
- Title: Connect to and manage Teradata
-description: This guide describes how to connect to Teradata in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Teradata source.
----- Previously updated : 04/20/2023---
-# Connect to and manage Teradata in Microsoft Purview
-
-This article outlines how to register Teradata, and how to authenticate and interact with Teradata in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
-
-## Supported capabilities
-
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Labeling**|**Access Policy**|**Lineage**|**Data Sharing**|
-||||||||||
-| [Yes](#register)| [Yes](#scan)| No | [Yes](#scan)| [Yes](#scan)| No| No | [Yes*](#lineage)| No |
-
-\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-
-The supported Teradata database versions are 12.x to 17.x.
-
-When scanning Teradata source, Microsoft Purview supports:
--- Extracting technical metadata including:-
- - Server
- - Databases
- - Tables including the columns, foreign keys, indexes, and constraints
- - Views including the columns
- - Stored procedures including the parameter dataset and result set
- - Functions including the parameter dataset
--- Fetching static lineage on assets relationships among tables and views.-
-When setting up scan, you can choose to scan an entire Teradata server, or scope the scan to a subset of databases matching the given name(s) or name pattern(s).
-
-### Known limitations
-
-When object is deleted from the data source, currently the subsequent scan won't automatically remove the corresponding asset in Microsoft Purview.
-
-### Required permissions for scan
-
-Microsoft Purview supports basic authentication (username and password) for scanning Teradata. The user should have SELECT permission granted for every individual system table listed below:
-
-```sql
-grant select on dbc.tvm to [user];
-grant select on dbc.dbase to [user];
-grant select on dbc.tvfields to [user];
-grant select on dbc.udtinfo to [user];
-grant select on dbc.idcol to [user];
-grant select on dbc.udfinfo to [user];
-```
-
-To retrieve data types of view columns, Microsoft Purview issues a prepare statement for `select * from <view>` for each of the view queries and parse the metadata that contains the data type details for better performance. It requires the SELECT data permission on views. If the permission is missing, view column data types will be skipped.
-
-For classification, user also needs to have read permission on the tables/views to retrieve sample data.
-
-## Prerequisites
-
-* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* An active [Microsoft Purview account](create-catalog-portal.md).
-
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
-
- * Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
-
- * Ensure Visual C++ Redistributable (version Visual Studio 2012 Update 4 or newer) is installed on the self-hosted integration runtime machine. If you don't have this update installed, [you can download it here](/cpp/windows/latest-supported-vc-redist).
-
- * Download the [Teradata JDBC driver](https://downloads.teradata.com/) on the machine where your self-hosted integration runtime is running. Note down the folder path which you will use to set up the scan.
-
- > [!Note]
- > The driver should be accessible by the self-hosted integration runtime. By default, self-hosted integration runtime uses [local service account "NT SERVICE\DIAHostService"](manage-integration-runtimes.md#service-account-for-self-hosted-integration-runtime). Make sure it has "Read and execute" and "List folder contents" permission to the driver folder.
-
-## Register
-
-This section describes how to register Teradata in Microsoft Purview using the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-
-### Steps to register
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-1. Select **Data Map** on the left navigation.
-1. Select **Register**
-1. On Register sources, select **Teradata**. Select **Continue**
-
- :::image type="content" source="media/register-scan-teradata-source/register-sources.png" alt-text="register Teradata options" border="true":::
-
-On the **Register sources (Teradata)** screen, do the following:
-
-1. Enter a **Name** that the data source will be listed with in the Catalog.
-
-1. Enter the **Host** name to connect to a Teradata source. It can also be an IP address of the server.
-
-1. Select a collection or create a new one (Optional)
-
-1. Finish to register the data source.
-
- :::image type="content" source="media/register-scan-teradata-source/register-sources-2.png" alt-text="register Teradata" border="true":::
-
-## Scan
-
-Follow the steps below to scan Teradata to automatically identify assets. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-
-### Create and run scan
-
-1. In the Management Center, select **Integration runtimes**. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime
-
-1. Select the **Data Map** tab on the left pane in the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Select the registered Teradata source.
-
-1. Select **New scan**
-
-1. Provide the below details:
-
- 1. **Name**: The name of the scan
-
- 1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
-
- 1. **Credential**: Select the credential to connect to your data source. Make sure to:
- * Select Basic Authentication while creating a credential.
- * Provide a user name to connect to database server in the User name input field
- * Store the database server password in the secret key.
-
- To understand more on credentials, refer to the link [here](./manage-credentials.md)
-
- 1. **Schema**: List subset of databases to import expressed as a semicolon separated list. For Example: `schema1; schema2`. All user databases are imported if that list is empty. All system databases (for example, SysAdmin) and objects are ignored by default.
-
- Acceptable database name patterns using SQL LIKE expressions syntax include using %. For example: `A%; %B; %C%; D`
- * Start with A or
- * End with B or
- * Contain C or
- * Equal D
-
- Usage of NOT and special characters aren't acceptable
-
- 1. **Driver location**: Specify the path to the JDBC driver location in your machine where self-host integration runtime is running, e.g. `D:\Drivers\Teradata`. It's the path to valid JAR folder location. Make sure the driver is accessible by the self-hosted integration runtime, learn more from [prerequisites section](#prerequisites).
-
- 1. **Stored procedure details**: Controls the number of details imported from stored procedures:
-
- - Signature: The name and parameters of stored procedures.
- - Code, signature: The name, parameters and code of stored procedures.
- - Lineage, code, signature: The name, parameters and code of stored procedures, and the data lineage derived from the code.
- - None: Stored procedure details aren't included.
-
- 1. **Maximum memory available:** Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of Teradata source to be scanned.
-
- > [!Note]
- > As a rule of thumb, please provide 2GB memory for every 1000 tables
-
- :::image type="content" source="media/register-scan-teradata-source/setup-scan.png" alt-text="setup scan" border="true":::
-
-1. Select **Continue**.
-
-1. Select a **scan rule set** for classification. You can choose between the system default, existing custom rule sets, or [create a new rule set](create-a-scan-rule-set.md) inline.
-
-1. Choose your **scan trigger**. You can set up a schedule or ran the scan once.
-
-1. Review your scan and select **Save and Run**.
--
-## Lineage
-
-After scanning your Teradata source, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
-
-Go to the asset -> lineage tab, you can see the asset relationship when applicable. Refer to the [supported capabilities](#supported-capabilities) section on the supported Teradata lineage scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
--
-## Next steps
-
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
--- [Data Estate Insights in Microsoft Purview](concept-insights.md)-- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
purview Scan Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/scan-data-sources.md
- Title: How to scan sources
-description: Learn how to scan registered data sources in Microsoft Purview.
----- Previously updated : 02/23/2023--
-# Scan data sources in Microsoft Purview
-
-In Microsoft Purview, after you [register your data source](manage-data-sources.md#register-a-new-source), you can scan your source to capture technical metadata, extract schema, and apply classifications to your data.
-
-* For more information about scanning in general, see our [scanning concept article](concept-scans-and-ingestion.md).
-* For best practices, see our [scanning best practices article.](concept-best-practices-scanning.md)
-
-In this article, you'll learn the basic steps for scanning any data source.
-
->[!TIP]
->Each source has its own instructions and prerequisites for scanning. For the most complete scanning instructions, select your source from the [supported sources list](microsoft-purview-connector-overview.md) and review its scanning instructions.
-
-## Prerequisites
-
-[Here's a list of all the sources that are currently available to register and scan in Microsoft Purview.](microsoft-purview-connector-overview.md)
-
-Before you can scan your data source, you must take these steps:
-
-1. [Register your data source](manage-data-sources.md#register-a-new-source) - This essentially gives Microsoft Purview the address of your data source, and maps it to a [collection](catalog-permissions.md#a-collections-example) in the Microsoft Purview Data Map.
-1. Consider your network - If your source is in an on-premises network, or a virtual private network (VPN), or if your [Microsoft Purview account is using private endpoints](catalog-private-link-end-to-end.md), you'll need a self-hosted integration runtime, which is a tool that will sit on a machine in your private network so your source and Microsoft Purview can connect during the scan. [Here are the instructions to create a self-hosted integration runtime.](manage-integration-runtimes.md)
-1. Consider what credentials you're going to use to connect to your source. All [source pages](microsoft-purview-connector-overview.md) will have a **Scan** section that will include details about what authentication types are available.
-
-## Create a scan
-
-In the steps below we'll be using [Azure Blob Storage](register-scan-azure-blob-storage-source.md) as an example, and authenticating with the Microsoft Purview Managed Identity.
-
->[!IMPORTANT]
-> These are the general steps for creating a scan, but you should refer to [the source page](microsoft-purview-connector-overview.md) for source-specific prerequistes and scanning instructions.
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Select the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
- :::image type="content" source="./media/scan-data-sources/open-purview-studio.png" alt-text="Screenshot of Microsoft Purview window in Azure portal, with the Microsoft Purview governance portal button highlighted." border="true":::
-
-1. Navigate to the **Data map** -> **Sources** to view your registered sources either in a map or table view.
-1. Find your source and select the **New Scan** icon.
-
- :::image type="content" source="media/scan-data-sources/register-blob-new-scan.png" alt-text="Screenshot the new scan button highlighted by a registered source and the new scan window.":::
-
-1. Provide a **Name** for the scan.
-1. Select your authentication method. Here we chose the Purview MSI (managed identity.)
-
- :::image type="content" source="media/scan-data-sources/register-blob-managed-identity.png" alt-text="Screenshot that shows the managed identity option to run the scan.":::
-
-1. Choose the current collection, or a sub collection for the scan. The collection you choose will house the metadata discovered during the scan.
-
-1. Select **Test connection**. If it isn't successful, see our [troubleshooting] section. On a successful connection, select **Continue**.
-
-1. Depending on the source, you can scope your scan to a specific subset of data. For Azure Blob Storage, we can select folders and subfolders by choosing the appropriate items in the list.
-
- :::image type="content" source="media/scan-data-sources/register-blob-scope-scan.png" alt-text="Screenshot showing the scope your scan window with files and folders selected.":::
-
-1. Select a scan rule set. The scan rule set contains the kinds of data [classifications](concept-classification.md) your scan will check for. You can choose between the system default (that will contain all classifications available for the source), existing custom rule sets made by others in your organization, or [create a new rule set inline](create-a-scan-rule-set.md).
-
- :::image type="content" source="media/scan-data-sources/register-blob-scan-rule-set.png" alt-text="Screenshot of the select a scan rule set page with the default set selected.":::
-
-1. Choose your scan trigger. You can set up a schedule (monthly or weekly) or run the scan once.
- >[!NOTE]
- > **Start recurrence at** must be at least 1 minute lesser than the **schedule scan time**, otherwise, the scan will be triggered in next recurrence.
-
- :::image type="content" source="media/scan-data-sources/register-blob-scan-trigger.png" alt-text="Screenshot of the set a scan trigger page showing a recurring monthly schedule.":::
-
-1. Review your scan and select **Save and run**.
-
- :::image type="content" source="media/scan-data-sources/register-blob-review-scan.png" alt-text="Screenshot of the scan review page with the save and run button highlighted.":::
-
-## View a scan
-
-Depending on the amount of data in your data source, a scan can take some time to run, so here's how you can check on progress and see results when the scan is complete.
-
-1. You can view your scan from the collection or from the source itself.
-
-1. To view from the collection, navigate to your _Collection_ in the data map, and select the **Scans** button.
-
- :::image type="content" source="media/scan-data-sources/select-scans.png" alt-text="Screenshot of the collection page with the scans button highlighted.":::
-
-1. Select your scan name to see details.
-
- :::image type="content" source="media/scan-data-sources/select-scan-name.png" alt-text="Screenshot of the scans in the collection list with the most recent scan name highlighted.":::
-
-1. Or, you can navigate directly to the _data source_ in the _Collection_ and select **View Details** to check the status of the scan.
-
- :::image type="content" source="media/scan-data-sources/register-blob-view-scan.png" alt-text="Screenshot of the data map with a source's view details button highlighted.":::
-
-1. The scan details indicate the progress of the scan in the **Last run status** and the number of assets _scanned_ and _classified_.
-
- :::image type="content" source="media/scan-data-sources/register-blob-scan-details.png" alt-text="Screenshot of a source detail page, with the assets and scans highlighted.":::
-
-1. The **Last run status** will be updated to **In progress** and then **Completed** once the entire scan has run successfully
-
- :::image type="content" source="media/scan-data-sources/register-blob-scan-in-progress.png" alt-text="Screenshot of a source detail page with a scan showing an in progress status.":::
-
- :::image type="content" source="media/scan-data-sources/register-blob-scan-completed.png" alt-text="Screenshot of a source detail page with a scan showing a completed status.":::
-
-## Manage a scan
-
-After a scan is complete, it can be managed or run again.
-
-1. Select the **Scan name** from either the collections list or the source page to manage the scan.
-
- :::image type="content" source="media/scan-data-sources/register-blob-manage-scan.png" alt-text="Screenshot of a source details page with the scan name link highlighted.":::
-
-1. You can _run the scan_ again, _edit the scan_, _delete the scan_
-
- :::image type="content" source="media/scan-data-sources/register-blob-manage-scan-options.png" alt-text="Screenshot of a manage scan page with the run, edit, and delete buttons highlighted.":::
-
-1. You can run a full scan, which will scan all the content in your scope, but some sources also have **incremental scan** available. Incremental scan will scan only those resources that have been updated since the last scan. Check the **supported capabilities** table in your source page to see if incremental scan is available for your source after the first scan.
-
- :::image type="content" source="media/scan-data-sources/register-blob-full-inc-scan.png" alt-text="Screenshot of the run scan now button showing the full and incremental scan options.":::
-
-## Troubleshooting
-
-Setting up the connection for your scan can complex since it's a custom set up for your network and your credentials.
-
-If you're unable to connect to your source, follow these steps:
-
-1. Review your [source page](microsoft-purview-connector-overview.md) prerequisites to make sure there's nothing you've missed.
-1. Review your authentication option in the **Scan** section of your source page to confirm you have set up the authentication method correctly.
-1. Review our [troubleshoot connections page](troubleshoot-connections.md).
-1. [Create a support request](../azure-portal/supportability/how-to-create-azure-support-request.md#go-to-help--support-from-the-global-header), so our support team can help you troubleshoot your specific environment.
-
-## Next steps
-
-* [Scanning best practices](concept-best-practices-scanning.md)
-* [Azure Data Lake Storage Gen 2](register-scan-adls-gen2.md)
-* [Power BI tenant](register-scan-power-bi-tenant.md)
-* [Azure SQL Database](register-scan-azure-sql-database.md)
purview Scanning Shir Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/scanning-shir-troubleshooting.md
- Title: Troubleshoot the self-hosted integration runtime in Microsoft Purview
-description: Learn how to troubleshoot self-hosted integration runtime issues in Microsoft Purview.
----- Previously updated : 04/03/2023---
-# Troubleshoot Microsoft Purview self-hosted integration runtime (SHIR)
-
-This article explores common troubleshooting methods for self-hosted integration runtime (SHIR) in Microsoft Purview.
-
-The self-hosted integration runtime (SHIR) is also used by Azure Data Factory and Azure Synapse Analytics. Though many of the troubleshooting steps overlap, follow this guide to troubleshoot your SHIR for those products:
--- [Troubleshoot SHIR for Azure Data Factory and Azure Synapse Analytics](../data-factory/self-hosted-integration-runtime-troubleshoot-guide.md)-
-## Gather Microsoft Purview specific SHIR self-hosted integration runtime logs
-
-For failed Microsoft Purview activities that are running on a self-hosted IR or shared IR, the service supports viewing and uploading error logs from the [Windows Event Viewer](/shows/inside/event-viewer).
-
-You can look up any errors you see in the error guide below.
-To get support and troubleshooting guidance for SHIR issues, you may need to generate an error report ID and [reach out to Microsoft support](https://azure.microsoft.com/support/create-ticket/).
-
-To generate the error report ID for Microsoft Support, follow these instructions:
-
-1. Before starting a scan in the Microsoft Purview governance portal:
-
- 1. Navigate to the machine where the self-hosted integration runtime is installed and open the Windows Event Viewer.
- 1. Clear the windows event viewer logs in the **Integration Runtime** section. Right-click on the logs and select the clear logs option.
- 1. Navigate back to the Microsoft Purview governance portal and start the scan.
-
-1. Once the scan shows status **Failed**, navigate back to the SHIR VM, or machine and refresh the event viewer in the **Integration Runtime** section.
-1. The activity logs are displayed for the failed scan run.
-
-1. For further assistance from Microsoft, select **Send Logs**.
-
- The **Share the self-hosted integration runtime (SHIR) logs with Microsoft** window opens.
-
- :::image type="content" source="media/scanning-shir-troubleshooting/shir-send-logs-ir.png" lightbox="media/scanning-shir-troubleshooting/shir-send-logs-ir.png" alt-text="Screenshot of the send logs button on the self-hosted integration runtime (SHIR) to upload logs to Microsoft.":::
-
-1. Select which logs you want to send.
-
- * For a *self-hosted IR*, you can upload logs that are related to the failed activity or all logs on the self-hosted IR node.
- * For a *shared IR*, you can upload only logs that are related to the failed activity.
-
-1. When the logs are uploaded, keep a record of the Report ID for later use if you need further assistance to solve the issue.
-
- :::image type="content" source="media/scanning-shir-troubleshooting/shir-send-logs-complete.png" lightbox="media/scanning-shir-troubleshooting/shir-send-logs-complete.png" alt-text="Screenshot of the displayed report ID in the upload progress window for the Purview SHIR logs.":::
-
-> [!NOTE]
-> Log viewing and uploading requests are executed on all online self-hosted IR instances. If any logs are missing, make sure that all the self-hosted IR instances are online.
-
-## Self-hosted integration runtime SHIR general failure or error
-
-There are lots of common errors, warnings, issues between Purview SHIR and Azure Data Factory or Azure Synapse SHIR. If your SHIR issues aren't resolved at this stage, refer to the [Azure Data Factory ADF or Azure Synapse SHIR troubleshooting guide](../data-factory/self-hosted-integration-runtime-troubleshoot-guide.md)
--
-## Next Steps
-
-For more help with troubleshooting, try the following resources:
--- [Getting started with Microsoft Purview](https://azure.microsoft.com/products/purview/)-- [Create and Manage SHIR Self-hosted integration runtimes in Purview](manage-integration-runtimes.md)-- [Microsoft Purview frequently asked questions](frequently-asked-questions.yml)
purview Self Hosted Integration Runtime Version https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/self-hosted-integration-runtime-version.md
- Title: Self-hosted integration runtime auto-update and expiration
-description: Learn about self-hosted integration runtime auto-update and expiration in Microsoft Purview.
----- Previously updated : 01/31/2023--
-# Self-hosted integration runtime auto-update and expiration
-
-This article will describe how to let self-hosted integration runtime auto-update to the latest version and how Microsoft Purview manages the versions of self-hosted integration runtime.
-
-## How to check your self-hosted integration runtime version
-
-You can check the version of your self-hosted integration runtime in Microsoft Purview governance portal -> Data map -> Integration runtimes:
--
-You can also check the version in your self-hosted integration runtime client -> Help tab.
-
-## Self-hosted Integration Runtime auto-update
-
-Auto update is enabled by default when you install a self-hosted integration runtime. You have two options to manage the version of self-hosted integration runtime: auto-update or maintain manually. Typically, Microsoft Purview releases two new versions of self-hosted integration runtime every month, which includes new feature release, bug fix or enhancement. So we recommend users to update to newer version in order to get the latest feature and enhancement.
-
-The self-hosted integration runtime will be automatically updated to newer version. When new version is available while not yet scheduled for your instance, you can also trigger the update from the portal.
--
-> [!NOTE]
-> If you have multiple self-hosted integration runtime nodes, there is no downtime during auto-update. The auto-update happens in one node first while others are working on tasks. When the first node finishes the update, it will take over the remain tasks when other nodes are updating. If you only have one self-hosted integration runtime node, then it has some downtime during the auto-update.
-
-## Auto-update version vs latest version
-
-To ensure the stability of self-hosted integration runtime, although we release two versions, we'll only push one version every month. So sometimes you'll find that the auto-update version is the previous version of the actual latest version. If you want to get the latest version, you can go to [download center](https://www.microsoft.com/download/details.aspx?id=39717) and do so manually. Additionally, **auto-update** to a new version is managed by the service, and you can't change it.
-
-The self-hosted integration runtime **Version** tab in Microsoft Purview governance portal shows the newer version if current version is old. When your self-hosted integration runtime is online, this version is the auto-update version and will automatically update your self-hosted integration runtime in the scheduled time. But if your self-hosted integration runtime is offline, the page only shows the newer version.
-
-If you have multiple nodes, and for some reasons that some of them aren't auto-updated successfully. Then these nodes roll back to the version, which was the same across all nodes before auto-update.
-
-## Self-hosted Integration Runtime expiration
-
-Each version of self-hosted integration runtime will be expired in one year. The expiring message is shown in Microsoft Purview governance portal and the self-hosted integration runtime client **90 days** before expiration.
-
-## Next steps
--- [Create and manage a self-hosted integration runtime](manage-integration-runtimes.md)
purview Sensitivity Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/sensitivity-insights.md
- Title: Sensitivity label reporting on your data in Microsoft Purview using Microsoft Purview Data Estate Insights
-description: This how-to guide describes how to view and use sensitivity label reporting on your data.
----- Previously updated : 02/20/2023-
-#Customer intent: As a security officer, I need to understand how to use Microsoft Purview Data Estate Insights to learn about sensitive data identified and classified and labeled during scanning.
--
-# Sensitivity label insights about your data in Microsoft Purview
-
-This how-to guide describes how to access, view, and filter security insights provided by sensitivity labels applied to your data.
-
-Supported data sources include: Azure Blob Storage, Azure Data Lake Storage (ADLS) GEN 1, Azure Data Lake Storage (ADLS) GEN 2, SQL Server, Azure SQL Database, Azure SQL Managed Instance, Amazon S3 buckets, Amazon RDS databases (public preview), Power BI
-
-In this how-to guide, you'll learn how to:
-
-> [!div class="checklist"]
-> - Launch your Microsoft Purview account from Azure.
-> - View sensitivity labeling insights on your data
-> - Drill down for more sensitivity labeling details on your data
-
-
-## Prerequisites
-
-Before getting started with Microsoft Purview Data Estate Insights, make sure that you've completed the following steps:
--- Set up your Azure resources and populated the relevant accounts with test data--- [Extended sensitivity labels to assets in the Microsoft Purview Data Map](how-to-automatically-label-your-content.md), and created or selected the labels you want to apply to your data.--- Set up and completed a scan on the test data in each data source. For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md) and [Create a scan rule set](create-a-scan-rule-set.md).--- Signed in to Microsoft Purview with account with a [Data Reader or Data Curator role](catalog-permissions.md#roles).-
-For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md) and [Automatically label your data in Microsoft Purview](create-sensitivity-label.md).
-
-## Use Microsoft Purview Data Estate Insights for sensitivity labels
-
-Classifications are similar to subject tags, and are used to mark and identify data of a specific type that's found within your data estate during scanning.
-
-Sensitivity labels enable you to state how sensitive certain data is in your organization. For example, a specific project name might be highly confidential within your organization, while that same term is not confidential to other organizations.
-
-Classifications are matched directly, such as a social security number, which has a classification of **Social Security Number**.
-
-In contrast, sensitivity labels are applied when one or more classifications and conditions are found together. In this context, [conditions](/microsoft-365/compliance/apply-sensitivity-label-automatically) refer to all the parameters that you can define for unstructured data, such as **proximity to another classification**, and **% confidence**.
-
-Microsoft Purview Data Estate Insights uses the same classifications, also known as [sensitive information types](/microsoft-365/compliance/sensitive-information-type-entity-definitions), as those used with Microsoft 365 apps and services. This enables you to extend your existing sensitivity labels to assets in the data map.
-
-> [!NOTE]
-> After you have scanned your source types, give **Sensitivity labeling** Insights a couple of hours to reflect the new assets.
-
-**To view sensitivity labeling insights:**
-
-1. Go to the **Microsoft Purview** home page.
-
-1. On the **Overview** page, in the **Get Started** section, select the **Launch Microsoft Purview account** tile.
-
-1. In Microsoft Purview, select the **Data Estate Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: menu item on the left to access your **Data Estate Insights** area.
-
-1. In the **Data Estate Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: area, select **Sensitivity labels** to display the Microsoft Purview **Sensitivity labeling insights** report.
-
- > [!NOTE]
- > If this report is empty, you may not have extended your sensitivity labels to Microsoft Purview Data Map. For more information, see [Labeling in the Microsoft Purview Data Map](create-sensitivity-label.md).
-
- :::image type="content" source="media/insights/sensitivity-labeling-insights-small.png" alt-text="Sensitivity labeling insights":::
-
- The main **Sensitivity labeling insights** page displays the following areas:
-
- |Area |Description |
- |||
- |**Overview of sources with sensitivity labels** |Displays tiles that provide: <br>- The number of subscriptions found in your data. <br>- The number of unique sensitivity labels applied on your data <br>- The number of sources with sensitivity labels applied <br>- The number of files and tables found with sensitivity labels applied|
- |**Top sources with labeled data (last 30 days)** | Shows the trend, over the past 30 days, of the number of sources with sensitivity labels applied. |
- |**Top labels applied across sources** |Shows the top labels applied across all of your Microsoft Purview data resources. |
- |**Top labels applied on files** |Shows the top sensitivity labels applied to files in your data. |
- |**Top labels applied on tables** | Shows the top sensitivity labels applied to database tables in your data. |
- | **Labeling activity** | Displays separate graphs for files and tables, each showing the number of files or tables labeled over the selected time frame. <br>**Default**: 30 days<br>Select the **Time** filter above the graphs to select a different time frame to display. |
- | | |
-
-## Sensitivity labeling insights drilldown
-
-In any of the following **Sensitivity labeling insights** graphs, select the **View more** link to drill down for more details:
--- **Top labels applied across sources**-- **Top labels applied on files**-- **Top labels applied on tables**-- **Labeling activity > Labeled data**-
-For example:
--
-Do any of the following to learn more:
-
-|Option |Description |
-|||
-|**Filter your data** | Use the filters above the grid to filter the data shown, including the label name, subscription name, or source type. <br><br>If you're not sure of the exact label name, you can enter part or all of the name in the **Filter by keyword** box. |
-|**Sort the grid** |Select a column header to sort the grid by that column. |
-|**Edit columns** | To display more or fewer columns in your grid, select **Edit Columns** :::image type="icon" source="media/insights/ico-columns.png" border="false":::, and then select the columns you want to view or change the order. <br><br>Select a column header to sort the grid by that column. |
-|**Drill down further** | To drill down to a specific label, select a name in the **Sensitivity label** column to view the **Label by source** report. <br><br>This report displays data for the selected label, including the source name, source type, subscription ID, and the numbers of classified files and tables. |
-|**Browse assets** | To browse through the assets found with a specific label or source, select one or more labels or sources, depending on the report you're viewing, and then select **Browse assets** :::image type="icon" source="medi). |
-| | |
-
-## Sensitivity label integration with Microsoft Purview Information Protection
-
-Close integration with [Microsoft Purview Information Protection](/microsoft-365/compliance/information-protection) means that you have direct ways to extend visibility into your data estate, and classify and label your data.
-
-For sensitivity labels to be extended to your assets in the data map, you must actively turn on this capability in the Microsoft Purview compliance portal.
-
-For more information, see [How to automatically apply sensitivity labels to your data in the Microsoft Purview Data Map](how-to-automatically-label-your-content.md).
-
-## Next steps
-
-Learn how to use Data Estate Insights with sources below:
-
-* [Learn how to use Asset insights](asset-insights.md)
-* [Learn how to use Data Stewardship](data-stewardship.md)
-* [Learn how to use Classification insights](classification-insights.md)
-* [Learn how to use Glossary insights](glossary-insights.md)
purview Supported Browsers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/supported-browsers.md
- Title: Supported browsers
-description: This article provides the list of supported browsers for Microsoft Purview.
---- Previously updated : 12/06/2022--
-# Supported Browsers
-
-Microsoft Purview supports the following browsers. We recommend that you use the most up-to-date browser that's compatible with your operating system.
-
-* Microsoft Edge (latest version)
-* Safari (latest version, Mac only)
-* Chrome (latest version)
-* Firefox (latest version)
-
-## Chrome Incognito mode
-
- Chrome Incognito blocking third party cookies must be disabled for the Microsoft Purview governance portal to work.
--
-## Microsoft Edge InPrivate mode
-
-Microsoft Edge InPrivate using Strict Tracking Prevention must be disabled for the Microsoft Purview governance portal to work.
-
purview Supported Classifications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/supported-classifications.md
- Title: List of supported classifications
-description: This page lists the supported system classifications in Microsoft Purview.
----- Previously updated : 04/25/2023
-#Customer intent: As a data steward or catalog administrator, I need to understand what's supported under classifications.
--
-# System classifications in Microsoft Purview
-
-This article lists the supported system classifications in Microsoft Purview. To learn more about classification, see [Classification](concept-classification.md).
-
-Microsoft Purview classifies data by using [RegEx](https://wikipedia.org/wiki/Regular_expression), [Bloom Filter](https://wikipedia.org/wiki/Bloom_filter) and Machine Learning models. The following lists describe the format, pattern, and keywords for the Microsoft Purview defined system classifications. Each classification name is prefixed by *MICROSOFT*.
-
-> [!Note]
-> Microsoft Purview can classify both structured (CSV, TSV, JSON, SQL Table etc.) as well as unstructured data (DOC, PDF, TXT etc.). However, there are certain classifications that are only applicable to structured data. Here is the list of classifications that Microsoft Purview doesn't apply on unstructured data - City Name, Country Name, Date Of Birth, Email, Ethnic Group, GeoLocation, Person Name, U.S. Phone Number, U.S. States, U.S. ZipCode
-
-> [!Note]
-> **Minimum match threshold**: It is the minimum percentage of data value matches in a column that must be found by the scanner for the classification to be applied. For system classification minimum match threshold value is set at 60% and cannot be changed. For custom classification, this value is configurable.
-
-## Bloom Filter based classifications
-
-### World Cities, Country
-
-The City and Country classifier identifies the data based on their full names as well as short codes.
-
-#### Keywords
-
-##### Keywords for City
-- burg-- city-- cities-- city names-- cosmopolis-- metropolis-- municipality-- place-- town-
-##### Keywords for Country
-- country-- countries-- country names-- nation-- nationality----
-## Machine Learning based classifications
-
-> [!NOTE]
-> Machine learning based classifiers are only supported for structured data like tabular or columnar data sources.
-
-### Person's Name
-
-Person Name machine learning model has been trained using global datasets of names in English language. Microsoft Purview classifies full names stored in the same column as well as first and last names in separate columns.
----
-### Person's Address
-Person's address classification is used to detect full address stored in a single column containing the following elements: House number, Street Name, City, State, Country/Region, Zip Code. Person's Address classifier uses machine learning model that is trained on the global addresses data set in English language.
-
-#### Supported formats
-Currently the address model supports the following formats in the same column:
--- number, street, city-- name, street, pincode or zipcode-- number, street, area, pincode or zipcode-- street, city, pincode or zipcode-- landmark, city----
-### Person's Gender
-Person's Gender machine learning model has been trained using US Census data and other public data sources in English language. It supports classifying 50+ genders out of the box.
-
-#### Keywords
-- sex-- gender-- sexual-- orientation----
-### Person's Age
-Person's Age machine learning model detects age of an individual specified in various different formats. The qualifiers for days, months, and years must be in English language.
-
-#### Keywords
-- age-- ages-
-#### Supported formats
-- {%y} y, {%m} m-- {%y} years {%m} months-- {%y} years and {%m} months-- {%y} years {%w} weeks-- {%y} years and {%w} weeks-- {%y} y, {%d} d-- {%y} y, {%w} w-- {%y} years, {%d} days-- {%y} years and {%d} days-- "{%y} years, {%m} months and {%d} days-- {%y} months and {%d} days-- {%y} yr-- {%y}.{%yd} yr-- {%y} years-- {%y} years old-- {%y}.{%yd} years-- age {%y}-- {%y} to {%y2}-- {%y} to {%y2} yrs-- {%y} years to {%y2} years-- {%m} months to {%y} years-- {%m} m to {%y} years-- {%y}-{%y2} yrs-- {%y}-{%y2}-- {%y} - {%y2}-- {%y}+-- {%m}-{%m2} mos-- {%y} and over-- {%y} and under-- below {%y}-- above {%y}-- month {%m}-- week {%w}-- {%y}-
-#### Unsupported formats
-- {%y}y {%m}m-- {%y}y {%d}d-- {%y}y {%w}w-- {%y}.{%m}-- {%y}.{%yd}----
-## RegEx Classifications
-
-### ABA routing number
-
-#### Format
-
-Nine digits that can be in a formatted or unformatted pattern.
-
-#### Pattern
--- two digits in the ranges 00-12, 21-32, 61-72, or 80-- two digits-- an optional hyphen-- four digits-- an optional hyphen-- a digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_aba_routing
--- aba number-- aba#-- aba-- abarouting#-- abaroutingnumber-- americanbankassociationrouting#-- americanbankassociationroutingnumber-- bankrouting#-- bankroutingnumber-- routing #-- routing no-- routing number-- routing transit number-- routing#-- RTN----
-### Argentina national identity (DNI) number
-
-#### Format
-
-Eight digits with or without periods
-
-#### Pattern
-
-Eight digits:
--- two digits-- an optional period-- three digits-- an optional period-- three digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_argentina_national_id
--- Argentina National Identity number-- cedula-- cédula-- dni-- documento nacional de identidad-- documento número-- documento numero-- registro nacional de las personas-- rnp----
-### Australia bank account number
-
-#### Format
-
-six to 10 digits with or without a bank state branch number
-
-#### Pattern
-
-Account number is 6 to 10 digits.
-
-Australia bank state branch number:
-- three digits-- a hyphen-- three digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_australia_bank_account_number
--- swift bank code-- correspondent bank-- base currency-- usa account-- holder address-- bank address-- information account-- fund transfers-- bank charges-- bank details-- banking information-- full names-- iaea----
-### Australia business number
-
-#### Format
-
-11 digits with optional delimiters
-
-#### Pattern
-
-11 digits with optional delimiters:
--- two digits-- an optional hyphen or space-- three digits-- an optional hyphen or space-- three digits-- an optional hyphen or space-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_australia_business_number
--- australia business no-- business number-- abn#-- businessid#-- business id-- abn-- businessno#----
-### Australia company number
-
-#### Format
-
-nine digits with delimiters
-
-#### Pattern
-
-nine digits with delimiters:
--- three digits-- a space-- three digits-- a space-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_australia_company_number
--- acn-- australia company no-- australia company no#-- australia company number-- australian company no-- australian company no#-- australian company number----
-### Australia driver's license number
-
-#### Format
-
-nine letters and digits
-
-#### Pattern
-
-nine letters and digits:
--- two digits or letters (not case-sensitive)-- two digits-- five digits or letters (not case-sensitive)-
-OR
--- one to two optional letters (not case-sensitive)-- four to nine digits-
-OR
--- nine digits or letters (not case-sensitive)-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_australia_drivers_license_number
--- international driving permits-- australian automobile association-- international driving permit-- DriverLicence-- DriverLicences-- Driver Lic-- Driver Licence-- Driver Licences-- DriversLic-- DriversLicence-- DriversLicences-- Drivers Lic-- Drivers Lics-- Drivers Licence-- Drivers Licences-- Driver'Lic-- Driver'Lics-- Driver'Licence-- Driver'Licences-- Driver' Lic-- Driver' Lics-- Driver' Licence-- Driver' Licences-- Driver'sLic-- Driver'sLics-- Driver'sLicence-- Driver'sLicences-- Driver's Lic-- Driver's Lics-- Driver's Licence-- Driver's Licences-- DriverLic#-- DriverLics#-- DriverLicence#-- DriverLicences#-- Driver Lic#-- Driver Lics#-- Driver Licence#-- Driver Licences#-- DriversLic#-- DriversLics#-- DriversLicence#-- DriversLicences#-- Drivers Lic#-- Drivers Lics#-- Drivers Licence#-- Drivers Licences#-- Driver'Lic#-- Driver'Lics#-- Driver'Licence#-- Driver'Licences#-- Driver' Lic#-- Driver' Lics#-- Driver' Licence#-- Driver' Licences#-- Driver'sLic#-- Driver'sLics#-- Driver'sLicence#-- Driver'sLicences#-- Driver's Lic#-- Driver's Lics#-- Driver's Licence#-- Driver's Licences#-
-##### Keyword_australia_drivers_license_number_exclusions
--- aaa-- DriverLicense-- DriverLicenses-- Driver License-- Driver Licenses-- DriversLicense-- DriversLicenses-- Drivers License-- Drivers Licenses-- Driver'License-- Driver'Licenses-- Driver' License-- Driver' Licenses-- Driver'sLicense-- Driver'sLicenses-- Driver's License-- Driver's Licenses-- DriverLicense#-- DriverLicenses#-- Driver License#-- Driver Licenses#-- DriversLicense#-- DriversLicenses#-- Drivers License#-- Drivers Licenses#-- Driver'License#-- Driver'Licenses#-- Driver' License#-- Driver' Licenses#-- Driver'sLicense#-- Driver'sLicenses#-- Driver's License#-- Driver's Licenses#----
-### Australia medical account number
-
-#### Format
-
-10-11 digits
-
-#### Pattern
-
-10-11 digits:
-- First digit is in the range 2-6-- Ninth digit is a check digit-- Tenth digit is the issue digit-- Eleventh digit (optional) is the individual number-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_Australia_Medical_Account_Number
--- bank account details-- medicare payments-- mortgage account-- bank payments-- information branch-- credit card loan-- department of human services-- local service-- medicare----
-### Australia passport number
-
-#### Format
-
-eight or nine alphanumeric characters
-
-#### Pattern
--- one letter (N, E, D, F, A, C, U, X) followed by seven digits-
-**or**
--- Two letters (PA, PB, PC, PD, PE, PF, PU, PW, PX, PZ) followed by seven digits.-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_australia_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-- passport details-- immigration and citizenship-- commonwealth of australia-- department of immigration-- national identity card-- travel document-- issuing authority----
-### Australia tax file number
-
-#### Format
-
-eight to nine digits
-
-#### Pattern
-
-eight to nine digits typically presented with spaces as follows:
-- three digits-- an optional space-- three digits-- an optional space-- two to three digits where the last digit is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_australia_tax_file_number
--- australian business number-- marginal tax rate-- medicare levy-- portfolio number-- service veterans-- withholding tax-- individual tax return-- tax file number-- tfn----
-### Austria driver's license number
-
-#### Format
-
-eight digits without spaces and delimiters
-
-#### Pattern
-
-eight digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keywords_austria_eu_driver's_license_number
--- fuhrerschein-- f├╝hrerschein-- F├╝hrerscheine-- F├╝hrerscheinnummer-- F├╝hrerscheinnummern----
-### Austria identity card
-
-#### Format
-
-A 24-character combination of letters, digits, and special characters
-
-#### Pattern
-
-24 characters:
--- 22 letters (not case-sensitive), digits, backslashes, forward slashes, or plus signs--- two letters (not case-sensitive), digits, backslashes, forward slashes, plus signs, or equal signs-
-#### Checksum
-
-Not applicable
-
-#### Keywords
-
-##### Keywords_austria_eu_national_id_card
--- identity number-- national id-- personalausweis republik ├╢sterreich----
-### Austria passport number
-
-#### Format
-
-One letter followed by an optional space and seven digits
-
-#### Pattern
-
-A combination of one letter, seven digits, and one space:
--- one letter (not case-sensitive)-- one space (optional)-- seven digits-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_austria_eu_passport_number
--- reisepassnummer-- reisepasse-- No-Reisepass-- Nr-Reisepass-- Reisepass-Nr-- Passnummer-- reisepässe-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### Austria social security number
-
-#### Format
-
-10 digits in the specified format
-
-#### Pattern
-
-10 digits:
--- three digits that correspond to a serial number-- one check digit-- six digits that correspond to the birth date (DDMMYY)-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_austria_eu_ssn_or_equivalent
--- austrian ssn-- ehic number-- ehic no-- insurance code-- insurancecode#-- insurance number-- insurance no-- krankenkassennummer-- krankenversicherung-- socialsecurityno-- socialsecurityno#-- social security no-- social security number-- social security code-- sozialversicherungsnummer-- sozialversicherungsnummer#-- soziale sicherheit kein-- sozialesicherheitkein#-- ssn#-- ssn-- versicherungscode-- versicherungsnummer-- zdravstveno zavarovanje----
-### Austria tax identification number
-
-#### Format
-
-nine digits with optional hyphen and forward slash
-
-#### Pattern
-
-nine digits with optional hyphen and forward slash:
--- two digits-- a hyphen (optional)-- three digits-- a forward slash (optional)-- four digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_austria_eu_tax_file_number
--- ├╢sterreich-- st.nr.-- steuernummer-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- tax number----
-### Austria value added tax
-
-#### Format
-
-11-character alphanumeric pattern
-
-#### Pattern
-
-11-character alphanumeric pattern:
--- Or a-- T or t-- Optional space-- U or u-- optional space-- two or three digits-- optional space-- four digits-- optional space-- one or two digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_austria_value_added_tax
--- vat number-- vat#-- austrian vat number-- vat no.-- vatno#-- value added tax number-- austrian vat-- mwst-- umsatzsteuernummer-- mwstnummer-- ust.-identifikationsnummer-- umsatzsteuer-identifikationsnummer-- vat identification number-- atu number-- uid number----
-### Belgium driver's license number
-
-#### Format
-
-10 digits without spaces and delimiters
-
-#### Pattern
-
-10 digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keywords_belgium_eu_driver's_license_number
--- rijbewijs-- rijbewijsnummer-- führerschein-- führerscheinnummer-- füehrerscheinnummer-- fuhrerschein-- fuehrerschein-- fuhrerscheinnummer-- fuehrerscheinnummer-- permis de conduire-- numéro permis conduire----
-### Belgium national number
-
-#### Format
-
-11 digits plus optional delimiters
-
-#### Pattern
-
-11 digits plus delimiters:
-- six digits and two optional periods in the format YY.MM.DD for date of birth-- An optional delimiter from dot, dash, space-- three sequential digits (odd for males, even for females)-- An optional delimiter from dot, dash, space-- two check digits-
-#### Checksum
-
-Yes
--
-#### Keywords
-
-##### Keyword_belgium_national_number
--- belasting aantal-- bnn#-- bnn-- carte d’identité-- identifiant national-- identifiantnational#-- identificatie-- identification-- identifikation-- identifikationsnummer-- identifizierung-- identité-- identiteit-- identiteitskaart-- identity-- inscription-- national number-- national register-- nationalnumber#-- nationalnumber-- nif#-- nif-- numéro d'assuré-- numéro de registre national-- numéro de sécurité-- numéro d'identification-- numéro d'immatriculation-- numéro national-- numéronational#-- personal id number-- personalausweis-- personalidnumber#-- registratie-- registration-- registrationsnumme-- registrierung-- social security number-- ssn#-- ssn-- steuernummer-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#----
-### Belgium passport number
-
-#### Format
-
-two letters followed by six digits with no spaces or delimiters
-
-#### Pattern
-
-two letters and followed by six digits
-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_belgium_eu_passport_number
--- numéro passeport-- paspoort nr-- paspoort-nr-- paspoortnummer-- paspoortnummers-- Passeport carte-- Passeport livre-- Pass-Nr-- Passnummer-- reisepass kein-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### Belgium value added tax number
-
-#### Format
-
-12-character alphanumeric pattern
-
-#### Pattern
-
-12-character alphanumeric pattern:
--- a letter B or b-- a letter E or e-- a digit 0-- a digit from 1 to 9-- an optional dot or Hyphen or space-- four digits-- an optional dot or Hyphen or space-- four digits--
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_belgium_value_added_tax_number
--- nº tva-- vat number-- vat no-- numéro t.v.a-- umsatzsteuer-identifikationsnummer-- umsatzsteuernummer-- btw-- btw#-- vat#----
-### Brazil CPF number
-
-#### Format
-
-11 digits that include a check digit and can be formatted or unformatted
-
-#### Pattern
-
-Formatted:
-- three digits-- a period-- three digits-- a period-- three digits-- a hyphen-- two digits that are check digits-
-Unformatted:
-- 11 digits where the last two digits are check digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_brazil_cpf
--- CPF-- Identification-- Registration-- Revenue-- Cadastro de Pessoas Físicas-- Imposto-- Identificação-- Inscrição-- Receita----
-### Brazil legal entity number (CNPJ)
-
-#### Format
-
-14 digits that include a registration number, branch number, and check digits, plus delimiters
-
-#### Pattern
-
-14 digits, plus delimiters:
--- two digits-- a period-- three digits-- a period-- three digits (these first eight digits are the registration number)-- a forward slash-- four-digit branch number-- a hyphen-- two digits that are check digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_brazil_cnpj
--- CNPJ-- CNPJ/MF-- CNPJ-MF-- National Registry of Legal Entities-- Taxpayers Registry-- Legal entity-- Legal entities-- Registration Status-- Business-- Company-- CNPJ-- Cadastro Nacional da Pessoa Jurídica-- Cadastro Geral de Contribuintes-- CGC-- Pessoa jurídica-- Pessoas jurídicas-- Situação cadastral-- Inscrição-- Empresa----
-### Brazil national identification card (RG)
-
-#### Format
-
-Registro Geral (old format): Nine digits
-
-Registro de Identidade (RIC) (new format): 11 digits
-
-#### Pattern
-
-Registro Geral (old format):
-- two digits-- a period-- three digits-- a period-- three digits-- a hyphen-- one digit that is a check digit-
-Registro de Identidade (RIC) (new format):
-- 10 digits-- a hyphen-- one digit that is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_brazil_rg
--- Cédula de identidade-- identity card-- national id-- número de rregistro-- registro de Iidentidade-- registro geral-- RG (this keyword is case-sensitive)-- RIC (this keyword is case-sensitive)----
-### Bulgaria driver's license number
-
-#### Format
-
-nine digits without spaces and delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_bulgaria_eu_driver's_license_number
--- свидетелство за управление на мпс-- свидетелство за управление на моторно превозно средство-- сумпс-- шофьорска книжка-- шофьорски книжки----
-### Bulgaria uniform civil number
-
-#### Format
-
-10 digits without spaces and delimiters
-
-#### Pattern
-
-10 digits without spaces and delimiters
--- six digits that correspond to the birth date (YYMMDD)-- two digits that correspond to the birth order-- one digit that corresponds to gender: An even digit for male and an odd digit for female-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_bulgaria_eu_national_id_card
--- bnn#-- bnn-- bucn#-- bucn-- edinen grazhdanski nomer-- egn#-- egn-- identification number-- national id-- national number-- nationalnumber#-- nationalnumber-- personal id-- personal no-- personal number-- personalidnumber#-- social security number-- ssn#-- ssn-- uniform civil id-- uniform civil no-- uniform civil number-- uniformcivilno#-- uniformcivilno-- uniformcivilnumber#-- uniformcivilnumber-- unique citizenship number-- егн#-- егн-- единен граждански номер-- идентификационен номер-- личен номер-- лична идентификация-- лично не-- национален номер-- номер на гражданството-- униформ id-- униформ граждански id-- униформ граждански не-- униформ граждански номер-- униформгражданскиid#-- униформгражданскине.#----
-### Bulgaria passport number
-
-#### Format
-
-nine digits without spaces and delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_bulgaria_eu_passport_number
--- номер на паспорта-- номер на паспорт-- паспорт №-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### Canada bank account number
-
-#### Format
-
-7 or 12 digits
-
-#### Pattern
-
-A Canada Bank Account Number is 7 or 12 digits.
-
-A Canada bank account transit number is:
--- five digits-- a hyphen-- three digits-
-**or**
--- a zero "0"-- eight digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_canada_bank_account_number
--- canada savings bonds-- canada revenue agency-- canadian financial institution-- direct deposit form-- canadian citizen-- legal representative-- notary public-- commissioner for oaths-- child care benefit-- universal child care-- canada child tax benefit-- income tax benefit-- harmonized sales tax-- social insurance number-- income tax refund-- child tax benefit-- territorial payments-- institution number-- deposit request-- banking information-- direct deposit----
-### Canada driver's license number
-
-#### Format
-
-Varies by province
-
-#### Pattern
-
-Various patterns covering:
-- Alberta-- British Columbia-- Manitoba-- New Brunswick-- Newfoundland/Labrador-- Nova Scotia-- Ontario-- Prince Edward Island-- Quebec-- Saskatchewan-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_[province_name]_drivers_license_name
--- The province abbreviation, for example AB-- The province name, for example Alberta-
-##### Keyword_canada_drivers_license
--- DL-- DLS-- CDL-- CDLS-- DriverLic-- DriverLics-- DriverLicense-- DriverLicenses-- DriverLicence-- DriverLicences-- Driver Lic-- Driver Lics-- Driver License-- Driver Licenses-- Driver Licence-- Driver Licences-- DriversLic-- DriversLics-- DriversLicence-- DriversLicences-- DriversLicense-- DriversLicenses-- Drivers Lic-- Drivers Lics-- Drivers License-- Drivers Licenses-- Drivers Licence-- Drivers Licences-- Driver'Lic-- Driver'Lics-- Driver'License-- Driver'Licenses-- Driver'Licence-- Driver'Licences-- Driver' Lic-- Driver' Lics-- Driver' License-- Driver' Licenses-- Driver' Licence-- Driver' Licences-- Driver'sLic-- Driver'sLics-- Driver'sLicense-- Driver'sLicenses-- Driver'sLicence-- Driver'sLicences-- Driver's Lic-- Driver's Lics-- Driver's License-- Driver's Licenses-- Driver's Licence-- Driver's Licences-- Permis de Conduire-- id-- ids-- idcard number-- idcard numbers-- idcard #-- idcard #s-- idcard card-- idcard cards-- idcard-- identification number-- identification numbers-- identification #-- identification #s-- identification card-- identification cards-- identification-- DL#-- DLS#-- CDL#-- CDLS#-- DriverLic#-- DriverLics#-- DriverLicense#-- DriverLicenses#-- DriverLicence#-- DriverLicences#-- Driver Lic#-- Driver Lics#-- Driver License#-- Driver Licenses#-- Driver License#-- Driver Licences#-- DriversLic#-- DriversLics#-- DriversLicense#-- DriversLicenses#-- DriversLicence#-- DriversLicences#-- Drivers Lic#-- Drivers Lics#-- Drivers License#-- Drivers Licenses#-- Drivers Licence#-- Drivers Licences#-- Driver'Lic#-- Driver'Lics#-- Driver'License#-- Driver'Licenses#-- Driver'Licence#-- Driver'Licences#-- Driver' Lic#-- Driver' Lics#-- Driver' License#-- Driver' Licenses#-- Driver' Licence#-- Driver' Licences#-- Driver'sLic#-- Driver'sLics#-- Driver'sLicense#-- Driver'sLicenses#-- Driver'sLicence#-- Driver'sLicences#-- Driver's Lic#-- Driver's Lics#-- Driver's License#-- Driver's Licenses#-- Driver's Licence#-- Driver's Licences#-- Permis de Conduire#-- id#-- ids#-- idcard card#-- idcard cards#-- idcard#-- identification card#-- identification cards#-- identification#----
-### Canada health service number
-
-#### Format
-
- 10 digits
-
-#### Pattern
-
-10 digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_canada_health_service_number
--- personal health number-- patient information-- health services-- speciality services-- automobile accident-- patient hospital-- psychiatrist-- workers compensation-- disability----
-### Canada passport number
-
-#### Format
-
-two uppercase letters followed by six digits
-
-#### Pattern
-
-two uppercase letters followed by six digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_canada_passport_number
--- canadian citizenship-- canadian passport-- passport application-- passport photos-- certified translator-- canadian citizens-- processing times-- renewal application-
-##### Keyword_passport
--- Passport Number-- Passport No-- Passport #-- Passport#-- PassportID-- Passportno-- passportnumber-- パスポート-- パスポート番号-- パスポートのNum-- パスポート#-- Numéro de passeport-- Passeport n °-- Passeport Non-- Passeport #-- Passeport#-- PasseportNon-- Passeportn °----
-### Canada personal health identification number (PHIN)
-
-#### Format
-
-nine digits
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_canada_phin
--- social insurance number-- health information act-- income tax information-- manitoba health-- health registration-- prescription purchases-- benefit eligibility-- personal health-- power of attorney-- registration number-- personal health number-- practitioner referral-- wellness professional-- patient referral-- health and wellness-
-##### Keyword_canada_provinces
--- Nunavut-- Quebec-- Northwest Territories-- Ontario-- British Columbia-- Alberta-- Saskatchewan-- Manitoba-- Yukon-- Newfoundland and Labrador-- New Brunswick-- Nova Scotia-- Prince Edward Island-- Canada----
-### Canada social insurance number
-
-#### Format
-
-nine digits with optional hyphens or spaces
-
-#### Pattern
-
-Formatted:
-- three digits-- a hyphen or space-- three digits-- a hyphen or space-- three digits-
-Unformatted: nine digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_sin
--- sin-- social insurance-- numero d'assurance sociale-- sins-- ssn-- ssns-- social security-- numero d'assurance social-- national identification number-- national id-- sin#-- soc ins-- social ins-
-##### Keyword_sin_collaborative
--- driver's license-- drivers license-- driver's licence-- drivers licence-- DOB-- Birthdate-- Birthday-- Date of Birth----
-### Chile identity card number
-
-#### Format
-
-seven to eight digits plus delimiters a check digit or letter
-
-#### Pattern
-
-seven to eight digits plus delimiters:
-- one to two digits-- an optional period-- three digits-- an optional period-- three digits-- a dash-- one digit or letter (not case-sensitive) which is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_chile_id_card
--- cédula de identidad-- identificación-- national identification-- national identification number-- national id-- número de identificación nacional-- rol único nacional-- rol único tributario-- RUN-- RUT-- tarjeta de identificación-- Rol Unico Nacional-- Rol Unico Tributario-- RUN#-- RUT#-- nationaluniqueroleID#-- nacional identidad-- número identificación-- identidad número-- numero identificacion-- identidad numero-- Chilean identity no.-- Chilean identity number-- Chilean identity #-- Unique Tax Registry-- Unique Tributary Role-- Unique Tax Role-- Unique Tributary Number-- Unique National Number-- Unique National Role-- National unique role-- Chile identity no.-- Chile identity number-- Chile identity #-----
-### China resident identity card (PRC) number
-
-#### Format
-
-18 digits
-
-#### Pattern
-
-18 digits:
-- six digits that are an address code-- eight digits in the form YYYYMMDD, which are the date of birth-- three digits that are an order code-- one digit that is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-#### Keyword_china_resident_id
--- Resident Identity Card-- PRC-- National Identification Card-- 身份证-- 居民 身份证-- 居民身份证-- 鉴定-- 身分證-- 居民 身份證-- 鑑定-----
-### Credit card number
-
-#### Format
-
-14 to 16 digits that can be formatted or unformatted (dddddddddddddddd) and that must pass the Luhn test.
-
-#### Pattern
-
-Detects cards from all major brands worldwide, including Visa, MasterCard, Discover Card, JCB, American Express, gift cards, diner's cards, Rupay and China UnionPay.
-
-#### Checksum
-
-Yes, the Luhn checksum
-
-#### Keywords
-
-##### Keyword_cc_verification
--- card verification-- card identification number-- cvn-- cid-- cvc2-- cvv2-- pin block-- security code-- security number-- security no-- issue number-- issue no-- cryptogramme-- numéro de sécurité-- numero de securite-- kreditkartenprüfnummer-- kreditkartenprufnummer-- prüfziffer-- prufziffer-- sicherheits Kode-- sicherheitscode-- sicherheitsnummer-- verfalldatum-- codice di verifica-- cod. sicurezza-- cod sicurezza-- n autorizzazione-- código-- codigo-- cod. seg-- cod seg-- código de segurança-- codigo de seguranca-- codigo de segurança-- código de seguranca-- cód. segurança-- cod. seguranca-- cod. segurança-- cód. seguranca-- cód segurança-- cod seguranca-- cod segurança-- cód seguranca-- número de verificação-- numero de verificacao-- ablauf-- gültig bis-- gültigkeitsdatum-- gultig bis-- gultigkeitsdatum-- scadenza-- data scad-- fecha de expiracion-- fecha de venc-- vencimiento-- válido hasta-- valido hasta-- vto-- data de expiração-- data de expiracao-- data em que expira-- validade-- valor-- vencimento-- transaction-- transaction number-- reference number-- セキュリティコード-- セキュリティ コード-- セキュリティナンバー-- セキュリティ ナンバー-- セキュリティ番号-
-##### Keyword_cc_name
--- amex-- american express-- americanexpress-- americano espresso-- Visa-- mastercard-- master card-- mc-- mastercards-- master cards-- diner's Club-- diners club-- dinersclub-- discover-- discover card-- discovercard-- discover cards-- JCB-- BrandSmart-- japanese card bureau-- carte blanche-- carteblanche-- credit card-- cc#-- cc#:-- expiration date-- exp date-- expiry date-- date d’expiration-- date d'exp-- date expiration-- bank card-- bankcard-- card number-- card num-- cardnumber-- cardnumbers-- card numbers-- creditcard-- credit cards-- creditcards-- ccn-- card holder-- cardholder-- card holders-- cardholders-- check card-- checkcard-- check cards-- checkcards-- debit card-- debitcard-- debit cards-- debitcards-- atm card-- atmcard-- atm cards-- atmcards-- enroute-- en route-- card type-- Cardmember Acct-- cardmember account-- Cardno-- Corporate Card-- Corporate cards-- Type of card-- card account number-- card member account-- Cardmember Acct.-- card no.-- card no-- card number-- carte bancaire-- carte de crédit-- carte de credit-- numéro de carte-- numero de carte-- nº de la carte-- nº de carte-- kreditkarte-- karte-- karteninhaber-- karteninhabers-- kreditkarteninhaber-- kreditkarteninstitut-- kreditkartentyp-- eigentümername-- kartennr-- kartennummer-- kreditkartennummer-- kreditkarten-nummer-- carta di credito-- carta credito-- n. carta-- n carta-- nr. carta-- nr carta-- numero carta-- numero della carta-- numero di carta-- tarjeta credito-- tarjeta de credito-- tarjeta crédito-- tarjeta de crédito-- tarjeta de atm-- tarjeta atm-- tarjeta debito-- tarjeta de debito-- tarjeta débito-- tarjeta de débito-- nº de tarjeta-- no. de tarjeta-- no de tarjeta-- numero de tarjeta-- número de tarjeta-- tarjeta no-- tarjetahabiente-- cartão de crédito-- cartão de credito-- cartao de crédito-- cartao de credito-- cartão de débito-- cartao de débito-- cartão de debito-- cartao de debito-- débito automático-- debito automatico-- número do cartão-- numero do cartão-- número do cartao-- numero do cartao-- número de cartão-- numero de cartão-- número de cartao-- numero de cartao-- nº do cartão-- nº do cartao-- nº. do cartão-- no do cartão-- no do cartao-- no. do cartão-- no. do cartao-- rupay-- union pay-- unionpay-- diner's-- diners-- クレジットカード番号-- クレジットカードナンバー-- クレジットカード#-- クレジットカード-- クレジット-- クレカ-- カード番号-- カードナンバー-- カード#-- アメックス-- アメリカンエクスプレス-- アメリカン エクスプレス-- Visaカード-- Visa カード-- マスターカード-- マスター カード-- マスター-- ダイナースクラブ-- ダイナース クラブ-- ダイナース-- 有効期限-- 期限-- キャッシュカード-- キャッシュ カード-- カード名義人-- カードの名義人-- カードの名義-- デビット カード-- デビットカード-- 中国银联-- 银联------
-### Croatia driver's license number
-
-#### Format
-
-eight digits without spaces and delimiters
-
-#### Pattern
-
-eight digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_croatia_eu_driver's_license_number
--- voza─ìka dozvola-- voza─ìke dozvole-----
-### Croatia identity card number
-
-This entity is included in the EU National Identification Number sensitive information type. It's available as a stand-alone sensitive information type entity.
-
-#### Format
-
-nine digits
-
-#### Pattern
-
-nine consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_croatia_id_card
--- majstorski broj gra─æana-- master citizen number-- nacionalni identifikacijski broj-- national identification number-- oib#-- oib-- osobna iskaznica-- osobni id-- osobni identifikacijski broj-- personal identification number-- porezni broj-- porezni identifikacijski broj-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### Croatia passport number
-
-#### Format
-
-nine digits without spaces and delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_croatia_eu_passport_number
--- broj putovnice-- br. Putovnice-- br putovnice----
-### Croatia personal identification (OIB) number
-
-#### Format
-
-11 digits
-
-#### Pattern
-
-11 digits:
-- 10 digits-- final digit is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_croatia_oib_number
--- majstorski broj gra─æana-- master citizen number-- nacionalni identifikacijski broj-- national identification number-- oib#-- oib-- osobna iskaznica-- osobni id-- osobni identifikacijski broj-- personal identification number-- porezni broj-- porezni identifikacijski broj-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#----
-### Cyprus drivers license number
-
-#### Format
-
-12 digits without spaces and delimiters
-
-#### Pattern
-
-12 digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keywords_cyprus_eu_driver's_license_number
--- άδεια οδήγησης-- αριθμό άδειας οδήγησης-- άδειες οδήγησης-----
-### Cyprus identity card
-
-#### Format
-
-10 digits without spaces and delimiters
-
-#### Pattern
-
-10 digits
-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_cyprus_eu_national_id_card
--- id card number-- identity card number-- kimlik karti-- national identification number-- personal id number-- ταυτοτητασ-----
-### Cyprus passport number
-
-#### Format
-
-one letter followed by 6-8 digits with no spaces or delimiters
-
-#### Pattern
-
-one letter followed by six to eight digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_cyprus_eu_passport_number
--- αριθμό διαβατηρίου-- pasaportu-- Αριθμός Διαβατηρίου-- κυπριακό διαβατήριο-- διαβατήριο#-- διαβατήριο-- αριθμός διαβατηρίου-- Pasaport Kimliği-- pasaport numarası-- Pasaport no.-- Αρ. Διαβατηρίου-
-##### Keywords_cyprus_eu_passport_date
--- expires on-- issued on----
-### Cyprus tax identification number
-
-#### Format
-
-eight digits and one letter in the specified pattern
-
-#### Pattern
-
-eight digits and one letter:
--- a "0" or "9"-- seven digits-- one letter (not case-sensitive)-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_cyprus_eu_tax_file_number
--- tax id-- tax identification code-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tic#-- tic-- tin id-- tin no-- tin#-- vergi kimlik kodu-- vergi kimlik numarası-- αριθμός φορολογικού μητρώου-- κωδικός φορολογικού μητρώου-- φορολογική ταυτότητα-- φορολογικού κωδικού----
-### Czech Republic Driver's License Number
-
-#### Format
-
-two letters followed by six digits
-
-#### Pattern
-
-eight letters and digits:
--- letter 'E' (not case-sensitive)-- a letter-- a space (optional)-- six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keywords_czech_republic_eu_driver's_license_number
--- řidičský prúkaz-- řidičské průkazy-- číslo řidičského průkazu-- čísla řidičských průkazů-----
-### Czech passport number
-
-#### Format
-
-eight digits without spaces or delimiters
-
-#### Pattern
-
-eight digits without spaces or delimiters
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_czech_republic_eu_passport_number
--- cestovní pas-- číslo pasu-- cestovní pasu-- passeport no-- čísla pasu-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Czech National Identity Card Number
-
-#### Format
-
-nine digits with optional forward slash (old format)
-10 digits with optional forward slash (new format)
-
-#### Pattern
-
-nine digits (old format):
-- six digits that represent date of birth-- an optional forward slash-- three digits-
-10 digits (new format):
-- six digits that represent date of birth-- an optional forward slash-- four digits where last digit is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_czech_id_card
--- birth number-- czech republic id-- czechidno#-- daňové číslo-- identifikační číslo-- identity no-- identity number-- identityno#-- identityno-- insurance number-- national identification number-- nationalnumber#-- national number-- osobní číslo-- personalidnumber#-- personal id number-- personal identification number-- personal number-- pid#-- pid-- pojištění číslo-- rč-- rodne cislo-- rodné číslo-- ssn-- ssn#-- social security number-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- unique identification number----
-### Date Of Birth
-
-#### Format
-Any valid date
-
-#### Checksum
-Not applicable
-
-#### Keywords
-
-##### Keywords_date_of_birth
--- dob-- birth day-- natal day-- any word containing the string *birth*----
-### Denmark driver's license number
-
-#### Format
-
-eight digits without spaces and delimiters
-
-#### Pattern
-
-eight digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keywords_denmark_eu_driver's_license_number
--- k├╕rekort-- k├╕rekortnummer----
-### Denmark passport number
-
-#### Format
-
-nine digits without spaces and delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_denmark_eu_passport_number
--- pasnummer-- Passeport n┬░-- pasnumre-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### Denmark personal identification number
-
-#### Format
-
-10 digits containing a hyphen
-
-#### Pattern
-
-10 digits:
-- six digits in the format DDMMYY, which are the date of birth-- a hyphen-- four digits where the final digit is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_denmark_id
--- centrale personregister-- civilt registreringssystem-- cpr-- cpr#-- gesundheitskarte nummer-- gesundheitsversicherungkarte nummer-- health card-- health insurance card number-- health insurance number-- identification number-- identifikationsnummer-- identifikationsnummer#-- identity number-- krankenkassennummer-- nationalid#-- nationalnumber#-- national number-- personalidnumber#-- personalidentityno#-- personal id number-- personnummer-- personnummer#-- reisekrankenversicherungskartenummer-- rejsesygesikringskort-- ssn-- ssn#-- skat id-- skat kode-- skat nummer-- skattenummer-- social security number-- sundhedsforsikringskort-- sundhedsforsikringsnummer-- sundhedskort-- sundhedskortnummer-- sygesikring-- sygesikringkortnummer-- tax code-- travel health insurance card-- uniqueidentityno#-- tax number-- tax registration number-- tax id-- tax identification number-- taxid#-- taxnumber#-- tax no-- taxno#-- taxnumber-- tax identification no-- tin#-- taxidno#-- taxidnumber#-- tax no#-- tin id-- tin no-- cpr.nr-- cprnr-- cprnummer-- personnr-- personregister-- sygesikringsbevis-- sygesikringsbevisnr-- sygesikringsbevisnummer-- sygesikringskort-- sygesikringskortnr-- sygesikringskortnummer-- sygesikringsnr-- sygesikringsnummer----
-### Email
-
-#### Format
-Any valid email address that abides by [RFC 5322](https://www.ietf.org/rfc/rfc5322.txt)
-
-#### Checksum
-Not applicable
-
-#### Keywords
-
-##### Keywords_email
--- contact-- email-- electronic-- login-- mail-- online-- user-- webmail----
-### Estonia driver's license number
-
-#### Format
-
-two letters followed by six digits
-
-#### Pattern
-
-two letters and six digits:
--- the letters "ET" (not case-sensitive)-- six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keywords_estonia_eu_driver's_license_number
--- permis de conduire-- juhilubade numbrid-- juhiloa number-- juhiluba----
-### Estonia Personal Identification Code
-
-#### Format
-
-11 digits without spaces and delimiters
-
-#### Pattern
-
-11 digits:
--- one digit that corresponds to sex and century of birth (odd number male, even number female; 1-2: 19th century; 3-4: 20th century; 5-6: 21st century)-- six digits that correspond to date of birth (YYMMDD)-- three digits that correspond to a serial number separating persons born on the same date-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_estonia_eu_national_id_card
--- id-kaart-- ik-- isikukood#-- isikukood-- maksu id-- maksukohustuslase identifitseerimisnumber-- maksunumber-- national identification number-- national number-- personal code-- personal id number-- personal identification code-- personal identification number-- personalidnumber#-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#----
-### Estonia passport number
-
-#### Format
-
-one letter followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-one letter followed by seven digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_estonia_eu_passport_number
--- eesti kodaniku pass-- passi number-- passinumbrid-- document number-- document no-- dokumendi nr-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### Ethnic groups
-
-#### Format
-
-This classifier consists of the most common ethnic groups. For a reference list, see this [article](https://en.wikipedia.org/wiki/List_of_contemporary_ethnic_groups).
-
-#### Checksum
-Not applicable
-
-#### Keywords
-
-##### Keywords_ethnic_group
--- ethnic-- ethnic groups-- ethnicity-- ethnicities-- nationality-- race----
-### EU debit card number
-
-#### Format
-
-16 digits
-
-#### Pattern
-
-Complex and robust pattern
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_eu_debit_card
--- account number-- card number-- card no.-- security number-- cc#-
-##### Keyword_card_terms_dict
--- acct nbr-- acct num-- acct no-- american express-- americanexpress-- americano espresso-- amex-- atm card-- atm cards-- atm kaart-- atmcard-- atmcards-- atmkaart-- atmkaarten-- bancontact-- bank card-- bankkaart-- card holder-- card holders-- card num-- card number-- card numbers-- card type-- cardano numerico-- cardholder-- cardholders-- cardnumber-- cardnumbers-- carta bianca-- carta credito-- carta di credito-- cartao de credito-- cartao de crédito-- cartao de debito-- cartao de débito-- carte bancaire-- carte blanche-- carte bleue-- carte de credit-- carte de crédit-- carte di credito-- carteblanche-- cartão de credito-- cartão de crédito-- cartão de debito-- cartão de débito-- cb-- ccn-- check card-- check cards-- checkcard-- checkcards-- chequekaart-- cirrus-- cirrus-edc-maestro-- controlekaart-- controlekaarten-- credit card-- credit cards-- creditcard-- creditcards-- debetkaart-- debetkaarten-- debit card-- debit cards-- debitcard-- debitcards-- debito automatico-- diners club-- dinersclub-- discover-- discover card-- discover cards-- discovercard-- discovercards-- débito automático-- edc-- eigentümername-- european debit card-- hoofdkaart-- hoofdkaarten-- in viaggio-- japanese card bureau-- japanse kaartdienst-- jcb-- kaart-- kaart num-- kaartaantal-- kaartaantallen-- kaarthouder-- kaarthouders-- karte-- karteninhaber-- karteninhabers-- kartennr-- kartennummer-- kreditkarte-- kreditkarten-nummer-- kreditkarteninhaber-- kreditkarteninstitut-- kreditkartennummer-- kreditkartentyp-- maestro-- master card-- master cards-- mastercard-- mastercards-- mc-- mister cash-- n carta-- carta-- no de tarjeta-- no do cartao-- no do cartão-- no. de tarjeta-- no. do cartao-- no. do cartão-- nr carta-- nr. carta-- numeri di scheda-- numero carta-- numero de cartao-- numero de carte-- numero de cartão-- numero de tarjeta-- numero della carta-- numero di carta-- numero di scheda-- numero do cartao-- numero do cartão-- numéro de carte-- nº carta-- nº de carte-- nº de la carte-- nº de tarjeta-- nº do cartao-- nº do cartão-- nº. do cartão-- número de cartao-- número de cartão-- número de tarjeta-- número do cartao-- scheda dell'assegno-- scheda dell'atmosfera-- scheda dell'atmosfera-- scheda della banca-- scheda di controllo-- scheda di debito-- scheda matrice-- schede dell'atmosfera-- schede di controllo-- schede di debito-- schede matrici-- scoprono la scheda-- scoprono le schede-- solo-- supporti di scheda-- supporto di scheda-- switch-- tarjeta atm-- tarjeta credito-- tarjeta de atm-- tarjeta de credito-- tarjeta de debito-- tarjeta debito-- tarjeta no-- tarjetahabiente-- tipo della scheda-- ufficio giapponese della-- scheda-- v pay-- v-pay-- visa-- visa plus-- visa electron-- visto-- visum-- vpay-
-##### Keyword_card_security_terms_dict
--- card identification number-- card verification-- cardi la verifica-- cid-- cod seg-- cod seguranca-- cod segurança-- cod sicurezza-- cod. seg-- cod. seguranca-- cod. segurança-- cod. sicurezza-- codice di sicurezza-- codice di verifica-- codigo-- codigo de seguranca-- codigo de segurança-- crittogramma-- cryptogram-- cryptogramme-- cv2-- cvc-- cvc2-- cvn-- cvv-- cvv2-- cód seguranca-- cód segurança-- cód. seguranca-- cód. segurança-- código-- código de seguranca-- código de segurança-- de kaart controle-- geeft nr uit-- issue no-- issue number-- kaartidentificatienummer-- kreditkartenprufnummer-- kreditkartenprüfnummer-- kwestieaantal-- no. dell'edizione-- no. di sicurezza-- numero de securite-- numero de verificacao-- numero dell'edizione-- numero di identificazione della-- scheda-- numero di sicurezza-- numero van veiligheid-- numéro de sécurité-- nº autorizzazione-- número de verificação-- perno il blocco-- pin block-- prufziffer-- prüfziffer-- security code-- security no-- security number-- sicherheits kode-- sicherheitscode-- sicherheitsnummer-- speldblok-- veiligheid nr-- veiligheidsaantal-- veiligheidscode-- veiligheidsnummer-- verfalldatum-
-##### Keyword_card_expiration_terms_dict
--- ablauf-- data de expiracao-- data de expiração-- data del exp-- data di exp-- data di scadenza-- data em que expira-- data scad-- data scadenza-- date de validité-- datum afloop-- datum van exp-- de afloop-- espira-- espira-- exp date-- exp datum-- expiration-- expire-- expires-- expiry-- fecha de expiracion-- fecha de venc-- gultig bis-- gultigkeitsdatum-- gültig bis-- gültigkeitsdatum-- la scadenza-- scadenza-- valable-- validade-- valido hasta-- valor-- venc-- vencimento-- vencimiento-- verloopt-- vervaldag-- vervaldatum-- vto-- válido hasta----
-### EU driver's license number
-
-These entities are in the EU Driver's License Number and are sensitive information types.
--- [Austria](#austria-drivers-license-number)-- [Belgium](#belgium-drivers-license-number)-- [Bulgaria](#bulgaria-drivers-license-number)-- [Croatia](#croatia-drivers-license-number)-- [Cyprus](#cyprus-drivers-license-number)-- [Czech](#czech-republic-drivers-license-number)-- [Denmark](#denmark-drivers-license-number)-- [Estonia](#estonia-drivers-license-number)-- [Finland](#finland-drivers-license-number)-- [France](#france-drivers-license-number)-- [Germany](#germany-drivers-license-number)-- [Greece](#greece-drivers-license-number)-- [Hungary](#hungary-drivers-license-number)-- [Ireland](#ireland-drivers-license-number)-- [Italy](#italy-drivers-license-number)-- [Latvia](#latvia-drivers-license-number)-- [Lithuania](#lithuania-drivers-license-number)-- [Luxemburg](#luxemburg-drivers-license-number)-- [Malta](#malta-drivers-license-number)-- [Netherlands](#netherlands-drivers-license-number)-- [Poland](#poland-drivers-license-number)-- [Portugal](#portugal-drivers-license-number)-- [Romania](#romania-drivers-license-number)-- [Slovakia](#slovakia-drivers-license-number)-- [Slovenia](#slovenia-drivers-license-number)-- [Spain](#spain-drivers-license-number)-- [Sweden](#sweden-drivers-license-number)-- [U.K.](#uk-drivers-license-number)----
-### EU passport number
-
-These entities are in the EU passport number and are sensitive information types. These entities are in the EU passport number bundle.
--- [Austria](#austria-passport-number)-- [Belgium](#belgium-passport-number)-- [Bulgaria](#bulgaria-passport-number)-- [Croatia](#croatia-passport-number)-- [Cyprus](#cyprus-passport-number)-- [Czech](#czech-passport-number)-- [Denmark](#denmark-passport-number)-- [Estonia](#estonia-passport-number)-- [Finland](#finland-passport-number)-- [France](#france-passport-number)-- [Germany](#germany-passport-number)-- [Greece](#greece-passport-number)-- [Hungary](#hungary-passport-number)-- [Ireland](#ireland-passport-number)-- [Italy](#italy-passport-number)-- [Latvia](#latvia-passport-number)-- [Lithuania](#lithuania-passport-number)-- [Luxemburg](#luxemburg-passport-number)-- [Malta](#malta-passport-number)-- [Netherlands](#netherlands-passport-number)-- [Poland](#poland-passport-number)-- [Portugal](#portugal-passport-number)-- [Romania](#romania-passport-number)-- [Slovakia](#slovakia-passport-number)-- [Slovenia](#slovenia-passport-number)-- [Spain](#spain-passport-number)-- [Sweden](#sweden-passport-number)-- [U.K.](#us--uk-passport-number)----
-### Finland driver's license number
-
-#### Format
-
-10 digits containing a hyphen
-
-#### Pattern
-
-10 digits containing a hyphen:
--- six digits-- a hyphen-- three digits-- a digit or letter-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_finland_eu_driver's_license_number
--- ajokortti-- permis de conduire-- ajokortin numero-- kuljettaja lic.-- k├╢rkort-- k├╢rkortnummer-- f├╢rare lic.-- ajokortit-- ajokortin numerot-----
-### Finland european health insurance number
-
-#### Format
-
-20-digit number
-
-#### Pattern
-
-20-digit number:
--- 10 digits - 8024680246-- an optional space or hyphen-- 10 digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_finland_european_health_insurance_number
--- ehic#-- ehic-- finlandehicnumber#-- finska sjukförsäkringskort-- health card-- health insurance card-- health insurance number-- hälsokort-- sairaanhoitokortin-- sairausvakuutuskortti-- sairausvakuutusnumero-- sjukförsäkring nummer-- sjukförsäkringskort-- suomen sairausvakuutuskortti-- terveyskortti-----
-### Finland national ID
-
-#### Format
-
-six digits plus a character indicating a century plus three digits plus a check digit
-
-#### Pattern
-
-Pattern must include all of the following:
-- six digits in the format DDMMYY, which are a date of birth-- century marker (either '-', '+' or 'a')-- three-digit personal identification number-- a digit or letter (case insensitive) which is a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
--- ainutlaatuinen henkilökohtainen tunnus-- henkilökohtainen tunnus-- henkilötunnus-- henkilötunnusnumero#-- henkilötunnusnumero-- hetu-- id no-- id number-- identification number-- identiteetti numero-- identity number-- idnumber-- kansallinen henkilötunnus-- kansallisen henkilökortin-- national id card-- national id no.-- personal id-- personal identity code-- personalidnumber#-- personbeteckning-- personnummer-- social security number-- sosiaaliturvatunnus-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- tunnistenumero-- tunnus numero-- tunnusluku-- tunnusnumero-- verokortti-- veronumero-- verotunniste-- verotunnus-----
-### Finland passport number
-
-This entity is available in the EU Passport Number sensitive information type and is available as a stand-alone sensitive information type entity.
-
-#### Format
-combination of nine letters and digits
-
-#### Pattern
-combination of nine letters and digits:
-- two letters (not case-sensitive)-- seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keyword_finland_passport_number
--- suomalainen passi-- passin numero-- passin numero.#-- passin numero#-- passin numero.-- passi#-- passi number-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### France driver's license number
-
-This entity is available in the EU Driver's License Number sensitive information type and is available as a stand-alone sensitive information type entity.
-
-#### Format
-
-12 digits
-
-#### Pattern
-
-12 digits with validation to discount similar patterns such as French telephone numbers
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_french_drivers_license
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-- permis de conduire-- licence number-- license number-- licence numbers-- license numbers-- numéros de licence-----
-### France health insurance number
-
-#### Format
-
-21-digit number
-
-#### Pattern
-
-21-digit number:
--- 10 digits-- an optional space-- 10 digits-- an optional space-- a digit--
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_France_health_insurance_number
--- insurance card-- carte vitale-- carte d'assuré social-----
-### France national id card (CNI)
-
-#### Format
-
-12 digits
-
-#### Pattern
-
-12 digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_france_eu_national_id_card
--- card number-- carte nationale d’identité-- carte nationale d'idenite no-- cni#-- cni-- compte bancaire-- national identification number-- national identity-- nationalidno#-- numéro d'assurance maladie-- numéro de carte vitale-----
-### France passport number
-This entity is available in the EU Passport Number sensitive information type. It's also available as a stand-alone sensitive information type entity.
-
-#### Format
-
-nine digits and letters
-
-#### Pattern
-
-nine digits and letters:
-- two digits-- two letters (not case-sensitive)-- five digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_france_eu_passport_number
--- numéro de passeport-- passeport n °-- passeport non-- passeport #-- passeport#-- passeportnon-- passeportn °-- passeport français-- passeport livre-- passeport carte-- numéro passeport-- passeport n°-- n° du passeport-- n° passeport-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### France social security number (INSEE)
-
-#### Format
-
-15 digits
-
-#### Pattern
-
-Must match one of two patterns:
-- 13 digits followed by a space followed by two digits<br/>
-or
-- 15 consecutive digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_fr_insee
--- code sécu-- d'identité nationale-- insee-- fssn#-- le numéro d'identification nationale-- le code de la sécurité sociale-- national id-- national identification-- no d'identité-- no. d'identité-- numéro d'assurance-- numéro d'identité-- numero d'identite-- numéro de sécu-- numéro de sécurité sociale-- no d'identite-- no. d'identite-- ssn-- ssn#-- sécurité sociale-- securité sociale-- securite sociale-- socialsecuritynumber-- social security number-- social security code-- social insurance number-----
-### France tax identification number
-
-#### Format
-
-13 digits
-
-#### Pattern
-
-13 digits
--- One digit that must be 0, 1, 2, or 3-- One digit-- A space (optional)-- Two digits-- A space (optional)-- Three digits-- A space (optional)-- Three digits-- A space (optional)-- Three check digits--
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_france_eu_tax_file_number
--- numéro d'identification fiscale-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### France value added tax number
-
-#### Format
-
-13 character alphanumeric pattern
-
-#### Pattern
-
-13 character alphanumeric pattern:
--- two letters - FR (case insensitive)-- an optional space or hyphen-- two letters or digits-- an optional space, dot, hyphen, or comma-- three digits-- an optional space, dot, hyphen, or comma-- three digits-- an optional space, dot, hyphen, or comma-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_France_value_added_tax_number
--- vat number-- vat no-- vat#-- value added tax-- siren identification no numéro d'identification taxe sur valeur ajoutée-- taxe valeur ajoutée-- taxe sur la valeur ajoutée-- n° tva-- numéro de tva-- numéro d'identification siren-----
-### Germany driver's license number
-
-This sensitive information type entity is included in the EU Driver's License Number sensitive information type. It's also available as a stand-alone sensitive information type entity.
-
-#### Format
-
-combination of 11 digits and letters
-
-#### Pattern
-
-11 digits and letters (not case-sensitive):
-- a digit or letter-- two digits-- six digits or letters-- a digit-- a digit or letter-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_german_drivers_license_number
--- ausstellungsdatum-- ausstellungsort-- ausstellende beh├╢de-- ausstellende behorde-- ausstellende behoerde-- f├╝hrerschein-- fuhrerschein-- fuehrerschein-- f├╝hrerscheinnummer-- fuhrerscheinnummer-- fuehrerscheinnummer-- f├╝hrerschein- -- fuhrerschein- -- fuehrerschein- -- f├╝hrerscheinnummernr-- fuhrerscheinnummernr-- fuehrerscheinnummernr-- f├╝hrerscheinnummerklasse-- fuhrerscheinnummerklasse-- fuehrerscheinnummerklasse-- nr-f├╝hrerschein-- nr-fuhrerschein-- nr-fuehrerschein-- no-f├╝hrerschein-- no-fuhrerschein-- no-fuehrerschein-- n-f├╝hrerschein-- n-fuhrerschein-- n-fuehrerschein-- permis de conduire-- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dlno-----
-### Germany identity card number
-
-#### Format
-
-since 1 November 2010: Nine letters and digits
-
-from 1 April 1987 until 31 October 2010: 10 digits
-
-#### Pattern
-
-since 1 November 2010:
-- one letter (not case-sensitive)-- eight digits-
-from 1 April 1987 until 31 October 2010:
-- 10 digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_germany_id_card
--- ausweis-- gpid-- identification-- identifikation-- identifizierungsnummer-- identity card-- identity number-- id-nummer-- personal id-- personalausweis-- persönliche id nummer-- persönliche identifikationsnummer-- persönliche-id-nummer-----
-### Germany passport number
-
-This entity is included in the EU Passport Number sensitive information type and is available as a stand-alone sensitive information type entity.
-
-#### Format
-
-10 digits or letters
-
-#### Pattern
-
-Pattern must include all of the following:
-- first character is a digit or a letter from this set (C, F, G, H, J, K)-- three digits-- five digits or letters from this set (C, -H, J-N, P, R, T, V-Z)-- a digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_german_passport
--- reisepasse-- reisepassnummer-- No-Reisepass-- Nr-Reisepass-- Reisepass-Nr-- Passnummer-- reisepässe-- passeport no.-- passeport no-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-----
-### Germany tax identification number
-
-#### Format
-
-11 digits without spaces and delimiters
-
-#### Pattern
-
-11 digits
--- Two digits-- An optional space-- Three digits-- An optional space-- Three digits-- An optional space-- Two digits-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_germany_eu_tax_file_number
--- identifikationsnummer-- steuer id-- steueridentifikationsnummer-- steuernummer-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- zinn#-- zinn-- zinnnummer-----
-### Germany value added tax number
-
-#### Format
-
-11 character alphanumeric pattern
-
-#### Pattern
-
-11-character alphanumeric pattern:
--- a letter D or d-- a letter E or e-- an optional space-- three digits-- an optional space or comma-- three digits-- an optional space or comma-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_germany_value_added_tax_number
--- vat number-- vat no-- vat#-- vat## mehrwertsteuer-- mwst-- mehrwertsteuer identifikationsnummer-- mehrwertsteuer nummer-----
-### Greece driver's license number
-
-This entity is included in the EU Driver's License Number sensitive information type. It's also available as a stand-alone sensitive information type entity.
-
-#### Format
-
-nine digits without spaces and delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_greece_eu_driver's_license_number
--- δεια οδήγησης-- Adeia odigisis-- Άδεια οδήγησης-- Δίπλωμα οδήγησης-----
-### Greece national ID card
-
-#### Format
-
-Combination of 7-8 letters and numbers plus a dash
-
-#### Pattern
-
-Seven letters and numbers (old format):
-- One letter (any letter of the Greek alphabet)-- A dash-- Six digits-
-Eight letters and numbers (new format):
-- Two letters whose uppercase character occurs in both the Greek and Latin alphabets (ABEZHIKMNOPTYX)-- A dash-- Six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_greece_id_card
--- greek id-- greek national id-- greek personal id card-- greek police id-- identity card-- tautotita-- ταυτότητα-- ταυτότητας-----
-### Greece passport number
-
-#### Format
-
-Two letters followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-Two letters followed by seven digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_greece_eu_passport_number
--- αριθμός διαβατηρίου-- αριθμούς διαβατηρίου-- αριθμός διαβατηριο-----
-### Greece Social Security Number (AMKA)
-
-#### Format
-
-11 digits without spaces and delimiters
-
-#### Pattern
--- Six digits as date of birth YYMMDD-- Four digits-- a check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_greece_eu_ssn_or_equivalent
--- ssn-- ssn#-- social security no-- socialsecurityno#-- social security number-- amka-- a.m.k.a.-- Αριθμού Μητρώου Κοινωνικής Ασφάλισης-----
-### Greece tax identification number
-
-#### Format
-
-Nine digits without spaces and delimiters
-
-#### Pattern
-
-Nine digits
-
-#### Checksum
-
-Not applicable
-
-#### Keywords
-
-##### Keywords_greece_eu_tax_file_number
--- afm#-- afm-- aφμ|aφμ αριθμός-- aφμ-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- tax registry no-- tax registry number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- taxregistryno#-- tin id-- tin no-- tin#-- αριθμός φορολογικού μητρώου-- τον αριθμό φορολογικού μητρώου-- φορολογικού μητρώου νο-----
-### Hong Kong identity card (HKID) number
-
-#### Format
-
-Combination of 8-9 letters and numbers plus optional parentheses around the final character
-
-#### Pattern
-
-Combination of 8-9 letters:
-- 1-2 letters (not case-sensitive)-- Six digits-- The final character (any digit or the letter A), which is the check digit and is optionally enclosed in parentheses.-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_hong_kong_id_card
--- hkid-- hong kong identity card-- HKIDC-- id card-- identity card-- hk identity card-- hong kong id-- 香港身份證-- 香港永久性居民身份證-- 身份證-- 身份証-- 身分證-- 身分証-- 香港身份証-- 香港身分證-- 香港身分証-- 香港身份證-- 香港居民身份證-- 香港居民身份証-- 香港居民身分證-- 香港居民身分証-- 香港永久性居民身份証-- 香港永久性居民身分證-- 香港永久性居民身分証-- 香港永久性居民身份證-- 香港非永久性居民身份證-- 香港非永久性居民身份証-- 香港非永久性居民身分證-- 香港非永久性居民身分証-- 香港特別行政區永久性居民身份證-- 香港特別行政區永久性居民身份証-- 香港特別行政區永久性居民身分證-- 香港特別行政區永久性居民身分証-- 香港特別行政區非永久性居民身份證-- 香港特別行政區非永久性居民身份証-- 香港特別行政區非永久性居民身分證-- 香港特別行政區非永久性居民身分証-----
-### Hungary driver's license number
-
-#### Format
-
-Two letters followed by six digits
-
-#### Pattern
-
-Two letters and six digits:
--- Two letters (not case-sensitive)-- Six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_hungary_eu_driver's_license_number
--- vezetoi engedely-- vezetői engedély-- vezetői engedélyek-----
-### Hungary personal identification number
-
-#### Format
-
-11 digits
-
-#### Pattern
-
-11 digits:
--- One digit that corresponds to gender, 1 for male, 2 for female. Other numbers are also possible for citizens born before 1900 or citizens with double citizenship.-- Six digits that correspond to birth date (YYMMDD)-- Three digits that correspond to a serial number-- One check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_hungary_eu_national_id_card
--- id number-- identification number-- sz ig-- sz. ig.-- sz.ig.-- személyazonosító igazolvány-- személyi igazolvány-----
-### Hungary passport number
-
-#### Format
-
-Two letters followed by six or seven digits with no spaces or delimiters
-
-#### Pattern
-
-Two letters followed by six or seven digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_hungary_eu_passport_number
--- útlevél száma-- Útlevelek száma-- útlevél szám-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Hungary social security number (TAJ)
-
-#### Format
-
-Nine digits without spaces and delimiters
-
-#### Pattern
-
-Nine digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_hungary_eu_ssn_or_equivalent
--- hungarian social security number-- social security number-- socialsecuritynumber#-- hssn#-- socialsecuritynno-- hssn-- taj-- taj#-- ssn-- ssn#-- social security no-- áfa-- közösségi adószám-- általános forgalmi adó szám-- hozzáadottérték adó-- áfa szám-- magyar áfa szám-----
-### Hungary tax identification number
-
-#### Format
-
-10 digits with no spaces or delimiters
-
-#### Pattern
-
-10 digits:
--- One digit that must be "8"-- Eight digits-- One check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_hungary_eu_tax_file_number
--- adóazonosító szám-- adóhatóság szám-- adószám-- hungarian tin-- hungatiantin#-- tax authority no-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- vat number-----
-### Hungary value added tax number
-
-#### Format
-
-10 character alphanumeric pattern
-
-#### Pattern
-
-10 character alphanumeric pattern:
--- two letters - HU or hu-- optional space-- eight digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_Hungary_value_added_tax_number
--- vat-- value added tax number-- vat#-- vatno#-- hungarianvatno#-- tax no.-- value added tax áfa-- közösségi adószám-- általános forgalmi adó szám-- hozzáadottérték adó-- áfa szám----
-### India permanent account number (PAN)
-
-#### Format
-
-10 letters or digits
-
-#### Pattern
-
-10 letters or digits:
-- Three letters (not case-sensitive)-- A letter in C, P, H, F, A, T, B, L, J, G (not case-sensitive)-- A letter-- Four digits-- A letter that is an alphabetic check digit-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_india_permanent_account_number
--- Permanent Account Number-- PAN----
-### India unique identification (Aadhaar) number
-
-#### Format
-
-12 digits containing optional spaces or dashes
-
-#### Pattern
-
-12 digits:
-- A digit that is not 0 or 1-- Three digits-- An optional space or dash-- Four digits-- An optional space or dash-- The final digit, which is the check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_india_aadhar
-- aadhaar-- aadhar-- aadhar#-- uid-- आधार-- uidai----
-### Indonesia identity card (KTP) number
-
-#### Format
-
-16 digits containing optional periods
-
-#### Pattern
-
-16 digits:
-- Two-digit province code-- A period (optional)-- Two-digit regency or city code-- Two-digit subdistrict code-- A period (optional)-- Six digits in the format DDMMYY, which are the date of birth-- A period (optional)-- Four digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_indonesia_id_card
--- KTP-- Kartu Tanda Penduduk-- Nomor Induk Kependudukan----
-### International banking account number (IBAN)
-
-#### Format
-
-Country code (two letters) plus check digits (two digits) plus bban number (up to 30 characters)
-
-#### Pattern
-
-Pattern must include all of the following:
--- Two-letter country code-- Two check digits (followed by an optional space)-- 1-7 groups of four letters or digits (can be separated by spaces)-- 1-3 letters or digits-
-The format for each country is slightly different. The IBAN sensitive information type covers these 60 countries/regions:
--- ad-- ae-- al-- at-- az-- ba-- be-- bg-- bh-- ch-- cr-- cy-- cz-- de-- dk-- do-- ee-- es-- fi-- fo-- fr-- gb-- ge-- gi-- gl-- gr-- hr-- hu-- ie-- il-- is-- it-- kw-- kz-- lb-- li-- lt-- lu-- lv-- mc-- md-- me-- mk-- mr-- mt-- mu-- nl-- no-- pl-- pt-- ro-- rs-- sa-- se-- si-- sk-- sm-- tn-- tr-- vg-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-None
----
-### IP address
-
-#### Format
-
-##### IPv4:
-Complex pattern that accounts for formatted (periods) and unformatted (no periods) versions of the IPv4 addresses
-
-##### IPv6:
-Complex pattern that accounts for formatted IPv6 numbers (which include colons)
-
-#### Pattern
--- 000.00.000.00-- 000.000.00.00-- 00.000.000.00-- 000.000.000.00-- 000.000.000.000-- 0000:0000:00a0:0:00a:0b00:0a0a:0ea-- 0000:0000:0000::00:00:000:0-- 0a0a:a000:00a::-- 0000:0000:0000:00::0-- 0a00:00a0:a000:0000:a00a:a00a:00a:aaa0-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_ipaddress
--- IP (this keyword is case-sensitive)-- ip address-- ip addresses-- internet protocol-- IP-כתובת ה-----
-### Personal IP Address
-
-#### Format
-
-Complex pattern that accounts for formatted (periods) versions of the IPv4 addresses
-
-#### Pattern
-
-000.000.000.000
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_ipaddress
--- IP (case sensitive)-- ip address-- ip addresses-- internet protocol-- IP-כתובת ה------
-### Ireland driver's license number
-
-#### Format
-
-Six digits followed by four letters
-
-#### Pattern
-
-Six digits and four letters:
--- Six digits-- Four letters (not case-sensitive)-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_ireland_eu_driver's_license_number
--- ceadúnas tiomána-- ceadúnais tiomána----
-### Ireland passport number
-
-#### Format
-
-Two letters or digits followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-Two letters or digits followed by seven digits:
--- Two digits or letters (not case-sensitive)-- Seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_ireland_eu_passport_number
--- passeport numero-- uimhreacha pasanna-- uimhir pas-- uimhir phas-- uimhreacha pas-- uimhir cárta-- uimhir chárta-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Ireland personal public service (PPS) number
-
-#### Format
-
-Old format (until 31 December 2012):
-- seven digits followed by 1-2 letters-
-New format (1 January 2013 and after):
-- seven digits followed by two letters-
-#### Pattern
-
-Old format (until 31 December 2012):
-- seven digits-- one to two letters (not case-sensitive)-
-New format (1 January 2013 and after):
-- seven digits-- a letter (not case-sensitive) which is an alphabetic check digit-- An optional letter in the range A-I, or ΓÇ£WΓÇ¥-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_ireland_eu_national_id_card
--- client identity service-- identification number-- personal id number-- personal public service number-- personal service no-- phearsanta seirbhíse poiblí-- pps no-- pps number-- pps num-- pps service no-- ppsn-- ppsno#-- ppsno-- psp-- public service no-- publicserviceno#-- publicserviceno-- revenue and social insurance number-- rsi no-- rsi number-- rsin-- seirbhís aitheantais cliant-- uimh-- uimhir aitheantais chánach-- uimhir aitheantais phearsanta-- uimhir phearsanta seirbhíse poiblí-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### Israel bank account number
-
-#### Format
-
-13 digits
-
-#### Pattern
-
-Formatted:
-- two digits-- a dash-- three digits-- a dash-- eight digits-
-Unformatted:
-- 13 consecutive digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_israel_bank_account_number
--- Bank Account Number-- Bank Account-- Account Number-- מספר חשבון בנק----
-### Israel national identification number
-
-#### Format
-
-nine digits
-
-#### Pattern
-
-nine consecutive digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_Israel_National_ID
--- מספר זהות-- מספר זיה וי-- מספר זיהוי ישר אלי -- זהותישר אלית-- هو ية اسرائيل ية عدد-- هوية إسرائ يلية-- رقم الهوية-- عدد هوية فريدة من نوعها-- idnumber#-- id number-- identity no -- identitynumber#-- identity number-- israeliidentitynumber -- personal id-- unique id -----
-### Italy driver's license number
-
-This type entity is included in the EU Driver's License Number sensitive information type. It's also available as a stand-alone sensitive information type entity.
-
-#### Format
-
-a combination of 10 letters and digits
-
-#### Pattern
-
-a combination of 10 letters and digits:
-- one letter (not case-sensitive)-- the letter "A" or "V" (not case-sensitive)-- seven digits-- one letter (not case-sensitive)-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-
-##### Keyword_italy_drivers_license_number
--- numero di patente-- patente di guida-- patente guida-- patenti di guida-- patenti guida-----
-### Italy fiscal code
-
-#### Format
-
-a 16-character combination of letters and digits in the specified pattern
-
-#### Pattern
-
-A 16-character combination of letters and digits:
-- three letters that correspond to the first three consonants in the family name-- three letters that correspond to the first, third, and fourth consonants in the first name-- two digits that correspond to the last digits of the birth year-- one letter that corresponds to the letter for the month of birthΓÇöletters are used in alphabetical order, but only the letters A to E, H, L, M, P, R to T are used (so, January is A and October is R)-- two digits that correspond to the day of the month of birth in order to differentiate between genders, 40 is added to the day of birth for women-- four digits that correspond to the area code specific to the municipality where the person was born (country-wide codes are used for foreign countries/regions)-- one parity digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_italy_eu_national_id_card
--- codice fiscal-- codice fiscale-- codice id personale-- codice personale-- fiscal code-- numero certificato personale-- numero di identificazione fiscale-- numero id personale-- numero personale-- personal certificate number-- personal code-- personal id code-- personal id number-- personalcodeno#-- tax code-- tax id-- tax identification no-- tax identification number-- tax identity number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### Italy passport number
-
-#### Format
-
-two letters or digits followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-two letters or digits followed by seven digits:
--- two digits or letters (not case-sensitive)-- seven digits-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_italy_eu_passport_number
--- italiana passaporto-- passaporto italiana-- passaporto numero-- numéro passeport-- numero di passaporto-- numeri del passaporto-- passeport italien-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Italy value added tax number
-
-#### Format
-
-13 character alphanumeric pattern with optional delimiters
-
-#### Pattern
-
-13 character alphanumeric pattern with optional delimiters:
--- I or i-- T or t-- optional space, dot, hyphen, or comma-- 11 digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_italy_value_added_tax_number
--- vat number-- vat no-- vat#-- iva-- iva#-----
-### Japan bank account number
-
-#### Format
-
-seven or eight digits
-
-#### Pattern
-
-bank account number:
-- seven or eight digits-- bank account branch code:-- four digits-- a space or dash (optional)-- three digits-
-Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_jp_bank_account
--- Checking Account Number-- Checking Account-- Checking Account #-- Checking Acct Number-- Checking Acct #-- Checking Acct No.-- Checking Account No.-- Bank Account Number-- Bank Account-- Bank Account #-- Bank Acct Number-- Bank Acct #-- Bank Acct No.-- Bank Account No.-- Savings Account Number-- Savings Account-- Savings Account #-- Savings Acct Number-- Savings Acct #-- Savings Acct No.-- Savings Account No.-- Debit Account Number-- Debit Account-- Debit Account #-- Debit Acct Number-- Debit Acct #-- Debit Acct No.-- Debit Account No.-- 口座番号-- 銀行口座-- 銀行口座番号-- 総合口座-- 普通預金口座-- 普通口座-- 当座預金口座-- 当座口座-- 預金口座-- 振替口座-- 銀行-- バンク-
-##### Keyword_jp_bank_branch_code
--- 支店番号-- 支店コード-- 店番号----
-### Japan driver's license number
-
-#### Format
-
-12 digits
-
-#### Pattern
-
-12 consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_jp_drivers_license_number
--- driverlicense-- driverslicense-- driver'slicense-- driverslicenses-- driver'slicenses-- driverlicenses-- dl#-- dls#-- lic#-- lics#-- 運転免許証-- 運転免許-- 免許証-- 免許-- 運転免許証番号-- 運転免許番号-- 免許証番号-- 免許番号-- 運転免許証ナンバー-- 運転免許ナンバー-- 免許証ナンバー-- 運転免許証no-- 運転免許no-- 免許証no-- 免許no-- 運転経歴証明書番号-- 運転経歴証明書-- 運転免許証No.-- 運転免許No.-- 免許証No.-- 免許No.-- 運転免許証#-- 運転免許#-- 免許証#-- 免許#-----
-### Japan My Number - Corporate
-
-#### Format
-
-13-digit number
-
-#### Pattern
-
-13-digit number:
--- one digit from one to nine-- 12 digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_japan_my_number_corporate
--- corporate number-- マイナンバー-- 共通番号-- マイナンバーカード-- マイナンバーカード番号-- 個人番号カード-- 個人識別番号-- 個人識別ナンバー-- 法人番号-- 指定通知書-----
-### Japan My Number - Personal
-
-#### Format
-
-12-digit number
-
-#### Pattern
-
-12-digit number:
--- four digits-- an optional space, dot, or hyphen-- four digits-- an optional space, dot, or hyphen-- four digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_japan_my_number_personal
--- my number-- マイナンバー-- 個人番号-- 共通番号-- マイナンバーカード-- マイナンバーカード番号-- 個人番号カード-- 個人識別番号-- 個人識別ナンバー-- 通知カード-----
-### Japan passport number
-
-#### Format
-
-two letters followed by seven digits
-
-#### Pattern
-
-two letters (not case-sensitive) followed by seven digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_jp_passport
--- Passport-- Passport Number-- Passport No.-- Passport #-- パスポート-- パスポート番号-- パスポートナンバー-- パスポート#-- パスポート#-- パスポートNo.-- 旅券番号-- 旅券番号#-- 旅券番号♯-- 旅券ナンバー-----
-### Japan residence card number
-
-#### Format
-
-12 letters and digits
-
-#### Pattern
-
-12 letters and digits:
-- two letters (not case-sensitive)-- eight digits-- two letters (not case-sensitive)-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_jp_residence_card_number
--- Residence card number-- Residence card no-- Residence card #-- 在留カード番号-- 在留カード-- 在留番号----
-### Japan resident registration number
-
-#### Format
-
-11 digits
-
-#### Pattern
-
-11 consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_jp_resident_registration_number
--- Resident Registration Number-- Residents Basic Registry Number-- Resident Registration No.-- Resident Register No.-- Residents Basic Registry No.-- Basic Resident Register No.-- 外国人登録証明書番号-- 証明書番号-- 登録番号-- 外国人登録証-----
-### Japan social insurance number (SIN)
-
-#### Format
-
-7-12 digits
-
-#### Pattern
-
-7-12 digits:
--- four digits-- a hyphen (optional)-- six digits-
-*or*
--- 7-12 consecutive digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_jp_sin
--- Social Insurance No.-- Social Insurance Num-- Social Insurance Number-- 健康保険被保険者番号-- 健保番号-- 基礎年金番号-- 雇用保険被保険者番号-- 雇用保険番号-- 保険証番号-- 社会保険番号-- 社会保険No.-- 社会保険-- 介護保険-- 介護保険被保険者番号-- 健康保険被保険者整理番号-- 雇用保険被保険者整理番号-- 厚生年金-- 厚生年金被保険者整理番号-----
-### Latvia driver's license number
-
-#### Format
-
-three letters followed by six digits
-
-#### Pattern
-
-three letters and six digits:
--- three letters (not case-sensitive)-- six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_latvia_eu_driver's_license_number
--- autovad─½t─üja apliec─½ba-- autovad─½t─üja apliec─½bas-- vad─½t─üja apliec─½ba----
-### Latvia Personal Code
-
-#### Format
-
-11 digits and an optional hyphen
-
-#### Pattern
-
-Old format
-
-11 digits and a hyphen:
--- six digits that correspond to the birth date (DDMMYY)-- a hyphen-- one digit that corresponds to the century of birth ("0" for 19th century, "1" for 20th century, and "2" for 21st century)-- four digits, randomly generated-
-New format
-
-11 digits
--- Two digits "32"-- Nine digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_latvia_eu_national_id_card
--- administrative number-- alvas n─ô-- birth number-- citizen number-- civil number-- electronic census number-- electronic number-- fiscal code-- healthcare user number-- id#-- id-code-- identification number-- identifik─ücijas numurs-- id-number-- individual number-- latvija alva-- nacion─ülais id-- national id-- national identifying number-- national identity number-- national insurance number-- national register number-- nodok─╝a numurs-- nodok─╝u id-- nodok─╝u identifik─ücija numurs-- personal certificate number-- personal code-- personal id code-- personal id number-- personal identification code-- personal identifier-- personal identity number-- personal number-- personal numeric code-- personalcodeno#-- personas kods-- population identification code-- public service number-- registration number-- revenue number-- social insurance number-- social security number-- state tax code-- tax file number-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- voterΓÇÖs number----
-### Latvia passport number
-
-#### Format
-
-two letters or digits followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-two letters or digits followed by seven digits:
--- two digits or letters (not case-sensitive)-- seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number_common
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_latvia_eu_passport_number
--- pase numurs-- pase numur-- pases numuri-- pases nr-- passeport no-- n┬░ du Passeport-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Lithuania driver's license number
-
-#### Format
-
-eight digits without spaces and delimiters
-
-#### Pattern
-
-eight digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_lithuania_eu_driver's_license_number
--- vairuotojo pa┼╛ym─ùjimas-- vairuotojo pa┼╛ym─ùjimo numeris-- vairuotojo pa┼╛ym─ùjimo numeriai----
-### Lithuania Personal Code
-
-#### Format
-
-11 digits without spaces and delimiters
-
-#### Pattern
-
-11 digits without spaces and delimiters:
--- one digit (1-6) that corresponds to the person's gender and century of birth-- six digits that correspond to birth date (YYMMDD)-- three digits that correspond to the serial number of the date of birth-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_lithuania_eu_national_id_card
--- asmeninis skaitmeninis kodas-- asmens kodas-- citizen service number-- mokes─ìi┼│ id-- mokes─ìi┼│ identifikavimas numeris-- mokes─ìi┼│ identifikavimo numeris-- mokes─ìi┼│ numeris-- national identification number-- personal code-- personal numeric code-- pilie─ìio paslaugos numeris-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- unikalus identifikavimo kodas-- unikalus identifikavimo numeris-- unique identification number-- unique identity number-- uniqueidentityno#----
-### Lithuania passport number
-
-#### Format
-
-eight digits or letters with no spaces or delimiters
-
-#### Pattern
-
-eight digits or letters (not case-sensitive)
-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_lithuania_eu_passport_number
--- paso numeris-- paso numeriai-- paso nr-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry----
-### Location
-
-#### Format
-Longitude can range from -180.0 to 180.0. Latitude can range from -90.0 to 90.0.
-
-#### Checksum
-Not applicable
-
-#### Keywords
--- lat-- latitude-- long-- longitude-- coord-- coordinates-- geo-- geolocation-- loc-- location-- position----
-### Luxemburg driver's license number
-
-#### Format
-
-six digits without spaces and delimiters
-
-#### Pattern
-
-six digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_luxemburg_eu_driver's_license_number
--- fahrerlaubnis-- Führerschäin----
-### Luxemburg national identification number natural persons
-
-#### Format
-
-13 digits with no spaces or delimiters
-
-#### Pattern
-
-13 digits:
--- 11 digits-- two check digits-
-#### Checksum
-
-yes
-
-#### Keywords
-
-##### Keywords_luxemburg_eu_national_id_card
--- eindeutige id-- eindeutige id-nummer-- eindeutigeid#-- id personnelle-- idpersonnelle#-- idpersonnelle-- individual code-- individual id-- individual identification-- individual identity-- numéro d'identification personnel-- personal id-- personal identification-- personal identity-- personalidno#-- personalidnumber#-- persönliche identifikationsnummer-- unique id-- unique identity-- uniqueidkey#----
-### Luxemburg national identification number non-natural persons
-
-#### Format
-
-11 digits
-
-#### Pattern
-
-11 digits
--- two digits-- an optional space-- three digits-- an optional space-- three digits-- an optional space-- two digits-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_luxemburg_eu_tax_file_number
--- carte de sécurité sociale-- étain non-- étain#-- identifiant d'impôt-- luxembourg tax identifikatiounsnummer-- numéro d'étain-- numéro d'identification fiscal luxembourgeois-- numéro d'identification fiscale-- social security-- sozialunterstützung-- sozialversécherung-- sozialversicherungsausweis-- steier id-- steier identifikatiounsnummer-- steier nummer-- steuer id-- steueridentifikationsnummer-- steuernummer-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- zinn#-- zinn-- zinnzahl----
-### Luxemburg passport number
-
-#### Format
-
-eight digits or letters with no spaces or delimiters
-
-#### Pattern
-
-eight digits or letters (not case-sensitive)
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_luxemburg_eu_passport_number
-- ausweisnummer-- luxembourg pass-- luxembourg passeport-- luxembourg passport-- no de passeport-- no-reisepass-- nr-reisepass-- numéro de passeport-- pass net-- pass nr-- passnummer-- passeport nombre-- reisepässe-- reisepass-nr-- reisepassnummer-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Malaysia identification card number
-
-#### Format
-
-12 digits containing optional hyphens
-
-#### Pattern
-
-12 digits:
-- six digits in the format YYMMDD, which are the date of birth-- a dash (optional)-- two-letter place-of-birth code-- a dash (optional)-- three random digits-- one-digit gender code-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_malaysia_id_card_number
--- digital application card-- i/c-- i/c no-- ic-- ic no-- id card-- identification Card-- identity card-- k/p-- k/p no-- kad akuan diri-- kad aplikasi digital-- kad pengenalan malaysia-- kp-- kp no-- mykad-- mykas-- mykid-- mypr-- mytentera-- malaysia identity card-- malaysian identity card-- nric-- personal identification card----
-### Malta driver's license number
-
-#### Format
-
-Combination of two characters and six digits in the specified pattern
-
-#### Pattern
-
-combination of two characters and six digits:
--- two characters (digits or letters, not case-sensitive)-- a space (optional)-- three digits-- a space (optional)-- three digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_malta_eu_driver's_license_number
--- li─ïenzja tas-sewqan-- li─ïenzji tas-sewwieq-----
-### Malta identity card number
-
-#### Format
-
-seven digits followed by one letter
-
-#### Pattern
-
-seven digits followed by one letter:
--- seven digits-- one letter in "M, G, A, P, L, H, B, Z" (case insensitive)-
-#### Checksum
-
-Not applicable
-
-#### Keywords
-
-##### Keywords_malta_eu_national_id_card
--- citizen service number-- id tat-taxxa-- identifika numru tal-biljett-- kodiċi numerali personali-- numru ta 'identifikazzjoni personali-- numru ta 'identifikazzjoni tat-taxxa-- numru ta 'identifikazzjoni uniku-- numru ta' identità uniku-- numru tas-servizz taċ-ċittadin-- numru tat-taxxa-- personal numeric code-- unique identification number-- unique identity number-- uniqueidentityno#-----
-### Malta passport number
-
-#### Format
-
-seven digits without spaces or delimiters
-
-#### Pattern
-
-seven digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_malta_eu_passport_number
--- numru tal-passaport-- numri tal-passaport-- Nru tal-passaport-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Malta tax identification number
-
-#### Format
-
-For Maltese nationals:
-- seven digits and one letter in the specified pattern-
-Non-Maltese nationals and Maltese entities:
-- nine digits-
-#### Pattern
-
-Maltese nationals: seven digits and one letter
--- seven digits-- one letter (not case-sensitive)-
-Non-Maltese nationals and Maltese entities: nine digits
--- nine digits-
-#### Checksum
-
-Not applicable
-
-#### Keywords
-
-##### Keywords_malta_eu_tax_file_number
--- citizen service number-- id tat-taxxa-- identifika numru tal-biljett-- kodiċi numerali personali-- numru ta 'identifikazzjoni personali-- numru ta 'identifikazzjoni tat-taxxa-- numru ta 'identifikazzjoni uniku-- numru ta' identità uniku-- numru tas-servizz taċ-ċittadin-- numru tat-taxxa-- personal numeric code-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- unique identification number-- unique identity number-- uniqueidentityno#----
-### Netherlands citizen's service (BSN) number
-
-#### Format
-
-eight or nine digits containing optional spaces
-
-#### Pattern
-
-eight-nine digits:
-- three digits-- a space (optional)-- three digits-- a space (optional)-- two-three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_netherlands_eu_national_id_card
--- bsn#-- bsn-- burgerservicenummer-- citizen service number-- person number-- personal number-- personal numeric code-- person-related number-- persoonlijk nummer-- persoonlijke numerieke code-- persoonsgebonden-- persoonsnummer-- sociaal-fiscaal nummer-- social-fiscal number-- sofi-- sofinummer-- uniek identificatienummer-- uniek identiteitsnummer-- unique identification number-- unique identity number-- uniqueidentityno#----
-### Netherlands driver's license number
-
-#### Format
-
-10 digits without spaces and delimiters
-
-#### Pattern
-
-10 digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_netherlands_eu_driver's_license_number
--- permis de conduire-- rijbewijs-- rijbewijsnummer-- rijbewijzen-- rijbewijs nummer-- rijbewijsnummers-----
-### Netherlands passport number
-
-#### Format
-
-nine letters or digits with no spaces or delimiters
-
-#### Pattern
-
-nine letters or digits
-
-#### Checksum
-
-not applicable
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_netherlands_eu_passport_number
--- paspoort nummer-- paspoortnummers-- paspoortnummer-- paspoort nr----
-### Netherlands tax identification number
-
-#### Format
-
-nine digits without spaces or delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_netherlands_eu_tax_file_number
--- btw nummer-- hollânske tax identification-- hulandes impuesto id number-- hulandes impuesto identification-- identificatienummer belasting-- identificatienummer van belasting-- impuesto identification number-- impuesto number-- nederlands belasting id nummer-- nederlands belasting identificatie-- nederlands belasting identificatienummer-- nederlands belastingnummer-- nederlandse belasting identificatie-- netherlands tax identification-- netherland's tax identification-- netherlands tin-- netherland's tin-- tax id-- tax identification no-- tax identification number-- tax identification tal-- tax no#-- tax no-- tax number-- tax registration number-- tax tal-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### Netherlands value added tax number
-
-#### Format
-
-14 character alphanumeric pattern
-
-#### Pattern
-
-14-character alphanumeric pattern:
--- N or n-- L or l-- optional space, dot, or hyphen-- nine digits-- optional space, dot, or hyphen-- B or b-- two digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_netherlands_value_added_tax_number
--- vat number-- vat no-- vat#-- wearde tafoege tax getal-- btw n├╗mer-- btw-nummer-----
-### New Zealand bank account number
-
-#### Format
-
-14-digit to 16-digit pattern with optional delimiter
-
-#### Pattern
-
-14-digit to 16-digit pattern with optional delimiter:
--- two digits-- an optional hyphen or space-- three to four digits-- an optional hyphen or space-- seven digits-- an optional hyphen or space-- two to three digits-- an options hyphen or space-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_new_zealand_bank_account_number
--- account number-- bank account-- bank_acct_id-- bank_acct_branch-- bank_acct_nbr-----
-### New Zealand driver's license number
-
-#### Format
-
-eight character alphanumeric pattern
-
-#### Pattern
-
-eight character alphanumeric pattern
--- two letters-- six digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_new_zealand_drivers_license_number
--- driverlicence-- driverlicences-- driver lic-- driver licence-- driver licences-- driverslic-- driverslicence-- driverslicences-- drivers lic-- drivers lics-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's licence-- driver's licences-- driverlic#-- driverlics#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver licence#-- driver licences#-- driverslic#-- driverslics#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's licence#-- driver's licences#-- international driving permit-- international driving permits-- nz automobile association-- new zealand automobile association-----
-### New Zealand inland revenue number
-
-#### Format
-
-eight or nine digits with optional delimiters
-
-#### Pattern
-
-eight or nine digits with optional delimiters
--- two or three digits-- an optional space or hyphen-- three digits-- an optional space or hyphen-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_new_zealand_inland_revenue_number
--- ird no.-- ird no#-- nz ird-- new zealand ird-- ird number-- inland revenue number-----
-### New Zealand ministry of health number
-
-#### Format
-
-three letters, a space (optional), and four digits
-
-#### Pattern
--- three letters (not case-sensitive) except 'I' and 'O'-- a space (optional)-- four digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_nz_terms
--- NHI-- New Zealand-- Health-- treatment-- National Health Index Number-- nhi number-- nhi no.-- NHI#-- National Health Index No.-- National Health Index Id-- National Health Index #----
-### New Zealand social welfare number
-
-#### Format
-
-nine digits
-
-#### Pattern
-
-nine digits
--- three digits-- an optional hyphen-- three digits-- an optional hyphen-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_new_zealand_social_welfare_number
--- social welfare #-- social welfare#-- social welfare No.-- social welfare number-- swn#-----
-### Norway identification number
-
-#### Format
-
-11 digits
-
-#### Pattern
-
-11 digits:
-- six digits in the format DDMMYY, which are the date of birth-- three-digit individual number-- two check digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_norway_id_number
--- Personal identification number-- Norwegian ID Number-- ID Number-- Identification-- Personnummer-- F├╕dselsnummer-----
-### Philippines unified multi-purpose identification number
-
-#### Format
-
-12 digits separated by hyphens
-
-#### Pattern
-
-12 digits:
-- four digits-- a hyphen-- seven digits-- a hyphen-- one digit-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_philippines_id
--- Unified Multi-Purpose ID-- UMID-- Identity Card-- Pinag-isang Multi-Layunin ID----
-### Poland driver's license number
-
-#### Format
-
-14 digits containing two forward slashes
-
-#### Pattern
-
-14 digits and two forward slashes:
--- five digits-- a forward slash-- two digits-- a forward slash-- seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_poland_eu_driver's_license_number
--- prawo jazdy-- prawa jazdy----
-### Poland identity card
-
-#### Format
-
-three letters and six digits
-
-#### Pattern
-
-three letters (not case-sensitive) followed by six digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_poland_national_id_passport_number
--- Dowód osobisty-- Numer dowodu osobistego-- Nazwa i numer dowodu osobistego-- Nazwa i nr dowodu osobistego-- Nazwa i nr dowodu tożsamości-- Dowód Tożsamości-- dow. os.-----
-### Poland national ID (PESEL)
-
-#### Format
-
-11 digits
-
-#### Pattern
--- six digits representing date of birth in the format YYMMDD-- four digits-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_pesel_identification_number
--- dowód osobisty-- dowódosobisty-- niepowtarzalny numer-- niepowtarzalnynumer-- nr.-pesel-- nr-pesel-- numer identyfikacyjny-- pesel-- tożsamości narodowej-----
-### Poland passport number
-This sensitive information type entity is included in the EU Passport Number sensitive information type. It's also available as a stand-alone sensitive information type entity.
-
-#### Format
-
-two letters and seven digits
-
-#### Pattern
-
-Two letters (not case-sensitive) followed by seven digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keyword_polish_national_passport_number
--- numer paszportu-- numery paszport├│w-- numery paszportowe-- nr paszportu-- nr. paszportu-- nr paszport├│w-- n┬░ passeport-- passeport n┬░-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Poland REGON number
-
-#### Format
-
-9-digit or 14-digit number
-
-#### Pattern
-
-nine digit or 14-digit number:
--- nine digits-
-**or**
--- nine digits-- hyphen-- five digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_poland_regon_number
--- regon id-- statistical number-- statistical id-- statistical no-- regon number-- regonid#-- regonno#-- company id-- companyid#-- companyidno#-- numer statystyczny-- numeru regon-- numerstatystyczny#-- numeruregon#-----
-### Poland tax identification number
-
-#### Format
-
-11 digits with no spaces or delimiters
-
-#### Pattern
-
-11 digits
-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_poland_eu_tax_file_number
--- nip#-- nip-- numer identyfikacji podatkowej-- numeridentyfikacjipodatkowej#-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- vat id#-- vat id-- vat no-- vat number-- vatid#-- vatid-- vatno#-----
-### Portugal citizen card number
-
-#### Format
-
-eight digits
-
-#### Pattern
-
-eight digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_portugal_citizen_card
--- bilhete de identidade-- cartão de cidadão-- citizen card-- document number-- documento de identificação-- id number-- identification no-- identification number-- identity card no-- identity card number-- national id card-- nic-- número bi de portugal-- número de identificação civil-- número de identificação fiscal-- número do documento-- portugal bi number-----
-### Portugal driver's license number
-
-#### Format
-
-two patterns - two letters followed by 5-8 digits with special characters
-
-#### Pattern
-
-Pattern 1:
-Two letters followed by 5/6 with special characters:
-- Two letters (not case-sensitive)-- A hyphen-- Five or Six digits-- A space-- One digit-
-Pattern 2:
-One letter followed by 6/8 digits with special characters:
-- One letter (not case-sensitive)-- A hyphen-- Six or eight digits-- A space-- One digit--
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_portugal_eu_driver's_license_number
--- carteira de motorista-- carteira motorista-- carteira de habilitação-- carteira habilitação-- número de licença-- número licença-- permissão de condução-- permissão condução-- Licença condução Portugal-- carta de condução----
-### Portugal passport number
-
-#### Format
-
-one letter followed by six digits with no spaces or delimiters
-
-#### Pattern
-
-one letter followed by six digits:
--- one letter (not case-sensitive)-- six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_portugal_eu_passport_number
--- n├║mero do passaporte-- portuguese passport-- portuguese passeport-- portuguese passaporte-- passaporte n┬║-- passeport n┬║-- n├║meros de passaporte-- portuguese passports-- n├║mero passaporte-- n├║meros passaporte-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Portugal tax identification number
-
-#### Format
-
-nine digits with optional spaces
-
-#### Pattern
--- three digits-- an optional space-- three digits-- an optional space-- three digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_portugal_eu_tax_file_number
--- cpf#-- cpf-- nif#-- nif-- número de identificação fisca-- numero fiscal-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### Romania driver's license number
-
-#### Format
-
-one character followed by eight digits
-
-#### Pattern
-
-one character followed by eight digits:
-- one letter (not case-sensitive) or digit-- eight digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_romania_eu_driver's_license_number
--- permis de conducere-- permisului de conducere-- permisului conducere-- permisele de conducere-- permisele conducere-- permis conducere----
-### Romania personal numeric code (CNP)
-
-#### Format
-
-13 digits without spaces and delimiters
-
-#### Pattern
--- one digit from 1-9-- six digits representing date of birth (YYMMDD)-- two digits, which can be 01-52 or 99-- four digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_romania_eu_national_id_card
--- cnp#-- cnp-- cod identificare personal-- cod numeric personal-- cod unic identificare-- codnumericpersonal#-- codul fiscal nr.-- identificarea fiscală nr#-- id-ul taxei-- insurance number-- insurancenumber#-- national id#-- national id-- national identification number-- număr identificare personal-- număr identitate-- număr personal unic-- număridentitate#-- număridentitate-- numărpersonalunic#-- numărpersonalunic-- număru de identificare fiscală-- numărul de identificare fiscală-- personal numeric code-- pin#-- pin-- tax file no-- tax file number-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-- unique identification number-- unique identity number-- uniqueidentityno#-- uniqueidentityno----
-### Romania passport number
-
-#### Format
-
-eight or nine digits without spaces and delimiters
-
-#### Pattern
-
-eight or nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_romania_eu_passport_number
--- numărul pașaportului-- numarul pasaportului-- numerele pașaportului-- Pașaport nr-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Russia passport number domestic
-
-#### Format
-
-10-digit number
-
-#### Pattern
-
-10-digit number:
--- two digits-- an optional space or hyphen-- two digits-- an optional space-- six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_russia_passport_number_domestic
--- passport number-- passport no-- passport #-- passport id-- passportno#-- passportnumber#-- паспорт нет-- паспорт id-- pоссийской паспорт-- pусский номер паспорта-- паспорт#-- паспортid#-- номер паспорта-- номерпаспорта#-----
-### Russia passport number international
-
-#### Format
-
-nine-digit number
-
-#### Pattern
-
-nine-digit number:
--- two digits-- an optional space or hyphen-- seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_russia_passport_number_international
--- passport number-- passport no-- passport #-- passport id-- passportno#-- passportnumber#-- паспорт нет-- паспорт id-- pоссийской паспорт-- pусский номер паспорта-- паспорт#-- паспортid#-- номер паспорта-- номерпаспорта#-----
-### Saudi Arabia National ID
-
-#### Format
-
-10 digits
-
-#### Pattern
-
-10 consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_saudi_arabia_national_id
--- Identification Card-- I card number-- ID number-- الوطنية الهوية بطاقة رقم-----
-### Singapore national registration identity card (NRIC) number
-
-#### Format
-
-nine letters and digits
-
-#### Pattern
--- nine letters and digits:-- the letter "F", "G", "S", or "T" (not case-sensitive)-- seven digits-- an alphabetic check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_singapore_nric
--- National Registration Identity Card-- Identity Card Number-- NRIC-- IC-- Foreign Identification Number-- FIN-- 身份证-- 身份證----
-### Slovakia driver's license number
-
-#### Format
-
-one character followed by seven digits
-
-#### Pattern
-
-one character followed by seven digits
--- one letter (not case-sensitive) or digit-- seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_slovakia_eu_driver's_license_number
--- vodičský preukaz-- vodičské preukazy-- vodičského preukazu-- vodičských preukazov----
-### Slovakia personal number
-
-#### Format
-
-nine or ten digits containing optional backslash
-
-#### Pattern
--- six digits representing date of birth-- optional slash (/)-- three digits-- one optional check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_slovakia_eu_national_id_card
--- azonosító szám-- birth number-- číslo národnej identifikačnej karty-- číslo občianského preukazu-- daňové číslo-- id number-- identification no-- identification number-- identifikačná karta č-- identifikačné číslo-- identity card no-- identity card number-- národná identifikačná značka č-- national number-- nationalnumber#-- nemzeti személyazonosító igazolvány-- personalidnumber#-- rč-- rodne cislo-- rodné číslo-- social security number-- ssn#-- ssn-- személyi igazolvány szám-- személyi igazolvány száma-- személyigazolvány szám-- tax file no-- tax file number-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#----
-### Slovakia passport number
-
-#### Format
-
-one digit or letter followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-one digit or letter (not case-sensitive) followed by seven digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_slovakia_eu_passport_number
--- číslo pasu-- čísla pasov-- pas č.-- Passeport n°-- n° Passeport-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Slovenia driver's license number
-
-#### Format
-
-nine digits without spaces and delimiters
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_slovenia_eu_driver's_license_number
--- vozniško dovoljenje-- vozniška številka licence-- vozniških dovoljenj-- številka vozniškega dovoljenja-- številke vozniških dovoljenj----
-### Slovenia Unique Master Citizen Number
-
-#### Format
-
-13 digits without spaces or delimiters
-
-#### Pattern
-
-13 digits in the specified pattern:
--- seven digits that correspond to the birth date (DDMMLLL) where "LLL" corresponds to the last three digits of the birth year-- two digits that correspond to the area of birth "50"-- three digits that correspond to a combination of gender and serial number for persons born on the same day. 000-499 for male and 500-999 for female.-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_slovenia_eu_national_id_card
--- edinstvena številka glavnega državljana-- emšo-- enotna maticna številka obcana-- id card-- identification number-- identifikacijska številka-- identity card-- nacionalna id-- nacionalni potni list-- national id-- osebna izkaznica-- osebni koda-- osebni ne-- osebni številka-- personal code-- personal number-- personal numeric code-- številka državljana-- unique citizen number-- unique id number-- unique identity number-- unique master citizen number-- unique registration number-- uniqueidentityno #-- uniqueidentityno#----
-### Slovenia passport number
-
-#### Format
-
-two letters followed by seven digits with no spaces or delimiters
-
-#### Pattern
-
-two letters followed by seven digits:
--- the letter "P"-- one uppercase letter-- seven digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_slovenia_eu_passport_number
--- številka potnega lista-- potek veljavnosti-- potni list#-- datum rojstva-- potni list-- številke potnih listov-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Slovenia tax identification number
-
-#### Format
-
-eight digits with no spaces or delimiters
-
-#### Pattern
--- one digit from 1-9-- six digits-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_slovenia_eu_tax_file_number
--- davčna številka-- identifikacijska številka davka-- številka davčne datoteke-- tax file no-- tax file number-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### South Africa identification number
-
-#### Format
-
-13 digits that may contain spaces
-
-#### Pattern
-
-13 digits:
-- six digits in the format YYMMDD, which are the date of birth-- four digits-- a single-digit citizenship indicator-- the digit "8" or "9"-- one digit, which is a checksum digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_south_africa_identification_number
--- Identity card-- ID-- Identification----
-### South Korea resident registration number
-
-#### Format
-
-13 digits containing a hyphen
-
-#### Pattern
-
-13 digits:
-- six digits in the format YYMMDD, which are the date of birth-- a hyphen-- one digit determined by the century and gender-- four-digit region-of-birth code-- one digit used to differentiate people for whom the preceding numbers are identical-- a check digit.-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_south_korea_resident_number
--- National ID card-- Citizen's Registration Number-- Jumin deungnok beonho-- RRN-- 주민등록번호----
-### Spain driver's license number
-
-#### Format
-
-eight digits followed by one character
-
-#### Pattern
-
-eight digits followed by one character:
--- eight digits-- one digit or letter (not case-sensitive)-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_spain_eu_driver's_license_number
--- permiso de conducci├│n-- permiso conducci├│n-- licencia de conducir-- licencia conducir-- permiso conducir-- permiso de conducir-- permisos de conducir-- permisos conducir-- carnet conducir-- carnet de conducir-- licencia de manejo-- licencia manejo----
-### Spain DNI
-
-#### Format
-
-eight digits followed by one character
-
-#### Pattern
-
-seven digits followed by one character
--- eight digits-- An optional space or hyphen-- one check letter (not case-sensitive)-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_spain_eu_national_id_card
--- carné de identidad-- dni#-- dni-- dninúmero#-- documento nacional de identidad-- identidad único-- identidadúnico#-- insurance number-- national identification number-- national identity-- nationalid#-- nationalidno#-- nie#-- nie-- nienúmero#-- número de identificación-- número nacional identidad-- personal identification number-- personal identity no-- unique identity number-- uniqueid#----
-### Spain passport number
-
-#### Format
-
-an eight- or nine-character combination of letters and numbers with no spaces or delimiters
-
-#### Pattern
-
-an eight- or nine-character combination of letters and numbers:
--- two digits or letters-- one digit or letter (optional)-- six digits-
-#### Checksum
-
-Not applicable
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_spain_eu_passport_number
--- libreta pasaporte-- n├║mero pasaporte-- espa├▒a pasaporte-- n├║meros de pasaporte-- n├║mero de pasaporte-- n├║meros pasaporte-- pasaporte no-- Passeport n┬░-- n┬░ Passeport-- pasaporte no.-- pasaporte n┬░-- spain passport-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Spain social security number (SSN)
-
-#### Format
-
-11-12 digits
-
-#### Pattern
-
-11-12 digits:
-- two digits-- a forward slash (optional)-- seven to eight digits-- a forward slash (optional)-- two digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_spain_eu_passport_number
--- ssn-- ssn#-- socialsecurityno-- social security no-- social security number-- n├║mero de la seguridad social----
-### Spain tax identification number
-
-#### Format
-
-seven or eight digits and one or two letters in the specified pattern
-
-#### Pattern
-
-Spanish Natural Persons with a Spain National Identity Card:
--- eight digits-- one uppercase letter (case-sensitive)-
-Non-resident Spaniards without a Spain National Identity Card
--- one uppercase letter "L" (case-sensitive)-- seven digits-- one uppercase letter (case-sensitive)-
-Resident Spaniards under the age of 14 years without a Spain National Identity Card:
--- one uppercase letter "K" (case-sensitive)-- seven digits-- one uppercase letter (case-sensitive)-
-Foreigners with a Foreigner's Identification Number
--- one uppercase letter that is "X", "Y", or "Z" (case-sensitive)-- seven digits-- one uppercase letter (case-sensitive)-
-Foreigners without a Foreigner's Identification Number
--- one uppercase letter that is "M" (case-sensitive)-- seven digits-- one uppercase letter (case-sensitive)-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_spain_eu_tax_file_number
--- cif-- cifid#-- cifn├║mero#-- n├║mero de contribuyente-- n├║mero de identificaci├│n fiscal-- n├║mero de impuesto corporativo-- spanishcifid#-- spanishcifid-- spanishcifno#-- spanishcifno-- tax file no-- tax file number-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#----
-### Sweden driver's license number
-
-#### Format
-
-10 digits containing a hyphen
-
-#### Pattern
-
-10 digits containing a hyphen:
--- six digits-- a hyphen-- four digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number--
-##### Keywords_sweden_eu_driver's_license_number
--- ajokortti-- permis de conducere-- ajokortin numero-- kuljettajat lic.-- drivere lic.-- körkort-- numărul permisului de conducere-- שאָפער דערלויבעניש נומער-- förare lic.-- דריווערס דערלויבעניש-- körkortsnummer----
-### Sweden national ID
-
-#### Format
-
-10 or 12 digits and an optional delimiter
-
-#### Pattern
-
-10 or 12 digits and an optional delimiter:
-- two digits (optional)-- Six digits in date format YYMMDD-- delimiter of "-" or "+" (optional)-- four digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_swedish_national_identifier
--- id no-- id number-- id#-- identification no-- identification number-- identifikationsnumret#-- identifikationsnumret-- identitetshandling-- identity document-- identity no-- identity number-- id-nummer-- personal id-- personnummer#-- personnummer-- skatteidentifikationsnummer----
-### Sweden passport number
-
-#### Format
-
-eight digits
-
-#### Pattern
-
-eight consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keyword_sweden_passport
--- alien registration card-- g3 processing fees-- multiple entry-- Numéro de passeport-- passeport n °-- passeport non-- passeport #-- passeport#-- passeportnon-- passeportn °-- passnummer-- pass nr-- schengen visa-- schengen visas-- single entry-- sverige pass-- visa requirements-- visa processing-- visa type-
-##### Keywords_eu_passport_date
--- date of issue-- date of expiry-----
-### Sweden tax identification number
-
-#### Format
-
-10 digits and a symbol in the specified pattern
-
-#### Pattern
-
-10 digits and a symbol:
--- six digits that correspond to the birth date (YYMMDD)-- a plus sign or minus sign-- three digits that make the identification number unique where:
- - for numbers issued before 1990, the seventh and eighth digit identify the county of birth or foreign-born people
- - the digit in the ninth position indicates gender by either odd for male or even for female
-- one check digit-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_sweden_eu_tax_file_number
--- personal id number-- personnummer-- skatt id nummer-- skatt identifikation-- skattebetalarens identifikationsnummer-- sverige tin-- tax file-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax number-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#-----
-### SWIFT code
-
-#### Format
-
-four letters followed by 5-31 letters or digits
-
-#### Pattern
-
-four letters followed by 5-31 letters or digits:
-- four-letter bank code (not case-sensitive)-- an optional space-- 4-28 letters or digits (the Basic Bank Account Number (BBAN))-- an optional space-- one to three letters or digits (remainder of the BBAN)-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_swift
--- international organization for standardization 9362-- iso 9362-- iso9362-- swift#-- swiftcode-- swiftnumber-- swiftroutingnumber-- swift code-- swift number #-- swift routing number-- bic number-- bic code-- bic #-- bic#-- bank identifier code-- Organisation internationale de normalisation 9362-- rapide #-- code SWIFT-- le numéro de swift-- swift numéro d'acheminement-- le numéro BIC-- \# BIC-- code identificateur de banque-- SWIFTコード-- SWIFT番号-- BIC番号-- BICコード-- SWIFT コード-- SWIFT 番号-- BIC 番号-- BIC コード-- 金融機関識別コード-- 金融機関コード-- 銀行コード----
-### Switzerland SSN AHV number
-
-#### Format
-
-13-digit number
-
-#### Pattern
-
-13-digit number:
--- three digits - 756-- an optional dot-- four digits-- an optional dot-- four digits-- an optional dot-- two digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_swiss_ssn_AHV_number
--- ahv-- ssn-- pid-- insurance number-- personalidno#-- social security number-- personal id number-- personal identification no.-- insuranceno#-- uniqueidno#-- unique identification no.-- avs number-- personal identity no versicherungsnummer-- identifikationsnummer-- einzigartige identität nicht-- sozialversicherungsnummer-- identification personnelle id-- numéro de sécurité sociale-----
-### Taiwanese identification number
-
-#### Format
-
-one letter (in English) followed by nine digits
-
-#### Pattern
-
-one letter (in English) followed by nine digits:
-- one letter (in English, not case-sensitive)-- the digit "1" or "2"-- eight digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_taiwan_national_id
--- 身份證字號-- 身份證-- 身份證號碼-- 身份證號-- 身分證字號-- 身分證-- 身分證號碼-- 身份證號-- 身分證統一編號-- 國民身分證統一編號-- 簽名-- 蓋章-- 簽名或蓋章-- 簽章----
-### Taiwan passport number
-
-#### Format
--- biometric passport number: nine digits-- non-biometric passport number: nine digits-
-#### Pattern
-biometric passport number:
-- the character "3"-- eight digits-
-non-biometric passport number:
-- nine digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_taiwan_passport
--- ROC passport number-- Passport number-- Passport no-- Passport Num-- Passport #-- 护照-- 中華民國護照-- Zhōnghuá Mínguó hùzhào----
-### Taiwan-resident certificate (ARC/TARC) number
-
-#### Format
-
-10 letters and digits
-
-#### Pattern
-
-10 letters and digits:
-- two letters (not case-sensitive)-- eight digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_taiwan_resident_certificate
--- Resident Certificate-- Resident Cert-- Resident Cert.-- Identification card-- Alien Resident Certificate-- ARC-- Taiwan Area Resident Certificate-- TARC-- 居留證-- 外僑居留證-- 台灣地區居留證----
-### U.K. driver's license number
-
-#### Format
-
-Combination of 18 letters and digits in the specified format
-
-#### Pattern
-
-18 letters and digits:
-- Five letters (not case-sensitive) or the digit "9" in place of a letter.-- One digit.-- Five digits in the date format MMDDY for date of birth. The seventh character is incremented by 50 if driver is female; for example, 51 to 62 instead of 01 to 12.-- Two letters (not case-sensitive) or the digit "9" in place of a letter.-- Five digits.-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keywords_eu_driver's_license_number
--- driverlic-- driverlics-- driverlicense-- driverlicenses-- driverlicence-- driverlicences-- driver lic-- driver lics-- driver license-- driver licenses-- driver licence-- driver licences-- driverslic-- driverslics-- driverslicence-- driverslicences-- driverslicense-- driverslicenses-- drivers lic-- drivers lics-- drivers license-- drivers licenses-- drivers licence-- drivers licences-- driver'lic-- driver'lics-- driver'license-- driver'licenses-- driver'licence-- driver'licences-- driver' lic-- driver' lics-- driver' license-- driver' licenses-- driver' licence-- driver' licences-- driver'slic-- driver'slics-- driver'slicense-- driver'slicenses-- driver'slicence-- driver'slicences-- driver's lic-- driver's lics-- driver's license-- driver's licenses-- driver's licence-- driver's licences-- dl#-- dls#-- driverlic#-- driverlics#-- driverlicense#-- driverlicenses#-- driverlicence#-- driverlicences#-- driver lic#-- driver lics#-- driver license#-- driver licenses#-- driver licences#-- driverslic#-- driverslics#-- driverslicense#-- driverslicenses#-- driverslicence#-- driverslicences#-- drivers lic#-- drivers lics#-- drivers license#-- drivers licenses#-- drivers licence#-- drivers licences#-- driver'lic#-- driver'lics#-- driver'license#-- driver'licenses#-- driver'licence#-- driver'licences#-- driver' lic#-- driver' lics#-- driver' license#-- driver' licenses#-- driver' licence#-- driver' licences#-- driver'slic#-- driver'slics#-- driver'slicense#-- driver'slicenses#-- driver'slicence#-- driver'slicences#-- driver's lic#-- driver's lics#-- driver's license#-- driver's licenses#-- driver's licence#-- driver's licences#-- driving licence -- driving license-- dlno#-- driv lic-- driv licen-- driv license-- driv licenses-- driv licence-- driv licences-- driver licen-- drivers licen-- driver's licen-- driving lic-- driving licen-- driving licenses-- driving licence-- driving licences-- driving permit-- dl no-- dlno-- dl number-----
-### U.K. electoral roll number
-
-#### Format
-
-two letters followed by 1-4 digits
-
-#### Pattern
-
-two letters (not case-sensitive) followed by 1-4 numbers
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_uk_electoral
--- council nomination-- nomination form-- electoral register-- electoral roll-----
-### U.K. national health service number
-
-#### Format
-
-10-17 digits separated by spaces
-
-#### Pattern
-
-10-17 digits:
-- either 3 or 10 digits-- a space-- three digits-- a space-- four digits-
-#### Checksum
-
-Yes
-
-#### Keywords
-
-##### Keyword_uk_nhs_number
--- national health service-- nhs-- health services authority-- health authority-
-##### Keyword_uk_nhs_number1
--- patient id-- patient identification-- patient no-- patient number-
-##### Keyword_uk_nhs_number_dob
--- GP-- DOB-- D.O.B-- Date of Birth-- Birth Date----
-### U.K. national insurance number (NINO)
-This sensitive information type entity is included in the EU National Identification Number sensitive information type. It's also available as a stand-alone sensitive information type entity.
-
-#### Format
-
-seven characters or nine characters separated by spaces or dashes
-
-#### Pattern
-
-two possible patterns:
--- two letters (valid NINOs use only certain characters in this prefix, which this pattern validates; not case-sensitive)-- six digits-- either 'A', 'B', 'C', or 'D' (like the prefix, only certain characters are allowed in the suffix; not case-sensitive)-
-OR
--- two letters-- a space or dash-- two digits-- a space or dash-- two digits-- a space or dash-- two digits-- a space or dash-- either 'A', 'B', 'C', or 'D'-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_uk_nino
--- national insurance number-- national insurance contributions-- protection act-- insurance-- social security number-- insurance application-- medical application-- social insurance-- medical attention-- social security-- great britain-- NI Number-- NI No.-- NI #-- NI#-- insurance#-- insurancenumber-- nationalinsurance#-- nationalinsurancenumber-----
-### U.K. Unique Taxpayer Reference Number
-
-#### Format
-
-10 digits without spaces and delimiters
--
-#### Pattern
-
-10 digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_uk_eu_tax_file_number
--- tax number-- tax file-- tax id-- tax identification no-- tax identification number-- tax no#-- tax no-- tax registration number-- taxid#-- taxidno#-- taxidnumber#-- taxno#-- taxnumber#-- taxnumber-- tin id-- tin no-- tin#----
-### U.S. bank account number
-
-#### Format
-
-6-17 digits
-
-#### Pattern
-
-6-17 consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_usa_Bank_Account
--- Checking Account Number-- Checking Account-- Checking Account #-- Checking Acct Number-- Checking Acct #-- Checking Acct No.-- Checking Account No.-- Bank Account Number-- Bank Account #-- Bank Acct Number-- Bank Acct #-- Bank Acct No.-- Bank Account No.-- Savings Account Number-- Savings Account.-- Savings Account #-- Savings Acct Number-- Savings Acct #-- Savings Acct No.-- Savings Account No.-- Debit Account Number-- Debit Account-- Debit Account #-- Debit Acct Number-- Debit Acct #-- Debit Acct No.-- Debit Account No.----
-### U.S. driver's license number
-
-#### Format
-
-Depends on the state
-
-#### Pattern
-
-depends on the state - for example, New York:
-- nine digits formatted like ddd ddd ddd will match.-- nine digits like ddddddddd won't match.-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_us_drivers_license_abbreviations
--- DL-- DLS-- CDL-- CDLS-- ID-- IDs-- DL#-- DLS#-- CDL#-- CDLS#-- ID#-- IDs#-- ID number-- ID numbers-- LIC-- LIC#-
-##### Keyword_us_drivers_license
--- DriverLic-- DriverLics-- DriverLicense-- DriverLicenses-- Driver Lic-- Driver Lics-- Driver License-- Driver Licenses-- DriversLic-- DriversLics-- DriversLicense-- DriversLicenses-- Drivers Lic-- Drivers Lics-- Drivers License-- Drivers Licenses-- Driver'Lic-- Driver'Lics-- Driver'License-- Driver'Licenses-- Driver' Lic-- Driver' Lics-- Driver' License-- Driver' Licenses-- Driver'sLic-- Driver'sLics-- Driver'sLicense-- Driver'sLicenses-- Driver's Lic-- Driver's Lics-- Driver's License-- Driver's Licenses-- identification number-- identification numbers-- identification #-- id card-- id cards-- identification card-- identification cards-- DriverLic#-- DriverLics#-- DriverLicense#-- DriverLicenses#-- Driver Lic#-- Driver Lics#-- Driver License#-- Driver Licenses#-- DriversLic#-- DriversLics#-- DriversLicense#-- DriversLicenses#-- Drivers Lic#-- Drivers Lics#-- Drivers License#-- Drivers Licenses#-- Driver'Lic#-- Driver'Lics#-- Driver'License#-- Driver'Licenses#-- Driver' Lic#-- Driver' Lics#-- Driver' License#-- Driver' Licenses#-- Driver'sLic#-- Driver'sLics#-- Driver'sLicense#-- Driver'sLicenses#-- Driver's Lic#-- Driver's Lics#-- Driver's License#-- Driver's Licenses#-- id card#-- id cards#-- identification card#-- identification cards#--
-##### Keyword_[state_name]_drivers_license_name
--- state abbreviation (for example, "NY")-- state name (for example, "New York")----
-### U.S. individual taxpayer identification number (ITIN)
-
-#### Format
-
-nine digits that start with a "9" and contain a "7" or "8" as the fourth digit, optionally formatted with spaces or dashes
-
-#### Pattern
-
-formatted:
-- the digit "9"-- two digits-- a space or dash-- a "7" or "8"-- a digit-- a space, or dash-- four digits-
-unformatted:
-- the digit "9"-- two digits-- a "7" or "8"-- five digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_itin
--- taxpayer-- tax id-- tax identification-- itin-- i.t.i.n.-- ssn-- tin-- social security-- tax payer-- itins-- taxid-- individual taxpayer---
-### U.S. phone number
-
-#### Pattern
-- 10 digit number, for example, +1 nxx-nxx-xxxx-- Optional area code: +1-- n can be any digit between 2-9-- x can be any digit between 0-9-- Optional paranthesis around the area code-- Optional space or - between area code, exchange code, and the last four digits-- Optional four digit extension-
-#### Checksum
-Not applicable
-
-#### Keywords
-
-##### Keywords_us_phone_number
-- cell-- cellphone-- contact-- landline-- mobile-- mob-- mob#-- ph#-- phone-- telephone-- tel#----
-### U.S. social security number (SSN)
-
-#### Format
-
-nine digits, which may be in a formatted or unformatted pattern
-
-> [!NOTE]
-> If issued before mid-2011, an SSN has strong formatting where certain parts of the number must fall within certain ranges to be valid (but there's no checksum).
-
-#### Pattern
-
-four functions look for SSNs in four different patterns:
-- Func_ssn finds SSNs with pre-2011 strong formatting that are formatted with dashes or spaces (ddd-dd-dddd OR ddd dd dddd)-- Func_unformatted_ssn finds SSNs with pre-2011 strong formatting that are unformatted as nine consecutive digits (ddddddddd)-- Func_randomized_formatted_ssn finds post-2011 SSNs that are formatted with dashes or spaces (ddd-dd-dddd OR ddd dd dddd)-- Func_randomized_unformatted_ssn finds post-2011 SSNs that are unformatted as nine consecutive digits (ddddddddd)-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_ssn
--- SSA Number-- social security number-- social security #-- social security#-- social security no-- Social Security#-- Soc Sec-- SSN-- SSNS-- SSN#-- SS#-- SSID----
-### U.S. states
-
-#### Format
-Includes all 50 U.S. state names and the two digit short codes.
-
-#### Checksum
-Not applicable
-
-#### Keywords
-
-##### Keywords_us_states
-- State----
-### U.S. zipcode
-
-#### Format
-Five digit U.S. Zip code and an optional four digit code separated by a hyphen (-).
-
-#### Checksum
-Not applicable
-
-#### Keywords
-
-##### Keywords_us_zip_code
-- zip-- zipcode-- postal-- postalcode----
-### U.S. / U.K. passport number
-
-#### Format
-
-nine digits
-
-#### Pattern
-
-nine consecutive digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keywords_eu_passport_number
--- passport#-- passport #-- passportid-- passports-- passportno-- passport no-- passportnumber-- passport number-- passportnumbers-- passport numbers-
-##### Keywords_uk_eu_passport_number
--- british passport-- uk passport-----
-### Ukraine passport domestic
-
-#### Format
-
-nine digits
-
-#### Pattern
-
-nine digits
-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_ukraine_passport_domestic
--- ukraine passport-- passport number-- passport no-- паспорт України-- номер паспорта-- персональний-----
-### Ukraine passport international
-
-#### Format
-
-eight-character alphanumeric pattern
-
-#### Pattern
-
-eight-character alphanumeric pattern:
-- two letters or digits-- six digits-
-#### Checksum
-
-No
-
-#### Keywords
-
-##### Keyword_ukraine_passport_international
--- ukraine passport-- passport number-- passport no-- паспорт України-- номер паспорта
purview Troubleshoot Connections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/troubleshoot-connections.md
- Title: Troubleshoot scans and connections in Microsoft Purview
-description: This article explains the steps to troubleshoot your scans and source connections in Microsoft Purview.
----- Previously updated : 03/07/2023--
-# Troubleshoot your scans and connections in Microsoft Purview
-
-This article describes how to troubleshoot connection errors while setting up scans on data sources in Microsoft Purview, or errors that may occur with your scans.
-
-## Permission the credential on the data source
-
-If you're using a managed identity or service principal as a method of authentication for scans, you'll have to allow these identities to have access to your data source.
-
-There are specific instructions for each [source type](azure-purview-connector-overview.md).
-
-> [!IMPORTANT]
-> Verify that you have followed all prerequisite and authentication steps for the source you are connecting to.
-> You can find all available sources listed in the [Microsoft Purview supported sources article](azure-purview-connector-overview.md).
-
-## Verifying Azure Role-based Access Control to enumerate Azure resources in the Microsoft Purview governance portal
-
-### Registering single Azure data source
-
-To register a single data source in Microsoft Purview, such as an Azure Blob Storage or an Azure SQL Database, you must be granted at least **Reader** role on the resource or inherited from higher scope such as resource group or subscription. Some Azure RBAC roles, such as Security Admin, don't have read access to view Azure resources in control plane.
-
-Verify this by following the steps below:
-
-1. From the [Azure portal](https://portal.azure.com), navigate to the resource that you're trying to register in Microsoft Purview. If you can view the resource, it's likely, that you already have at least reader role on the resource.
-2. Select **Access control (IAM)** > **Role Assignments**.
-3. Search by name or email address of the user who is trying to register data sources in Microsoft Purview.
-4. Verify if any role assignments, such as Reader, exist in the list or add a new role assignment if needed.
-
-### Scanning multiple Azure data sources
-
-1. From the [Azure portal](https://portal.azure.com), navigate to the subscription or the resource group.
-2. Select **Access Control (IAM)** from the left menu.
-3. Select **+Add**.
-4. In the **Select input** box, select the **Reader** role and enter your Microsoft Purview account name (which represents its MSI name).
-5. Select **Save** to finish the role assignment.
-6. Repeat the steps above to add the identity of the user who is trying to create a new scan for multiple data sources in Microsoft Purview.
-
-## Scanning data sources using Private Link
-
-If public endpoint is restricted on your data sources, to scan Azure data sources using Private Link, you need to set up a Self-hosted integration runtime and create a credential.
-
-> [!IMPORTANT]
-> Scanning multiple data sources which contain databases as Azure SQL database with _Deny public network access_, would fail. To scan these data sources using private Endpoint, instead use registering single data source option.
-
-For more information about setting up a self-hosted integration runtime, see [Ingestion private endpoints and scanning sources](catalog-private-link-ingestion.md#deploy-self-hosted-integration-runtime-ir-and-scan-your-data-sources)
-
-For more information how to create a new credential in Microsoft Purview, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
-
-## Storing your credential in your key vault and using the right secret name and version
-
-You must also store your credential in your Azure Key Vault instance and use the right secret name and version.
-
-Verify this by following the steps below:
-
-1. Navigate to your Key Vault.
-1. Select **Settings** > **Secrets**.
-1. Select the secret you're using to authenticate against your data source for scans.
-1. Select the version that you intend to use and verify that the password or account key is correct by selecting **Show Secret Value**.
-
-## Verify permissions for the Microsoft Purview managed identity on your Azure Key Vault
-
-Verify that the correct permissions have been configured for the Microsoft Purview managed identity to access your Azure Key Vault.
-
-To verify this, do the following steps:
-
-1. Navigate to your key vault and to the **Access policies** section
-
-1. Verify that your Microsoft Purview managed identity shows under the _Current access policies_ section with at least **Get** and **List** permissions on Secrets
-
- :::image type="content" source="./media/troubleshoot-connections/verify-minimum-permissions.png" alt-text="Image showing dropdown selection of both Get and List permission options":::
-
-If you don't see your Microsoft Purview managed identity listed, then follow the steps in [Create and manage credentials for scans](manage-credentials.md) to add it.
-
-## Scans no longer run
-
-If your Microsoft Purview scan used to successfully run, but are now failing, check these things:
-1. Have credentials to your resource changed or been rotated? If so, you'll need to update your scan to have the correct credentials.
-1. Is an [Azure Policy](../governance/policy/overview.md) preventing **updates to Storage accounts**? If so follow the [Microsoft Purview exception tag guide](create-azure-purview-portal-faq.md) to create an exception for Microsoft Purview accounts.
-1. Are you using a self-hosted integration runtime? Check that it's up to date with the latest software and that it's connected to your network.
-
-## Test connection passes but scan fails with connection error
-
-Are you using private endpoints or virtual networks? Confirm your [network settings](concept-best-practices-security.md#network-security), paying attention to your Network Security Group (NSG) rules.
-
-## Next steps
--- [Browse the Microsoft Purview Data Catalog](how-to-browse-catalog.md)-- [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
purview Troubleshoot Policy Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/troubleshoot-policy-sql.md
- Title: Troubleshoot Microsoft Purview policies for SQL data sources
-description: Check how to see if SQL data sources are receiving policies from Microsoft Purview.
----- Previously updated : 03/10/2023--
-# Tutorial: Troubleshoot Microsoft Purview policies for SQL data sources
-
-In this tutorial, you learn how issue SQL commands to inspect the Microsoft Purview policies that have been communicated to the SQL instance, where they will be enforced. You will also learn how to force a download of the policies to the SQL instance. These commands are only used for troubleshooting and are not required during the normal operation of Microsoft Purview policies. These commands require a higher level of privileges in the SQL instance.
-
-For more information about Microsoft Purview policies, see the concept guides listed in the [Next steps](#next-steps) section.
-
-## Prerequisites
-
-* An Azure subscription. If you don't already have one, [create a free subscription](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-* A Microsoft Purview account. If you don't have one, see the [quickstart for creating a Microsoft Purview account](create-catalog-portal.md).
-* Register a data source, enable *Data use management*, and create a policy. To do so, use one of the Microsoft Purview policy guides. To follow along with the examples in this tutorial, you can [create a DevOps policy for Azure SQL Database](how-to-policies-devops-azure-sql-db.md).
-
-## Test the policy
-Once you create a policy, the Azure AD principals referenced in the Subject of the policy should be able to connect to any database in the server to which the policies are published.
-
-### Force policy download
-It is possible to force an immediate download of the latest published policies to the current SQL database by running the following command. The minimal permission required to run it is membership in ##MS_ServerStateManager##-server role.
-
-```sql
Force immediate download of latest published policies
-exec sp_external_policy_refresh reload
-```
-
-### Analyze downloaded policy state from SQL
-The following DMVs can be used to analyze which policies have been downloaded and are currently assigned to Azure AD principals. The minimal permission required to run them is VIEW DATABASE SECURITY STATE - or assigned Action Group *SQL Security Auditor*.
-
-```sql
- Lists generally supported actions
-SELECT * FROM sys.dm_server_external_policy_actions
- Lists the roles that are part of a policy published to this server
-SELECT * FROM sys.dm_server_external_policy_roles
- Lists the links between the roles and actions, could be used to join the two
-SELECT * FROM sys.dm_server_external_policy_role_actions
- Lists all Azure AD principals that were given connect permissions
-SELECT * FROM sys.dm_server_external_policy_principals
- Lists Azure AD principals assigned to a given role on a given resource scope
-SELECT * FROM sys.dm_server_external_policy_role_members
- Lists Azure AD principals, joined with roles, joined with their data actions
-SELECT * FROM sys.dm_server_external_policy_principal_assigned_actions
-```
-
-## Next steps
-
-Concept guides for Microsoft Purview access policies:
-- [DevOps policies](concept-policies-devops.md)-- [Self-service access policies](concept-self-service-data-access-policy.md)-- [Data owner policies](concept-policies-data-owner.md)
purview Tutorial Atlas 2 2 Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-atlas-2-2-apis.md
- Title: "Use new APIs available with Atlas 2.2."
-description: This tutorial describes the new APIs available with the Atlas 2.2 upgrade.
----- Previously updated : 04/18/2022-
-# Customer intent: As a developer, I want to use the new APIs available with Atlas 2.2 to interact programmatically with the data map in Microsoft Purview.
--
-# Tutorial: Atlas 2.2 new functionality
-
-In this tutorial, learn to programmatically interact with new Atlas 2.2 APIs with the data map in Microsoft Purview.
-
-## Prerequisites
-
-* If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
-
-* You must have an existing Microsoft Purview account. If you don't have a catalog, see the [quickstart for creating a Microsoft Purview account](create-catalog-portal.md).
-
-* To establish a bearer token and to call any data plane APIs, see [the documentation about how to call REST APIs for Microsoft Purview data planes](tutorial-using-rest-apis.md).
-
-## Business metadata APIs
-
-Business metadata is a template that contains custom attributes (key values). You can create these attributes globally and then apply them across multiple typedefs.
-
-### Atlas endpoint
-
-For all the requests, you'll need the Atlas endpoint for your Microsoft Purview account.
-
-1. Find your Microsoft Purview account in the [Azure portal](https://portal.azure.com)
-1. Select the **Properties** page on the left side menu
-1. Copy the **Atlas endpoint** value
--
-### Create business metadata with attributes
-
-You can send a `POST` request to the following endpoint:
-
-```
-POST {{endpoint}}/api/atlas/v2/types/typedefs
-```
-
->[!TIP]
-> The **applicableEntityTypes** property tells which data types the metadata will be applied to.
-
-Sample JSON:
-
-```json
- {
- "businessMetadataDefs": [
- {
- "category": "BUSINESS_METADATA",
- "createdBy": "admin",
- "updatedBy": "admin",
- "version": 1,
- "typeVersion": "1.1",
- "name": "<Name of Business Metadata>",
- "description": "",
- "attributeDefs": [
- {
- "name": "<Attribute Name>",
- "typeName": "string",
- "isOptional": true,
- "cardinality": "SINGLE",
- "isUnique": false,
- "isIndexable": true,
- "options": {
- "maxStrLength": "50",
- "applicableEntityTypes": "[\"Referenceable\"]"
- }
- }
- ]
- }
- ]
-}
-```
-
-### Add or update an attribute to existing business metadata
-
-You can send a `PUT` request to the following endpoint:
-
-```
-PUT {{endpoint}}/api/atlas/v2/types/typedefs
-```
-
-Sample JSON:
-
-```json
- {
- "businessMetadataDefs": [
- {
- "category": "BUSINESS_METADATA",
- "createdBy": "admin",
- "updatedBy": "admin",
- "version": 1,
- "typeVersion": "1.1",
- "name": "<Name of Business Metadata>",
- "description": "",
- "attributeDefs": [
- {
- "name": "<Attribute Name>",
- "typeName": "string",
- "isOptional": true,
- "cardinality": "SINGLE",
- "isUnique": false,
- "isIndexable": true,
- "options": {
- "maxStrLength": "500",
- "applicableEntityTypes": "[\"Referenceable\"]"
- }
- },
- {
- "name": "<Attribute Name 2>",
- "typeName": "int",
- "isOptional": true,
- "cardinality": "SINGLE",
- "isUnique": false,
- "isIndexable": true,
- "options": {
- "applicableEntityTypes": "[\"Referenceable\"]"
- }
- }
- ]
- }
- ]
-}
-```
-
-### Get a business metadata definition
-
-You can send a `GET` request to the following endpoint:
-
-```
-GET {endpoint}}/api/atlas/v2/types/typedef/name/{{Business Metadata Name}}
-```
-
-### Set a business metadata attribute to an entity
-
-You can send a `POST` request to the following endpoint:
-
-```
-POST {{endpoint}}/api/atlas/v2/entity/guid/{{GUID}}/businessmetadata?isOverwrite=true
-```
-
-Sample JSON:
-
-```json
-{
- "myBizMetaData1": {
- "bizAttr1": "I am myBizMetaData1.bizAttr1",
- "bizAttr2": 123,
- }
- }
-```
-
-### Delete a business metadata attribute from an entity
--
-You can send a `DELETE` request to the following endpoint:
-
-```
-'DELETE' {{endpoint}}/api/atlas/v2/entity/guid/{{GUID}}/businessmetadata?isOverwrite=true
-```
-
-Sample JSON:
-
-```json
-{
- "myBizMetaData1": {
- "bizAttr1": ""
- }
-}
-```
-
-### Delete a business metadata type definition
-
->[!NOTE]
->You can only delete business metadata type definition if it has no references, i.e., if it has not been assigned to any assets in the catalog.
-
-You can send a `DELETE` request to the following endpoint:
-
-```
-DELETE {{endpoint}}/api/atlas/v2/types/typedef/name/{{Business Metadata Name}}
-```
-
-## Custom attribute APIs
-
-Custom attributes are key/value pairs that can be directly added to an Atlas entity.
-
-### Set a custom attribute to an entity
-
-You can send a `POST` request to the following endpoint:
-
-```
-POST {{endpoint}}/api/atlas/v2/entity
-```
-
-Sample JSON:
-
-```json
-{
- "entity": {
- "typeName": "azure_datalake_gen2_path",
- "attributes": {
-
- "qualifiedName": "<FQN of the asset>",
- "name": "data6.csv"
- },
- "guid": "3ffb28ff-138f-419e-84ba-348b0165e9e0",
- "customAttributes": {
- "custAttr1": "attr1",
- "custAttr2": "attr2"
- }
- }
-}
-```
-## Label APIs
-
-Labels are free text tags that can be applied to any Atlas entity.
-
-### Set labels to an entity
-
-You can send a `POST` request to the following endpoint:
-
-```
-POST {{endpoint}}/api/atlas/v2/entity/guid/{{GUID}}/labels
-```
-
-Sample JSON:
-
-```json
-[
- "label1",
- "label2"
-]
-```
-
-### Delete labels to an entity
-
-You can send a `DELETE` request to the following endpoint:
-
-```
-DELETE {{endpoint}}/api/atlas/v2/entity/guid/{{GUID}}/labels
-```
-
-Sample JSON:
-
-```json
-[
- "label2"
-]
-```
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Manage data sources](manage-data-sources.md)
-> [Microsoft Purview data plane REST APIs](/rest/api/purview/)
purview Tutorial Azure Purview Checklist https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-azure-purview-checklist.md
- Title: Prerequisites to successfully deploy a Microsoft Purview (formerly Azure Purview) account
-description: This tutorial lists a prerequisite checklist to deploy a Microsoft Purview (formerly Azure Purview) account.
----- Previously updated : 12/09/2022
-# Customer Intent: As a Data and Data Security administrator, I want to deploy Microsoft Purview as a unified data governance solution.
--
-# Microsoft Purview (formerly Azure Purview) deployment checklist
-
-This article lists prerequisites that help you get started quickly on planning and deployment for your Microsoft Purview (formerly Azure Purview) account.
-
-If you are creating a plan to deploy Microsoft Purview, and also want to consider best practices as you develop your deployment strategy, then use [our deployment best practices guide](deployment-best-practices.md) to get started.
-
-If you are looking for a strictly technical deployment guide, this deployment checklist is for you.
-
-|No. |Prerequisite / Action |Required permission |More guidance and recommendations |
-|:|:|:|:|
-|1 | Azure Active Directory Tenant |N/A |An [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md) should be associated with your subscription. <ul><li>*Global Administrator* or *Information Protection Administrator* role is required, if you plan to [extend Microsoft 365 Sensitivity Labels to the Microsoft Purview Data Map for files and db columns](create-sensitivity-label.md)</li><li> *Global Administrator* or *Power BI Administrator* role is required, if you're planning to [scan Power BI tenants](register-scan-power-bi-tenant.md).</li></ul> |
-|2 |An active Azure Subscription |*Subscription Owner* |An Azure subscription is needed to deploy Microsoft Purview and its managed resources. If you don't have an Azure subscription, create a [free subscription](https://azure.microsoft.com/free/) before you begin. |
-|3 |Define whether you plan to deploy a Microsoft Purview with a managed event hub | N/A | You can choose to deploy configure an existing Event Hubs namespace during Microsoft Purview account creation, see [Microsoft Purview account creation](create-catalog-portal.md). With this managed namespace you can publish messages to the event hub kafka topic ATLAS_HOOK and Microsoft Purview will consume and process it. Microsoft Purview will notify entity changes to the event hub kafka topic ATLAS_ENTITIES and user can consume and process it. You can enable or disable this feature any time after account creation. |
-|4 |Register the following resource providers: <ul><li>Microsoft.Storage</li><li>Microsoft.EventHub (optional)</li><li>Microsoft.Purview</li></ul> |*Subscription Owner* or custom role to register Azure resource providers (_/register/action_) | [Register required Azure Resource Providers](../azure-resource-manager/management/resource-providers-and-types.md) in the Azure Subscription that is designated for Microsoft Purview Account. Review [Azure resource provider operations](../role-based-access-control/resource-provider-operations.md). |
-|5 |Update Azure Policy to allow deployment of the following resources in your Azure subscription: <ul><li>Microsoft Purview</li><li>Azure Storage</li></ul> |*Subscription Owner* |Use this step if an existing Azure Policy prevents deploying such Azure resources. If a blocking policy exists and needs to remain in place, follow our [Microsoft Purview exception tag guide](create-azure-purview-portal-faq.md) and follow the steps to create an exception for Microsoft Purview accounts. |
-|6 | Define your network security requirements. | Network and Security architects. |<ul><li> Review [Microsoft Purview network architecture and best practices](concept-best-practices-network.md) to define what scenario is more relevant to your network requirements. </li><li>If private network is needed, use [Microsoft Purview Managed IR](catalog-managed-vnet.md) to scan Azure data sources when possible to reduce complexity and administrative overhead. </li></ul> |
-|7 |An Azure Virtual Network and Subnet(s) for Microsoft Purview private endpoints. | *Network Contributor* to create or update Azure VNet. |Use this step if you're planning to deploy [private endpoint connectivity with Microsoft Purview](catalog-private-link.md): <ul><li>Private endpoints for **Ingestion**.</li><li>Private endpoint for Microsoft Purview **Account**.</li><li>Private endpoint for Microsoft Purview **Portal**.</li></ul> <br> Deploy [Azure Virtual Network](../virtual-network/quick-create-portal.md) if you need one. |
-|8 |Deploy private endpoint for Azure data sources. |*Network Contributor* to set up private endpoints for each data source. |Perform this step, if you're planning to use [Private Endpoint for Ingestion](catalog-private-link-end-to-end.md). |
-|9 |Define whether to deploy new or use existing Azure Private DNS Zones. |Required [Azure Private DNS Zones](catalog-private-link-name-resolution.md) can be created automatically during Purview Account deployment using Subscription Owner / Contributor role |Use this step if you're planning to use Private Endpoint connectivity with Microsoft Purview. Required DNS Zones for Private Endpoint: <ul><li>privatelink.purview.azure.com</li><li>privatelink.purviewstudio.azure.com</li><li>privatelink.blob.core.windows.net</li><li>privatelink.queue.core.windows.net</li><li>privatelink.servicebus.windows.net</li></ul> |
-|10 |A management machine in your CorpNet or inside Azure VNet to launch the Microsoft Purview governance portal. |N/A |Use this step if you're planning to set **Allow Public Network** to **deny** on your Microsoft Purview Account. |
-|11 |Deploy a Microsoft Purview Account |Subscription Owner / Contributor |Purview account is deployed with one Capacity Unit and will scale up based [on demand](concept-elastic-data-map.md). |
-|12 |Deploy a Managed Integration Runtime and Managed private endpoints for Azure data sources. |*Data source admin* to set up Managed VNet inside Microsoft Purview. <br> *Network Contributor* to approve managed private endpoint for each Azure data source. |Perform this step if you're planning to use [Managed VNet](catalog-managed-vnet.md). within your Microsoft Purview account for scanning purposes. |
-|13 |Deploy Self-hosted integration runtime VMs inside your network. |Azure: *Virtual Machine Contributor* <br> On-premises: Application owner |Use this step if you're planning to perform any scans using [Self-hosted Integration Runtime](manage-integration-runtimes.md). |
-|14 |Create a Self-hosted integration runtime inside Microsoft Purview. |Data curator <br> VM Administrator or application owner |Use this step if you're planning to use Self-hosted Integration Runtime instead of Managed Integration Runtime or Azure Integration Runtime. <br><br> <br> [download](https://www.microsoft.com/en-us/download/details.aspx?id=39717) |
-|15 |Register your Self-hosted integration runtime | Virtual machine administrator |Use this step if you have **on-premises** or **VM-based data sources** (for example, SQL Server). <br> Use this step are using **Private Endpoint** to scan to **any** data sources. |
-|16 |Grant Azure RBAC **Reader** role to **Microsoft Purview MSI** at data sources' Subscriptions |*Subscription owner* or *User Access Administrator* |Use this step if you're planning to register [multiple](register-scan-azure-multiple-sources.md) or **any** of the following data sources: <ul><li>[Azure Blob Storage](register-scan-azure-blob-storage-source.md)</li><li>[Azure Data Lake Storage Gen1](register-scan-adls-gen1.md)</li><li>[Azure Data Lake Storage Gen2](register-scan-adls-gen2.md)</li><li>[Azure SQL Database](register-scan-azure-sql-database.md)</li><li>[Azure SQL Managed Instance](register-scan-azure-sql-managed-instance.md)</li><li>[Azure Synapse Analytics](register-scan-synapse-workspace.md)</li></ul> |
-|17 |Grant Azure RBAC **Storage Blob Data Reader** role to **Microsoft Purview MSI** at data sources Subscriptions. |*Subscription owner* or *User Access Administrator* | **Skip** this step if you're using Private Endpoint to connect to data sources. Use this step if you have these data sources:<ul><li>[Azure Blob Storage](register-scan-azure-blob-storage-source.md#using-a-system-or-user-assigned-managed-identity-for-scanning)</li><li>[Azure Data Lake Storage Gen2](register-scan-adls-gen2.md#using-a-system-or-user-assigned-managed-identity-for-scanning)</li></ul> |
-|18 |Enable network connectivity to allow AzureServices to access data sources: <br> for example, Enable "**Allow trusted Microsoft services to access this storage account**". |*Owner* or *Contributor* at Data source |Use this step if **Service Endpoint** is used in your data sources. (Don't use this step if Private Endpoint is used) |
-|19 |Enable **Azure Active Directory Authentication** on **Azure SQL Servers**, **Azure SQL Managed Instance** and **Azure Synapse Analytics** |Azure SQL Server Contributor |Use this step if you have **Azure SQL DB** or **Azure SQL Managed Instance** or **Azure Synapse Analytics** as data source. **Skip** this step if you're using **Private Endpoint** to connect to data sources. |
-|20 |Grant **Microsoft Purview MSI** account with **db_datareader** role to Azure SQL databases and Azure SQL Managed Instance databases |Azure SQL Administrator |Use this step if you have **Azure SQL DB** or **Azure SQL Managed Instance** as data source. **Skip** this step if you're using **Private Endpoint** to connect to data sources. |
-|21 |Grant Azure RBAC **Storage Blob Data Reader** to **Synapse SQL Server** for staging Storage Accounts |Owner or User Access Administrator at data source |Use this step if you have **Azure Synapse Analytics** as data sources. **Skip** this step if you're using Private Endpoint to connect to data sources. |
-|22 |Grant Azure RBAC **Reader** role to **Microsoft Purview MSI** at **Synapse workspace** resources |Owner or User Access Administrator at data source |Use this step if you have **Azure Synapse Analytics** as data sources. **Skip** this step if you're using Private Endpoint to connect to data sources. |
-|23 |Grant Azure **Purview MSI account** with **db_datareader** role |Azure SQL Administrator |Use this step if you have **Azure Synapse Analytics (Dedicated SQL databases)**. <br> **Skip** this step if you're using **Private Endpoint** to connect to data sources. |
-|24 |Grant **Microsoft Purview MSI** account with **sysadmin** role |Azure SQL Administrator |Use this step if you have Azure Synapse Analytics (Serverless SQL databases). **Skip** this step if you're using **Private Endpoint** to connect to data sources. |
-|25 |Create an app registration or service principal inside your Azure Active Directory tenant | Azure Active Directory *Global Administrator* or *Application Administrator* | Use this step if you're planning to perform a scan on a data source using Delegated Author [Service Principal](create-service-principal-azure.md).|
-|26 |Create an **Azure Key Vault** and a **Secret** to save data source credentials or service principal secret. |*Contributor* or *Key Vault Administrator* |Use this step if you have **on-premises** or **VM-based data sources** (for example, SQL Server). <br> Use this step are using **ingestion private endpoints** to scan a data source. |
-|27 |Grant Key **Vault Access Policy** to Microsoft Purview MSI: **Secret: get/list** |*Key Vault Administrator* |Use this step if you have **on-premises** / **VM-based data sources** (for example, SQL Server) <br> Use this step if **Key Vault Permission Model** is set to [Vault Access Policy](../key-vault/general/assign-access-policy.md). |
-|28 |Grant **Key Vault RBAC role** Key Vault Secrets User to Microsoft Purview MSI. | *Owner* or *User Access Administrator* |Use this step if you have **on-premises** or **VM-based data sources** (for example, SQL Server) <br> Use this step if **Key Vault Permission Model** is set to [Azure role-based access control](../key-vault/general/rbac-guide.md). |
-|29 | Create a new connection to Azure Key Vault from the Microsoft Purview governance portal | *Data source admin* | Use this step if you're planning to use any of the following [authentication options](manage-credentials.md#create-a-new-credential) to scan a data source in Microsoft Purview: <ul><li>Account key</li><li>Basic Authentication</li><li>Delegated Auth</li><li>SQL Authentication</li><li>Service Principal</li><li>Consumer Key</li></ul>
-|30 |Deploy a private endpoint for Power BI tenant |*Power BI Administrator* <br> *Network contributor* |Use this step if you're planning to register a Power BI tenant as data source and your Microsoft Purview account is set to **deny public access**. <br> For more information, see [How to configure private endpoints for accessing Power BI](/power-bi/enterprise/service-security-private-links). |
-|31 |Connect Azure Data Factory to Microsoft Purview from Azure Data Factory Portal. **Manage** -> **Microsoft Purview**. Select **Connect to a Purview account**. <br> Validate if Azure resource tag **catalogUri** exists in ADF Azure resource. |Azure Data Factory Contributor / Data curator |Use this step if you have **Azure Data Factory**. |
-|32 |Verify if you have at least one **Microsoft 365 required license** in your Azure Active Directory tenant to use sensitivity labels in Microsoft Purview. |Azure Active Directory *Global Reader* |Perform this step if you're planning to extend **sensitivity labels to Microsoft Purview Data Map** <br> For more information, see [licensing requirements to use sensitivity labels on files and database columns in Microsoft Purview](sensitivity-labels-frequently-asked-questions.yml) |
-|33 |Consent "**Extend labeling to assets in Microsoft Purview Data Map**" |Compliance Administrator <br> Azure Information Protection Administrator |Use this step if you're interested in extending sensitivity labels to your data in the data map. <br> For more information, see [Labeling in the Microsoft Purview Data Map](create-sensitivity-label.md). |
-|34 |Create new collections and assign roles in Microsoft Purview |*Collection admin* | [Create a collection and assign permissions in Microsoft Purview](./quickstart-create-collection.md). |
-|36 |Govern Data Sources in Microsoft Purview |*Data Source admin* <br> *Data Reader* or *Data Curator* | For more information, see [supported data sources and file types](azure-purview-connector-overview.md) |
-|35 |Grant access to data roles in the organization |*Collection admin* |Provide access to other teams to use Microsoft Purview: <ul><li> Data curator</li><li>Data reader</li><li>Collection admin</li><li>Data source admin</li><li>Policy Author</li><li>Workflow admin</li></ul> <br> For more information, see [Access control in Microsoft Purview](catalog-permissions.md). |
-
-## Next steps
-- [Review Microsoft Purview deployment best practices](./deployment-best-practices.md)
purview Tutorial Azure Purview Tools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-azure-purview-tools.md
- Title: Learn about open-source tools and utilities for Microsoft Purview governance services
-description: This tutorial lists various tools and utilities available for Microsoft Purview governance services and discusses their usage.
----- Previously updated : 12/07/2022
-# Customer Intent: As a Microsoft Purview administrator, I want to kickstart and be up and running with Microsoft Purview service in a matter of minutes; additionally, I want to perform and set up automations, batch-mode API executions and scripts that help me run Microsoft Purview smoothly and effectively for the long-term on a regular basis.
--
-# Microsoft Purview governance services open-source tools and utilities
-
-This article lists several open-source tools and utilities (command-line, python, and PowerShell interfaces) that help you get started quickly with Microsoft Purview governance services, like Microsoft Purview Data Map, Data Catalog, and Data Estate Insights in a matter of minutes! These tools have been authored & developed by collective effort of the Microsoft Purview Product Group and the open-source community. The objective of such tools is to make learning, starting up, regular usage, and long-term adoption of Microsoft Purview fast and easy.
-
-## Intended audience
--- Microsoft Purview community including customers, developers, ISVs, partners, evangelists, and enthusiasts. --- The Microsoft Purview Data Catalog is based on [Apache Atlas](https://atlas.apache.org/) and extends full support for Apache Atlas APIs. We welcome Apache Atlas community, enthusiasts, and developers to wholeheartedly build on and evangelize Microsoft Purview.-
-## Microsoft Purview customer journey stages
--- *Microsoft Purview Learners*: Learners who are starting fresh with Microsoft Purview governance services and are keen to understand and explore how a multicloud unified data governance solution works. A section of learners includes users who want to compare and contrast Microsoft Purview with other competing solutions in the data governance market and try it before adopting for long-term usage.--- *Microsoft Purview Innovators*: Innovators who are keen to understand existing and latest features, ideate, and conceptualize features upcoming on Microsoft Purview. They're adept at building and developing solutions for customers, and have futuristic forward-looking ideas for the next-gen cutting-edge data governance product.--- *Microsoft Purview Enthusiasts/Evangelists*: Enthusiasts who are a combination of Learners and Innovators. They have developed solid understanding and knowledge of Microsoft Purview, hence, are upbeat about adoption of Microsoft Purview. They can help evangelize Microsoft Purview as a service and educate several other Microsoft Purview users and probable customers across the globe.--- *Microsoft Purview Adopters*: Adopters who have migrated from starting up and exploring the Microsoft Purview governance portal and are smoothly using Microsoft Purview for more than a few months.--- *Microsoft Purview Long-Term Regular Users*: Long-term users who have been using the Microsoft Purview governance portal for more than one year and are now confident and comfortable using most advanced use cases on the Azure portal and Microsoft Purview governance portal; furthermore they have near perfect knowledge and awareness of the Microsoft Purview REST APIs and the other use cases supported via Microsoft Purview APIs.--
-## Microsoft Purview open-source tools and utilities list
-
-1. [Purview-API-via-PowerShell](https://github.com/Azure/Azure-Purview-API-PowerShell/blob/main/README.md)
-
- - **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: This utility is based on and covers the entire set functionality described in the [Microsoft Purview REST API reference](/rest/api/purview/). [Download & Install from PowerShell Gallery](https://aka.ms/purview-api-ps). It helps you execute all the documented Microsoft Purview REST APIs through a breezy fast and easy to use PowerShell interface. Use and automate Microsoft Purview APIs for regular and long-term usage via command-line and scripted methods. This is an alternative for customers looking to do bulk tasks in automated manner, batch-mode, or scheduled cron jobs; as against the GUI method of using the Azure portal and Microsoft Purview governance portal. Detailed documentation, sample usage guide, self-help, and examples are available on [GitHub:Azure-Purview-API-PowerShell](https://github.com/Azure/Azure-Purview-API-PowerShell).
-
-1. [Microsoft Purview Lab](https://aka.ms/purviewlab)
-
- - **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts*
- - **Description**: A hands-on-lab introducing the myriad features of Microsoft Purview and helping you learn the concepts in a practical and hands-on approach where you execute each step on your own by hand to develop the best possible understanding of Microsoft Purview.
-
-1. [Microsoft Purview CLI](https://aka.ms/purviewcli)
-
- - **Recommended customer journey stages**: *Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: Python-based tool to execute the Microsoft Purview APIs similar to [Purview-API-via-PowerShell](https://aka.ms/purview-api-ps) but has limited/lesser functionality than the PowerShell-based framework.
-
-1. [Microsoft Purview Demo](https://aka.ms/pvdemo)
-
- - **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts*
- - **Description**: An Azure Resource Manager (ARM) template-based tool to automatically set up and deploy fresh new Microsoft Purview account quickly and securely at the issue of just one command. It's similar to [Purview-Starter-Kit](https://aka.ms/PurviewKickstart), the extra feature being it deploys a few more pre-configured data sources - Azure SQL Database, Azure Data Lake Storage Gen2 Account, Azure Data Factory, Azure Synapse Analytics Workspace
-
-1. [PyApacheAtlas: Interface between Microsoft Purview and Apache Atlas](https://github.com/wjohnson/pyapacheatlas) using Atlas APIs
-
- - **Recommended customer journey stages**: *Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: A Python package to work with Microsoft Purview and Apache Atlas API. Supports bulk loading, custom lineage, and more from a Pythonic set of classes and Excel templates. The package supports programmatic interaction and an Excel template for low-code uploads.
-
-1. [Microsoft Purview Event Hubs Notifications Reader](https://github.com/Azure/Azure-Purview-API-PowerShell/blob/main/purview_atlas_eventhub_sample.py)
-
- - **Recommended customer journey stages**: *Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: This tool demonstrates how to read Microsoft Purview's Event Hubs and catch real-time Kafka notifications from the Event Hubs in Atlas Notifications (https://atlas.apache.org/2.0.0/Notifications.html) format. Further, it generates an excel sheet CSV of the entities and assets on the fly that are discovered live during a scan, and any other notifications of interest that Microsoft Purview generates.
--
-## Feedback and disclaimer
-
-None of the tools come with an express warranty from Microsoft verifying their efficacy or guarantees of functionality. They're certified to be free of any malicious activity or viruses, and guaranteed to not collect any private or sensitive data.
-
-For feedback or questions about efficacy and functionality during usage, contact the respective tool owners and authors on the contact details mentioned in the respective GitHub repo.
--
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Purview-API-PowerShell](https://aka.ms/purview-api-ps)
purview Tutorial Custom Types https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-custom-types.md
- Title: Type definitions and how to create custom types in Microsoft Purview
-description: This tutorial will explain what type definitions are, how to create custom type definitions, and how to initialize assets of those custom types in Microsoft Purview.
---- Previously updated : 03/14/2023--
-# Type definitions and how to create custom types
-
-This tutorial will explain what type definitions are, how to create custom types, and how to initialize assets of custom types in Microsoft Purview.
-
-In this tutorial, you'll learn:
-
-> [!div class="checklist"]
->* How Microsoft Purview uses the *type system* from [*Apache Atlas*](https://atlas.apache.org/#/)
->* How to create a new custom type
->* How to create relationships between custom types
->* How to initialize new entities of custom types
-
-## Prerequisites
-
-For this tutorial you'll need:
-
-* An Azure account with an active subscription. If you don't have one, you can [create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active Microsoft Purview (formerly Azure Purview) account. If you don't have one, see the [quickstart for creating a Microsoft Purview account](create-microsoft-purview-portal.md).
-* A bearer token to your Microsoft Purview account. To establish a bearer token and to call any data plane APIs, see the documentation about how to [call REST APIs for Microsoft Purview data planes](tutorial-using-rest-apis.md).
-* Apache Atlas endpoint of your Microsoft Purview account. To get your Apache Atlas endpoint, follow the *Apache Atlas endpoint* section from [here](tutorial-atlas-2-2-apis.md#atlas-endpoint).
-
-> [!NOTE]
-> Before moving to the hands-on part of the tutorial, the first four sections will explain what a System Type is and how it is used in Microsoft Purview.
-> All the REST API calls described further will use the **bearer token** and the **endpoint** which are described in the prerequisites.
->
-> To skip directly to the steps, use these links:
->
->* [Create custom type definitions](#create-definitions)
->* [Initialize assets of custom types](#initialize-assets-of-custom-types)
-
-## What are *asset* and *type* in Microsoft Purview?
-
-An *asset* is a metadata element that describes a digital or physical resource. The digital or physical resources that are expected to be cataloged as assets include:
-
-* Data sources such as databases, files, and data feed.
-* Analytical models and processes.
-* Business policies and terms.
-* Infrastructure like the server.
-
-Microsoft Purview provides users a flexible *type system* to expand the definition of the asset to include new kinds of resources as they become relevant. Microsoft Purview relies on the [Type System](https://atlas.apache.org/2.0.0/TypeSystem.html) from Apache Atlas. All metadata objects (assets) managed by Microsoft Purview are modeled using type definitions. Understanding the Type System is fundamental to create new custom types in Microsoft Purview.
-
-Essentially, a *Type* can be seen as a *Class* from Object Oriented Programming (OOP):
-
-* It defines the properties that represent that type.
-* Each type is uniquely identified by its *name*.
-* A *type* can inherit from a *supertType*. This is an equivalent concept as inheritance from OOP. A type that extends a superType will inherit the attributes of the superType.
-
-You can see all type definitions in your Microsoft Purview account by sending a `GET` request to the [All Type Definitions](/rest/api/purview/catalogdataplane/types/get-all-type-definitions) endpoint:
-
-```http
-GET https://{{ENDPOINT}}/catalog/api/atlas/v2/types/typedefs
-```
-
-Apache Atlas has few predefined system types that are commonly used as supertypes.
-
-For example:
-
-* **Referenceable**: This type represents all entities that can be searched for using a unique attribute called *qualifiedName*.
-
-* **Asset**: This type extends from Referenceable and has other attributes such as: *name*, *description* and *owner*.
-
-* **DataSet**: This type extends Referenceable and Asset. Conceptually, it can be used to represent a type that stores data. Types that extend DataSet can be expected to have a Schema. For example, a SQL table.
-
-* **Lineage**: Lineage information helps one understand the origin of data and the transformations it may have gone through before arriving in a file or table. Lineage is calculated through *DataSet* and *Process*: DataSets (input of process) impact some other DataSets (output of process) through Process.
--
-## Example of a *Type* definition
-
-To better understand the Type system, let's look at an example and see how an **Azure SQL Table** is defined.
-
-You can get the complete type definition by sending a `GET` request to the Type Definition [endpoint](/rest/api/purview/catalogdataplane/types/get-type-definition-by-name):
-
-```http
-GET https://{{ENDPOINT}}/catalog/api/atlas/v2/types/typedef/name/{name}
-```
-
->[!TIP]
-> The **{name}** property tells which definition you are interested in. In this case, you should use **azure_sql_table**.
-
-Below you can see a simplified JSON result:
-
-```json
-{
- "category": "ENTITY",
- "guid": "7d92a449-f7e8-812f-5fc8-ca6127ba90bd",
- "name": "azure_sql_table",
- "description": "azure_sql_table",
- "typeVersion": "1.0",
- "serviceType": "Azure SQL Database",
- "options": {
- "schemaElementsAttribute": "columns",
- },
- "attributeDefs": [
- { "name": "principalId", ...},
- { "name": "objectType", ...},
- { "name": "createTime", ...},
- { "name": "modifiedTime", ... }
- ],
- "superTypes": [
- "DataSet",
- "Purview_Table",
- "Table"
- ],
- "subTypes": [],
- "relationshipAttributeDefs": [
- {
- "name": "dbSchema",
- "typeName": "azure_sql_schema",
- "isOptional": false,
- "cardinality": "SINGLE",
- "relationshipTypeName": "azure_sql_schema_tables",
- },
- {
- "name": "columns",
- "typeName": "array<azure_sql_column>",
- "isOptional": true,
- "cardinality": "SET",
- "relationshipTypeName": "azure_sql_table_columns",
- },
- ]
-}
-```
-
-Based on the JSON type definition, let's look at some properties:
-
-* **Category** field describes in what category your type is. The list of categories supported by Apache Atlas can be found [here](https://atlas.apache.org/api/v2/json_TypeCategory.html).
-
-* **ServiceType** field is useful when browsing assets *by source type* in Microsoft Purview. The *service type* will be an entry point to find all assets that belong to the same *service type* - as defined on their type definition. In the below screenshot of Purview UI, the user limits the result to be the entities specified with *Azure SQL Database* in **serviceType**:
-
- :::image type="content" source="./media/tutorial-custom-types/browse-assets.png" alt-text="Screenshot of the portal showing the path from Data Catalog to Browse to By source type and the asset highlighted.":::
-
- > [!NOTE]
- > **Azure SQL Database** is defined with the same *serviceType* as **Azure SQL Table**.
-
-* **SuperTypes** describes the *"parent"* types you want to "*inherit*" from.
-
-* **schemaElementsAttributes** from **options** influences what appears in the **Schema** tab of your asset in Microsoft Purview.
-
- Below you can see an example of how the **Schema** tab looks like for an asset of type Azure SQL Table:
-
- :::image type="content" source="./media/tutorial-custom-types/schema-tab.png" alt-text="Screenshot of the schema tab for an Azure SQL Table asset." lightbox="./media/tutorial-custom-types/schema-tab.png":::
-
-* **relationshipAttributeDefs** are calculated through the relationship type definitions. In our JSON, we can see that **schemaElementsAttributes** points to the relationship attribute called **columns** - which is one of elements from **relationshipAttributeDefs** array, as shown below:
-
- ```json
- ...
- "relationshipAttributeDefs": [
- ...
- {
- "name": "columns",
- "typeName": "array<azure_sql_column>",
- "isOptional": true,
- "cardinality": "SET",
- "relationshipTypeName": "azure_sql_table_columns",
- },
- ]
- ```
-
- Each relationship has its own definition. The name of the definition is found in **relationshipTypeName** attribute. In this case, it's *azure_sql_table_columns*.
-
- * The **cardinality** of this relationship attribute is set to *SET, which suggests that it holds a list of related assets.
- * The related asset is of type *azure_sql_column*, as visible in the *typeName* attribute.
-
- In other words, the *columns* relationship attribute relates the Azure SQL Table to a list of Azure SQL Columns that show up in the Schema tab.
-
-## Example of a *relationship Type definition*
-
-Each relationship consists of two ends, called *endDef1* and *endDef2*.
-
-In the previous example, *azure_sql_table_columns* was the name of the relationship that characterizes a table (endDef1) and its columns (endDef2).
-
-For the full definition, you can do make a `GET` request to the following [endpoint](/rest/api/purview/catalogdataplane/types/get-type-definition-by-name) using *azure_sql_table_columns* as the name:
-
-```http
-GET https://{{ENDPOINT}}/catalog/api/atlas/v2/types/typedef/name/azure_sql_table_columns
-```
-
-Below you can see a simplified JSON result:
-
-```json
-{
- "category": "RELATIONSHIP",
- "guid": "c80d0027-8f29-6855-6395-d243b37d8a93",
- "name": "azure_sql_table_columns",
- "description": "azure_sql_table_columns",
- "serviceType": "Azure SQL Database",
- "relationshipCategory": "COMPOSITION",
- "endDef1": {
- "type": "azure_sql_table",
- "name": "columns",
- "isContainer": true,
- "cardinality": "SET",
- },
- "endDef2": {
- "type": "azure_sql_column",
- "name": "table",
- "isContainer": false,
- "cardinality": "SINGLE",
- }
-}
-```
-
-* **name** is the name of the relationship definition. The value, in this case *azure_sql_table_columns* is used in the *relationshipTypeName* attribute of the entity that has this relationship, as you can see it referenced in the json.
-
-* **relationshipCategory** is the category of the relationship and it can be either COMPOSITION, AGGREGATION or ASSOCIATION as described [here](https://atlas.apache.org/api/v2/json_RelationshipCategory.html).
-
-* **enDef1** is the first end of the definition and contains the attributes:
-
- * **type** is the type of the entity that this relationship expects as end1.
-
- * **name** is the attribute that will appear on this entity's relationship attribute.
-
- * **cardinality** is either SINGLE, SET or LIST.
-
- * **isContainer** is a boolean and applies to containment relationship category. When set to true in one end, it indicates that this end is the container of the other end. Therefore:
- * Only the *Composition* or *Aggregation* category relationships can and should have in one end *isContainer* set to true.
- * *Association* category relationship shouldn't have *isContainer* property set to true in any end.
-
-* **endDef2** is the second end of the definition and describes, similarly to endDef1, the properties of the second part of the relationship.
-
-## Schema tab
-
-### What is *Schema* in Microsoft Purview?
-
-Schema is an important concept that reflects how data is stored and organized in the data store. It reflects the structure of the data and the data restrictions of the elements that construct the structure.
-
-Elements on the same schema can be classified differently (due to their content). Also, different transformation (lineage) can happen to only a subset of elements. Due to these aspects, Purview can model schema and schema elements **as entities**, hence schema is usually a relationship attribute to the data asset entity. Examples of schema elements are: **columns** of a table, **json properties** of json schema, **xml elements** of xml schema etc.
-
-There are two types of schemas:
-
-* **Intrinsic Schema** - Some systems are intrinsic to schema. For example, when you create a SQL Table, the system requires you to define the columns that construct the table; in this sense, schema of a table is reflected by its columns.
-
- For data store with predefined schema, Purview uses the corresponding relationship between the data asset and the schema elements to reflect the schema. This relationship attribute is specified by the keyword **schemaElementsAttribute** in **options** property of the entity type definition.
-
-* **Non Intrinsic Schema** - Some systems don't enforce such schema restrictions, but users can use it to store structural data by applying some schema protocols to the data. For example, Azure Blobs store binary data and don't care about the data in the binary stream. Therefore, it's unaware of any schema, but the user can serialize their data with schema protocols like json before storing it in the blob. In this sense, schema is maintained by some extra protocols and corresponding validation enforced by the user.
-
- For data store without inherent schema, schema model is independent of this data store. For such cases, Purview defines an interface for schema and a relationship between DataSet and schema, called **dataset_attached_schemas** - this extends any entity type that inherits from DataSet to have an **attachedSchema** relationship attribute to link to their schema representation.
-
-### Example of *Schema tab*
-
-The Azure SQL Table example from above has an intrinsic schema. The information that shows up in the Schema tab of the Azure SQL Table comes from the Azure SQL Column themselves.
-
-Selecting one column item, we would see the following:
--
-The question is, how did Microsoft Purview select the *data_tye* property from the column and showed it in the Schema tab of the table?
--
-You can get the type definition of an Azure SQL Column by making a `GET` request to the [endpoint](/rest/api/purview/catalogdataplane/types/get-type-definition-by-name):
-
-```http
-GET https://{{ENDPOINT}}/catalog/api/atlas/v2/types/typedef/name/{name}
-```
-
->[!NOTE]
-> {name} in this case is: azure_sql_column
-
-Here's a simplified JSON result:
-
-```json
-{
- "category": "ENTITY",
- "guid": "58034a18-fc2c-df30-e474-75803c3a8957",
- "name": "azure_sql_column",
- "description": "azure_sql_column",
- "serviceType": "Azure SQL Database",
- "options": {
- "schemaAttributes": "[\"data_type\"]"
- },
- "attributeDefs":
- [
- {
- "name": "data_type",
- "typeName": "string",
- "isOptional": false,
- "cardinality": "SINGLE",
- "valuesMinCount": 1,
- "valuesMaxCount": 1,
- "isUnique": false,
- "isIndexable": false,
- "includeInNotification": false
- },
- ...
- ]
- ...
-}
-```
-
->[!NOTE]
->serviceType is Azure SQL Database, the same as for the table
-
-* *schemaAttributes* is set to **data_type**, which is one of the attributes of this type.
-
-Azure SQL Table used *schemaElementAttribute* to point to a relationship consisting of a list of Azure SQL Columns. The type definition of a column has *schemaAttributes* defined.
-
-In this way, the Schema tab in the table displays the attribute(s) listed in the *schemaAttributes* of the related assets.
-
-## Create custom type definitions
-
-### Why?
-
-First, why would someone like to create a custom type definition?
-
-There can be cases where there's no built-in type that corresponds to the structure of the metadata you want to import in Microsoft Purview.
-
-In such a case, a new type definition has to be defined.
-
->[!NOTE]
->The usage of built-in types should be favored over the creation of custom types, whenever possible.
-
-Now that we gained an understanding of type definitions in general, let us create custom type definitions.
-
-### Scenario
-
-In this tutorial, we would like to model a 1:n relationship between two types, called *custom_type_parent* and *custom_type_child*.
-
-A *custom_type_child* should reference one parent, whereas a *custom_type_parent* can reference a list of children.
-
-They should be linked together through a 1:n relationship.
-
->[!TIP]
-> [Here](https://github.com/wjohnson/purview-ingestor-scenarios) you can find few tips when creating a new custom type.
-
-### Create definitions
-
-1. Create the *custom_type_parent* type definition by making a `Post` request to:
-
- ```http
- POST https://{{ENDPOINT}}.purview.azure.com/catalog/api/atlas/v2/types/typedefs
- ```
-
- With the body:
-
- ```json
- {
- "entityDefs":
- [
- {
- "category": "ENTITY",
- "version": 1,
- "name": "custom_type_parent",
- "description": "Sample custom type of a parent object",
- "typeVersion": "1.0",
- "serviceType": "Sample-Custom-Types",
- "superTypes": [
- "DataSet"
- ],
- "subTypes": [],
- "options":{
- "schemaElementsAttribute": "columns"
- }
- }
- ]
- }
- ```
-
-1. Create the *custom_type_child* type definition by making a `POST` request to:
-
- ```http
- POST https://{{ENDPOINT}}.purview.azure.com/catalog/api/atlas/v2/types/typedefs
- ```
-
- With the body:
-
- ```json
- {
- "entityDefs":
- [
- {
- "category": "ENTITY",
- "version": 1,
- "name": "custom_type_child",
- "description": "Sample custom type of a CHILD object",
- "typeVersion": "1.0",
- "serviceType": "Sample-Custom-Types",
- "superTypes": [
- "DataSet"
- ],
- "subTypes": [],
- "options":{
- "schemaAttributes": "data_type"
- }
- }
- ]
- }
- ```
-
-1. Create a custom type relationship definition by making a `POST` request to:
-
- ```http
- POST https://{{ENDPOINT}}.purview.azure.com/catalog/api/atlas/v2/types/typedefs
- ```
-
- With the body:
-
- ```json
- {
- "relationshipDefs": [
- {
- "category": "RELATIONSHIP",
- "endDef1" : {
- "cardinality" : "SET",
- "isContainer" : true,
- "name" : "Children",
- "type" : "custom_type_parent"
- },
- "endDef2" : {
- "cardinality" : "SINGLE",
- "isContainer" : false,
- "name" : "Parent",
- "type" : "custom_type_child"
- },
- "relationshipCategory" : "COMPOSITION",
- "serviceType": "Sample-Custom-Types",
- "name": "custom_parent_child_relationship"
- }
- ]
- }
- ```
-
-## Initialize assets of custom types
-
-1. Initialize a new asset of type *custom_type_parent* by making a `POST` request to:
-
- ```http
- POST https://{{ENDPOINT}}.purview.azure.com/catalog/api/atlas/v2/entity
- ```
-
- With the body:
-
- ```json
-
- {
- "entity": {
- "typeName":"custom_type_parent",
- "status": "ACTIVE",
- "version": 1,
- "attributes":{
- "name": "First_parent_object",
- "description": "This is the first asset of type custom_type_parent",
- "qualifiedName": "custom//custom_type_parent:First_parent_object"
- }
-
- }
- }
- ```
-
- Save the *guid* as you'll need it later.
-
-1. Initialize a new asset of type *custom_type_child* by making a `POST` request to:
-
- ```http
- POST https://{{ENDPOINT}}.purview.azure.com/catalog/api/atlas/v2/entity
- ```
-
- With the body:
-
- ```json
- {
- "entity": {
- "typeName":"custom_type_child",
- "status": "ACTIVE",
- "version": 1,
- "attributes":{
- "name": "First_child_object",
- "description": "This is the first asset of type custom_type_child",
- "qualifiedName": "custom//custom_type_child:First_child_object"
- }
- }
- }
- ```
-
- Save the *guid* as you'll need it later.
-
-1. Initialize a new relationship of type *custom_parent_child_relationship* by making a `POST` request to:
-
- ```http
- POST https://{{ENDPOINT}}.purview.azure.com/catalog/api/atlas/v2/relationship/
- ```
-
- With the following body:
-
- >[!NOTE]
- > The *guid* in end1 must be replaced with the the guid of the object created at step 6.1 The *guid* in end2 must be replaced with the guid of the object created at step 6.2
-
- ```json
- {
- "typeName": "custom_parent_child_relationship",
- "end1": {
- "guid": "...",
- "typeName": "custom_type_parent"
- },
- "end2": {
- "guid": "...",
- "typeName": "custom_type_child"
- }
- }
- ```
-
-## View the assets in Microsoft Purview
-
-1. Go to *Data Catalog* in Microsoft Purview.
-1. Select *Browse*.
-1. Select *By source type*.
-1. Select *Sample-Custom-Types*.
-
- :::image type="content" source="./media/tutorial-custom-types/custom-types-objects.png" alt-text="Screenshot showing the path from the Data Catalog to Browse assets with the filter narrowed to Sample-Custom-Types.":::
-
-1. Select the *First_parent_object*:
-
- :::image type="content" source="./media/tutorial-custom-types/first-parent-object.png" alt-text="Screenshot of the First_parent_object page.":::
-
-1. Select the *Properties* tab:
-
- :::image type="content" source="./media/tutorial-custom-types/children.png" alt-text="Screenshot of the properties tab with the related assets highlighted, showing one child asset.":::
-
-1. You can see the *First_child_object* being linked there.
-
-1. Select the *First_child_object*:
-
- :::image type="content" source="./media/tutorial-custom-types/first-child-object.png" alt-text="Screenshot of the First_child_object page, showing the overview tab.":::
-
-1. Select the *Properties* tab:
-
- :::image type="content" source="./media/tutorial-custom-types/parent.png" alt-text="Screenshot of the properties page, showing the related assets with a single parent asset.":::
-
-1. You can see the Parent object being linked there.
-
-1. Similarly, you can select the *Related* tab and will see the relationship between the two objects:
-
- :::image type="content" source="./media/tutorial-custom-types/relationship.png" alt-text="Screenshot of the Related tab, showing the relationship between the child and parent.":::
-
-1. You can create multiple children by initializing a new child asset and inititialzing a relationship
-
- >[!NOTE]
- >The *qualifiedName* is unique per asset, therefore, the second child should be called differently, such as: *custom//custom_type_child:Second_child_object*
-
- :::image type="content" source="./media/tutorial-custom-types/two_children.png" alt-text="Screenshot of the First_parent_object, showing the two child assets highlighted.":::
-
- >[!TIP]
- > If you delete the *First_parent_object* you will notice that the children will also be removed, due to the *COMPOSITION* relationship that we chose in the definition.
-
-## Limitations
-
-There are several known limitations when working with custom types that will be enhanced in future, such as:
-* Relationship tab looks different compared to built-in types
-* Custom types have no icons
-* Hierarchy isn't supported
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Manage data sources](manage-data-sources.md)
-> [Microsoft Purview data plane REST APIs](/rest/api/purview/)
purview Tutorial Data Owner Policies Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-data-owner-policies-storage.md
- Title: Tutorial to provision access for Azure Storage
-description: This tutorial describes how a data owner can create access policies for Azure Storage resources.
----- Previously updated : 04/08/2022--
-# Tutorial: Access provisioning by data owner to Azure Storage datasets (preview)
--
-[Policies](concept-policies-data-owner.md) in Microsoft Purview allow you to enable access to data sources that have been registered to a collection. This tutorial describes how a data owner can use Microsoft Purview to enable access to datasets in Azure Storage through Microsoft Purview.
-
-In this tutorial, you learn how to:
-> [!div class="checklist"]
-> * Prepare your Azure environment
-> * Configure permissions to allow Microsoft Purview to connect to your resources
-> * Register your Azure Storage resource for Data Use Management
-> * Create and publish a policy for your resource group or subscription
-
-## Prerequisites
---
-## Configuration
--
-### Register the data sources in Microsoft Purview for Data Use Management
-
-Your Azure Storage account needs to be registered in Microsoft Purview to later define access policies, and during registration we'll enable Data Use Management. **Data Use Management** is an available feature in Microsoft Purview that allows users to manage access to a resource from within Microsoft Purview. This allows you to centralize data discovery and access management, however it's a feature that directly impacts your data security.
-
-> [!WARNING]
-> Before enabling Data Use Management for any of your resources, read through our [**Data Use Management article**](how-to-enable-data-use-management.md).
->
-> This article includes Data Use Management best practices to help you ensure that your information is secure.
--
-To register your resource and enable Data Use Management, follow these steps:
-
-> [!Note]
-> You need to be an owner of the subscription or resource group to be able to add a managed identity on an Azure resource.
-
-1. From the [Azure portal](https://portal.azure.com), find the Azure Blob storage account that you would like to register.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-storage-acct.png" alt-text="Screenshot that shows the storage account":::
-
-1. Select **Access Control (IAM)** in the left navigation and then select **+ Add** --> **Add role assignment**.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _Microsoft Purview account name_ under the **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
-
-1. If you have a firewall enabled on your Storage account, follow these steps as well:
- 1. Go into your Azure Storage account in [Azure portal](https://portal.azure.com).
- 1. Navigate to **Security + networking > Networking**.
- 1. Choose **Selected Networks** under **Allow access from**.
- 1. In the **Exceptions** section, select **Allow trusted Microsoft services to access this storage account** and select **Save**.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-permission.png" alt-text="Screenshot that shows the exceptions to allow trusted Microsoft services to access the storage account.":::
-
-1. Once you have set up authentication for your storage account, go to the [Microsoft Purview governance portal](https://web.purview.azure.com/).
-1. Select **Data Map** on the left menu.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/select-data-map.png" alt-text="Screenshot that shows the far left menu in the Microsoft Purview governance portal open with Data Map highlighted.":::
-
-1. Select **Register**.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/select-register.png" alt-text="Screenshot that shows the Microsoft Purview governance portal Data Map sources, with the register button highlighted at the top.":::
-
-1. On **Register sources**, select **Azure Blob Storage**.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/select-azure-blob-storage.png" alt-text="Screenshot that shows the tile for Azure Multiple on the screen for registering multiple sources.":::
-
-1. Select **Continue**.
-1. On the **Register sources (Azure)** screen, do the following:
- 1. In the **Name** box, enter a friendly name that the data source will be listed with in the catalog.
- 1. In the **Subscription** dropdown list boxes, select the subscription where your storage account is housed. Then select your storage account under **Storage account name**. In **Select a collection** select the collection where you'd like to register your Azure Storage account.
-
- :::image type="content" source="media/tutorial-data-owner-policies-storage/register-data-source-for-policy-storage.png" alt-text="Screenshot that shows the boxes for selecting a storage account.":::
-
- 1. In the **Select a collection** box, select a collection or create a new one (optional).
- 1. Set the *Data Use Management* toggle to **Enabled**, as shown in the image below.
-
- :::image type="content" source="./media/tutorial-data-owner-policies-storage/register-data-source-for-policy-storage.png" alt-text="Screenshot that shows Data Use Management toggle set to active on the registered resource page.":::
-
- >[!TIP]
- >If the Data Use Management toggle is greyed out and unable to be selected:
- > 1. Confirm you have followed all prerequisites to enable Data Use Management across your resources.
- > 1. Confirm that you have selected a storage account to be registered.
- > 1. It may be that this resource is already registered in another Microsoft Purview account. Hover over it to know the name of the Microsoft Purview account that has registered the data resource.first. Only one Microsoft Purview account can register a resource for Data Use Management at at time.
-
- 1. Select **Register** to register the resource group or subscription with Microsoft Purview with Data Use Management enabled.
-
->[!TIP]
-> For more information about Data Use Management, including best practices or known issues, see our [Data Use Management article](how-to-enable-data-use-management.md).
-
-## Create a data owner policy
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
-1. Select the **New Policy** button in the policy page.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-1.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to create policies.":::
-
-1. The new policy page will appear. Enter the policy **Name** and **Description**.
-
-1. To add policy statements to the new policy, select the **New policy statement** button. This will bring up the policy statement builder.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/create-new-policy.png" alt-text="Data owner can create a new policy statement.":::
-
-1. Select the **Effect** button and choose *Allow* from the drop-down list.
-
-1. Select the **Action** button and choose *Read* or *Modify* from the drop-down list.
-
-1. Select the **Data Resources** button to bring up the window to enter Data resource information, which will open to the right.
-
-1. Under the **Data Resources** Panel do one of two things depending on the granularity of the policy:
- - To create a broad policy statement that covers an entire data source, resource group, or subscription that was previously registered, use the **Data sources** box and select its **Type**.
- - To create a fine-grained policy, use the **Assets** box instead. Enter the **Data Source Type** and the **Name** of a previously registered and scanned data source. See example in the image.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-data-source-type.png" alt-text="Screenshot showing the policy editor, with Data Resources selected, and Data source Type highlighted in the data resources menu.":::
-
-1. Select the **Continue** button and traverse the hierarchy to select and underlying data-object (for example: folder, file, etc.). Select **Recursive** to apply the policy from that point in the hierarchy down to any child data-objects. Then select the **Add** button. This will take you back to the policy editor.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-asset.png" alt-text="Screenshot showing the Select asset menu, and the Add button highlighted.":::
-
-1. Select the **Subjects** button and enter the subject identity as a principal, group, or MSI. Then select the **OK** button. This will take you back to the policy editor
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-subject.png" alt-text="Screenshot showing the Subject menu, with a subject select from the search and the OK button highlighted at the bottom.":::
-
-1. Repeat the steps #5 to #11 to enter any more policy statements.
-
-1. Select the **Save** button to save the policy.
-
- :::image type="content" source="./media/tutorial-data-owner-policies-storage/data-owner-policy-example-storage.png" alt-text="Screenshot showing a sample data owner policy giving access to an Azure Storage account.":::
-
-## Publish a data owner policy
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-2.png" alt-text="Screenshot showing the Microsoft Purview governance portal with the leftmost menu open, Policy Management highlighted, and Data Policies selected on the next page.":::
-
-1. The Policy portal will present the list of existing policies in Microsoft Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/publish-policy.png" alt-text="Screenshot showing the policy editing menu with the Publish button highlighted in the top right of the page.":::
-
-1. A list of data sources is displayed. You can enter a name to filter the list. Then, select each data source where this policy is to be published and then select the **Publish** button.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/select-data-sources-publish-policy.png" alt-text="Screenshot showing with Policy publish menu with a data resource selected and the publish button highlighted.":::
-
->[!Important]
-> - Publish is a background operation. It can take up to **2 hours** for the changes to be reflected in Storage account(s).
-
-## Clean up resources
-
-To delete a policy in Microsoft Purview, follow these steps:
-
-1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/policy-onboard-guide-2.png" alt-text="Screenshot showing the leftmost menu open, Policy Management highlighted, and Data Policies selected on the next page.":::
-
-1. The Policy portal will present the list of existing policies in Microsoft Purview. Select the policy that needs to be updated.
-
-1. The policy details page will appear, including Edit and Delete options. Select the **Edit** button, which brings up the policy statement builder. Now, any parts of the statements in this policy can be updated. To delete the policy, use the **Delete** button.
-
- :::image type="content" source="./media/how-to-policies-data-owner-authoring-generic/edit-policy.png" alt-text="Screenshot showing an open policy with the Edit button highlighted in the top menu on the page.":::
--
-## Next steps
-
-Check our demo and related tutorials:
-
-> [!div class="nextstepaction"]
-> [Demo of access policy for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
-> [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
-> [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
purview Tutorial Data Sources Readiness https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-data-sources-readiness.md
- Title: 'Check data source readiness at scale'
-description: In this tutorial, you'll verify the readiness of your Azure data sources before you register and scan them in Microsoft Purview.
----- Previously updated : 12/12/2022
-# Customer intent: As a data steward or catalog administrator, I need to onboard Azure data sources at scale before I register and scan them.
-
-# Tutorial: Check data source readiness at scale
-
-To scan data sources, Microsoft Purview requires access to them. It uses credentials to obtain this access. A *credential* is the authentication information that Microsoft Purview can use to authenticate to your registered data sources. There are a few ways to set up the credentials for Microsoft Purview, including:
-- The managed identity assigned to the Microsoft Purview account.-- Secrets stored in Azure Key Vault. -- Service principals.-
-In this two-part tutorial series, we'll help you verify and configure required Azure role assignments and network access for various Azure data sources across your Azure subscriptions at scale. You can then register and scan your Azure data sources in Microsoft Purview.
-
-Run the [Microsoft Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script after you deploy your Microsoft Purview account and before you register and scan your Azure data sources.
-
-In part 1 of this tutorial series, you'll:
-
-> [!div class="checklist"]
->
-> * Locate your data sources and prepare a list of data source subscriptions.
-> * Run the readiness checklist script to find any missing role-based access control (RBAC) or network configurations across your data sources in Azure.
-> * In the output report, review missing network configurations and role assignments required by Microsoft Purview Managed Identity (MSI).
-> * Share the report with data Azure subscription owners so they can take suggested actions.
-
-## Prerequisites
-
-* Azure subscriptions where your data sources are located. If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
-* A [Microsoft Purview account](create-catalog-portal.md).
-* An Azure Key Vault resource in each subscription that has data sources like Azure SQL Database, Azure Synapse Analytics, or Azure SQL Managed Instance.
-* The [Microsoft Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script.
-
-> [!NOTE]
-> The Microsoft Purview data sources readiness checklist is available only for Windows.
-> This readiness checklist script is currently supported for Microsoft Purview MSI.
-
-## Prepare Azure subscriptions list for data sources
-
-Before running the script, create a .csv file (for example, C:\temp\Subscriptions.csv) with four columns:
-
-|Column name|Description|Example|
-|-|-|-|
-|`SubscriptionId`|Azure subscription IDs for your data sources.|12345678-aaaa-bbbb-cccc-1234567890ab|
-|`KeyVaultName`|Name of existing key vault thatΓÇÖs deployed in the data source subscription.|ContosoDevKeyVault|
-|`SecretNameSQLUserName`|Name of an existing Azure Key Vault secret that contains an Azure Active Directory (Azure AD) user name that can sign in to Azure Synapse, Azure SQL Database, or Azure SQL Managed Instance by using Azure AD authentication.|ContosoDevSQLAdmin|
-|`SecretNameSQLPassword`|Name of an existing Azure Key Vault secret that contains an Azure AD user password that can sign in to Azure Synapse, Azure SQL Database, or Azure SQL Managed Instance by using Azure AD authentication.|ContosoDevSQLPassword|
-
-
-**Sample .csv file:**
-
-
-> [!NOTE]
-> You can update the file name and path in the code, if you need to.
---
-## Run the script and install the required PowerShell modules
-
-Follow these steps to run the script from your Windows computer:
-
-1. [Download the Microsoft Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script to the location of your choice.
-
-2. On your computer, enter **PowerShell** in the search box on the Windows taskbar. In the search list, select and hold (or right-click) **Windows PowerShell** and then select **Run as administrator**.
-
-3. In the PowerShell window, enter the following command. (Replace `<path-to-script>` with the folder path of the extracted script file.)
-
- ```powershell
- dir -Path <path-to-script> | Unblock-File
- ```
-
-4. Enter the following command to install the Azure cmdlets:
-
- ```powershell
- Install-Module -Name Az -AllowClobber -Scope CurrentUser
- ```
-6. If you see the prompt *NuGet provider is required to continue*, enter **Y**, and then select **Enter**.
-
-7. If you see the prompt *Untrusted repository*, enter **A**, and then select **Enter**.
-
-5. Repeat the previous steps to install the `Az.Synapse` and `AzureAD` modules.
-
-It might take up to a minute for PowerShell to install the required modules.
--
-## Collect other data needed to run the script
-
-Before you run the PowerShell script to verify the readiness of data source subscriptions, obtain the values of the following arguments to use in the scripts:
--- `AzureDataType`: Choose any of the following options as your data-source type to check the readiness for the data type across your subscriptions:
-
- - `BlobStorage`
-
- - `AzureSQLMI`
-
- - `AzureSQLDB`
-
- - `ADLSGen2`
-
- - `ADLSGen1`
-
- - `Synapse`
-
- - `All`
--- `PurviewAccount`: Your existing Microsoft Purview account resource name.--- `PurviewSub`: Subscription ID where the Microsoft Purview account is deployed.-
-## Verify your permissions
-
-Make sure your user has the following roles and permissions:
-
-Role or permission | Scope |
-|-|--|
-| **Global Reader** | Azure AD tenant |
-| **Reader** | Azure subscriptions where your Azure data sources are located |
-| **Reader** | Subscription where your Microsoft Purview account was created |
-| **SQL Admin** (Azure AD Authentication) | Azure Synapse dedicated pools, Azure SQL Database instances, Azure SQL managed instances |
-| Access to your Azure key vault | Access to get/list key vault's secret or Azure Key Vault secret user |
--
-## Run the client-side readiness script
-
-Run the script by completing these steps:
-
-1. Use the following command to go to the script's folder. Replace `<path-to-script>` with the folder path of the extracted file.
-
- ```powershell
- cd <path-to-script>
- ```
-
-2. Run the following command to set the execution policy for the local computer. Enter **A** for *Yes to All* when you're prompted to change the execution policy.
-
- ```powershell
- Set-ExecutionPolicy -ExecutionPolicy Unrestricted
- ```
-
-3. Run the script with the following parameters. Replace the `DataType`, `PurviewName`, and `SubscriptionID` placeholders.
-
- ```powershell
- .\purview-data-sources-readiness-checklist.ps1 -AzureDataType <DataType> -PurviewAccount <PurviewName> -PurviewSub <SubscriptionID>
- ```
-
- When you run the command, a pop-up window might appear twice prompting you to sign in to Azure and Azure AD by using your Azure Active Directory credentials.
--
-It can take several minutes to create the report, depending on the number of Azure subscriptions and resources in the environment.
-
-After the process completes, review the output report, which demonstrates the detected missing configurations in your Azure subscriptions or resources. The results can appear as _Passed_, _Not Passed_, or _Awareness_. You can share the results with the corresponding subscription admins in your organization so they can configure the required settings.
--
-## More information
-
-### What data sources are supported by the script?
-
-Currently, the following data sources are supported by the script:
--- Azure Blob Storage (BlobStorage)-- Azure Data Lake Storage Gen2 (ADLSGen2)-- Azure Data Lake Storage Gen1 (ADLSGen1)-- Azure SQL Database (AzureSQLDB)-- Azure SQL Managed Instance (AzureSQLMI)-- Azure Synapse (Synapse) dedicated pool-
-You can choose all or any of these data sources as the input parameter when you run the script.
-
-### What checks are included in the results?
-
-#### Azure Blob Storage (BlobStorage)
--- RBAC. Check whether Microsoft Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.-- RBAC. Check whether Microsoft Purview MSI is assigned the **Reader** role on the selected scope.-- Service endpoint. Check whether service endpoint is on, and check whether **Allow trusted Microsoft services to access this storage account** is enabled.-- Networking: Check whether private endpoint is created for storage and enabled for Blob Storage.-
-#### Azure Data Lake Storage Gen2 (ADLSGen2)
--- RBAC. Check whether Microsoft Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.-- RBAC. Check whether Microsoft Purview MSI is assigned the **Reader** role on the selected scope.-- Service endpoint. Check whether service endpoint is on, and check whether **Allow trusted Microsoft services to access this storage account** is enabled.-- Networking: Check whether private endpoint is created for storage and enabled for Blob Storage.-
-#### Azure Data Lake Storage Gen1 (ADLSGen1)
--- Networking. Check whether service endpoint is on, and check whether **Allow all Azure services to access this Data Lake Storage Gen1 account** is enabled.-- Permissions. Check whether Microsoft Purview MSI has Read/Execute permissions.-
-#### Azure SQL Database (AzureSQLDB)
--- SQL Server instances:
- - Network. Check whether public endpoint or private endpoint is enabled.
- - Firewall. Check whether **Allow Azure services and resources to access this server** is enabled.
- - Azure AD administration. Check whether Azure SQL Server has Azure AD authentication.
- - Azure AD administration. Populate the Azure SQL Server Azure AD admin user or group.
--- SQL databases:
- - SQL role. Check whether Microsoft Purview MSI is assigned the **db_datareader** role.
-
-#### Azure SQL Managed Instance (AzureSQLMI)
--- SQL Managed Instance servers:
- - Network. Check whether public endpoint or private endpoint is enabled.
- - ProxyOverride. Check whether Azure SQL Managed Instance is configured as Proxy or Redirect.
- - Networking. Check whether NSG has an inbound rule to allow AzureCloud over required ports:
- - Redirect: 1433 and 11000-11999
- or
- - Proxy: 3342
- - Azure AD administration. Check whether Azure SQL Server has Azure AD authentication.
- - Azure AD administration. Populate the Azure SQL Server Azure AD admin user or group.
--- SQL databases:
- - SQL role. Check whether Microsoft Purview MSI is assigned the **db_datareader** role.
-
-#### Azure Synapse (Synapse) dedicated pool
--- RBAC. Check whether Microsoft Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.-- RBAC. Check whether Microsoft Purview MSI is assigned the **Reader** role on the selected scope.-- SQL Server instances (dedicated pools):
- - Network: Check whether public endpoint or private endpoint is enabled.
- - Firewall: Check whether **Allow Azure services and resources to access this server** is enabled.
- - Azure AD administration: Check whether Azure SQL Server has Azure AD authentication.
- - Azure AD administration: Populate the Azure SQL Server Azure AD admin user or group.
--- SQL databases:
- - SQL role. Check whether Microsoft Purview MSI is assigned the **db_datareader** role.
-
-## Next steps
-
-In this tutorial, you learned how to:
-> [!div class="checklist"]
->
-> * Run the Microsoft Purview readiness checklist to check, at scale, whether your Azure subscriptions are missing configuration, before you register and scan them in Microsoft Purview.
-
-Go to the next tutorial to learn how to identify the required access and set up required authentication and network rules for Microsoft Purview across Azure data sources:
-
-> [!div class="nextstepaction"]
-> [Configure access to data sources for Microsoft Purview MSI at scale](tutorial-msi-configuration.md)
purview Tutorial Metadata Policy Collections Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-metadata-policy-collections-apis.md
- Title: Learn about Microsoft Purview collections metadata policy and roles APIs
-description: This tutorial discusses how to manage role-based access control over Microsoft Purview collections to users, groups, or service principals.
----- Previously updated : 12/07/2022
-# Customer intent: As a Microsoft Purview collection administrator, I want to manage collections and control access to each collection in the Microsoft Purview account by adding or removing users, groups, or service principals via the REST API interface.
--
-# Tutorial: Use REST APIs to manage role-based access control on Microsoft Purview collections
-
-In August 2021, access control in Microsoft Purview moved from Azure Identity & Access Management (IAM) (control plane) to [Microsoft Purview collections](how-to-create-and-manage-collections.md) (data plane). This change gives enterprise data curators and administrators more precise, granular access control on their data sources scanned by Microsoft Purview. The change also enables organizations to audit right access and right use of their data.
-
-This tutorial guides you through step-by-step usage of the Microsoft Purview Metadata Policy APIs to help you add users, groups, or service principals to a collection, and manage or remove their roles within that collection. REST APIs are an alternative method to using the Azure portal or Microsoft Purview governance portal to achieve the same granular role-based access control.
-
-For more information about the built-in roles in Microsoft Purview, see the [Microsoft Purview permissions guide](catalog-permissions.md#roles). The guide maps the roles to the level of access permissions that are granted to users.
-
-## Metadata Policy API Reference summary
-
-The following table gives an overview of the [Microsoft Purview Metadata Policy API Reference](/rest/api/purview/metadatapolicydataplane/Metadata-Policy).
-
-> [!NOTE]
-> Replace {pv-acc-name} with the name of your Microsoft Purview account before running these APIs. For instance, if your Microsoft Purview account name is *FabrikamPurviewAccount*, your API endpoints will become *FabrikamPurviewAccount.purview.azure.com*. The "api-version" parameter is subject to change. Please refer the [Microsoft Purview Metadata policy REST API documentation](/rest/api/purview/metadatapolicydataplane/Metadata-Policy) for the latest "api-version" and the API signature.
-
-| API&nbsp;function | REST&nbsp;method | API&nbsp;endpoint | Description |
-| | | | |
-| Read All Metadata Roles| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataroles?&api-version={latest-api-version}^| Reads all metadata roles from your Microsoft Purview account.|
-| Read Metadata Policy By Collection Name| GET| https://{pv-acc-name}.purview.azure.com /policystore/collections/{collectionName}/metadataPolicy?&api-version={latest-api-version}^| Reads the metadata policy by using a specified collection name (the six character random name that's generated by Microsoft Purview when it creates the policy).|
-| Read Metadata Policy By PolicyID| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies/{policyId}?&api-version={latest-api-version}^| Reads the metadata policy by using a specified policy ID. The policy ID is in GUID format.|
-| Read All Metadata Policies| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies?&api-version={latest-api-version}^| Reads all metadata policies from your Microsoft Purview account. You can pick a certain policy to work with from the JSON output list that's generated by this API.|
-| Update/PUT Metadata Policy| PUT| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies/{policyId}?&api-version={latest-api-version}^| Updates the metadata policy by using a specified policy ID. The policy ID is in GUID format.|
-| ^ Refer the [Microsoft Purview Metadata Policy API Reference](/rest/api/purview/metadatapolicydataplane/Metadata-Policy) for {latest-api-version}
-
-
-## Microsoft Purview catalog collections API reference summary
-
-The following table gives an overview of the Microsoft Purview collections APIs. For complete documentation about each API, select the API operation in the left column.
-
-| Operation | Description |
-| | |
-| [Create or update collection](/rest/api/purview/accountdataplane/collections/create-or-update-collection) | Creates or updates a collection entity. |
-| [Delete collection](/rest/api/purview/accountdataplane/collections/delete-collection) | Deletes a collection entity. |
-| [Get collection](/rest/api/purview/accountdataplane/collections/get-collection) | Gets a collection.|
-| [Get collection path](/rest/api/purview/accountdataplane/collections/get-collection-path) | Gets the parent name and display name chains that represent the collection path.|
-| [List child collection names](/rest/api/purview/accountdataplane/collections/list-child-collection-names) | Lists the child collections names in the collection.|
-| [List collections](/rest/api/purview/accountdataplane/collections/list-collections) | Lists the collections in the account.|
--- If you're using the API, the service principal, user, or group that executes the API should have a [Collection Admin](how-to-create-and-manage-collections.md#check-permissions) role assigned in Microsoft Purview to execute this API successfully.--- For all Microsoft Purview APIs that require {collectionName}, you'll need to use *"name"* (and not *"friendlyName"*). Replace {collectionName} with the actual six-character alphanumeric collection name string.
- > [!NOTE]
- > This name is different from the friendly display name you supplied when you created the collection. If you don't have {collectionName} handy, use the [List Collections API](/rest/api/purview/accountdataplane/collections/list-collections) to select the six-character collection name from the JSON output.
-
-Here's an example JSON file:
-
-```json
-{
- "name": "74dhe7",
- "friendlyName": "Friendly Name",
- "parentCollection": {
- "type": "CollectionReference",
- "referenceName": "{your_purview_account_name}"
- },
- "systemData": {
- "createdBy": "{guid}",
- "createdByType": "Application",
- "createdAt": "2021-08-26T21:21:51.2646627Z",
- "lastModifiedBy": "7f8d47e2-330c-42f0-8744-fcfb1ecb3ea0",
- "lastModifiedByType": "Application",
- "lastModifiedAt": "2021-08-26T21:21:51.2646628Z"
- },
- "collectionProvisioningState": "Succeeded"
-}
-```
-
-### Policy JSON description
-
-Here are some of the important identifiers in the JSON output that's received from the collection APIs:
-
-**Name**: The name of the policy.
-
-**Id**: The unique identifier for the policy.
-
-**Version**: The latest version number of the policy.
- > [!IMPORTANT]
- > The version number is incremented each time the Update-Metadata-Policy API is called. Be sure to fetch the latest copy of the policy by invoking the Get-Policy-by-Policy-ID API. Perform this refresh every time before you call the Update Policy (PUT) API, so that you always have the latest version of the JSON file.
-
-**DecisionRules**: List the rules and effect of this policy. For metadata policies, the effect is always *ΓÇ£PermitΓÇ¥*.
-
-## Add or remove users from a collection or role
-
-Use Microsoft Purview REST APIs to add or remove a user, group, or service principal from a collection or role. Detailed API usage is provided along with sample JSON outputs. We highly recommend that you follow the instructions in the next sections sequentially for best understanding of the Microsoft Purview metadata policy APIs.
-
-## Get all metadata roles
-
-To list all the available metadata access permission roles, run the following command:
-
-```ruby
-GET https://{your_purview_account_name}.purview.azure.com/policystore/metadataroles?api-version={latest-api-version}^
-```
-
-The output JSON will describe the roles and their associated permissions in this format.
-
-The default metadata roles are listed in the following table:
-
-| Role ID | Permissions | Role description |
-| | | |
-| purviewmetadatarole\_builtin\_data-source-administrator| Microsoft.Purview/accounts/scan/read Microsoft.Purview/accounts/scan/write Microsoft.Purview/accounts/collection/read| Grants access to others to read, write collection, register data sources, and trigger scans.|
-| purviewmetadatarole\_builtin\_collection-administrator| Microsoft.Purview/accounts/collection/read Microsoft.Purview/accounts/collection/write| Administrator-level full access to the entire collection, including add or remove users and service principal names (SPNs) from the collection, management rights, and grant or revoke access. In some cases, the Collection Administrator might be different from the creator of the collection.|
-| purviewmetadatarole\_builtin\_purview-reader| Microsoft.Purview/accounts/data/read Microsoft.Purview/accounts/collection/read| Grants only read access to data handling and all metadata, including classifications, sensitivity labels, insights, and read assets in a collection, except scan bindings.|
-| purviewmetadatarole\_builtin\_data-curator| Microsoft.Purview/accounts/data/read Microsoft.Purview/accounts/data/write Microsoft.Purview/accounts/collection/read| Grants full access to data handling and all metadata, including classifications, sensitivity labels, insights, and read assets in a collection, except scan bindings.|
-| purviewmetadatarole\_builtin\_data-share-contributor| Microsoft.Purview/accounts/share/read Microsoft.Purview/accounts/share/write| Grants access to data shares as a contributor. |
-
-```json
-{
- "values": [
- {
- "id": "purviewmetadatarole_builtin_data-curator",
- "name": "data-curator",
- "type": "Microsoft.Purview/role",
- "properties": {
- "provisioningState": "Provisioned",
- "roleType": "BuiltIn",
- "friendlyName": "Data Curator",
- "cnfCondition": [
- [
- {
- "attributeName": "request.azure.dataAction",
- "attributeValueIncludedIn": [
- "Microsoft.Purview/accounts/data/read",
- "Microsoft.Purview/accounts/data/write",
- "Microsoft.Purview/accounts/collection/read"
- ]
- }
- ]
- ],
- "version": 1
- }
- },
- {
- "id": "purviewmetadatarole_builtin_data-source-administrator",
- "name": "data-source-administrator",
- "type": "Microsoft.Purview/role",
- "properties": {
- "provisioningState": "Provisioned",
- "roleType": "BuiltIn",
- "friendlyName": "Data Source Administrator",
- "cnfCondition": [
- [
- {
- "attributeName": "request.azure.dataAction",
- "attributeValueIncludedIn": [
- "Microsoft.Purview/accounts/scan/read",
- "Microsoft.Purview/accounts/scan/write",
- "Microsoft.Purview/accounts/collection/read"
- ]
- }
- ]
- ],
- "version": 1
- }
- },
- {
- "id": "purviewmetadatarole_builtin_collection-administrator",
- "name": "collection-administrator",
- "type": "Microsoft.Purview/role",
- "properties": {
- "provisioningState": "Provisioned",
- "roleType": "BuiltIn",
- "friendlyName": "Collection Administrator",
- "cnfCondition": [
- [
- {
- "attributeName": "request.azure.dataAction",
- "attributeValueIncludedIn": [
- "Microsoft.Purview/accounts/collection/read",
- "Microsoft.Purview/accounts/collection/write"
- ]
- }
- ]
- ],
- "version": 1
- }
- },
- {
- "id": "purviewmetadatarole_builtin_purview-reader",
- "name": "purview-reader",
- "type": "Microsoft.Purview/role",
- "properties": {
- "provisioningState": "Provisioned",
- "roleType": "BuiltIn",
- "friendlyName": "Microsoft Purview Reader",
- "cnfCondition": [
- [
- {
- "attributeName": "request.azure.dataAction",
- "attributeValueIncludedIn": [
- "Microsoft.Purview/accounts/data/read",
- "Microsoft.Purview/accounts/collection/read"
- ]
- }
- ]
- ],
- "version": 1
- }
- },
- {
- "id": "purviewmetadatarole_builtin_data-share-contributor",
- "name": "data-share-contributor",
- "type": "Microsoft.Purview/role",
- "properties": {
- "provisioningState": "Provisioned",
- "roleType": "BuiltIn",
- "friendlyName": "Data share contributor",
- "cnfCondition": [
- [
- {
- "attributeName": "request.azure.dataAction",
- "attributeValueIncludedIn": [
- "Microsoft.Purview/accounts/share/read",
- "Microsoft.Purview/accounts/share/write"
- ]
- }
- ]
- ],
- "version": 1
- }
- }
- ]
-}
-```
-
-## Get all metadata policies
-
-```ruby
-GET https://{your_purview_account_name}.purview.azure.com/policystore/metadataPolicies?api-version={latest-api-version}^
-```
-The preceding command lists all available metadata policies across the entire collections hierarchy in tree format, from the root collection at the top to all its child policies. Each child collection contains each of its next level children.
-
-Example:
-
-```json
-{
- "values": [
- {
- "name": "policy_FabrikamPurview",
- "id": "9b2f1cb9-584c-4a16-811e-9232884b5cac",
- "version": 30,
- "properties": {
- "description": "",
- "decisionRules": [
- {
- "kind": "decisionrule",
- "effect": "Permit",
- "dnfCondition": [
- [
- {
- "attributeName": "resource.purview.collection",
- "attributeValueIncludes": "fabrikampurview"
- },
- {
- "fromRule": "permission:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:fabrikampurview"
- }
- ]
- ]
- }
- ],
- "attributeRules": [
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview",
- "name": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0",
- "04314867-60a4-4e5a-ae16-8e5856f415d9",
- "8988fe5c-5736-4179-9435-0a64c273b90b",
- "6d563253-1d5b-48f2-baaa-5489f22ddce9",
- "26f98046-5b02-4fa9-b709-e0519c658891",
- "73fc02dc-becd-468b-a2a3-82238e722dae"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- },
- {
- "attributeName": "principal.microsoft.groups",
- "attributeValueIncludedIn": [
- "ffd851fa-86ec-431b-95ea-8b84d5012383",
- "cf84b126-4384-4952-91f1-7f705b25e569",
- "5046aba1-5b81-411c-8fec-b84600f3f08b",
- "b055a5c6-a04e-4d1a-8524-001ad81bfb28",
- "cc194892-92fa-4ce3-96ae-1f98bef8211c"
- ]
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_data-curator:fabrikampurview",
- "name": "purviewmetadatarole_builtin_data-curator:fabrikampurview",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0",
- "649f56ab-2dd2-40de-a731-3d3f28e7af92",
- "c29a5809-f9ec-49fd-b762-2d4d64abb93e",
- "04314867-60a4-4e5a-ae16-8e5856f415d9",
- "73fc02dc-becd-468b-a2a3-82238e722dae",
- "517a27d2-39ba-4c91-a032-dd9ecf8ad6f1",
- "6d563253-1d5b-48f2-baaa-5489f22ddce9"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_data-curator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_data-curator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_data-curator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_data-curator"
- },
- {
- "attributeName": "principal.microsoft.groups",
- "attributeValueIncludedIn": [
- "b055a5c6-a04e-4d1a-8524-001ad81bfb28",
- "cc194892-92fa-4ce3-96ae-1f98bef8211c",
- "5046aba1-5b81-411c-8fec-b84600f3f08b"
- ]
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_data-source-administrator:fabrikampurview",
- "name": "purviewmetadatarole_builtin_data-source-administrator:fabrikampurview",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0",
- "04314867-60a4-4e5a-ae16-8e5856f415d9",
- "517a27d2-39ba-4c91-a032-dd9ecf8ad6f1",
- "6d563253-1d5b-48f2-baaa-5489f22ddce9"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_data-source-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_data-source-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_data-source-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_data-source-administrator"
- },
- {
- "attributeName": "principal.microsoft.groups",
- "attributeValueIncludedIn": [
- "b055a5c6-a04e-4d1a-8524-001ad81bfb28",
- "cc194892-92fa-4ce3-96ae-1f98bef8211c",
- "d34eb741-be5e-4098-90d7-eca8d4a5153f",
- "664ec992-9af0-4773-88f2-dc39edc46f6f",
- "5046aba1-5b81-411c-8fec-b84600f3f08b"
- ]
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "permission:fabrikampurview",
- "name": "permission:fabrikampurview",
- "dnfCondition": [
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_purview-reader:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_purview-reader:fabrikampurview"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_data-curator:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_data-curator:fabrikampurview"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_data-source-administrator:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_data-source-administrator:fabrikampurview"
- }
- ]
- ]
- }
- ],
- "collection": {
- "type": "CollectionReference",
- "referenceName": "fabrikampurview"
- }
- }
- },
- {
- "name": "policy_b2zpf1",
- "id": "12b0bb28-2acc-413e-8fe1-179ff9cc54c3",
- "version": 0,
- "properties": {
- "description": "",
- "decisionRules": [
- {
- "kind": "decisionrule",
- "effect": "Permit",
- "dnfCondition": [
- [
- {
- "attributeName": "resource.purview.collection",
- "attributeValueIncludes": "b2zpf1"
- },
- {
- "fromRule": "permission:b2zpf1",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:b2zpf1"
- }
- ]
- ]
- }
- ],
- "attributeRules": [
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_collection-administrator:b2zpf1",
- "name": "purviewmetadatarole_builtin_collection-administrator:b2zpf1",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:ukx7pq",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:ukx7pq"
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "permission:b2zpf1",
- "name": "permission:b2zpf1",
- "dnfCondition": [
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:b2zpf1",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:b2zpf1"
- }
- ],
- [
- {
- "fromRule": "permission:ukx7pq",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:ukx7pq"
- }
- ]
- ]
- }
- ],
- "collection": {
- "type": "CollectionReference",
- "referenceName": "b2zpf1"
- },
- "parentCollectionName": "ukx7pq"
- }
- },
- {
- "name": "policy_7wte2n",
- "id": "a72084e4-ccab-4aec-a364-08ab001e4999",
- "version": 0,
- "properties": {
- "description": "",
- "decisionRules": [
- {
- "kind": "decisionrule",
- "effect": "Permit",
- "dnfCondition": [
- [
- {
- "attributeName": "resource.purview.collection",
- "attributeValueIncludes": "7wte2n"
- },
- {
- "fromRule": "permission:7wte2n",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:7wte2n"
- }
- ]
- ]
- }
- ],
- "attributeRules": [
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_collection-administrator:7wte2n",
- "name": "purviewmetadatarole_builtin_collection-administrator:7wte2n",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:ukx7pq",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:ukx7pq"
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "permission:7wte2n",
- "name": "permission:7wte2n",
- "dnfCondition": [
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:7wte2n",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:7wte2n"
- }
- ],
- [
- {
- "fromRule": "permission:ukx7pq",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:ukx7pq"
- }
- ]
- ]
- }
- ],
- "collection": {
- "type": "CollectionReference",
- "referenceName": "7wte2n"
- },
- "parentCollectionName": "ukx7pq"
- }
- }
- ]
-}
-```
-
-## Get the selected metadata policy
-
-You can use either of two APIs to fetch a particular collection's metadata policy JSON structure by supplying either {collectionName} or {PolicyID}.
-
-As described in the following two sections, both APIs serve the same purpose, and the JSON outputs of both are exactly the same.
-
-### Get the metadata policy of the collection by using the collection name
-
-```ruby
-GET https://{your_purview_account_name}.purview.azure.com/policystore/collections/{collectionName}/metadataPolicy?api-version={latest-api-version}^
-```
-
-1. The Microsoft Purview account name is {your_purview_account_name}. Replace it with your Microsoft Purview account name.
-
-1. In the JSON output of the previous API, "Get All Metadata Policies," look for the following section:
-
- { "type": "CollectionReference", "referenceName": "7xkdg2"}
-
-1. Replace "{collectionName}" in the API URL with the value of "referenceName": "{6-char-collection-name}". For example, if your six-character collection name is "7xkdg2," the API URL will be formatted as:
-
- https://{your_purview_account_name}.purview.azure.com/policystore/collections/7xkdg2/metadataPolicy?api-version={latest-api-version}^
-
-1. Execute the following API:
-
- ```json
- {
- "name": "policy_qu45fs",
- "id": "c6639bb2-9c41-4be0-912b-775750e725de",
- "version": 0,
- "properties": {
- "description": "",
- "decisionRules": [
- {
- "kind": "decisionrule",
- "effect": "Permit",
- "dnfCondition": [
- [
- {
- "attributeName": "resource.purview.collection",
- "attributeValueIncludes": "qu45fs"
- },
- {
- "fromRule": "permission:qu45fs",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:qu45fs"
- }
- ]
- ]
- }
- ],
- "attributeRules": [
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "name": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview"
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "permission:qu45fs",
- "name": "permission:qu45fs",
- "dnfCondition": [
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:qu45fs"
- }
- ],
- [
- {
- "fromRule": "permission:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:fabrikampurview"
- }
- ]
- ]
- }
- ],
- "collection": {
- "type": "CollectionReference",
- "referenceName": "qu45fs"
- },
- "parentCollectionName": "fabrikampurview"
- }
- }
- ```
-
-### Get the metadata policy of the collection by using the policy ID
-
-```ruby
-GET https://{your_purview_account_name}.purview.azure.com/policystore/metadataPolicies/{policyId}?api-version={latest-api-version}^
-```
-
-1. The Microsoft Purview account name is {your_purview_account_name}. Replace it with your Microsoft Purview account name.
-
-1. In the JSON output of the previous API, "Get All Metadata Policies," look for the following section:
-
- {.... "name": "policy_qu45fs", "id": "{policy-guid}", "version": N ....}
-
-1. Replace "{policyId}" in the API URL with the value of "id". For example, if your "{policy-guid}" is "c6639bb2-9c41-4be0-912b-775750e725de," the API URL will be formatted as:
-
- https://{your_purview_account_name}.purview.azure.com/policystore/metadataPolicies/c6639bb2-9c41-4be0-912b-775750e725de?api-version={latest-api-version}^
-
-1. Now execute the following API:
-
- > [!NOTE]
- > The output of this API call and the previous API call is the same. As mentioned previously, you can choose either one.
-
-```json
-{
- "name": "policy_qu45fs",
- "id": "c6639bb2-9c41-4be0-912b-775750e725de",
- "version": 0,
- "properties": {
- "description": "",
- "decisionRules": [
- {
- "kind": "decisionrule",
- "effect": "Permit",
- "dnfCondition": [
- [
- {
- "attributeName": "resource.purview.collection",
- "attributeValueIncludes": "qu45fs"
- },
- {
- "fromRule": "permission:qu45fs",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:qu45fs"
- }
- ]
- ]
- }
- ],
- "attributeRules": [
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "name": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview"
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "permission:qu45fs",
- "name": "permission:qu45fs",
- "dnfCondition": [
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:qu45fs"
- }
- ],
- [
- {
- "fromRule": "permission:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:fabrikampurview"
- }
- ]
- ]
- }
- ],
- "collection": {
- "type": "CollectionReference",
- "referenceName": "qu45fs"
- },
- "parentCollectionName": "fabrikampurview"
- }
-}
-```
--
-## Update the collection policy
-
-```ruby
-PUT https://{your_purview_account_name}.purview.azure.com/policystore/metadataPolicies/{policyId}?api-version={latest-api-version}^
-```
-
-In this section, you update the policy JSON that you obtained in the preceding step by adding or removing a user, group, or service principal from the collection. You then push it to the Microsoft Purview service by using a PUT REST method.
-
-Whether you're adding or removing a user, group, or service principal, you'll follow the same API process.
-
-1. Supply the user, group, or service principal object ID {guid} in the "attributeValueIncludedIn" array of the JSON.
-
-1. Search the JSON output of the Get-Policy-by-ID API for the "attributeValueIncludedIn" array in the previous step, and either add or remove the user, group, or service principal object ID in the array. If you're unsure about how to fetch the user, group, or service principal object ID, see [Get-AzureADUser](/powershell/module/azuread/get-azureaduser).
-
-1. There are multiple sections in the JSON mapping for each of the four roles. For the Collection Administrator permissions role, use the section bearing the ID called "purviewmetadatarole_builtin_collection-administrator". Likewise, use the corresponding section for the other roles.
-
-1. To better understand the add/remove operation, carefully examine the difference between the JSON output from the previous API and the following output. In the JSON following output, we've added user ID "3a3a3a3a-2c2c-4b4b-1c1c-2a3b4c5d6e7f" As Collection Administrator.
-
-```json
-{
- "name": "policy_qu45fs",
- "id": "c6639bb2-9c41-4be0-912b-775750e725de",
- "version": 0,
- "properties": {
- "description": "",
- "decisionRules": [
- {
- "kind": "decisionrule",
- "effect": "Permit",
- "dnfCondition": [
- [
- {
- "attributeName": "resource.purview.collection",
- "attributeValueIncludes": "qu45fs"
- },
- {
- "fromRule": "permission:qu45fs",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:qu45fs"
- }
- ]
- ]
- }
- ],
- "attributeRules": [
- {
- "kind": "attributerule",
- "id": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "name": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "dnfCondition": [
- [
- {
- "attributeName": "principal.microsoft.id",
- "attributeValueIncludedIn": [
- "2f656762-e440-4b62-9eb6-a991d17d64b0",
- "3a3a3a3a-2c2c-4b4b-1c1c-2a3b4c5d6e7f"
- ]
- },
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator",
- "attributeName": "derived.purview.role",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator"
- }
- ],
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:fabrikampurview"
- }
- ]
- ]
- },
- {
- "kind": "attributerule",
- "id": "permission:qu45fs",
- "name": "permission:qu45fs",
- "dnfCondition": [
- [
- {
- "fromRule": "purviewmetadatarole_builtin_collection-administrator:qu45fs",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "purviewmetadatarole_builtin_collection-administrator:qu45fs"
- }
- ],
- [
- {
- "fromRule": "permission:fabrikampurview",
- "attributeName": "derived.purview.permission",
- "attributeValueIncludes": "permission:fabrikampurview"
- }
- ]
- ]
- }
- ],
- "collection": {
- "type": "CollectionReference",
- "referenceName": "qu45fs"
- },
- "parentCollectionName": "fabrikampurview"
- }
-}
-```
-## Add the Root Collection Administrator role
-
-By default, the user who created the Microsoft Purview account is the Root Collection Administrator (that is, the administrator of the topmost level of the collection hierarchy). However, in some cases, an organization may want to change the Root Collection Administrator using the API.
-
-```ruby
-POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Purview/accounts/{accountName}/addRootCollectionAdmin?api-version={latest-api-version}^
-```
-To run the preceding command, you need only to pass the new Root Collection Administrator's object ID. As we mentioned before, the object ID can be that of any user, group, or service principal.
-
-```json
-{"objectId": "{guid}"}
-```
-
-> [!NOTE]
-> Users who call this API must have Owner or User Account and Authentication (UAA) permissions on the Microsoft Purview account to execute a write action on the account.
-
-## More resources
-
-You may choose to execute Microsoft Purview REST APIs by using the [PowerShell utility](https://aka.ms/purview-api-ps). It can be readily installed from PowerShell Gallery. With this utility, you can execute all the same commands, but from Windows PowerShell.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Purview-API-PowerShell](https://aka.ms/purview-api-ps)
purview Tutorial Msi Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-msi-configuration.md
- Title: 'Configure access to data sources for Microsoft Purview MSI at scale'
-description: In this tutorial, you'll configure Azure MSI settings on your Azure data source subscriptions.
----- Previously updated : 12/12/2022
-# Customer intent: As a data steward or catalog administrator, I need to onboard Azure data sources at scale before I register and scan them.
-
-# Tutorial: Configure access to data sources for Microsoft Purview MSI at scale
-
-To scan data sources, Microsoft Purview requires access to them. This tutorial is intended for Azure subscription owners and Microsoft Purview Data Source Administrators. It will help you identify required access and set up required authentication and network rules for Microsoft Purview across Azure data sources.
-
-In part 2 of this tutorial series, you'll:
-
-> [!div class="checklist"]
->
-> * Locate your data sources and prepare a list of data source subscriptions.
-> * Run a script to configure any missing role-based access control (RBAC) or required network configurations across your data sources in Azure.
-> * Review the output report.
-
-## Prerequisites
-
-* Azure subscriptions where your data sources are located. If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
-* An [Microsoft Purview account](create-catalog-portal.md).
-* An Azure Key Vault resource in each subscription that has data sources like Azure SQL Database, Azure Synapse Analytics, or Azure SQL Managed Instance.
-* The [Microsoft Purview MSI Configuration](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-MSI-Configuration) script.
-
-> [!NOTE]
-> The Microsoft Purview MSI Configuration script is available only for Windows.
-> This script is currently supported for Microsoft Purview Managed Identity (MSI).
-
-> [!IMPORTANT]
-> We strongly recommend that you test and verify all the changes the script performs in your Azure environment before you deploy it into your production environment.
-
-## Prepare Azure subscriptions list for data sources
-
-Before you run the script, create a .csv file (for example, "C:\temp\Subscriptions.csv) with four columns:
-
-|Column name|Description|Example|
-|-|-|-|
-|`SubscriptionId`|Azure subscription IDs for your data sources.|12345678-aaaa-bbbb-cccc-1234567890ab|
-|`KeyVaultName`|Name of existing key vault thatΓÇÖs deployed in the data source subscription.|ContosoDevKeyVault|
-|`SecretNameSQLUserName`|Name of an existing Azure Key Vault secret that contains an Azure Active Directory (Azure AD) user name that can sign in to Azure Synapse, Azure SQL Database, or Azure SQL Managed Instance by using Azure AD authentication.|ContosoDevSQLAdmin|
-|`SecretNameSQLPassword`|Name of an existing Azure Key Vault secret that contains an Azure AD user password that can sign in to Azure Synapse, Azure SQL Database, or Azure SQL Managed Instance by using Azure AD authentication.|ContosoDevSQLPassword|
-
--
-**Sample .csv file:**
-
-
-> [!NOTE]
-> You can update the file name and path in the code, if you need to.
--
-## Run the script and install the required PowerShell modules
-
-Follow these steps to run the script from your Windows computer:
-
-1. [Download Microsoft Purview MSI Configuration](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-MSI-Configuration) script to the location of your choice.
-
-2. On your computer, enter **PowerShell** in the search box on the Windows taskbar. In the search list, select and hold (or right-click) **Windows PowerShell** and then select **Run as administrator**.
-
-3. In the PowerShell window, enter the following command. (Replace `<path-to-script>` with the folder path of the extracted script file.)
-
- ```powershell
- dir -Path <path-to-script> | Unblock-File
- ```
-
-4. Enter the following command to install the Azure cmdlets:
-
- ```powershell
- Install-Module -Name Az -AllowClobber -Scope CurrentUser
- ```
-5. If you see the prompt *NuGet provider is required to continue*, enter **Y**, and then select **Enter**.
-
-6. If you see the prompt *Untrusted repository*, enter **A**, and then select **Enter**.
-
-7. Repeat the previous steps to install the `Az.Synapse` and `AzureAD` modules.
-
-It might take up to a minute for PowerShell to install the required modules.
--
-## Collect other data needed to run the script
-
-Before you run the PowerShell script to verify the readiness of data source subscriptions, obtain the values of the following arguments to use in the scripts:
--- `AzureDataType`: Choose any of the following options as your data-source type to check the readiness for the data type across your subscriptions:
-
- - `BlobStorage`
-
- - `AzureSQLMI`
-
- - `AzureSQLDB`
-
- - `ADLSGen2`
-
- - `ADLSGen1`
-
- - `Synapse`
-
- - `All`
--- `PurviewAccount`: Your existing Microsoft Purview account resource name.--- `PurviewSub`: Subscription ID where the Microsoft Purview account is deployed.-
-## Verify your permissions
-
-Make sure your user has the following roles and permissions:
-
-At a minimum, you need the following permissions to run the script in your Azure environment:
-
-Role | Scope | Why is it needed? |
-|-|--|--|
-| **Global Reader** | Azure AD tenant | To read Azure SQL Admin user group membership and Microsoft Purview MSI |
-| **Global Administrator** | Azure AD tenant | To assign the **Directory Reader** role to Azure SQL managed instances |
-| **Contributor** | Subscription or resource group where your Microsoft Purview account is created | To read the Microsoft Purview account resource and create a Key Vault resource and secret |
-| **Owner or User Access Administrator** | Management group or subscription where your Azure data sources are located | To assign RBAC |
-| **Contributor** | Management group or subscription where your Azure data sources are located | To set up network configuration |
-| **SQL Admin** (Azure AD Authentication) | Azure SQL Server instances or Azure SQL managed instances | To assign the **db_datareader** role to Microsoft Purview |
-| Access to your Azure key vault | Access to get/list Key Vault secret for Azure SQL Database, Azure SQL Managed Instance, or Azure Synapse authentication |
--
-## Run the client-side readiness script
-
-Run the script by completing these steps:
-
-1. Use the following command to go to the script's folder. Replace `<path-to-script>` with the folder path of the extracted file.
-
- ```powershell
- cd <path-to-script>
- ```
-
-2. Run the following command to set the execution policy for the local computer. Enter **A** for *Yes to All* when you're prompted to change the execution policy.
-
- ```powershell
- Set-ExecutionPolicy -ExecutionPolicy Unrestricted
- ```
-
-3. Run the script with the following parameters. Replace the `DataType`, `PurviewName`, and `SubscriptionID` placeholders.
-
- ```powershell
- .\purview-msi-configuration.ps1 -AzureDataType <DataType> -PurviewAccount <PurviewName> -PurviewSub <SubscriptionID>
- ```
-
- When you run the command, a pop-up window might appear twice prompting you to sign in to Azure and Azure AD by using your Azure Active Directory credentials.
-
-It can take several minutes to create the report, depending on the number of Azure subscriptions and resources in the environment.
-
-You might be prompted to sign in to your Azure SQL Server instances if the credentials in the key vault don't match. You can provide the credentials or select **Enter** to skip the specific server.
-
-After the process completes, view the output report to review the changes.
--
-## More information
-
-### What data sources are supported by the script?
-
-Currently, the following data sources are supported by the script:
--- Azure Blob Storage (BlobStorage)-- Azure Data Lake Storage Gen2 (ADLSGen2)-- Azure Data Lake Storage Gen1 (ADLSGen1)-- Azure SQL Database (AzureSQLDB)-- Azure SQL Managed Instance (AzureSQLMI)-- Azure Synapse (Synapse) dedicated pool-
-You can choose all or any of these data sources as input parameter when you run the script.
-
-### What configurations are included in the script?
-
-This script can help you automatically complete the following tasks:
-
-#### Azure Blob Storage (BlobStorage)
--- RBAC. Assign the Azure RBAC **Reader** role to Microsoft Purview MSI on the selected scope. Verify the assignment. -- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Microsoft Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.-- Networking. Report whether private endpoint is created for storage and enabled for Blob Storage.-- Service endpoint. If private endpoint is off, check whether service endpoint is on, and enable **Allow trusted Microsoft services to access this storage account**.-
-#### Azure Data Lake Storage Gen2 (ADLSGen2)
--- RBAC. Assign the Azure RBAC **Reader** role to Microsoft Purview MSI on the selected scope. Verify the assignment. -- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Microsoft Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.-- Networking. Report whether private endpoint is created for storage and enabled for Blob Storage.-- Service endpoint. If private endpoint is off, check whether service endpoint is on, and enable **Allow trusted Microsoft services to access this storage account**.-
-#### Azure Data Lake Storage Gen1 (ADLSGen1)
--- Networking. Verify that service endpoint is on, and enable **Allow all Azure services to access this Data Lake Storage Gen1 account** on Data Lake Storage.-- Permissions. Assign Read/Execute access to Microsoft Purview MSI. Verify the access. -
-#### Azure SQL Database (AzureSQLDB)
--- SQL Server instances:
- - Network. Report whether public endpoint or private endpoint is enabled.
- - Firewall. If private endpoint is off, verify firewall rules and enable **Allow Azure services and resources to access this server**.
- - Azure AD administration. Enable Azure AD authentication for Azure SQL Database.
--- SQL databases:
- - SQL role. Assign the **db_datareader** role to Microsoft Purview MSI.
-
-#### Azure SQL Managed Instance (AzureSQLMI)
--- SQL Managed Instance servers:
- - Network. Verify that public endpoint or private endpoint is on. Report if public endpoint is off.
- - ProxyOverride. Verify that Azure SQL Managed Instance is configured as Proxy or Redirect.
- - Networking. Update NSG rules to allow AzureCloud inbound access to SQL Server instances over required ports:
- - Redirect: 1433 and 11000-11999
-
- or
- - Proxy: 3342
-
- Verify this access.
- - Azure AD administration. Enable Azure AD authentication for Azure SQL Managed Instance.
-
-- SQL databases:
- - SQL role. Assign the **db_datareader** role to Microsoft Purview MSI.
-
-#### Azure Synapse (Synapse) dedicated pool
--- RBAC. Assign the Azure RBAC **Reader** role to Microsoft Purview MSI on the selected scope. Verify the assignment. -- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Microsoft Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.-- SQL Server instances (dedicated pools):
- - Network. Report whether public endpoint or private endpoint is on.
- - Firewall. If private endpoint is off, verify firewall rules and enable **Allow Azure services and resources to access this server**.
- - Azure AD administration. Enable Azure AD authentication for Azure SQL Database.
--- SQL databases:
- - SQL role. Assign the **db_datareader** role to Microsoft Purview MSI.
-
-## Next steps
-
-In this tutorial, you learned how to:
-> [!div class="checklist"]
->
-> * Identify required access and set up required authentication and network rules for Microsoft Purview across Azure data sources.
-
-Go to the next tutorial to learn how to [Register and scan multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md).
purview Tutorial Purview Audit Logs Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-purview-audit-logs-diagnostics.md
- Title: Enable and capture audit logs and time series activity history for applications in the Microsoft Purview governance portal
-description: This tutorial lists the step-by-step configuration required to enable and capture audit logs for applications in the Microsoft Purview governance portal and time series activity history via Azure Diagnostics event hubs.
----- Previously updated : 09/28/2022--
-# Audit logs, diagnostics, and activity history
-
-This tutorial lists the step-by-step configuration required to enable and capture audit and diagnostics logs for applications in the Microsoft Purview governance portal via Azure Event Hubs.
-
-A Microsoft Purview administrator or Microsoft Purview data-source admin needs the ability to monitor audit and diagnostics logs captured from [applications in the Microsoft Purview governance portal](https://azure.microsoft.com/services/purview/#get-started). Audit and diagnostics information consists of the timestamped history of actions taken and changes made to a Microsoft Purview account by every user. Captured activity history includes actions in the [Microsoft Purview governance portal](https://ms.web.purview.azure.com) and outside the portal. Actions outside the portal include calling [Microsoft Purview REST APIs](/rest/api/purview/) to perform write operations.
-
-This tutorial takes you through the steps to enable audit logging. It also shows you how to configure and capture streaming audit events from the Microsoft Purview governance portal via Azure Diagnostics event hubs.
-
-## Audit events categories
-
-Some of the important categories of Microsoft Purview governance portal audit events that are currently available for capture and analysis are listed in the table.
-
-More types and categories of activity audit events will be added.
-
-| Category | Activity | Operation |
-|||--|
-| Management | Collections | Create |
-| Management | Collections | Update |
-| Management | Collections | Delete |
-| Management | Role assignments | Create |
-| Management | Role assignments | Update |
-| Management | Role assignments | Delete |
-| Management | Scan rule set | Create |
-| Management | Scan rule set | Update |
-| Management | Scan rule set | Delete |
-| Management | Classification rule | Create |
-| Management | Classification rule | Update |
-| Management | Classification rule | Delete |
-| Management | Scan | Create |
-| Management | Scan | Update |
-| Management | Scan | Delete |
-| Management | Scan | Run |
-| Management | Scan | Cancel |
-| Management | Scan | Create |
-| Management | Scan | Schedule |
-| Management | Data source | Register |
-| Management | Data source | Update |
-| Management | Data source | Delete |
-
-## Enable audit and diagnostics
-
-The following sections walk you through the process of enabling audit and diagnostics.
-
-### Configure Event Hubs
-
-Create an [Azure Event Hubs namespace by using an Azure Resource Manager (ARM) template (GitHub)](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.eventhub/eventhubs-create-namespace-and-enable-capture). This automated Azure ARM template will deploy and finish creating your Event Hubs instance with the required configuration.
-
-For step-by-step explanations and manual setup:
--- [Event Hubs: Use an ARM template to enable Event Hubs capture](../event-hubs/event-hubs-resource-manager-namespace-event-hub-enable-capture.md)-- [Event Hubs: Enable capturing of events streaming manually by using the Azure portal](../event-hubs/event-hubs-capture-enable-through-portal.md)-
-### Connect a Microsoft Purview account to Diagnostics event hubs
-
-Now that Event Hubs is deployed and created, connect your Microsoft Purview account diagnostics audit logging to Event Hubs.
-
-1. Go to your Microsoft Purview account home page. This page is where the overview information is displayed in the [Azure portal](https://portal.azure.com). It's not the Microsoft Purview governance portal home page.
-
-1. On the left menu, select **Monitoring** > **Diagnostic settings**.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-e.png" alt-text="Screenshot that shows selecting Diagnostic settings." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-e.png":::
-
-1. Select **Add diagnostic setting** or **Edit setting**. Adding more than one diagnostic setting row in the context of Microsoft Purview isn't recommended. In other words, if you already have a diagnostic setting row, don't select **Add diagnostic**. Select **Edit** instead.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-f.png" alt-text="Screenshot that shows the Add or Edit Diagnostic settings screen." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-f.png":::
-
-1. Select the **audit** and **allLogs** checkboxes to enable collection of audit logs. Optionally, select **AllMetrics** if you also want to capture Data Map capacity units and Data Map size metrics of the account.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-g.png" alt-text="Screenshot that shows configuring Microsoft Purview Diagnostic settings and selecting diagnostic types." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-g.png":::
-
-Diagnostics configuration on the Microsoft Purview account is complete.
-
-Now that diagnostics audit logging configuration is complete, configure the data capture and data retention settings for Event Hubs.
-
-1. Go to the [Azure portal](https://portal.azure.com) home page, and search for the name of the Event Hubs namespace you created earlier.
-
-1. Go to the Event Hubs namespace. Select **Event Hubs** > **Capture Data**.
-
-1. Supply the name of the Event Hubs namespace and the event hub where you want the audit and diagnostics to be captured and streamed. Modify the **Time Window** and **Size Window** values for the retention period of the streaming events. Select **Save**.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-h.png" alt-text="Screenshot that shows Capture settings on the Event Hubs namespace and Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-h.png":::
-
-1. Optionally, on the left menu, go to **Properties** and change **Message Retention** to any value between one and seven days. The retention period value depends on the frequency of scheduled jobs or the scripts you've created to continuously listen and capture the streaming events. If you schedule a capture once every week, move the slider to seven days.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-i.png" alt-text="Screenshot that shows Event Hubs Properties message retention period." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-i.png":::
-
-1. At this stage, the Event Hubs configuration is complete. The Microsoft Purview governance portal will start streaming all its audit history and diagnostics data to this event hub. You can now proceed to read, extract, and perform further analytics and operations on the captured diagnostics and audit events.
-
-### Read captured audit events
-
-To analyze the captured audit and diagnostics log data:
-
-1. Go to **Process data** on the Event Hubs page to see a preview of the captured audit logs and diagnostics.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-d.png" alt-text="Screenshot that shows configuring Event Hubs Process data." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-d.png":::
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-c.png" alt-text="Screenshot that shows navigating Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-c.png":::
-
-1. Switch between the **Table** and **Raw** views of the JSON output.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-a.png" alt-text="Screenshot that shows exploring Microsoft Purview audit events on Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-a.png":::
-
-1. Select **Download sample data** and analyze the results carefully.
-
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-b.png" alt-text="Screenshot that shows Query and Process Microsoft Purview Audit data on Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-b.png":::
-
-Now that you know how to gather this information, you can use automatic, scheduled scripts to extract, read, and perform further analytics on the Event Hubs audit and diagnostics data. You can even build your own utilities and custom code to extract business value from captured audit events.
-
-These audit logs can also be transformed to Excel, any database, Dataverse, or Synapse Analytics database for analytics and reporting by using Power BI.
-
-While you're free to use any programming or scripting language of your choice to read the event hubs, here's a ready-made [Python-based script](https://github.com/Azure/Azure-Purview-API-PowerShell/blob/main/purview_atlas_eventhub_sample.py). See this Python tutorial on how to [capture Event Hubs data in Azure Storage and read it by using Python (azure-eventhub)](../event-hubs/event-hubs-capture-python.md).
-
-## Next steps
-
-Enable diagnostic audit logging and kickstart your Microsoft Purview journey.
-
-> [!div class="nextstepaction"]
-> [Microsoft Purview: Automated new account setup](https://aka.ms/PurviewKickstart)
purview Tutorial Register Scan On Premises Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-register-scan-on-premises-sql-server.md
- Title: 'Tutorial: Register and scan an on-premises SQL Server'
-description: This tutorial describes how to register an on-prem SQL Server to Microsoft Purview, and scan the server using a self-hosted IR.
----- Previously updated : 11/03/2022---
-# Tutorial: Register and scan an on-premises SQL Server
-
-Microsoft Purview is designed to connect to data sources to help you manage sensitive data, simplify data discovery, and ensure right use. Microsoft Purview can connect to sources across your entire landscape, including multi-cloud and on-premises. For this scenario, you'll use a self-hosted integration runtime to connect to data on an on-premises SQL server. Then you'll use Microsoft Purview to scan and classify that data.
-
-In this tutorial, you'll learn how to:
-
-> [!div class="checklist"]
-> * Sign in to the Microsoft Purview governance portal.
-> * Create a collection in Microsoft Purview.
-> * Create a self-hosted integration runtime.
-> * Store credentials in an Azure Key Vault.
-> * Register an on-premises SQL Server to Microsoft Purview.
-> * Scan the SQL Server.
-> * Browse your data catalog to view assets in your SQL Server.
-
-## Prerequisites
--- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An active [Azure Key Vault](../key-vault/general/quick-create-portal.md).-- A Microsoft Purview account. If you don't already have one, you can [follow our quickstart guide to create one](create-catalog-portal.md).-- An [on-premises SQL Server](https://www.microsoft.com/sql-server/sql-server-downloads).-
-## Sign in to the Microsoft Purview governance portal
-
-To interact with Microsoft Purview, you'll connect to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/). You can find the studio by:
--- Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.-- Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.--
-## Create a collection
-
-Collections in Microsoft Purview are used to organize assets and sources into a custom hierarchy for organization and discoverability. They're also the tool used to manage access across Microsoft Purview. In this tutorial, we'll create one collection to house your SQL Server source and all its assets. This tutorial won't cover information about assigning permissions to other users, so for more information you can follow our [Microsoft Purview permissions guide](catalog-permissions.md).
-
-### Check permissions
-
-To create and manage collections in Microsoft Purview, you'll need to be a **Collection Admin** within Microsoft Purview. We can check these permissions in the [Microsoft Purview governance portal](use-azure-purview-studio.md).
-
-1. Select **Data Map > Collections** from the left pane to open the collection management page.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/find-collections.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the Collections tab selected." border="true":::
-
-1. Select your root collection. The root collection is the top collection in your collection list and will have the same name as your Microsoft Purview account. In our example below, it is called Microsoft Purview Account.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/select-root-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the root collection highlighted." border="true":::
-
-1. Select **Role assignments** in the collection window.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/role-assignments.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
-
-1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the Microsoft Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant you permission.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/collection-admins.png" alt-text="Screenshot of the Microsoft Purview governance portal window, opened to the Data Map, with the collection admin section highlighted." border="true":::
-
-### Create the collection
-
-1. Select **+ Add a collection**. Again, only [collection admins](#check-permissions) can manage collections.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/select-add-a-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, showing the new collection window, with the 'add a collection' buttons highlighted." border="true":::
-
-1. In the right panel, enter the collection name and description. If needed you can also add users or groups as collection admins to the new collection.
-1. Select **Create**.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/create-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
-
-1. The new collection's information will reflect on the page.
-
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/created-collection.png" alt-text="Screenshot of the Microsoft Purview governance portal window, showing the newly created collection window." border="true":::
-
-## Create a self-hosted integration runtime
-
-The Self-Hosted Integration Runtime (SHIR) is the compute infrastructure used by Microsoft Purview to connect to on-premises data sources. The SHIR is downloaded and installed on a machine within the same network as the on-premises data source.
-
-This tutorial assumes the machine where you'll install your self-hosted integration runtime can make network connections to the internet. This connection allows the SHIR to communicate between your source and Microsoft Purview. If your machine has a restricted firewall, or if you would like to secure your firewall, look into the [network requirements for the self-hosted integration runtime](manage-integration-runtimes.md#networking-requirements).
-
-1. On the home page of the Microsoft Purview governance portal, select **Data Map** from the left navigation pane.
-
-1. Under **Source management** on the left pane, select **Integration runtimes**, and then select **+ New**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-integration-runtime.png" alt-text="Select the Integration Runtimes button.":::
-
-1. On the **Integration runtime setup** page, select **Self-Hosted** to create a Self-Hosted IR, and then select **Continue**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-self-hosted-ir.png" alt-text="Create new SHIR.":::
-
-1. Enter a name for your IR, and select Create.
-
-1. On the **Integration Runtime settings** page, follow the steps under the **Manual setup** section. You'll have to download the integration runtime from the download site onto a VM or machine that is in the same network as your on-premises SQL Server. For information about the kind of machine needed, you can follow our [guide to manage integration runtimes](manage-integration-runtimes.md#prerequisites).
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/integration-runtime-settings.png" alt-text="get key":::
-
- - Copy and paste the authentication key.
-
- - Download the self-hosted integration runtime from [Microsoft Integration Runtime](https://www.microsoft.com/download/details.aspx?id=39717) on a local Windows machine. Run the installer. Self-hosted integration runtime versions such as 5.4.7803.1 and 5.6.7795.1 are supported.
-
- - On the **Register Integration Runtime (Self-hosted)** page, paste one of the two keys you saved earlier, and select **Register**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/register-integration-runtime.png" alt-text="input key.":::
-
- - On the **New Integration Runtime (Self-hosted) Node** page, select **Finish**.
-
-1. After the Self-hosted integration runtime is registered successfully, you'll see this window:
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/successfully-registered.png" alt-text="successfully registered.":::
-
-## Set up authentication
-
-There are two ways to set up authentication for SQL server on-premises:
--- SQL Authentication-- Windows Authentication-
-This tutorial includes steps to use SQL authentication. For more information about scanning on-premises SQL Server with Windows authentication, see [Set up SQL server authentication](register-scan-on-premises-sql-server.md#set-up-sql-server-authentication).
-
-### SQL authentication
-
-The SQL account must have access to the **master** database. This is because the `sys.databases` is in the database. The Microsoft Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
-
-#### Create a new login and user
-
-If you would like to create a new login and user to be able to scan your SQL server, follow the steps below:
-
-> [!Note]
-> All the steps below can be executed using the code provided [here](https://github.com/Azure/Purview-Samples/blob/master/TSQL-Code-Permissions/grant-access-to-on-prem-sql-databases.sql).
-
-1. Navigate to SQL Server Management Studio (SSMS), connect to the server, navigate to security, select and hold (or right-click) login and create New login. Make sure to select SQL authentication.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/create-new-login-user.png" alt-text="Create new login and user.":::
-
-1. Select Server roles on the left navigation and ensure that public role is assigned.
-
-1. Select User mapping on the left navigation, select all the databases in the map and select the Database role: **db_datareader**.
-
-1. Select **OK** to save.
-
-1. Navigate again to the user you created, by selecting and holding (or right-clicking) on the user and selecting **Properties**. Enter a new password and confirm it. Select the 'Specify old password' and enter the old password. **It's required to change your password as soon as you create a new login.**
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/change-password.png" alt-text="change password.":::
-
-#### Create a Key Vault credential
-
-1. Navigate to your key vault in the Azure portal. Select **Settings > Secrets**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-secrets.png" alt-text="Select Secrets from Left Menu":::
-
-1. Select **+ Generate/Import**
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-generate-import.png" alt-text="Select Generate/Import from the top menu.":::
-
-1. For upload options, select **Manual** and enter a **Name** for your secret. The **Value** will be the *password* from your SQL server login. Ensure **Enabled** is set to **Yes**. If you set an activation and expiration date, ensure that today's date is between the two, or you won't be able to use the credential.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/create-credential-secret.png" alt-text="Add values to key vault credential.":::
-
-1. Select **Create** to complete.
-1. In the [Microsoft Purview governance portal](#sign-in-to-the-microsoft-purview-governance-portal), navigate to the **Management** page in the left menu.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-management.png" alt-text="Select Management page on left menu.":::
-
-1. Select the **Credentials** page.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-credentials.png" alt-text="The credentials button on the Management page is highlighted.":::
-
-1. From the **Credentials** page, select **Manage Key Vault connections**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/manage-key-vault-connections.png" alt-text="Manage Azure Key Vault connections.":::
-
-1. Select **+ New** from the Manage Key Vault connections page.
-
-1. Provide the required information, then select **Create**.
-
-1. Confirm that your Key Vault has been successfully associated with your Microsoft Purview account as shown in this example:
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/view-kv-connections.png" alt-text="View Azure Key Vault connections to confirm.":::
-
-1. Create a new Credential for SQL Server by selecting **+ New**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-new.png" alt-text="Select +New to create credential.":::
-
-1. Provide the required information. Select the **Authentication method** and a **Key Vault connection** from which to select a secret from.
-
-1. Once all the details have been filled in, select **Create**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/new-credential.png" alt-text="New credential":::
-
-1. Verify that your new credential shows up in the list view and is ready to use.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/view-credentials.png" alt-text="View credential":::
-
-## Register SQL Server
-
-1. Open the Microsoft Purview governance portal by:
-
- - Browsing directly to [https://web.purview.azure.com](https://web.purview.azure.com) and selecting your Microsoft Purview account.
- - Opening the [Azure portal](https://portal.azure.com), searching for and selecting the Microsoft Purview account. Selecting the [**the Microsoft Purview governance portal**](https://web.purview.azure.com/) button.
-
-1. Under Sources and scanning in the left navigation, select **Integration runtimes**. Make sure a self-hosted integration runtime is set up. If it's not set up, follow the steps mentioned [here](manage-integration-runtimes.md) to create a self-hosted integration runtime for scanning on an on-premises or Azure VM that has access to your on-premises network.
-
-1. Select **Data Map** on the left navigation.
-
-1. Select **Register**
-
-1. Select **SQL server** and then **Continue**
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/set-up-sql-data-source.png" alt-text="Set up the SQL data source.":::
-
-1. Provide a friendly name and server endpoint and then select **Finish** to register the data source. If, for example, your SQL server FQDN is **foobar.database.windows.net**, then enter *foobar* as the server endpoint.
-
-To create and run a new scan, do the following:
-
-1. Select the **Data Map** tab on the left pane in the Microsoft Purview governance portal.
-
-1. Select the SQL Server source that you registered.
-
-1. Select **New scan**
-
-1. Select the credential to connect to your data source.
-
-1. You can scope your scan to specific tables by choosing the appropriate items in the list.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/on-premises-sql-scope-your-scan.png" alt-text="Scope your scan":::
-
-1. Then select a scan rule set. You can choose between the system default, existing custom rule sets, or create a new rule set inline.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/on-premises-sql-scan-rule-set.png" alt-text="Scan rule set":::
-
-1. Choose your scan trigger. You can set up a schedule or run the scan once.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/trigger-scan.png" alt-text="trigger":::
-
-1. Review your scan and select **Save and run**.
--
-## Clean up resources
-
-If you're not going to continue to use this Microsoft Purview or SQL source moving forward, you can follow the steps below to delete the integration runtime, SQL credential, and purview resources.
-
-### Remove SHIR from Microsoft Purview
-
-1. On the home page of [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/), select **Data Map** from the left navigation pane.
-
-1. Under **Source management** on the left pane, select **Integration runtimes**.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-integration-runtime.png" alt-text="Select the Integration Runtimes button.":::
-
-1. Select the checkbox next to your integration runtime, then select the **delete** button.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/delete-integration-runtime.png" alt-text="Check box next to integration runtime and delete button highlighted.":::
-
-1. Select **Confirm** on the next window to confirm the delete.
-
-1. The window will self-refresh and you should no longer see your SHIR listed under Integration runtimes.
-
-### Uninstall self-hosted integration runtime
-
-1. Sign in to the machine where your self-hosted integration runtime is installed.
-1. Open the control panel, and under *Uninstall a Program* search for "Microsoft Integration Runtime"
-
-1. Uninstall the existing integration runtime.
-
-> [!IMPORTANT]
-> In the following process, select Yes. Do not keep data during the uninstallation process.
--
-### Remove SQL credentials
-
-1. Go to the [Azure portal](https://portal.azure.com) and navigate to the Key Vault resource where you stored your Microsoft Purview credentials.
-
-1. Under **Settings** in the left menu, select **Secrets**
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-secrets.png" alt-text="Select Secrets from Left Menu in Azure Key Vault.":::
-
-1. Select the SQL Server credential secret you created for this tutorial.
-1. Select **Delete**
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-delete-credential.png" alt-text="Delete Secret from top Menu in Azure Key Vault Secret.":::
-
-1. Select **Yes** to permanently delete the resource.
-
-### Delete Microsoft Purview account
-
-If you would like to delete your Microsoft Purview account after completing this tutorial, follow these steps.
-
-1. Go to the [Azure portal](https://portal.azure.com) and navigate to your purview account.
-
-1. At the top of the page, select the **Delete** button.
-
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-delete.png" alt-text="Delete button on the Microsoft Purview account page in the Azure portal is selected.":::
-
-1. When the process is complete, you'll receive a notification in the Azure portal.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Use Microsoft Purview REST APIs](tutorial-using-rest-apis.md)
purview Tutorial Using Python Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-using-python-sdk.md
- Title: "Tutorial: How to use Microsoft Purview Python SDK"
-description: This tutorial describes how to use the Microsoft Purview Python SDK to scan data and search the catalog.
----- Previously updated : 05/27/2022
-# Customer intent: I can use the scanning and catalog Python SDKs to perform CRUD operations on data sources and scans, trigger scans and also to search the catalog.
--
-# Tutorial: Use the Microsoft Purview Python SDK
-
-This tutorial will introduce you to using the Microsoft Purview Python SDK. You can use the SDK to do all the most common Microsoft Purview operations programmatically, rather than through the Microsoft Purview governance portal.
-
-In this tutorial, you'll learn how us the SDK to:
-
-> [!div class="checklist"]
->* Grant the required rights to work programmatically with Microsoft Purview
->* Register a Blob Storage container as a data source in Microsoft Purview
->* Define and run a scan
->* Search the catalog
->* Delete a data source
-
-## Prerequisites
-
-For this tutorial, you'll need:
-* [Python 3.6 or higher](https://www.python.org/downloads/)
-* An active Azure Subscription. [If you don't have one, you can create one for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An Azure Active Directory tenant associated with your subscription.
-* An Azure Storage account. If you don't already have one, you can [follow our quickstart guide to create one](../storage/common/storage-account-create.md).
-* A Microsoft Purview account. If you don't already have one, you can [follow our quickstart guide to create one](create-catalog-portal.md).
-* A [service principal](../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal) with a [client secret](../active-directory/develop/howto-create-service-principal-portal.md#set-up-authentication).
-
-## Give Microsoft Purview access to the Storage account
-
-Before being able to scan the content of the Storage account, you need to give Microsoft Purview the right role.
-
-1. Go to your Storage Account through the [Azure portal](https://portal.azure.com).
-1. Select Access Control (IAM).
-1. Select the Add button and select **Add role assignment**.
-
- :::image type="content" source="media/tutorial-using-python-sdk/add-role-assignment-storage.png" alt-text="Screenshot of the Access Control menu in the Storage Account with the add button selected and then add role assignment selected.":::
-
-1. In the next window, search for the **Storage blob Reader** role and select it:
-
- :::image type="content" source="media/tutorial-using-python-sdk/storage-blob-reader-role.png" alt-text="Screenshot of the add role assignment menu, with Storage Blob Data Reader selected from the list of available roles.":::
-
-1. Then go on the **Members** tab and select **Select members**:
-
- :::image type="content" source="media/tutorial-using-python-sdk/select-members-blob-reader-role.png" alt-text="Screenshot of the add role assignment menu with the + Select members button selected.":::
-
-1. A new pane appears on the right. Search and select the name of your existing Microsoft Purview instance.
-1. You can then select **Review + Assign**.
-
-Microsoft Purview now has the required reading right to scan your Blob Storage.
-
-## Grant your application the access to your Microsoft Purview account
-
-1. First, you'll need the Client ID, Tenant ID, and Client secret from your service principal. To find this information, select your **Azure Active Directory**.
-1. Then, select **App registrations**.
-1. Select your application and locate the required information:
- * Name
- * Client ID (or Application ID)
- * Tenant ID (or Directory ID)
-
- :::image type="content" source="media/tutorial-using-python-sdk/app-registration-info.png" alt-text="Screenshot of the service principal page in the Azure portal with the Client ID and Tenant ID highlighted.":::
- * [Client secret](../active-directory/develop/howto-create-service-principal-portal.md#set-up-authentication)
-
- :::image type="content" source="media/tutorial-using-python-sdk/get-service-principal-secret.png" alt-text="Screenshot of the service principal page in the Azure portal, with the Certificates & secrets tab selected, showing the available client certificates and secrets.":::
-
-1. You now need to give the relevant Microsoft Purview roles to your service principal. To do so, access your Microsoft Purview instance. Select **Open Microsoft Purview governance portal** or open [the Microsoft Purview's governance portal directly](https://web.purview.azure.com/) and choose the instance that you deployed.
-
-1. Inside the Microsoft Purview governance portal, select **Data map**, then **Collections**:
-
- :::image type="content" source="media/tutorial-using-python-sdk/purview-collections.png" alt-text="Screenshot of the Microsoft Purview governance portal left menu. The data map tab is selected, then the collections tab is selected.":::
-
-1. Select the collection you want to work with, and go on the **Role assignments** tab. Add the service principal in the following roles:
- * Collection admins
- * Data source admins
- * Data curators
- * Data readers
-
-1. For each role, select the **Edit role assignments** button and select the role you want to add the service principal to. Or select the **Add** button next to each role, and add the service principal by searching its name or Client ID as shown below:
-
- :::image type="content" source="media/tutorial-using-python-sdk/add-role-purview.png" alt-text="Screenshot of the Role assignments menu under a collection in the Microsoft Purview governance portal. The add user button is select next to the Collection admins tab. The add or remove collection admins pane is shown, with a search for the service principal in the text box.":::
-
-## Install the Python packages
-
-1. Open a new command prompt or terminal
-1. Install the Azure identity package for authentication:
- ```bash
- pip install azure-identity
- ```
-1. Install the Microsoft Purview Scanning Client package:
- ```bash
- pip install azure-purview-scanning
- ```
-1. Install the Microsoft Purview Administration Client package:
- ```bash
- pip install azure-purview-administration
- ```
-1. Install the Microsoft Purview Client package:
- ```bash
- pip install azure-purview-catalog
- ```
-1. Install the Microsoft Purview Account package:
- ```bash
- pip install azure-purview-account
- ```
-1. Install the Azure Core package:
- ```bash
- pip install azure-core
- ```
-
-## Create Python script file
-
-Create a plain text file, and save it as a Python script with the suffix .py.
-For example: tutorial.py.
-
-## Instantiate a Scanning, Catalog, and Administration client
-
-In this section, you learn how to instantiate:
-* A scanning client useful to registering data sources, creating and managing scan rules, triggering a scan, etc.
-* A catalog client useful to interact with the catalog through searching, browsing the discovered assets, identifying the sensitivity of your data, etc.
-* An administration client is useful for interacting with the Microsoft Purview Data Map itself, for operations like listing collections.
-
-First you need to authenticate to your Azure Active Directory. For this, you'll use the [client secret you created](../active-directory/develop/howto-create-service-principal-portal.md#option-3-create-a-new-application-secret).
--
-1. Start with required import statements: our three clients, the credentials statement, and an Azure exceptions statement.
- ```python
- from azure.purview.scanning import PurviewScanningClient
- from azure.purview.catalog import PurviewCatalogClient
- from azure.purview.administration.account import PurviewAccountClient
- from azure.identity import ClientSecretCredential
- from azure.core.exceptions import HttpResponseError
- ```
-
-1. Specify the following information in the code:
- * Client ID (or Application ID)
- * Tenant ID (or Directory ID)
- * Client secret
-
- ```python
- client_id = "<your client id>"
- client_secret = "<your client secret>"
- tenant_id = "<your tenant id>"
- ```
-
-1. You also need to specify the name of your Microsoft Purview account:
-
- ```python
- reference_name_purview = "<name of your Microsoft Purview account>"
- ```
-1. You can now instantiate the three clients:
-
- ```python
- def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
- def get_purview_client():
- credentials = get_credentials()
- client = PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
- def get_catalog_client():
- credentials = get_credentials()
- client = PurviewCatalogClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
- def get_admin_client():
- credentials = get_credentials()
- client = PurviewAccountClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
- ```
-
-Many of our scripts will start with these same steps, as we'll need these clients to interact with the account.
-
-## Register a data source
-
-In this section, you'll register your Blob Storage.
-
-1. Like we discussed in the previous section, first you'll import the clients you'll need to access your Microsoft Purview account. Also import the Azure error response package so you can troubleshoot, and the ClientSecretCredential to construct your Azure credentials.
-
- ```python
- from azure.purview.administration.account import PurviewAccountClient
- from azure.purview.scanning import PurviewScanningClient
- from azure.core.exceptions import HttpResponseError
- from azure.identity import ClientSecretCredential
- ```
-
-1. Gather the resource ID for your storage account by following this guide: [get the resource ID for a storage account.](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account)
-
-1. Then, in your Python file, define the following information to be able to register the Blob storage programmatically:
-
- ```python
- storage_name = "<name of your Storage Account>"
- storage_id = "<id of your Storage Account>"
- rg_name = "<name of your resource group>"
- rg_location = "<location of your resource group>"
- reference_name_purview = "<name of your Microsoft Purview account>"
- ```
-
-1. Provide the name of the collection where you'd like to register your blob storage. (It should be the same collection where you applied permissions earlier. If it isn't, first apply permissions to this collection.) If it's the root collection, use the same name as your Microsoft Purview instance.
-
- ```python
- collection_name = "<name of your collection>"
- ```
-
-1. Create a function to construct the credentials to access your Microsoft Purview account:
-
- ```python
- client_id = "<your client id>"
- client_secret = "<your client secret>"
- tenant_id = "<your tenant id>"
--
- def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
- ```
-
-1. All collections in the Microsoft Purview Data Map have a **friendly name** and a **name**.
- * The **friendly name** name is the one you see on the collection. For example: Sales.
- * The **name** for all collections (except the root collection) is a six-character name assigned by the data map.
-
- Python needs this six-character name to reference any sub collections. To convert your **friendly name** automatically to the six-character collection name needed in your script, add this block of code:
-
- ```python
- def get_admin_client():
- credentials = get_credentials()
- client = PurviewAccountClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
- try:
- admin_client = get_admin_client()
- except ValueError as e:
- print(e)
-
- collection_list = client.collections.list_collections()
- for collection in collection_list:
- if collection["friendlyName"].lower() == collection_name.lower():
- collection_name = collection["name"]
- ```
-
-1. For both clients, and depending on the operations, you also need to provide an input body. To register a source, you'll need to provide an input body for data source registration:
-
- ```python
- ds_name = "<friendly name for your data source>"
-
- body_input = {
- "kind": "AzureStorage",
- "properties": {
- "endpoint": f"https://{storage_name}.blob.core.windows.net/",
- "resourceGroup": rg_name,
- "location": rg_location,
- "resourceName": storage_name,
- "resourceId": storage_id,
- "collection": {
- "type": "CollectionReference",
- "referenceName": collection_name
- },
- "dataUseGovernance": "Disabled"
- }
- }
- ```
-
-1. Now you can call your Microsoft Purview clients and register the data source.
-
- ```python
- def get_purview_client():
- credentials = get_credentials()
- client = PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
- try:
- client = get_purview_client()
- except ValueError as e:
- print(e)
-
- try:
- response = client.data_sources.create_or_update(ds_name, body=body_input)
- print(response)
- print(f"Data source {ds_name} successfully created or updated")
- except HttpResponseError as e:
- print(e)
- ```
-
-When the registration process succeeds, you can see an enriched body response from the client.
-
-In the following sections, you'll scan the data source you registered and search the catalog. Each of these scripts will be similarly structured to this registration script.
-
-### Full code
-
-```python
-from azure.purview.scanning import PurviewScanningClient
-from azure.identity import ClientSecretCredential
-from azure.core.exceptions import HttpResponseError
-from azure.purview.administration.account import PurviewAccountClient
-
-client_id = "<your client id>"
-client_secret = "<your client secret>"
-tenant_id = "<your tenant id>"
-reference_name_purview = "<name of your Microsoft Purview account>"
-storage_name = "<name of your Storage Account>"
-storage_id = "<id of your Storage Account>"
-rg_name = "<name of your resource group>"
-rg_location = "<location of your resource group>"
-collection_name = "<name of your collection>"
-ds_name = "<friendly data source name>"
-
-def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
-def get_purview_client():
- credentials = get_credentials()
- client = PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
-def get_admin_client():
- credentials = get_credentials()
- client = PurviewAccountClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
-try:
- admin_client = get_admin_client()
-except ValueError as e:
- print(e)
-
-collection_list = admin_client.collections.list_collections()
-for collection in collection_list:
- if collection["friendlyName"].lower() == collection_name.lower():
- collection_name = collection["name"]
--
-body_input = {
- "kind": "AzureStorage",
- "properties": {
- "endpoint": f"https://{storage_name}.blob.core.windows.net/",
- "resourceGroup": rg_name,
- "location": rg_location,
- "resourceName": storage_name,
- "resourceId": storage_id,
- "collection": {
- "type": "CollectionReference",
- "referenceName": collection_name
- },
- "dataUseGovernance": "Disabled"
- }
-}
-
-try:
- client = get_purview_client()
-except ValueError as e:
- print(e)
-
-try:
- response = client.data_sources.create_or_update(ds_name, body=body_input)
- print(response)
- print(f"Data source {ds_name} successfully created or updated")
-except HttpResponseError as e:
- print(e)
-```
-
-## Scan the data source
-
-Scanning a data source can be done in two steps:
-
-1. Create a scan definition
-1. Trigger a scan run
-
-In this tutorial, you'll use the default scan rules for Blob Storage containers. However, you can also [create custom scan rules programmatically with the Microsoft Purview Scanning Client](/python/api/azure-purview-scanning/azure.purview.scanning.operations.scanrulesetsoperations).
-
-Now let's scan the data source you registered above.
-
-1. Add an import statement to generate [unique identifier](https://en.wikipedia.org/wiki/Universally_unique_identifier), call the Microsoft Purview scanning client, the Microsoft Purview administration client, the Azure error response package to be able to troubleshoot, and the client secret credential to gather your Azure credentials.
-
- ```python
- import uuid
- from azure.purview.scanning import PurviewScanningClient
- from azure.purview.administration.account import PurviewAccountClient
- from azure.core.exceptions import HttpResponseError
- from azure.identity import ClientSecretCredential
- ```
-
-1. Create a scanning client using your credentials:
-
- ```python
- client_id = "<your client id>"
- client_secret = "<your client secret>"
- tenant_id = "<your tenant id>"
-
- def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
- def get_purview_client():
- credentials = get_credentials()
- client = PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
- try:
- client = get_purview_client()
- except ValueError as e:
- print(e)
- ```
-
-1. Add the code to gather the internal name of your collection. (For more information, see the previous section):
-
- ```python
- collection_name = "<name of the collection where you will be creating the scan>"
-
- def get_admin_client():
- credentials = get_credentials()
- client = PurviewAccountClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
- try:
- admin_client = get_admin_client()
- except ValueError as e:
- print(e)
-
- collection_list = client.collections.list_collections()
- for collection in collection_list:
- if collection["friendlyName"].lower() == collection_name.lower():
- collection_name = collection["name"]
- ```
-
-1. Then, create a scan definition:
-
- ```python
- ds_name = "<name of your registered data source>"
- scan_name = "<name of the scan you want to define>"
- reference_name_purview = "<name of your Microsoft Purview account>"
-
- body_input = {
- "kind":"AzureStorageMsi",
- "properties": {
- "scanRulesetName": "AzureStorage",
- "scanRulesetType": "System", #We use the default scan rule set
- "collection":
- {
- "referenceName": collection_name,
- "type": "CollectionReference"
- }
- }
- }
-
- try:
- response = client.scans.create_or_update(data_source_name=ds_name, scan_name=scan_name, body=body_input)
- print(response)
- print(f"Scan {scan_name} successfully created or updated")
- except HttpResponseError as e:
- print(e)
- ```
-
-1. Now that the scan is defined you can trigger a scan run with a unique ID:
-
- ```python
- run_id = uuid.uuid4() #unique id of the new scan
-
- try:
- response = client.scan_result.run_scan(data_source_name=ds_name, scan_name=scan_name, run_id=run_id)
- print(response)
- print(f"Scan {scan_name} successfully started")
- except HttpResponseError as e:
- print(e)
- ```
-
-### Full code
-
-```python
-import uuid
-from azure.purview.scanning import PurviewScanningClient
-from azure.purview.administration.account import PurviewAccountClient
-from azure.identity import ClientSecretCredential
-
-ds_name = "<name of your registered data source>"
-scan_name = "<name of the scan you want to define>"
-reference_name_purview = "<name of your Microsoft Purview account>"
-client_id = "<your client id>"
-client_secret = "<your client secret>"
-tenant_id = "<your tenant id>"
-collection_name = "<name of the collection where you will be creating the scan>"
-
-def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
-def get_purview_client():
- credentials = get_credentials()
- client = PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
-def get_admin_client():
- credentials = get_credentials()
- client = PurviewAccountClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
-try:
- admin_client = get_admin_client()
-except ValueError as e:
- print(e)
-
-collection_list = admin_client.collections.list_collections()
-for collection in collection_list:
- if collection["friendlyName"].lower() == collection_name.lower():
- collection_name = collection["name"]
--
-try:
- client = get_purview_client()
-except AzureError as e:
- print(e)
-
-body_input = {
- "kind":"AzureStorageMsi",
- "properties": {
- "scanRulesetName": "AzureStorage",
- "scanRulesetType": "System",
- "collection": {
- "type": "CollectionReference",
- "referenceName": collection_name
- }
- }
-}
-
-try:
- response = client.scans.create_or_update(data_source_name=ds_name, scan_name=scan_name, body=body_input)
- print(response)
- print(f"Scan {scan_name} successfully created or updated")
-except HttpResponseError as e:
- print(e)
-
-run_id = uuid.uuid4() #unique id of the new scan
-
-try:
- response = client.scan_result.run_scan(data_source_name=ds_name, scan_name=scan_name, run_id=run_id)
- print(response)
- print(f"Scan {scan_name} successfully started")
-except HttpResponseError as e:
- print(e)
-```
-
-## Search catalog
-
-Once a scan is complete, it's likely that assets have been discovered and even classified. This process can take some time to complete after a scan, so you may need to wait before running this next portion of code. Wait for your scan to show **completed**, and the assets to appear in the Microsoft Purview Data Catalog.
-
-Once the assets are ready, you can use the Microsoft Purview Catalog client to search the whole catalog.
-
-1. This time you need to import the **catalog** client instead of the scanning one. Also include the HTTPResponse error and ClientSecretCredential.
-
- ```python
- from azure.purview.catalog import PurviewCatalogClient
- from azure.identity import ClientSecretCredential
- from azure.core.exceptions import HttpResponseError
- ```
-
-1. Create a function to get the credentials to access your Microsoft Purview account, and instantiate the catalog client.
-
- ```python
- client_id = "<your client id>"
- client_secret = "<your client secret>"
- tenant_id = "<your tenant id>"
- reference_name_purview = "<name of your Microsoft Purview account>"
-
- def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
- def get_catalog_client():
- credentials = get_credentials()
- client = PurviewCatalogClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
- try:
- client_catalog = get_catalog_client()
- except ValueError as e:
- print(e)
- ```
-
-1. Configure your search criteria and keywords in the input body:
-
- ```python
- keywords = "keywords you want to search"
-
- body_input={
- "keywords": keywords
- }
- ```
-
- Here you only specify keywords, but keep in mind [you can add many other fields to further specify your query](/python/api/azure-purview-catalog/azure.purview.catalog.operations.discoveryoperations#azure-purview-catalog-operations-discoveryoperations-query).
-
-1. Search the catalog:
-
- ```python
- try:
- response = client_catalog.discovery.query(search_request=body_input)
- print(response)
- except HttpResponseError as e:
- print(e)
- ```
-
-### Full code
-
-```python
-from azure.purview.catalog import PurviewCatalogClient
-from azure.identity import ClientSecretCredential
-from azure.core.exceptions import HttpResponseError
-
-client_id = "<your client id>"
-client_secret = "<your client secret>"
-tenant_id = "<your tenant id>"
-reference_name_purview = "<name of your Microsoft Purview account>"
-keywords = "<keywords you want to search for>"
-
-def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
-def get_catalog_client():
- credentials = get_credentials()
- client = PurviewCatalogClient(endpoint=f"https://{reference_name_purview}.purview.azure.com/", credential=credentials, logging_enable=True)
- return client
-
-body_input={
- "keywords": keywords
-}
-
-try:
- catalog_client = get_catalog_client()
-except ValueError as e:
- print(e)
-
-try:
- response = catalog_client.discovery.query(search_request=body_input)
- print(response)
-except HttpResponseError as e:
- print(e)
-```
-
-## Delete a data source
-
-In this section, you'll learn how to delete the data source you registered earlier. This operation is fairly simple, and is done with the scanning client.
-
-1. Import the **scanning** client. Also include the HTTPResponse error and ClientSecretCredential.
-
- ```python
- from azure.purview.scanning import PurviewScanningClient
- from azure.identity import ClientSecretCredential
- from azure.core.exceptions import HttpResponseError
- ```
-
-1. Create a function to get the credentials to access your Microsoft Purview account, and instantiate the scanning client.
-
- ```python
- client_id = "<your client id>"
- client_secret = "<your client secret>"
- tenant_id = "<your tenant id>"
- reference_name_purview = "<name of your Microsoft Purview account>"
-
- def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
- def get_scanning_client():
- credentials = get_credentials()
- PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
- try:
- client_scanning = get_scanning_client()
- except ValueError as e:
- print(e)
- ```
-
-1. Delete the data source:
-
- ```python
- ds_name = "<name of the registered data source you want to delete>"
- try:
- response = client_scanning.data_sources.delete(ds_name)
- print(response)
- print(f"Data source {ds_name} successfully deleted")
- except HttpResponseError as e:
- print(e)
- ```
-
-### Full code
-
-```python
-from azure.purview.scanning import PurviewScanningClient
-from azure.identity import ClientSecretCredential
-from azure.core.exceptions import HttpResponseError
--
-client_id = "<your client id>"
-client_secret = "<your client secret>"
-tenant_id = "<your tenant id>"
-reference_name_purview = "<name of your Microsoft Purview account>"
-ds_name = "<name of the registered data source you want to delete>"
-
-def get_credentials():
- credentials = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)
- return credentials
-
-def get_scanning_client():
- credentials = get_credentials()
- client = PurviewScanningClient(endpoint=f"https://{reference_name_purview}.scan.purview.azure.com", credential=credentials, logging_enable=True)
- return client
-
-try:
- client_scanning = get_scanning_client()
-except ValueError as e:
- print(e)
-
-try:
- response = client_scanning.data_sources.delete(ds_name)
- print(response)
- print(f"Data source {ds_name} successfully deleted")
-except HttpResponseError as e:
- print(e)
-```
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Learn more about the Python Microsoft Purview Scanning Client](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-purview-scanning/1.0.0b2/https://docsupdatetracker.net/index.html)
-> [Learn more about the Python Microsoft Purview Catalog Client](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-purview-catalog/1.0.0b2/https://docsupdatetracker.net/index.html)
purview Tutorial Using Rest Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-using-rest-apis.md
- Title: "How to use REST APIs for Microsoft Purview Data Planes"
-description: This tutorial describes how to use the Microsoft Purview REST APIs to access the contents of your Microsoft Purview.
------ Previously updated : 12/06/2022-
-# Customer intent: I can call the Data plane REST APIs to perform CRUD operations on Microsoft Purview account.
--
-# Tutorial: Use the REST APIs
-
-In this tutorial, you learn how to use the Microsoft Purview REST APIs. Anyone who wants to submit data to Microsoft Purview, include Microsoft Purview as part of an automated process, or build their own user experience on Microsoft Purview can use the REST APIs to do so.
-
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
-
-## Prerequisites
-
-* To get started, you must have an existing Microsoft Purview account. If you don't have a catalog, see the [quickstart for creating a Microsoft Purview account](create-catalog-portal.md).
-
-## Create a service principal (application)
-
-For a REST API client to access the catalog, the client must have a service principal (application), and an identity that the catalog recognizes and is configured to trust. When you make REST API calls to the catalog, they use the service principal's identity.
-
-Customers who have used existing service principals (application IDs) have had a high rate of failure. Therefore, we recommend creating a new service principal for calling APIs.
-
-To create a new service principal:
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. From the portal, search for and select **Azure Active Directory**.
-1. From the **Azure Active Directory** page, select **App registrations** from the left pane.
-1. Select **New registration**.
-1. On the **Register an application** page:
- 1. Enter a **Name** for the application (the service principal name).
- 1. Select **Accounts in this organizational directory only (_&lt;your tenant's name&gt;_ only - Single tenant)**.
- 1. For **Redirect URI (optional)**, select **Web** and enter a value. This value doesn't need to be a valid endpoint. `https://exampleURI.com` will do.
- 1. Select **Register**.
-
- :::image type="content" source="./media/tutorial-using-rest-apis/application-registration.png" alt-text="Screenshot of the application registration page, with the above options filled out.":::
-
-1. On the new service principal page, copy the values of the **Display name** and the **Application (client) ID** to save for later.
-
- The application ID is the `client_id` value in the sample code.
-
- :::image type="content" source="./media/tutorial-using-rest-apis/application-id.png" alt-text="Screenshot of the application page in the portal with the Application (client) ID highlighted.":::
-
-To use the service principal (application), you need to know the service principal's password that can be found by:
-
-1. From the Azure portal, search for and select **Azure Active Directory**, and then select **App registrations** from the left pane.
-1. Select your service principal (application) from the list.
-1. Select **Certificates & secrets** from the left pane.
-1. Select **New client secret**.
-1. On the **Add a client secret** page, enter a **Description**, select an expiration time under **Expires**, and then select **Add**.
-
- On the **Client secrets** page, the string in the **Value** column of your new secret is your password. Save this value.
-
- :::image type="content" source="./media/tutorial-using-rest-apis/client-secret.png" alt-text="Screenshot showing a client secret.":::
-
-## Set up authentication using service principal
-
-Once the new service principal is created, you need to assign the data plane roles of your purview account to the service principal created above. Follow the steps below to assign the correct role to establish trust between the service principal and the Purview account:
-
-1. Navigate to your [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-1. Select the Data Map in the left menu.
-1. Select Collections.
-1. Select the root collection in the collections menu. This will be the top collection in the list, and will have the same name as your Microsoft Purview account.
-
- >[!NOTE]
- >You can also assign your service principal permission to any sub-collections, instead of the root collection. However, all APIs will be scoped to that collection (and sub-collections that inherit permissions), and users trying to call the API for another collection will get errors.
-
-1. Select the **Role assignments** tab.
-
-1. Assign the following roles to the service principal created previously to access various data planes in Microsoft Purview. For detailed steps, see [Assign Azure roles using the Microsoft Purview governance portal](./how-to-create-and-manage-collections.md#add-role-assignments).
-
- * Data Curator role to access Catalog Data plane.
- * Data Source Administrator role to access Scanning Data plane.
- * Collection Admin role to access Account Data Plane and Metadata policy Data Plane.
-
- > [!Note]
- > Only members of the Collection Admin role can assign data plane roles in Microsoft Purview. For more information about Microsoft Purview roles, see [Access Control in Microsoft Purview](./catalog-permissions.md).
-
-## Get token
-
-You can send a POST request to the following URL to get access token.
-
-`https://login.microsoftonline.com/{your-tenant-id}/oauth2/token`
-
-You can find your Tenant ID by searching for **Tenant Properties** in the Azure portal. The ID will be available on the tenant properties page.
-
-The following parameters need to be passed to the above URL:
--- **client_id**: client ID of the application registered in Azure Active directory and is assigned to a data plane role for the Microsoft Purview account.-- **client_secret**: client secret created for the above application.-- **grant_type**: This should be ΓÇÿclient_credentialsΓÇÖ.-- **resource**: This should be ΓÇÿhttps://purview.azure.netΓÇÖ-
-Here's a sample POST request in PowerShell:
-
-```azurepowershell
-$tenantID = "12a345bc-67d1-ef89-abcd-efg12345abcde"
-
-$url = "https://login.microsoftonline.com/$tenantID/oauth2/token"
-$params = @{ client_id = "a1234bcd-5678-9012-abcd-abcd1234abcd"; client_secret = "abcd~a1234bcd56789012abcdabcd1234abcd"; grant_type = "client_credentials"; resource = ΓÇÿhttps://purview.azure.netΓÇÖ }
-
-Invoke-WebRequest $url -Method Post -Body $params -UseBasicParsing | ConvertFrom-Json
-```
-
-Sample response token:
-
-```json
- {
- "token_type": "Bearer",
- "expires_in": "86399",
- "ext_expires_in": "86399",
- "expires_on": "1621038348",
- "not_before": "1620951648",
- "resource": "https://purview.azure.net",
- "access_token": "<<access token>>"
- }
-```
-
-> [!TIP]
-> If you get an error message that reads: *Cross-origin token redemption is permitted only for the 'Single-Page Application' client-type.*
-> * Check your request headers and confirm that your request **doesn't** contain the 'origin' header.
-> * Confirm that your redirect URI is set to **web** in your service principal.
-> * If you are using an application like Postman, make sure your software is up to date.
-
-Use the access token above to call the Data plane APIs.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Manage data sources](manage-data-sources.md)
-> [Microsoft Purview Data Plane REST APIs](/rest/api/purview/)
purview Use Microsoft Purview Governance Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/use-microsoft-purview-governance-portal.md
- Title: Use the Microsoft Purview governance portal
-description: This article describes how to use the Microsoft Purview governance portal.
---- Previously updated : 02/13/2023--
-# Use the Microsoft Purview governance portal
-
-This article gives an overview of some of the main features of Microsoft Purview.
-
-## Prerequisites
-
-* An Active Microsoft Purview account is already created in Azure portal
-* The user has permissions to access [the Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
-
-## Launch Microsoft Purview account
-
-* You can launch the Microsoft Purview account directly by going to `https://web.purview.azure.com`, selecting **Azure Active Directory** and the account name. Or by going to `https://web.purview.azure.com/resource/yourpurviewaccountname`
-
-* To launch your Microsoft Purview account from the [Azure portal](https://portal.azure.com), go to Microsoft Purview accounts in Azure portal, select the account you want to launch and launch the account.
-
- :::image type="content" source="./media/use-purview-studio/open-purview-studio.png" alt-text="Screenshot of Microsoft Purview window in Azure portal, with the Microsoft Purview governance portal button highlighted." border="true":::
-
->[!TIP]
->If you can't access the portal, [confirm you have the necessary permissions](catalog-permissions.md#permissions-to-access-the-microsoft-purview-governance-portal).
-
-## Home page
-
-**Home** is the starting page for the Microsoft Purview client.
--
-The following list summarizes the main features of **Home page**. Each number in the list corresponds to a highlighted number in the preceding screenshot.
-
-1. Friendly name of the catalog. You can set catalog name in **Management** > **Account information**.
-
-2. Catalog analytics shows the number of:
-
- * Data sources
- * Assets
- * Glossary terms
-
-3. The search box allows you to search for data assets across the data catalog.
-
-4. The quick access buttons give access to frequently used functions of the application. The buttons that are presented, depend on the role assigned to your user account at the root collection.
-
- * For *collection admin*, the available button is **Knowledge center**.
- * For *data curator*, the buttons are **Browse assets**, **Manage glossary**, and **Knowledge center**.
- * For *data reader*, the buttons are **Browse assets**, **View glossary**, and **Knowledge center**.
- * For *data source admin* + *data curator*, the buttons are **Browse assets**, **Manage glossary**, and **Knowledge center**.
- * For *data source admin* + *data reader*, the buttons are **Browse assets**, **View glossary**, and **Knowledge center**.
-
- > [!NOTE]
- > For more information about Microsoft Purview roles, see [Access control in Microsoft Purview](catalog-permissions.md).
-
-5. The left navigation bar helps you locate the main pages of the application.
-6. The **Recently accessed** tab shows a list of recently accessed data assets. For information about accessing assets, see [Search the Data Catalog](how-to-search-catalog.md) and [Browse by asset type](how-to-browse-catalog.md). **My items** tab is a list of data assets owned by the logged-on user.
-7. **Links** contains links to region status, documentation, pricing, overview, and Microsoft Purview status
-8. The top navigation bar contains information about release notes/updates, change purview account, notifications, help, and feedback sections.
-
-## Knowledge center
-
-Knowledge center is where you can find all the videos and tutorials related to Microsoft Purview.
-
-## Localization
-
-Microsoft Purview is localized in 18 languages. To change the language used, go to the **Settings** from the top bar and select the desired language from the dropdown.
--
-> [!NOTE]
-> Only generally available features are localized. Features still in preview are in English regardless of which language is selected.
--
-## Guided tours
-
-Each UX in the Microsoft Purview governance portal will have guided tours to give overview of the page. To start the guided tour, select **help** on the top bar and select **guided tours**.
--
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Add a security principal](tutorial-scan-data.md)
remote-rendering Convert Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/remote-rendering/quickstarts/convert-model.md
You need:
## Azure setup
-If you don't have an account yet, go to [https://azure.microsoft.com/get-started/](https://azure.microsoft.com/get-started/), select the free account option, and follow the instructions.
+If you don't have an account yet, go to [Get started with Azure](https://azure.microsoft.com/get-started/), select the free account option, and follow the instructions.
-Once you have an Azure account, go to [https://portal.azure.com/#home](https://portal.azure.com/#home).
+Once you have an Azure account, sign in to the [Azure portal](https://portal.azure.com).
### Storage account creation
The conversion script generates a *Shared Access Signature (SAS)* URI for the co
The SAS URI created by the conversion script expires after 24 hours. However, after it expired you don't need to convert your model again. Instead, you can create a new SAS in the portal as described in the next steps:
-1. Go to the [Azure portal](https://www.portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Select your **Storage account** resource: ![Screenshot that highlights the selected Storage account resource.](./media/portal-storage-accounts.png)
role-based-access-control Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/role-based-access-control/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure RBAC description: Lists Azure Policy Regulatory Compliance controls available for Azure role-based access control (Azure RBAC). These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
sap Acss Backup Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/center-sap-solutions/acss-backup-integration.md
Before you can go ahead and use this feature in preview, register for it from th
## Register for Backup integration preview feature Before you can start configuring Backup from the VIS resource or viewing Backup status on VIS resource in case Backup is already configured, you need to register for the Backup integration feature in Azure Center for SAP solutions. Follow these steps to register for the feature:
-1. Sign into the [Azure portal](https://portal.azure.com) as a user with **Contributor** role access.
+1. Sign in to the [Azure portal](https://portal.azure.com) as a user with **Contributor** role access.
2. Search for **ACSS** and select **Azure Center for SAP solutions** from search results. 3. On the left navigation, select **Virtual Instance for SAP solutions**. 4. Select the **Backup (preview)** tab on the left navigation.
Before you can start configuring Backup from the VIS resource or viewing Backup
## Configure Backup for your SAP system You can configure Backup for your Central service and Application server virtual machines and HANA database from the Virtual Instance for SAP solutions resource following these steps:
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for **ACSS** and select **Azure Center for SAP solutions** from search results. 3. On the left navigation, select **Virtual Instance for SAP solutions**. 4. Select the **Backup (preview)** tab on the left navigation.
You can configure Backup for your Central service and Application server virtual
After you configure Backup for the Virtual Machines and HANA Database of your SAP system either from the Virtual Instance for SAP solutions resource or from the Backup Center, you can monitor the status of Backup from the Virtual Instance for SAP solutions resource. To monitor Backup status:
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for **ACSS** and select **Azure Center for SAP solutions** from search results. 3. On the left navigation, select **Virtual Instance for SAP solutions**. 4. Select the **Backup (preview)** tab on the left navigation.
sap Quickstart Register System Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/center-sap-solutions/quickstart-register-system-cli.md
To register an existing SAP system in Azure Center for SAP solutions:
--sap-product s4hana \ --central-server-vm <Virtual Machine resource ID> \ --identity "{type:UserAssigned,userAssignedIdentities:{<Managed Identity resource ID>:{}}}" \
+ --managed-rg-name "acss-C36" \
```
+ - **g** is used to specify the name of the existing Resource Group into which you want the Virtual Instance for SAP solutions resource to be deployed. It could be the same RG in which you have Compute, Storage resources of your SAP system or a different one.
- **n** parameter is used to specify the SAP System ID (SID) that you are registering with Azure Center for SAP solutions. - **environment** parameter is used to specify the type of SAP environment you are registering. Valid values are *NonProd* and *Prod*. - **sap-product** parameter is used to specify the type of SAP product you are registering. Valid values are *S4HANA*, *ECC*, *Other*.
+ - **managed-rg-name** parameter is used to specify the name of the managed resource group which is deployed by ACSS service in your Subscription. This RG is unique for each SAP system (SID) you register. If you do not specify the name, ACSS service sets a name with this naming convention 'mrg-{SID}-{random string}'.
2. Once you trigger the registration process, you can view its status by getting the status of the Virtual Instance for SAP solutions resource that gets deployed as part of the registration process.
sap Quickstart Register System Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/center-sap-solutions/quickstart-register-system-powershell.md
To register an existing SAP system in Azure Center for SAP solutions:
-IdentityType 'UserAssigned' ` -UserAssignedIdentity @{'/subscriptions/sub1/resourcegroups/rg1/providers/Microsoft.ManagedIdentity/userAssignedIdentities/ACSS-MSI'= @{}} ` ```
+ - **ResourceGroupName** is used to specify the name of the existing Resource Group into which you want the Virtual Instance for SAP solutions resource to be deployed. It could be the same RG in which you have Compute, Storage resources of your SAP system or a different one.
- **Name** attribute is used to specify the SAP System ID (SID) that you are registering with Azure Center for SAP solutions. - **Location** attribute is used to specify the Azure Center for SAP solutions service location. Following table has the mapping that enables you to choose the right service location based on where your SAP system infrastructure is located on Azure.
To register an existing SAP system in Azure Center for SAP solutions:
- **Environment** is used to specify the type of SAP environment you are registering. Valid values are *NonProd* and *Prod*. - **SapProduct** is used to specify the type of SAP product you are registering. Valid values are *S4HANA*, *ECC*, *Other*.
+ - **ManagedResourceGroupName** is used to specify the name of the managed resource group which is deployed by ACSS service in your Subscription. This RG is unique for each SAP system (SID) you register. If you do not specify the name, ACSS service sets a name with this naming convention 'mrg-{SID}-{random string}'.
+ - **ManagedRgStorageAccountName** is used to specify the name of the Storage Account which is deployed into the managed resource group. This storage account is unique for each SAP system (SID) you register. ACSS service sets a default name using '{SID}{random string}' naming convention.
2. Once you trigger the registration process, you can view its status by getting the status of the Virtual Instance for SAP solutions resource that gets deployed as part of the registration process.
sap Hana Li Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/large-instances/hana-li-portal.md
>[!NOTE] >For Rev 4.2, follow the instructions in the [Manage BareMetal Instances through the Azure portal](../../baremetal-infrastructure/connect-baremetal-infrastructure.md) topic.
-This document covers the way how [HANA Large Instances](./hana-overview-architecture.md) are presented in [Azure portal](https://portal.azure.com) and what activities can be conducted through Azure portal with HANA Large Instance units that are deployed for you. Visibility of HANA Large Instances in Azure portal is provided through an Azure resource provider for HANA Large Instances, which currently is in public preview
+This document covers the way how [HANA Large Instances](./hana-overview-architecture.md) are presented in the [Azure portal](https://portal.azure.com) and what activities can be conducted through Azure portal with HANA Large Instance units that are deployed for you. Visibility of HANA Large Instances in Azure portal is provided through an Azure resource provider for HANA Large Instances, which currently is in public preview
## Register HANA Large Instance Resource Provider Usually your Azure subscription you were using for HANA Large Instance deployments is registered for the HANA Large Instance Resource Provider. However, if you canΓÇÖt see you deployed HANA Large Instance units, you should register the Resource Provider in your Azure subscription. There are two ways in registering the HANA Large Instance Resource provider
As you answered the questions and provided additional details, you can go the ne
## Next steps - [How to monitor SAP HANA (large instances) on Azure](./troubleshooting-monitoring.md)-- [Monitoring and troubleshooting from HANA side](./hana-monitor-troubleshoot.md)
+- [Monitoring and troubleshooting from HANA side](./hana-monitor-troubleshoot.md)
sap Provider Netweaver https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/monitor/provider-netweaver.md
This step is **mandatory** when configuring SAP NetWeaver Provider. To fetch spe
```Value field SDEFAULT -GetQueueStatistic -ABAPGetWPTable -EnqGetStatistic -GetProcessList -GetEnvironment -ABAPGetSystemWPTable 1. Select **Copy**.
-1. Select **Profile** &gt; **Save** to save the changes.
+1. Select **Profile** &gt; **Save** to save the changes.
1. Restart the **SAPStartSRV** service on each instance in the SAP system. Restarting the services doesn't restart the entire system. This process only restarts **SAPStartSRV** (on Windows) or the daemon process (in Unix or Linux).
- 1. On Windows systems, use the SAP Microsoft Management Console (MMC) or SAP Management Console (MC) to restart the service. Right-click each instance. Then, choose **All Tasks** &gt; **Restart Service**.
+ You must restart **SAPStartSRV** on each instance of the SAP system for the SAP Control web methods to be unprotected. These read-only SOAP APIs are required for the NetWeaver provider to fetch metric data from the SAP system. Failure to unprotect these methods results in empty or missing visualizations on the NetWeaver metric workbook.
+
+ 1. On Windows systems, use the SAP Microsoft Management Console (MMC) or SAP Management Console (MC) to restart the service. Right-click each instance. Then, choose **All Tasks** &gt; **Restart Service**.
+ ![Screenshot of the MMC console, showing the Restart Service option being selected.](./media/provider-netweaver/azure-monitor-providers-netweaver-mmc-output.png)
+
2. On Linux systems, use the following commands to restart the host. Replace `<instance number>` with your SAP system's instance number. ```Command to restart the service sapcontrol -nr <instance number> -function RestartService ```
- 3. Repeat the previous steps for each instance profile.
+ 3. Repeat the previous steps for each instance profile (or) you can restart the SAP system in lower environments as another option.
- **Powershell script to unprotect web methods**
+#### PowerShell script to unprotect web methods
- You can refer to the [link](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/tree/main/Provider_Pre_Requisites/SAP_NetWeaver_Pre_Requisites/Windows) to unprotect the web-methods in the SAP windows virtual machine.
+You can refer to the [link](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/tree/main/Provider_Pre_Requisites/SAP_NetWeaver_Pre_Requisites/Windows)
+to unprotect the web-methods in the SAP Windows virtual machine.
### Prerequisite to enable RFC metrics RFC metrics are only supported for **AS ABAP applications** and do not apply to SAP JAVA systems. This step is **mandatory** when the connection type selected is **SOAP+RFC**. Below steps need to be performed as a pre-requisite to enable RFC
-1. **Create or upload role** in the SAP NW ABAP system. Azure Monitor for SAP solutions requires this role to connect to SAP. The role uses the least privileged access. Download and unzips [Z_AMS_NETWEAVER_MONITORING.zip](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/files/8710130/Z_AMS_NETWEAVER_MONITORING.zip).
+1. **Create or upload role** in the SAP NW ABAP system. Azure Monitor for SAP solutions requires this role to connect to SAP. The role uses the least privileged access. Download and unzip [Z_AMS_NETWEAVER_MONITORING.zip](https://github.com/hsridharan/azure-docs-pr/files/12114525/Z_AMS_NETWEAVER_MONITORING.zip)
+ 1. Sign in to your SAP system. 1. Use the transaction code **PFCG** &gt; select on **Role Upload** in the menu. 1. Upload the **Z_AMS_NETWEAVER_MONITORING.SAP** file from the ZIP file.
Below steps need to be performed as a pre-requisite to enable RFC
**Transport to import role in SAP System**
- You can also refer to the [link](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/tree/main/Provider_Pre_Requisites/SAP_NetWeaver_Pre_Requisites/SAP%20Role%20Transport) to import role in PFCG and generate profile for successfully configuring Netweaver provider for you SAP system.
+ You can also refer to the [link](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/tree/main/Provider_Pre_Requisites/SAP_NetWeaver_Pre_Requisites/SAP%20Role%20Transport) to import role in PFCG and generate profile for successfully configuring Netweaver provider for your SAP system.
-2. **Create and authorize a new RFC user**.
+3. **Create and authorize a new RFC user**.
1. Create an RFC user. 1. Assign the role **Z_AMS_NETWEAVER_MONITORING** to the user. It's the role that you uploaded in the previous section.
-3. **Enable SICF Services** to access the RFC via the SAP Internet Communication Framework (ICF)
+4. **Enable SICF Services** to access the RFC via the SAP Internet Communication Framework (ICF)
1. Go to transaction code **SICF**. 1. Go to the service path `/default_host/sap/bc/soap/`. 1. Activate the services **wsdl**, **wsdl11** and **RFC**. It's also recommended to check that you enabled the ICF ports.
-4. **SMON** - Enable **SMON** to monitor the system performance.Make sure the version of **ST-PI** is **SAPK-74005INSTPI**. You'll see empty visualization as part of the workbook when it isn't configured.
+4. **SMON** - Enable **SMON** to monitor the system performance.Make sure the version of **ST-PI** is **SAPK-74005INSTPI**.
+ You'll see empty visualization as part of the workbook when it isn't configured.
1. Enable the **SDF/SMON** snapshot service for your system. Turn on daily monitoring. For instructions, see [SAP Note 2651881](https://userapps.support.sap.com/sap/support/knowledge/en/2651881). 2. Configure **SDF/SMON** metrics to be aggregated every minute.
- 3. recommended scheduling **SDF/SMON** as a background job in your target SAP client each minute.
+ 3. Recommended scheduling **SDF/SMON** as a background job in your target SAP client each minute.
+ 4. If you notice empty visualization as part of the workbook tab "System Performance - CPU and Memory (/SDF/SMON)", please apply the below SAP note:
+ 1. Release 740 SAPKB74006-SAPKB74025 - Release 755 Until SAPK-75502INSAPBASIS. For specific support package versions please refer to the SAP NOTE.- [SAP Note 2246160](https://launchpad.support.sap.com/#/notes/2246160).
+ 2. If the metric collection does not work with the above note then please try - [SAP Note 3268727](https://launchpad.support.sap.com/#/notes/3268727)
-5. **To enable secure communication**
+6. **To enable secure communication**
To [enable TLS 1.2 or higher](enable-tls-azure-monitor-sap-solutions.md) with SAP NetWeaver provider please execute steps mentioned on this [SAP document](https://help.sap.com/docs/ABAP_PLATFORM_NEW/e73bba71770e4c0ca5fb2a3c17e8e229/4923501ebf5a1902e10000000a42189c.html?version=201909.002)
Ensure all the prerequisites are successfully completed. To add the NetWeaver pr
9. For **SAP username**, enter the name of the user that you created to connect to the SAP system. 10. For **SAP password**, enter the password for the user. 11. For **Host file entries**, provide the DNS mappings for all SAP VMs associated with the SID
- Enter **all SAP application servers and ASCS** host file entries in **Host file entries**. Enter host file mappings in comma-separated format. The expected format for each entry is IP address, FQDN, hostname. For example: **192.X.X.X sapservername.contoso.com sapservername,192.X.X.X sapservername2.contoso.com sapservername2**. Make sure that host file entries are provided for all hostnames that the [command returns](#determine-all-hostname-associated-with-an-sap-system)
-
- **Scripts to generate hostfiles entries**
+ Enter **all SAP application servers and ASCS** host file entries in **Host file entries**. Enter host file mappings in comma-separated format. The expected format for each entry is IP address, FQDN, hostname. For example: **192.X.X.X sapservername.contoso.com sapservername,192.X.X.X sapservername2.contoso.com sapservername2**.
+ To determine all SAP hostnames associated with the SID, Sign in to the SAP system using the `sidadm` user. Then, run the following command (or) you can leverage the script below to generate the hostfile entries.
+
+ Command to find a list of instances associated with a given SID
+
+ ```bash
+ /usr/sap/hostctrl/exe/sapcontrol -nr <instancenumber> -function GetSystemInstanceList
+ ```
+
+ **Scripts to generate hostfile entries**
- We highly recommend to follow the detailed instructions in the [link](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/tree/main/Provider_Pre_Requisites/SAP_NetWeaver_Pre_Requisites/GenerateHostfileMappings) for generating hostfile entries. These entries are crucial for the successful creation of the Netweaver provider for your SAP system.
+ We highly recommend following the detailed instructions in the [link](https://github.com/Azure/Azure-Monitor-for-SAP-solutions-preview/tree/main/Provider_Pre_Requisites/SAP_NetWeaver_Pre_Requisites/GenerateHostfileMappings) for generating hostfile entries. These entries are crucial for the successful creation of the Netweaver provider for your SAP system.
## Troubleshooting for SAP Netweaver Provider
-List of common commands and troubleshooting solution for errors.
+### Common issues while adding Netweaver Provider.
-### Ensuring Internet communication Framework port is open
+1. **Unable to reach the SAP hostname. ErrorCode: SOAPApiConnectionError**
-1. Sign in to the SAP system
-2. Go to transaction code **SICF**.
-3. Navigate to the service path `/default_host/sap/bc/soap/`.
-3. Right-click the ping service and choose **Test Service**. SAP starts your default browser.
-4. If the port can't be reached, or the test fails, open the port in the SAP VM.
+ 1. Check the input hostname, instance number, and host file mappings for the hostname provided.
+ 2. Follow the instruction for determining the [hostfile entries](#adding-netweaver-provider) Host file entries section.
+ 3. Ensure the NSG/firewall is not blocking the port ΓÇô 5XX13 or 5XX14. (XX - SAP Instance Number)
+ 4. Check if AMS and SAP VMs are in the same vNet or are attached using vNet peering.
- 1. For Linux, run the following commands. Replace `<your port>` with your configured port.
+ If not attached, see the following [link](https://learn.microsoft.com/azure/virtual-network/tutorial-connect-virtual-networks-portal) to connect vNets:
+
+2. **Check for unprotected updated rules. ErrorCode: SOAPWebMethodsValidationFailed**
- ```bash
- sudo firewall-cmd --permanent --zone=public --add-port=<your port>/TCP
+ After you restart the SAP service, check that your updated rules are applied to each instance.
+
+ 1. When Signing in to the SAP system as `sidadm`. Run the following command. Replace `<instance number>` with your system's instance number.
+
+ ```Command to list unprotectedmethods
+ sapcontrol -nr <instance number> -function ParameterValue service/protectedwebmethods
```
- ```bash
- sudo firewall-cmd --reload
+
+ 1. When signing in as non SIDADM user. Run the following command, replace `<instance number>` with your system's instance number, `<admin user>` with your administrator username, and `<admin password>` with the password.
+
+ ```Command to list unprotectedmethods
+ sapcontrol -nr <instance number> -function ParameterValue service/protectedwebmethods -user "<admin user>" "<admin password>"
```
- 1. For Windows, open Windows Defender Firewall from the Start menu. Select **Advanced settings** in the side menu, then select **Inbound Rules**. To open a port, select **New Rule**. Add your port and set the protocol to TCP.
-
-### Check for unprotected updated rules
-
-After you restart the SAP service, check that your updated rules are applied to each instance.
-
-1. When Sign in to the SAP system as `sidadm`. Run the following command. Replace `<instance number>` with your system's instance number.
-
- ```Command to list unprotectedmethods
- sapcontrol -nr <instance number> -function ParameterValue service/protectedwebmethods
- ```
-
-1. When sign in as non SIDADM user. Run the following command, replace `<instance number>` with your system's instance number, `<admin user>` with your administrator username, and `<admin password>` with the password.
-
- ```Command to list unprotectedmethods
- sapcontrol -nr <instance number> -function ParameterValue service/protectedwebmethods -user "<admin user>" "<admin password>"
- ```
-
-1. Review the output. Ensure in the output you see the name of methods **GetQueueStatistic ABAPGetWPTable EnqGetStatistic GetProcessList GetEnvironment ABAPGetSystemWPTable**
-
-1. Repeat the previous steps for each instance profile.
-
-To validate the rules, run a test query against the web methods. Replace the `<hostname>` with your hostname, `<instance number>` with your SAP instance number, and the method name with the appropriate method.
-
- ```powershell
- $SAPHostName = "<hostname>"
- $InstanceNumber = "<instance number>"
+ 1. Review the output. Ensure in the output you see the name of methods **GetQueueStatistic ABAPGetWPTable EnqGetStatistic GetProcessList GetEnvironment ABAPGetSystemWPTable**
- $Function = "ABAPGetWPTable"
+ 1. Repeat the previous steps for each instance profile.
+
- [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
+ To validate the rules, run a test query against the web methods. Replace the `<hostname>` with your hostname, `<instance number>` with your SAP instance number, and the method name with the appropriate method.
- $sapcntrluri = "https://" + $SAPHostName + ":5" + $InstanceNumber + "14/?wsdl"
+ ```powershell
+ $SAPHostName = "<hostname>"
+ $InstanceNumber = "<instance number>"
+ $Function = "ABAPGetWPTable"
+ [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
+ $sapcntrluri = "https://" + $SAPHostName + ":5" + $InstanceNumber + "14/?wsdl"
+ $sapcntrl = New-WebServiceProxy -uri $sapcntrluri -namespace WebServiceProxy -class sapcntrl
+ $FunctionObject = New-Object ($sapcntrl.GetType().NameSpace + ".$Function")
+ $sapcntrl.$Function($FunctionObject)
+ ```
- $sapcntrl = New-WebServiceProxy -uri $sapcntrluri -namespace WebServiceProxy -class sapcntrl
-
- $FunctionObject = New-Object ($sapcntrl.GetType().NameSpace + ".$Function")
-
- $sapcntrl.$Function($FunctionObject)
- ```
-
-### Determine all hostname associated with an SAP system
-
-To determine all SAP hostnames associated with the SID, Sign in to the SAP system using the `sidadm` user. Then, run the following command:
-
- ```Command to find list of instances associated to given instance
- /usr/sap/hostctrl/exe/sapcontrol -nr <instancenumber> -function GetSystemInstanceList
- ```
+3. **Ensuring the Internet communication Framework port is open. ErrorCode: RFCSoapApiNotEnabled**
+ 1. Sign in to the SAP system
+ 2. Go to transaction code **SICF**.
+ 3. Navigate to the service path `/default_host/sap/bc/soap/`.
+ 3. Right-click the ping service and choose **Test Service**. SAP starts your default browser.
+ 4. If the port can't be reached, or the test fails, open the port in the SAP VM.
+
+ 1. For Linux, run the following commands. Replace `<your port>` with your configured port.
+
+ ```bash
+ sudo firewall-cmd --permanent --zone=public --add-port=<your port>/TCP
+ ```
+ ```bash
+ sudo firewall-cmd --reload
+ ```
+ 1. For Windows, open Windows Defender Firewall from the Start menu. Select **Advanced settings** in the side menu, then select **Inbound Rules**. To open a port, select **New Rule**. Add your port and set the protocol to TCP.
+
### Common issues with the metric collection and possible solutions
-#### Batch job metrics not fetched
-Apply the OSS Note - 2469926 in your SAP System to resolve the issues with batch job metrics.
-
-After you apply this OSS note you need to execute the RFC function module - BAPI_XMI_LOGON_WS with the following parameters:
-
-This function module has the same parameters as BAPI_XMI_LOGON but stores them in the table BTCOPTIONS.
-
-INTERFACE = XBP
-VERSION = 3.0
-EXTCOMPANY = TESTC
-EXTPRODUCT = TESTP
-
-#### SWNC metrics not fetched
-In order to retrieve the SWNC metrics, it is important to ensure that the application servers, central instance, and database are consistently set to the same timezone.
-
-### Unprotect methods
-
-To fetch specific metrics, you need to unprotect some methods for the current release. Follow these steps for each SAP system:
-
-1. Open an SAP GUI connection to the SAP server.
-1. Sign in with an administrative account.
-1. Execute transaction **RZ10**.
-1. Select the appropriate profile (*DEFAULT.PFL*).
-1. Select **Extended Maintenance** &gt; **Change**.
-1. Select the profile parameter `service/protectedwebmethods`.
-1. Change the value to `SDEFAULT -GetQueueStatistic -ABAPGetWPTable -EnqGetStatistic -GetProcessList -GetEnvironment -ABAPGetSystemWPTable`.
-1. Select **Copy**.
-1. Go back and select **Profile** &gt; **Save**.
-
-### Restart SAP start service
-
-After updating the parameter, restart the **SAPStartSRV** service on each of the instances in the SAP system. Restarting the services doesn't restart the SAP system. Only the **SAPStartSRV** service (in Windows) or daemon process (in Unix/Linux) is restarted.
-
-You must restart **SAPStartSRV** on each instance of the SAP system for the SAP Control web methods to be unprotected. These read-only SOAP APIs are required for the NetWeaver provider to fetch metric data from the SAP system. Failure to unprotect these methods results empty or missing visualizations on the NetWeaver metric workbook.
-
-On Windows, open the SAP Microsoft Management Console (MMC) / SAP Management Console (MC). Right-click on each instance and select **All Tasks** &gt; **Restart Service**.
-
-![Screenshot of the MMC console, showing the Restart Service option being selected.](./media/provider-netweaver/azure-monitor-providers-netweaver-mmc-output.png)
-
-On Linux, run the command `sapcontrol -nr <NN> -function RestartService`. Replace `<NN>` with the SAP instance number to restart the host.
-
-## Validate changes
-
-After the SAP service restarts, check that the updated web method protection exclusion rules have been applied for each instance. Run one of the following commands. Again, replace `<NN>` with the SAP instance number.
--- If you're logged in as `<sidadm\>`, run `sapcontrol -nr <NN> -function ParameterValue service/protectedwebmethods`.--- If you're logged in as another user, run `sapcontrol -nr <NN> -function ParameterValue service/protectedwebmethods -user "<adminUser>" "<adminPassword>"`.-
-To validate your settings, run a test query against web methods. Replace the hostname, instance number, and method name with the appropriate values.
-
-```powershell
-$SAPHostName = "<hostname>"
-$InstanceNumber = "<instance-number>"
-$Function = "ABAPGetWPTable"
-[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
-$sapcntrluri = "https://" + $SAPHostName + ":5" + $InstanceNumber + "14/?wsdl"
-$sapcntrl = New-WebServiceProxy -uri $sapcntrluri -namespace WebServiceProxy -class sapcntrl
-$FunctionObject = New-Object ($sapcntrl.GetType().NameSpace + ".$Function")
-$sapcntrl.$Function($FunctionObject)
-```
-
-Repeat the previous steps for each instance profile.
-
-You can use an access control list (ACL) to filter the access to a server port. For more information, see [SAP note 1495075](https://launchpad.support.sap.com/#/notes/1495075).
-
-### Install NetWeaver provider
-
-To install the NetWeaver provider in the Azure portal:
-
-1. Sign in to the Azure portal.
+1. SMON metrics
+
+ Refer to the SMON section in the [prerequisite](#prerequisite-to-enable-rfc-metrics)
-1. Go to the **Azure Monitor for SAP solutions** service.
-
-1. Select **Create** to add a new Azure Monitor for SAP solutions resource.
-
-1. Select **Add provider**.
-
- 1. For **Type**, select **SAP NetWeaver**.
-
- 1. *Optional* Select **Enable Secure communication**, choose certificate from drop down.
-
- 1. For **Hostname**, enter the host name of the SAP system.
-
- 1. For **Subdomain**, enter a subdomain if applicable.
-
- 1. For **Instance No**, enter the instance number that corresponds to the host name you entered.
-
- 1. For **SID**, enter the system ID.
-
-1. Select **Add provider** to save your changes.
-
-1. Continue to add more providers as needed.
-
-1. Select **Review + create** to review the deployment.
-
-1. Select **Create** to finish creating the resource.
-
-If the SAP application servers (VMs) are part of a network domain, such as an Azure Active Directory (Azure AD) managed domain, you must provide the corresponding subdomain. The Azure Monitor for SAP solutions collector VM exists inside the virtual network, and isn't joined to the domain. Azure Monitor for SAP solutions can't resolve the hostname of instances inside the SAP system unless the hostname is an FQDN. If you don't provide the subdomain, there can be missing or incomplete visualizations in the NetWeaver workbook.
-
-For example, if the hostname of the SAP system has an FQDN of `myhost.mycompany.contoso.com`:
--- The hostname is `myhost`-- The subdomain is `mycompany.contoso.com`
+2. Batch job metrics
+
+ If you notice empty visualization as part of the workbook tab "Application Performance -Batch Jobs (SM37)", please apply the below SAP note
+ [SAP Note 2469926](https://launchpad.support.sap.com/#/notes/2469926) in your SAP System.
+
+ After you apply this OSS note you need to execute the RFC function module - BAPI_XMI_LOGON_WS with the following parameters:
+
+ This function module has the same parameters as BAPI_XMI_LOGON but stores them in the table BTCOPTIONS.
+
+ INTERFACE = XBP
+ VERSION = 3.0
+ EXTCOMPANY = TESTC
+ EXTPRODUCT = TESTP
-When the NetWeaver provider invokes the **GetSystemInstanceList** API on the SAP system, SAP returns the hostnames of all instances in the system. The collect VM uses this list to make more API calls to fetch metrics for each instances feature. For example, ABAP, J2EE, MESSAGESERVER, ENQUE, ENQREP, and more. If you specify the subdomain, the collect VM uses the subdomain to build the FQDN of each instance in the system.
+4. SWNC metrics
-Don't specify an IP address for the hostname if your SAP system is part of network domain.
+ To ensure a successful retrieval of the SWNC metrics, it is essential to confirm that both the SAP system and the operating system (OS) have synchronized times.
## Next steps
sap Quickstart Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/monitor/quickstart-powershell.md
To create an SAP NetWeaver provider, use the [New-AzWorkloadsProviderInstance](/
Set-AzContext -SubscriptionId 00000000-0000-0000-0000-000000000000 ```
-In the following code, `hostname` is the host name or IP address for SAP Web Dispatcher or the application server. `SapHostFileEntry` is the IP address, fully qualified domain name, or host name of every instance that's listed in [GetSystemInstanceList](./provider-netweaver.md#determine-all-hostname-associated-with-an-sap-system).
+In the following code, `hostname` is the host name or IP address for SAP Web Dispatcher or the application server. `SapHostFileEntry` is the IP address, fully qualified domain name, or host name of every instance that's listed in [GetSystemInstanceList](./provider-netweaver.md#adding-netweaver-provider) point 6 (xi).
```azurepowershell-interactive $subscription_id = '00000000-0000-0000-0000-000000000000'
sap High Availability Guide Suse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/high-availability-guide-suse.md
The following tests are a copy of the test cases in the best practices guides of
rsc_sap_NW1_ERS02 (ocf::heartbeat:SAPInstance): Started nw1-cl-0 ```
- Execute firewall rule to drop communication on one of the nodes
+ Execute firewall rule to block the communication on one of the nodes.
```bash # Execute iptable rule on nw1-cl-0 (10.0.0.5) to block the incoming and outgoing traffic to nw1-cl-1 (10.0.0.6)
The following tests are a copy of the test cases in the best practices guides of
When configuring a fencing device, it's recommended to configure [`pcmk_delay_max`](https://www.suse.com/support/kb/doc/?id=000019110) property. So, in the event of split-brain scenario, the cluster introduces a random delay up to the `pcmk_delay_max` value, to the fencing action on each node. The node with the shortest delay will be selected for fencing.
- Additionally, in ENSA 2 configuration, to prioritize the node hosting the ASCS resource over the other node during a split brain scenario, it's recommended to configure [`priority-fencing-delay`](https://documentation.suse.com/sle-ha/15-SP3/single-html/SLE-HA-administration/#pro-ha-storage-protect-fencing) property in the cluster. Enabling priority-fencing-delay property allows the cluster to introduce an extra delay in the fencing action specifically on the node hosting the ASCS resource, allowing the ASCS node to win the fence race.
+ Additionally, in ENSA 2 configuration, to prioritize the node hosting the ASCS resource over the other node during a split brain scenario, it's recommended to configure [`priority-fencing-delay`](https://documentation.suse.com/sle-ha/15-SP3/single-html/SLE-HA-administration/#pro-ha-storage-protect-fencing) property in the cluster. Enabling priority-fencing-delay property allows the cluster to introduce an additional delay in the fencing action specifically on the node hosting the ASCS resource, allowing the ASCS node to win the fence race.
+
+ Execute below command to delete the firewall rule.
+
+ ```bash
+ # If the iptables rule set on the server gets reset after a reboot, the rules will be cleared out. In case they have not been reset, please proceed to remove the iptables rule using the following command.
+ iptables -D INPUT -s 10.0.0.6 -j DROP; iptables -D OUTPUT -d 10.0.0.6 -j DROP
+ ```
1. Test manual restart of ASCS instance
sap High Availability Guide Windows Netapp Files Smb https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/high-availability-guide-windows-netapp-files-smb.md
Perform the following steps, as preparation for using Azure NetApp Files.
7. Mount the SMB volume on your Windows Virtual Machine. > [!TIP]
-> You can find the instructions on how to mount the Azure NetApp Files volume, if you navigate in [Azure Portal](https://portal.azure.com/#home) to the Azure NetApp Files object, click on the **Volumes** blade, then **Mount Instructions**.
+> For instructions on how to mount the Azure NetApp Files volume, sign in to the [Azure portal](https://portal.azure.com), then navigate to the Azure NetApp Files object, select the **Volumes** blade, then select **Mount Instructions**.
### Important considerations
Using DFS-N allows you to utilize individual sapmnt volumes for SAP systems depl
* To learn how to establish high availability and plan for disaster recovery of SAP HANA on Azure VMs, see [High Availability of SAP HANA on Azure Virtual Machines (VMs)][sap-hana-ha] [sap-ha-guide-figure-8007A]:./media/virtual-machines-shared-sap-high-availability-guide/ha-smb-as.png
-[sap-ha-guide-figure-8007B]:./media/virtual-machines-shared-sap-high-availability-guide/ha-sql-ascs-smb.png
+[sap-ha-guide-figure-8007B]:./media/virtual-machines-shared-sap-high-availability-guide/ha-sql-ascs-smb.png
sap Sap Hana Availability One Region https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/sap-hana-availability-one-region.md
description: Describes SAP HANA operations on Azure native VMs in one Azure regi
tags: azure-resource-manager+
sap Sap Hana High Availability Netapp Files Red Hat https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/sap-hana-high-availability-netapp-files-red-hat.md
vm-linux
Last updated 07/11/2023 -+ # High availability of SAP HANA Scale-up with Azure NetApp Files on Red Hat Enterprise Linux
sap Sap Hana High Availability Netapp Files Suse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/sap-hana-high-availability-netapp-files-suse.md
documentationcenter: saponazure
tags: azure-resource-manager+
sap Sap Hana High Availability Rhel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/sap-hana-high-availability-rhel.md
vm-linux-+ Previously updated : 04/06/2023 Last updated : 04/27/2023 # High availability of SAP HANA on Azure VMs on Red Hat Enterprise Linux
SAP HANA System Replication setup uses a dedicated virtual hostname and virtual
## Deploy for Linux
-The Azure Marketplace contains an image for Red Hat Enterprise Linux 7.4 for SAP HANA that you can use to deploy new virtual machines.
+The Azure Marketplace contains images qualified for SAP HANA with the High Availability add-on, which you can use to deploy new virtual machines using various versions of Red Hat.
### Deploy with a template
To deploy the template, follow these steps:
1. Create a load balancer (internal). We recommend [standard load balancer](../../load-balancer/load-balancer-overview.md). * Select the virtual network created in step 2. 1. Create virtual machine 1.
- Use at least Red Hat Enterprise Linux 7.4 for SAP HANA. This example uses the [Red Hat Enterprise Linux 7.4 for SAP HANA image](https://portal.azure.com/#create/RedHat.RedHatEnterpriseLinux75forSAP-ARM).
- Select the scale set, availability zone or availability set created in step 3.
+ Use a properly supported version of Red Hat for SAP + High Availability, supported for your version of SAP HANA. This page will use the image [Red Hat Enterprise Linux- SAP, HA, Update Services](https://portal.azure.com/#create/redhat.rhel-sap-ha).
+ Select the availability set created in step 3.
1. Create virtual machine 2.
- Use at least Red Hat Enterprise Linux 7.4 for SAP HANA. This example uses the [Red Hat Enterprise Linux 7.4 for SAP HANA image](https://portal.azure.com/#create/RedHat.RedHatEnterpriseLinux75forSAP-ARM).
- Select the scale set, availability zone or availability set created in step 3 (not the same zone as in step 4).
+ Use a properly supported version of Red Hat for SAP + High Availability, supported for your version of SAP HANA. This page will use the image [Red Hat Enterprise Linux- SAP, HA, Update Services](https://portal.azure.com/#create/redhat.rhel-sap-ha).
+ Select the availability set created in step 3.
1. Add data disks. > [!IMPORTANT]
The steps in this section use the following prefixes:
List all of the available disks:
- <pre><code>ls /dev/disk/azure/scsi1/lun*
- </code></pre>
+ ```bash
+ ls /dev/disk/azure/scsi1/lun*
+ ```
Example output:
- <pre><code>
+ ```output
/dev/disk/azure/scsi1/lun0 /dev/disk/azure/scsi1/lun1 /dev/disk/azure/scsi1/lun2 /dev/disk/azure/scsi1/lun3
- </code></pre>
-
+ ```
+
Create physical volumes for all of the disks that you want to use:
- <pre><code>sudo pvcreate /dev/disk/azure/scsi1/lun0
+ ```bash
+ sudo pvcreate /dev/disk/azure/scsi1/lun0
sudo pvcreate /dev/disk/azure/scsi1/lun1 sudo pvcreate /dev/disk/azure/scsi1/lun2 sudo pvcreate /dev/disk/azure/scsi1/lun3
- </code></pre>
+ ```
Create a volume group for the data files. Use one volume group for the log files and one for the shared directory of SAP HANA:
- <pre><code>sudo vgcreate vg_hana_data_<b>HN1</b> /dev/disk/azure/scsi1/lun0 /dev/disk/azure/scsi1/lun1
- sudo vgcreate vg_hana_log_<b>HN1</b> /dev/disk/azure/scsi1/lun2
- sudo vgcreate vg_hana_shared_<b>HN1</b> /dev/disk/azure/scsi1/lun3
- </code></pre>
+ ```bash
+ sudo vgcreate vg_hana_data_HN1 /dev/disk/azure/scsi1/lun0 /dev/disk/azure/scsi1/lun1
+ sudo vgcreate vg_hana_log_HN1 /dev/disk/azure/scsi1/lun2
+ sudo vgcreate vg_hana_shared_HN1 /dev/disk/azure/scsi1/lun3
+ ```
Create the logical volumes. A linear volume is created when you use `lvcreate` without the `-i` switch. We suggest that you create a striped volume for better I/O performance, and align the stripe sizes to the values documented in [SAP HANA VM storage configurations](./hana-vm-operations-storage.md). The `-i` argument should be the number of the underlying physical volumes and the `-I` argument is the stripe size. In this document, two physical volumes are used for the data volume, so the `-i` switch argument is set to **2**. The stripe size for the data volume is **256KiB**. One physical volume is used for the log volume, so no `-i` or `-I` switches are explicitly used for the log volume commands. > [!IMPORTANT] > Use the `-i` switch and set it to the number of the underlying physical volume when you use more than one physical volume for each data, log, or shared volumes. Use the `-I` switch to specify the stripe size, when creating a striped volume.
- > See [SAP HANA VM storage configurations](./hana-vm-operations-storage.md) for recommended storage configurations, including stripe sizes and number of disks.
-
- <pre><code>sudo lvcreate <b>-i 2</b> <b>-I 256</b> -l 100%FREE -n hana_data vg_hana_data_<b>HN1</b>
- sudo lvcreate -l 100%FREE -n hana_log vg_hana_log_<b>HN1</b>
- sudo lvcreate -l 100%FREE -n hana_shared vg_hana_shared_<b>HN1</b>
- sudo mkfs.xfs /dev/vg_hana_data_<b>HN1</b>/hana_data
- sudo mkfs.xfs /dev/vg_hana_log_<b>HN1</b>/hana_log
- sudo mkfs.xfs /dev/vg_hana_shared_<b>HN1</b>/hana_shared
- </code></pre>
-
- Create the mount directories and copy the UUID of all of the logical volumes:
-
- <pre><code>sudo mkdir -p /hana/data/<b>HN1</b>
- sudo mkdir -p /hana/log/<b>HN1</b>
- sudo mkdir -p /hana/shared/<b>HN1</b>
- # Write down the ID of /dev/vg_hana_data_<b>HN1</b>/hana_data, /dev/vg_hana_log_<b>HN1</b>/hana_log, and /dev/vg_hana_shared_<b>HN1</b>/hana_shared
- sudo blkid
- </code></pre>
-
- Create `fstab` entries for the three logical volumes:
-
- <pre><code>sudo vi /etc/fstab
- </code></pre>
-
- Insert the following line in the `/etc/fstab` file:
+ > See [SAP HANA VM storage configurations](./hana-vm-operations-storage.md) for recommended storage configurations, including stripe sizes and number of disks. The following layout examples do not necessarily meet the performance guidelines for a particular system size, they are for illustration only.
- <pre><code>/dev/disk/by-uuid/<b>&lt;UUID of /dev/mapper/vg_hana_data_<b>HN1</b>-hana_data&gt;</b> /hana/data/<b>HN1</b> xfs defaults,nofail 0 2
- /dev/disk/by-uuid/<b>&lt;UUID of /dev/mapper/vg_hana_log_<b>HN1</b>-hana_log&gt;</b> /hana/log/<b>HN1</b> xfs defaults,nofail 0 2
- /dev/disk/by-uuid/<b>&lt;UUID of /dev/mapper/vg_hana_shared_<b>HN1</b>-hana_shared&gt;</b> /hana/shared/<b>HN1</b> xfs defaults,nofail 0 2
- </code></pre>
-
- Mount the new volumes:
-
- <pre><code>sudo mount -a
- </code></pre>
-
-1. **[A]** Set up the disk layout: **Plain Disks**.
-
- For demo systems, you can place your HANA data and log files on one disk. Create a partition on /dev/disk/azure/scsi1/lun0 and format it with xfs:
-
- <pre><code>sudo sh -c 'echo -e "n\n\n\n\n\nw\n" | fdisk /dev/disk/azure/scsi1/lun0'
- sudo mkfs.xfs /dev/disk/azure/scsi1/lun0-part1
+ ```bash
+ sudo lvcreate -i 2 -I 256 -l 100%FREE -n hana_data vg_hana_data_HN1
+ sudo lvcreate -l 100%FREE -n hana_log vg_hana_log_HN1
+ sudo lvcreate -l 100%FREE -n hana_shared vg_hana_shared_HN1
+ sudo mkfs.xfs /dev/vg_hana_data_HN1/hana_data
+ sudo mkfs.xfs /dev/vg_hana_log_HN1/hana_log
+ sudo mkfs.xfs /dev/vg_hana_shared_HN1/hana_shared
+ ```
- # Write down the ID of /dev/disk/azure/scsi1/lun0-part1
- sudo /sbin/blkid
- sudo vi /etc/fstab
- </code></pre>
+ Do not mount the directories by issuing mount commands, rather enter the configurations into the fstab and issue a final `mount -a` to validate the syntax. Start by creating the mount directories for each volume:
- Insert this line in the /etc/fstab file:
+ ```bash
+ sudo mkdir -p /hana/data
+ sudo mkdir -p /hana/log
+ sudo mkdir -p /hana/shared
+ ```
+
+ Next create `fstab` entries for the three logical volumes by inserting the following lines in the `/etc/fstab` file:
- <pre><code>/dev/disk/by-uuid/<b>&lt;UUID&gt;</b> /hana xfs defaults,nofail 0 2
- </code></pre>
+ /dev/mapper/vg_hana_data_HN1-hana_data /hana/data xfs defaults,nofail 0 2
+ /dev/mapper/vg_hana_log_HN1-hana_log /hana/log xfs defaults,nofail 0 2
+ /dev/mapper/vg_hana_shared_HN1-hana_shared /hana/shared xfs defaults,nofail 0 2
- Create the target directory and mount the disk:
+ Finally mount the new volumes all at once:
- <pre><code>sudo mkdir /hana
+ ```bash
sudo mount -a
- </code></pre>
+ ```
1. **[A]** Set up host name resolution for all hosts.
- You can either use a DNS server or modify the /etc/hosts file on all nodes. This example shows you how to use the /etc/hosts file.
- Replace the IP address and the hostname in the following commands:
+ You can either use a DNS server or modify the /etc/hosts file on all nodes by creating entries for all nodes like this in `/etc/hosts`
- <pre><code>sudo vi /etc/hosts
- </code></pre>
+ 10.0.0.5 hn1-db-0
+ 10.0.0.6 hn1-db-1
- Insert the following lines in the /etc/hosts file. Change the IP address and hostname to match your environment:
-
- <pre><code><b>10.0.0.5 hn1-db-0</b>
- <b>10.0.0.6 hn1-db-1</b>
- </code></pre>
1. **[A]** RHEL for HANA configuration
The steps in this section use the following prefixes:
* Enter Instance Number [00]: Enter the HANA Instance number. Enter **03** if you used the Azure template or followed the manual deployment section of this article. * Select Database Mode / Enter Index [1]: Select Enter. * Select System Usage / Enter Index [4]: Select the system usage value.
- * Enter Location of Data Volumes [/hana/data/HN1]: Select Enter.
- * Enter Location of Log Volumes [/hana/log/HN1]: Select Enter.
+ * Enter Location of Data Volumes [/hana/data]: Select Enter.
+ * Enter Location of Log Volumes [/hana/log]: Select Enter.
* Restrict maximum memory allocation? [n]: Select Enter. * Enter Certificate Host Name For Host '...' [...]: Select Enter. * Enter SAP Host Agent User (sapadm) Password: Enter the host agent user password.
The steps in this section use the following prefixes:
Download the latest SAP Host Agent archive from the [SAP Software Center][sap-swcenter] and run the following command to upgrade the agent. Replace the path to the archive to point to the file that you downloaded:
- <pre><code>sudo /usr/sap/hostctrl/exe/saphostexec -upgrade -archive &lt;path to SAP Host Agent SAR&gt;
- </code></pre>
+ ```bash
+ sudo /usr/sap/hostctrl/exe/saphostexec -upgrade -archive <path to SAP Host Agent>;
+ ```
1. **[A]** Configure firewall Create the firewall rule for the Azure load balancer probe port.
- <pre><code>sudo firewall-cmd --zone=public --add-port=625<b>03</b>/tcp
- sudo firewall-cmd --zone=public --add-port=625<b>03</b>/tcp --permanent
- </code></pre>
+ ```bash
+ sudo firewall-cmd --zone=public --add-port=62503/tcp
+ sudo firewall-cmd --zone=public --add-port=62503/tcp --permanent
+ ```
## Configure SAP HANA 2.0 System Replication
The steps in this section use the following prefixes:
Create firewall rules to allow HANA System Replication and client traffic. The required ports are listed on [TCP/IP Ports of All SAP Products](https://help.sap.com/viewer/ports). The following commands are just an example to allow HANA 2.0 System Replication and client traffic to database SYSTEMDB, HN1 and NW1.
- <pre><code>sudo firewall-cmd --zone=public --add-port=40302/tcp --permanent
+ ```bash
+ sudo firewall-cmd --zone=public --add-port=40302/tcp --permanent
sudo firewall-cmd --zone=public --add-port=40302/tcp sudo firewall-cmd --zone=public --add-port=40301/tcp --permanent sudo firewall-cmd --zone=public --add-port=40301/tcp
The steps in this section use the following prefixes:
sudo firewall-cmd --zone=public --add-port=30341/tcp sudo firewall-cmd --zone=public --add-port=30342/tcp --permanent sudo firewall-cmd --zone=public --add-port=30342/tcp
- </code></pre>
+ ```
1. **[1]** Create the tenant database.
The steps in this section use the following prefixes:
Execute as <hanasid\>adm the following command:
- <pre><code>hdbsql -u SYSTEM -p "<b>passwd</b>" -i <b>03</b> -d SYSTEMDB 'CREATE DATABASE <b>NW1</b> SYSTEM USER PASSWORD "<b>passwd</b>"'
- </code></pre>
+ ```bash
+ hdbsql -u SYSTEM -p "[passwd]" -i 03 -d SYSTEMDB 'CREATE DATABASE NW1 SYSTEM USER PASSWORD "<passwd>"'
+ ```
1. **[1]** Configure System Replication on the first node: Backup the databases as <hanasid\>adm:
- <pre><code>hdbsql -d SYSTEMDB -u SYSTEM -p "<b>passwd</b>" -i <b>03</b> "BACKUP DATA USING FILE ('<b>initialbackupSYS</b>')"
- hdbsql -d <b>HN1</b> -u SYSTEM -p "<b>passwd</b>" -i <b>03</b> "BACKUP DATA USING FILE ('<b>initialbackupHN1</b>')"
- hdbsql -d <b>NW1</b> -u SYSTEM -p "<b>passwd</b>" -i <b>03</b> "BACKUP DATA USING FILE ('<b>initialbackupNW1</b>')"
- </code></pre>
+ ```bash
+ hdbsql -d SYSTEMDB -u SYSTEM -p "<passwd>" -i 03 "BACKUP DATA USING FILE ('initialbackupSYS')"
+ hdbsql -d HN1 -u SYSTEM -p "<passwd>" -i 03 "BACKUP DATA USING FILE ('initialbackupHN1')"
+ hdbsql -d NW1 -u SYSTEM -p "<passwd>" -i 03 "BACKUP DATA USING FILE ('initialbackupNW1')"
+ ```
Copy the system PKI files to the secondary site:
- <pre><code>scp /usr/sap/<b>HN1</b>/SYS/global/security/rsecssfs/data/SSFS_<b>HN1</b>.DAT <b>hn1-db-1</b>:/usr/sap/<b>HN1</b>/SYS/global/security/rsecssfs/data/
- scp /usr/sap/<b>HN1</b>/SYS/global/security/rsecssfs/key/SSFS_<b>HN1</b>.KEY <b>hn1-db-1</b>:/usr/sap/<b>HN1</b>/SYS/global/security/rsecssfs/key/
- </code></pre>
+ ```bash
+ scp /usr/sap/HN1/SYS/global/security/rsecssfs/data/SSFS_HN1.DAT hn1-db-1:/usr/sap/HN1/SYS/global/security/rsecssfs/data/
+ scp /usr/sap/HN1/SYS/global/security/rsecssfs/key/SSFS_HN1.KEY hn1-db-1:/usr/sap/HN1/SYS/global/security/rsecssfs/key/
+ ```
Create the primary site:
- <pre><code>hdbnsutil -sr_enable --name=<b>SITE1</b>
- </code></pre>
+ ```bash
+ hdbnsutil -sr_enable --name=SITE1
+ ```
1. **[2]** Configure System Replication on the second node: Register the second node to start the system replication. Run the following command as <hanasid\>adm:
- <pre><code>sapcontrol -nr <b>03</b> -function StopWait 600 10
- hdbnsutil -sr_register --remoteHost=<b>hn1-db-0</b> --remoteInstance=<b>03</b> --replicationMode=sync --name=<b>SITE2</b>
- </code></pre>
+ ```bash
+ sapcontrol -nr 03 -function StopWait 600 10
+ hdbnsutil -sr_register --remoteHost=hn1-db-0 --remoteInstance=03 --replicationMode=sync --name=SITE2
+ ```
1. **[1]** Check replication status Check the replication status and wait until all databases are in sync. If the status remains UNKNOWN, check your firewall settings.
- <pre><code>sudo su - <b>hn1</b>adm -c "python /usr/sap/<b>HN1</b>/HDB<b>03</b>/exe/python_support/systemReplicationStatus.py"
+ ```bash
+ sudo su - hn1adm -c "python /usr/sap/HN1/HDB03/exe/python_support/systemReplicationStatus.py"
# | Database | Host | Port | Service Name | Volume ID | Site ID | Site Name | Secondary | Secondary | Secondary | Secondary | Secondary | Replication | Replication | Replication | # | | | | | | | | Host | Port | Site ID | Site Name | Active Status | Mode | Status | Status Details | # | -- | -- | -- | | | - | | | | | | - | -- | -- | -- |
- # | SYSTEMDB | <b>hn1-db-0</b> | 30301 | nameserver | 1 | 1 | <b>SITE1</b> | <b>hn1-db-1</b> | 30301 | 2 | <b>SITE2</b> | YES | SYNC | ACTIVE | |
- # | <b>HN1</b> | <b>hn1-db-0</b> | 30307 | xsengine | 2 | 1 | <b>SITE1</b> | <b>hn1-db-1</b> | 30307 | 2 | <b>SITE2</b> | YES | SYNC | ACTIVE | |
- # | <b>NW1</b> | <b>hn1-db-0</b> | 30340 | indexserver | 2 | 1 | <b>SITE1</b> | <b>hn1-db-1</b> | 30340 | 2 | <b>SITE2</b> | YES | SYNC | ACTIVE | |
- # | <b>HN1</b> | <b>hn1-db-0</b> | 30303 | indexserver | 3 | 1 | <b>SITE1</b> | <b>hn1-db-1</b> | 30303 | 2 | <b>SITE2</b> | YES | SYNC | ACTIVE | |
+ # | SYSTEMDB | hn1-db-0 | 30301 | nameserver | 1 | 1 | SITE1 | hn1-db-1 | 30301 | 2 | SITE2 | YES | SYNC | ACTIVE | |
+ # | HN1 | hn1-db-0 | 30307 | xsengine | 2 | 1 | SITE1 | hn1-db-1 | 30307 | 2 | SITE2 | YES | SYNC | ACTIVE | |
+ # | NW1 | hn1-db-0 | 30340 | indexserver | 2 | 1 | SITE1 | hn1-db-1 | 30340 | 2 | SITE2 | YES | SYNC | ACTIVE | |
+ # | HN1 | hn1-db-0 | 30303 | indexserver | 3 | 1 | SITE1 | hn1-db-1 | 30303 | 2 | SITE2 | YES | SYNC | ACTIVE | |
# # status system replication site "2": ACTIVE # overall system replication status: ACTIVE
The steps in this section use the following prefixes:
# # mode: PRIMARY # site id: 1
- # site name: <b>SITE1</b>
- </code></pre>
+ # site name: SITE1
+ ```
## Configure SAP HANA 1.0 System Replication
The steps in this section use the following prefixes:
Create firewall rules to allow HANA System Replication and client traffic. The required ports are listed on [TCP/IP Ports of All SAP Products](https://help.sap.com/viewer/ports). The following commands are just an example to allow HANA 2.0 System Replication. Adapt it to your SAP HANA 1.0 installation.
- <pre><code>sudo firewall-cmd --zone=public --add-port=40302/tcp --permanent
+ ```bash
+ sudo firewall-cmd --zone=public --add-port=40302/tcp --permanent
sudo firewall-cmd --zone=public --add-port=40302/tcp
- </code></pre>
+ ```
1. **[1]** Create the required users.
- Run the following command as root. Make sure to replace bold strings (HANA System ID **HN1** and instance number **03**) with the values of your SAP HANA installation:
+ Run the following command as root. Make sure to replace the values for HANA System ID (ex. **HN1**), instance number (**03**), and any user names, with the values of your SAP HANA installation:
- <pre><code>PATH="$PATH:/usr/sap/<b>HN1</b>/HDB<b>03</b>/exe"
- hdbsql -u system -i <b>03</b> 'CREATE USER <b>hdb</b>hasync PASSWORD "<b>passwd</b>"'
- hdbsql -u system -i <b>03</b> 'GRANT DATA ADMIN TO <b>hdb</b>hasync'
- hdbsql -u system -i <b>03</b> 'ALTER USER <b>hdb</b>hasync DISABLE PASSWORD LIFETIME'
- </code></pre>
+ ```bash
+ PATH="$PATH:/usr/sap/HN1/HDB03/exe"
+ hdbsql -u system -i 03 'CREATE USER hdbhasync PASSWORD "passwd"'
+ hdbsql -u system -i 03 'GRANT DATA ADMIN TO hdbhasync'
+ hdbsql -u system -i 03 'ALTER USER hdbhasync DISABLE PASSWORD LIFETIME'
+ ```
1. **[A]** Create the keystore entry. Run the following command as root to create a new keystore entry:
- <pre><code>PATH="$PATH:/usr/sap/<b>HN1</b>/HDB<b>03</b>/exe"
- hdbuserstore SET <b>hdb</b>haloc localhost:3<b>03</b>15 <b>hdb</b>hasync <b>passwd</b>
- </code></pre>
+ ```bash
+ PATH="$PATH:/usr/sap/HN1/HDB03/exe"
+ hdbuserstore SET hdbhaloc localhost:30315 hdbhasync passwd
+ ```
1. **[1]** Back up the database. Back up the databases as root:
- <pre><code>PATH="$PATH:/usr/sap/<b>HN1</b>/HDB<b>03</b>/exe"
- hdbsql -d SYSTEMDB -u system -i <b>03</b> "BACKUP DATA USING FILE ('<b>initialbackup</b>')"
- </code></pre>
+ ```bash
+ PATH="$PATH:/usr/sap/HN1/HDB03/exe"
+ hdbsql -d SYSTEMDB -u system -i 03 "BACKUP DATA USING FILE ('initialbackup')"
+ ```
If you use a multi-tenant installation, also back up the tenant database:
- <pre><code>hdbsql -d <b>HN1</b> -u system -i <b>03</b> "BACKUP DATA USING FILE ('<b>initialbackup</b>')"
- </code></pre>
+ ```bash
+ hdbsql -d HN1 -u system -i 03 "BACKUP DATA USING FILE ('initialbackup')"
+ ```
1. **[1]** Configure System Replication on the first node. Create the primary site as <hanasid\>adm:
- <pre><code>su - <b>hdb</b>adm
- hdbnsutil -sr_enable ΓÇô-name=<b>SITE1</b>
- </code></pre>
+ ```bash
+ su - hdbadm
+ hdbnsutil -sr_enable ΓÇô-name=SITE1
+ ```
1. **[2]** Configure System Replication on the secondary node. Register the secondary site as <hanasid\>adm:
- <pre><code>HDB stop
- hdbnsutil -sr_register --remoteHost=<b>hn1-db-0</b> --remoteInstance=<b>03</b> --replicationMode=sync --name=<b>SITE2</b>
+ ```bash
+ HDB stop
+ hdbnsutil -sr_register --remoteHost=hn1-db-0 --remoteInstance=03 --replicationMode=sync --name=SITE2
HDB start
- </code></pre>
+ ```
## Create a Pacemaker cluster
This is important step to optimize the integration with the cluster and improve
1. Prepare the hook as `root`.
- ```bash
- mkdir -p /hana/shared/myHooks
- cp /usr/share/SAPHanaSR/srHook/SAPHanaSR.py /hana/shared/myHooks
- chown -R hn1adm:sapsys /hana/shared/myHooks
- ```
+ ```bash
+ mkdir -p /hana/shared/myHooks
+ cp /usr/share/SAPHanaSR/srHook/SAPHanaSR.py /hana/shared/myHooks
+ chown -R hn1adm:sapsys /hana/shared/myHooks
+ ```
- 2. Stop HANA on both nodes. Execute as <sid\>adm:
-
- ```bash
- sapcontrol -nr 03 -function StopSystem
- ```
+ 1. Stop HANA on both nodes. Execute as <sid\>adm:
- 3. Adjust `global.ini` on each cluster node.
-
- ```bash
- # add to global.ini
- [ha_dr_provider_SAPHanaSR]
- provider = SAPHanaSR
- path = /hana/shared/myHooks
- execution_order = 1
+ ```bash
+ sapcontrol -nr 03 -function StopSystem
+ ```
+
+ 1. Adjust `global.ini` on each cluster node.
+
+ ```output
+ [ha_dr_provider_SAPHanaSR]
+ provider = SAPHanaSR
+ path = /hana/shared/myHooks
+ execution_order = 1
- [trace]
- ha_dr_saphanasr = info
- ```
+ [trace]
+ ha_dr_saphanasr = info
+ ```
+
+3. **[A]** The cluster requires sudoers configuration on each cluster node for <sid\>adm. In this example that is achieved by creating a new file. Use the `visudo` command to edit the 20-saphana dropin file, as `root`.
-3. **[A]** The cluster requires sudoers configuration on each cluster node for <sid\>adm. In this example that is achieved by creating a new file. Execute the commands as `root`.
```bash sudo visudo -f /etc/sudoers.d/20-saphana
- # Insert the following lines and then save
+ ```
+
+ Insert the following lines and then save
+
+ ```output
Cmnd_Alias SITE1_SOK = /usr/sbin/crm_attribute -n hana_hn1_site_srHook_SITE1 -v SOK -t crm_config -s SAPHanaSR Cmnd_Alias SITE1_SFAIL = /usr/sbin/crm_attribute -n hana_hn1_site_srHook_SITE1 -v SFAIL -t crm_config -s SAPHanaSR Cmnd_Alias SITE2_SOK = /usr/sbin/crm_attribute -n hana_hn1_site_srHook_SITE2 -v SOK -t crm_config -s SAPHanaSR
This is important step to optimize the integration with the cluster and improve
cdtrace awk '/ha_dr_SAPHanaSR.*crm_attribute/ \ { printf "%s %s %s %s\n",$2,$3,$5,$16 }' nameserver_*
- # Example output
+ ```
+
+ ```output
# 2021-04-12 21:36:16.911343 ha_dr_SAPHanaSR SFAIL # 2021-04-12 21:36:29.147808 ha_dr_SAPHanaSR SFAIL # 2021-04-12 21:37:04.898680 ha_dr_SAPHanaSR SOK
For more details on the implementation of the SAP HANA system replication hook s
## Create SAP HANA cluster resources
-Create the HANA topology. Run the following commands on one of the Pacemaker cluster nodes:
+Create the HANA topology. Run the following commands on one of the Pacemaker cluster nodes. Throughout these instructions, be sure to substitute your instance number, HANA system ID, IP addresses, and system names, where appropriate:
-<pre><code>sudo pcs property set maintenance-mode=true
+ ```bash
+ sudo pcs property set maintenance-mode=true
-# Replace the bold string with your instance number and HANA system ID
-sudo pcs resource create SAPHanaTopology_<b>HN1</b>_<b>03</b> SAPHanaTopology SID=<b>HN1</b> InstanceNumber=<b>03</b> \
-op start timeout=600 op stop timeout=300 op monitor interval=10 timeout=600 \
-clone clone-max=2 clone-node-max=1 interleave=true
-</code></pre>
+ sudo pcs resource create SAPHanaTopology_HN1_03 SAPHanaTopology SID=HN1 InstanceNumber=03 \
+ op start timeout=600 op stop timeout=300 op monitor interval=10 timeout=600 \
+ clone clone-max=2 clone-node-max=1 interleave=true
+ ```
Next, create the HANA resources. > [!NOTE]
-> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we’ll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
If building a cluster on **RHEL 7.x**, use the following commands:
-<pre><code># Replace the bold string with your instance number, HANA system ID, and the front-end IP address of the Azure load balancer.
-
-sudo pcs resource create SAPHana_<b>HN1</b>_<b>03</b> SAPHana SID=<b>HN1</b> InstanceNumber=<b>03</b> PREFER_SITE_TAKEOVER=true DUPLICATE_PRIMARY_TIMEOUT=7200 AUTOMATED_REGISTER=false \
-op start timeout=3600 op stop timeout=3600 \
-op monitor interval=61 role="Slave" timeout=700 \
-op monitor interval=59 role="Master" timeout=700 \
-op promote timeout=3600 op demote timeout=3600 \
-master notify=true clone-max=2 clone-node-max=1 interleave=true
+```bash
+sudo pcs resource create SAPHana_HN1_03 SAPHana SID=HN1 InstanceNumber=03 PREFER_SITE_TAKEOVER=true DUPLICATE_PRIMARY_TIMEOUT=7200 AUTOMATED_REGISTER=false \
+ op start timeout=3600 op stop timeout=3600 \
+ op monitor interval=61 role="Slave" timeout=700 \
+ op monitor interval=59 role="Master" timeout=700 \
+ op promote timeout=3600 op demote timeout=3600 \
+ master notify=true clone-max=2 clone-node-max=1 interleave=true
-sudo pcs resource create vip_<b>HN1</b>_<b>03</b> IPaddr2 ip="<b>10.0.0.13</b>"
-sudo pcs resource create nc_<b>HN1</b>_<b>03</b> azure-lb port=625<b>03</b>
-sudo pcs resource group add g_ip_<b>HN1</b>_<b>03</b> nc_<b>HN1</b>_<b>03</b> vip_<b>HN1</b>_<b>03</b>
+sudo pcs resource create vip_HN1_03 IPaddr2 ip="10.0.0.13"
+sudo pcs resource create nc_HN1_03 azure-lb port=62503
+sudo pcs resource group add g_ip_HN1_03 nc_HN1_03 vip_HN1_03
-sudo pcs constraint order SAPHanaTopology_<b>HN1</b>_<b>03</b>-clone then SAPHana_<b>HN1</b>_<b>03</b>-master symmetrical=false
-sudo pcs constraint colocation add g_ip_<b>HN1</b>_<b>03</b> with master SAPHana_<b>HN1</b>_<b>03</b>-master 4000
+sudo pcs constraint order SAPHanaTopology_HN1_03-clone then SAPHana_HN1_03-master symmetrical=false
+sudo pcs constraint colocation add g_ip_HN1_03 with master SAPHana_HN1_03-master 4000
sudo pcs resource defaults resource-stickiness=1000 sudo pcs resource defaults migration-threshold=5000 sudo pcs property set maintenance-mode=false
-</code></pre>
+```
If building a cluster on **RHEL 8.x**, use the following commands:
-<pre><code># Replace the bold string with your instance number, HANA system ID, and the front-end IP address of the Azure load balancer.
-
-sudo pcs resource create SAPHana_<b>HN1</b>_<b>03</b> SAPHana SID=<b>HN1</b> InstanceNumber=<b>03</b> PREFER_SITE_TAKEOVER=true DUPLICATE_PRIMARY_TIMEOUT=7200 AUTOMATED_REGISTER=false \
-op start timeout=3600 op stop timeout=3600 \
-op monitor interval=61 role="Slave" timeout=700 \
-op monitor interval=59 role="Master" timeout=700 \
-op promote timeout=3600 op demote timeout=3600 \
-promotable notify=true clone-max=2 clone-node-max=1 interleave=true
+```bash
+sudo pcs resource create SAPHana_HN1_03 SAPHana SID=HN1 InstanceNumber=03 PREFER_SITE_TAKEOVER=true DUPLICATE_PRIMARY_TIMEOUT=7200 AUTOMATED_REGISTER=false \
+ op start timeout=3600 op stop timeout=3600 \
+ op monitor interval=61 role="Slave" timeout=700 \
+ op monitor interval=59 role="Master" timeout=700 \
+ op promote timeout=3600 op demote timeout=3600 \
+ promotable notify=true clone-max=2 clone-node-max=1 interleave=true
-sudo pcs resource create vip_<b>HN1</b>_<b>03</b> IPaddr2 ip="<b>10.0.0.13</b>"
-sudo pcs resource create nc_<b>HN1</b>_<b>03</b> azure-lb port=625<b>03</b>
-sudo pcs resource group add g_ip_<b>HN1</b>_<b>03</b> nc_<b>HN1</b>_<b>03</b> vip_<b>HN1</b>_<b>03</b>
+sudo pcs resource create vip_HN1_03 IPaddr2 ip="10.0.0.13"
+sudo pcs resource create nc_HN1_03 azure-lb port=62503
+sudo pcs resource group add g_ip_HN1_03 nc_HN1_03 vip_HN1_03
-sudo pcs constraint order SAPHanaTopology_<b>HN1</b>_<b>03</b>-clone then SAPHana_<b>HN1</b>_<b>03</b>-clone symmetrical=false
-sudo pcs constraint colocation add g_ip_<b>HN1</b>_<b>03</b> with master SAPHana_<b>HN1</b>_<b>03</b>-clone 4000
+sudo pcs constraint order SAPHanaTopology_HN1_03-clone then SAPHana_HN1_03-clone symmetrical=false
+sudo pcs constraint colocation add g_ip_HN1_03 with master SAPHana_HN1_03-clone 4000
sudo pcs resource defaults update resource-stickiness=1000 sudo pcs resource defaults update migration-threshold=5000 sudo pcs property set maintenance-mode=false
-</code></pre>
+```
> [!IMPORTANT] > It's a good idea to set `AUTOMATED_REGISTER` to `false`, while you're performing failover tests, to prevent a failed primary instance to automatically register as secondary. After testing, as a best practice, set `AUTOMATED_REGISTER` to `true`, so that after takeover, system replication can resume automatically. - Make sure that the cluster status is ok and that all of the resources are started. It's not important on which node the resources are running. > [!NOTE] > The timeouts in the above configuration are just examples and may need to be adapted to the specific HANA setup. For instance, you may need to increase the start timeout, if it takes longer to start the SAP HANA database.
-<pre><code>sudo pcs status
+Use the command `sudo pcs status` to check the state of the cluster resources just created:
+```output
# Online: [ hn1-db-0 hn1-db-1 ] # # Full list of resources:
Make sure that the cluster status is ok and that all of the resources are starte
# Resource Group: g_ip_HN1_03 # nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-0 # vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-0
-</code></pre>
-
+```
## Configure HANA active/read enabled system replication in Pacemaker cluster
pcs property set maintenance-mode=false
``` Make sure that the cluster status is ok and that all of the resources are started. The second virtual IP will run on the secondary site along with SAPHana secondary resource.
-```
+```output
sudo pcs status # Online: [ hn1-db-0 hn1-db-1 ]
The setup maximizes the time that the second virtual IP resource will be assigne
This section describes how you can test your setup. Before you start a test, make sure that Pacemaker does not have any failed action (via pcs status), there are no unexpected location constraints (for example leftovers of a migration test) and that HANA is sync state, for example with systemReplicationStatus:
-<pre><code>[root@hn1-db-0 ~]# sudo su - hn1adm -c "python /usr/sap/HN1/HDB03/exe/python_support/systemReplicationStatus.py"
-</code></pre>
+```bash
+sudo su - hn1adm -c "python /usr/sap/HN1/HDB03/exe/python_support/systemReplicationStatus.py"
+```
### Test the migration Resource state before starting the test:
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-0 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-0 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-0
-</code></pre>
+```
+
+You can migrate the SAP HANA master node by executing the following command as root:
-You can migrate the SAP HANA master node by executing the following command:
+#### On RHEL 7.x
-<pre><code># On RHEL <b>7.x</b>
-[root@hn1-db-0 ~]# pcs resource move SAPHana_HN1_03-master
-# On RHEL <b>8.x</b>
-[root@hn1-db-0 ~]# pcs resource move SAPHana_HN1_03-clone --master
-</code></pre>
+```bash
+pcs resource move SAPHana_HN1_03-master
+```
+
+#### On RHEL 8.x
+
+```bash
+pcs resource move SAPHana_HN1_03-clone --master
+```
If you set `AUTOMATED_REGISTER="false"`, this command should migrate the SAP HANA master node and the group that contains the virtual IP address to hn1-db-1. Once the migration is done, the 'sudo pcs status' output looks like this
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-1 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-1 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-1
-</code></pre>
-
-The SAP HANA resource on hn1-db-0 is stopped. In this case, configure the HANA instance as secondary by executing this command:
+```
-<pre><code>[root@hn1-db-0 ~]# su - hn1adm
+The SAP HANA resource on hn1-db-0 is stopped. In this case, configure the HANA instance as secondary by executing these commands, as **hn1adm**:
-# Stop the HANA instance just in case it is running
-hn1adm@hn1-db-0:/usr/sap/HN1/HDB03> sapcontrol -nr 03 -function StopWait 600 10
-hn1adm@hn1-db-0:/usr/sap/HN1/HDB03> hdbnsutil -sr_register --remoteHost=hn1-db-1 --remoteInstance=03 --replicationMod
+```bash
+sapcontrol -nr 03 -function StopWait 600 10
+hdbnsutil -sr_register --remoteHost=hn1-db-1 --remoteInstance=03 --replicationMod
e=sync --name=SITE1
-</code></pre>
+```
-The migration creates location constraints that need to be deleted again:
+The migration creates location constraints that need to be deleted again. Do the following as root, or via sudo:
-<pre><code># Switch back to root
-exit
-[root@hn1-db-0 ~]# pcs resource clear SAPHana_HN1_03-master
-</code></pre>
+```bash
+pcs resource clear SAPHana_HN1_03-master
+```
-Monitor the state of the HANA resource using 'pcs status'. Once HANA is started on hn1-db-0, the output should look like this
+Monitor the state of the HANA resource using `pcs status`. Once HANA is started on hn1-db-0, the output should look like this
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-1 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-1 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-1
-</code></pre>
+```
### Test the Azure fencing agent > [!NOTE]
-> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we’ll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
Resource state before starting the test:
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-1 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-1 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-1
-</code></pre>
+```
You can test the setup of the Azure fencing agent by disabling the network interface on the node where SAP HANA is running as Master.
-See [Red Hat Knowledgebase article 79523](https://access.redhat.com/solutions/79523) for a description on how to simulate a network failure. In this example we use the net_breaker script to block all access to the network.
+See [Red Hat Knowledgebase article 79523](https://access.redhat.com/solutions/79523) for a description on how to simulate a network failure. In this example we use the net_breaker script, as root, to block all access to the network.
-<pre><code>[root@hn1-db-1 ~]# sh ./net_breaker.sh BreakCommCmd 10.0.0.6
-</code></pre>
+```bash
+sh ./net_breaker.sh BreakCommCmd 10.0.0.6
+```
The virtual machine should now restart or stop depending on your cluster configuration. If you set the `stonith-action` setting to off, the virtual machine is stopped and the resources are migrated to the running virtual machine.
-After you start the virtual machine again, the SAP HANA resource fails to start as secondary if you set `AUTOMATED_REGISTER="false"`. In this case, configure the HANA instance as secondary by executing this command:
+After you start the virtual machine again, the SAP HANA resource fails to start as secondary if you set `AUTOMATED_REGISTER="false"`. In this case, configure the HANA instance as secondary by executing this command as the **hn1adm** user:
+
+```bash
+sapcontrol -nr 03 -function StopWait 600 10
+hdbnsutil -sr_register --remoteHost=hn1-db-0 --remoteInstance=03 --replicationMode=sync --name=SITE2
+```
-<pre><code>su - <b>hn1</b>adm
+Switch back to root and clean up the failed state
-# Stop the HANA instance just in case it is running
-hn1adm@hn1-db-1:/usr/sap/HN1/HDB03> sapcontrol -nr <b>03</b> -function StopWait 600 10
-hn1adm@hn1-db-1:/usr/sap/HN1/HDB03> hdbnsutil -sr_register --remoteHost=<b>hn1-db-0</b> --remoteInstance=<b>03</b> --replicationMode=sync --name=<b>SITE2</b>
+#### On RHEL 7.x
-# Switch back to root and clean up the failed state
-exit
-# On RHEL <b>7.x</b>
-[root@hn1-db-1 ~]# pcs resource cleanup SAPHana_HN1_03-master
-# On RHEL <b>8.x</b>
-[root@hn1-db-1 ~]# pcs resource cleanup SAPHana_HN1_03 node=&lt;hostname on which the resource needs to be cleaned&gt;
-</code></pre>
+```bash
+pcs resource cleanup SAPHana_HN1_03-master
+```
+
+#### On RHEL 8.x
+```bash
+pcs resource cleanup SAPHana_HN1_03 node=<hostname on which the resource needs to be cleaned>
+```
Resource state after the test:
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-0 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-0 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-0
-</code></pre>
+```
### Test a manual failover Resource state before starting the test:
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-0 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-0 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-0
-</code></pre>
+```
+
+You can test a manual failover by stopping the cluster on the hn1-db-0 node, as root:
+
+```bash
+pcs cluster stop
+```
+
+After the failover, you can start the cluster again. If you set `AUTOMATED_REGISTER="false"`, the SAP HANA resource on the hn1-db-0 node fails to start as secondary. In this case, configure the HANA instance as secondary by executing this command as root:
+
+```bash
+pcs cluster start
+```
-You can test a manual failover by stopping the cluster on the hn1-db-0 node:
+Execute the following as **hn1adm**
-<pre><code>[root@hn1-db-0 ~]# pcs cluster stop
-</code></pre>
+```bash
+sapcontrol -nr 03 -function StopWait 600 10
+hdbnsutil -sr_register --remoteHost=hn1-db-1 --remoteInstance=03 --replicationMode=sync --name=SITE1
+```
-After the failover, you can start the cluster again. If you set `AUTOMATED_REGISTER="false"`, the SAP HANA resource on the hn1-db-0 node fails to start as secondary. In this case, configure the HANA instance as secondary by executing this command:
+Then as root -
+#### On RHEL 7.x
-<pre><code>[root@hn1-db-0 ~]# pcs cluster start
-[root@hn1-db-0 ~]# su - hn1adm
+```bash
+pcs resource cleanup SAPHana_HN1_03-master
+```
-# Stop the HANA instance just in case it is running
-hn1adm@hn1-db-0:/usr/sap/HN1/HDB03> sapcontrol -nr 03 -function StopWait 600 10
-hn1adm@hn1-db-0:/usr/sap/HN1/HDB03> hdbnsutil -sr_register --remoteHost=<b>hn1-db-1</b> --remoteInstance=<b>03</b> --replicationMode=sync --name=<b>SITE1</b>
+#### On RHEL 8.x
-# Switch back to root and clean up the failed state
-hn1adm@hn1-db-0:/usr/sap/HN1/HDB03> exit
-# On RHEL <b>7.x</b>
-[root@hn1-db-1 ~]# pcs resource cleanup SAPHana_HN1_03-master
-# On RHEL <b>8.x</b>
-[root@hn1-db-1 ~]# pcs resource cleanup SAPHana_HN1_03 node=&lt;hostname on which the resource needs to be cleaned&gt;
-</code></pre>
+```bash
+pcs resource cleanup SAPHana_HN1_03 node=<hostname on which the resource needs to be cleaned>
+```
Resource state after the test:
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-1 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-1 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-1
-</code></pre>
+```
### Test a manual failover Resource state before starting the test:
-<pre><code>Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
+```output
+Clone Set: SAPHanaTopology_HN1_03-clone [SAPHanaTopology_HN1_03]
Started: [ hn1-db-0 hn1-db-1 ] Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03] Masters: [ hn1-db-0 ]
Master/Slave Set: SAPHana_HN1_03-master [SAPHana_HN1_03]
Resource Group: g_ip_HN1_03 nc_HN1_03 (ocf::heartbeat:azure-lb): Started hn1-db-0 vip_HN1_03 (ocf::heartbeat:IPaddr2): Started hn1-db-0
-</code></pre>
+```
-You can test a manual failover by stopping the cluster on the hn1-db-0 node:
+You can test a manual failover by stopping the cluster on the hn1-db-0 node, as root:
-<pre><code>[root@hn1-db-0 ~]# pcs cluster stop
-</code></pre>
+```bash
+pcs cluster stop
+```
## Next steps
sap Sap Hana High Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sap/workloads/sap-hana-high-availability.md
Replace `<placeholders>` with the values for your SAP HANA installation.
sudo mkfs.xfs /dev/vg_hana_log_<HANA SID>/hana_log sudo mkfs.xfs /dev/vg_hana_shared_<HANA SID>/hana_shared ```
-
+
1. Create the mount directories and copy the universally unique identifier (UUID) of all the logical volumes: ```bash
With susChkSrv implemented, an immediate and configurable action is executed. Th
provider = SAPHanaSR path = /usr/share/SAPHanaSR execution_order = 1
-
+
[ha_dr_provider_suschksrv] provider = susChkSrv path = /usr/share/SAPHanaSR execution_order = 3 action_on_lost = fence-
+
[trace] ha_dr_saphanasr = info ```
sudo crm configure ms msl_SAPHana_<HANA SID>_HDB<instance number> rsc_SAPHana_<H
meta notify="true" clone-max="2" clone-node-max="1" \ target-role="Started" interleave="true"
+sudo crm resource meta msl_SAPHana_<HANA SID>_HDB<instance number> set priority 100
+ sudo crm configure primitive rsc_ip_<HANA SID>_HDB<instance number> ocf:heartbeat:IPaddr2 \ meta target-role="Started" \ operations \$id="rsc_ip_<HANA SID>_HDB<instance number>-operations" \
sudo crm configure order ord_SAPHana_<HANA SID>_HDB<instance number> Optional: c
# Clean up the HANA resources. The HANA resources might have failed because of a known issue. sudo crm resource cleanup rsc_SAPHana_<HANA SID>_HDB<instance number>
+sudo crm configure property priority-fencing-delay=30
+ sudo crm configure property maintenance-mode=false sudo crm configure rsc_defaults resource-stickiness=1000 sudo crm configure rsc_defaults migration-threshold=5000
stonith-sbd (stonith:external/sbd): Started hn1-db-1
rsc_nc_HN1_HDB03 (ocf::heartbeat:azure-lb): Started hn1-db-1 ```
-### Test the Azure fencing agent
+### Blocking network communication
-You can test the setup of the Azure fencing agent (not the *SBD*) by disabling the network interface on the `hn1-db-0` node:
+Resource state before starting the test:
-```bash
-sudo ifdown eth0
-```
+ ```bash
+ Online: [ hn1-db-0 hn1-db-1 ]
+
+ Full list of resources:
+ stonith-sbd (stonith:external/sbd): Started hn1-db-1
+ Clone Set: cln_SAPHanaTopology_HN1_HDB03 [rsc_SAPHanaTopology_HN1_HDB03]
+ Started: [ hn1-db-0 hn1-db-1 ]
+ Master/Slave Set: msl_SAPHana_HN1_HDB03 [rsc_SAPHana_HN1_HDB03]
+ Masters: [ hn1-db-1 ]
+ Slaves: [ hn1-db-0 ]
+ Resource Group: g_ip_HN1_HDB03
+ rsc_ip_HN1_HDB03 (ocf::heartbeat:IPaddr2): Started hn1-db-1
+ rsc_nc_HN1_HDB03 (ocf::heartbeat:azure-lb): Started hn1-db-1
+ ```
-The VM now restarts or stops, depending on your cluster configuration.
+Execute firewall rule to block the communication on one of the nodes.
-If you set the `stonith-action` setting to `off`, the VM is stopped and the resources are migrated to the running VM.
+ ```bash
+ # Execute iptable rule on hn1-db-1 (10.0.0.6) to block the incoming and outgoing traffic to hn1-db-0 (10.0.0.5)
+ iptables -A INPUT -s 10.0.0.5 -j DROP; iptables -A OUTPUT -d 10.0.0.5 -j DROP
+ ```
-After you start the VM again, the SAP HANA resource fails to start as secondary if you set `AUTOMATED_REGISTER="false"`. In this case, configure the HANA instance as secondary by running this command:
+When cluster nodes can't communicate to each other, there's a risk of a split-brain scenario. In such situations, cluster nodes will try to simultaneously fence each other, resulting in fence race.
-```bash
-su - <hana sid>adm
+When configuring a fencing device, it's recommended to configure [`pcmk_delay_max`](https://www.suse.com/support/kb/doc/?id=000019110) property. So, in the event of split-brain scenario, the cluster introduces a random delay up to the `pcmk_delay_max` value, to the fencing action on each node. The node with the shortest delay will be selected for fencing.
-# Stop the HANA instance, just in case it is running
-sapcontrol -nr <instance number> -function StopWait 600 10
-hdbnsutil -sr_register --remoteHost=hn1-db-1 --remoteInstance=<instance number> --replicationMode=sync --name=<site 1>
+Additionally, to ensure that the node running the HANA master takes priority and wins the fence race in a split brain scenario, it's recommended to set [`priority-fencing-delay`](https://documentation.suse.com/sle-ha/15-SP3/single-html/SLE-HA-administration/#pro-ha-storage-protect-fencing) property in the cluster configuration. By enabling priority-fencing-delay property, the cluster can introduce an additional delay in the fencing action specifically on the node hosting HANA master resource, allowing the node to win the fence race.
-# Switch back to root and clean up the failed state
-exit
-crm resource cleanup msl_SAPHana_<HANA SID>_HDB<instance number> hn1-db-0
-```
+Execute below command to delete the firewall rule.
+
+ ```bash
+ # If the iptables rule set on the server gets reset after a reboot, the rules will be cleared out. In case they have not been reset, please proceed to remove the iptables rule using the following command.
+ iptables -D INPUT -s 10.0.0.5 -j DROP; iptables -D OUTPUT -d 10.0.0.5 -j DROP
+ ```
### Test SBD fencing
search Search Security Manage Encryption Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-security-manage-encryption-keys.md
The following tools and services are used in this scenario.
+ [Azure Cognitive Search](search-create-service-portal.md) on a [billable tier](search-sku-tier.md#tier-descriptions) (Basic or above, in any region).
-+ [Azure Key Vault](../key-vault/general/overview.md), you can create key vault using [Azure portal](../key-vault//general/quick-create-portal.md), [Azure CLI](../key-vault//general/quick-create-cli.md), or [Azure PowerShell](../key-vault//general/quick-create-powershell.md). Create the resource in the same subscription as Azure Cognitive Search. The key vault must have **soft-delete** and **purge protection** enabled.
++ [Azure Key Vault](../key-vault/general/overview.md), you can [create a key vault using the Azure portal](../key-vault/general/quick-create-portal.md), [Azure CLI](../key-vault//general/quick-create-cli.md), or [Azure PowerShell](../key-vault//general/quick-create-powershell.md). Create the resource in the same subscription as Azure Cognitive Search. The key vault must have **soft-delete** and **purge protection** enabled. + [Azure Active Directory](../active-directory/fundamentals/active-directory-whatis.md). If you don't have one, [set up a new tenant](../active-directory/develop/quickstart-create-new-tenant.md).
Conditions that will prevent you from adopting this approach include:
> > The REST API version 2021-04-30-Preview and [Management REST API 2021-04-01-Preview](/rest/api/searchmanagement/2021-04-01-preview/services/create-or-update) provide this feature.
-1. [Sign into Azure portal](https://portal.azure.com/).
+1. [Sign into Azure portal](https://portal.azure.com).
1. Select **+ Create a new resource**.
Conditions that will prevent you from adopting this approach include:
### [**Register an app**](#tab/register-app)
-1. In [Azure portal](https://portal.azure.com), find the Azure Active Directory resource for your subscription.
+1. In the [Azure portal](https://portal.azure.com), find the Azure Active Directory resource for your subscription.
1. On the left, under **Manage**, select **App registrations**, and then select **New registration**.
search Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Cognitive Search description: Lists Azure Policy Regulatory Compliance controls available for Azure Cognitive Search. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
search Service Create Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/service-create-private-endpoint.md
In this section, you'll create a new Azure Cognitive Search service with a Priva
| VM architecture | Accept the default **x64**. | | Size | Accept the default **Standard D2S v3**. | | **ADMINISTRATOR ACCOUNT** | |
- | Username | Enter the user name of the administrator. Use an account that's valid for your Azure subscription. You'll want to sign into Azure portal from the VM so that you can manage your search service. |
+ | Username | Enter the user name of the administrator. Use an account that's valid for your Azure subscription. You'll want to sign in to the Azure portal from the VM so that you can manage your search service. |
| Password | Enter the account password. The password must be at least 12 characters long and meet the [defined complexity requirements](../virtual-machines/windows/faq.yml?toc=%2fazure%2fvirtual-network%2ftoc.json#what-are-the-password-requirements-when-creating-a-vm-).| | Confirm Password | Reenter password. | | **INBOUND PORT RULES** | |
To work around this restriction, connect to Azure portal from a browser on a vir
1. Follow the [steps to provision a VM that can access the search service through a private endpoint](#create-virtual-machine-private-endpoint).
-1. On a virtual machine in your virtual network, open a browser and sign into the Azure portal. The portal will use the private endpoint attached to the virtual machine to connect to your search service.
+1. On a virtual machine in your virtual network, open a browser and sign in to the Azure portal. The portal will use the private endpoint attached to the virtual machine to connect to your search service.
## Clean up resources
security Operational Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/security/fundamentals/operational-overview.md
For more information, see [Configure Network Watcher](../../network-watcher/netw
[Customer Lockbox for Microsoft Azure](customer-lockbox-overview.md) is a service integrated into Azure portal that gives you explicit control in the rare instance when a Microsoft Support Engineer may need access to your data to resolve an issue. There are very few instances, such as a debugging remote access issue, where a Microsoft Support Engineer requires elevated permissions to resolve this issue. In such cases, Microsoft engineers use just-in-time access service that provides limited, time-bound authorization with access limited to the service.
-While Microsoft has always obtained customer consent for access, Customer Lockbox now gives you the ability to review and approve or deny such requests from the Azure Portal. Microsoft support engineers will not be granted access until you approve the request.
+While Microsoft has always obtained customer consent for access, Customer Lockbox now gives you the ability to review and approve or deny such requests from the Azure portal. Microsoft support engineers will not be granted access until you approve the request.
## Standardized and Compliant Deployments
To learn about the Security and Audit solution, see the following articles:
- [Security and compliance](https://azure.microsoft.com/overview/trusted-cloud/) - [Microsoft Defender for Cloud](../../security-center/security-center-introduction.md)-- [Azure Monitor](../../azure-monitor/overview.md)
+- [Azure Monitor](../../azure-monitor/overview.md)
sentinel Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/best-practices.md
Schedule the following Microsoft Sentinel activities regularly to ensure continu
### Weekly tasks -- **Content review of solutions or standalone content**. Get any content updates for your installed solutions or standalone content from the the [Content hub](sentinel-solutions-deploy.md). Review new solutions or standalone content that might be of value for your environment, such as analytics rules, workbooks, hunting queries, or playbooks.
+- **Content review of solutions or standalone content**. Get any content updates for your installed solutions or standalone content from the [Content hub](sentinel-solutions-deploy.md). Review new solutions or standalone content that might be of value for your environment, such as analytics rules, workbooks, hunting queries, or playbooks.
- **Microsoft Sentinel auditing**. Review Microsoft Sentinel activity to see who has updated or deleted resources, such as analytics rules, bookmarks, and so on. For more information, see [Audit Microsoft Sentinel queries and activities](audit-sentinel-data.md).
sentinel Forward Syslog Monitor Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/forward-syslog-monitor-agent.md
Last updated 01/05/2023-+ #Customer intent: As a security engineer, I want to get Syslog data into Microsoft Sentinel so that I can do attack detection, threat visibility, proactive hunting, and threat response. As an IT administrator, I want to get Syslog data into my Log Analytics workspace to monitor my Linux-based devices.
sentinel Normalization Schema Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/normalization-schema-network.md
The following list mentions fields that have specific guidelines for Network Ses
| Field | Class | Type | Description | ||-||--| | **EventCount** | Mandatory | Integer | Netflow sources support aggregation, and the **EventCount** field should be set to the value of the Netflow **FLOWS** field. For other sources, the value is typically set to `1`. |
-| <a name="eventtype"></a> **EventType** | Mandatory | Enumerated | Describes the scenario reported by the record.<br><br> For Network Session records, the allowed values are:<br> - `EndpointNetworkSession`<br> - `NetworkSession` <br> - `L2NetworkSession`<br>- `IDS` <br> - `Flow`<br><br>For more information on event types, refer to the the [schema overview](#schema-overview) |
+| <a name="eventtype"></a> **EventType** | Mandatory | Enumerated | Describes the scenario reported by the record.<br><br> For Network Session records, the allowed values are:<br> - `EndpointNetworkSession`<br> - `NetworkSession` <br> - `L2NetworkSession`<br>- `IDS` <br> - `Flow`<br><br>For more information on event types, refer to the [schema overview](#schema-overview) |
| <a name="eventsubtype"></a>**EventSubType** | Optional | String | Additional description of the event type, if applicable. <br> For Network Session records, supported values include:<br>- `Start`<br>- `End`<br><br>This is field is not relevant for `Flow` events. | | <a name="eventresult"></a>**EventResult** | Mandatory | Enumerated | If the source device does not provide an event result, **EventResult** should be based on the value of [DvcAction](#dvcaction). If [DvcAction](#dvcaction) is `Deny`, `Drop`, `Drop ICMP`, `Reset`, `Reset Source`, or `Reset Destination`<br>, **EventResult** should be `Failure`. Otherwise, **EventResult** should be `Success`. | | **EventResultDetails** | Recommended | Enumerated | Reason or details for the result reported in the [EventResult](#eventresult) field. Supported values are:<br> - Failover <br> - Invalid TCP <br> - Invalid Tunnel<br> - Maximum Retry<br> - Reset<br> - Routing issue<br> - Simulation<br> - Terminated<br> - Timeout<br> - Transient error<br> - Unknown<br> - NA.<br><br>The original, source specific, value is stored in the [EventOriginalResultDetails](normalization-common-fields.md#eventoriginalresultdetails) field. |
sentinel Deploy Sap Btp Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/sap/deploy-sap-btp-solution.md
Before you begin, verify that:
- **uaa.clientsecret**: `682323d2-42a0-45db-a939-74639efde986$gR3x3ohHTB8iyYSKHW0SNIWG4G0tQkkMdBwO7lKhwcQ=` - **uaa.url**: `https://915a0312trial.authentication.us10.hana.ondemand.com`
-1. Log into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to the **Microsoft Sentinel** service. 1. Select **Content hub**, and in the search bar, search for *BTP*. 1. Select **SAP BTP**.
service-bus-messaging Automate Update Messaging Units https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/automate-update-messaging-units.md
In this section, you learn how to use the Azure portal to configure autoscaling
## Autoscale setting page First, follow these steps to navigate to the **Autoscale settings** page for your Service Bus namespace.
-1. Sign into [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the search bar, type **Service Bus**, select **Service Bus** from the drop-down list, and press **ENTER**. 1. Select your **premium namespace** from the list of namespaces. 1. Switch to the **Scale** page.
We recommend that you create rules such that messaging units are increased or de
## Next steps To learn about messaging units, see the [Premium messaging](service-bus-premium-messaging.md)-
service-bus-messaging Private Link Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/private-link-service.md
If you select the **Private access** option on the **Networking** page of the na
If you already have an existing namespace, you can create a private endpoint by following these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. In the search bar, type in **Service Bus**. 3. Select the **namespace** from the list to which you want to add a private endpoint. 2. On the left menu, select **Networking** option under **Settings**.
service-bus-messaging Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Service Bus Messaging description: Lists Azure Policy Regulatory Compliance controls available for Azure Service Bus Messaging. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
service-bus-messaging Service Bus Azure And Service Bus Queues Compared Contrasted https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted.md
Service Bus queues provide many advanced features such as the following ones. So
The following articles provide more guidance and information about using Storage queues or Service Bus queues. * [Get started with Service Bus queues](service-bus-dotnet-get-started-with-queues.md)
-* [How to Use the Queue Storage Service](../storage/queues/storage-dotnet-how-to-use-queues.md)
+* [How to Use the Queue Storage Service](/azure/storage/queues/storage-quickstart-queues-dotnet?tabs=passwordless%2Croles-azure-portal%2Cenvironment-variable-windows%2Csign-in-azure-cli)
* [Best practices for performance improvements using Service Bus brokered messaging](service-bus-performance-improvements.md) [Azure portal]: https://portal.azure.com
service-bus-messaging Service Bus Quickstart Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-quickstart-cli.md
This quickstart shows you how to create a Service Bus namespace and a queue usin
## Prerequisites If you don't have an Azure subscription, you can create a [free account][free account] before you begin.
-In this quickstart, you use Azure Cloud Shell that you can launch after sign into the Azure portal. For details about Azure Cloud Shell, see [Overview of Azure Cloud Shell](../cloud-shell/overview.md). You can also [install](/cli/azure/install-azure-cli) and use Azure PowerShell on your machine.
+In this quickstart, you use Azure Cloud Shell that you can launch after signing in to the Azure portal. For details about Azure Cloud Shell, see [Overview of Azure Cloud Shell](../cloud-shell/overview.md). You can also [install](/cli/azure/install-azure-cli) and use Azure PowerShell on your machine.
## Provision resources
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Launch Azure Cloud Shell by selecting the icon shown in the following image. Switch to **Bash** mode if the Cloud Shell is in **PowerShell** mode. :::image type="content" source="./media/service-bus-quickstart-powershell/launch-cloud-shell.png" alt-text="Launch Cloud Shell":::
service-bus-messaging Service Bus Quickstart Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-quickstart-powershell.md
This quickstart shows you how to create a Service Bus namespace and a queue usin
To complete this quickstart, make sure you have an Azure subscription. If you don't have an Azure subscription, you can create a [free account][] before you begin.
-In this quickstart, you use Azure Cloud Shell that you can launch after sign into the Azure portal. For details about Azure Cloud Shell, see [Overview of Azure Cloud Shell](../cloud-shell/overview.md). You can also [install](/powershell/azure/install-azure-powershell) and use Azure PowerShell on your machine.
+In this quickstart, you use Azure Cloud Shell that you can launch after sign in to the Azure portal. For details about Azure Cloud Shell, see [Overview of Azure Cloud Shell](../cloud-shell/overview.md). You can also [install](/powershell/azure/install-azure-powershell) and use Azure PowerShell on your machine.
## Provision resources
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Launch Azure Cloud Shell by selecting the icon shown in the following image: :::image type="content" source="./media/service-bus-quickstart-powershell/launch-cloud-shell.png" alt-text="Launch Cloud Shell":::
service-bus-messaging Service Bus Tutorial Topics Subscriptions Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-tutorial-topics-subscriptions-cli.md
Service Bus topics and subscriptions enable you to scale to process a large numb
## Prerequisites If you don't have an Azure subscription, you can create a [free account][free account] before you begin.
-In this quickstart, you use Azure Cloud Shell that you can launch after sign into the Azure portal. For details about Azure Cloud Shell, see [Overview of Azure Cloud Shell](../cloud-shell/overview.md). You can also [install](/cli/azure/install-azure-cli) and use Azure PowerShell on your machine.
+In this quickstart, you use Azure Cloud Shell that you can launch after sign in to the Azure portal. For details about Azure Cloud Shell, see [Overview of Azure Cloud Shell](../cloud-shell/overview.md). You can also [install](/cli/azure/install-azure-cli) and use Azure PowerShell on your machine.
## Create a Service Bus topic and subscriptions Each [subscription to a topic](service-bus-messaging-overview.md#topics) can receive a copy of each message. Topics are fully protocol and semantically compatible with Service Bus queues. Service Bus topics support a wide array of selection rules with filter conditions, with optional actions that set or modify message properties. Each time a rule matches, it produces a message. To learn more about rules, filters, and actions, follow this [link](topic-filters.md).
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Launch Azure Cloud Shell by selecting the icon shown in the following image. Switch to **Bash** mode if the Cloud Shell is in **PowerShell** mode. :::image type="content" source="./media/service-bus-quickstart-powershell/launch-cloud-shell.png" alt-text="Launch Cloud Shell":::
service-fabric Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-fabric/security-controls-policy.md
Previously updated : 07/06/2023 Last updated : 07/20/2023 # Azure Policy Regulatory Compliance controls for Azure Service Fabric
site-recovery Azure To Azure Replicate After Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/azure-to-azure-replicate-after-migration.md
Title: Set up disaster recovery after migration to Azure with Azure Site Recover
description: This article describes how to prepare machines to set up disaster recovery between Azure regions after migration to Azure using Azure Site Recovery. + Last updated 05/02/2023
site-recovery Azure To Azure Tutorial Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/azure-to-azure-tutorial-migrate.md
The following steps show how to prepare the virtual machine for the move using A
### Create the vault in any region, except the source region
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In search, type Recovery Services > click Recovery Services vaults
-1. In the Recovery Services vaults menu, click +Add.
+1. In the Recovery Services vaults menu, click **+ Add**.
1. In **Name**, specify the friendly name **ContosoVMVault**. If you have more than one subscription, select the appropriate one. 1. Create the resource group **ContosoRG**. 1. Specify an Azure region. To check supported regions, see geographic availability in [Azure Site Recovery pricing details](https://azure.microsoft.com/pricing/details/site-recovery/).
site-recovery Hyper V Azure Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/hyper-v-azure-tutorial.md
description: Learn how to set up disaster recovery of on-premises Hyper-V VMs (w
Last updated 05/04/2023-+ - # Set up disaster recovery of on-premises Hyper-V VMs to Azure
site-recovery Site Recovery Deployment Planner History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/site-recovery-deployment-planner-history.md
+ Last updated 6/4/2020
site-recovery Site Recovery Extension Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/site-recovery-extension-troubleshoot.md
description: Troubleshoot issues with the Azure VM extension for disaster recove
+ Last updated 05/03/2023
site-recovery Site Recovery Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/site-recovery-whats-new.md
Last updated 05/02/2023--+ # What's new in Site Recovery
site-recovery Vmware Azure Deploy Configuration Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/vmware-azure-deploy-configuration-server.md
Minimum hardware requirements for a configuration server are summarized in the f
You must have a user with one of the following permissions set in Azure Active Directory (Azure AD) to register the configuration server with Azure Site Recovery services. 1. The user must have an application developer role to create an application.
- - To verify, sign in to the Azure portal.</br>
- - Go to **Azure Active Directory** > **Roles and administrators**.</br>
+ - To verify, sign in to the Azure portal.
+ - Go to **Azure Active Directory** > **Roles and administrators**.
- Verify that the application developer role is assigned to the user. If not, use a user with this permission or contact an [administrator to enable the permission](../active-directory/fundamentals/active-directory-users-assign-role-azure-portal.md#assign-roles). 2. If the application developer role can't be assigned, ensure that the **Users can register applications** flag is set as **true** for the user to create an identity. To enable these permissions:
- - Sign in to the Azure portal.
- - Go to **Azure Active Directory** > **User settings**.
- - Under **App registrations**, **Users can register applications**, select **Yes**.
+ 1. Sign in to the Azure portal.
+ 1. Go to **Azure Active Directory** > **User settings**.
+ 1. Under **App registrations**, **Users can register applications**, select **Yes**.
![Azure AD_application_permission](media/vmware-azure-deploy-configuration-server/AAD_application_permission.png)
site-recovery Vmware Azure Install Linux Master Target https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/vmware-azure-install-linux-master-target.md
+ Last updated 05/02/2023
From 9.42 version, ASR supports Linux master target server on Ubuntu 20.04. To u
After the installation and registration of the master target has finished, you can see the master target appear on the **master target** section in **Site Recovery Infrastructure**, under the configuration server overview. You can now proceed with [reprotection](vmware-azure-reprotect.md), followed by failback.-
site-recovery Vmware Physical Manage Mobility Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/vmware-physical-manage-mobility-service.md
description: Manage Mobility Service agent for disaster recovery of VMware VMs a
+ Last updated 05/02/2023
site-recovery Vmware Physical Mobility Service Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/vmware-physical-mobility-service-overview.md
Last updated 05/02/2023-+ # About the Mobility service for VMware VMs and physical servers
spring-apps Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-apps/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Spring Apps description: Lists Azure Policy Regulatory Compliance controls available for Azure Spring Apps. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
storage-mover Agent Deploy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage-mover/agent-deploy.md
You can unregister an agent in the Azure portal by navigating to your storage mo
You can unregister an agent using the Az PowerShell. As a prerequisite, ensure that you have the latest version of PowerShell on your machine, and also the latest versions of the Az and Az.StorageMover PowerShell modules installed. ```powershell
-Login-AzAccount -subscriptionId <YourSubscriptionId> #log into the Azure subscription that contains the storage mover resource the agent is registered with.
+Login-AzAccount -subscriptionId <YourSubscriptionId> #Sign in to the Azure subscription that contains the storage mover resource the agent is registered with.
Unregister-AzStorageMoverAgent -ResourceGroupName <YourResourceGroupName> -StorageMoverName <YourStorageMoverName> -AgentName <YourAgentName> ```
storage Blobfuse2 Commands Mount All https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-commands-mount-all.md
description: Learn how to use the 'blobfuse2 mount all' all command to mount all blob containers in a storage account as a Linux file system. + Last updated 12/02/2022
storage Blobfuse2 Commands Mount List https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-commands-mount-list.md
description: Learn how to use the 'blobfuse2 mount list' command to display all BlobFuse2 mount points. + Last updated 12/02/2022
storage Blobfuse2 Commands Mount https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-commands-mount.md
description: Learn how to use the 'blobfuse2 mount' command to mount a Blob Storage container as a file system in Linux, or to display and manage existing mount points. + Last updated 12/02/2022
storage Blobfuse2 Commands Mountv1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-commands-mountv1.md
description: How to generate a configuration file for BlobFuse2 from a BlobFuse v1 configuration file. + Last updated 12/02/2022
storage Blobfuse2 Commands Unmount All https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-commands-unmount-all.md
description: Learn how to use the 'blobfuse2 unmount all' command to unmount all blob containers in a storage account as a Linux file system. + Last updated 12/02/2022
storage Blobfuse2 Commands Unmount https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-commands-unmount.md
description: How to use the 'blobfuse2 unmount' command to unmount an existing mount point. + Last updated 12/02/2022
storage Blobfuse2 How To Deploy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blobfuse2-how-to-deploy.md
Last updated 01/26/2023-+ # How to mount an Azure Blob Storage container on Linux with BlobFuse2
storage Simulate Primary Region Failure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/simulate-primary-region-failure.md
Last updated 09/06/2022
ms.devlang: javascript-+ # Tutorial: Simulate a failure in reading data from the primary region
storage Storage How To Mount Container Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-how-to-mount-container-linux.md
Last updated 12/02/2022 -+ # How to mount Azure Blob Storage as a file system with BlobFuse v1
storage Upgrade To Data Lake Storage Gen2 How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/upgrade-to-data-lake-storage-gen2-how-to.md
Previously updated : 07/19/2023 Last updated : 07/20/2023
To prepare to upgrade your storage account to Data Lake Storage Gen2:
> [!div class="checklist"] > - [Review feature support](#review-feature-support) > - [Ensure the segments of each blob path are named](#ensure-the-segments-of-each-blob-path-are-named)
+> - [Prevent write activity to the storage account](#prevent-write-activity-to-the-storage-account)
### Review feature support
In some cases, you will have to allow time for clean-up operations after a featu
The migration process creates a directory for each path segment of a blob. Data Lake Storage Gen2 directories must have a name so for migration to succeed, each path segment in a virtual directory must have a name. The same requirement is true for segments that are named only with a space character. If any path segments are either unnamed (`//`) or named only with a space character (`_`), then before you proceed with the migration, you must copy those blobs to a new path that is compatible with these naming requirements.
+### Prevent write activity to the storage account
+
+The upgrade might fail if an application writes to the storage account during the upgrade. To prevent such write activity:
+
+1. Quiesce any applications or services that might perform write operations.
+1. Release or break existing leases on containers and blobs in the storage account.
+1. Acquire new leases on all containers and blobs in the account. The new leases should be infinite or long enough to prevent write access for the duration of the upgrade.
+
+After the upgrade has completed, break the leases you created to resume allowing write access to the containers and blobs.
+
+> [!WARNING]
+> Breaking an active lease without gracefully disabling applications or virtual machines that are currently accessing those resources could have unexpected results. Be sure to quiesce any current write activities before breaking any current leases.
+ ## Perform the upgrade ### [Portal](#tab/azure-portal)
storage Azure Defender Storage Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/azure-defender-storage-configure.md
There are several ways to enable Defender for Storage on subscriptions:
To enable Defender for Storage at the subscription level using the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to **Microsoft Defender for Cloud** > **Environment settings**. 1. Select the subscription for which you want to enable Defender for Storage.
If you want to disable the plan, toggle the status button to **Off** for the Sto
To enable and configure Defender for Storage at scale with an Azure built-in policy to ensure that consistent security policies are applied across all existing and new storage accounts within the subscriptions, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com/) and navigate to the Policy dashboard.
+1. Sign in to the [Azure portal](https://portal.azure.com) and navigate to the Policy dashboard.
1. In the Policy dashboard, select **Definitions** from the left-side menu. 1. In the ΓÇ£Security CenterΓÇ¥ category, search for and then select the **Configure Microsoft Defender for Storage to be enabled**. This policy will enable all Defender for Storage capabilities: Activity Monitoring, Malware Scanning and Sensitive Data Threat Detection. You can also get it here: [List of built-in policy definitions](../../governance/policy/samples/built-in-policies.md#security-center) If you want to enable a policy without the configurable features, use **Configure basic Microsoft Defender for Storage to be enabled (Activity Monitoring only)**.
The steps below include instructions on how to set up logging and an Event Grid
To enable and configure Microsoft Defender for Storage for a specific account using the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your storage account. 1. In the storage account menu, in the **Security + networking** section, select **Microsoft Defender for Cloud**. 1. **On-upload Malware Scanning** and **Sensitive data threat detection** are enabled by default. You can disable the features by unselecting them.
The override setting is usually used for the following scenarios:
To override Defender for Storage subscription-level settings to configure settings that are different from the settings that are configured on the subscription-level using the Azure portal:
-1. Sign in to the [Azure portal](https://portal.azure.com/)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your storage account that you want to configure custom settings.
To override Defender for Storage subscription-level settings to configure settin
} 1. Make sure you add the parameter `overrideSubscriptionLevelSettings` and its value is set to `true`. This ensures that the settings are saved only for this storage account and will not be overrun by the subscription settings.-
storage Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Storage description: Lists Azure Policy Regulatory Compliance controls available for Azure Storage. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
storage Storage Plan Manage Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/storage-plan-manage-costs.md
When you use cost analysis, you can view Azure Storage costs in graphs and table
To view Azure Storage costs in cost analysis:
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Open the **Cost Management + Billing** window, select **Cost management** from the menu and then select **Cost analysis**. You can then change the scope for a specific subscription from the **Scope** dropdown.
storage Container Storage Aks Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/container-storage/container-storage-aks-quickstart.md
Azure Container Service is a separate service from AKS, so you'll need to grant
# [Azure portal](#tab/portal)
-1. Sign into the [Azure portal](https://portal.azure.com?azure-portal=true), and search for and select **Kubernetes services**.
+1. Sign in to the [Azure portal](https://portal.azure.com?azure-portal=true), and search for and select **Kubernetes services**.
1. Locate and select your AKS cluster. Select **Settings** > **Properties** from the left navigation. 1. Under **Infrastructure resource group**, you should see a link to the resource group that AKS created when you created the cluster. Select it. 1. Select **Access control (IAM)** from the left pane.
storage Container Storage Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/container-storage/container-storage-faq.md
* <a id="azure-container-storage-delete-aks-resource-group"></a> **I've created an Elastic SAN storage pool, and I'm trying to delete my resource group where my AKS cluster is located and it's not working. Why?**
- Sign into the [Azure portal](https://portal.azure.com?azure-portal=true) and select **Resource groups**. Locate the resource group that AKS created (the resource group name starts with **MC_**). Select the SAN resource object within that resource group. Manually remove all volumes and volume groups. Then retry deleting the resource group that includes your AKS cluster.
+ Sign in to the [Azure portal](https://portal.azure.com?azure-portal=true) and select **Resource groups**. Locate the resource group that AKS created (the resource group name starts with **MC_**). Select the SAN resource object within that resource group. Manually remove all volumes and volume groups. Then retry deleting the resource group that includes your AKS cluster.
* <a id="azure-container-storage-autoupgrade"></a> **Is there any performance impact when upgrading to a new version of Azure Container Storage?**
storage Use Container Storage With Elastic San https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/container-storage/use-container-storage-with-elastic-san.md
When the storage pool is created, Azure Container Storage will create a storage
Next, you must assign the [Contributor](../../role-based-access-control/built-in-roles.md#contributor) Azure RBAC built-in role to the AKS managed identity on your Azure Elastic SAN Preview subscription. You'll need an [Owner](../../role-based-access-control/built-in-roles.md#owner) role for your Azure subscription in order to do this. If you don't have sufficient permissions, ask your admin to perform these steps.
-1. Sign into the [Azure portal](https://portal.azure.com?azure-portal=true).
+1. Sign in to the [Azure portal](https://portal.azure.com?azure-portal=true).
1. Select **Subscriptions**, and locate and select the subscription associated with the Azure Elastic SAN Preview resource that Azure Container Storage created on your behalf. This will likely be the same subscription as the AKS cluster that Azure Container Storage is installed on. You can verify this by locating the Elastic SAN resource in the resource group that AKS created (`MC_YourResourceGroup_YourAKSClusterName_Region`). 1. Select **Access control (IAM)** from the left pane. 1. Select **Add > Add role assignment**.
kubectl delete sp -n acstor <storage-pool-name>
## See also - [What is Azure Container Storage?](container-storage-introduction.md)-- [What is Azure Elastic SAN? Preview](../elastic-san/elastic-san-introduction.md)
+- [What is Azure Elastic SAN? Preview](../elastic-san/elastic-san-introduction.md)
storage Elastic San Connect Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/elastic-san/elastic-san-connect-linux.md
Last updated 04/24/2023 -+ # Connect to Elastic SAN Preview volumes - Linux
storage File Sync Networking Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/file-sync/file-sync-networking-endpoints.md
Last updated 04/26/2023 -+ # Configuring Azure File Sync network endpoints
storage Files Nfs Protocol https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/files-nfs-protocol.md
Last updated 11/15/2022 -+ # NFS file shares in Azure Files
storage Geo Redundant Storage For Large File Shares https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/geo-redundant-storage-for-large-file-shares.md
description: Azure Files geo-redundancy for large file shares (preview) signific
Previously updated : 05/24/2023 Last updated : 07/21/2023
Azure Files geo-redundancy for large file shares preview is currently available
- Australia Central 2 - Australia East - Australia Southeast
+- Central India
- Central US - China East 2 - China East 3
Azure Files geo-redundancy for large file shares preview is currently available
- Norway West - South Africa North - South Africa West
+- South India
- Southeast Asia - Sweden Central - Sweden South
+- Switzerland North
+- Switzerland West
- UAE Central - UAE North - UK South - UK West - West Central US
+- West India
- West US 2 ## Pricing
storage Storage Files Enable Soft Delete https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/storage-files-enable-soft-delete.md
The following sections show how to enable and use soft delete for Azure file sha
# [Portal](#tab/azure-portal)
-1. Sign into the [Azure portal](https://portal.azure.com/).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your storage account and select **File shares** under **Data storage**. 1. Select **Enabled** next to **Soft delete**. 1. Select **Enabled** for **Soft delete for all file shares**.
storage Storage Files Quick Create Use Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/storage-files-quick-create-use-linux.md
Title: Tutorial - Create an NFS Azure file share and mount it on a Linux VM usin
description: This tutorial covers how to use the Azure portal to deploy a Linux virtual machine, create an Azure file share using the NFS protocol, and mount the file share so that it's ready to store files. + Last updated 10/21/2022
storage Storage How To Use Files Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/storage-how-to-use-files-linux.md
Title: Mount SMB Azure file share on Linux
description: Learn how to mount an Azure file share over SMB on Linux and review SMB security considerations on Linux clients. + Last updated 01/10/2023
storage Authorize Data Operations Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/queues/authorize-data-operations-cli.md
For details about the permissions required for each Azure Storage operation on a
### Example: Authorize an operation to create a queue with Azure AD credentials
-The following example shows how to create a queue from Azure CLI using your Azure AD credentials. To create the queue, you'll need to log in to the Azure CLI, and you'll need a resource group and a storage account.
+The following example shows how to create a queue from Azure CLI using your Azure AD credentials. To create the queue, you'll need to sign in to the Azure CLI, and you'll need a resource group and a storage account.
1. Before you create the queue, assign the [Storage Queue Data Contributor](../../role-based-access-control/built-in-roles.md#storage-queue-data-contributor) role to yourself. Even though you are the account owner, you need explicit permissions to perform data operations against the storage account. For more information about assigning Azure roles, see [Assign an Azure role for access to queue data](assign-azure-role-data-access.md).
stream-analytics Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Stream Analytics description: Lists Azure Policy Regulatory Compliance controls available for Azure Stream Analytics. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
stream-analytics Stream Analytics Build An Iot Solution Using Stream Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/stream-analytics-build-an-iot-solution-using-stream-analytics.md
There are several resources that can easily be deployed in a resource group toge
### Review the Azure Stream Analytics TollApp resources
-1. Sign in to the Azure portal
+1. Sign in to the Azure portal.
2. Locate the Resource Group that you named in the previous section.
stream-analytics Stream Analytics Clean Up Your Job https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/stream-analytics-clean-up-your-job.md
Azure Stream Analytics jobs can be easily stopped or deleted through the Azure p
When you stop a job, the resources are deprovisioned and it stops processing events. Charges related to this job are also stopped. However all your configuration are kept and you can restart the job later
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Locate your running Stream Analytics job and select it.
synapse-analytics Quickstart Copy Activity Load Sql Pool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/quickstart-copy-activity-load-sql-pool.md
In this quickstart, you learn how to *load data from Azure SQL Database into Azu
After your Synapse workspace is created, you have two ways to open Synapse Studio:
-* Open your Synapse workspace in the [Azure portal](https://portal.azure.com/#home). Select **Open** on the Open Synapse Studio card under Getting started.
+* Open your Synapse workspace in the [Azure portal](https://portal.azure.com). Select **Open** on the Open Synapse Studio card under **Getting started**.
* Open [Azure Synapse Analytics](https://web.azuresynapse.net/) and sign in to your workspace. In this quickstart, we use the workspace named "adftest2020" as an example. It will automatically navigate you to the Synapse Studio home page.
synapse-analytics Quickstart Data Flow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/quickstart-data-flow.md
In this quickstart, you do the following steps:
After your Azure Synapse workspace is created, you have two ways to open Synapse Studio:
-* Open your Synapse workspace in the [Azure portal](https://portal.azure.com/#home). Select **Open** on the Open Synapse Studio card under Getting started.
+* Open your Synapse workspace in the [Azure portal](https://portal.azure.com). Select **Open** on the Open Synapse Studio card under **Getting started**.
* Open [Azure Synapse Analytics](https://web.azuresynapse.net/) and sign in to your workspace. In this quickstart, we use the workspace named "adftest2020" as an example. It will automatically navigate you to the Synapse Studio home page.
synapse-analytics Quickstart Transform Data Using Spark Job Definition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/quickstart-transform-data-using-spark-job-definition.md
In this quickstart, you'll use Azure Synapse Analytics to create a pipeline usin
After your Azure Synapse workspace is created, you have two ways to open Synapse Studio:
-* Open your Synapse workspace in the [Azure portal](https://portal.azure.com/#home). Select **Open** on the Open Synapse Studio card under Getting started.
+* Open your Synapse workspace in the [Azure portal](https://portal.azure.com). Select **Open** on the Open Synapse Studio card under **Getting started**.
* Open [Azure Synapse Analytics](https://web.azuresynapse.net/) and sign in to your workspace. In this quickstart, we use the workspace named "sampletest" as an example. It will automatically navigate you to the Synapse Studio home page.
synapse-analytics Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Synapse Analytics description: Lists Azure Policy Regulatory Compliance controls available for Azure Synapse Analytics. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
synapse-analytics Synapse Spark Sql Pool Import Export https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/spark/synapse-spark-sql-pool-import-export.md
Connect to the Synapse Dedicated SQL Pool database and run following setup state
#### Azure Active Directory based authentication
-Azure Active Directory based authentication is an integrated authentication approach. The user is required to successfully log in to the Azure Synapse Analytics Workspace.
+Azure Active Directory based authentication is an integrated authentication approach. The user is required to successfully sign in to the Azure Synapse Analytics Workspace.
#### Basic authentication
traffic-manager Traffic Manager Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/traffic-manager/traffic-manager-diagnostic-logs.md
If you choose to install and use PowerShell locally, this article requires the A
To access log files follow the following steps.
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Navigate to your Azure Storage account in the portal. 2. On the left pane of your Azure storage account, under **Data Storage** select **Containers**. 3. For **Containers**, select **$logs**, and navigate down to the PT1H.json file and select **Download** to download and save a copy of this log file.
update-center Deploy Updates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/update-center/deploy-updates.md
Update management center (preview) is available in all [Azure public regions](su
To install one time updates on a single VM, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In **Update management center (preview)**, **Overview**, choose your **Subscription** and select **One-time update** to install updates.
To install one time updates on a single VM, follow these steps:
## Install updates at scale
-
+ To create a new update deployment for multiple machines, follow these steps: >[!NOTE]
You can schedule updates
# [From Overview blade](#tab/install-scale-overview)
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. In **Update management center (Preview)**, **Overview**, choose your **Subscription**, select **One-time update**, and **Install now** to install updates.
The **Machines** displays a list of machines for which you can deploy one-time u
# [From Machines blade](#tab/install-scale-machines)
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to **Machines**, select your subscription and choose your machines. You can choose **Select all** to select all the machines.
update-center Manage Multiple Machines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/update-center/manage-multiple-machines.md
Instead of performing these actions from a selected Azure VM or Arc-enabled serv
## View update management center (Preview) status
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. To view update assessment across all machines, including Azure Arc-enabled servers navigate to **Update management center(Preview)**.
update-center Prerequsite For Schedule Patching https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/update-center/prerequsite-for-schedule-patching.md
You can select the patch orchestration option for new VMs that would be associat
To update the patch mode, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to **Virtual machine**, and select **+Create** to open *Create a virtual machine* page. 1. In **Basics** tab, complete all the mandatory fields. 1. In **Management** tab, under **Guest OS updates**, for **Patch orchestration options**, select *Azure-orchestrated*.
You can update the patch orchestration option for existing VMs that either alrea
To update the patch mode, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to **Update management center (Preview)**, select **Update Settings**. 1. In **Change update settings**, select **+Add machine**. 1. In **Select resources**, select your VMs and then select **Add**.
You can select the patch orchestration option for new VMs that would be associat
To update the patch mode, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to **Virtual machine**, and select **+Create** to open *Create a virtual machine* page. 1. In **Basics** tab, complete all the mandatory fields. 1. In **Management** tab, under **Guest OS updates**, for **Patch orchestration options**, select *Azure-orchestrated*.
To update the patch mode, follow these steps:
To update the patch mode, follow these steps:
-1. Sign in to the [Azure portal](https://portal.azure.com)
+1. Sign in to the [Azure portal](https://portal.azure.com).
1. Go to **Update management center (Preview)**, select **Update Settings**. 1. In **Change update settings**, select **+Add machine**. 1. In **Select resources**, select your VMs and then select **Add**.
virtual-desktop Configure Host Pool Load Balancing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/configure-host-pool-load-balancing.md
You can also configure load balancing with the Azure portal.
To configure load balancing:
-1. Sign into the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Azure portal](https://portal.azure.com).
2. Search for and select **Azure Virtual Desktop** under Services. 3. In the Azure Virtual Desktop page, select **Host pools**. 4. Select the name of the host pool you want to edit.
virtual-desktop Diagnostics Log Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/diagnostics-log-analytics.md
You can access Log Analytics workspaces on the Azure portal or Azure Monitor.
### Access Log Analytics on Azure Monitor
-1. Sign into the Azure portal
+1. Sign in to the Azure portal.
2. Search for and select **Monitor**.
virtual-desktop Screen Capture Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/screen-capture-protection.md
Title: Screen capture protection in Azure Virtual Desktop description: Learn how to enable screen capture protection in Azure Virtual Desktop (preview) to help prevent sensitive information from being captured on client endpoints.-+ Previously updated : 01/27/2023- Last updated : 07/21/2023+
-# Screen capture protection in Azure Virtual Desktop
+# Enable screen capture protection in Azure Virtual Desktop
-Screen capture protection, alongside [watermarking](watermarking.md), helps prevent sensitive information from being captured on client endpoints. When you enable screen capture protection, remote content will be automatically blocked or hidden in screenshots and screen sharing. Also, the Remote Desktop client will hide content from malicious software that may be capturing the screen.
+Screen capture protection, alongside [watermarking](watermarking.md), helps prevent sensitive information from being captured on client endpoints through a specific set of operating system (OS) features and Application Programming Interfaces (APIs). When you enable screen capture protection, remote content is automatically blocked in screenshots and screen sharing.
-In Windows 11, version 22H2 or later, you can enable screen capture protection on session host VMs as well as remote clients. Protection on session host VMs works just like protection for remote clients.
+There are two supported scenarios for screen capture protection, depending on the version of Windows you're using:
+
+- **Block screen capture on client**: the session host instructs a supported Remote Desktop client to enable screen capture protection for a remote session. This prevents screen capture from the client of applications running in the remote session.
+
+- **Block screen capture on client and server**: the session host instructs a supported Remote Desktop client to enable screen capture protection for a remote session. This prevents screen capture from the client of applications running in the remote session, but also prevents tools and services within the session host from capturing the screen.
+
+When screen capture protection is enabled, users can't share their Remote Desktop window using local collaboration software, such as Microsoft Teams. With Teams, neither the local Teams app or using [Teams with media optimization](teams-on-avd.md) can share protected content.
+
+> [!TIP]
+> - To increase the security of your sensitive information, you should also disable clipboard, drive, and printer redirection. Disabling redirection helps prevent users from copying content from the remote session. To learn about supported redirection values, see [Device redirection](rdp-properties.md#device-redirection).
+>
+> - To discourage other methods of screen capture, such as taking a photo of a screen with a physical camera, you can enable [watermarking](watermarking.md), where admins can use a QR code to trace the session.
## Prerequisites
-Screen capture protection is configured on the session host level and enforced on the client. Only clients that support this feature can connect to the remote session.
+- Your session hosts must be running one of the following versions of Windows to use screen capture protection:
-You must connect to Azure Virtual Desktop with one of the following clients to use support screen capture protection:
+ - **Block screen capture on client** is available with a [supported version of Windows 10 or Windows 11](prerequisites.md#operating-systems-and-licenses).
+ - **Block screen capture on client and server** is available starting with Windows 11, version 22H2.
-- The Remote Desktop client for Windows and the Azure Virtual Desktop Store app support screen capture protection for full desktops. You can also use them with RemoteApps when using the client on Windows 11, version 22H2 or later.-- The Remote Desktop client for macOS (version 10.7.0 or later) supports screen capture protection for both RemoteApps and full desktops.
+- Users must connect to Azure Virtual Desktop with one of the following Remote Desktop clients to use screen capture protection. If a user tries to connect with a different client or version, the connection is denied and shows an error message with the code `0x1151`.
-## Configure screen capture protection
+ | Client | Client version | Desktop session | RemoteApp session |
+ |--|--|--|--|
+ | Remote Desktop client for Windows | 1.2.1672 or later | Yes | Yes. Client device OS must be Windows 11, version 22H2 or later. |
+ | Azure Virtual Desktop Store app | Any | Yes | Yes. Client device OS must be Windows 11, version 22H2 or later. |
+ | Remote Desktop client for macOS | 10.7.0 or later | Yes | Yes |
-To configure screen capture protection:
+## Enable screen capture protection
-1. Download the [Azure Virtual Desktop policy templates file](https://aka.ms/avdgpo) (*AVDGPTemplate.cab*). You can use File Explorer to open *AVDGPTemplate.cab*, then extract the zip archive inside the *AVDGPTemplate.cab* file to a temporary location.
-2. Copy the **terminalserver-avd.admx** file to the **%windir%\PolicyDefinitions** folder.
-3. Copy the **en-us\terminalserver-avd.adml** file to the **%windir%\PolicyDefinitions\en-us** folder.
-4. To confirm the files copied correctly, open the Group Policy Editor and go to **Computer Configuration** > **Administrative Templates** > **Windows Components** > **Remote Desktop Services** > **Remote Desktop Session Host** > **Azure Virtual Desktop**. You should see one or more Azure Virtual Desktop policies, as shown in the following screenshot.
+Screen capture protection is configured on session hosts and enforced by the client. You configure the settings by using Intune or Group Policy.
- :::image type="content" source="media/administrative-template/azure-virtual-desktop-gpo.png" alt-text="Screenshot of the group policy editor." lightbox="media/administrative-template/azure-virtual-desktop-gpo.png":::
+To configure screen capture protection:
- > [!TIP]
- > You can also install administrative templates to the group policy Central Store in your Active Directory domain.
- > For more information, see [How to create and manage the Central Store for Group Policy Administrative Templates in Windows](/troubleshoot/windows-client/group-policy/create-and-manage-central-store).
+1. Follow the steps to make the [Administrative template for Azure Virtual Desktop](administrative-template.md) available.
-5. Open the **"Enable screen capture protection"** policy and set it to **"Enabled"**.
-6. To configure screen capture for client and server, set the **"Enable screen capture protection"** policy to **"Block Screen capture on client and server"**. By default, the policy will be set to **"Block Screen capture on client"**.
+1. Once you've verified that the administrative template is available, open the policy setting **Enable screen capture protection** and set it to **Enabled**.
- >[!NOTE]
- >You can only use screen capture protection on session host VMs that use Windows 11, version 22H2 or later.
+1. From the drop-down menu, select the screen capture protection scenario you want to use from **Block screen capture on client** or **Block screen capture on client and server**.
-## Limitations and known issues
+1. Apply the policy settings to your session hosts by running a Group Policy update or Intune device sync.
-- If a user tries to connect to a capture-protected session host with an unsupported client, the connection won't work and will instead show an error message with the code `0x1151`.-- This feature protects the Remote Desktop window from being captured through a specific set of public operating system features and Application Programming Interfaces (APIs). However, there's no guarantee that this feature will strictly protect content in scenarios where a user were to take a photo of their screen with a physical camera.-- For maximum security, customers should use this feature while also disabling clipboard, drive, and printer redirection. Disabling redirection prevents users from copying any captured screen content from the remote session.-- Users can't share their Remote Desktop window using local collaboration software, such as Microsoft Teams, while this feature is enabled. When they use Microsoft Teams, neither the local Teams app nor Teams with media optimization can share protected content.
+1. Connect to a remote session with a supported client and test screen capture protection is working by taking a screenshot or sharing your screen. The content should be blocked or hidden. Any existing sessions will need to sign out and back in again for the change to take effect.
## Next steps
virtual-desktop Watermarking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/watermarking.md
You'll need the following things before you can use watermarking:
## Enable watermarking
-To enable watermarking, follow the steps below:
+To enable watermarking:
-1. Follow the steps to download and add the [Administrative template for Azure Virtual Desktop](administrative-template.md).
+1. Follow the steps to make the [Administrative template for Azure Virtual Desktop](administrative-template.md) available.
-1. Once you've verified that the Azure Virtual Desktop administrative template is available, open the policy setting **Enable watermarking** and set it to **Enabled**.
+1. Once you've verified that the administrative template is available, open the policy setting **Enable watermarking** and set it to **Enabled**.
1. You can configure the following options:
To enable watermarking, follow the steps below:
1. Apply the policy settings to your session hosts by running a Group Policy update or Intune device sync.
-1. Connect to a remote session, where you should see QR codes appear. For any changes you make to the policy and apply to the session host, you'll need to disconnect and reconnect to your remote session to see the difference.
+1. Connect to a remote session with a supported client, where you should see QR codes appear. Any existing sessions will need to sign out and back in again for the change to take effect.
## Find session information
virtual-desktop Whats New Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/whats-new-agent.md
Title: What's new in the Azure Virtual Desktop Agent? - Azure
description: New features and product updates for the Azure Virtual Desktop Agent. Previously updated : 07/12/2023 Last updated : 07/21/2023
New versions of the Azure Virtual Desktop Agent are installed automatically. Whe
A rollout may take several weeks before the agent is available in all environments. Some agent versions may not reach non-validation environments, so you may see multiple versions of the agent deployed across your environments.
+## Version 1.0.6713.1603
+
+This update was released at the end of July 2023 and includes the following changes:
+
+- Security improvements and bug fixes.
+ ## Version 1.0.7033.900 This update was released at the beginning of July 2023 and includes the following changes:
virtual-desktop Whats New Client Windows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/whats-new-client-windows.md
description: Learn about recent changes to the Remote Desktop client for Windows
Previously updated : 07/11/2023 Last updated : 07/21/2023 # What's new in the Remote Desktop client for Windows
The following table lists the current versions available for the public and Insi
| Release | Latest version | Download | ||-|-| | Public | 1.2.4419 | [Windows 64-bit](https://go.microsoft.com/fwlink/?linkid=2139369) *(most common)*<br />[Windows 32-bit](https://go.microsoft.com/fwlink/?linkid=2139456)<br />[Windows ARM64](https://go.microsoft.com/fwlink/?linkid=2139370) |
-| Insider | 1.2.4485 | [Windows 64-bit](https://go.microsoft.com/fwlink/?linkid=2139233) *(most common)*<br />[Windows 32-bit](https://go.microsoft.com/fwlink/?linkid=2139144)<br />[Windows ARM64](https://go.microsoft.com/fwlink/?linkid=2139368) |
+| Insider | 1.2.4487 | [Windows 64-bit](https://go.microsoft.com/fwlink/?linkid=2139233) *(most common)*<br />[Windows 32-bit](https://go.microsoft.com/fwlink/?linkid=2139144)<br />[Windows ARM64](https://go.microsoft.com/fwlink/?linkid=2139368) |
-## Updates for version 1.2.4485 (Insider)
+## Updates for version 1.2.4487 (Insider)
-*Date published: July 11, 2023*
+*Date published: July 21, 2023*
Download: [Windows 64-bit](https://go.microsoft.com/fwlink/?linkid=2139233), [Windows 32-bit](https://go.microsoft.com/fwlink/?linkid=2139144), [Windows ARM64](https://go.microsoft.com/fwlink/?linkid=2139368)
In this release, we've made the following changes:
- Added a new RDP file property called "allowed security protocols." This property restricts the list of security protocols the client can negotiate. - Fixed an issue where, in Azure Arc, Connection Information dialog gave inconsistent information about identity verification. - Added heading-level description to subscribe with URL.-- Improved client logging, diagnostics, and error classification to help admins troubleshoot connection and feed issues.
+- Improved client logging, diagnostics, and error classification to help admins troubleshoot connection and feed issues.
+- Fixed an issue where the Client doesn't auto-reconnect when the Gateway WebSocket connection shuts down normally.
## Updates for version 1.2.4419
virtual-machine-scale-sets Disk Encryption Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/disk-encryption-cli.md
Last updated 11/22/2022 --+ # Encrypt OS and attached data disks in a Virtual Machine Scale Set with the Azure CLI
virtual-machine-scale-sets Flexible Virtual Machine Scale Sets Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/flexible-virtual-machine-scale-sets-cli.md
Last updated 11/22/2022 -+ # Create virtual machines in a scale set using Azure CLI
virtual-machine-scale-sets Flexible Virtual Machine Scale Sets Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/flexible-virtual-machine-scale-sets-portal.md
Last updated 11/22/2022 -+ # Create virtual machines in a scale set using Azure portal
virtual-machine-scale-sets Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/quick-create-cli.md
Last updated 11/22/2022 -+ # Quickstart: Create a Virtual Machine Scale Set with the Azure CLI
virtual-machine-scale-sets Quick Create Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/quick-create-portal.md
Last updated 04/18/2023 -+ # Quickstart: Create a Virtual Machine Scale Set in the Azure portal
virtual-machine-scale-sets Tutorial Autoscale Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/tutorial-autoscale-cli.md
Last updated 12/16/2022 --+ # Tutorial: Automatically scale a Virtual Machine Scale Set with the Azure CLI When you create a scale set, you define the number of VM instances that you wish to run. As your application demand changes, you can automatically increase or decrease the number of VM instances. The ability to autoscale lets you keep up with customer demand or respond to application performance changes throughout the lifecycle of your app. In this tutorial you learn how to:
virtual-machine-scale-sets Tutorial Create And Manage Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/tutorial-create-and-manage-cli.md
Last updated 12/16/2022 --+ # Tutorial: Create and manage a Virtual Machine Scale Set with the Azure CLI A Virtual Machine Scale Set allows you to deploy and manage a set of virtual machines. Throughout the lifecycle of a Virtual Machine Scale Set, you may need to run one or more management tasks. In this tutorial you learn how to:
virtual-machine-scale-sets Tutorial Modify Scale Sets Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/tutorial-modify-scale-sets-cli.md
Last updated 11/22/2022 -+ # Tutorial: Modify a Virtual Machine Scale Set using Azure CLI Throughout the lifecycle of your applications, you may need to modify or update your Virtual Machine Scale Set. These updates may include how to update the configuration of the scale set, or change the application configuration. This article describes how to modify an existing scale set using the Azure CLI.
virtual-machine-scale-sets Tutorial Use Custom Image Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/tutorial-use-custom-image-cli.md
Last updated 12/16/2022 -+ # Tutorial: Create and use a custom image for Virtual Machine Scale Sets with the Azure CLI
virtual-machine-scale-sets Virtual Machine Scale Sets Automatic Instance Repairs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-instance-repairs.md
Last updated 11/22/2022 --+ # Automatic instance repairs for Azure Virtual Machine Scale Sets
virtual-machine-scale-sets Virtual Machine Scale Sets Automatic Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-upgrade.md
+ Last updated 11/22/2022
virtual-machine-scale-sets Virtual Machine Scale Sets Use Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/virtual-machine-scale-sets-use-availability-zones.md
Last updated 11/22/2022 -+ # Create a Virtual Machine Scale Set that uses Availability Zones
virtual-machines Automatic Vm Guest Patching https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/automatic-vm-guest-patching.md
Last updated 10/20/2021 -+ # Automatic VM guest patching for Azure VMs
virtual-machines Boot Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/boot-diagnostics.md
Title: Azure boot diagnostics
description: Overview of Azure boot diagnostics and managed boot diagnostics +
virtual-machines Compiling Scaling Applications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/compiling-scaling-applications.md
Title: Scaling HPC applications - Azure Virtual Machines | Microsoft Docs
description: Learn how to scale HPC applications on Azure VMs. + Last updated 04/11/2023
virtual-machines Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/configure.md
Title: Configuration and Optimization of InfiniBand enabled H-series and N-serie
description: Learn about configuring and optimizing the InfiniBand enabled H-series and N-series VMs for HPC. + Last updated 04/11/2023
virtual-machines Dedicated Hosts How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/dedicated-hosts-how-to.md
-+ Last updated 07/12/2023
virtual-machines Delete https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/delete.md
Last updated 05/09/2022 -+ # Delete a VM and attached resources
virtual-machines Disks Deploy Premium V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/disks-deploy-premium-v2.md
Title: Deploy a Premium SSD v2 managed disk
-description: Learn how to deploy a Premium SSD v2.
+description: Learn how to deploy a Premium SSD v2 and about its regional availability.
Previously updated : 04/10/2023 Last updated : 07/21/2023
virtual-machines Disks Enable Customer Managed Keys Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/disks-enable-customer-managed-keys-portal.md
The VM deployment process is similar to the standard deployment process, the onl
:::image type="content" source="media/virtual-machines-disk-encryption-portal/server-side-encryption-encrypt-existing-disk-customer-managed-key.png" alt-text="Screenshot of your example OS disk, the encryption pane is open, encryption at rest with a customer-managed key is selected, as well as your example Azure Key Vault." lightbox="media/virtual-machines-disk-encryption-portal/server-side-encryption-encrypt-existing-disk-customer-managed-key.png"::: 1. Repeat this process for any other disks attached to the VM you'd like to encrypt.
-1. When your disks finish switching over to customer-managed keys, if there are no there no other attached disks you'd like to encrypt, start your VM.
+1. When your disks finish switching over to customer-managed keys, if there are no other attached disks you'd like to encrypt, start your VM.
> [!IMPORTANT] > Customer-managed keys rely on managed identities for Azure resources, a feature of Azure Active Directory (Azure AD). When you configure customer-managed keys, a managed identity is automatically assigned to your resources under the covers. If you subsequently move the subscription, resource group, or managed disk from one Azure AD directory to another, the managed identity associated with the managed disks is not transferred to the new tenant, so customer-managed keys may no longer work. For more information, see [Transferring a subscription between Azure AD directories](../active-directory/managed-identities-azure-resources/known-issues.md#transferring-a-subscription-between-azure-ad-directories).
The VM deployment process is similar to the standard deployment process, the onl
- [Replicate machines with customer-managed keys enabled disks](../site-recovery/azure-to-azure-how-to-enable-replication-cmk-disks.md) - [Set up disaster recovery of VMware VMs to Azure with PowerShell](../site-recovery/vmware-azure-disaster-recovery-powershell.md#replicate-vmware-vms) - [Set up disaster recovery to Azure for Hyper-V VMs using PowerShell and Azure Resource Manager](../site-recovery/hyper-v-azure-powershell-resource-manager.md#step-7-enable-vm-protection)-- See [Create a managed disk from a snapshot with CLI](scripts/create-managed-disk-from-snapshot.md#disks-with-customer-managed-keys) for a code sample.
+- See [Create a managed disk from a snapshot with CLI](scripts/create-managed-disk-from-snapshot.md#disks-with-customer-managed-keys) for a code sample.
virtual-machines Dv2 Dsv2 Series Memory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/dv2-dsv2-series-memory.md
DSv2-series sizes run on the 3rd Generation Intel® Xeon® Platinum 8370C (Ice L
| Standard_DS14_v2 <sup>3</sup> | 16 | 112 | 224 | 64 | 64000/512 (576) | 51200/768 | 8|12000 | | Standard_DS15_v2 <sup>2</sup> | 20 | 140 | 280 | 64 | 80000/640 (720) | 64000/960 | 8|25000 <sup>4</sup> |
-<sup>1</sup> The maximum disk throughput (IOPS or MBps) possible with a DSv2 series VM may be limited by the number, size and striping of the attached disk(s). For details, see [Designing for high performance](./premium-storage-performance.md).
-<sup>2</sup> Instance is isolated to the Intel Haswell based hardware and dedicated to a single customer.
-<sup>3</sup> Constrained core sizes available.
-<sup>4</sup> 25000 Mbps with Accelerated Networking.<br>
+1. The maximum disk throughput (IOPS or MBps) possible with a DSv2 series VM may be limited by the number, size and striping of the attached disk(s). For details, see [Designing for high performance](./premium-storage-performance.md).
+2. Instance is isolated to the Intel Haswell based hardware and dedicated to a single customer.
+3. Constrained core sizes available.
+4. 25000 Mbps with Accelerated Networking.<br>
[!INCLUDE [virtual-machines-common-sizes-table-defs](../../includes/virtual-machines-common-sizes-table-defs.md)]
virtual-machines Ephemeral Os Disks Deploy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/ephemeral-os-disks-deploy.md
Last updated 07/23/2020 -+ # How to deploy Ephemeral OS disks for Azure VMs
virtual-machines Ephemeral Os Disks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/ephemeral-os-disks.md
description: Learn more about ephemeral OS disks for Azure VMs.
+ Last updated 07/23/2020
virtual-machines Agent Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/agent-linux.md
-+ Last updated 03/28/2023- # Azure Linux VM Agent overview
virtual-machines Custom Script Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/custom-script-linux.md
-+ Last updated 03/31/2023
virtual-machines Diagnostics Linux V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/diagnostics-linux-v3.md
Last updated 12/13/2018 -+ ms.devlang: azurecli- # Use Linux diagnostic extension 3.0 to monitor metrics and logs
virtual-machines Diagnostics Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/diagnostics-linux.md
Last updated 04/04/2023-+ ms.devlang: azurecli- # Use the Linux diagnostic extension 4.0 to monitor metrics and logs
virtual-machines Enable Infiniband https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/enable-infiniband.md
Title: Enable InfiniBand on HPC VMs - Azure Virtual Machines | Microsoft Docs
description: Learn how to enable InfiniBand on Azure HPC VMs. + Last updated 04/12/2023
virtual-machines Extensions Rmpolicy Howto Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/extensions-rmpolicy-howto-cli.md
description: Use Azure Policy to restrict VM extension deployments.
-+
virtual-machines Extensions Rmpolicy Howto Ps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/extensions-rmpolicy-howto-ps.md
Last updated 04/11/2023 --+ # Use Azure Policy to restrict extensions installation on Windows VMs
virtual-machines Features Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/features-linux.md
Last updated 05/24/2022-+ # Virtual machine extensions and features for Linux
virtual-machines Hpc Compute Infiniband Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/hpc-compute-infiniband-linux.md
vm-linux Last updated 04/21/2023-+
virtual-machines Hpccompute Gpu Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/hpccompute-gpu-linux.md
vm-linux+ Last updated 07/19/2023
virtual-machines Key Vault Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/key-vault-linux.md
Last updated 12/02/2019 -+ # Key Vault virtual machine extension for Linux
virtual-machines Network Watcher Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/network-watcher-linux.md
Last updated 06/29/2023-+ # Network Watcher Agent virtual machine extension for Linux
az vm extension show --name NetworkWatcherAgentLinux --resource-group myResource
### Support If you need more help at any point in this article, you can refer to the [Network Watcher documentation](../../network-watcher/index.yml), or contact the Azure experts on the [MSDN Azure and Stack Overflow forums](https://azure.microsoft.com/support/forums/). Alternatively, you can file an Azure support incident. Go to the [Azure support site](https://azure.microsoft.com/support/options/) and select **Get support**. For information about using Azure Support, see the [Microsoft Azure support FAQ](https://azure.microsoft.com/support/faq/).-
virtual-machines Stackify Retrace Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/stackify-retrace-linux.md
Last updated 04/12/2018 -+ ms.devlang: azurecli # Stackify Retrace Linux Agent Extension
virtual-machines Update Linux Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/update-linux-agent.md
description: Learn how to update Azure Linux Agent for your Linux VM in Azure
+ Last updated 02/03/2023- # How to update the Azure Linux Agent on a VM
virtual-machines Hb Hc Known Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hb-hc-known-issues.md
Title: Troubleshooting known issues with HPC and GPU VMs - Azure Virtual Machine
description: Learn about troubleshooting known issues with HPC and GPU VM sizes in Azure. + Last updated 03/10/2023
virtual-machines Hb Series Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hb-series-overview.md
Title: HB-series VM overview - Azure Virtual Machines | Microsoft Docs
description: Learn about the preview support for the HB-series VM size in Azure. + Last updated 04/20/2023
virtual-machines Hbv2 Performance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hbv2-performance.md
+ Last updated 03/04/2023
virtual-machines Hbv2 Series Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hbv2-series-overview.md
Title: HBv2-series VM overview - Azure Virtual Machines | Microsoft Docs
description: Learn about the HBv2-series VM size in Azure. tags: azure-resource-manager +
virtual-machines Hbv3 Performance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hbv3-performance.md
+ Last updated 03/04/2023
virtual-machines Hbv3 Series Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hbv3-series-overview.md
Title: HBv3-series VM overview, architecture, topology - Azure Virtual Machines
description: Learn about the HBv3-series VM size in Azure. tags: azure-resource-manager +
virtual-machines Hc Series Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/hc-series-overview.md
Title: HC-series VM overview - Azure Virtual Machines| Microsoft Docs
description: Learn about the preview support for the HC-series VM size in Azure. + Last updated 04/18/2023
virtual-machines How To Enable Write Accelerator https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/how-to-enable-write-accelerator.md
Last updated 04/11/2023 -+ # Enable Write Accelerator
virtual-machines Instance Metadata Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/instance-metadata-service.md
+ Last updated 04/11/2023
virtual-machines Linux Vm Connect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux-vm-connect.md
+ Last updated 04/06/2023 - # Connect to a Linux VM
virtual-machines Add Disk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/add-disk.md
Title: Add a data disk to Linux VM using the Azure CLI
description: Learn to add a persistent data disk to your Linux VM with the Azure CLI -+ Last updated 01/09/2023
virtual-machines Attach Disk Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/attach-disk-portal.md
Title: Attach a data disk to a Linux VM
description: Use the portal to attach new or existing data disk to a Linux VM. + Last updated 01/09/2023 - # Use the portal to attach a data disk to a Linux VM
virtual-machines Azure Dns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/azure-dns.md
description: Name Resolution scenarios for Linux virtual machines in Azure IaaS,
+ Last updated 04/11/2023 - # DNS Name Resolution options for Linux virtual machines in Azure
virtual-machines Azure Hybrid Benefit Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/azure-hybrid-benefit-linux.md
Last updated 05/02/2023 -+ # Azure Hybrid Benefit for Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise Server (SLES) virtual machines
virtual-machines Build Image With Packer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/build-image-with-packer.md
+ Last updated 04/11/2023
virtual-machines Cli Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cli-manage.md
Title: Common Azure CLI commands
description: Learn some of the common Azure CLI commands to get you started managing your VMs in Azure Resource Manager mode -+ Last updated 04/11/2023
virtual-machines Cli Ps Findimage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cli-ps-findimage.md
Last updated 02/09/2023
-+ # Find Azure Marketplace image information using the Azure CLI
virtual-machines Cloud Init Deep Dive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloud-init-deep-dive.md
Last updated 03/29/2023
+ # Diving deeper into cloud-init
virtual-machines Cloud Init Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloud-init-troubleshooting.md
Last updated 03/29/2023
+ # Troubleshooting VM provisioning with cloud-init
virtual-machines Cloudinit Add User https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloudinit-add-user.md
Last updated 03/29/2022 -+ # Use cloud-init to add a user to a Linux VM in Azure
virtual-machines Cloudinit Bash Script https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloudinit-bash-script.md
Last updated 03/29/2023 -+ # Use cloud-init to run a bash script in a Linux VM in Azure
virtual-machines Cloudinit Configure Swapfile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloudinit-configure-swapfile.md
Last updated 03/29/2023 -+ # Use cloud-init to configure a swap partition on a Linux VM
virtual-machines Cloudinit Update Vm Hostname https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloudinit-update-vm-hostname.md
Last updated 03/29/2023 -+ # Use cloud-init to set hostname for a Linux VM in Azure
virtual-machines Cloudinit Update Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/cloudinit-update-vm.md
Last updated 03/29/2023 -+ # Use cloud-init to update and install packages in a Linux VM in Azure
virtual-machines Create Cli Complete https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/create-cli-complete.md
Title: Create a Linux environment with the Azure CLI
description: Create storage, a Linux VM, a virtual network and subnet, a load balancer, an NIC, a public IP, and a network security group, all from the ground up by using the Azure CLI. -+ Last updated 3/29/2023
virtual-machines Create Upload Centos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/create-upload-centos.md
Title: Create and upload a CentOS-based Linux VHD
description: Learn to create and upload an Azure virtual hard disk (VHD) that contains a CentOS-based Linux operating system. + Last updated 12/14/2022
virtual-machines Create Upload Generic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/create-upload-generic.md
Title: Prepare Linux for imaging
description: Learn to prepare a Linux system to be used for an image in Azure. + Last updated 12/14/2022
virtual-machines Create Upload Openbsd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/create-upload-openbsd.md
Title: Create and upload an OpenBSD image
description: Learn how to create and upload a virtual hard disk (VHD) that contains the OpenBSD operating system to create an Azure virtual machine through Azure CLI -+ Last updated 05/24/2017
virtual-machines Create Upload Ubuntu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/create-upload-ubuntu.md
Title: Create and upload an Ubuntu Linux VHD in Azure
description: Learn to create and upload an Azure virtual hard disk (VHD) that contains an Ubuntu Linux operating system. + Last updated 07/28/2021 - # Prepare an Ubuntu virtual machine for Azure
virtual-machines Detach Disk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/detach-disk.md
Last updated 01/09/2023 --+ # How to detach a data disk from a Linux virtual machine
virtual-machines Disable Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disable-provisioning.md
-+ Last updated 04/11/2023
virtual-machines Disk Encryption Cli Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-cli-quickstart.md
Last updated 03/29/2023-+ # Quickstart: Create and encrypt a Linux VM with the Azure CLI
virtual-machines Disk Encryption Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-linux.md
Last updated 07/07/2023-+ # Azure Disk Encryption scenarios on Linux VMs
In all cases, you should [take a snapshot](snapshot-copy-managed-disk.md) and/or
>[!WARNING] > - If you have previously used Azure Disk Encryption with Azure AD to encrypt a VM, you must continue use this option to encrypt your VM. See [Azure Disk Encryption with Azure AD (previous release)](disk-encryption-overview-aad.md) for details. >
-> - When encrypting Linux OS volumes, the VM should be considered unavailable. We strongly recommend to avoid SSH logins while the encryption is in progress to avoid issues blocking any open files that will need to be accessed during the encryption process. To check progress, use the the [Get-AzVMDiskEncryptionStatus](/powershell/module/az.compute/get-azvmdiskencryptionstatus) PowerShell cmdlet or the [vm encryption show](/cli/azure/vm/encryption#az-vm-encryption-show) CLI command. This process can be expected to take a few hours for a 30GB OS volume, plus additional time for encrypting data volumes. Data volume encryption time will be proportional to the size and quantity of the data volumes unless the encrypt format all option is used.
+> - When encrypting Linux OS volumes, the VM should be considered unavailable. We strongly recommend to avoid SSH logins while the encryption is in progress to avoid issues blocking any open files that will need to be accessed during the encryption process. To check progress, use the [Get-AzVMDiskEncryptionStatus](/powershell/module/az.compute/get-azvmdiskencryptionstatus) PowerShell cmdlet or the [vm encryption show](/cli/azure/vm/encryption#az-vm-encryption-show) CLI command. This process can be expected to take a few hours for a 30GB OS volume, plus additional time for encrypting data volumes. Data volume encryption time will be proportional to the size and quantity of the data volumes unless the encrypt format all option is used.
> - Disabling encryption on Linux VMs is only supported for data volumes. It is not supported on data or OS volumes if the OS volume has been encrypted. ## Install tools and connect to Azure
Azure Disk Encryption does not work for the following Linux scenarios, features,
- Encrypting VMs in failover clusters. - Encryption of [Azure ultra disks](../disks-enable-ultra-ssd.md). - Encryption of [Premium SSD v2 disks](../disks-types.md#premium-ssd-v2-limitations).-- Encryption of VMs in subscriptions that have the [Secrets should have the specified maximum validity period](https://ms.portal.azure.com/#view/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F342e8053-e12e-4c44-be01-c3c2f318400f) policy enabled with the [DENY effect](../../governance/policy/concepts/effects.md).
+- Encryption of VMs in subscriptions that have the [Secrets should have the specified maximum validity period](https://portal.azure.com/#view/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F342e8053-e12e-4c44-be01-c3c2f318400f) policy enabled with the [DENY effect](../../governance/policy/concepts/effects.md).
## Next steps
virtual-machines Disk Encryption Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-overview.md
Last updated 06/14/2023--+ # Azure Disk Encryption for Linux VMs
virtual-machines Disk Encryption Portal Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-portal-quickstart.md
Last updated 01/04/2023-+ # Quickstart: Create and encrypt a virtual machine with the Azure portal
virtual-machines Disk Encryption Powershell Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-powershell-quickstart.md
Last updated 01/04/2023-+ # Quickstart: Create and encrypt a Linux VM in Azure with Azure PowerShell
virtual-machines Disk Encryption Sample Scripts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-sample-scripts.md
Last updated 03/29/2023-+ # Azure Disk Encryption sample scripts for Linux VMs
virtual-machines Disk Encryption Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disk-encryption-upgrade.md
Last updated 05/27/2021-+ # Upgrading the Azure Disk Encryption version
virtual-machines Disks Enable Customer Managed Keys Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disks-enable-customer-managed-keys-cli.md
Last updated 05/03/2023
-+ # Use the Azure CLI to enable server-side encryption with customer-managed keys for managed disks
az disk-encryption-set update -n keyrotationdes -g keyrotationtesting --key-url
- [Replicate machines with customer-managed keys enabled disks](../../site-recovery/azure-to-azure-how-to-enable-replication-cmk-disks.md) - [Set up disaster recovery of VMware VMs to Azure with PowerShell](../../site-recovery/vmware-azure-disaster-recovery-powershell.md#replicate-vmware-vms) - [Set up disaster recovery to Azure for Hyper-V VMs using PowerShell and Azure Resource Manager](../../site-recovery/hyper-v-azure-powershell-resource-manager.md#step-7-enable-vm-protection)-- See [Create a managed disk from a snapshot with CLI](../scripts/create-managed-disk-from-snapshot.md#disks-with-customer-managed-keys) for a code sample.
+- See [Create a managed disk from a snapshot with CLI](../scripts/create-managed-disk-from-snapshot.md#disks-with-customer-managed-keys) for a code sample.
virtual-machines Disks Enable Host Based Encryption Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/disks-enable-host-based-encryption-cli.md
Last updated 03/29/2023 -+ # Use the Azure CLI to enable end-to-end encryption using encryption at host
virtual-machines Expand Disks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/expand-disks.md
Last updated 07/12/2023 -+ # Expand virtual hard disks on a Linux VM
virtual-machines How To Configure Lvm Raid On Crypt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/how-to-configure-lvm-raid-on-crypt.md
Last updated 04/06/2023---+ # Configure LVM and RAID on encrypted devices
virtual-machines How To Resize Encrypted Lvm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/how-to-resize-encrypted-lvm.md
description: This article provides instructions for resizing ADE encrypted disks
+ Last updated 04/11/2023
virtual-machines How To Verify Encryption Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/how-to-verify-encryption-status.md
Last updated 04/11/2023--+ # Verify encryption status for Linux
virtual-machines Image Builder Devops Task https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-devops-task.md
Last updated 04/11/2023
-+ ms.devlang: azurecli
virtual-machines Image Builder Gallery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-gallery.md
Last updated 04/11/2023
-+ # Create a Linux image and distribute it to an Azure Compute Gallery
virtual-machines Image Builder Json https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-json.md
Last updated 07/17/2023
-+ # Create an Azure Image Builder Bicep or ARM template JSON template
virtual-machines Image Builder Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-troubleshoot.md
Last updated 06/07/2023
-+ # Troubleshoot Azure VM Image Builder
virtual-machines Image Builder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder.md
Last updated 04/11/2023
-+ # Create a Linux image and distribute it to an Azure Compute Gallery by using the Azure CLI
virtual-machines Mac Create Ssh Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/mac-create-ssh-keys.md
+ Last updated 04/11/2023
virtual-machines Multiple Nics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/multiple-nics.md
-+ Last updated 04/06/2023
virtual-machines N Series Driver Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/n-series-driver-setup.md
+ Last updated 04/06/2023
virtual-machines No Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/no-agent.md
-+ Last updated 04/11/2023
virtual-machines Oracle Create Upload Vhd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/oracle-create-upload-vhd.md
+ Last updated 11/09/2021
virtual-machines Proximity Placement Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/proximity-placement-groups.md
-+ Last updated 4/6/2023
virtual-machines Quick Create Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/quick-create-portal.md
Last updated 3/29/2023 -+ # Quickstart: Create a Linux virtual machine in the Azure portal
virtual-machines Redhat Create Upload Vhd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/redhat-create-upload-vhd.md
vm-linux+ Last updated 04/25/2023
For more details, see the information about [rebuilding initramfs](https://acces
* You're now ready to use your Red Hat Enterprise Linux virtual hard disk to create new virtual machines in Azure. If this is the first time that you're uploading the .vhd file to Azure, see [Create a Linux VM from a custom disk](upload-vhd.md#option-1-upload-a-vhd). * For more details about the hypervisors that are certified to run Red Hat Enterprise Linux, see [the Red Hat website](https://access.redhat.com/certified-hypervisors). * To learn more about using production-ready RHEL BYOS images, go to the documentation page for [BYOS](../workloads/redhat/byos.md).-
virtual-machines Spot Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/spot-template.md
+ Last updated 05/31/2023
virtual-machines Static Dns Name Resolution For Linux On Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/static-dns-name-resolution-for-linux-on-azure.md
-+ Last updated 04/06/2023
virtual-machines Suse Create Upload Vhd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/suse-create-upload-vhd.md
+ Last updated 12/14/2022 - # Prepare a SLES or openSUSE Leap virtual machine for Azure
virtual-machines Tutorial Automate Vm Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/tutorial-automate-vm-deployment.md
Last updated 04/06/2023 -+ #Customer intent: As an IT administrator or developer, I want learn about cloud-init so that I customize and configure Linux VMs in Azure on first boot to minimize the number of post-deployment configuration tasks required.
virtual-machines Tutorial Lamp Stack https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/tutorial-lamp-stack.md
ms.devlang: azurecli+ Last updated 4/4/2023 - #Customer intent: As an IT administrator, I want to learn how to install the LAMP stack so that I can quickly prepare a Linux VM to run web applications.
virtual-machines Tutorial Secure Web Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/tutorial-secure-web-server.md
Last updated 04/09/2023 --+ #Customer intent: As an IT administrator or developer, I want to learn how to secure a web server with TLS/SSL certificates so that I can protect my customer data on web applications that I build and run.
virtual-machines Use Remote Desktop https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/use-remote-desktop.md
+ Last updated 03/28/2023
If you don't receive any response in your remote desktop client and don't see an
For more information about creating and using SSH keys with Linux VMs, see [Create SSH keys for Linux VMs in Azure](mac-create-ssh-keys.md). For information on using SSH from Windows, see [How to use SSH keys with Windows](ssh-from-windows.md).-
virtual-machines Lsv2 Series https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/lsv2-series.md
description: Specifications for the Lsv2-series VMs.
+ Last updated 06/01/2022
virtual-machines M Series https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/m-series.md
description: Specifications for the M-series VMs.
+ Last updated 04/12/2023
virtual-machines Maintenance Configurations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/maintenance-configurations.md
This scope is integrated with [update management center](../update-center/overvi
- [Patch orchestration](automatic-vm-guest-patching.md#patch-orchestration-modes) for virtual machines need to be set to AutomaticByPlatform -- A minimum of 1 hour and 10 minutes is required for the maintenance window. :::image type="content" source="./media/maintenance-configurations/add-schedule-maintenance-window.png" alt-text="Screenshot of the upper maintenance window minimum time specification."::: - The upper maintenance window is 3 hours 55 mins. - A minimum of 1 hour and 30 minutes is required for the maintenance window. - The value of **Repeat** should be at least 6 hours.
+>[!IMPORTANT]
+> The minimum maintenance window has been increased from 1 hour 10 minutes to 1 hour 30 minutes, while the minimum repeat value has been set to 6 hours for new schedules. **Please note that your existing schedules will not get impacted; however, we strongly recommend updating existing schedules to include these new changes.**
+ To learn more about this topic, checkout [update management center and scheduled patching](../update-center/scheduled-patching.md) ## Shut Down Machines
virtual-machines Migration Classic Resource Manager Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/migration-classic-resource-manager-cli.md
Last updated 04/12/2023 -+ # Migrate IaaS resources from classic to Azure Resource Manager by using Azure CLI
virtual-machines Restore Point Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/restore-point-troubleshooting.md
description: Symptoms, causes, and resolutions of restore point failures related
Last updated 04/12/2023 + # Troubleshoot restore point failures: Issues with the agent or extension
virtual-machines Security Controls Policy Image Builder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/security-controls-policy-image-builder.md
Title: Azure Policy Regulatory Compliance controls for Azure VM Image Builder description: Lists Azure Policy Regulatory Compliance controls available for Azure VM Image Builder. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
virtual-machines Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Virtual Machines description: Lists Azure Policy Regulatory Compliance controls available for Azure Virtual Machines . These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
virtual-machines Vm Applications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/vm-applications.md
Last updated 07/17/2023
--+ # VM Applications overview
virtual-machines Build Image With Packer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/build-image-with-packer.md
Last updated 03/31/2023 -+ # PowerShell: How to use Packer to create virtual machine images in Azure
virtual-machines Image Builder Gallery Update Image Version https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/image-builder-gallery-update-image-version.md
Title: Create a new Windows image version from an existing image version using A
description: Create a new Windows VM image version from an existing image version using Azure VM Image Builder. - Previously updated : 03/02/2021+ Last updated : 07/21/2023
virtual-machines Image Builder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/image-builder.md
Last updated 06/12/2023
-+ # Create a Windows VM by using Azure VM Image Builder
virtual-machines Template Description https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/template-description.md
description: Learn more about how the virtual machine resource is defined in an
-+ Last updated 04/11/2023
virtual-machines Deploy Ibm Db2 Purescale Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/mainframe-rehosting/ibm/deploy-ibm-db2-purescale-azure.md
editor: swread
+ Last updated 04/19/2023
virtual-machines Install Openframe Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/mainframe-rehosting/tmaxsoft/install-openframe-azure.md
Last updated 04/19/2023
+ # Install TmaxSoft OpenFrame on Azure
virtual-machines Configure Oracle Asm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/configure-oracle-asm.md
description: Quickly get Oracle ASM up and running in your Azure environment.
-+ Last updated 07/13/2022
virtual-machines Configure Oracle Dataguard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/configure-oracle-dataguard.md
description: Quickly get Oracle Data Guard up and running in your Azure environm
-+ Last updated 03/23/2023
az group delete --name $RESOURCE_GROUP
- [Tutorial: Create highly available virtual machines](../../linux/create-cli-complete.md) - [Explore Azure CLI samples for VM deployment](https://github.com/Azure-Samples/azure-cli-samples/tree/master/virtual-machine)-
virtual-machines Configure Oracle Golden Gate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/configure-oracle-golden-gate.md
description: Quickly get an Oracle Golden Gate up and running in your Azure envi
-+ Last updated 08/02/2018
virtual-machines Oracle Database Backup Azure Backup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/oracle-database-backup-azure-backup.md
description: Learn how to back up and recover an Oracle Database instance by usi
-+ Last updated 01/28/2021
virtual-machines Oracle Database Backup Azure Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/oracle-database-backup-azure-storage.md
description: Learn how to back up and recover an Oracle Database instance to an
-+ Last updated 01/28/2021
virtual-machines Oracle Vm Solutions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/oracle-vm-solutions.md
description: Learn about supported configurations and limitations of Oracle virt
+ Last updated 04/11/2023 - # Oracle VM images and their deployment on Microsoft Azure
virtual-machines Byos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/redhat/byos.md
description: Learn about bring-your-own-subscription images for Red Hat Enterpri
-+ Last updated 06/10/2020
virtual-machines Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/redhat/overview.md
description: Learn about the Red Hat product offerings available on Azure.
+ Last updated 02/10/2020
virtual-machines Redhat Extended Lifecycle Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/redhat/redhat-extended-lifecycle-support.md
description: Learn about adding Red Hat Enterprise Extended Lifecycle support ad
+ Last updated 04/16/2020 - # Red Hat Enterprise Linux (RHEL) Extended Lifecycle Support
virtual-machines Redhat In Place Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/redhat/redhat-in-place-upgrade.md
description: Learn how to do an in-place upgrade from Red Hat Enterprise 7.x ima
+ Last updated 04/16/2020
virtual-machines Redhat Rhui https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/redhat/redhat-rhui.md
description: Learn about Red Hat Update Infrastructure for on-demand Red Hat Ent
+ Last updated 04/06/2023
virtual-network Accelerated Networking How It Works https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/accelerated-networking-how-it-works.md
vm-linux+ Last updated 04/18/2023
virtual-network Accelerated Networking Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/accelerated-networking-overview.md
Title: Accelerated Networking overview
description: Learn how Accelerated Networking can improve the networking performance of Azure VMs. + Last updated 04/18/2023
virtual-network Create Peering Different Subscriptions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/create-peering-different-subscriptions.md
Last updated 12/30/2022-+ # Create a virtual network peering - Resource Manager, different subscriptions and Azure Active Directory tenants
virtual-network Create Vm Accelerated Networking Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/create-vm-accelerated-networking-cli.md
Last updated 04/18/2023 -+ # Use Azure CLI to create a Windows or Linux VM with Accelerated Networking
virtual-network Create Vm Dual Stack Ipv6 Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/create-vm-dual-stack-ipv6-cli.md
Last updated 04/19/2023-+ ms.devlang: azurecli
For more information about IPv6 and IP addresses in Azure, see:
- [Overview of IPv6 for Azure Virtual Network.](ipv6-overview.md) - [What is Azure Virtual Network IP Services?](ip-services-overview.md)--
virtual-network Create Vm Dual Stack Ipv6 Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/create-vm-dual-stack-ipv6-portal.md
Last updated 08/17/2022-+ # Create an Azure Virtual Machine with a dual-stack network using the Azure portal
For more information about IPv6 and IP addresses in Azure, see:
- [Overview of IPv6 for Azure Virtual Network.](ipv6-overview.md) - [What is Azure Virtual Network IP Services?](ip-services-overview.md)--
virtual-network Ip Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/ip-services-overview.md
Last updated 04/19/2023-+ # What is Azure Virtual Network IP Services?
virtual-network Public Ip Address Prefix https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/public-ip-address-prefix.md
-
+ Title: Azure Public IP address prefix description: Learn about what an Azure public IP address prefix is and how it can help you assign public IP addresses to your resources.
+ Last updated 04/19/2023
virtual-network Public Ip Basic Upgrade Guidance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/public-ip-basic-upgrade-guidance.md
We recommend the following approach to upgrade to Standard SKU public IP address
| | | | Virtual Machine or Virtual Machine Scale Sets (flex model) | Disassociate IP(s) and utilize the upgrade options detailed after the table. For virtual machines, you can use the [upgrade script](public-ip-upgrade-vm.md). | | Load Balancer (Basic SKU) | New LB SKU required. Use the upgrade scripts for [virtual machines](../../load-balancer/upgrade-basic-standard.md) or [Virtual Machine Scale Sets (without IPs per VM)](../../load-balancer/upgrade-basic-standard-virtual-machine-scale-sets.md) to upgrade to Standard Load Balancer |
- | VPN Gateway (Basic SKU or VpnGw1-5 SKU using Basic IPs) | New VPN Gateway SKU required. Create a [new Virtual Network Gateway with a Standard SKU IP](../../vpn-gateway/tutorial-create-gateway-portal.md). |
- | ExpressRoute Gateway (using Basic IPs) | New ExpressRoute Gateway required. Create a [new ExpressRoute Gateway with a Standard SKU IP](../../expressroute/expressroute-howto-add-gateway-portal-resource-manager.md). |
+ | VPN Gateway (Basic SKU or VpnGw1-5 SKU using Basic IPs) | No action required for existing VPN gateways that use Basic SKU public IP addresses. For new VPN gateways, we recommend that you use Standard SKU public IP addresses.|
+| ExpressRoute Gateway (using Basic IPs) | New ExpressRoute Gateway required. Create a [new ExpressRoute Gateway with a Standard SKU IP](../../expressroute/expressroute-howto-add-gateway-portal-resource-manager.md). |
| Application Gateway (v1 SKU) | New AppGW SKU required. Use this [migration script to migrate from v1 to v2](../../application-gateway/migrate-v1-v2.md). | > [!NOTE]
virtual-network Virtual Network Multiple Ip Addresses Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/virtual-network-multiple-ip-addresses-cli.md
Last updated 04/19/2023 -+ # Assign multiple IP addresses to virtual machines using the Azure CLI
virtual-network Manage Network Security Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/manage-network-security-group.md
Last updated 04/24/2023 -+ # Create, change, or delete a network security group
virtual-network Manage Route Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/manage-route-table.md
+ Last updated 04/24/2023
virtual-network Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure Virtual Network description: Lists Azure Policy Regulatory Compliance controls available for Azure Virtual Network. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Previously updated : 07/06/2023 Last updated : 07/20/2023
virtual-network Setup Dpdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/setup-dpdk.md
description: Learn the benefits of the Data Plane Development Kit (DPDK) and how
+ Last updated 04/24/2023
virtual-network Tutorial Create Route Table Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/tutorial-create-route-table-cli.md
documentationcenter: virtual-network
tags: azure-resource-manager
-# Customer intent: I want to route traffic from one subnet, to a different subnet, through a network virtual appliance.
ms.devlang: azurecli
virtual-network
Last updated 04/20/2022 -+
+# Customer intent: I want to route traffic from one subnet, to a different subnet, through a network virtual appliance.
# Route network traffic with a route table using the Azure CLI
virtual-network Virtual Network Scenario Udr Gw Nva https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/virtual-network-scenario-udr-gw-nva.md
description: Learn how to deploy virtual appliances and route tables to create a
+ Last updated 03/22/2023
virtual-network Virtual Network Service Endpoint Policies Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/virtual-network-service-endpoint-policies-cli.md
virtual-network
Last updated 02/03/2020 -+ # Customer intent: I want only specific Azure Storage account to be allowed access from a virtual network subnet.
virtual-network Virtual Network Test Latency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/virtual-network-test-latency.md
+ Last updated 03/23/2023
virtual-network Virtual Network Troubleshoot Peering Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/virtual-network-troubleshoot-peering-issues.md
To learn more about global peering requirements and restraints, see [Virtual net
## Troubleshoot a connectivity issue between two peered virtual networks
-Sign in to the [Azure portal](https://portal.azure.com/) with an account that has the necessary [roles and permissions](virtual-network-manage-peering.md#permissions). Select the virtual network, select **Peering**, and then check the **Status** field. What is the status?
+Sign in to the [Azure portal](https://portal.azure.com) with an account that has the necessary [roles and permissions](virtual-network-manage-peering.md#permissions). Select the virtual network, select **Peering**, and then check the **Status** field. What is the status?
### The peering status is "Connected"
virtual-network Virtual Networks Name Resolution Ddns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/virtual-networks-name-resolution-ddns.md
ms.assetid: c315961a-fa33-45cf-82b9-4551e70d32dd
+ Last updated 04/27/2023
If needed, you can add a DNS search suffix to your VMs. The DNS suffix is specif
``` supersede domain-name <required-dns-suffix>;
-```
+```
virtual-network Virtual Networks Name Resolution For Vms And Role Instances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/virtual-networks-name-resolution-for-vms-and-role-instances.md
Last updated 04/27/2023 -+ # Name resolution for resources in Azure virtual networks
Azure Resource Manager deployment model:
* [Manage a virtual network](manage-virtual-network.md) * [Manage a network interface](virtual-network-network-interface.md)--
virtual-wan Howto Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-wan/howto-private-link.md
Last updated 03/30/2023 --+ # Use Private Link in Virtual WAN
virtual-wan Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-wan/whats-new.md
You can also find the latest Azure Virtual WAN updates and subscribe to the RSS
|Feature|Remote User connectivity/Point-to-site VPN|[Dual-RADIUS server](virtual-wan-point-to-site-portal.md)|Ability to specify primary and backup RADIUS servers to service authentication traffic.|March 2021| | |Feature|Remote User connectivity/Point-to-site VPN|[Custom IPsec policies](point-to-site-ipsec.md)|Ability to specify connection/encryption parameters for IKEv2 point-to-site connections.|March 2021|Only supported for IKEv2- based connections.<br><br>View the [list of available parameters](point-to-site-ipsec.md). | |SKU|Remote User connectivity/Point-to-site VPN|[Support up to 100K users connected to a single hub](about-client-address-pools.md)|Increased maximum number of concurrent users connected to a single gateway to 100,000.|March 2021| |
-|Feature|Remote User connectivity/Point-to-site VPN|Multiple-authentication methods|Ability for a single gateway to use multiple authentication mechanisms.|June 2023|Supported for gateways running all protocol combinations. Note that Azure AD authentication still required used of OpenVPN|
+|Feature|Remote User connectivity/Point-to-site VPN|Multiple-authentication methods|Ability for a single gateway to use multiple authentication mechanisms.|June 2023|Supported for gateways running all protocol combinations. Note that Azure AD authentication still requires the use of OpenVPN|
## Preview
vpn-gateway Vpn Gateway Vpn Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/vpn-gateway/vpn-gateway-vpn-faq.md
No.
### Can I get my VPN gateway IP address before I create it?
-Zone-redundant and zonal gateways (gateway SKUs that have _AZ_ in the name) both rely on a _Standard SKU_ Azure public IP resource. Azure Standard SKU public IP resources must use a static allocation method. Therefore, you'll have the public IP address for your VPN gateway as soon as you create the Standard SKU public IP resource you intend to use for it.
+Azure Standard SKU public IP resources must use a static allocation method. Therefore, you'll have the public IP address for your VPN gateway as soon as you create the Standard SKU public IP resource you intend to use for it.
-For non-zone-redundant and non-zonal gateways (gateway SKUs that do *not* have *AZ* in the name), you can't obtain the VPN gateway IP address before it's created. The IP address changes only if you delete and re-create your VPN gateway.
+### Can I request a static public IP address for my VPN gateway?
-### Can I request a Static Public IP address for my VPN gateway?
+We recommend that you use a Standard SKU public IP address for your VPN gateway. Standard SKU public IP address resources use a static allocation method. While we do support dynamic IP address assignment for certain gateway SKUs (gateway SKUS that do not have an *AZ* in the name), we recommend that you use a Standard SKU public IP address going forward for all virtual network gateways.
-Zone-redundant and zonal gateways (gateway SKUs that have *AZ* in the name) both rely on a *Standard SKU* Azure public IP resource. Azure Standard SKU public IP resources must use a static allocation method.
+For non-zone-redundant and non-zonal gateways (gateway SKUs that do *not* have *AZ* in the name), dynamic IP address assignment is supported, but is being phased out. When you use a dynamic IP address, the IP address doesn't change after it has been assigned to your VPN gateway. The only time the VPN gateway IP address changes is when the gateway is deleted and then re-created. The VPN gateway public IP address doesn't change when you resize, reset, or complete other internal maintenance and upgrades of your VPN gateway.
-For non-zone-redundant and non-zonal gateways (gateway SKUs that do *not* have *AZ* in the name), dynamic IP address assignment is supported. When you use a dynamic IP address, the IP address doesn't change after it has been assigned to your VPN gateway. The only time the VPN gateway IP address changes is when the gateway is deleted and then re-created. The VPN gateway public IP address doesn't change when you resize, reset, or complete other internal maintenance and upgrades of your VPN gateway.
+### How does the retirement of Basic SKU public IP addresses affect my VPN gateways?
+
+We are taking action to ensure the continued operation of deployed VPN gateways that utilize Basic SKU public IP addresses. If you already have VPN gateways with Basic SKU public IP addresses, there is no need for you to take any action.
+
+However, it's important to note that Basic SKU public IP addresses are being phased out. We highly recommend using **Standard SKU** public IP addresses when creating new VPN gateways. Further details on the retirement of Basic SKU public IP addresses can be found [here](https://azure.microsoft.com/updates/upgrade-to-standard-sku-public-ip-addresses-in-azure-by-30-september-2025-basic-sku-will-be-retired).
### How does my VPN tunnel get authenticated?