Updates from: 04/20/2022 01:12:23
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Add Password Reset Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/add-password-reset-policy.md
To test the user flow:
### Create a password reset policy
-Custom policies are a set of XML files that you upload to your Azure AD B2C tenant to define user journeys. We provide starter packs that have several pre-built policies, including sign-up and sign-in, password reset, and profile editing policies. For more information, see [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy).
+Custom policies are a set of XML files that you upload to your Azure AD B2C tenant to define user journeys. We provide [starter packs](https://github.com/Azure-Samples/active-directory-b2c-custom-policy-starterpack) that have several pre-built policies, including sign up and sign in, password reset, and profile editing policies. For more information, see [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy).
::: zone-end
active-directory-domain-services Tutorial Create Instance Advanced https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/tutorial-create-instance-advanced.md
To complete this tutorial, you need the following resources and privileges:
* An Azure Active Directory tenant associated with your subscription, either synchronized with an on-premises directory or a cloud-only directory. * If needed, [create an Azure Active Directory tenant][create-azure-ad-tenant] or [associate an Azure subscription with your account][associate-azure-ad-tenant]. * You need [Application Administrator](../active-directory/roles/permissions-reference.md#application-administrator) and [Groups Administrator](../active-directory/roles/permissions-reference.md#groups-administrator) Azure AD roles in your tenant to enable Azure AD DS.
-* You need [Domain Services Contributor](/azure/role-based-access-control/built-in-roles#domain-services-contributor) Azure role to create the required Azure AD DS resources.
+* You need [Domain Services Contributor](../role-based-access-control/built-in-roles.md#domain-services-contributor) Azure role to create the required Azure AD DS resources.
Although not required for Azure AD DS, it's recommended to [configure self-service password reset (SSPR)][configure-sspr] for the Azure AD tenant. Users can change their password without SSPR, but SSPR helps if they forget their password and need to reset it.
active-directory-domain-services Tutorial Create Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-domain-services/tutorial-create-instance.md
To complete this tutorial, you need the following resources and privileges:
* An Azure Active Directory tenant associated with your subscription, either synchronized with an on-premises directory or a cloud-only directory. * If needed, [create an Azure Active Directory tenant][create-azure-ad-tenant] or [associate an Azure subscription with your account][associate-azure-ad-tenant]. * You need [Application Administrator](../active-directory/roles/permissions-reference.md#application-administrator) and [Groups Administrator](../active-directory/roles/permissions-reference.md#groups-administrator) Azure AD roles in your tenant to enable Azure AD DS.
-* You need [Domain Services Contributor](/azure/role-based-access-control/built-in-roles#domain-services-contributor) Azure role to create the required Azure AD DS resources.
+* You need [Domain Services Contributor](../role-based-access-control/built-in-roles.md#domain-services-contributor) Azure role to create the required Azure AD DS resources.
* A virtual network with DNS servers that can query necessary infrastructure such as storage. DNS servers that can't perform general internet queries might block the ability to create a managed domain. Although not required for Azure AD DS, it's recommended to [configure self-service password reset (SSPR)][configure-sspr] for the Azure AD tenant. Users can change their password without SSPR, but SSPR helps if they forget their password and need to reset it.
active-directory Use Scim To Provision Users And Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/use-scim-to-provision-users-and-groups.md
Previously updated : 07/26/2021 Last updated : 04/13/2022 # Tutorial: Develop and plan provisioning for a SCIM endpoint in Azure Active Directory
-As an application developer, you can use the System for Cross-Domain Identity Management (SCIM) user management API to enable automatic provisioning of users and groups between your application and Azure AD (AAD). This article describes how to build a SCIM endpoint and integrate with the AAD provisioning service. The SCIM specification provides a common user schema for provisioning. When used in conjunction with federation standards like SAML or OpenID Connect, SCIM gives administrators an end-to-end, standards-based solution for access management.
+As an application developer, you can use the System for Cross-Domain Identity Management (SCIM) user management API to enable automatic provisioning of users and groups between your application and Azure AD. This article describes how to build a SCIM endpoint and integrate with the Azure AD provisioning service. The SCIM specification provides a common user schema for provisioning. When used in conjunction with federation standards like SAML or OpenID Connect, SCIM gives administrators an end-to-end, standards-based solution for access management.
![Provisioning from Azure AD to an app with SCIM](media/use-scim-to-provision-users-and-groups/scim-provisioning-overview.png)
To automate provisioning to an application will require building and integrating
1. Design your user and group schema
- Identify the application's objects and attributes to determine how they map to the user and group schema supported by the AAD SCIM implementation.
+ Identify the application's objects and attributes to determine how they map to the user and group schema supported by the Azure AD SCIM implementation.
-1. Understand the AAD SCIM implementation
+1. Understand the Azure AD SCIM implementation
- Understand how the AAD SCIM client is implemented to model your SCIM protocol request handling and responses.
+ Understand how the Azure AD SCIM client is implemented to model your SCIM protocol request handling and responses.
1. Build a SCIM endpoint
- An endpoint must be SCIM 2.0-compatible to integrate with the AAD provisioning service. As an option, use Microsoft Common Language Infrastructure (CLI) libraries and code samples to build your endpoint. These samples are for reference and testing only; we recommend against using them as dependencies in your production app.
+ An endpoint must be SCIM 2.0-compatible to integrate with the Azure AD provisioning service. As an option, use Microsoft Common Language Infrastructure (CLI) libraries and code samples to build your endpoint. These samples are for reference and testing only; we recommend against using them as dependencies in your production app.
-1. Integrate your SCIM endpoint with the AAD SCIM client
+1. Integrate your SCIM endpoint with the Azure AD SCIM client
- If your organization uses a third-party application to implement a profile of SCIM 2.0 that AAD supports, you can quickly automate both provisioning and deprovisioning of users and groups.
+ If your organization uses a third-party application to implement a profile of SCIM 2.0 that Azure AD supports, you can quickly automate both provisioning and deprovisioning of users and groups.
-1. Publish your application to the AAD application gallery
+1. Publish your application to the Azure AD application gallery
Make it easy for customers to discover your application and easily configure provisioning.
There are several endpoints defined in the SCIM RFC. You can start with the `/Us
> [!NOTE] > Use the `/Schemas` endpoint to support custom attributes or if your schema changes frequently as it enables a client to retrieve the most up-to-date schema automatically. Use the `/Bulk` endpoint to support groups.
-## Understand the AAD SCIM implementation
+## Understand the Azure AD SCIM implementation
-To support a SCIM 2.0 user management API, this section describes how the AAD SCIM client is implemented and shows how to model your SCIM protocol request handling and responses.
+To support a SCIM 2.0 user management API, this section describes how the Azure AD SCIM client is implemented and shows how to model your SCIM protocol request handling and responses.
> [!IMPORTANT] > The behavior of the Azure AD SCIM implementation was last updated on December 18, 2018. For information on what changed, see [SCIM 2.0 protocol compliance of the Azure AD User Provisioning service](application-provisioning-config-problem-scim-compatibility.md).
Within the [SCIM 2.0 protocol specification](http://www.simplecloud.info/#Specif
|Retrieve a known resource for a user or group created earlier|[section 3.4.1](https://tools.ietf.org/html/rfc7644#section-3.4.1)| |Query users or groups|[section 3.4.2](https://tools.ietf.org/html/rfc7644#section-3.4.2). By default, users are retrieved by their `id` and queried by their `username` and `externalId`, and groups are queried by `displayName`.| |The filter [excludedAttributes=members](#get-group) when querying the group resource|section 3.4.2.5|
-|Accept a single bearer token for authentication and authorization of AAD to your application.||
+|Accept a single bearer token for authentication and authorization of Azure AD to your application.||
|Soft-deleting a user `active=false` and restoring the user `active=true`|The user object should be returned in a request whether or not the user is active. The only time the user should not be returned is when it is hard deleted from the application.| |Support the /Schemas endpoint|[section 7](https://tools.ietf.org/html/rfc7643#page-30) The schema discovery endpoint will be used to discover additional attributes.|
+|Support listing users and paginating|[section 3.4.2.4](https://datatracker.ietf.org/doc/html/rfc7644#section-3.4.2.4).|
-Use the general guidelines when implementing a SCIM endpoint to ensure compatibility with AAD:
+Use the general guidelines when implementing a SCIM endpoint to ensure compatibility with Azure AD:
##### General: * `id` is a required property for all resources. Every response that returns a resource should ensure each resource has this property, except for `ListResponse` with zero members.
-* Values sent should be stored in the same format as what the were sent in. Invalid values should be rejected with a descriptive, actionable error message. Transformations of data should not happen between data being sent by Azure AD and data being stored in the SCIM application. (e.g. A phone number sent as 55555555555 should not be saved/returned as +5 (555) 555-5555)
+* Values sent should be stored in the same format as what they were sent in. Invalid values should be rejected with a descriptive, actionable error message. Transformations of data should not happen between data being sent by Azure AD and data being stored in the SCIM application. (e.g. A phone number sent as 55555555555 should not be saved/returned as +5 (555) 555-5555)
* It isn't necessary to include the entire resource in the **PATCH** response.
-* Don't require a case-sensitive match on structural elements in SCIM, in particular **PATCH** `op` operation values, as defined in [section 3.5.2](https://tools.ietf.org/html/rfc7644#section-3.5.2). AAD emits the values of `op` as **Add**, **Replace**, and **Remove**.
-* Microsoft AAD makes requests to fetch a random user and group to ensure that the endpoint and the credentials are valid. It's also done as a part of the **Test Connection** flow in the [Azure portal](https://portal.azure.com).
+* Don't require a case-sensitive match on structural elements in SCIM, in particular **PATCH** `op` operation values, as defined in [section 3.5.2](https://tools.ietf.org/html/rfc7644#section-3.5.2). Azure AD emits the values of `op` as **Add**, **Replace**, and **Remove**.
+* Microsoft Azure AD makes requests to fetch a random user and group to ensure that the endpoint and the credentials are valid. It's also done as a part of the **Test Connection** flow in the [Azure portal](https://portal.azure.com).
* Support HTTPS on your SCIM endpoint.
-* Custom complex and multivalued attributes are supported but AAD does not have many complex data structures to pull data from in these cases. Simple paired name/value type complex attributes can be mapped to easily, but flowing data to complex attributes with three or more subattributes are not well supported at this time.
-* The "type" sub-attribute values of multivalued complex attributes must be unique. For example, there can not be two different email addresses with the "work" sub-type.
+* Custom complex and multivalued attributes are supported but Azure AD does not have many complex data structures to pull data from in these cases. Simple paired name/value type complex attributes can be mapped to easily, but flowing data to complex attributes with three or more subattributes are not well supported at this time.
+* The "type" sub-attribute values of multivalued complex attributes must be unique. For example, there cannot be two different email addresses with the "work" sub-type.
##### Retrieving Resources: * Response to a query/filter request should always be a `ListResponse`.
-* Microsoft AAD only uses the following operators: `eq`, `and`
+* Microsoft Azure AD only uses the following operators: `eq`, `and`
* The attribute that the resources can be queried on should be set as a matching attribute on the application in the [Azure portal](https://portal.azure.com), see [Customizing User Provisioning Attribute Mappings](customize-application-attributes.md). ##### /Users:
Use the general guidelines when implementing a SCIM endpoint to ensure compatibi
* If a value is not present, do not send null values. * Property values should be camel cased (e.g. readWrite). * Must return a list response.
-* The /schemas request will be made by the Azure AD SCIM client every time someone saves the provisioning configuration in the Azure Portal or every time a user lands on the edit provisioning page in the Azure Portal. Any additional attributes discovered will be surfaced to customers in the attribute mappings under the target attribute list. Schema discovery only leads to additional target attributes being added. It will not result in attributes being removed.
+* The /schemas request will be made by the Azure AD SCIM client every time someone saves the provisioning configuration in the Azure portal or every time a user lands on the edit provisioning page in the Azure portal. Any additional attributes discovered will be surfaced to customers in the attribute mappings under the target attribute list. Schema discovery only leads to additional target attributes being added. It will not result in attributes being removed.
### User provisioning and deprovisioning
-The following illustration shows the messages that AAD sends to a SCIM service to manage the lifecycle of a user in your application's identity store.
+The following illustration shows the messages that Azure AD sends to a SCIM service to manage the lifecycle of a user in your application's identity store.
![Shows the user provisioning and deprovisioning sequence](media/use-scim-to-provision-users-and-groups/scim-figure-4.png)<br/> *User provisioning and deprovisioning sequence* ### Group provisioning and deprovisioning
-Group provisioning and deprovisioning are optional. When implemented and enabled, the following illustration shows the messages that AAD sends to a SCIM service to manage the lifecycle of a group in your application's identity store. Those messages differ from the messages about users in two ways:
+Group provisioning and deprovisioning are optional. When implemented and enabled, the following illustration shows the messages that Azure AD sends to a SCIM service to manage the lifecycle of a group in your application's identity store. Those messages differ from the messages about users in two ways:
* Requests to retrieve groups specify that the members attribute is to be excluded from any resource provided in response to the request. * Requests to determine whether a reference attribute has a certain value are requests about the members attribute.
Group provisioning and deprovisioning are optional. When implemented and enabled
*Group provisioning and deprovisioning sequence* ### SCIM protocol requests and responses
-This section provides example SCIM requests emitted by the AAD SCIM client and example expected responses. For best results, you should code your app to handle these requests in this format and emit the expected responses.
+This section provides example SCIM requests emitted by the Azure AD SCIM client and example expected responses. For best results, you should code your app to handle these requests in this format and emit the expected responses.
> [!IMPORTANT]
-> To understand how and when the AAD user provisioning service emits the operations described below, see the section [Provisioning cycles: Initial and incremental](how-provisioning-works.md#provisioning-cycles-initial-and-incremental) in [How provisioning works](how-provisioning-works.md).
+> To understand how and when the Azure AD user provisioning service emits the operations described below, see the section [Provisioning cycles: Initial and incremental](how-provisioning-works.md#provisioning-cycles-initial-and-incremental) in [How provisioning works](how-provisioning-works.md).
[User Operations](#user-operations) - [Create User](#create-user) ([Request](#request) / [Response](#response))
TLS 1.2 Cipher Suites minimum bar:
### IP Ranges The Azure AD provisioning service currently operates under the IP Ranges for AzureActiveDirectory as listed [here](https://www.microsoft.com/download/details.aspx?id=56519&WT.mc_id=rss_alldownloads_all). You can add the IP ranges listed under the AzureActiveDirectory tag to allow traffic from the Azure AD provisioning service into your application. Note that you will need to review the IP range list carefully for computed addresses. An address such as '40.126.25.32' could be represented in the IP range list as '40.126.0.0/18'. You can also programmatically retrieve the IP range list using the following [API](/rest/api/virtualnetwork/servicetags/list).
-Azure AD also supports an agent based solution to provide connectivity to applications in private networks (on-premises, hosted in Azure, hosted in AWS, etc.). Customers can deploy a lightweight agent, which provides connectivity to Azure AD without opening an inbound ports, on a server in their private network. Learn more [here](./on-premises-scim-provisioning.md).
+Azure AD also supports an agent based solution to provide connectivity to applications in private networks (on-premises, hosted in Azure, hosted in AWS, etc.). Customers can deploy a lightweight agent, which provides connectivity to Azure AD without opening any inbound ports, on a server in their private network. Learn more [here](./on-premises-scim-provisioning.md).
## Build a SCIM endpoint
private string GenerateJSONWebToken()
***Example 1. Query the service for a matching user***
-Azure Active Directory (AAD) queries the service for a user with an `externalId` attribute value matching the mailNickname attribute value of a user in AAD. The query is expressed as a Hypertext Transfer Protocol (HTTP) request such as this example, wherein jyoung is a sample of a mailNickname of a user in Azure Active Directory.
+Azure Active Directory queries the service for a user with an `externalId` attribute value matching the mailNickname attribute value of a user in Azure AD. The query is expressed as a Hypertext Transfer Protocol (HTTP) request such as this example, wherein jyoung is a sample of a mailNickname of a user in Azure Active Directory.
>[!NOTE]
-> This is an example only. Not all users will have a mailNickname attribute, and the value a user has may not be unique in the directory. Also, the attribute used for matching (which in this case is `externalId`) is configurable in the [AAD attribute mappings](customize-application-attributes.md).
+> This is an example only. Not all users will have a mailNickname attribute, and the value a user has may not be unique in the directory. Also, the attribute used for matching (which in this case is `externalId`) is configurable in the [Azure AD attribute mappings](customize-application-attributes.md).
``` GET https://.../scim/Users?filter=externalId eq jyoung HTTP/1.1
In the sample query, for a user with a given value for the `externalId` attribut
***Example 2. Provision a user***
-If the response to a query to the web service for a user with an `externalId` attribute value that matches the mailNickname attribute value of a user doesn't return any users, then AAD requests that the service provision a user corresponding to the one in AAD. Here is an example of such a request:
+If the response to a query to the web service for a user with an `externalId` attribute value that matches the mailNickname attribute value of a user doesn't return any users, then Azure AD requests that the service provision a user corresponding to the one in Azure AD. Here is an example of such a request:
``` POST https://.../scim/Users HTTP/1.1
In the example of a request to update a user, the object provided as the value o
***Example 6. Deprovision a user***
-To deprovision a user from an identity store fronted by an SCIM service, AAD sends a request such as:
+To deprovision a user from an identity store fronted by an SCIM service, Azure AD sends a request such as:
``` DELETE ~/scim/Users/54D382A4-2050-4C03-94D1-E769F1D15682 HTTP/1.1
The object provided as the value of the resourceIdentifier argument has these pr
* ResourceIdentifier.Identifier: "54D382A4-2050-4C03-94D1-E769F1D15682" * ResourceIdentifier.SchemaIdentifier: "urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
-## Integrate your SCIM endpoint with the AAD SCIM client
+## Integrate your SCIM endpoint with the Azure AD SCIM client
-Azure AD can be configured to automatically provision assigned users and groups to applications that implement a specific profile of the [SCIM 2.0 protocol](https://tools.ietf.org/html/rfc7644). The specifics of the profile are documented in [Understand the Azure AD SCIM implementation](#understand-the-aad-scim-implementation).
+Azure AD can be configured to automatically provision assigned users and groups to applications that implement a specific profile of the [SCIM 2.0 protocol](https://tools.ietf.org/html/rfc7644). The specifics of the profile are documented in [Understand the Azure AD SCIM implementation](#understand-the-azure-ad-scim-implementation).
Check with your application provider, or your application provider's documentation for statements of compatibility with these requirements.
Applications that support the SCIM profile described in this article can be conn
**To connect an application that supports SCIM:**
-1. Sign in to the [AAD portal](https://aad.portal.azure.com). Note that you can get access a free trial for Azure Active Directory with P2 licenses by signing up for the [developer program](https://developer.microsoft.com/office/dev-program)
+1. Sign in to the [Azure AD portal](https://aad.portal.azure.com). Note that you can get access a free trial for Azure Active Directory with P2 licenses by signing up for the [developer program](https://developer.microsoft.com/office/dev-program)
1. Select **Enterprise applications** from the left pane. A list of all configured apps is shown, including apps that were added from the gallery. 1. Select **+ New application** > **+ Create your own application**. 1. Enter a name for your application, choose the option "*integrate any other application you don't find in the gallery*" and select **Add** to create an app object. The new app is added to the list of enterprise applications and opens to its app management screen.
Once the initial cycle has started, you can select **Provisioning logs** in the
> [!NOTE] > The initial cycle takes longer to perform than later syncs, which occur approximately every 40 minutes as long as the service is running.
-## Publish your application to the AAD application gallery
+## Publish your application to the Azure AD application gallery
If you're building an application that will be used by more than one tenant, you can make it available in the Azure AD application gallery. This will make it easy for organizations to discover the application and configure provisioning. Publishing your app in the Azure AD gallery and making provisioning available to others is easy. Check out the steps [here](../manage-apps/v2-howto-app-gallery-listing.md). Microsoft will work with you to integrate your application into our gallery, test your endpoint, and release onboarding [documentation](../saas-apps/tutorial-list.md) for customers to use. ### Gallery onboarding checklist Use the checklist to onboard your application quickly and customers have a smooth deployment experience. The information will be gathered from you when onboarding to the gallery. > [!div class="checklist"]
-> * Support a [SCIM 2.0](#understand-the-aad-scim-implementation) user and group endpoint (Only one is required but both are recommended)
+> * Support a [SCIM 2.0](#understand-the-azure-ad-scim-implementation) user and group endpoint (Only one is required but both are recommended)
> * Support at least 25 requests per second per tenant to ensure that users and groups are provisioned and deprovisioned without delay (Required) > * Establish engineering and support contacts to guide customers post gallery onboarding (Required) > * 3 Non-expiring test credentials for your application (Required)
The SCIM spec doesn't define a SCIM-specific scheme for authentication and autho
|OAuth client credentials grant|Access tokens are much shorter-lived than passwords, and have an automated refresh mechanism that long-lived bearer tokens do not have. Both the authorization code grant and the client credentials grant create the same type of access token, so moving between these methods is transparent to the API. Provisioning can be completely automated, and new tokens can be silently requested without user interaction. ||Not supported for gallery and non-gallery apps. Support is in our backlog.| > [!NOTE]
-> It's not recommended to leave the token field blank in the AAD provisioning configuration custom app UI. The token generated is primarily available for testing purposes.
+> It's not recommended to leave the token field blank in the Azure AD provisioning configuration custom app UI. The token generated is primarily available for testing purposes.
### OAuth code grant flow
Best practices (recommended, but not required):
1. When the provisioning cycle begins, the service checks if the current access token is valid and exchanges it for a new token if needed. The access token is provided in each request made to the app and the validity of the request is checked before each request. > [!NOTE]
-> While it's not possible to setup OAuth on the non-gallery applications, you can manually generate an access token from your authorization server and input it as the secret token to a non-gallery application. This allows you to verify compatibility of your SCIM server with the AAD SCIM client before onboarding to the app gallery, which does support the OAuth code grant.
+> While it's not possible to setup OAuth on the non-gallery applications, you can manually generate an access token from your authorization server and input it as the secret token to a non-gallery application. This allows you to verify compatibility of your SCIM server with the Azure AD SCIM client before onboarding to the app gallery, which does support the OAuth code grant.
**Long-lived OAuth bearer tokens:** If your application doesn't support the OAuth authorization code grant flow, instead generate a long lived OAuth bearer token that an administrator can use to setup the provisioning integration. The token should be perpetual, or else the provisioning job will be [quarantined](application-provisioning-quarantine-status.md) when the token expires.
active-directory Howto Authentication Passwordless Security Key On Premises https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-authentication-passwordless-security-key-on-premises.md
You must also meet the following system requirements:
- [Windows Server 2016](https://support.microsoft.com/help/4534307/windows-10-update-kb4534307) - [Windows Server 2019](https://support.microsoft.com/help/4534321/windows-10-update-kb4534321) -- AES256_HMAC_SHA1 must be enabled when **Network security: Configure encryption types allowed for Kerberos** policy is [configured](https://docs.microsoft.com/windows/security/threat-protection/security-policy-settings/network-security-configure-encryption-types-allowed-for-kerberos) on domain controllers.
+- AES256_HMAC_SHA1 must be enabled when **Network security: Configure encryption types allowed for Kerberos** policy is [configured](/windows/security/threat-protection/security-policy-settings/network-security-configure-encryption-types-allowed-for-kerberos) on domain controllers.
- Have the credentials required to complete the steps in the scenario: - An Active Directory user who is a member of the Domain Admins group for a domain and a member of the Enterprise Admins group for a forest. Referred to as **$domainCred**.
An FIDO2 Windows login looks for a writable DC to exchange the user TGT. As long
## Next steps
-[Learn more about passwordless authentication](concept-authentication-passwordless.md)
+[Learn more about passwordless authentication](concept-authentication-passwordless.md)
active-directory Concept Conditional Access Cloud Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-cloud-apps.md
Previously updated : 02/08/2022 Last updated : 04/19/2022
Administrators can select published authentication contexts in their Conditional
For more information about authentication context use in applications, see the following articles. -- [Microsoft Information Protection sensitivity labels to protect SharePoint sites](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites#more-information-about-the-dependencies-for-the-authentication-context-option)
+- [Use sensitivity labels to protect content in Microsoft Teams, Microsoft 365 groups, and SharePoint sites](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites)
- [Microsoft Defender for Cloud Apps](/cloud-app-security/session-policy-aad?branch=pr-en-us-2082#require-step-up-authentication-authentication-context) - [Custom applications](../develop/developer-guide-conditional-access-authentication-context.md)
active-directory Howto Conditional Access Policy Risk User https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-risk-user.md
Previously updated : 11/05/2021 Last updated : 03/21/2022
# Conditional Access: User risk-based Conditional Access
-Microsoft works with researchers, law enforcement, various security teams at Microsoft, and other trusted sources to find leaked username and password pairs. Organizations with Azure AD Premium P2 licenses can create Conditional Access policies incorporating [Azure AD Identity Protection user risk detections](../identity-protection/concept-identity-protection-risks.md#user-linked-detections).
+Microsoft works with researchers, law enforcement, various security teams at Microsoft, and other trusted sources to find leaked username and password pairs. Organizations with Azure AD Premium P2 licenses can create Conditional Access policies incorporating [Azure AD Identity Protection user risk detections](../identity-protection/concept-identity-protection-risks.md).
There are two locations where this policy may be configured, Conditional Access and Identity Protection. Configuration using a Conditional Access policy is the preferred method providing more context including enhanced diagnostic data, report-only mode integration, Graph API support, and the ability to utilize other Conditional Access attributes in the policy.
Organizations can choose to deploy this policy using the steps outlined below or
1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts. 1. Select **Done**. 1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**.
-1. Under **Conditions** > **User risk**, set **Configure** to **Yes**. Under **Configure user risk levels needed for policy to be enforced** select **High**, then select **Done**.
-1. Under **Access controls** > **Grant**, select **Grant access**, **Require password change**, and select **Select**.
-1. Confirm your settings and set **Enable policy** to **Report-only**.
+1. Under **Conditions** > **User risk**, set **Configure** to **Yes**.
+ 1. Under **Configure user risk levels needed for policy to be enforced**, select **High**.
+ 1. Select **Done**.
+1. Under **Access controls** > **Grant**.
+ 1. Select **Grant access**, **Require password change**.
+ 1. Select **Select**.
+1. Confirm your settings, and set **Enable policy** to **Report-only**.
1. Select **Create** to create to enable your policy. After confirming your settings using [report-only mode](howto-conditional-access-insights-reporting.md), an administrator can move the **Enable policy** toggle from **Report-only** to **On**.
-## Enable through Identity Protection
-
-1. Sign in to the **Azure portal**.
-1. Select **All services**, then browse to **Azure AD Identity Protection**.
-1. Select **User risk policy**.
-1. Under **Assignments**, select **Users**.
- 1. Under **Include**, select **All users**.
- 1. Under **Exclude**, select **Select excluded users**, choose your organization's emergency access or break-glass accounts, and select **Select**.
- 1. Select **Done**.
-1. Under **Conditions**, select **User risk**, then choose **High**.
- 1. Select **Select**, then **Done**.
-1. Under **Controls** > **Access**, choose **Allow access**, and then select **Require password change**.
- 1. Select **Select**.
-1. Set **Enforce Policy** to **On**.
-1. Select **Save**.
- ## Next steps [Conditional Access common policies](concept-conditional-access-policy-common.md)
active-directory Howto Conditional Access Policy Risk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-risk.md
Previously updated : 11/05/2021 Last updated : 03/21/2022
Organizations can choose to deploy this policy using the steps outlined below or
1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts. 1. Select **Done**. 1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**.
-1. Under **Conditions** > **Sign-in risk**, set **Configure** to **Yes**. Under **Select the sign-in risk level this policy will apply to**
+1. Under **Conditions** > **Sign-in risk**, set **Configure** to **Yes**. Under **Select the sign-in risk level this policy will apply to**.
1. Select **High** and **Medium**. 1. Select **Done**.
-1. Under **Access controls** > **Grant**, select **Grant access**, **Require multi-factor authentication**, and select **Select**.
+1. Under **Access controls** > **Grant**.
+ 1. Select **Grant access**, **Require multi-factor authentication**.
+ 1. Select **Select**.
1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy. After confirming your settings using [report-only mode](howto-conditional-access-insights-reporting.md), an administrator can move the **Enable policy** toggle from **Report-only** to **On**.
-## Enable through Identity Protection
-
-1. Sign in to the **Azure portal**.
-1. Select **All services**, then browse to **Azure AD Identity Protection**.
-1. Select **Sign-in risk policy**.
-1. Under **Assignments**, select **Users**.
- 1. Under **Include**, select **All users**.
- 1. Under **Exclude**, select **Select excluded users**, choose your organization's emergency access or break-glass accounts, and select **Select**.
- 1. Select **Done**.
-1. Under **Conditions**, select **Sign-in risk**, then choose **Medium and above**.
- 1. Select **Select**, then **Done**.
-1. Under **Controls** > **Access**, choose **Allow access**, and then select **Require multi-factor authentication**.
- 1. Select **Select**.
-1. Set **Enforce Policy** to **On**.
-1. Select **Save**.
- ## Next steps [Conditional Access common policies](concept-conditional-access-policy-common.md)
active-directory Workload Identity Federation Create Trust https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/workload-identity-federation-create-trust.md
For examples, see [Configure an app to trust a GitHub repo](workload-identity-fe
Run the following command to configure a federated identity credential on an app and create a trust relationship with a Kubernetes service account. Specify the following parameters: -- *issuer* is your service account issuer URL (the [OIDC issuer URL](/azure/aks/cluster-configuration#oidc-issuer-preview) for the managed cluster or the [OIDC Issuer URL](https://azure.github.io/azure-workload-identity/docs/installation/self-managed-clusters/oidc-issuer.html) for a self-managed cluster).
+- *issuer* is your service account issuer URL (the [OIDC issuer URL](../../aks/cluster-configuration.md#oidc-issuer-preview) for the managed cluster or the [OIDC Issuer URL](https://azure.github.io/azure-workload-identity/docs/installation/self-managed-clusters/oidc-issuer.html) for a self-managed cluster).
- *subject* is the subject name in the tokens issued to the service account. Kubernetes uses the following format for subject names: `system:serviceaccount:<SERVICE_ACCOUNT_NAMESPACE>:<SERVICE_ACCOUNT_NAME>`. - *name* is the name of the federated credential, which cannot be changed later. - *audiences* lists the audiences that can appear in the 'aud' claim of the external token. This field is mandatory, and defaults to "api://AzureADTokenExchange".
Select the **Kubernetes accessing Azure resources** scenario from the dropdown m
Fill in the **Cluster issuer URL**, **Namespace**, **Service account name**, and **Name** fields: -- **Cluster issuer URL** is the [OIDC issuer URL](/azure/aks/cluster-configuration#oidc-issuer-preview) for the managed cluster or the [OIDC Issuer URL](https://azure.github.io/azure-workload-identity/docs/installation/self-managed-clusters/oidc-issuer.html) for a self-managed cluster.
+- **Cluster issuer URL** is the [OIDC issuer URL](../../aks/cluster-configuration.md#oidc-issuer-preview) for the managed cluster or the [OIDC Issuer URL](https://azure.github.io/azure-workload-identity/docs/installation/self-managed-clusters/oidc-issuer.html) for a self-managed cluster.
- **Service account name** is the name of the Kubernetes service account, which provides an identity for processes that run in a Pod. - **Namespace** is the service account namespace. - **Name** is the name of the federated credential, which cannot be changed later.
active-directory Directory Delete Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/directory-delete-howto.md
You can put a subscription into the **Deprovisioned** state to be deleted in thr
If you have an Active or Cancelled Azure Subscription associated to your Azure AD Tenant then you would not be able to delete Azure AD Tenant. After you cancel, billing is stopped immediately. However, Microsoft waits 30 - 90 days before permanently deleting your data in case you need to access it or you change your mind. We don't charge you for keeping the data. -- If you have a free trial or pay-as-you-go subscription, you don't have to wait 90 days for the subscription to automatically delete. You can delete your subscription three days after you cancel it. The Delete subscription option isn't available until three days after you cancel your subscription. For more details please read through [Delete free trial or pay-as-you-go subscriptions](https://docs.microsoft.com/azure/cost-management-billing/manage/cancel-azure-subscription#delete-free-trial-or-pay-as-you-go-subscriptions).-- All other subscription types are deleted only through the [subscription cancellation](https://docs.microsoft.com/azure/cost-management-billing/manage/cancel-azure-subscription#cancel-subscription-in-the-azure-portal) process. In other words, you can't delete a subscription directly unless it's a free trial or pay-as-you-go subscription. However, after you cancel a subscription, you can create an [Azure support request](https://go.microsoft.com/fwlink/?linkid=2083458) to ask to have the subscription deleted immediately.-- Alternatively, you can also move/transfer the Azure subscription to another Azure AD tenant account. When you transfer billing ownership of your subscription to an account in another Azure AD tenant, you can move the subscription to the new account's tenant. Additionally, perfoming Switch Directory on the subscription would not help as the billing would still be aligned with Azure AD Tenant which was used to sign up for the subscription. For more information review [Transfer a subscription to another Azure AD tenant account](https://docs.microsoft.com/azure/cost-management-billing/manage/billing-subscription-transfer#transfer-a-subscription-to-another-azure-ad-tenant-account)
+- If you have a free trial or pay-as-you-go subscription, you don't have to wait 90 days for the subscription to automatically delete. You can delete your subscription three days after you cancel it. The Delete subscription option isn't available until three days after you cancel your subscription. For more details please read through [Delete free trial or pay-as-you-go subscriptions](../../cost-management-billing/manage/cancel-azure-subscription.md#delete-free-trial-or-pay-as-you-go-subscriptions).
+- All other subscription types are deleted only through the [subscription cancellation](../../cost-management-billing/manage/cancel-azure-subscription.md#cancel-subscription-in-the-azure-portal) process. In other words, you can't delete a subscription directly unless it's a free trial or pay-as-you-go subscription. However, after you cancel a subscription, you can create an [Azure support request](https://go.microsoft.com/fwlink/?linkid=2083458) to ask to have the subscription deleted immediately.
+- Alternatively, you can also move/transfer the Azure subscription to another Azure AD tenant account. When you transfer billing ownership of your subscription to an account in another Azure AD tenant, you can move the subscription to the new account's tenant. Additionally, perfoming Switch Directory on the subscription would not help as the billing would still be aligned with Azure AD Tenant which was used to sign up for the subscription. For more information review [Transfer a subscription to another Azure AD tenant account](../../cost-management-billing/manage/billing-subscription-transfer.md#transfer-a-subscription-to-another-azure-ad-tenant-account)
Once you have all the Azure and Office/Microsoft 365 Subscriptions cancelled and deleted you can proceed with cleaning up rest of the things within Azure AD Tenant before actually delete it.
You can put a self-service sign-up product like Microsoft Power BI or Azure Righ
## Next steps
-[Azure Active Directory documentation](../index.yml)
+[Azure Active Directory documentation](../index.yml)
active-directory Groups Assign Sensitivity Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/groups-assign-sensitivity-labels.md
Previously updated : 11/19/2021 Last updated : 04/19/2022
# Assign sensitivity labels to Microsoft 365 groups in Azure Active Directory
-Azure Active Directory (Azure AD) supports applying sensitivity labels published by the [Microsoft 365 compliance center](https://sip.protection.office.com/homepage) to Microsoft 365 groups. Sensitivity labels apply to group across services like Outlook, Microsoft Teams, and SharePoint. For more information about Microsoft 365 apps support, see [Microsoft 365 support for sensitivity labels](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites#support-for-the-sensitivity-labels).
+Azure Active Directory (Azure AD) supports applying sensitivity labels published by the [Microsoft Purview compliance portal](https://compliance.microsoft.com) to Microsoft 365 groups. Sensitivity labels apply to group across services like Outlook, Microsoft Teams, and SharePoint. For more information about Microsoft 365 apps support, see [Microsoft 365 support for sensitivity labels](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites#support-for-the-sensitivity-labels).
> [!IMPORTANT] > To configure this feature, there must be at least one active Azure Active Directory Premium P1 license in your Azure AD organization.
After you enable this feature, the ΓÇ£classicΓÇ¥ classifications for groups will
The sensitivity label option is only displayed for groups when all the following conditions are met:
-1. Labels are published in the Microsoft 365 Compliance Center for this Azure AD organization.
+1. Labels are published in the Microsoft Purview compliance portal for this Azure AD organization.
1. The feature is enabled, EnableMIPLabels is set to True in from the Azure AD PowerShell module. 1. Labels are synchronized to Azure AD with the Execute-AzureAdLabelSync cmdlet in the Security & Compliance PowerShell module. It can take up to 24 hours after synchronization for the label to be available to Azure AD. 1. The group is a Microsoft 365 group.
Please make sure all the conditions are met in order to assign labels to a group
If the label you are looking for is not in the list, this could be the case for one of the following reasons: -- The label might not be published in the Microsoft 365 Compliance Center. This could also apply to labels that are no longer published. Please check with your administrator for more information.
+- The label might not be published in the Microsoft Purview compliance portal. This could also apply to labels that are no longer published. Please check with your administrator for more information.
- The label may be published, however, it is not available to the user that is signed-in. Please check with your administrator for more information on how to get access to the label. ### How to change the label on a group
Labels can be swapped at any time using the same steps as assigning a label to a
### Group setting changes to published labels aren't updated on the groups
-When you make changes to group settings for a published label in [Microsoft 365 compliance center](https://sip.protection.office.com/homepage), those policy changes aren't automatically applied on the labeled groups. Once the sensitivity label is published and applied to groups, Microsoft recommend that you not change the group settings for the label in Microsoft 365 Compliance Center.
+When you make changes to group settings for a published label in the [Microsoft Purview compliance portal](https://compliance.microsoft.com), those policy changes aren't automatically applied on the labeled groups. Once the sensitivity label is published and applied to groups, Microsoft recommend that you not change the group settings for the label in the Microsoft Purview compliance portal.
If you must make a change, use an [Azure AD PowerShell script](https://github.com/microsoftgraph/powershell-aad-samples/blob/master/ReassignSensitivityLabelToO365Groups.ps1) to manually apply updates to the impacted groups. This method makes sure that all existing groups enforce the new setting.
active-directory Active Directory Get Started Premium https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/active-directory-get-started-premium.md
Before you sign up for Active Directory Premium 1 or Premium 2, you must first d
Signing up using your Azure subscription with previously purchased and activated Azure AD licenses, automatically activates the licenses in the same directory. If that's not the case, you must still activate your license plan and your Azure AD access. For more information about activating your license plan, see [Activate your new license plan](#activate-your-new-license-plan). For more information about activating your Azure AD access, see [Activate your Azure AD access](#activate-your-azure-ad-access). ## Sign up using your existing Azure or Microsoft 365 subscription
-As an Azure or Microsoft 365 subscriber, you can purchase the Azure Active Directory Premium editions online. For detailed steps, see [Buy or remove licenses](https://docs.microsoft.com/microsoft-365/commerce/licenses/buy-licenses?view=o365-worldwide).
+As an Azure or Microsoft 365 subscriber, you can purchase the Azure Active Directory Premium editions online. For detailed steps, see [Buy or remove licenses](/microsoft-365/commerce/licenses/buy-licenses?view=o365-worldwide).
## Sign up using your Enterprise Mobility + Security licensing plan Enterprise Mobility + Security is a suite, comprised of Azure AD Premium, Azure Information Protection, and Microsoft Intune. If you already have an EMS license, you can get started with Azure AD, using one of these licensing options:
After your purchased licenses are provisioned in your directory, you'll receive
The activation process typically takes only a few minutes and then you can use your Azure AD tenant. ## Next steps
-Now that you have Azure AD Premium, you can [customize your domain](add-custom-domain.md), add your [corporate branding](customize-branding.md), [create a tenant](active-directory-access-create-new-tenant.md), and [add groups](active-directory-groups-create-azure-portal.md) and [users](add-users-azure-active-directory.md).
+Now that you have Azure AD Premium, you can [customize your domain](add-custom-domain.md), add your [corporate branding](customize-branding.md), [create a tenant](active-directory-access-create-new-tenant.md), and [add groups](active-directory-groups-create-azure-portal.md) and [users](add-users-azure-active-directory.md).
active-directory Active Directory Whatis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/active-directory-whatis.md
Azure Active Directory (Azure AD) is a cloud-based identity and access management service. This service helps your employees access external resources, such as Microsoft 365, the Azure portal, and thousands of other SaaS applications. Azure Active Directory also helps them access internal resources like apps on your corporate intranet network, along with any cloud apps developed for your own organization. For more information about creating a tenant for your organization, see [Quickstart: Create a new tenant in Azure Active Directory](active-directory-access-create-new-tenant.md).
-To learn the differences between Azure Active Directory and Azure Active Directory, see [Compare Active Directory to Azure Active Directory](active-directory-compare-azure-ad-to-ad.md). You can also refer [Microsoft Cloud for Enterprise Architects Series](/microsoft-365/solutions/cloud-architecture-models) posters to better understand the core identity services in Azure like Azure AD and Microsoft-365.
+To learn the differences between Active Directory and Azure Active Directory, see [Compare Active Directory to Azure Active Directory](active-directory-compare-azure-ad-to-ad.md). You can also refer [Microsoft Cloud for Enterprise Architects Series](/microsoft-365/solutions/cloud-architecture-models) posters to better understand the core identity services in Azure like Azure AD and Microsoft-365.
## Who uses Azure AD?
active-directory How To Connect Azure Ad Trust https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-connect-azure-ad-trust.md
You can restore the issuance transform rules using the suggested steps below
## Best practice for securing and monitoring the AD FS trust with Azure AD When you federate your AD FS with Azure AD, it is critical that the federation configuration (trust relationship configured between AD FS and Azure AD) is monitored closely, and any unusual or suspicious activity is captured. To do so, we recommend setting up alerts and getting notified whenever any changes are made to the federation configuration. To learn how to setup alerts, see [Monitor changes to federation configuration](how-to-connect-monitor-federation-changes.md).
-If you are using cloud Azure MFA, for multi factor authentication, with federated users, we highly recommend enabling additional security protection. This security protection prevents bypassing of cloud Azure MFA when federated with Azure AD. When enabled, for a federated domain in your Azure AD tenant, it ensures that a bad actor cannot bypass Azure MFA by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, `federatedIdpMfaBehavior`.For additional information see [Best practices for securing Active Directory Federation Services](https://docs.microsoft.com/windows-server/identity/ad-fs/deployment/best-practices-securing-ad-fs#enable-protection-to-prevent-by-passing-of-cloud-azure-mfa-when-federated-with-azure-ad)
+If you are using cloud Azure MFA, for multi factor authentication, with federated users, we highly recommend enabling additional security protection. This security protection prevents bypassing of cloud Azure MFA when federated with Azure AD. When enabled, for a federated domain in your Azure AD tenant, it ensures that a bad actor cannot bypass Azure MFA by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, `federatedIdpMfaBehavior`.For additional information see [Best practices for securing Active Directory Federation Services](/windows-server/identity/ad-fs/deployment/best-practices-securing-ad-fs#enable-protection-to-prevent-by-passing-of-cloud-azure-mfa-when-federated-with-azure-ad)
## Next steps
-* [Manage and customize Active Directory Federation Services using Azure AD Connect](how-to-connect-fed-management.md)
+* [Manage and customize Active Directory Federation Services using Azure AD Connect](how-to-connect-fed-management.md)
active-directory Plan Hybrid Identity Design Considerations Tools Comparison https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/plan-hybrid-identity-design-considerations-tools-comparison.md
na Previously updated : 04/07/2020 Last updated : 04/18/2022
Over the years the directory integration tools have grown and evolved. -- [FIM](/previous-versions/windows/desktop/forefront-2010/ff182370(v=vs.100)) and [MIM](/microsoft-identity-manager/microsoft-identity-manager-2016) are still supported and primarily enable synchronization between on-premises systems. The [FIM Windows Azure AD Connector](/previous-versions/mim/dn511001(v=ws.10)) is supported in both FIM and MIM, but not recommended for new deployments - customers with on-premises sources such as Notes or SAP HCM should use MIM to populate Active Directory Domain Services (AD DS) and then also use either Azure AD Connect sync or Azure AD Connect cloud provisioning to synchronize from AD DS to Azure AD.
+- [MIM](/microsoft-identity-manager/microsoft-identity-manager-2016) is still supported, and primarily enables synchronization from or between on-premises systems. The [FIM Windows Azure AD Connector](/previous-versions/mim/dn511001(v=ws.10)) is deprecated. Customers with on-premises sources such as Notes or SAP HCM should use MIM in one of two topologies.
+ - If users and groups are needed in Active Directory Domain Services (AD DS), then use MIM to populate users and groups into AD DS, and use either Azure AD Connect sync or Azure AD Connect cloud provisioning to synchronize those users and groups from AD DS to Azure AD.
+ - If users and groups are not needed in AD DS, then use MIM to populate users and groups into Azure AD through the [MIM Graph connector](/microsoft-identity-manager/microsoft-identity-manager-2016-connector-graph).
- [Azure AD Connect sync](how-to-connect-sync-whatis.md) incorporates the components and functionality previously released in DirSync and Azure AD Sync, for synchronizing between AD DS forests and Azure AD. - [Azure AD Connect cloud provisioning](../cloud-sync/what-is-cloud-sync.md) is a new Microsoft agent for synching from AD DS to Azure AD, useful for scenarios such as merger and acquisition where the acquired company's AD forests are isolated from the parent company's AD forests.
-To learn more about the differences between Azure AD Connect sync and Azure AD Connect cloud provisioning, see the article [What is Azure AD Connect cloud provisioning?](../cloud-sync/what-is-cloud-sync.md)
+To learn more about the differences between Azure AD Connect sync and Azure AD Connect cloud provisioning, see the article [What is Azure AD Connect cloud provisioning?](../cloud-sync/what-is-cloud-sync.md). For more information on deployment options with multiple HR sources or directories, then see the article [parallel and combined identity infrastructure options](../fundamentals/azure-active-directory-parallel-identity-options.md).
## Next steps Learn more about [Integrating your on-premises identities with Azure Active Directory](whatis-hybrid-identity.md).
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/reference-connect-version-history.md
To read more about auto-upgrade, see [Azure AD Connect: Automatic upgrade](how-t
### Bug fixes - Fixed an issue where some sync rule functions were not parsing surrogate pairs properly.
+ - Fixed an issue where, under certain circumstances, the sync service would not start due to a model db corruption. You can read more about the model db corruption issue in [this article](/troubleshoot/azure/active-directory/resolve-model-database-corruption-sqllocaldb)
## 2.0.91.0
This is a bug fix release. There are no functional changes in this release.
## Next steps
-Learn more about how to [integrate your on-premises identities with Azure AD](whatis-hybrid-identity.md).
+Learn more about how to [integrate your on-premises identities with Azure AD](whatis-hybrid-identity.md).
active-directory Concept Identity Protection Risks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/concept-identity-protection-risks.md
Previously updated : 01/24/2022 Last updated : 04/15/2022 -+
Identity Protection provides organizations access to powerful resources to see a
## Risk types and detection
-Risk can be detected at the **User** and **Sign-in** level and two types of detection or calculation **Real-time** and **Offline**.
+Risk can be detected at the **User** and **Sign-in** level and two types of detection or calculation **Real-time** and **Offline**. Some risks are considered premium available to Azure AD Premium P2 customers only, while others are available to Free and Azure AD Premium P1 customers.
+
+A sign-in risk represents the probability that a given authentication request isn't authorized by the identity owner. Risky activity can be detected for a user that isn't linked to a specific malicious sign-in but to the user itself.
Real-time detections may not show up in reporting for five to 10 minutes. Offline detections may not show up in reporting for 48 hours. > [!NOTE]
-> Our system may detect that the risk event that contributed to the risk user risk score was a false positives or the user risk was remediated with policy enforcement such as completing an MFA prompt or secure password change. Therefore our system will dismiss the risk state and a risk detail of ΓÇ£AI confirmed sign-in safeΓÇ¥ will surface and it will no longer contribute to the userΓÇÖs risk.
-
-### User-linked detections
+> Our system may detect that the risk event that contributed to the risk user risk score was a false positives or the user risk was remediated with policy enforcement such as completing multi-factor authentication or secure password change. Therefore our system will dismiss the risk state and a risk detail of ΓÇ£AI confirmed sign-in safeΓÇ¥ will surface and it will no longer contribute to the userΓÇÖs risk.
-Risky activity can be detected for a user that is not linked to a specific malicious sign-in but to the user itself.
+### Premium detections
-These risks are calculated offline using Microsoft's internal and external threat intelligence sources including security researchers, law enforcement professionals, security teams at Microsoft, and other trusted sources.
+Premium detections are visible only to Azure AD Premium P2 customers. Customers without Azure AD Premium P2 licenses still receives the premium detections but they'll be titled "additional risk detected".
-| Risk detection | Description |
-| | |
-| Leaked credentials | This risk detection type indicates that the user's valid credentials have been leaked. When cybercriminals compromise valid passwords of legitimate users, they often share those credentials. This sharing is typically done by posting publicly on the dark web, paste sites, or by trading and selling the credentials on the black market. When the Microsoft leaked credentials service acquires user credentials from the dark web, paste sites, or other sources, they are checked against Azure AD users' current valid credentials to find valid matches. For more information about leaked credentials, see [Common questions](#common-questions). |
-| Azure AD threat intelligence | This risk detection type indicates user activity that is unusual for the given user or is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. |
-| Possible attempt to access Primary Refresh Token (PRT)| This risk detection type is detected by Microsoft Defender for Endpoint (MDE). A Primary Refresh Token (PRT) is a key artifact of Azure AD authentication on Windows 10, Windows Server 2016, and later versions, iOS, and Android devices. It is a JSON Web Token (JWT) that's specially issued to Microsoft first-party token brokers to enable single sign-on (SSO) across the applications used on those devices. Attackers can attempt to access this resource to move laterally into an organization or perform credential theft. This detection will move users to high risk and will only fire in organizations that have deployed MDE. This is a low-volume detection that will be infrequently seen by most organizations. However, when it does occur it is high risk and users should be remediated.|
### Sign-in risk
-A sign-in risk represents the probability that a given authentication request is not authorized by the identity owner.
-
-These risks can be calculated in real-time or calculated offline using Microsoft's internal and external threat intelligence sources including security researchers, law enforcement professionals, security teams at Microsoft, and other trusted sources.
+#### Premium sign-in risk detections
| Risk detection | Detection type | Description | | | | |
-| Anonymous IP address | Real-time | This risk detection type indicates sign-ins from an anonymous IP address (for example, Tor browser or anonymous VPN). These IP addresses are typically used by actors who want to hide their login telemetry (IP address, location, device, and so on) for potentially malicious intent. |
| Atypical travel | Offline | This risk detection type identifies two sign-ins originating from geographically distant locations, where at least one of the locations may also be atypical for the user, given past behavior. Among several other factors, this machine learning algorithm takes into account the time between the two sign-ins and the time it would have taken for the user to travel from the first location to the second, indicating that a different user is using the same credentials. <br><br> The algorithm ignores obvious "false positives" contributing to the impossible travel conditions, such as VPNs and locations regularly used by other users in the organization. The system has an initial learning period of the earliest of 14 days or 10 logins, during which it learns a new user's sign-in behavior. |
-| Anomalous Token | Offline | This detection indicates that there are abnormal characteristics in the token such as an unusual token lifetime or a token that is played from an unfamiliar location. This detection covers Session Tokens and Refresh Tokens. ***NOTE:** Anomalous token is tuned to incur more noise than other detections at the same risk level. This tradeoff is chosen to increase the likelihood of detecting replayed tokens that may otherwise go unnoticed. Because this is a high noise detection, there is a higher than normal chance that some of the sessions flagged by this detection are false positives. We recommend investigating the sessions flagged by this detection in the context of other sign-ins from the user. If the location, application, IP address, User Agent, or other characteristics are unexpected for the user, the tenant admin should consider this as an indicator of potential token replay*. |
+| Anomalous Token | Offline | This detection indicates that there are abnormal characteristics in the token such as an unusual token lifetime or a token that is played from an unfamiliar location. This detection covers Session Tokens and Refresh Tokens. <br><br> **NOTE:** Anomalous token is tuned to incur more noise than other detections at the same risk level. This tradeoff is chosen to increase the likelihood of detecting replayed tokens that may otherwise go unnoticed. Because this is a high noise detection, there's a higher than normal chance that some of the sessions flagged by this detection are false positives. We recommend investigating the sessions flagged by this detection in the context of other sign-ins from the user. If the location, application, IP address, User Agent, or other characteristics are unexpected for the user, the tenant admin should consider this as an indicator of potential token replay. |
| Token Issuer Anomaly | Offline |This risk detection indicates the SAML token issuer for the associated SAML token is potentially compromised. The claims included in the token are unusual or match known attacker patterns. | | Malware linked IP address | Offline | This risk detection type indicates sign-ins from IP addresses infected with malware that is known to actively communicate with a bot server. This detection is determined by correlating IP addresses of the user's device against IP addresses that were in contact with a bot server while the bot server was active. <br><br> **[This detection has been deprecated](../fundamentals/whats-new-archive.md#planned-deprecationmalware-linked-ip-address-detection-in-identity-protection)**. Identity Protection will no longer generate new "Malware linked IP address" detections. Customers who currently have "Malware linked IP address" detections in their tenant will still be able to view, remediate, or dismiss them until the 90-day detection retention time is reached.| | Suspicious browser | Offline | Suspicious browser detection indicates anomalous behavior based on suspicious sign-in activity across multiple tenants from different countries in the same browser. |
-| Unfamiliar sign-in properties | Real-time | This risk detection type considers past sign-in history (IP, Latitude / Longitude and ASN) to look for anomalous sign-ins. The system stores information about previous locations used by a user, and considers these "familiar" locations. The risk detection is triggered when the sign-in occurs from a location that is not already in the list of familiar locations. Newly created users will be in "learning mode" for a while where unfamiliar sign-in properties risk detections will be turned off while our algorithms learn the user's behavior. The learning mode duration is dynamic and depends on how much time it takes the algorithm to gather enough information about the user's sign-in patterns. The minimum duration is five days. A user can go back into learning mode after a long period of inactivity and after a secure password reset. The system also ignores sign-ins from familiar devices, and locations that are geographically close to a familiar location. <br><br> We also run this detection for basic authentication (or legacy protocols). Because these protocols do not have modern properties such as client ID, there is limited telemetry to reduce false positives. We recommend our customers to move to modern authentication. <br><br> Unfamiliar sign-in properties can be detected on both interactive and non-interactive sign-ins. When this detection is detected on non-interactive sign-ins, it deserves increased scrutiny due to the risk of token replay attacks. |
-| Admin confirmed user compromised | Offline | This detection indicates an admin has selected 'Confirm user compromised' in the Risky users UI or using riskyUsers API. To see which admin has confirmed this user compromised, check the user's risk history (via UI or API). |
+| Unfamiliar sign-in properties | Real-time |This risk detection type considers past sign-in history to look for anomalous sign-ins. The system stores information about previous sign-ins, and triggers a risk detection when a sign-in occurs with properties that are unfamiliar to the user. These properties can include IP, ASN, location, device, browser, and tenant IP subnet. Newly created users will be in "learning mode" period where the unfamiliar sign-in properties risk detection will be turned off while our algorithms learn the user's behavior. The learning mode duration is dynamic and depends on how much time it takes the algorithm to gather enough information about the user's sign-in patterns. The minimum duration is five days. A user can go back into learning mode after a long period of inactivity. <br><br> We also run this detection for basic authentication (or legacy protocols). Because these protocols don't have modern properties such as client ID, there's limited telemetry to reduce false positives. We recommend our customers to move to modern authentication. <br><br> Unfamiliar sign-in properties can be detected on both interactive and non-interactive sign-ins. When this detection is detected on non-interactive sign-ins, it deserves increased scrutiny due to the risk of token replay attacks. |
| Malicious IP address | Offline | This detection indicates sign-in from a malicious IP address. An IP address is considered malicious based on high failure rates because of invalid credentials received from the IP address or other IP reputation sources. | | Suspicious inbox manipulation rules | Offline | This detection is discovered by [Microsoft Defender for Cloud Apps](/cloud-app-security/anomaly-detection-policy#suspicious-inbox-manipulation-rules). This detection profiles your environment and triggers alerts when suspicious rules that delete or move messages or folders are set on a user's inbox. This detection may indicate that the user's account is compromised, that messages are being intentionally hidden, and that the mailbox is being used to distribute spam or malware in your organization. | | Password spray | Offline | A password spray attack is where multiple usernames are attacked using common passwords in a unified brute force manner to gain unauthorized access. This risk detection is triggered when a password spray attack has been performed. |
These risks can be calculated in real-time or calculated offline using Microsoft
| New country | Offline | This detection is discovered by [Microsoft Defender for Cloud Apps](/cloud-app-security/anomaly-detection-policy#activity-from-infrequent-country). This detection considers past activity locations to determine new and infrequent locations. The anomaly detection engine stores information about previous locations used by users in the organization. | | Activity from anonymous IP address | Offline | This detection is discovered by [Microsoft Defender for Cloud Apps](/cloud-app-security/anomaly-detection-policy#activity-from-anonymous-ip-addresses). This detection identifies that users were active from an IP address that has been identified as an anonymous proxy IP address. | | Suspicious inbox forwarding | Offline | This detection is discovered by [Microsoft Defender for Cloud Apps](/cloud-app-security/anomaly-detection-policy#suspicious-inbox-forwarding). This detection looks for suspicious email forwarding rules, for example, if a user created an inbox rule that forwards a copy of all emails to an external address. |
-| Azure AD threat intelligence | Offline | This risk detection type indicates sign-in activity that is unusual for the given user or is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. |
| Mass Access to Sensitive Files | Offline | This detection is discovered by [Microsoft Defender for Cloud Apps](/defender-cloud-apps/investigate-anomaly-alerts#unusual-file-access-by-user). This detection profiles your environment and triggers alerts when users access multiple files from Microsoft SharePoint or Microsoft OneDrive. An alert is triggered only if the number of accessed files is uncommon for the user and the files might contain sensitive information|
+#### Nonpremium sign-in risk detections
+
+| Risk detection | Detection type | Description |
+| | | |
+| Additional risk detected | Real-time or Offline | This detection indicates that one of the premium detections was detected. Since the premium detections are visible only to Azure AD Premium P2 customers, they're titled "additional risk detected" for customers without Azure AD Premium P2 licenses. |
+| Anonymous IP address | Real-time | This risk detection type indicates sign-ins from an anonymous IP address (for example, Tor browser or anonymous VPN). These IP addresses are typically used by actors who want to hide their login telemetry (IP address, location, device, and so on) for potentially malicious intent. |
+| Admin confirmed user compromised | Offline | This detection indicates an admin has selected 'Confirm user compromised' in the Risky users UI or using riskyUsers API. To see which admin has confirmed this user compromised, check the user's risk history (via UI or API). |
+| Azure AD threat intelligence | Offline | This risk detection type indicates user activity that is unusual for the given user or is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. |
+
+### User-linked detections
+
+#### Premium user risk detections
+
+| Risk detection | Detection type | Description |
+| | | |
+| Possible attempt to access Primary Refresh Token (PRT) | Offline | This risk detection type is detected by Microsoft Defender for Endpoint (MDE). A Primary Refresh Token (PRT) is a key artifact of Azure AD authentication on Windows 10, Windows Server 2016, and later versions, iOS, and Android devices. A PRT is a JSON Web Token (JWT) that's specially issued to Microsoft first-party token brokers to enable single sign-on (SSO) across the applications used on those devices. Attackers can attempt to access this resource to move laterally into an organization or perform credential theft. This detection will move users to high risk and will only fire in organizations that have deployed MDE. This detection is low-volume and will be seen infrequently by most organizations. However, when it does occur it's high risk and users should be remediated. |
-### Other risk detections
+#### Nonpremium user risk detections
-| Risk detection | Detection type | Description |
+| Risk detection | Detection type | Description |
| | | |
-| Additional risk detected | Real-time or Offline | This detection indicates that one of the above premium detections was detected. Since the premium detections are visible only to Azure AD Premium P2 customers, they are titled "additional risk detected" for customers without Azure AD Premium P2 licenses. |
+| Additional risk detected | Real-time or Offline | This detection indicates that one of the premium detections was detected. Since the premium detections are visible only to Azure AD Premium P2 customers, they're titled "additional risk detected" for customers without Azure AD Premium P2 licenses. |
+| Leaked credentials | Offline | This risk detection type indicates that the user's valid credentials have been leaked. When cybercriminals compromise valid passwords of legitimate users, they often share those credentials. This sharing is typically done by posting publicly on the dark web, paste sites, or by trading and selling the credentials on the black market. When the Microsoft leaked credentials service acquires user credentials from the dark web, paste sites, or other sources, they're checked against Azure AD users' current valid credentials to find valid matches. For more information about leaked credentials, see [Common questions](#common-questions). |
+| Azure AD threat intelligence | Offline | This risk detection type indicates user activity that is unusual for the given user or is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. |
## Common questions ### Risk levels
-Identity Protection categorizes risk into three tiers: low, medium, and high. When configuring [custom Identity protection policies](./concept-identity-protection-policies.md#custom-conditional-access-policy), you can also configure it to trigger upon **No risk** level. No Risk means there is no active indication that the user's identity has been compromised.
+Identity Protection categorizes risk into three tiers: low, medium, and high. When configuring [custom Identity protection policies](./concept-identity-protection-policies.md#custom-conditional-access-policy), you can also configure it to trigger upon **No risk** level. No Risk means there's no active indication that the user's identity has been compromised.
-While Microsoft does not provide specific details about how risk is calculated, we will say that each level brings higher confidence that the user or sign-in is compromised. For example, something like one instance of unfamiliar sign-in properties for a user might not be as threatening as leaked credentials for another user.
+While Microsoft doesn't provide specific details about how risk is calculated, we'll say that each level brings higher confidence that the user or sign-in is compromised. For example, something like one instance of unfamiliar sign-in properties for a user might not be as threatening as leaked credentials for another user.
### Password hash synchronization
Microsoft finds leaked credentials in various places, including:
#### Why am I not seeing any leaked credentials?
-Leaked credentials are processed anytime Microsoft finds a new, publicly available batch. Because of the sensitive nature, the leaked credentials are deleted shortly after processing. Only new leaked credentials found after you enable password hash synchronization (PHS) will be processed against your tenant. Verifying against previously found credential pairs is not done.
+Leaked credentials are processed anytime Microsoft finds a new, publicly available batch. Because of the sensitive nature, the leaked credentials are deleted shortly after processing. Only new leaked credentials found after you enable password hash synchronization (PHS) will be processed against your tenant. Verifying against previously found credential pairs isn't done.
#### I have not seen any leaked credential risk events for quite some time?
-If you have not seen any leaked credential risk events, it is because of the following reasons:
+If you haven't seen any leaked credential risk events, it is because of the following reasons:
-- You do not have PHS enabled for your tenant.
+- You don't have PHS enabled for your tenant.
- Microsoft has not found any leaked credential pairs that match your users. #### How often does Microsoft process new credentials?
active-directory Howto Identity Protection Configure Risk Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/howto-identity-protection-configure-risk-policies.md
Previously updated : 01/24/2022 Last updated : 03/18/2022
Both policies work to automate the response to risk detections in your environme
## Choosing acceptable risk levels
-Organizations must decide the level of risk they are willing to accept balancing user experience and security posture.
+Organizations must decide the level of risk they're willing to accept balancing user experience and security posture.
Microsoft's recommendation is to set the user risk policy threshold to **High** and the sign-in risk policy to **Medium and above** and allow self-remediation options. Choosing to block access rather than allowing self-remediation options, like password change and multi-factor authentication, will impact your users and administrators. Weigh this choice when configuring your policies.
Organizations can choose to block access when risk is detected. Blocking sometim
- When a user risk policy triggers: - Administrators can require a secure password reset, requiring Azure AD MFA be done before the user creates a new password with SSPR, resetting the user risk. - When a sign-in risk policy triggers:
- - Azure AD MFA can be triggered, allowing to user to prove it is them by using one of their registered authentication methods, resetting the sign-in risk.
+ - Azure AD MFA can be triggered, allowing to user to prove it's them by using one of their registered authentication methods, resetting the sign-in risk.
> [!WARNING] > Users must register for Azure AD MFA and SSPR before they face a situation requiring remediation. Users not registered are blocked and require administrator intervention.
Organizations can choose to block access when risk is detected. Blocking sometim
## Exclusions
-Policies allow for excluding users such as your [emergency access or break-glass administrator accounts](../roles/security-emergency-access.md). Organizations may need to exclude other accounts from specific policies based on the way the accounts are used. Exclusions should be reviewed regularly to see if they are still applicable.
+Policies allow for excluding users such as your [emergency access or break-glass administrator accounts](../roles/security-emergency-access.md). Organizations may need to exclude other accounts from specific policies based on the way the accounts are used. Exclusions should be reviewed regularly to see if they're still applicable.
## Enable policies
-There are two locations where these policies may be configured, Conditional Access and Identity Protection. Configuration using Conditional Access policies is the preferred method, providing more context including:
+There are two locations where these policies may be configured, Conditional Access and Identity Protection. Configuration using Conditional Access policies is the preferred method, providing more context including:
- Enhanced diagnostic data - Report-only mode integration - Graph API support - Use more Conditional Access attributes in policy
+Organizations can choose to deploy policies using the steps outlined below or using the [Conditional Access templates (Preview)](../conditional-access/concept-conditional-access-policy-common.md#conditional-access-templates-preview).
+ > [!VIDEO https://www.youtube.com/embed/zEsbbik-BTE]
-Before enabling remediation policies, organizations may want to [investigate](howto-identity-protection-investigate-risk.md) and [remediate](howto-identity-protection-remediate-unblock.md) any active risks.
+Before organizations enable remediation policies, they may want to [investigate](howto-identity-protection-investigate-risk.md) and [remediate](howto-identity-protection-remediate-unblock.md) any active risks.
### User risk with Conditional Access
active-directory How To Managed Identity Regional Move https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/how-to-managed-identity-regional-move.md
+
+ Title: Move managed identities to another region - Azure AD
+description: Steps involved in getting a managed identity recreated in another region
+
+documentationcenter:
++
+editor:
++++
+ na
+ Last updated : 04/13/2022+++++
+# Move managed identity for Azure resources across regions
+
+There are situations in which you'd want to move your existing user-assigned managed identities from one region to another. For example, you may need to move a solution that uses user-assigned managed identities to another region. You may also want to move an existing identity to another region as part of disaster recovery planning, and testing.
+
+Moving User-assigned managed identities across Azure regions is not supported. You can however, recreate a user-assigned managed identity in the target region.
+
+## Prerequisites
+
+- Permissions to list permissions granted to existing user-assigned managed identity.
+- Permissions to grant a new user-assigned managed identity the required permissions.
+- Permissions to assign a new user-assigned identity to the Azure resources.
+- Permissions to edit Group membership, if your user-assigned managed identity is a member of one or more groups.
+
+## Prepare and move
+
+1. Copy user-assigned managed identity assigned permissions. You can list [Azure role assignments](../../role-based-access-control/role-assignments-list-powershell.md) but that may not be enough depending on how permissions were granted to the user-assigned managed identity. You should confirm that your solution doesn't depend on permissions granted using a service specific option.
+1. Create a [new user-assigned managed identity](how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-powershell#create-a-user-assigned-managed-identity-2) at the target region.
+1. Grant the managed identity the same permissions as the original identity that it's replacing, including Group membership. You can review [Assign Azure roles to a managed identity](../../role-based-access-control/role-assignments-portal-managed-identity.md), and [Group membership](../../active-directory/fundamentals/active-directory-groups-view-azure-portal.md).
+1. Specify the new identity in the properties of the resource instance that uses the newly created user assigned managed identity.
+
+## Verify
+
+After reconfiguring your service to use your new managed identities in the target region, you need to confirm that all operations have been restored.
+
+## Clean up
+
+Once that you confirm your service is back online, you can proceed to delete any resources in the source region that you no longer use.
+
+## Next steps
+
+In this tutorial, you took the steps needed to recreate a user-assigned managed identity in a new region.
+
+- [Manage user-assigned managed identities](how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-powershell#delete-a-user-assigned-managed-identity-2)
active-directory Managed Identities Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/managed-identities-status.md
The following Azure services support managed identities for Azure resources:
| Azure Media services | [Managed identities](/azure/media-services/latest/concept-managed-identities) | | Azure Monitor | [Azure Monitor customer-managed key](../../azure-monitor/logs/customer-managed-keys.md?tabs=portal) | | Azure Policy | [Remediate non-compliant resources with Azure Policy](../../governance/policy/how-to/remediate-resources.md) |
-| Azure Purview | [Credentials for source authentication in Azure Purview](../../purview/manage-credentials.md) |
+| Microsoft Purview | [Credentials for source authentication in Microsoft Purview](../../purview/manage-credentials.md) |
| Azure Resource Mover | [Move resources across regions (from resource group)](../../resource-mover/move-region-within-resource-group.md) | Azure Site Recovery | [Replicate machines with private endpoints](../../site-recovery/azure-to-azure-how-to-enable-replication-private-endpoints.md#enable-the-managed-identity-for-the-vault) | | Azure Search | [Set up an indexer connection to a data source using a managed identity](../../search/search-howto-managed-identities-data-sources.md) |
active-directory Delegate By Task https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/roles/delegate-by-task.md
You can further restrict permissions by assigning roles at smaller scopes or by
> [!div class="mx-tableFixed"] > | Task | Least privileged role | Additional roles | > | - | | - |
-> | Create Azure AD Domain Services instance | [Application Administrator](../roles/permissions-reference.md#application-administrator)<br>[Groups Administrator](../roles/permissions-reference.md#groups-administrator)<br> [Domain Services Contributor](/azure/role-based-access-control/built-in-roles#domain-services-contributor)| |
+> | Create Azure AD Domain Services instance | [Application Administrator](../roles/permissions-reference.md#application-administrator)<br>[Groups Administrator](../roles/permissions-reference.md#groups-administrator)<br> [Domain Services Contributor](../../role-based-access-control/built-in-roles.md#domain-services-contributor)| |
> | Perform all Azure AD Domain Services tasks | [AAD DC Administrators group](../../active-directory-domain-services/tutorial-create-management-vm.md#administrative-tasks-you-can-perform-on-a-managed-domain) | | > | Read all configuration | Reader on Azure subscription containing AD DS service | |
You can further restrict permissions by assigning roles at smaller scopes or by
- [Assign Azure AD roles to users](manage-roles-portal.md) - [Assign Azure AD roles at different scopes](assign-roles-different-scopes.md) - [Create and assign a custom role in Azure Active Directory](custom-create.md)-- [Azure AD built-in roles](permissions-reference.md)
+- [Azure AD built-in roles](permissions-reference.md)
active-directory Security Planning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/roles/security-planning.md
keywords:
Previously updated : 11/04/2021 Last updated : 04/19/2022
Evaluate the accounts that are assigned or eligible for the Global Administrator
#### Turn on multi-factor authentication and register all other highly privileged single-user non-federated administrator accounts
-Require Azure AD Multi-Factor Authentication (MFA) at sign-in for all individual users who are permanently assigned to one or more of the Azure AD administrator roles: Global Administrator, Privileged Role Administrator, Exchange Administrator, and SharePoint Administrator. Use the guide to enable [Multi-factor Authentication (MFA) for your administrator accounts](../authentication/howto-mfa-userstates.md) and ensure that all those users have registered at [https://aka.ms/mfasetup](https://aka.ms/mfasetup). More information can be found under step 2 and step 3 of the guide [Protect access to data and services in Microsoft 365](https://support.office.com/article/Protect-access-to-data-and-services-in-Office-365-a6ef28a4-2447-4b43-aae2-f5af6d53c68e).
+Require Azure AD Multi-Factor Authentication (MFA) at sign-in for all individual users who are permanently assigned to one or more of the Azure AD administrator roles: Global Administrator, Privileged Role Administrator, Exchange Administrator, and SharePoint Administrator. Use the guidance at [Enforce multifactor authentication on your administrators](../authentication/how-to-authentication-find-coverage-gaps.md#enforce-multifactor-authentication-on-your-administrators) and ensure that all those users have registered at [https://aka.ms/mfasetup](https://aka.ms/mfasetup). More information can be found under step 2 and step 3 of the guide [Protect user and device access in Microsoft 365](/microsoft-365/compliance/protect-access-to-data-and-services).
## Stage 2: Mitigate frequently used attacks
active-directory Cisco Webex Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/cisco-webex-tutorial.md
Previously updated : 11/01/2021 Last updated : 04/18/2022
In this tutorial, you'll learn how to integrate Cisco Webex Meetings with Azure
* Enable your users to be automatically signed-in to Cisco Webex Meetings with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal. - ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). * Cisco Webex Meetings single sign-on (SSO) enabled subscription. * Service Provider Metadata file from Cisco Webex Meetings.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
> [!NOTE] > This integration is also available to use from Azure AD US Government Cloud environment. You can find this application in the Azure AD US Government Cloud Application Gallery and configure it in the same way as you do from public cloud.
In this tutorial, you configure and test Azure AD SSO in a test environment.
* Cisco Webex Meetings supports [**Automated** user provisioning and deprovisioning](cisco-webex-provisioning-tutorial.md) (recommended). * Cisco Webex Meetings supports **Just In Time** user provisioning.
-## Adding Cisco Webex Meetings from the gallery
+## Add Cisco Webex Meetings from the gallery
To configure the integration of Cisco Webex Meetings into Azure AD, you need to add Cisco Webex Meetings from the gallery to your list of managed SaaS apps.
To configure and test Azure AD SSO with Cisco Webex Meetings, perform the follow
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon. 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
-
1. **[Configure Cisco Webex Meetings SSO](#configure-cisco-webex-meetings-sso)** - to configure the single sign-on settings on application side.
- * **[Create Cisco Webex Meetings test user](#create-cisco-webex-meetings-test-user)** - to have a counterpart of B.Simon in Cisco Webex Meetings that is linked to the Azure AD representation of user.
-
+ 1. **[Create Cisco Webex Meetings test user](#create-cisco-webex-meetings-test-user)** - to have a counterpart of B.Simon in Cisco Webex Meetings that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
> You will get the Service Provider Metadata file from **Configure Cisco Webex Meetings SSO** section, which is explained later in the tutorial. 1. If you wish to configure the application in **SP** initiated mode, perform the following steps:
- 1. On the **Basic SAML Configuration** section, click the edit/pen icon.
+ 1. On the **Basic SAML Configuration** section, click the pencil icon.
![Edit Basic SAML Configuration](common/edit-urls.png)
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Sign in to Cisco Webex Meetings with your administrator credentials. 1. Go to **Common Site Settings** and navigate to **SSO Configuration**.
- ![Screenshot shows Cisco Webex Administration with Common Site Settings and S S O Configuration selected.](./media/cisco-webex-tutorial/tutorial-cisco-webex-11.png)
+ ![Screenshot shows Cisco Webex Administration with Common Site Settings and S S O Configuration selected.](./media/cisco-webex-tutorial/settings.png)
1. On the **Webex Administration** page, perform the following steps:
- ![Screenshot shows the Webex Administration page with the information described in this step.](./media/cisco-webex-tutorial/tutorial-cisco-webex-10.png)
+ ![Screenshot shows the Webex Administration page with the information described in this step.](./media/cisco-webex-tutorial/metadata.png)
- 1. select **SAML 2.0** as **Federation Protocol**.
+ 1. Select **SAML 2.0** as **Federation Protocol**.
1. Click on **Import SAML Metadata** link to upload the metadata file, which you have downloaded from Azure portal. 1. Select **SSO Profile** as **IDP initiated** and click on **Export** button to download the Service Provider Metadata file and upload it in the **Basic SAML Configuration** section on Azure portal.
- 1. In the **AuthContextClassRef** textbox, type one of the following values:
- * `urn:oasis:names:tc:SAML:2.0:ac:classes:unspecified`
- * `urn:oasis:names:tc:SAML:2.0:ac:classes:Password`
-
- To enable the MFA by using Azure AD, enter the two values like this:
- `urn:oasis:names:tc:SAML:2.0:ac:classes:PasswordProtectedTransport;urn:oasis:names:tc:SAML:2.0:ac:classes:X509`
- 1. Select **Auto Account Creation**. > [!NOTE]
You can also use Microsoft My Apps to test the application in any mode. When you
## Next steps
-Once you configure Cisco Webex Meetings you can enforce Session Control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session Control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-aad).
+Once you configure Cisco Webex Meetings you can enforce Session Control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session Control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-aad).
active-directory F5 Big Ip Oracle Enterprise Business Suite Easy Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/f5-big-ip-oracle-enterprise-business-suite-easy-button.md
Integrating a BIG-IP with Azure AD provides many benefits, including:
* Manage Identities and access from a single control plane, the [Azure portal](https://portal.azure.com/)
-To learn about all the benefits, see the article on [F5 BIG-IP and Azure AD integration](/azure/active-directory/manage-apps/f5-aad-integration) and [what is application access and single sign-on with Azure AD](/azure/active-directory/active-directory-appssoaccess-whatis).
+To learn about all the benefits, see the article on [F5 BIG-IP and Azure AD integration](../manage-apps/f5-aad-integration.md) and [what is application access and single sign-on with Azure AD](/azure/active-directory/active-directory-appssoaccess-whatis).
## Scenario description
Prior BIG-IP experience isnΓÇÖt necessary, but you need:
* An account with Azure AD application admin [permissions](/azure/active-directory/users-groups-roles/directory-assign-admin-roles#application-administrator)
-* An [SSL Web certificate](/azure/active-directory/manage-apps/f5-bigip-deployment-guide#ssl-profile) for publishing services over HTTPS, or use default BIG-IP certs while testing
+* An [SSL Web certificate](../manage-apps/f5-bigip-deployment-guide.md#ssl-profile) for publishing services over HTTPS, or use default BIG-IP certs while testing
* An existing Oracle EBS suite including Oracle AccessGate and an LDAP enabled OID (Oracle Internet Database)
Along with this the SAML federation metadata for the published application is al
If the BIG-IP webtop portal is used to access published applications then a sign-out from there would be processed by the APM to also call the Azure AD sign-out endpoint. But consider a scenario where the BIG-IP webtop portal isnΓÇÖt used, then the user has no way of instructing the APM to sign out. Even if the user signs-out of the application itself, the BIG-IP is technically oblivious to this. So for this reason, SP initiated sign-out needs careful consideration to ensure sessions are securely terminated when no longer required. One way of achieving this would be to add an SLO function to your applications sign out button, so that it can redirect your client to either the Azure AD SAML or BIG-IP sign-out endpoint. The URL for SAML sign-out endpoint for your tenant can be found in **App Registrations > Endpoints**.
-If making a change to the app is a no go, then consider having the BIG-IP listen for the application's sign-out call, and upon detecting the request have it trigger SLO. Refer to our [Oracle PeopleSoft SLO guidance](/azure/active-directory/manage-apps/f5-big-ip-oracle-peoplesoft-easy-button#peoplesoft-single-logout) for using BIG-IP irules to achieve this. More details on using BIG-IP iRules to achieve this is available in the F5 knowledge article [Configuring automatic session termination (logout) based on a URI-referenced file name](https://support.f5.com/csp/article/K42052145) and [Overview of the Logout URI Include option](https://support.f5.com/csp/article/K12056).
+If making a change to the app is a no go, then consider having the BIG-IP listen for the application's sign-out call, and upon detecting the request have it trigger SLO. Refer to our [Oracle PeopleSoft SLO guidance](../manage-apps/f5-big-ip-oracle-peoplesoft-easy-button.md#peoplesoft-single-logout) for using BIG-IP irules to achieve this. More details on using BIG-IP iRules to achieve this is available in the F5 knowledge article [Configuring automatic session termination (logout) based on a URI-referenced file name](https://support.f5.com/csp/article/K42052145) and [Overview of the Logout URI Include option](https://support.f5.com/csp/article/K12056).
## Summary
For increased security, organizations using this pattern could also consider blo
## Advanced deployment
-There may be cases where the Guided Configuration templates lack the flexibility to achieve more specific requirements. For those scenarios, see [Advanced Configuration for headers-based SSO](/azure/active-directory/manage-apps/f5-big-ip-header-advanced). Alternatively, the BIG-IP gives the option to disable **Guided ConfigurationΓÇÖs strict management mode**. This allows you to manually tweak your configurations, even though bulk of your configurations are automated through the wizard-based templates.
+There may be cases where the Guided Configuration templates lack the flexibility to achieve more specific requirements. For those scenarios, see [Advanced Configuration for headers-based SSO](../manage-apps/f5-big-ip-header-advanced.md). Alternatively, the BIG-IP gives the option to disable **Guided ConfigurationΓÇÖs strict management mode**. This allows you to manually tweak your configurations, even though bulk of your configurations are automated through the wizard-based templates.
You can navigate to **Access > Guided Configuration** and select the **small padlock icon** on the far right of the row for your applicationsΓÇÖ configs.
The following command from a bash shell validates the APM service account used f
```ldapsearch -xLLL -H 'ldap://192.168.0.58' -b "CN=oraclef5,dc=contoso,dc=lds" -s sub -D "CN=f5-apm,CN=partners,DC=contoso,DC=lds" -w 'P@55w0rd!' "(cn=testuser)" ```
-For more information, visit this F5 knowledge article [Configuring LDAP remote authentication for Active Directory](https://support.f5.com/csp/article/K11072). ThereΓÇÖs also a great BIG-IP reference table to help diagnose LDAP-related issues in this [F5 knowledge article on LDAP Query](https://techdocs.f5.com/en-us/bigip-16-1-0/big-ip-access-policy-manager-authentication-methods/ldap-query.html).
+For more information, visit this F5 knowledge article [Configuring LDAP remote authentication for Active Directory](https://support.f5.com/csp/article/K11072). ThereΓÇÖs also a great BIG-IP reference table to help diagnose LDAP-related issues in this [F5 knowledge article on LDAP Query](https://techdocs.f5.com/en-us/bigip-16-1-0/big-ip-access-policy-manager-authentication-methods/ldap-query.html).
active-directory F5 Big Ip Oracle Jd Edwards Easy Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/f5-big-ip-oracle-jd-edwards-easy-button.md
Integrating a BIG-IP with Azure AD provides many benefits, including:
* Manage Identities and access from a single control plane, the [Azure portal](https://portal.azure.com/)
-To learn about all the benefits, see the article on [F5 BIG-IP and Azure AD integration](/azure/active-directory/manage-apps/f5-aad-integration) and [what is application access and single sign-on with Azure AD](/azure/active-directory/active-directory-appssoaccess-whatis).
+To learn about all the benefits, see the article on [F5 BIG-IP and Azure AD integration](../manage-apps/f5-aad-integration.md) and [what is application access and single sign-on with Azure AD](/azure/active-directory/active-directory-appssoaccess-whatis).
## Scenario description
Prior BIG-IP experience isnΓÇÖt necessary, but you need:
* An Azure AD free subscription or above
-* An existing BIG-IP or [deploy a BIG-IP Virtual Edition (VE) in Azure](/azure/active-directory/manage-apps/f5-bigip-deployment-guide)
+* An existing BIG-IP or [deploy a BIG-IP Virtual Edition (VE) in Azure](../manage-apps/f5-bigip-deployment-guide.md)
* Any of the following F5 BIG-IP license SKUs
Prior BIG-IP experience isnΓÇÖt necessary, but you need:
* An account with Azure AD application admin [permissions](/azure/active-directory/users-groups-roles/directory-assign-admin-roles#application-administrator)
-* An [SSL Web certificate](/azure/active-directory/manage-apps/f5-bigip-deployment-guide#ssl-profile) for publishing services over HTTPS, or use default BIG-IP certs while testing
+* An [SSL Web certificate](../manage-apps/f5-bigip-deployment-guide.md#ssl-profile) for publishing services over HTTPS, or use default BIG-IP certs while testing
* An existing Oracle JDE environment
Along with this the SAML federation metadata for the published application is al
If the BIG-IP webtop portal is used to access published applications then a sign-out from there would be processed by the APM to also call the Azure AD sign-out endpoint. But consider a scenario where the BIG-IP webtop portal isnΓÇÖt used, then the user has no way of instructing the APM to sign out. Even if the user signs-out of the application itself, the BIG-IP is technically oblivious to this. So for this reason, SP initiated sign-out needs careful consideration to ensure sessions are securely terminated when no longer required. One way of achieving this would be to add an SLO function to your applications sign out button, so that it can redirect your client to either the Azure AD SAML or BIG-IP sign-out endpoint. The URL for SAML sign-out endpoint for your tenant can be found in **App Registrations > Endpoints**.
-If making a change to the app is a no go, then consider having the BIG-IP listen for the application's sign-out call, and upon detecting the request have it trigger SLO. Refer to our [Oracle PeopleSoft SLO guidance](/azure/active-directory/manage-apps/f5-big-ip-oracle-peoplesoft-easy-button#peoplesoft-single-logout) for using BIG-IP irules to achieve this. More details on using BIG-IP iRules to achieve this is available in the F5 knowledge article [Configuring automatic session termination (logout) based on a URI-referenced file name](https://support.f5.com/csp/article/K42052145) and [Overview of the Logout URI Include option](https://support.f5.com/csp/article/K12056).
+If making a change to the app is a no go, then consider having the BIG-IP listen for the application's sign-out call, and upon detecting the request have it trigger SLO. Refer to our [Oracle PeopleSoft SLO guidance](../manage-apps/f5-big-ip-oracle-peoplesoft-easy-button.md#peoplesoft-single-logout) for using BIG-IP irules to achieve this. More details on using BIG-IP iRules to achieve this is available in the F5 knowledge article [Configuring automatic session termination (logout) based on a URI-referenced file name](https://support.f5.com/csp/article/K42052145) and [Overview of the Logout URI Include option](https://support.f5.com/csp/article/K12056).
## Summary
For increased security, organizations using this pattern could also consider blo
## Advanced deployment
-There may be cases where the Guided Configuration templates lack the flexibility to achieve more specific requirements. For those scenarios, see [Advanced Configuration for headers-based SSO](/azure/active-directory/manage-apps/f5-big-ip-header-advanced). Alternatively, the BIG-IP gives the option to disable **Guided ConfigurationΓÇÖs strict management mode**. This allows you to manually tweak your configurations, even though bulk of your configurations are automated through the wizard-based templates.
+There may be cases where the Guided Configuration templates lack the flexibility to achieve more specific requirements. For those scenarios, see [Advanced Configuration for headers-based SSO](../manage-apps/f5-big-ip-header-advanced.md). Alternatively, the BIG-IP gives the option to disable **Guided ConfigurationΓÇÖs strict management mode**. This allows you to manually tweak your configurations, even though bulk of your configurations are automated through the wizard-based templates.
You can navigate to **Access > Guided Configuration** and select the **small padlock icon** on the far right of the row for your applicationsΓÇÖ configs.
If you donΓÇÖt see a BIG-IP error page, then the issue is probably more related
2. The **View Variables** link in this location may also help root cause SSO issues, particularly if the BIG-IP APM fails to obtain the right attributes from Azure AD or another source
-See [BIG-IP APM variable assign examples](https://devcentral.f5.com/s/articles/apm-variable-assign-examples-1107) and [F5 BIG-IP session variables reference](https://techdocs.f5.com/en-us/bigip-15-0-0/big-ip-access-policy-manager-visual-policy-editor/session-variables.html) for more info.
+See [BIG-IP APM variable assign examples](https://devcentral.f5.com/s/articles/apm-variable-assign-examples-1107) and [F5 BIG-IP session variables reference](https://techdocs.f5.com/en-us/bigip-15-0-0/big-ip-access-policy-manager-visual-policy-editor/session-variables.html) for more info.
active-directory F5 Big Ip Sap Erp Easy Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/f5-big-ip-sap-erp-easy-button.md
Integrating a BIG-IP with Azure Active Directory (Azure AD) provides many benefi
* Manage identities and access from a single control plane, the [Azure portal](https://portal.azure.com/)
-To learn about all the benefits, see the article on [F5 BIG-IP and Azure AD integration](/azure/active-directory/manage-apps/f5-aad-integration) and [what is application access and single sign-on with Azure AD](/azure/active-directory/active-directory-appssoaccess-whatis).
+To learn about all the benefits, see the article on [F5 BIG-IP and Azure AD integration](../manage-apps/f5-aad-integration.md) and [what is application access and single sign-on with Azure AD](/azure/active-directory/active-directory-appssoaccess-whatis).
## Scenario description
Prior BIG-IP experience isnΓÇÖt necessary, but you will need:
* An Azure AD free subscription or above
-* An existing BIG-IP or [deploy a BIG-IP Virtual Edition (VE) in Azure](/azure/active-directory/manage-apps/f5-bigip-deployment-guide)
+* An existing BIG-IP or [deploy a BIG-IP Virtual Edition (VE) in Azure](../manage-apps/f5-bigip-deployment-guide.md)
* Any of the following F5 BIG-IP license offers
Prior BIG-IP experience isnΓÇÖt necessary, but you will need:
* An account with Azure AD Application admin [permissions](/azure/active-directory/users-groups-roles/directory-assign-admin-roles#application-administrator)
-* An [SSL Web certificate](/azure/active-directory/manage-apps/f5-bigip-deployment-guide#ssl-profile) for publishing services over HTTPS, or use default BIG-IP certs while testing
+* An [SSL Web certificate](../manage-apps/f5-bigip-deployment-guide.md#ssl-profile) for publishing services over HTTPS, or use default BIG-IP certs while testing
* An existing SAP ERP environment configured for Kerberos authentication
Easy Button provides a set of pre-defined application templates for Oracle Peopl
When a user successfully authenticates to Azure AD, it issues a SAML token with a default set of claims and attributes uniquely identifying the user. The **User Attributes & Claims tab** shows the default claims to issue for the new application. It also lets you configure more claims.
-As our example AD infrastructure is based on a .com domain suffix used both, internally and externally, we donΓÇÖt require any additional attributes to achieve a functional KCD SSO implementation. See the [advanced tutorial](/azure/active-directory/manage-apps/f5-big-ip-kerberos-advanced) for cases where you have multiple domains or userΓÇÖs log-in using an alternate suffix.
+As our example AD infrastructure is based on a .com domain suffix used both, internally and externally, we donΓÇÖt require any additional attributes to achieve a functional KCD SSO implementation. See the [advanced tutorial](../manage-apps/f5-big-ip-kerberos-advanced.md) for cases where you have multiple domains or userΓÇÖs log-in using an alternate suffix.
![Screenshot for user attributes and claims](./media/f5-big-ip-easy-button-sap-erp/user-attributes-claims.png)
For increased security, organizations using this pattern could also consider blo
## Advanced deployment
-There may be cases where the Guided Configuration templates lack the flexibility to achieve more specific requirements. For those scenarios, see [Advanced Configuration for kerberos-based SSO](/azure/active-directory/manage-apps/f5-big-ip-kerberos-advanced).
+There may be cases where the Guided Configuration templates lack the flexibility to achieve more specific requirements. For those scenarios, see [Advanced Configuration for kerberos-based SSO](../manage-apps/f5-big-ip-kerberos-advanced.md).
Alternatively, the BIG-IP gives you the option to disable **Guided ConfigurationΓÇÖs strict management mode**. This allows you to manually tweak your configurations, even though bulk of your configurations are automated through the wizard-based templates.
If you donΓÇÖt see a BIG-IP error page, then the issue is probably more related
2. Select the link for your active session. The **View Variables** link in this location may also help determine root cause KCD issues, particularly if the BIG-IP APM fails to obtain the right user and domain identifiers from session variables
-See [BIG-IP APM variable assign examples]( https://devcentral.f5.com/s/articles/apm-variable-assign-examples-1107) and [F5 BIG-IP session variables reference]( https://techdocs.f5.com/en-us/bigip-15-0-0/big-ip-access-policy-manager-visual-policy-editor/session-variables.html) for more info.
+See [BIG-IP APM variable assign examples]( https://devcentral.f5.com/s/articles/apm-variable-assign-examples-1107) and [F5 BIG-IP session variables reference]( https://techdocs.f5.com/en-us/bigip-15-0-0/big-ip-access-policy-manager-visual-policy-editor/session-variables.html) for more info.
active-directory Fortisase Sia Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/fortisase-sia-tutorial.md
Previously updated : 03/25/2022 Last updated : 04/13/2022
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
- `https://<TENANTHOSTNAME>.edge.prod.fortisase.com/remote/saml/metadata`
+ a. In the **Identifier (Entity ID)** text box, type a URL using one of the following patterns:
- b. In the **Reply URL** text box, type a URL using the following pattern:
- `https://<TENANTHOSTNAME>.edge.prod.fortisase.com/remote/saml/login`
+ | User | URL |
+ ||--|
+ | For FortiSASE VPN User SSO | `https://<TENANTHOSTNAME>.edge.prod.fortisase.com/remote/saml/metadata` |
+ | For FortiSASE SWG User SSO | `https://<TENANTHOSTNAME>.edge.prod.fortisase.com:7831/XX/YY/ZZ/saml/metadata` |
+
+ b. In the **Reply URL** text box, type a URL using one of the following patterns:
+
+ | User | URL |
+ ||--|
+ | For FortiSASE VPN User SSO | `https://<TENANTHOSTNAME>.edge.prod.fortisase.com/remote/saml/login` |
+ | For FortiSASE SWG User SSO | `https://<TENANTHOSTNAME>.edge.prod.fortisase.com:7831/XX/YY/ZZ/saml/login` |
- c. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://<TENANTHOSTNAME>.edge.prod.fortisase.com/remote/login`
+ c. In the **Sign on URL** text box, type a URL using one of the following patterns:
+
+ | User | URL |
+ ||--|
+ | For FortiSASE VPN User SSO | `https://<TENANTHOSTNAME>.edge.prod.fortisase.com/remote/login` |
+ | For FortiSASE SWG User SSO | `https://<TENANTHOSTNAME>.edge.prod.fortisase.com:7831/XX/YY/ZZ/login` |
> [!NOTE]
- > These values are not real. Update these values with the actual Identifier, Reply URL and Sign on URL. Contact [FortiSASE Client support team](mailto:fgc@fortinet.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign on URL. On the FortiSASE portal, go to **Configuration > VPN User SSO** or **Configuration > SWG User SSO** to find the service provider URLs. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
1. FortiSASE application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
## Configure FortiSASE SSO
-To configure single sign-on on **FortiSASE** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [FortiSASE support team](mailto:fgc@fortinet.com). They set this setting to have the SAML SSO connection set properly on both sides.
+1. Log in to your FortiSASE company site as an administrator.
+
+1. Go to **Configuration > VPN User SSO** or **Configuration > SWG User SSO** depending on the FortiSASE mode used.
+
+1. In the **Configure Identity Provider** section, copy the following URLs and paste in the **Basic SAML Configuration** section in the Azure portal.
+
+ ![Screenshot that shows the Configuration](./media/fortisase-tutorial/general.png "Configuration")
+
+1. In the **Configure Service Provider** section, perform the following steps:
+
+ ![Screenshot that shows Service Provider configuration](./media/fortisase-tutorial/certificate.png "Service Provider")
+
+ a. In the **IdP Entity ID** textbox, paste the **Azure AD Identifier** value which you have copied from the Azure portal.
+
+ b. In the **IdP Single Sign-On URL** textbox, paste the **Login URL** value which you have copied from the Azure portal.
+
+ c. In the **IdP Single Log-Out URL** textbox, paste the **Logout URL** value which you have copied from the Azure portal.
+
+ d. Open the downloaded **Certificate (Base64)** from the Azure portal into Notepad and upload the content into the **IdP Certificate** textbox.
+
+1. Review and submit the configuration.
### Create FortiSASE test user
-In this section, a user called Britta Simon is created in FortiSASE. FortiSASE supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in FortiSASE, a new one is created after authentication.
+FortiSASE supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section.
## Test SSO
active-directory Per Angusta Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/per-angusta-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Per Angusta'
+description: Learn how to configure single sign-on between Azure Active Directory and Per Angusta.
++++++++ Last updated : 04/06/2022++++
+# Tutorial: Azure AD SSO integration with Per Angusta
+
+In this tutorial, you'll learn how to integrate Per Angusta with Azure Active Directory (Azure AD). When you integrate Per Angusta with Azure AD, you can:
+
+* Control in Azure AD who has access to Per Angusta.
+* Enable your users to be automatically signed-in to Per Angusta with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Per Angusta single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Per Angusta supports **SP** initiated SSO.
+
+## Add Per Angusta from the gallery
+
+To configure the integration of Per Angusta into Azure AD, you need to add Per Angusta from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Per Angusta** in the search box.
+1. Select **Per Angusta** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Per Angusta
+
+Configure and test Azure AD SSO with Per Angusta using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Per Angusta.
+
+To configure and test Azure AD SSO with Per Angusta, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Per Angusta SSO](#configure-per-angusta-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Per Angusta test user](#create-per-angusta-test-user)** - to have a counterpart of B.Simon in Per Angusta that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Per Angusta** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** text box, type a value using the following pattern:
+ `<SUBDOMAIN>.per-angusta.com`
+
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.per-angusta.com/saml/consume`
+
+ c. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.per-angusta.com/saml/init`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Per Angusta Client support team](mailto:support@per-angusta.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
+
+ ![The Certificate download link](common/copy-metadataurl.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Per Angusta.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Per Angusta**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Per Angusta SSO
+
+1. Log in to your Per Angusta company site as an administrator.
+
+1. Go to Administration tab.
+
+ ![Screenshot that shows the Admin Account](./media/per-angusta-tutorial/users.png "Account")
+
+1. In the left-side menu under **CONFIGURATION**, click **SSO SAML**.
+
+ ![Screenshot that shows the Configuration](./media/per-angusta-tutorial/general.png "Configuration")
+
+1. Perform the following steps in the configuration page:
+
+ ![Screenshot that shows the metadata](./media/per-angusta-tutorial/certificate.png "Metadata")
+
+ ![Screenshot that shows the SSO SAML Certificate](./media/per-angusta-tutorial/claims.png "SAML Certificate")
+
+ 1. Copy **Reply URL** value, paste this value into the **Reply URL** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Copy **Entity ID** value, paste this value into the **Identifier** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Copy **SAML initialization URL** value, paste this value into the **Sign on URL** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Enable **Active** SSO checkbox before to test connection.
+
+ 1. In the **XML URL** textbox, paste the **App Federation Metadata Url** value which you have copied from the Azure portal.
+
+ 1. In the **Claim** textbox, select **Email** from the dropdown.
+
+ 1. In the **NameID Format** textbox, please select `urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified` from the dropdown.
+
+ 1. Click **Save**.
+
+### Create Per Angusta test user
+
+In this section, you create a user called Britta Simon in Per Angusta. Work with [Per Angusta support team](mailto:support@per-angusta.com) to add the users in the Per Angusta platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Per Angusta Sign-on URL where you can initiate the login flow.
+
+* Go to Per Angusta Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Per Angusta tile in the My Apps, this will redirect to Per Angusta Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+
+## Next steps
+
+Once you configure Per Angusta you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
advisor Advisor Reference Cost Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-reference-cost-recommendations.md
Learn more about [Managed Disk Snapshot - ManagedDiskSnapshot (Use Standard Stor
We've analyzed the usage patterns of your virtual machine over the past 7 days and identified virtual machines with low usage. While certain scenarios can result in low utilization by design, you can often save money by managing the size and number of virtual machines.
-Learn more about [Virtual machine - LowUsageVmV2 (Right-size or shutdown underutilized virtual machines)](/azure/advisor/advisor-cost-recommendations#optimize-virtual-machine-spend-by-resizing-or-shutting-down-underutilized-instances).
+Learn more about [Virtual machine - LowUsageVmV2 (Right-size or shutdown underutilized virtual machines)](./advisor-cost-recommendations.md#optimize-virtual-machine-spend-by-resizing-or-shutting-down-underutilized-instances).
### You have disks which have not been attached to a VM for more than 30 days. Please evaluate if you still need the disk.
Learn more about [Synapse workspace - EnableSynapseSparkComputeAutoScaleGuidance
## Next steps
-Learn more about [Cost Optimization - Microsoft Azure Well Architected Framework](/azure/architecture/framework/cost/overview)
+Learn more about [Cost Optimization - Microsoft Azure Well Architected Framework](/azure/architecture/framework/cost/overview)
advisor Advisor Reference Operational Excellence Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-reference-operational-excellence-recommendations.md
Learn more about [Batch account - OldPool (Recreate your pool to get the latest
Your pool is using a deprecated internal component. Please delete and recreate your pool for improved stability and performance.
-Learn more about [Batch account - RecreatePool (Delete and recreate your pool to remove a deprecated internal component)](/azure/batch/best-practices#pool-lifetime-and-billing).
+Learn more about [Batch account - RecreatePool (Delete and recreate your pool to remove a deprecated internal component)](../batch/best-practices.md#pool-lifetime-and-billing).
### Upgrade to the latest API version to ensure your Batch account remains operational.
Learn more about [Batch account - RemoveA8_A11Pools (Delete and recreate your po
Your pool is using an image with an imminent expiration date. Please recreate the pool with a new image to avoid potential interruptions. A list of newer images is available via the ListSupportedImages API.
-Learn more about [Batch account - EolImage (Recreate your pool with a new image)](/azure/batch/batch-pool-vm-sizes#supported-vm-images).
+Learn more about [Batch account - EolImage (Recreate your pool with a new image)](../batch/batch-pool-vm-sizes.md#supported-vm-images).
## Cognitive Service
Learn more about [Kubernetes service - UpdateServicePrincipal (Update cluster's
Monitoring addon workspace is deleted. Correct issues to setup monitoring addon.
-Learn more about [Kubernetes service - MonitoringAddonWorkspaceIsDeleted (Monitoring addon workspace is deleted)](/azure/azure-monitor/containers/container-insights-optout#azure-cli).
+Learn more about [Kubernetes service - MonitoringAddonWorkspaceIsDeleted (Monitoring addon workspace is deleted)](../azure-monitor/containers/container-insights-optout.md#azure-cli).
### Deprecated Kubernetes API in 1.16 is found
Learn more about [Cosmos DB account - CosmosDBMigrateToContinuousBackup (Improve
We have detected that one or more of your alert rules have invalid queries specified in their condition section. Log alert rules are created in Azure Monitor and are used to run analytics queries at specified intervals. The results of the query determine if an alert needs to be triggered. Analytics queries may become invalid overtime due to changes in referenced resources, tables, or commands. We recommend that you correct the query in the alert rule to prevent it from getting auto-disabled and ensure monitoring coverage of your resources in Azure.
-Learn more about [Alert Rule - ScheduledQueryRulesLogAlert (Repair your log alert rule)](/azure/azure-monitor/alerts/alerts-troubleshoot-log#query-used-in-a-log-alert-is-not-valid).
+Learn more about [Alert Rule - ScheduledQueryRulesLogAlert (Repair your log alert rule)](../azure-monitor/alerts/alerts-troubleshoot-log.md#query-used-in-a-log-alert-isnt-valid).
### Log alert rule was disabled The alert rule was disabled by Azure Monitor as it was causing service issues. To enable the alert rule, contact support.
-Learn more about [Alert Rule - ScheduledQueryRulesRp (Log alert rule was disabled)](/azure/azure-monitor/alerts/alerts-troubleshoot-log#query-used-in-a-log-alert-is-not-valid).
+Learn more about [Alert Rule - ScheduledQueryRulesRp (Log alert rule was disabled)](../azure-monitor/alerts/alerts-troubleshoot-log.md#query-used-in-a-log-alert-isnt-valid).
## Key Vault
Learn more about [SQL virtual machine - UpgradeToFullMode (SQL IaaS Agent should
A region can support a maximum of 250 storage accounts per subscription. You have either already reached or are about to reach that limit. If you reach that limit, you will be unable to create any more storage accounts in that subscription/region combination. Please evaluate the recommended action below to avoid hitting the limit.
-Learn more about [Storage Account - StorageAccountScaleTarget (Prevent hitting subscription limit for maximum storage accounts)](/azure/storage/blobs/storage-performance-checklist#what-to-do-when-approaching-a-scalability-target).
+Learn more about [Storage Account - StorageAccountScaleTarget (Prevent hitting subscription limit for maximum storage accounts)](../storage/blobs/storage-performance-checklist.md).
### Update to newer releases of the Storage Java v12 SDK for better reliability.
Learn more about [App service - AzureAppService-StagingEnv (Set up staging envir
## Next steps
-Learn more about [Operational Excellence - Microsoft Azure Well Architected Framework](/azure/architecture/framework/devops/overview)
+Learn more about [Operational Excellence - Microsoft Azure Well Architected Framework](/azure/architecture/framework/devops/overview)
advisor Advisor Reference Performance Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-reference-performance-recommendations.md
Learn more about [AVS Private cloud - vSANCapacity (vSAN capacity utilization ha
Cache instances perform best when not running under high network bandwidth which may cause them to become unresponsive, experience data loss, or become unavailable. Apply best practices to reduce network bandwidth or scale to a different size or sku with more capacity.
-Learn more about [Redis Cache Server - RedisCacheNetworkBandwidth (Improve your Cache and application performance when running with high network bandwidth)](/azure/azure-cache-for-redis/cache-troubleshoot-server#server-side-bandwidth-limitation).
+Learn more about [Redis Cache Server - RedisCacheNetworkBandwidth (Improve your Cache and application performance when running with high network bandwidth)](../azure-cache-for-redis/cache-troubleshoot-server.md#server-side-bandwidth-limitation).
### Improve your Cache and application performance when running with many connected clients
Learn more about [Redis Cache Server - RedisCacheConnectedClients (Improve your
Cache instances perform best when not running under high server load which may cause them to become unresponsive, experience data loss, or become unavailable. Apply best practices to reduce the server load or scale to a different size or sku with more capacity.
-Learn more about [Redis Cache Server - RedisCacheServerLoad (Improve your Cache and application performance when running with high server load)](/azure/azure-cache-for-redis/cache-troubleshoot-client#high-client-cpu-usage).
+Learn more about [Redis Cache Server - RedisCacheServerLoad (Improve your Cache and application performance when running with high server load)](../azure-cache-for-redis/cache-troubleshoot-client.md#high-client-cpu-usage).
### Improve your Cache and application performance when running with high memory pressure Cache instances perform best when not running under high memory pressure which may cause them to become unresponsive, experience data loss, or become unavailable. Apply best practices to reduce used memory or scale to a different size or sku with more capacity.
-Learn more about [Redis Cache Server - RedisCacheUsedMemory (Improve your Cache and application performance when running with high memory pressure)](/azure/azure-cache-for-redis/cache-troubleshoot-client#memory-pressure-on-redis-client).
+Learn more about [Redis Cache Server - RedisCacheUsedMemory (Improve your Cache and application performance when running with high memory pressure)](../azure-cache-for-redis/cache-troubleshoot-client.md#memory-pressure-on-redis-client).
## Cognitive Service
Learn more about [Data explorer resource - ReduceCacheForAzureDataExplorerTables
Time to Live (TTL) affects how recent of a response a client will get when it makes a request to Azure Traffic Manager. Reducing the TTL value means that the client will be routed to a functioning endpoint faster in the case of a failover. Configure your TTL to 20 seconds to route traffic to a health endpoint as quickly as possible.
-Learn more about [Traffic Manager profile - FastFailOverTTL (Configure DNS Time to Live to 20 seconds)](/azure/traffic-manager/traffic-manager-monitoring#endpoint-failover-and-recovery).
+Learn more about [Traffic Manager profile - FastFailOverTTL (Configure DNS Time to Live to 20 seconds)](../traffic-manager/traffic-manager-monitoring.md#endpoint-failover-and-recovery).
### Configure DNS Time to Live to 60 seconds
Learn more about [SQL data warehouse - CreateTableStatisticsSqlDW (Create statis
We have detected distribution data skew greater than 15%. This can cause costly performance bottlenecks.
-Learn more about [SQL data warehouse - DataSkewSqlDW (Remove data skew to increase query performance)](/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-distribute#how-to-tell-if-your-distribution-column-is-a-good-choice).
+Learn more about [SQL data warehouse - DataSkewSqlDW (Remove data skew to increase query performance)](../synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-distribute.md#how-to-tell-if-your-distribution-column-is-a-good-choice).
### Update statistics on table columns
Learn more about [SQL data warehouse - SqlDwIncreaseCacheCapacity (Scale up to o
We have detected that you had high tempdb utilization which can impact the performance of your workload.
-Learn more about [SQL data warehouse - SqlDwReduceTempdbContention (Scale up or update resource class to reduce tempdb contention with SQL Data Warehouse)](/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-manage-monitor#monitor-tempdb).
+Learn more about [SQL data warehouse - SqlDwReduceTempdbContention (Scale up or update resource class to reduce tempdb contention with SQL Data Warehouse)](../synapse-analytics/sql-data-warehouse/sql-data-warehouse-manage-monitor.md#monitor-tempdb).
### Convert tables to replicated tables with SQL Data Warehouse
Learn more about [SQL data warehouse - SqlDwReplicateTable (Convert tables to re
We have detected that you can increase load throughput by splitting your compressed files that are staged in your storage account. A good rule of thumb is to split compressed files into 60 or more to maximize the parallelism of your load.
-Learn more about [SQL data warehouse - FileSplittingGuidance (Split staged files in the storage account to increase load performance)](/azure/synapse-analytics/sql/data-loading-best-practices#preparing-data-in-azure-storage).
+Learn more about [SQL data warehouse - FileSplittingGuidance (Split staged files in the storage account to increase load performance)](../synapse-analytics/sql/data-loading-best-practices.md#prepare-data-in-azure-storage).
### Increase batch size when loading to maximize load throughput, data compression, and query performance We have detected that you can increase load performance and throughput by increasing the batch size when loading into your database. You should consider using the COPY statement. If you are unable to use the COPY statement, consider increasing the batch size when using loading utilities such as the SQLBulkCopy API or BCP - a good rule of thumb is a batch size between 100K to 1M rows.
-Learn more about [SQL data warehouse - LoadBatchSizeGuidance (Increase batch size when loading to maximize load throughput, data compression, and query performance)](/azure/synapse-analytics/sql/data-loading-best-practices#increase-batch-size-when-using-sqlbulkcopy-api-or-bcp).
+Learn more about [SQL data warehouse - LoadBatchSizeGuidance (Increase batch size when loading to maximize load throughput, data compression, and query performance)](../synapse-analytics/sql/data-loading-best-practices.md#increase-batch-size-when-using-sqlbulkcopy-api-or-bcp).
### Co-locate the storage account within the same region to minimize latency when loading We have detected that you are loading from a region that is different from your SQL pool. You should consider loading from a storage account that is within the same region as your SQL pool to minimize latency when loading data.
-Learn more about [SQL data warehouse - ColocateStorageAccount (Co-locate the storage account within the same region to minimize latency when loading)](/azure/synapse-analytics/sql/data-loading-best-practices#preparing-data-in-azure-storage).
+Learn more about [SQL data warehouse - ColocateStorageAccount (Co-locate the storage account within the same region to minimize latency when loading)](../synapse-analytics/sql/data-loading-best-practices.md#prepare-data-in-azure-storage).
## Storage
Learn more about [App service - AppServiceMoveToPremiumV2 (Move your App Service
Your app has opened too many TCP/IP socket connections. Exceeding ephemeral TCP/IP port connection limits can cause unexpected connectivity issues for your apps.
-Learn more about [App service - AppServiceOutboundConnections (Check outbound connections from your App Service resource)](/azure/app-service/app-service-best-practices#socketresources).
+Learn more about [App service - AppServiceOutboundConnections (Check outbound connections from your App Service resource)](../app-service/app-service-best-practices.md#socketresources).
## Next steps
-Learn more about [Performance Efficiency - Microsoft Azure Well Architected Framework](/azure/architecture/framework/scalability/overview)
+Learn more about [Performance Efficiency - Microsoft Azure Well Architected Framework](/azure/architecture/framework/scalability/overview)
advisor Advisor Reference Reliability Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-reference-reliability-recommendations.md
Learn more about [Api Management - TlsRenegotiationBlocked (SSL/TLS renegotiatio
Fragmentation and memory pressure can cause availability incidents during a failover or management operations. Increasing reservation of memory for fragmentation helps in reducing the cache failures when running under high memory pressure. Memory for fragmentation can be increased via maxfragmentationmemory-reserved setting available in advanced settings blade.
-Learn more about [Redis Cache Server - RedisCacheMemoryFragmentation (Availability may be impacted from high memory fragmentation. Increase fragmentation memory reservation to avoid potential impact.)](/azure/azure-cache-for-redis/cache-configure#memory-policies).
+Learn more about [Redis Cache Server - RedisCacheMemoryFragmentation (Availability may be impacted from high memory fragmentation. Increase fragmentation memory reservation to avoid potential impact.)](../azure-cache-for-redis/cache-configure.md#memory-policies).
## Compute
Learn more about [Virtual machine (classic) - EnableBackup (Enable Backups on yo
We have identified that you are using standard disks with your premium-capable Virtual Machines and we recommend you consider upgrading the standard disks to premium disks. For any Single Instance Virtual Machine using premium storage for all Operating System Disks and Data Disks, we guarantee you will have Virtual Machine Connectivity of at least 99.9%. Consider these factors when making your upgrade decision. The first is that upgrading requires a VM reboot and this process takes 3-5 minutes to complete. The second is if the VMs in the list are mission-critical production VMs, evaluate the improved availability against the cost of premium disks.
-Learn more about [Virtual machine - MigrateStandardStorageAccountToPremium (Upgrade the standard disks attached to your premium-capable VM to premium disks)](/azure/virtual-machines/disks-types#premium-ssd).
+Learn more about [Virtual machine - MigrateStandardStorageAccountToPremium (Upgrade the standard disks attached to your premium-capable VM to premium disks)](../virtual-machines/disks-types.md#premium-ssds).
### Enable virtual machine replication to protect your applications from regional outage
Learn more about [Virtual machine - UpgradeVMToManagedDisksWithoutAdditionalCost
Using IP Address based filtering has been identified as a vulnerable way to control outbound connectivity for firewalls. It is advised to use Service Tags as an alternative for controlling connectivity. We highly recommend the use of Service Tags, to allow connectivity to Azure Site Recovery services for the machines.
-Learn more about [Virtual machine - ASRUpdateOutboundConnectivityProtocolToServiceTags (Update your outbound connectivity protocol to Service Tags for Azure Site Recovery)](/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-using-service-tags).
+Learn more about [Virtual machine - ASRUpdateOutboundConnectivityProtocolToServiceTags (Update your outbound connectivity protocol to Service Tags for Azure Site Recovery)](../site-recovery/azure-to-azure-about-networking.md#outbound-connectivity-using-service-tags).
### Use Managed Disks to improve data reliability
Learn more about [Cosmos DB account - CosmosDBSingleRegionProdAccounts (Add a se
We observed your account is throwing a TooManyRequests error with the 16500 error code. Enabling Server Side Retry (SSR) can help mitigate this issue for you.
-Learn more about [Cosmos DB account - CosmosDBMongoServerSideRetries (Enable Server Side Retry (SSR) on your Azure Cosmos DB's API for MongoDB account)](/azure/cosmos-db/cassandra/prevent-rate-limiting-errors).
+Learn more about [Cosmos DB account - CosmosDBMongoServerSideRetries (Enable Server Side Retry (SSR) on your Azure Cosmos DB's API for MongoDB account)](../cosmos-db/cassandr).
### Migrate your Azure Cosmos DB API for MongoDB account to v4.0 to save on query/storage costs and utilize new features
Learn more about [Application gateway - AppGateway (Upgrade your SKU or add more
The VPN gateway Basic SKU is designed for development or testing scenarios. Please move to a production SKU if you are using the VPN gateway for production purposes. The production SKUs offer higher number of tunnels, BGP support, active-active, custom IPsec/IKE policy in addition to higher stability and availability.
-Learn more about [Virtual network gateway - BasicVPNGateway (Move to production gateway SKUs from Basic gateways)](/azure/vpn-gateway/vpn-gateway-about-vpn-gateway-settings#gwsku).
+Learn more about [Virtual network gateway - BasicVPNGateway (Move to production gateway SKUs from Basic gateways)](../vpn-gateway/vpn-gateway-about-vpn-gateway-settings.md#gwsku).
### Add at least one more endpoint to the profile, preferably in another Azure region
Learn more about [ExpressRoute circuit - ExpressRouteGatewayE2EMonitoring (Imple
Try to avoid overriding the hostname when configuring Application Gateway. Having a different domain on the frontend of Application Gateway than the one which is used to access the backend can potentially lead to cookies or redirect urls being broken. Note that this might not be the case in all situations and that certain categories of backends (like REST API's) in general are less sensitive to this. Please make sure the backend is able to deal with this or update the Application Gateway configuration so the hostname does not need to be overwritten towards the backend. When used with App Service, attach a custom domain name to the Web App and avoid use of the *.azurewebsites.net host name towards the backend.
-Learn more about [Application gateway - AppGatewayHostOverride (Avoid hostname override to ensure site integrity)](/azure/application-gateway/troubleshoot-app-service-redirection-app-service-url#alternate-solution-use-a-custom-domain-name).
+Learn more about [Application gateway - AppGatewayHostOverride (Avoid hostname override to ensure site integrity)](../application-gateway/troubleshoot-app-service-redirection-app-service-url.md).
### Use ExpressRoute Global Reach to improve your design for disaster recovery
Learn more about [Search service - StandardServiceStorageQuota90percent (You are
After enabling Soft Delete, deleted data transitions to a soft deleted state instead of being permanently deleted. When data is overwritten, a soft deleted snapshot is generated to save the state of the overwritten data. You can configure the amount of time soft deleted data is recoverable before it permanently expires.
-Learn more about [Storage Account - StorageSoftDelete (Enable Soft Delete to protect your blob data)](https://aka.ms/softdelete).
+Learn more about [Storage Account - StorageSoftDelete (Enable Soft Delete to protect your blob data)](../storage/blobs/soft-delete-blob-overview.md).
### Use Managed Disks for storage accounts reaching capacity limit We have identified that you are using Premium SSD Unmanaged Disks in Storage account(s) that are about to reach Premium Storage capacity limit. To avoid failures when the limit is reached, we recommend migrating to Managed Disks that do not have account capacity limit. This migration can be done through the portal in less than 5 minutes.
-Learn more about [Storage Account - StoragePremiumBlobQuotaLimit (Use Managed Disks for storage accounts reaching capacity limit)](/azure/storage/common/scalability-targets-standard-account#premium-performance-page-blob-storage).
+Learn more about [Storage Account - StoragePremiumBlobQuotaLimit (Use Managed Disks for storage accounts reaching capacity limit)](../storage/common/scalability-targets-standard-account.md).
## Web
Learn more about [Storage Account - StoragePremiumBlobQuotaLimit (Use Managed Di
Your App reached >90% CPU over the last couple of days. High CPU utilization can lead to runtime issues with your apps, to solve this you could scale out your app.
-Learn more about [App service - AppServiceCPUExhaustion (Consider scaling out your App Service Plan to avoid CPU exhaustion)](/azure/app-service/app-service-best-practices#CPUresources).
+Learn more about [App service - AppServiceCPUExhaustion (Consider scaling out your App Service Plan to avoid CPU exhaustion)](../app-service/app-service-best-practices.md#CPUresources).
### Fix the backup database settings of your App Service resource Your app's backups are consistently failing due to invalid DB configuration, you can find more details in backup history.
-Learn more about [App service - AppServiceFixBackupDatabaseSettings (Fix the backup database settings of your App Service resource)](/azure/app-service/app-service-best-practices#appbackup.).
+Learn more about [App service - AppServiceFixBackupDatabaseSettings (Fix the backup database settings of your App Service resource)](../app-service/app-service-best-practices.md#appbackup).
### Consider scaling up your App Service Plan SKU to avoid memory exhaustion The App Service Plan containing your app reached >85% memory allocated. High memory consumption can lead to runtime issues with your apps. Investigate which app in the App Service Plan is exhausting memory and scale up to a higher plan with more memory resources if needed.
-Learn more about [App service - AppServiceMemoryExhaustion (Consider scaling up your App Service Plan SKU to avoid memory exhaustion)](/azure/app-service/app-service-best-practices#memoryresources).
+Learn more about [App service - AppServiceMemoryExhaustion (Consider scaling up your App Service Plan SKU to avoid memory exhaustion)](../app-service/app-service-best-practices.md#memoryresources).
### Scale up your App Service resource to remove the quota limit
Learn more about [App service - AppServiceUseDeploymentSlots (Use deployment slo
Your app's backups are consistently failing due to invalid storage settings, you can find more details in backup history.
-Learn more about [App service - AppServiceFixBackupStorageSettings (Fix the backup storage settings of your App Service resource)](/azure/app-service/app-service-best-practices#appbackup.).
+Learn more about [App service - AppServiceFixBackupStorageSettings (Fix the backup storage settings of your App Service resource)](../app-service/app-service-best-practices.md#appbackup).
### Move your App Service resource to Standard or higher and use deployment slots
aks Configure Kubenet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/configure-kubenet.md
You need the Azure CLI version 2.0.65 or later installed and configured. Run `a
In many environments, you have defined virtual networks and subnets with allocated IP address ranges. These virtual network resources are used to support multiple services and applications. To provide network connectivity, AKS clusters can use *kubenet* (basic networking) or Azure CNI (*advanced networking*).
-With *kubenet*, only the nodes receive an IP address in the virtual network subnet. Pods can't communicate directly with each other. Instead, User Defined Routing (UDR) and IP forwarding is used for connectivity between pods across nodes. By default, UDRs and IP forwarding configuration is created and maintained by the AKS service, but you have to the option to [bring your own route table for custom route management][byo-subnet-route-table]. You could also deploy pods behind a service that receives an assigned IP address and load balances traffic for the application. The following diagram shows how the AKS nodes receive an IP address in the virtual network subnet, but not the pods:
+With *kubenet*, only the nodes receive an IP address in the virtual network subnet. Pods can't communicate directly with each other. Instead, User Defined Routing (UDR) and IP forwarding is used for connectivity between pods across nodes. By default, UDRs and IP forwarding configuration is created and maintained by the AKS service, but you have the option to [bring your own route table for custom route management][byo-subnet-route-table]. You could also deploy pods behind a service that receives an assigned IP address and load balances traffic for the application. The following diagram shows how the AKS nodes receive an IP address in the virtual network subnet, but not the pods:
![Kubenet network model with an AKS cluster](media/use-kubenet/kubenet-overview.png)
With an AKS cluster deployed into your existing virtual network subnet, you can
[express-route]: ../expressroute/expressroute-introduction.md [network-comparisons]: concepts-network.md#compare-network-models [custom-route-table]: ../virtual-network/manage-route-table.md
-[user-assigned managed identity]: use-managed-identity.md#bring-your-own-control-plane-mi
+[user-assigned managed identity]: use-managed-identity.md#bring-your-own-control-plane-mi
aks Devops Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/devops-pipeline.md
https://github.com/MicrosoftDocs/pipelines-javascript-docker
## Create the Azure resources
-Sign in to the [Azure portal](https://portal.azure.com/), and then select the [Cloud Shell](/azure/cloud-shell/overview) button in the upper-right corner.
+Sign in to the [Azure portal](https://portal.azure.com/), and then select the [Cloud Shell](../cloud-shell/overview.md) button in the upper-right corner.
### Create a container registry
https://github.com/MicrosoftDocs/pipelines-javascript-docker
## Create the Azure resources
-Sign in to the [Azure portal](https://portal.azure.com/), and then select the [Cloud Shell](/azure/cloud-shell/overview) button in the upper-right corner.
+Sign in to the [Azure portal](https://portal.azure.com/), and then select the [Cloud Shell](../cloud-shell/overview.md) button in the upper-right corner.
### Create a container registry
You're now ready to create a release, which means to start the process of runnin
1. In the pipeline view, choose the status link in the stages of the pipeline to see the logs and agent output.
aks Nat Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/nat-gateway.md
az group create --name myresourcegroup --location southcentralus
```azurecli-interactive az aks create \
- --resource-group myresourcegroup \
+ --resource-group myResourceGroup \
--name natcluster \ --node-count 3 \
- --outbound-type managedNATGateway \
+ --outbound-type managedNATGateway \
--nat-gateway-managed-outbound-ip-count 2 \ --nat-gateway-idle-timeout 30 ```
api-management Api Management Using With Internal Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-using-with-internal-vnet.md
After successful deployment, you should see your API Management service's **priv
### Enable connectivity using a Resource Manager template (`stv2` platform)
-* Azure Resource Manager [template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.apimanagement/api-management-create-with-internal-vnet-publicip) (API version 2021-01-01-preview )
+* Azure Resource Manager [template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.apimanagement/api-management-create-with-internal-vnet-publicip) (API version 2021-08-01 )
[![Deploy to Azure](../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.apimanagement%2Fapi-management-create-with-internal-vnet-publicip%2Fazuredeploy.json)
api-management Api Management Using With Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-using-with-vnet.md
It can take 15 to 45 minutes to update the API Management instance. The Develope
### Enable connectivity using a Resource Manager template (`stv2` compute platform)
-* Azure Resource Manager [template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.apimanagement/api-management-create-with-external-vnet-publicip) (API version 2021-01-01-preview)
+* Azure Resource Manager [template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.apimanagement/api-management-create-with-external-vnet-publicip) (API version 2021-08-01)
[![Deploy to Azure](../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.apimanagement%2Fapi-management-create-with-external-vnet-publicip%2Fazuredeploy.json)
api-management How To Server Sent Events https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/how-to-server-sent-events.md
Follow these guidelines when using API Management to reach a backend API that im
## Next steps
-* Learn more about [configuring policies](/azure/api-management/api-management-howto-policies) in API Management.
-* Learn about API Management [capacity](api-management-capacity.md).
+* Learn more about [configuring policies](./api-management-howto-policies.md) in API Management.
+* Learn about API Management [capacity](api-management-capacity.md).
api-management Websocket Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/websocket-api.md
Below are the current restrictions of WebSocket support in API Management:
* WebSocket APIs are not supported yet in the [self-hosted gateway](./self-hosted-gateway-overview.md). * Azure CLI, PowerShell, and SDK currently do not support management operations of WebSocket APIs. * 200 active connections limit per unit.
-* Websockets APIs support the following valid buffer types for messages: Close, BinaryFragment, BinayrMessage, UTF8Fragment, and UTF8Message.
+* Websockets APIs support the following valid buffer types for messages: Close, BinaryFragment, BinaryMessage, UTF8Fragment, and UTF8Message.
### Unsupported policies
app-service Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/migrate.md
There's no cost to migrate your App Service Environment. You'll stop being charg
> [App Service Environment v3 Networking](networking.md) > [!div class="nextstepaction"]
-> [Using an App Service Environment v3](using.md)
+> [Using an App Service Environment v3](using.md)
app-service Quickstart Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-java.md
cd agoncal-application-petstore-ee7
## Configure the Maven plugin
-The deployment process to Azure App Service will use your Azure credentials from the Azure CLI automatically. If the Azure CLI isn't installed locally, then the Maven plugin will authenticate with Oauth or device sign in. For more information, see [authentication with Maven plugins](https://github.com/microsoft/azure-maven-plugins/wiki/Authentication).
+> [!TIP]
+> The Maven plugin supports **Java 17** and **Tomcat 10.0**. For more information about latest support, see [Java 17 and Tomcat 10.0 are available on Azure App Service](https://devblogs.microsoft.com/java/java-17-and-tomcat-10-0-available-on-azure-app-service/).
++
+The deployment process to Azure App Service will use your Azure credentials from the Azure CLI automatically. If the Azure CLI is not installed locally, then the Maven plugin will authenticate with Oauth or device login. For more information, see [authentication with Maven plugins](https://github.com/microsoft/azure-maven-plugins/wiki/Authentication).
Run the Maven command below to configure the deployment. This command will help you to set up the App Service operating system, Java version, and Tomcat version.
application-gateway Http Response Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/http-response-codes.md
+
+ Title: HTTP response codes - Azure Application Gateway
+description: 'Learn how to troubleshoot Application Gateway HTTP response codes'
++++ Last updated : 04/19/2022++++
+# HTTP response codes in Application Gateway
+
+This article lists some HTTP response codes that can be returned by Azure Application Gateway. Common causes and troubleshooting steps are provided to help you determine the root cause. HTTP response codes can be returned to a client request whether or not a connection was initiated to a backend target.
+
+## 3XX response codes (redirection)
+
+300-399 responses are presented when a client request matches an application gateway rule that has redirects configured. Redirects can be configured on a rule as-is or via a path map rule. For more information about redirects, see [Application Gateway redirect overview](redirect-overview.md).
+
+#### 301 Permanent Redirect
+
+HTTP 301 responses are presented when a redirection rule is specified with the **Permanent** value.
+
+#### 302 Found
+
+HTTP 302 responses are presented when a redirection rule is specified with the **Found** value.
+
+#### 303 See Other
+
+HTTP 302 responses are presented when a redirection rule is specified with the **See Other** value.
+
+#### 307 Temporary Redirect
+
+HTTP 307 responses are presented when a redirection rule is specified with the **Temporary** value.
++
+## 4XX response codes (client error)
+
+400-499 response codes indicate an issue that is initiated from the client. These issues can range from the client initiating requests to an unmatched hostname, request timeout, unauthenticated request, malicious request, and more.
+
+#### 400 ΓÇô Bad Request
+
+HTTP 400 response codes are commonly observed when:
+- Non-HTTP / HTTPS traffic is initiated to an application gateway with an HTTP or HTTPS listener.
+- HTTP traffic is initiated to a listener with HTTPS, with no redirection configured.
+- Mutual authentication is configured and unable to properly negotiate.
+
+For cases when mutual authentication is configured, several scenarios can lead to an HTTP 400 response being returned the client, such as:
+- Client certificate isn't presented, but mutual authentication is enabled.
+- DN validation is enabled and the DN of the client certificate doesn't match the DN of the specified certificate chain.
+- Client certificate chain doesn't match certificate chain configured in the defined SSL Policy.
+- Client certificate is expired.
+- OCSP Client Revocation check is enabled and the certificate is revoked.
+- OCSP Client Revocation check is enabled, but unable to be contacted.
+- OCSP Client Revocation check is enabled, but OCSP responder isn't provided in the certificate.
+
+For more information about troubleshooting mutual authentication, see [Error code troubleshooting](mutual-authentication-troubleshooting.md#solution-2).
+
+#### 403 ΓÇô Forbidden
+
+HTTP 403 Forbidden is presented when customers are utilizing WAF skus and have WAF configured in Prevention mode. If enabled WAF rulesets or custom deny WAF rules match the characteristics of an inbound request, the client will be presented a 403 forbidden response.
+
+#### 404 ΓÇô Page not found
+
+An HTTP 404 response can be returned if a request is sent to an application gateway that is:
+- Using a [v2 sku](overview-v2.md).
+- Without a hostname match defined in any [multi-site listeners](multiple-site-overview.md).
+- Not configured with a [basic listener](application-gateway-components.md#types-of-listeners).
+
+#### 408 ΓÇô Request Timeout
+
+An HTTP 408 response can be observed when client requests to the frontend listener of application gateway do not respond back within 60 seconds. This error can be observed due to traffic congestion between on-premises networks and Azure, when traffic is inspected by virtual appliances, or the client itself becomes overwhelmed.
+
+#### 499 ΓÇô Client closed the connection
+
+An HTTP 499 response is presented if a client request that is sent to application gateways using v2 sku is closed before the server finished responding. This error can be observed when a large response is returned to the client, but the client may have closed or refreshed their browser/application before the server had a chance to finish responding.
++
+## 5XX response codes (server error)
+
+500-599 response codes indicate a problem has occurred with application gateway or the backend server while performing the request.
+
+#### 500 ΓÇô Internal Server Error
+
+Azure Application Gateway shouldn't exhibit 500 response codes. Please open a support request if you see this code, because this issue is an internal error to the service. For information on how to open a support case, see [Create an Azure support request](/azure/azure-portal/supportability/how-to-create-azure-support-request).
+
+#### 502 ΓÇô Bad Gateway
+
+HTTP 502 errors can have several root causes, for example:
+- NSG, UDR, or custom DNS is blocking access to backend pool members.
+- Back-end VMs or instances of [virtual machine scale sets](/azure/virtual-machine-scale-sets/overview) aren't responding to the default health probe.
+- Invalid or improper configuration of custom health probes.
+- Azure Application Gateway's [back-end pool isn't configured or empty](application-gateway-troubleshooting-502.md#empty-backendaddresspool).
+- None of the VMs or instances in [virtual machine scale set are healthy](application-gateway-troubleshooting-502.md#unhealthy-instances-in-backendaddresspool).
+- [Request time-out or connectivity issues](application-gateway-troubleshooting-502.md#request-time-out) with user requests.
+
+For information about scenarios where 502 errors occur, and how to troubleshoot them, see [Troubleshoot Bad Gateway errors](application-gateway-troubleshooting-502.md).
+
+#### 504 ΓÇô Request timeout
+
+HTTP 504 errors are presented if a request is sent to application gateways using v2 sku, and the backend response exceeds the time-out value associated to the listener's rule. This value is defined in the HTTP setting.
+
+## Next steps
+
+If the information in this article doesn't help to resolve the issue, [submit a support ticket](https://azure.microsoft.com/support/options/).
application-gateway Redirect Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-overview.md
Title: Redirect overview for Azure Application Gateway description: Learn about the redirect capability in Azure Application Gateway to redirect traffic received on one listener to another listener or to an external site. -+ Previously updated : 11/16/2019- Last updated : 04/19/2022+ # Application Gateway redirect overview
You can use application gateway to redirect traffic. It has a generic redirecti
A common redirection scenario for many web applications is to support automatic HTTP to HTTPS redirection to ensure all communication between application and its users occurs over an encrypted path. In the past, customers have used techniques such as creating a dedicated backend pool whose sole purpose is to redirect requests it receives on HTTP to HTTPS. With redirection support in Application Gateway, you can accomplish this simply by adding a new redirect configuration to a routing rule, and specifying another listener with HTTPS protocol as the target listener.
-The following types of redirection are supported:
+## Redirection types
+A redirect type sets the response status code for the clients to understand the purpose of the redirect. The following types of redirection are supported:
-- 301 Permanent Redirect-- 302 Found-- 303 See Other-- 307 Temporary Redirect-
-Application Gateway redirection support offers the following capabilities:
--- **Global redirection**-
- Redirects from one listener to another listener on the gateway. This enables HTTP to HTTPS redirection on a site.
- When configuring redirects with a multi-site target listener, it is required that all the host names (with or without wildcard characters) defined as part of the source listener are also part of the destination listener. This ensures that no traffic is dropped due to missing host names on the destination listener while setting up HTTP to HTTPS redirection.
+- 301 (Moved permanently): Indicates that the target resource has been assigned a new permanent URI. Any future references to this resource will use one of the enclosed URIs. Use 301 status code for HTTP to HTTPS redirection.
+- 302 (Found): Indicates that the target resource is temporarily under a different URI. Since the redirection can change on occasion, the client should continue to use the effective request URI for future requests.
+- 307 (Temporary redirect): Indicates that the target resource is temporarily under a different URI. The user agent MUST NOT change the request method if it does an automatic redirection to that URI. Since the redirection can change over time, the client ought to continue using the original effective request URI for future requests.
+- 308 (Permanent redirect): Indicates that the target resource has been assigned a new permanent URI. Any future references to this resource should use one of the enclosed URIs.
+## Redirection capabilities
+- **Listener redirection**
+
+ Redirects from one listener to another listener. Listener redirection is commonly used to enable HTTP to HTTPS redirection.
+
- **Path-based redirection**
- This type of redirection enables HTTP to HTTPS redirection only on a specific site area, for example a shopping cart area denoted by /cart/*.
+ This type of redirection enables redirection only on a specific site area, for example, redirecting HTTP to HTTPS requests for a shopping cart area denoted by /cart/\*.
+
- **Redirect to external site** ![Diagram shows users and an App Gateway and connections between the two, including an unlocked H T T P red arrow, a not allowed 301 direct red arrow, and a locked H T T P S a green arrow.](./media/redirect-overview/redirect.png)
applied-ai-services Concept Layout https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-layout.md
You'll need a form document. You can use our [sample form document](https://raw.
* For best results, provide one clear photo or high-quality scan per document. * Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location. * For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed).
-* The file size must be less than 50 MB.
+* The file size must be less than 50 MB (4 MB for the free tier).
* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels. > [!NOTE]
applied-ai-services Concept Read https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-read.md
See how text is extracted from forms and documents using the Form Recognizer Stu
* For best results, provide one clear photo or high-quality scan per document. * Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location. * For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed).
-* The file size must be less than 50 MB.
+* The file size must be less than 50 MB (4 MB for the free tier)
* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels. ## Supported languages and locales
applied-ai-services Build Custom Model V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/build-custom-model-v3.md
The Form Recognizer Studio provides and orchestrates all the API calls required
1. On the next step in the workflow, choose or create a Form Recognizer resource before you select continue. > [!IMPORTANT]
- > Custom neural models models are only available in a few regions. If you plan on training a neural model, please select or create a resource in one of [these supported regions](/azure/applied-ai-services/form-recognizer/concept-custom-neural#l).
+ > Custom neural models models are only available in a few regions. If you plan on training a neural model, please select or create a resource in one of [these supported regions](../concept-custom-neural.md).
:::image type="content" source="../media/how-to/studio-select-resource.png" alt-text="Screenshot: Select the Form Recognizer resource.":::
applied-ai-services Try V3 Python Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-python-sdk.md
def analyze_general_documents():
docUrl = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-layout.pdf" # create your `DocumentAnalysisClient` instance and `AzureKeyCredential` variable
- endpoint=endpoint, credential=AzureKeyCredential(key)
- )
+ document_analysis_client = DocumentAnalysisClient(endpoint=endpoint, credential=AzureKeyCredential(key))
poller = document_analysis_client.begin_analyze_document_from_url( "prebuilt-document", docUrl)
def analyze_layout():
formUrl = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-layout.pdf" # create your `DocumentAnalysisClient` instance and `AzureKeyCredential` variable
- document_analysis_client = DocumentAnalysisClient(
- endpoint=endpoint, credential=AzureKeyCredential(key)
- )
+ document_analysis_client = DocumentAnalysisClient(endpoint=endpoint, credential=AzureKeyCredential(key))
poller = document_analysis_client.begin_analyze_document_from_url( "prebuilt-layout", formUrl)
def analyze_invoice():
invoiceUrl = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-invoice.pdf" # create your `DocumentAnalysisClient` instance and `AzureKeyCredential` variable
- document_analysis_client = DocumentAnalysisClient(
- endpoint=endpoint, credential=AzureKeyCredential(key)
- )
+ document_analysis_client = DocumentAnalysisClient(endpoint=endpoint, credential=AzureKeyCredential(key))
poller = document_analysis_client.begin_analyze_document_from_url( "prebuilt-invoice", invoiceUrl)
attestation Audit Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/attestation/audit-logs.md
-# Audit logs for Azure Attestation
+# Azure Attestation logging
-Audit logs are secure, immutable, timestamped records of discrete events that happened over time. These logs capture important events that may affect the functionality of your attestation instance.
+If you create one or more Azure Attestation resources, youΓÇÖll want to monitor how and when your attestation instance is accessed, and by whom. You can do this by enabling logging for Microsoft Azure Attestation, which saves information in an Azure storage account you provide.
-Azure Attestation manages attestation instances and the policies associated with them. Actions associated with instance management and policy changes are audited and logged.
+Logging information will be available up to 10 minutes after the operation occurred (in most cases, it will be quicker than this). Since you provide the storage account, you can secure your logs via standard Azure access controls and delete logs you no longer want to keep in your storage account.
-This article contains information on the events that are logged, the information collected, and the location of these logs.
+## Interpret your Azure Attestation logs
-## About Audit logs
+When logging is enabled, up to three containers may be automatically created for you in your specified storage account: **insights-logs-auditevent, insights-logs-operational, insights-logs-notprocessed**. It is recommended to only use **insights-logs-operational** and **insights-logs-notprocessed**. **insights-logs-auditevent** was created to provide early access to logs for customers using VBS. Future enhancements to logging will occur in the **insights-logs-operational** and **insights-logs-notprocessed**.
-Azure Attestation uses code to produce audit logs for events that affect the way attestation is performed. This typically boils down to how or when policy changes are made to your attestation instance as well as some admin actions.
+**Insights-logs-operational** contains generic information across all TEE types.
-### Auditable Events
-Here are some of the audit logs we collect:
+**Insights-logs-notprocessed** contains requests which the service was unable to process, typically due to malformed HTTP headers, incomplete message bodies, or similar issues.
-| Event/API | Event Description |
-|--|--|
-| Create Instance | Creates a new instance of an attestation service. |
-| Destroy Instance | Destroys an instance of an attestation service. |
-| Add Policy Certificate | Addition of a certificate to the current set of policy management certificates. |
-| Remove Policy Certificate | Remove a certificate from the current set of policy management certificates. |
-| Set Current Policy | Sets the attestation policy for a given TEE type. |
-| Reset Attestation Policy | Resets the attestation policy for a given TEE type. |
-| Prepare to Update Policy | Prepare to update attestation policy for a given TEE type. |
-| Rehydrate Tenants After Disaster | Re-seals all of the attestation tenants on this instance of the attestation service. This can only be performed by Attestation Service admins. |
+Individual blobs are stored as text, formatted as a JSON blob. LetΓÇÖs look at an example log entry:
-### Collected information
-For each of these events, Azure Attestation collects the following information:
-- Operation Name-- Operation Success-- Operation Caller, which could be any of the following:
- - Azure AD UPN
- - Object ID
- - Certificate
- - Azure AD Tenant ID
-- Operation Target, which could be any of the following:
- - Environment
- - Service Region
- - Service Role
- - Service Role Instance
- - Resource ID
- - Resource Region
+```json
+{
+ "Time": "2021-11-03T19:33:54.3318081Z",
+ "resourceId": "/subscriptions/<subscription ID>/resourceGroups/<resource group name>/providers/Microsoft.Attestation/attestationProviders/<instance name>",
+ "region": "EastUS",
+ "operationName": "AttestSgxEnclave",
+ "category": "Operational",
+ "resultType": "Succeeded",
+ "resultSignature": "400",
+ "durationMs": 636,
+ "callerIpAddress": "::ffff:24.17.183.201",
+ "traceContext": "{\"traceId\":\"e4c24ac88f33c53f875e5141a0f4ce13\",\"parentId\":\"0000000000000000\",}",
+ "identity": "{\"callerAadUPN\":\"deschuma@microsoft.com\",\"callerAadObjectId\":\"6ab02abe-6ca2-44ac-834d-42947dbde2b2\",\"callerId\":\"deschuma@microsoft.com\"}",
+ "uri": "https://deschumatestrp.eus.test.attest.azure.net:443/attest/SgxEnclave?api-version=2018-09-01-preview",
+ "level": "Informational",
+ "location": "EastUS",
+ "properties":
+ {
+ "failureResourceId": "",
+ "failureCategory": "None",
+ "failureDetails": "",
+ "infoDataReceived":
+ {
+ "Headers":
+ {
+ "User-Agent": "PostmanRuntime/7.28.4"
+ },
+ "HeaderCount": 10,
+ "ContentType": "application/json",
+ "ContentLength": 6912,
+ "CookieCount": 0,
+ "TraceParent": ""
+ }
+ }
+ }
+```
-### Sample Audit log
+Most of these fields are documented in the [Top-level common schema](/azure-monitor/essentials/resource-logs-schema#top-level-common-schema). The following table lists the field names and descriptions for the entries not included in the top-level common schema:
-Audit logs are provided in JSON format. Here is an example of what an audit log may look like.
+| Field Name | Description |
+||--|
+| traceContext | JSON blob representing the W3C trace-context |
+| uri | Request URI |
-```json
-{
- "operationName": "SetCurrentPolicy",
- "resultType": "Success",
- "resultDescription": null,
- "auditEventCategory": [
- "ApplicationManagement"
- ],
- "nCloud": null,
- "requestId": null,
- "callerIpAddress": null,
- "callerDisplayName": null,
- "callerIdentities": [
- {
- "callerIdentityType": "ObjectID",
- "callerIdentity": "<some object ID>"
- },
- {
- "callerIdentityType": "TenantId",
- "callerIdentity": "<some tenant ID>"
- }
- ],
- "targetResources": [
- {
- "targetResourceType": "Environment",
- "targetResourceName": "PublicCloud"
- },
- {
- "targetResourceType": "ServiceRegion",
- "targetResourceName": "EastUS2"
- },
- {
- "targetResourceType": "ServiceRole",
- "targetResourceName": "AttestationRpType"
- },
- {
- "targetResourceType": "ServiceRoleInstance",
- "targetResourceName": "<some service role instance>"
- },
- {
- "targetResourceType": "ResourceId",
- "targetResourceName": "/subscriptions/<some subscription ID>/resourceGroups/<some resource group name>/providers/Microsoft.Attestation/attestationProviders/<some instance name>"
- },
- {
- "targetResourceType": "ResourceRegion",
- "targetResourceName": "EastUS2"
- }
- ],
- "ifxAuditFormat": "Json",
- "env_ver": "2.1",
- "env_name": "#Ifx.AuditSchema",
- "env_time": "2020-11-23T18:23:29.9427158Z",
- "env_epoch": "MKZ6G",
- "env_seqNum": 1277,
- "env_popSample": 0.0,
- "env_iKey": null,
- "env_flags": 257,
- "env_cv": "##00000000-0000-0000-0000-000000000000_00000000-0000-0000-0000-000000000000_00000000-0000-0000-0000-000000000000",
- "env_os": null,
- "env_osVer": null,
- "env_appId": null,
- "env_appVer": null,
- "env_cloud_ver": "1.0",
- "env_cloud_name": null,
- "env_cloud_role": null,
- "env_cloud_roleVer": null,
- "env_cloud_roleInstance": null,
- "env_cloud_environment": null,
- "env_cloud_location": null,
- "env_cloud_deploymentUnit": null
-}
-```
+The properties contain additional Azure attestation specific context:
+
+| Field Name | Description |
+||--|
+| failureResourceId | Resource ID of component which resulted in request failure |
+| failureCategory | Broad category indicating category of a request failure. Includes categories such as AzureNetworkingPhysical, AzureAuthorization etc. |
+| failureDetails | Detailed information about a request failure, if available |
+| infoDataReceived | Information about the request received from the client. Includes some HTTP headers, the number of headers received, the content type and content length |
+
+## Next steps
+- [How to enable Microsoft Azure Attestation logging ](azure-diagnostic-monitoring.md)
attestation Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/attestation/overview.md
OE standardizes specific requirements for verification of an enclave evidence. T
Client applications can be designed to take advantage of TPM attestation by delegating security-sensitive tasks to only take place after a platform has been validated to be secure. Such applications can then make use of Azure Attestation to routinely establish trust in the platform and its ability to access sensitive data.
-### Azure Confidential VM attestation
+### AMD SEV-SNP attestation
Azure [Confidential VM](../confidential-computing/confidential-vm-overview.md) (CVM) is based on [AMD processors with SEV-SNP technology](../confidential-computing/virtual-machine-solutions-amd.md) and aims to improve VM security posture by removing trust in host, hypervisor and Cloud Service Provider (CSP). To achieve this, CVM offers VM OS disk encryption option with platform-managed keys and binds the disk encryption keys to the virtual machine's TPM. When a CVM boots up, SNP report containing the guest VM firmware measurements will be sent to Azure Attestation. The service validates the measurements and issues an attestation token that is used to release keys from [Managed-HSM](../key-vault/managed-hsm/overview.md) or [Azure Key Vault](../key-vault/general/basic-concepts.md). These keys are used to decrypt the vTPM state of the guest VM, unlock the OS disk and start the CVM. The attestation and key release process is performed automatically on each CVM boot, and the process ensures the CVM boots up only upon successful attestation of the hardware.
automanage Windows Server Azure Edition Vnext https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automanage/windows-server-azure-edition-vnext.md
> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. > For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-[Windows Server: Azure Edition (WSAE)](https://aka.ms/wsae) is a new edition of Windows Server focused on innovation and efficiency. Featuring an annual release cadence and optimized to run on Azure properties, WSAE brings new functionality to Windows Server users faster than the traditional Long-Term Servicing Channel (LTSC) editions of Windows Server (2016,2019,2022, etc.) the first version of this new variant is Windows Server 2022 Datacenter: Azure Edition, announced at Microsoft Ignite in November 2021.
+[Windows Server: Azure Edition (WSAE)](./automanage-windows-server-services-overview.md) is a new edition of Windows Server focused on innovation and efficiency. Featuring an annual release cadence and optimized to run on Azure properties, WSAE brings new functionality to Windows Server users faster than the traditional Long-Term Servicing Channel (LTSC) editions of Windows Server (2016,2019,2022, etc.) the first version of this new variant is Windows Server 2022 Datacenter: Azure Edition, announced at Microsoft Ignite in November 2021.
The annual WSAE releases are delivered using Windows Update, rather than a full OS upgrade. As part of this annual release cadence, the WSAE Insider preview program will spin up each spring with the opportunity to access early builds of the next release - leading to general availability in the fall. Install the preview to get early access to all the new features and functionality prior to general availability. If you are a registered Microsoft Server Insider, you have access to create and use virtual machine images from this preview. For more information and to manage your Insider membership, visit the [Windows Insider home page](https://insider.windows.com/) or [Windows Insiders for Business home page.](https://insider.windows.com/for-business/)
automation Automation Manage Send Joblogs Log Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-manage-send-joblogs-log-analytics.md
Azure Automation can send runbook job status and job streams to your Log Analyti
- Trigger an email or alert based on your runbook job status (for example, failed or suspended). - Write advanced queries across your job streams. - Correlate jobs across Automation accounts.
- - Use customized views and search queries to visualize your runbook results, runbook job status, and other related key indicators or metrics through an [Azure dashboard](/azure/azure-portal/azure-portal-dashboards).
+ - Use customized views and search queries to visualize your runbook results, runbook job status, and other related key indicators or metrics through an [Azure dashboard](../azure-portal/azure-portal-dashboards.md).
- Get the audit logs related to Automation accounts, runbooks, and other asset create, modify and delete operations.
-Using Azure Monitor logs, you can consolidate logs from different resources in the same workspace where it can be analyzed with [queries](/azure/azure-monitor/logs/log-query-overview) to quickly retrieve, consolidate, and analyze the collected data. You can create and test queries using [Log Analytics](/azure/azure-monitor/logs/log-query-overview) in the Azure portal and then either directly analyze the data using these tools or save queries for use with [visualization](/azure/azure-monitor/best-practices-analysis) or [alert rules](/azure/azure-monitor/alerts/alerts-overview).
+Using Azure Monitor logs, you can consolidate logs from different resources in the same workspace where it can be analyzed with [queries](../azure-monitor/logs/log-query-overview.md) to quickly retrieve, consolidate, and analyze the collected data. You can create and test queries using [Log Analytics](../azure-monitor/logs/log-query-overview.md) in the Azure portal and then either directly analyze the data using these tools or save queries for use with [visualization](../azure-monitor/best-practices-analysis.md) or [alert rules](../azure-monitor/alerts/alerts-overview.md).
-Azure Monitor uses a version of the [Kusto query language (KQL)](/azure/kusto/query/) used by Azure Data Explorer that is suitable for simple log queries. It also includes advanced functionality such as aggregations, joins, and smart analytics. You can quickly learn the query language using [multiple lessons](/azure/azure-monitor/logs/get-started-queries).
+Azure Monitor uses a version of the [Kusto query language (KQL)](/azure/kusto/query/) used by Azure Data Explorer that is suitable for simple log queries. It also includes advanced functionality such as aggregations, joins, and smart analytics. You can quickly learn the query language using [multiple lessons](../azure-monitor/logs/get-started-queries.md).
## Azure Automation diagnostic settings
You can configure diagnostic settings in the Azure portal from the menu for the
:::image type="content" source="media/automation-manage-send-joblogs-log-analytics/destination-details-options-inline.png" alt-text="Screenshot showing selections in destination details section." lightbox="media/automation-manage-send-joblogs-log-analytics/destination-details-options-expanded.png":::
- - **Log Analytics** : Enter the Subscription ID and workspace name. If you don't have a workspace, you must [create one before proceeding](/azure/azure-monitor/logs/quick-create-workspace).
+ - **Log Analytics** : Enter the Subscription ID and workspace name. If you don't have a workspace, you must [create one before proceeding](../azure-monitor/logs/quick-create-workspace.md).
- **Event Hubs**: Specify the following criteria: - Subscription: The same subscription as that of the Event Hub.
- - Event Hub namespace: [Create Event Hub](/azure/event-hubs/event-hubs-create) if you don't have one yet.
- - Event Hub name (optional): If you don't specify a name, an event hub is created for each log category. If you are sending multiple categories, specify a name to limit the number of Event Hubs created. See [Azure Event Hubs quotas and limits](/azure/event-hubs/event-hubs-quotas) for details.
- - Event Hub policy (optional): A policy defines the permissions that the streaming mechanism has. See [Event Hubs feature](/azure/event-hubs/event-hubs-features#publisher-policy).
+ - Event Hub namespace: [Create Event Hub](../event-hubs/event-hubs-create.md) if you don't have one yet.
+ - Event Hub name (optional): If you don't specify a name, an event hub is created for each log category. If you are sending multiple categories, specify a name to limit the number of Event Hubs created. See [Azure Event Hubs quotas and limits](../event-hubs/event-hubs-quotas.md) for details.
+ - Event Hub policy (optional): A policy defines the permissions that the streaming mechanism has. See [Event Hubs feature](../event-hubs/event-hubs-features.md#publisher-policy).
- **Storage**: Choose the subscription, storage account, and retention policy. :::image type="content" source="media/automation-manage-send-joblogs-log-analytics/storage-account-details-inline.png" alt-text="Screenshot showing the storage account." lightbox="media/automation-manage-send-joblogs-log-analytics/storage-account-details-expanded.png":::
- - **Partner integration**: You must first install a partner integration into your subscription. Configuration options will vary by partner. For more information, see [Azure Monitor integration](/azure/partner-solutions/overview).
+ - **Partner integration**: You must first install a partner integration into your subscription. Configuration options will vary by partner. For more information, see [Azure Monitor integration](../partner-solutions/overview.md).
1. Click **Save**.
-After a few moments, the new setting appears in your list of settings for this resource, and logs are streamed to the specified destinations as new event data is generated. There can be 15 minutes time difference between the event emitted and its appearance in [Log Analytics workspace](/azure/azure-monitor/logs/data-ingestion-time).
+After a few moments, the new setting appears in your list of settings for this resource, and logs are streamed to the specified destinations as new event data is generated. There can be 15 minutes time difference between the event emitted and its appearance in [Log Analytics workspace](../azure-monitor/logs/data-ingestion-time.md).
## Query the logs
To create an alert rule, create a log search for the runbook job records that sh
```kusto AzureDiagnostics | where ResourceProvider == "MICROSOFT.AUTOMATION" and Category == "JobLogs" and (ResultType == "Failed" or ResultType == "Suspended") | summarize AggregatedValue = count() by RunbookName_s ```
- 1. To open the **Create alert rule** screen, click **+New alert rule** on the top of the page. For more information on the options to configure the alerts, see [Log alerts in Azure](/azure/azure-monitor/alerts/alerts-log#create-a-log-alert-rule-in-the-azure-portal)
+ 1. To open the **Create alert rule** screen, click **+New alert rule** on the top of the page. For more information on the options to configure the alerts, see [Log alerts in Azure](../azure-monitor/alerts/alerts-log.md#create-a-new-log-alert-rule-in-the-azure-portal)
## Azure Automation diagnostic audit logs
You can now send audit logs also to the Azure Monitor workspace. This allows ent
## Difference between activity logs and audit logs
-Activity log is aΓÇ»[platform log](/azure/azure-monitor/essentials/platform-logs-overview)in Azure that provides insight into subscription-level events. The activity log for Automation account includes information about when an automation resource is modified or created or deleted. However, it does not capture the name or ID of the resource.
+Activity log is aΓÇ»[platform log](../azure-monitor/essentials/platform-logs-overview.md)in Azure that provides insight into subscription-level events. The activity log for Automation account includes information about when an automation resource is modified or created or deleted. However, it does not capture the name or ID of the resource.
Audit logs for Automation accounts capture the name and ID of the resource such as automation variable, credential, connection and so on, along with the type of the operation performed for the resource and Azure Automation would scrub some details like client IP data conforming to the GDPR compliance.
automation Automation Security Guidelines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-security-guidelines.md
Review the Azure Policy recommendations for Azure Automation and act as appropri
## Next steps
-* To learn how to use Azure role-based access control (Azure RBAC), see [Manage role permissions and security in Azure Automation](/azure/automation/automation-role-based-access-control).
-* For information on how Azure protects your privacy and secures your data, see [Azure Automation data security](/azure/automation/automation-managing-data).
-* To learn about configuring the Automation account to use encryption, see [Encryption of secure assets in Azure Automation](/azure/automation/automation-secure-asset-encryption).
+* To learn how to use Azure role-based access control (Azure RBAC), see [Manage role permissions and security in Azure Automation](./automation-role-based-access-control.md).
+* For information on how Azure protects your privacy and secures your data, see [Azure Automation data security](./automation-managing-data.md).
+* To learn about configuring the Automation account to use encryption, see [Encryption of secure assets in Azure Automation](./automation-secure-asset-encryption.md).
automation Automation Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-services.md
Multiple Azure services can fulfill the above requirements. Each service has its
### Azure Resource Manager (ARM) template
-Azure Resource Manager provides a language to develop repeatable and consistent deployment templates for Azure resources. The template is a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. It uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. In the template, you specify the resources to deploy and the properties for those resources. [Learn more](/azure/azure-resource-manager/templates/overview).
+Azure Resource Manager provides a language to develop repeatable and consistent deployment templates for Azure resources. The template is a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. It uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. In the template, you specify the resources to deploy and the properties for those resources. [Learn more](../azure-resource-manager/templates/overview.md).
### Bicep
-We've introduced a new language named [Bicep](/azure/azure-resource-manager/bicep/overview) that offers the same capabilities as ARM templates but with a syntax that's easier to use. Each Bicep file is automatically converted to an ARM template during deployment. If you're considering infrastructure as code options, we recommend Bicep. For more information, see [What is Bicep?](/azure/azure-resource-manager/bicep/overview)
+We've introduced a new language named [Bicep](../azure-resource-manager/bicep/overview.md) that offers the same capabilities as ARM templates but with a syntax that's easier to use. Each Bicep file is automatically converted to an ARM template during deployment. If you're considering infrastructure as code options, we recommend Bicep. For more information, see [What is Bicep?](../azure-resource-manager/bicep/overview.md)
The following table describes the scenarios and users for ARM template and Bicep:
The following table describes the scenarios and users for ARM template and Bicep
### Azure Blueprints (Preview)
- Azure Blueprints (Preview) define a repeatable set of Azure resources that implements and adheres to an organization's standards, patterns, and requirements. Blueprints are a declarative way to orchestrate the deployment of various resource templates and other artifacts such as, Role assignments, Policy assignments, ARM templates and Resource groups. [Learn more](/azure/governance/blueprints/overview).
+ Azure Blueprints (Preview) define a repeatable set of Azure resources that implements and adheres to an organization's standards, patterns, and requirements. Blueprints are a declarative way to orchestrate the deployment of various resource templates and other artifacts such as, Role assignments, Policy assignments, ARM templates and Resource groups. [Learn more](../governance/blueprints/overview.md).
**Scenarios** | **Users** |
The following table describes the scenarios and users for ARM template and Bicep
-### [Azure Automation](/azure/automation/overview)
+### [Azure Automation](./overview.md)
-Azure Automation orchestrates repetitive processes using graphical, PowerShell, and Python runbooks in the cloud or hybrid environments. It provides a persistent shared assets including variables, connections, objects that allow orchestration of complex jobs. [Learn more](/azure/automation/automation-runbook-gallery).
+Azure Automation orchestrates repetitive processes using graphical, PowerShell, and Python runbooks in the cloud or hybrid environments. It provides a persistent shared assets including variables, connections, objects that allow orchestration of complex jobs. [Learn more](./automation-runbook-gallery.md).
There are more than 3,000 modules in the PowerShell Gallery, and the PowerShell community continues to grow. Azure Automation based on PowerShell modules can work with multiple applications and vendors, both 1st party and 3rd party. As more application vendors release PowerShell modules for integration, extensibility and automation tasks, you could use an existing PowerShell script as-is to execute it as a PowerShell runbook in an automation account without making any changes. **Scenarios** | **Users** |
- | Allows to write an [Automation PowerShell runbook](/azure/automation/learn/powershell-runbook-managed-identity) that deploys an Azure resource by using an [Azure Resource Manager template](/azure/azure-resource-manager/templates/quickstart-create-templates-use-the-portal).</br> </br> Schedule tasks, for example ΓÇô Stop dev/test VMs or services at night and turn on during the day. </br> </br> Response to alerts such as system alerts, service alerts, high CPU/memory alerts, create ServiceNow tickets, and so on. </br> </br> Hybrid automation where you can manage to automate on-premises servers such as SQL Server, Active Directory and so on. </br> </br> Azure resource life-cycle management and governance include resource provisioning, de-provisioning, adding correct tags, locks, NSGs and so on. | IT administrators, System administrators, IT operations administrators who are skilled at using PowerShell or Python based scripting. </br> </br> Infrastructure administrators manage the on-premises infrastructure using scripts or executing long-running jobs such as month-end operations on servers running on-premises.
+ | Allows to write an [Automation PowerShell runbook](./learn/powershell-runbook-managed-identity.md) that deploys an Azure resource by using an [Azure Resource Manager template](../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md).</br> </br> Schedule tasks, for example ΓÇô Stop dev/test VMs or services at night and turn on during the day. </br> </br> Response to alerts such as system alerts, service alerts, high CPU/memory alerts, create ServiceNow tickets, and so on. </br> </br> Hybrid automation where you can manage to automate on-premises servers such as SQL Server, Active Directory and so on. </br> </br> Azure resource life-cycle management and governance include resource provisioning, de-provisioning, adding correct tags, locks, NSGs and so on. | IT administrators, System administrators, IT operations administrators who are skilled at using PowerShell or Python based scripting. </br> </br> Infrastructure administrators manage the on-premises infrastructure using scripts or executing long-running jobs such as month-end operations on servers running on-premises.
### Azure Automation based in-guest management
-**Configuration management** : Collects inventory and tracks changes in your environment. [Learn more](/azure/automation/change-tracking/overview).
-You can configure desired the state of your machines to discover and correct configuration drift. [Learn more](/azure/automation/automation-dsc-overview).
+**Configuration management** : Collects inventory and tracks changes in your environment. [Learn more](./change-tracking/overview.md).
+You can configure desired the state of your machines to discover and correct configuration drift. [Learn more](./automation-dsc-overview.md).
-**Update management** : Assess compliance of servers and can schedule update installation on your machines. [Learn more](/azure/automation/update-management/overview).
+**Update management** : Assess compliance of servers and can schedule update installation on your machines. [Learn more](./update-management/overview.md).
**Scenarios** | **Users** |
You can configure desired the state of your machines to discover and correct con
### Azure Automanage (Preview)
-Replaces repetitive, day-to-day operational tasks with an exception-only management model, where a healthy, steady-state of VM is equal to hands-free management. [Learn more](/azure/automanage/automanage-virtual-machines).
+Replaces repetitive, day-to-day operational tasks with an exception-only management model, where a healthy, steady-state of VM is equal to hands-free management. [Learn more](../automanage/automanage-virtual-machines.md).
**Linux and Windows support** - You can intelligently onboard virtual machines to select best practices Azure services.
Replaces repetitive, day-to-day operational tasks with an exception-only managem
### Azure Policy based Guest Configuration
-Azure Policy based Guest configuration is the next iteration of Azure Automation State configuration. [Learn more](/azure/governance/policy/concepts/guest-configuration-policy-effects).
+Azure Policy based Guest configuration is the next iteration of Azure Automation State configuration. [Learn more](../governance/policy/concepts/guest-configuration-policy-effects.md).
You can check on what is installed in:
- - The next iteration of [Azure Automation State Configuration](/azure/automation/automation-dsc-overview).
+ - The next iteration of [Azure Automation State Configuration](./automation-dsc-overview.md).
- For known-bad apps, protocols certificates, administrator privileges, and health of agents. - For customer-authored content. **Scenarios** | **Users** |
- | Obtain compliance data that may include: The configuration of the operating system ΓÇô files, registry, and services, Application configuration or presence, Check environment settings. </br> </br> Audit or deploy settings to all machines (Set) in scope either reactively to existing machines or proactively to new machines as they are deployed. </br> </br> Respond to policy events to provide [remediation on demand or continuous remediation.](/azure/governance/policy/concepts/guest-configuration-policy-effects#remediation-on-demand-applyandmonitor) | The Central IT, Infrastructure Administrators, Auditors (Cloud custodians) are working towards the regulatory requirements at scale and ensuring that servers' end state looks as desired. </br> </br> The application teams validate compliance before releasing change.
+ | Obtain compliance data that may include: The configuration of the operating system ΓÇô files, registry, and services, Application configuration or presence, Check environment settings. </br> </br> Audit or deploy settings to all machines (Set) in scope either reactively to existing machines or proactively to new machines as they are deployed. </br> </br> Respond to policy events to provide [remediation on demand or continuous remediation.](../governance/policy/concepts/guest-configuration-policy-effects.md#remediation-on-demand-applyandmonitor) | The Central IT, Infrastructure Administrators, Auditors (Cloud custodians) are working towards the regulatory requirements at scale and ensuring that servers' end state looks as desired. </br> </br> The application teams validate compliance before releasing change.
### Azure Automation - Process Automation
-Orchestrates repetitive processes using graphical, PowerShell, and Python runbooks in the cloud or hybrid environment. [Learn more](/azure/automation/automation-runbook-types?).
+Orchestrates repetitive processes using graphical, PowerShell, and Python runbooks in the cloud or hybrid environment. [Learn more](./automation-runbook-types.md).
- It provides persistent shared assets, including variables, connections, objects, that allows orchestration of complex jobs.
- - You can invoke a runbook on the basis of [Azure Monitor alert](/azure/automation/automation-create-alert-triggered-runbook) or through a [webhook](/azure/automation/automation-webhooks).
+ - You can invoke a runbook on the basis of [Azure Monitor alert](./automation-create-alert-triggered-runbook.md) or through a [webhook](./automation-webhooks.md).
**Scenarios** | **Users** |
Orchestrates repetitive processes using graphical, PowerShell, and Python runboo
### Azure functions
-Provides a serverless event-driven compute platform for automation that allows you to write code to react to critical events from various sources, third-party services, and on-premises systems. For example, an HTTP trigger without worrying about the underlying platform. [Learn more](/azure/azure-functions/functions-overview).
+Provides a serverless event-driven compute platform for automation that allows you to write code to react to critical events from various sources, third-party services, and on-premises systems. For example, an HTTP trigger without worrying about the underlying platform. [Learn more](../azure-functions/functions-overview.md).
- You can use a variety of languages to write functions in a language of your choice such as C#, Java, JavaScript, PowerShell, or Python and focus on specific pieces of code. Functions runtime is an open source. - You can choose the hosting plan according to your function app scaling requirements, functionality, and resources required.
- - You can orchestrate complex workflows through [durable functions](/azure/azure-functions/durable/durable-functions-overview?tabs=csharp).
- - You should avoid large, and long-running functions that can cause unexpected timeout issues. [Learn more](/azure/azure-functions/functions-best-practices?tabs=csharp#write-robust-functions).
- - When you write Powershell scripts within the Function Apps, you must tweak the scripts to define how the function behaves such as - how it's triggered, its input and output parameters. [Learn more](/azure/azure-functions/functions-reference-powershell?tabs=portal).
+ - You can orchestrate complex workflows through [durable functions](../azure-functions/durable/durable-functions-overview.md?tabs=csharp).
+ - You should avoid large, and long-running functions that can cause unexpected timeout issues. [Learn more](../azure-functions/functions-best-practices.md?tabs=csharp#write-robust-functions).
+ - When you write Powershell scripts within the Function Apps, you must tweak the scripts to define how the function behaves such as - how it's triggered, its input and output parameters. [Learn more](../azure-functions/functions-reference-powershell.md?tabs=portal).
**Scenarios** | **Users** |
Provides a serverless event-driven compute platform for automation that allows y
### Azure logic apps
-Logic Apps is a platform for creating and running complex orchestration workflows that integrate your apps, data, services, and systems. [Learn more](/azure/logic-apps/logic-apps-overview).
+Logic Apps is a platform for creating and running complex orchestration workflows that integrate your apps, data, services, and systems. [Learn more](../logic-apps/logic-apps-overview.md).
- Allows you to build smart integrations between 1st party and 3rd party apps, services and systems running across on-premises, hybrid and cloud native. - Allows you to use managed connectors from a 450+ and growing Azure connectors ecosystem to use in your workflows.
Logic Apps is a platform for creating and running complex orchestration workflow
### Azure Automation - Process Automation
-Orchestrates repetitive processes using graphical, PowerShell, and Python runbooks in the cloud or hybrid environment. It provides persistent shared assets, including variables, connections, objects, that allows orchestration of complex jobs. [Learn more](/azure/automation/overview).
+Orchestrates repetitive processes using graphical, PowerShell, and Python runbooks in the cloud or hybrid environment. It provides persistent shared assets, including variables, connections, objects, that allows orchestration of complex jobs. [Learn more](./overview.md).
**Scenarios** | **Users** |
Orchestrates repetitive processes using graphical, PowerShell, and Python runboo
### Azure functions
-Provides a serverless event-driven compute platform for automation that allows you to write code to react to critical events from various sources, third-party services, and on-premises systems. For example, an HTTP trigger without worrying about the underlying platform [Learn more](/azure/azure-functions/functions-overview).
+Provides a serverless event-driven compute platform for automation that allows you to write code to react to critical events from various sources, third-party services, and on-premises systems. For example, an HTTP trigger without worrying about the underlying platform [Learn more](../azure-functions/functions-overview.md).
- You can use a variety of languages to write functions in a language of your choice such as C#, Java, JavaScript, PowerShell, or Python and focus on specific pieces of code. Functions runtime is an open source. - You can choose the hosting plan according to your function app scaling requirements, functionality, and resources required.
- - You can orchestrate complex workflows through [durable functions](/azure/azure-functions/durable/durable-functions-overview?tabs=csharp).
- - You should avoid large, and long-running functions that can cause unexpected timeout issues. [Learn more](/azure/azure-functions/functions-best-practices?tabs=csharp#write-robust-functions).
- - When you write Powershell scripts within the Function Apps, you must tweak the scripts to define how the function behaves such as - how it's triggered, its input and output parameters. [Learn more](/azure/azure-functions/functions-reference-powershell?tabs=portal).
+ - You can orchestrate complex workflows through [durable functions](../azure-functions/durable/durable-functions-overview.md?tabs=csharp).
+ - You should avoid large, and long-running functions that can cause unexpected timeout issues. [Learn more](../azure-functions/functions-best-practices.md?tabs=csharp#write-robust-functions).
+ - When you write Powershell scripts within the Function Apps, you must tweak the scripts to define how the function behaves such as - how it's triggered, its input and output parameters. [Learn more](../azure-functions/functions-reference-powershell.md?tabs=portal).
**Scenarios** | **Users** |
Provides a serverless event-driven compute platform for automation that allows y
## Next steps-- To learn on how to securely execute the automation jobs, see [best practices for security in Azure Automation](/azure/automation/automation-security-guidelines).
+- To learn on how to securely execute the automation jobs, see [best practices for security in Azure Automation](./automation-security-guidelines.md).
automation Automation Webhooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-webhooks.md
Consider the following strategies:
## Create a webhook > [!NOTE]
-> When you use the webhook with PowerShell 7 runbook, it auto-converts the webhook input parameter to an invalid JSON. For more information, see [Known issues - 7.1 (preview)](/azure/automation/automation-runbook-types#known-issues71-preview). We recommend that you use the webhook with PowerShell 5 runbook.
+> When you use the webhook with PowerShell 7 runbook, it auto-converts the webhook input parameter to an invalid JSON. For more information, see [Known issues - 7.1 (preview)](./automation-runbook-types.md#known-issues71-preview). We recommend that you use the webhook with PowerShell 5 runbook.
1. Create PowerShell runbook with the following code:
Automation webhooks can also be created using [Azure Resource Manager](../azure-
## Next steps
-* To trigger a runbook from an alert, see [Use an alert to trigger an Azure Automation runbook](automation-create-alert-triggered-runbook.md).
+* To trigger a runbook from an alert, see [Use an alert to trigger an Azure Automation runbook](automation-create-alert-triggered-runbook.md).
availability-zones Region Types Service Categories Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/availability-zones/region-types-service-categories-azure.md
As mentioned previously, Azure classifies services into three categories: founda
> | Azure Machine Learning | > | Azure Managed Instance for Apache Cassandra | > | Azure NetApp Files |
-> | Azure Purview |
+> | Microsoft Purview |
> | Azure Red Hat OpenShift | > | Azure Remote Rendering | > | Azure SignalR Service |
azure-app-configuration Howto Import Export Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-import-export-data.md
From the Azure portal, follow these steps:
| Separator | The separator is the character parsed in your imported configuration file to separate key-values which will be added to your configuration store. Select one of the following options: `.`, `,`,`:`, `;`, `/`, `-`. | : | | Prefix | Optional. A key prefix is the beginning part of a key. Prefixes can be used to manage groups of keys in a configuration store. | TestApp:Settings:Backgroundcolor | | Label | Optional. Select an existing label or enter a new label that will be assigned to your imported key-values. | prod |
- | Content type | Optional. Indicate if the file you're importing is a Key Vault reference or a JSON file. For more information about Key Vault references, go to [Use Key Vault references in an ASP.NET Core app](/azure/azure-app-configuration/use-key-vault-references-dotnet-core). | JSON (application/json) |
+ | Content type | Optional. Indicate if the file you're importing is a Key Vault reference or a JSON file. For more information about Key Vault references, go to [Use Key Vault references in an ASP.NET Core app](./use-key-vault-references-dotnet-core.md). | JSON (application/json) |
1. Select **Apply** to proceed with the import. ### [Azure CLI](#tab/azure-cli)
-Use the Azure CLI as explained below to import App Configuration data. If you don't have the Azure CLI installed locally, you can optionally use [Azure Cloud Shell](/azure/cloud-shell/overview). Specify the source of the data: `appconfig`, `appservice` or `file`. Optionally specify a source label with `--src-label` and a label to apply with `--label`.
+Use the Azure CLI as explained below to import App Configuration data. If you don't have the Azure CLI installed locally, you can optionally use [Azure Cloud Shell](../cloud-shell/overview.md). Specify the source of the data: `appconfig`, `appservice` or `file`. Optionally specify a source label with `--src-label` and a label to apply with `--label`.
Import all keys and feature flags from a file and apply test label.
From the [Azure portal](https://portal.azure.com), follow these steps:
### [Azure CLI](#tab/azure-cli)
-Use the Azure CLI as explained below to export configurations from App Configuration to another place. If you don't have the Azure CLI installed locally, you can optionally use [Azure Cloud Shell](/azure/cloud-shell/overview). Specify the destination of the data: `appconfig`, `appservice` or `file`. Specify a label for the data you want to export with `--label` or export data with no label by not entering a label.
+Use the Azure CLI as explained below to export configurations from App Configuration to another place. If you don't have the Azure CLI installed locally, you can optionally use [Azure Cloud Shell](../cloud-shell/overview.md). Specify the destination of the data: `appconfig`, `appservice` or `file`. Specify a label for the data you want to export with `--label` or export data with no label by not entering a label.
> [!IMPORTANT] > If the keys you want to export have labels, do select the corresponding labels. If you don't select a label, only keys without labels will be exported.
For more details and examples, go to [az appconfig kv export](/cli/azure/appconf
## Next steps > [!div class="nextstepaction"]
-> [Create an ASP.NET Core web app](./quickstart-aspnet-core-app.md)
+> [Create an ASP.NET Core web app](./quickstart-aspnet-core-app.md)
azure-arc Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/connectivity.md
Azure Arc-enabled data services provides you the option to connect to Azure in t
The connectivity mode provides you the flexibility to choose how much data is sent to Azure and how users interact with the Arc Data Controller. Depending on the connectivity mode that is chosen, some functionality of Azure Arc-enabled data services may or may not be available.
-Importantly, if the Azure Arc-enabled data services are directly connected to Azure, then users can use [Azure Resource Manager APIs](/rest/api/resources/), the Azure CLI, and the Azure portal to operate the Azure Arc data services. The experience in directly connected mode is much like how you would use any other Azure service with provisioning/de-provisioning, scaling, configuring, and so on all in the Azure portal. If the Azure Arc-enabled data services are indirectly connected to Azure, then the Azure portal is a read-only view. You can see the inventory of SQL managed instances and Postgres Hyperscale instances that you have deployed and the details about them, but you cannot take action on them in the Azure portal. In the indirectly connected mode, all actions must be taken locally using Azure Data Studio, the appropriate CLI, or Kubernetes native tools like kubectl.
+Importantly, if the Azure Arc-enabled data services are directly connected to Azure, then users can use [Azure Resource Manager APIs](/rest/api/resources/), the Azure CLI, and the Azure portal to operate the Azure Arc data services. The experience in directly connected mode is much like how you would use any other Azure service with provisioning/de-provisioning, scaling, configuring, and so on all in the Azure portal. If the Azure Arc-enabled data services are indirectly connected to Azure, then the Azure portal is a read-only view. You can see the inventory of SQL managed instances and PostgreSQL Hyperscale instances that you have deployed and the details about them, but you cannot take action on them in the Azure portal. In the indirectly connected mode, all actions must be taken locally using Azure Data Studio, the appropriate CLI, or Kubernetes native tools like kubectl.
Additionally, Azure Active Directory and Azure Role-Based Access Control can be used in the directly connected mode only because there is a dependency on a continuous and direct connection to Azure to provide this functionality.
azure-arc View Arc Data Services Inventory In Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/view-arc-data-services-inventory-in-azure-portal.md
You can view your Azure Arc-enabled data services in the Azure portal or in your
## View resources in Azure portal
-After you upload your [metrics, logs](upload-metrics-and-logs-to-azure-monitor.md), or [usage](view-billing-data-in-azure.md), you can view your Azure Arc-enabled SQL managed instances or Azure Arc-enabled Postgres Hyperscale server groups in the Azure portal. To view your resource in the [Azure portal](https://portal.azure.com), follow these steps:
+After you upload your [metrics, logs](upload-metrics-and-logs-to-azure-monitor.md), or [usage](view-billing-data-in-azure.md), you can view your Azure Arc-enabled SQL managed instances or Azure Arc-enabled PostgreSQL Hyperscale server groups in the Azure portal. To view your resource in the [Azure portal](https://portal.azure.com), follow these steps:
1. Go to **All services**. 1. Search for your database instance type.
azure-arc View Data Controller In Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/view-data-controller-in-azure-portal.md
In the **indirect** connected mode, you must export and upload at least one type
## Azure portal
-After you complete your first [metrics or logs upload to Azure](upload-metrics-and-logs-to-azure-monitor.md) or [usage data upload](view-billing-data-in-azure.md), you can see the Azure Arc data controller and any Azure Arc-enabled SQL managed instances or Azure Arc-enabled Postgres Hyperscale server resources in the [Azure portal](https://portal.azure.com).
+After you complete your first [metrics or logs upload to Azure](upload-metrics-and-logs-to-azure-monitor.md) or [usage data upload](view-billing-data-in-azure.md), you can see the Azure Arc data controller and any Azure Arc-enabled SQL managed instances or Azure Arc-enabled PostgreSQL Hyperscale server resources in the [Azure portal](https://portal.azure.com).
To find your data controller, search for it by name in the search bar and then select it.
azure-arc Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/troubleshooting.md
When attempting to onboard Kubernetes clusters to the Azure Arc platform, the lo
Cannot load native module 'Crypto.Hash._MD5' ```
-Sometimes, dependent modules fail to download successfully when adding the extensions `connectedk8s` and `k8s-configuration` through Azure CLI or Azure Powershell. To fix this problem, manually remove and then add the extensions in the local environment.
+Sometimes, dependent modules fail to download successfully when adding the extensions `connectedk8s` and `k8s-configuration` through Azure CLI or Azure PowerShell. To fix this problem, manually remove and then add the extensions in the local environment.
To remove the extensions, use:
Once above permissions are granted, you can now proceed to [enabling the custom
The following troubleshooting steps provide guidance on validating the deployment of all the Open Service Mesh extension components on your cluster.
-### 1. Check OSM Controller **Deployment**
+### Check OSM Controller **Deployment**
```bash kubectl get deployment -n arc-osm-system --selector app=osm-controller ```
NAME READY UP-TO-DATE AVAILABLE AGE
osm-controller 1/1 1 1 59m ```
-### 2. Check the OSM Controller **Pod**
+### Check the OSM Controller **Pod**
```bash kubectl get pods -n arc-osm-system --selector app=osm-controller ```
osm-controller-b5bd66db-wvl9w 1/1 Running 0 31m
Even though we had one controller _evicted_ at some point, we have another one which is `READY 1/1` and `Running` with `0` restarts. If the column `READY` is anything other than `1/1` the service mesh would be in a broken state.
-Column `READY` with `0/1` indicates the control plane container is crashing - we need to get logs. See `Get OSM Controller Logs from Azure Support Center` section below.
+Column `READY` with `0/1` indicates the control plane container is crashing - we need to get logs. Use the following command to inspect controller logs:
+```bash
+kubectl logs -n arc-osm-system -l app=osm-controller
+```
Column `READY` with a number higher than 1 after the `/` would indicate that there are sidecars installed. OSM Controller would most likely not work with any sidecars attached to it.
-### 3. Check OSM Controller **Service**
+### Check OSM Controller **Service**
```bash kubectl get service -n arc-osm-system osm-controller ```
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AG
osm-controller ClusterIP 10.0.31.254 <none> 15128/TCP,9092/TCP 67m ```
-> Note: The `CLUSTER-IP` would be different. The service `NAME` and `PORT(S)` must be the same as seen in the output.
+> [!NOTE]
+> The `CLUSTER-IP` would be different. The service `NAME` and `PORT(S)` must be the same as seen in the output.
-### 4. Check OSM Controller **Endpoints**:
+### Check OSM Controller **Endpoints**
```bash kubectl get endpoints -n arc-osm-system osm-controller ```
osm-controller 10.240.1.115:9092,10.240.1.115:15128 69m
If the user's cluster has no `ENDPOINTS` for `osm-controller` this would indicate that the control plane is unhealthy. This may be caused by the OSM Controller pod crashing, or never deployed correctly.
-### 5. Check OSM Injector **Deployment**
+### Check OSM Injector **Deployment**
```bash kubectl get deployments -n arc-osm-system osm-injector ```
NAME READY UP-TO-DATE AVAILABLE AGE
osm-injector 1/1 1 1 73m ```
-### 6. Check OSM Injector **Pod**
+### Check OSM Injector **Pod**
```bash kubectl get pod -n arc-osm-system --selector app=osm-injector ```
osm-injector-5986c57765-vlsdk 1/1 Running 0 73m
The `READY` column must be `1/1`. Any other value would indicate an unhealthy osm-injector pod.
-### 7. Check OSM Injector **Service**
+### Check OSM Injector **Service**
```bash kubectl get service -n arc-osm-system osm-injector ```
osm-injector ClusterIP 10.0.39.54 <none> 9090/TCP 75m
Ensure the IP address listed for `osm-injector` service is `9090`. There should be no `EXTERNAL-IP`.
-### 8. Check OSM Injector **Endpoints**
+### Check OSM Injector **Endpoints**
```bash kubectl get endpoints -n arc-osm-system osm-injector ```
osm-injector 10.240.1.172:9090 75m
For OSM to function, there must be at least one endpoint for `osm-injector`. The IP address of your OSM Injector endpoints will be different. The port `9090` must be the same.
-### 9. Check **Validating** and **Mutating** webhooks:
+### Check **Validating** and **Mutating** webhooks
```bash kubectl get ValidatingWebhookConfiguration --selector app=osm-controller ``` If the Validating Webhook is healthy, you will get an output similar to the following output: ```
-NAME WEBHOOKS AGE
-arc-osm-webhook-osm 1 81m
+NAME WEBHOOKS AGE
+osm-validator-mesh-osm 1 81m
``` ```bash
-kubectl get MutatingWebhookConfiguration --selector app=osm-controller
+kubectl get MutatingWebhookConfiguration --selector app=osm-injector
``` If the Mutating Webhook is healthy, you will get an output similar to the following output: ```
-NAME WEBHOOKS AGE
-arc-osm-webhook-osm 1 102m
+NAME WEBHOOKS AGE
+arc-osm-webhook-osm 1 102m
```
-Check for the service and the CA bundle of the **Validating** webhook:
-```
-kubectl get ValidatingWebhookConfiguration arc-osm-webhook-osm -o json | jq '.webhooks[0].clientConfig.service'
+Check for the service and the CA bundle of the **Validating** webhook
+```bash
+kubectl get ValidatingWebhookConfiguration osm-validator-mesh-osm -o json | jq '.webhooks[0].clientConfig.service'
``` A well configured Validating Webhook Configuration would have the following output:
A well configured Validating Webhook Configuration would have the following outp
{ "name": "osm-config-validator", "namespace": "arc-osm-system",
- "path": "/validate-webhook",
+ "path": "/validate",
"port": 9093 } ```
-Check for the service and the CA bundle of the **Mutating** webhook:
+Check for the service and the CA bundle of the **Mutating** webhook
```bash kubectl get MutatingWebhookConfiguration arc-osm-webhook-osm -o json | jq '.webhooks[0].clientConfig.service' ```
A well configured Mutating Webhook Configuration would have the following output
Check whether OSM Controller has given the Validating (or Mutating) Webhook a CA Bundle by using the following command: ```bash
-kubectl get ValidatingWebhookConfiguration arc-osm-webhook-osm -o json | jq -r '.webhooks[0].clientConfig.caBundle' | wc -c
+kubectl get ValidatingWebhookConfiguration osm-validator-mesh-osm -o json | jq -r '.webhooks[0].clientConfig.caBundle' | wc -c
``` ```bash
Example output:
```bash 1845 ```
-The number in the output indicates the number of bytes, or the size of the CA Bundle. If this is empty, 0, or some number under a 1000, it would indicate that the CA Bundle is not correctly provisioned. Without a correct CA Bundle, the ValidatingWebhook would throw an error and prohibit you from making changes to the `osm-config` ConfigMap in the `arc-osm-system` namespace.
-
-Let's look at a sample error when the CA Bundle is incorrect:
-- An attempt to change the `osm-config` ConfigMap:
- ```bash
- kubectl patch ConfigMap osm-config -n arc-osm-system --type merge --patch '{"data":{"config_resync_interval":"2m"}}'
- ```
-- Error output:
- ```bash
- Error from server (InternalError): Internal error occurred: failed calling webhook "osm-config-webhook.k8s.io": Post https://osm-config-validator.arc-osm-system.svc:9093/validate-webhook?timeout=30s: x509: certificate signed by unknown authority
- ```
-
-Use one of the following workarounds when the **Validating** Webhook Configuration has a bad certificate:
-- Option 1. Restart OSM Controller - This will restart the OSM Controller. On start, it will overwrite the CA Bundle of both the Mutating and Validating webhooks.
- ```bash
- kubectl rollout restart deployment -n arc-osm-system osm-controller
- ```
--- Option 2. Delete the Validating Webhook - Removing the Validating Webhook makes mutations of the `osm-config` ConfigMap no longer validated. Any patch will go through. The OSM Controller may have to be restarted to quickly rewrite the CA Bundle.
- ```bash
- kubectl delete ValidatingWebhookConfiguration arc-osm-webhook-osm
- ```
+The number in the output indicates the number of bytes, or the size of the CA Bundle. If this is empty, 0, or some number under a 1000, it would indicate that the CA Bundle is not correctly provisioned. Without a correct CA Bundle, the ValidatingWebhook would throw an error.
-- Option 3. Delete and Patch: The following command will delete the validating webhook, allowing you to add any values, and will immediately try to apply a patch
- ```bash
- kubectl delete ValidatingWebhookConfiguration arc-osm-webhook-osm; kubectl patch ConfigMap osm-config -n arc-osm-system --type merge --patch '{"data":{"config_resync_interval":"15s"}}'
- ```
+### Check the `osm-mesh-config` resource
+Check for the existence:
-### 10. Check the `osm-config` **ConfigMap**
+```azurecli-interactive
+kubectl get meshconfig osm-mesh-config -n arc-osm-system
+```
->[!Note]
->The OSM Controller does not require `osm-config` ConfigMap to be present in the `arc-osm-system` namespace. The controller has reasonable default values for the config and can operate without it.
+Check the content of the OSM MeshConfig
-Check for the existence:
-```bash
-kubectl get ConfigMap -n arc-osm-system osm-config
+```azurecli-interactive
+kubectl get meshconfig osm-mesh-config -n arc-osm-system -o yaml
```
-Check the content of the `osm-config` ConfigMap:
-```bash
-kubectl get ConfigMap -n arc-osm-system osm-config -o json | jq '.data'
-```
-You will get the following output:
-```json
-{
- "egress": "false",
- "enable_debug_server": "false",
- "enable_privileged_init_container": "false",
- "envoy_log_level": "error",
- "permissive_traffic_policy_mode": "true",
- "prometheus_scraping": "true",
- "service_cert_validity_duration": "24h",
- "tracing_enable": "false",
- "use_https_ingress": "false",
-}
+```yaml
+apiVersion: config.openservicemesh.io/v1alpha1
+kind: MeshConfig
+metadata:
+ creationTimestamp: "0000-00-00A00:00:00A"
+ generation: 1
+ name: osm-mesh-config
+ namespace: arc-osm-system
+ resourceVersion: "2494"
+ uid: 6c4d67f3-c241-4aeb-bf4f-b029b08faa31
+spec:
+ certificate:
+ certKeyBitSize: 2048
+ serviceCertValidityDuration: 24h
+ featureFlags:
+ enableAsyncProxyServiceMapping: false
+ enableEgressPolicy: true
+ enableEnvoyActiveHealthChecks: false
+ enableIngressBackendPolicy: true
+ enableMulticlusterMode: false
+ enableRetryPolicy: false
+ enableSnapshotCacheMode: false
+ enableWASMStats: true
+ observability:
+ enableDebugServer: false
+ osmLogLevel: info
+ tracing:
+ enable: false
+ sidecar:
+ configResyncInterval: 0s
+ enablePrivilegedInitContainer: false
+ logLevel: error
+ resources: {}
+ traffic:
+ enableEgress: false
+ enablePermissiveTrafficPolicyMode: true
+ inboundExternalAuthorization:
+ enable: false
+ failureModeAllow: false
+ statPrefix: inboundExtAuthz
+ timeout: 1s
+ inboundPortExclusionList: []
+ outboundIPRangeExclusionList: []
+ outboundPortExclusionList: []
+kind: List
+metadata:
+ resourceVersion: ""
+ selfLink: ""
```
-Refer [OSM ConfigMap documentation](https://release-v0-8.docs.openservicemesh.io/docs/osm_config_map/) to understand `osm-config` ConfigMap values.
-
-### 11. Check Namespaces
+`osm-mesh-config` resource values:
+
+| Key | Type | Default Value | Kubectl Patch Command Examples |
+|--|||--|
+| spec.traffic.enableEgress | bool | `false` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"traffic":{"enableEgress":false}}}' --type=merge` |
+| spec.traffic.enablePermissiveTrafficPolicyMode | bool | `true` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"traffic":{"enablePermissiveTrafficPolicyMode":true}}}' --type=merge` |
+| spec.traffic.outboundPortExclusionList | array | `[]` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"traffic":{"outboundPortExclusionList":[6379,8080]}}}' --type=merge` |
+| spec.traffic.outboundIPRangeExclusionList | array | `[]` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"traffic":{"outboundIPRangeExclusionList":["10.0.0.0/32","1.1.1.1/24"]}}}' --type=merge` |
+| spec.traffic.inboundPortExclusionList | array | `[]` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"traffic":{"inboundPortExclusionList":[6379,8080]}}}' --type=merge` |
+| spec.certificate.serviceCertValidityDuration | string | `"24h"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"certificate":{"serviceCertValidityDuration":"24h"}}}' --type=merge` |
+| spec.observability.enableDebugServer | bool | `false` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"observability":{"enableDebugServer":false}}}' --type=merge` |
+| spec.observability.osmLogLevel | string | `"info"`| `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"observability":{"tracing":{"osmLogLevel": "info"}}}}' --type=merge` |
+| spec.observability.tracing.enable | bool | `false` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"observability":{"tracing":{"enable":true}}}}' --type=merge` |
+| spec.sidecar.enablePrivilegedInitContainer | bool | `false` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"sidecar":{"enablePrivilegedInitContainer":true}}}' --type=merge` |
+| spec.sidecar.logLevel | string | `"error"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"sidecar":{"logLevel":"error"}}}' --type=merge` |
+| spec.featureFlags.enableWASMStats | bool | `"true"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableWASMStats":"true"}}}' --type=merge` |
+| spec.featureFlags.enableEgressPolicy | bool | `"true"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableEgressPolicy":"true"}}}' --type=merge` |
+| spec.featureFlags.enableMulticlusterMode | bool | `"false"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableMulticlusterMode":"false"}}}' --type=merge` |
+| spec.featureFlags.enableSnapshotCacheMode | bool | `"false"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableSnapshotCacheMode":"false"}}}' --type=merge` |
+| spec.featureFlags.enableAsyncProxyServiceMapping | bool | `"false"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableAsyncProxyServiceMapping":"false"}}}' --type=merge` |
+| spec.featureFlags.enableIngressBackendPolicy | bool | `"true"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableIngressBackendPolicy":"true"}}}' --type=merge` |
+| spec.featureFlags.enableEnvoyActiveHealthChecks | bool | `"false"` | `kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"featureFlags":{"enableEnvoyActiveHealthChecks":"false"}}}' --type=merge` |
+
+### Check Namespaces
>[!Note] >The arc-osm-system namespace will never participate in a service mesh and will never be labeled and/or annotated with the key/values below.
When a kubernetes namespace is part of the mesh, the following must be true:
View the annotations of the namespace `bookbuyer`: ```bash
-kc get namespace bookbuyer -o json | jq '.metadata.annotations'
+kubectl get namespace bookbuyer -o json | jq '.metadata.annotations'
``` The following annotation must be present:
The following annotation must be present:
View the labels of the namespace `bookbuyer`: ```bash
-kc get namespace bookbuyer -o json | jq '.metadata.labels'
+kubectl get namespace bookbuyer -o json | jq '.metadata.labels'
``` The following label must be present:
The following label must be present:
Note that if you are not using `osm` CLI, you could also manually add these annotations to your namespaces. If a namespace is not annotated with `"openservicemesh.io/sidecar-injection": "enabled"` or not labeled with `"openservicemesh.io/monitored-by": "osm"` the OSM Injector will not add Envoy sidecars. >[!Note]
->After `osm namespace add` is called, only **new** pods will be injected with an Envoy sidecar. Existing pods must be restarted with `kubectl rollout restard deployment` command.
+>After `osm namespace add` is called, only **new** pods will be injected with an Envoy sidecar. Existing pods must be restarted with `kubectl rollout restart deployment` command.
-### 12. Verify the SMI CRDs
+### Verify the SMI CRDs
Check whether the cluster has the required CRDs: ```bash kubectl get crds ```
-Ensure that the CRDs correspond to the same OSM upstream version. E.g. if you are using v0.8.4, ensure that the CRDs match the ones that are available in the release branch v0.8.4 of [OSM OSS project](https://docs.openservicemesh.io/). Refer [OSM release notes](https://github.com/openservicemesh/osm/releases).
+Ensure that the CRDs correspond to the versions available in the release branch. For example, if you are using OSM-Arc v1.0.0-1, navigate to the [SMI supported versions page](https://docs.openservicemesh.io/docs/overview/smi/) and select v1.0 from the Releases dropdown to check which CRDs versions are in use.
Get the versions of the CRDs installed with the following command: ```bash
for x in $(kubectl get crds --no-headers | awk '{print $1}' | grep 'smi-spec.io'
done ```
-If CRDs are missing, use the following commands to install them on the cluster. Ensure that you replace the version in the command.
+If CRDs are missing, use the following commands to install them on the cluster. If you are using a version of OSM-Arc that is not v1.0, ensure that you replace the version in the command (ex: v1.1.0 would be release-v1.1).
+ ```bash
-kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v0.8.2/charts/osm/crds/access.yaml
+kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v1.0/cmd/osm-bootstrap/crds/smi_http_route_group.yaml
+
+kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v1.0/cmd/osm-bootstrap/crds/smi_tcp_route.yaml
-kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v0.8.2/charts/osm/crds/specs.yaml
+kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v1.0/cmd/osm-bootstrap/crds/smi_traffic_access.yaml
-kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v0.8.2/charts/osm/crds/split.yaml
+kubectl apply -f https://raw.githubusercontent.com/openservicemesh/osm/release-v1.0/cmd/osm-bootstrap/crds/smi_traffic_split.yaml
```
-### 13. Troubleshoot Certificate Management
+Refer to [OSM release notes](https://github.com/openservicemesh/osm/releases) to see CRD changes between releases.
+
+### Troubleshoot certificate management
Information on how OSM issues and manages certificates to Envoy proxies running on application pods can be found on the [OSM docs site](https://docs.openservicemesh.io/docs/guides/certificates/).
-### 14. Upgrade Envoy
-When a new pod is created in a namespace monitored by the add-on, OSM will inject an [envoy proxy sidecar](https://docs.openservicemesh.io/docs/guides/app_onboarding/sidecar_injection/) in that pod. If the envoy version needs to be updated, steps to do so can be found in the [Upgrade Guide](https://release-v0-11.docs.openservicemesh.io/docs/getting_started/upgrade/#envoy) on the OSM docs site.
+### Upgrade Envoy
+When a new pod is created in a namespace monitored by the add-on, OSM will inject an [Envoy proxy sidecar](https://docs.openservicemesh.io/docs/guides/app_onboarding/sidecar_injection/) in that pod. If the envoy version needs to be updated, steps to do so can be found in the [Upgrade Guide](https://docs.openservicemesh.io/docs/guides/upgrade/#envoy) on the OSM docs site.
azure-arc Tutorial Arc Enabled Open Service Mesh https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/tutorial-arc-enabled-open-service-mesh.md
Title: Azure Arc-enabled Open Service Mesh (Preview)
+ Title: Azure Arc-enabled Open Service Mesh
description: Open Service Mesh (OSM) extension on Azure Arc-enabled Kubernetes cluster Previously updated : 07/23/2021 Last updated : 04/07/2022
-# Azure Arc-enabled Open Service Mesh (Preview)
+# Azure Arc-enabled Open Service Mesh
[Open Service Mesh (OSM)](https://docs.openservicemesh.io/) is a lightweight, extensible, Cloud Native service mesh that allows users to uniformly manage, secure, and get out-of-the-box observability features for highly dynamic microservice environments.
OSM runs an Envoy-based control plane on Kubernetes, can be configured with [SMI
### Support limitations for Azure Arc-enabled Open Service Mesh - Only one instance of Open Service Mesh can be deployed on an Azure Arc-connected Kubernetes cluster.-- Public preview is available for Open Service Mesh version v0.8.4 and above. Find out the latest version of the release [here](https://github.com/Azure/osm-azure/releases). The supported release versions are appended with notes. Ignore the tags associated with intermediate releases.
+- Support is available for Azure Arc-enabled Open Service Mesh version v1.0.0-1 and above. Find the latest version [here](https://github.com/Azure/osm-azure/releases). Supported release versions are appended with notes. Ignore the tags associated with intermediate releases.
- The following Kubernetes distributions are currently supported: - AKS Engine - AKS on HCI
OSM runs an Envoy-based control plane on Kubernetes, can be configured with [SMI
- OpenShift Kubernetes Distribution - Amazon Elastic Kubernetes Service - VMware Tanzu Kubernetes Grid-- Azure Monitor integration with Azure Arc-enabled Open Service Mesh is available with [limited support](https://github.com/microsoft/Docker-Provider/blob/ci_dev/Documentation/OSMPrivatePreview/ReadMe.md).
+- Azure Monitor integration with Azure Arc-enabled Open Service Mesh is available with [limited support](#monitoring-application-using-azure-monitor-and-applications-insights).
[!INCLUDE [preview features note](./includes/preview/preview-callout.md)] ### Prerequisites - Ensure you have met all the common prerequisites for cluster extensions listed [here](extensions.md#prerequisites).-- Use az k8s-extension CLI version >= v0.4.0
+- Use az k8s-extension CLI version >= v1.0.4
-## Basic installation of Azure Arc-enabled OSM
+## Basic installation
+
+Arc-enabled Open Service Mesh can be deployed through Azure portal, an ARM template, a built-in Azure policy, and CLI.
+
+### Basic installation using Azure portal
+To deploy using Azure portal, once you have an Arc connected cluster, go to the cluster's **Open Service Mesh** section.
+
+[ ![Open Service Mesh located under Settings for Arc enabled Kubernetes cluster](media/tutorial-arc-enabled-open-service-mesh/osm-portal-install.jpg) ](media/tutorial-arc-enabled-open-service-mesh/osm-portal-install.jpg#lightbox)
+
+Simply select the **Install extension** button to deploy the latest version of the extension.
+
+Alternatively, you can use the CLI experience captured below. For at-scale onboarding, read further in this article about deployment using [ARM template](#install-azure-arc-enabled-osm-using-arm-template) and using [Azure Policy](#install-azure-arc-enabled-osm-using-built-in-policy).
+
+### Basic installation using Azure CLI
The following steps assume that you already have a cluster with a supported Kubernetes distribution connected to Azure Arc. Ensure that your KUBECONFIG environment variable points to the kubeconfig of the Arc-enabled Kubernetes cluster.
Ensure that your KUBECONFIG environment variable points to the kubeconfig of the
Set the environment variables: ```azurecli-interactive
-export VERSION=<osm-arc-version>
export CLUSTER_NAME=<arc-cluster-name> export RESOURCE_GROUP=<resource-group-name> ```
-While Azure Arc-enabled Open Service Mesh is in preview, the `az k8s-extension create` command only accepts `pilot` for the `--release-train` flag. `--auto-upgrade-minor-version` is always set to `false` and a version must be provided. If you are using an OpenShift cluster, use the steps in the [section](#install-osm-on-an-openshift-cluster).
+If you are using an OpenShift cluster, skip to the OpenShift installation steps [below](#install-osm-on-an-openshift-cluster).
+
+Create the extension:
+> [!NOTE]
+> If you would like to pin a specific version of OSM, add the `--version x.y.z` flag to the `create` command. Note that this will set the value for `auto-upgrade-minor-version` to false.
```azurecli-interactive
-az k8s-extension create --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --extension-type Microsoft.openservicemesh --scope cluster --release-train pilot --name osm --version $VERSION
+az k8s-extension create --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --extension-type Microsoft.openservicemesh --scope cluster --name osm
```
-You should see output similar to the output shown below. It may take 3-5 minutes for the actual OSM helm chart to get deployed to the cluster. Until this deployment happens, you will continue to see installState as Pending.
+You should see output similar to the example below. It may take 3-5 minutes for the actual OSM helm chart to get deployed to the cluster. Until this deployment happens, you will continue to see installState as Pending.
```json {
- "autoUpgradeMinorVersion": false,
+ "autoUpgradeMinorVersion": true,
"configurationSettings": {}, "creationTime": "2021-04-29T17:50:11.4116524+00:00", "errorInfo": {
You should see output similar to the output shown below. It may take 3-5 minutes
"lastStatusTime": null, "location": null, "name": "osm",
- "releaseTrain": "pilot",
+ "releaseTrain": "stable",
"resourceGroup": "$RESOURCE_GROUP", "scope": { "cluster": {
You should see output similar to the output shown below. It may take 3-5 minutes
}, "statuses": [], "type": "Microsoft.KubernetesConfiguration/extensions",
- "version": "x.x.x"
+ "version": "x.y.z"
} ```
-## Custom installations of Azure Arc-enabled OSM
-The following sections describe certain custom installations of Azure Arc-enabled OSM. Custom installations require setting
+Next, [validate your installation](#validate-installation).
+
+## Custom installations
+The following sections describe certain custom installations of Azure Arc-enabled OSM. Custom installations require setting
values of OSM by in a JSON file and passing them into `k8s-extension create` CLI command as described below. ### Install OSM on an OpenShift cluster
values of OSM by in a JSON file and passing them into `k8s-extension create` CLI
"osm.osm.enablePrivilegedInitContainer": "true" } ```
-
+ 2. [Install OSM with custom values](#setting-values-during-osm-installation).
-
+ 3. Add the privileged [security context constraint](https://docs.openshift.com/container-platform/4.7/authentication/managing-security-context-constraints.html) to each service account for the applications in the mesh. ```azurecli-interactive oc adm policy add-scc-to-user privileged -z <service account name> -n <service account namespace>
values of OSM by in a JSON file and passing them into `k8s-extension create` CLI
It may take 3-5 minutes for the actual OSM helm chart to get deployed to the cluster. Until this deployment happens, you will continue to see installState as Pending.
-To ensure that the privileged init container setting is not reverted to the default, pass in the "osm.osm.enablePrivilegedInitContainer" : "true" configuration setting to all subsequent az k8s-extension create commands.
+To ensure that the privileged init container setting is not reverted to the default, pass in the `"osm.osm.enablePrivilegedInitContainer" : "true"` configuration setting to all subsequent az k8s-extension create commands.
### Enable High Availability features on installation OSM's control plane components are built with High Availability and Fault Tolerance in mind. This section describes how to
enable Horizontal Pod Autoscaling (HPA) and Pod Disruption Budget (PDB) during i
considerations of High Availability on OSM [here](https://docs.openservicemesh.io/docs/guides/ha_scale/high_availability/). #### Horizontal Pod Autoscaling (HPA)
-HPA automatically scales up or down control plane pods based on the average target CPU utilization (%) and average target
+HPA automatically scales up or down control plane pods based on the average target CPU utilization (%) and average target
memory utilization (%) defined by the user. To enable HPA and set applicable values on OSM control plane pods during installation, create or
-append to your existing JSON settings file as below, repeating the key/value pairs for each control plane pod
-(`osmController`, `injector`) that you want to enable HPA on.
+append to your existing JSON settings file as below, repeating the key/value pairs for each control plane pod
+(`osmController`, `injector`) that you want to enable HPA on.
```json {
append to your existing JSON settings file as below, repeating the key/value pai
Now, [install OSM with custom values](#setting-values-during-osm-installation). #### Pod Disruption Budget (PDB)
-In order to prevent disruptions during planned outages, control plane pods `osm-controller` and `osm-injector` have a PDB
+In order to prevent disruptions during planned outages, control plane pods `osm-controller` and `osm-injector` have a PDB
that ensures there is always at least 1 pod corresponding to each control plane application.
-To enable PDB, create or append to your existing JSON settings file as follows for each desired control plane pod
+To enable PDB, create or append to your existing JSON settings file as follows for each desired control plane pod
(`osmController`, `injector`): ```json {
To enable PDB, create or append to your existing JSON settings file as follows f
Now, [install OSM with custom values](#setting-values-during-osm-installation).
-### Install OSM with cert-manager for Certificate Management
+### Install OSM with cert-manager for certificate management
[cert-manager](https://cert-manager.io/) is a provider that can be used for issuing signed certificates to OSM without
-the need for storing private keys in Kubernetes. Refer to OSM's [cert-manager documentation](https://release-v0-11.docs.openservicemesh.io/docs/guides/certificates/)
+the need for storing private keys in Kubernetes. Refer to OSM's [cert-manager documentation](https://docs.openservicemesh.io/docs/guides/certificates/)
and [demo](https://docs.openservicemesh.io/docs/demos/cert-manager_integration/) to learn more. > [!NOTE]
-> Use the commands provided in the OSM GitHub documentation with caution. Ensure that you use the correct namespace name `arc-osm-system`.
-
-To install OSM with cert-manager as the certificate provider, create or append to your existing JSON settings file the `certificateProvider.kind`
+> Use the commands provided in the OSM GitHub documentation with caution. Ensure that you use the correct namespace in commands or specify with flag `--osm-namespace arc-osm-system`.
+To install OSM with cert-manager as the certificate provider, create or append to your existing JSON settings file the `certificateProvider.kind`
value set to cert-manager as shown below. If you would like to change from default cert-manager values specified in OSM documentation, also include and update the subsequent `certmanager.issuer` lines.
also include and update the subsequent `certmanager.issuer` lines.
Now, [install OSM with custom values](#setting-values-during-osm-installation).
-### Install OSM with Contour for Ingress
+### Install OSM with Contour for ingress
OSM provides multiple options to expose mesh services externally using ingress. OSM can use [Contour](https://projectcontour.io/), which works with the ingress controller installed outside the mesh and provisioned with a certificate to participate in the mesh. Refer to [OSM's ingress documentation](https://docs.openservicemesh.io/docs/guides/traffic_management/ingress/#1-using-contour-ingress-controller-and-gateway) and [demo](https://docs.openservicemesh.io/docs/demos/ingress_contour/) to learn more. > [!NOTE]
-> Use the commands provided in the OSM GitHub documentation with caution. Ensure that you use the correct namespace name `arc-osm-system`.
-
+> Use the commands provided in the OSM GitHub documentation with caution. Ensure that you use the correct namespace in commands or specify with flag `--osm-namespace arc-osm-system`.
To set required values for configuring Contour during OSM installation, append the following to your JSON settings file: ```json {
Now, [install OSM with custom values](#setting-values-during-osm-installation).
Any values that need to be set during OSM installation need to be saved to a single JSON file and passed in through the Azure CLI install command.
-Once you have created a JSON file with applicable values as described in above custom installation sections, set the
+Once you have created a JSON file with applicable values as described in above custom installation sections, set the
file path as an environment variable: ```azurecli-interactive export SETTINGS_FILE=<json-file-path> ```
-Run the `az k8s-extension create` command to create the OSM extension, passing in the settings file using the
-`--configuration-settings` flag:
+Run the `az k8s-extension create` command to create the OSM extension, passing in the settings file using the
+`--configuration-settings-file` flag:
```azurecli-interactive
- az k8s-extension create --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --extension-type Microsoft.openservicemesh --scope cluster --release-train pilot --name osm --version $VERSION --configuration-settings-file $SETTINGS_FILE
+ az k8s-extension create --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --extension-type Microsoft.openservicemesh --scope cluster --name osm --configuration-settings-file $SETTINGS_FILE
``` ## Install Azure Arc-enabled OSM using ARM template
After connecting your cluster to Azure Arc, create a JSON file with the followin
} }, "ReleaseTrain": {
- "defaultValue": "Pilot",
+ "defaultValue": "Stable",
"type": "String", "metadata": { "description": "The release train."
az deployment group create --name $DEPLOYMENT_NAME --resource-group $RESOURCE_GR
You should now be able to view the OSM resources and use the OSM extension in your cluster.
+## Install Azure Arc-enabled OSM using built-in policy
+
+A built-in policy is available on Azure portal under the category of **Kubernetes** by the name of **Azure Arc-enabled Kubernetes clusters should have the Open Service Mesh extension installed**.
+This policy can be assigned at the scope of a subscription or a resource group. The default action of this policy is **Deploy if not exists**.
+However, you could choose to simply audit the clusters for extension installations by changing the parameters during assignment.
+You will also be prompted to specify the version you wish to install (v1.0.0-1 or above) as a parameter.
+ ## Validate installation Run the following command.
You should see a JSON output similar to the output below:
```json {
- "autoUpgradeMinorVersion": false,
+ "autoUpgradeMinorVersion": true,
"configurationSettings": {}, "creationTime": "2021-04-29T19:22:00.7649729+00:00", "errorInfo": {
You should see a JSON output similar to the output below:
"lastStatusTime": "2021-04-29T19:23:27.642+00:00", "location": null, "name": "osm",
- "releaseTrain": "pilot",
+ "releaseTrain": "stable",
"resourceGroup": "$RESOURCE_GROUP", "scope": { "cluster": {
You should see a JSON output similar to the output below:
}, "statuses": [], "type": "Microsoft.KubernetesConfiguration/extensions",
- "version": "x.x.x"
+ "version": "x.y.z"
} ``` ## OSM controller configuration
kubectl describe meshconfig osm-mesh-config -n arc-osm-system
The output would show the default values: ```azurecli-interactive
-Certificate:
+ Certificate:
+ Cert Key Bit Size: 2048
Service Cert Validity Duration: 24h Feature Flags:
- Enable Egress Policy: true
- Enable Multicluster Mode: false
- Enable WASM Stats: true
+ Enable Async Proxy Service Mapping: false
+ Enable Egress Policy: true
+ Enable Envoy Active Health Checks: false
+ Enable Ingress Backend Policy: true
+ Enable Multicluster Mode: false
+ Enable Retry Policy: false
+ Enable Snapshot Cache Mode: false
+ Enable WASM Stats: true
Observability: Enable Debug Server: false Osm Log Level: info Tracing:
- Address: jaeger.osm-system.svc.cluster.local
- Enable: false
- Endpoint: /api/v2/spans
- Port: 9411
+ Enable: false
Sidecar: Config Resync Interval: 0s Enable Privileged Init Container: false
- Envoy Image: mcr.microsoft.com/oss/envoyproxy/envoy:v1.18.3
- Init Container Image: mcr.microsoft.com/oss/openservicemesh/init:v0.9.1
Log Level: error
- Max Data Plane Connections: 0
Resources: Traffic: Enable Egress: false
Certificate:
Failure Mode Allow: false Stat Prefix: inboundExtAuthz Timeout: 1s
- Use HTTPS Ingress: false
+ Inbound Port Exclusion List:
+ Outbound IP Range Exclusion List:
+ Outbound Port Exclusion List:
```
-Refer to the [Config API reference](https://docs.openservicemesh.io/docs/api_reference/config/v1alpha1/) for more information. Notice that **spec.traffic.enablePermissiveTrafficPolicyMode** is set to **true**. Permissive traffic policy mode in OSM is a mode where the [SMI](https://smi-spec.io/) traffic policy enforcement is bypassed. In this mode, OSM automatically discovers services that are a part of the service mesh and programs traffic policy rules on each Envoy proxy sidecar to be able to communicate with these services.
+Refer to the [Config API reference](https://docs.openservicemesh.io/docs/api_reference/config/v1alpha1/) for more information. Notice that `spec.traffic.enablePermissiveTrafficPolicyMode` is set to `true`. When OSM is in permissive traffic policy mode, [SMI](https://smi-spec.io/) traffic policy enforcement is bypassed. In this mode, OSM automatically discovers services that are a part of the service mesh and programs traffic policy rules on each Envoy proxy sidecar to be able to communicate with these services.
+
+`osm-mesh-config` can also be viewed on Azure portal by selecting **Edit configuration** in the cluster's Open Service Mesh section.
+
+[ ![Edit configuration button located on top of the Open Service Mesh section](media/tutorial-arc-enabled-open-service-mesh/osm-portal-configuration.jpg) ](media/tutorial-arc-enabled-open-service-mesh/osm-portal-configuration.jpg#lightbox)
### Making changes to OSM controller configuration > [!NOTE] > Values in the MeshConfig `osm-mesh-config` are persisted across upgrades.- Changes to `osm-mesh-config` can be made using the kubectl patch command. In the following example, the permissive traffic policy mode is changed to false. ```azurecli-interactive
If an incorrect value is used, validations on the MeshConfig CRD will prevent th
```azurecli-interactive kubectl patch meshconfig osm-mesh-config -n arc-osm-system -p '{"spec":{"traffic":{"enableEgress":"no"}}}' --type=merge- # Validations on the CRD will deny this change The MeshConfig "osm-mesh-config" is invalid: spec.traffic.enableEgress: Invalid value: "string": spec.traffic.enableEgress in body must be of type boolean: "string" ```
-## OSM controller configuration (version v0.8.4)
-
-Currently you can access and configure the OSM controller configuration via the ConfigMap. To view the OSM controller configuration settings, query the `osm-config` ConfigMap via `kubectl` to view its configuration settings.
-
-```azurecli-interactive
-kubectl get configmap osm-config -n arc-osm-system -o json
-```
-
-Output:
+Alternatively, to edit `osm-mesh-config` in Azure portal, select **Edit configuration** in the cluster's Open Service Mesh section.
-```json
-{
- "egress": "false",
- "enable_debug_server": "false",
- "enable_privileged_init_container": "false",
- "envoy_log_level": "error",
- "permissive_traffic_policy_mode": "true",
- "prometheus_scraping": "true",
- "service_cert_validity_duration": "24h",
- "tracing_enable": "false",
- "use_https_ingress": "false"
-}
-```
-
-Read [OSM ConfigMap documentation](https://release-v0-8.docs.openservicemesh.io/docs/osm_config_map/) to understand each of the available configurations.
-
-To make changes to the OSM ConfigMap for version v0.8.4, use the following guidance:
-
-1. Copy and save the changes you wish to make in a JSON file. In this example, we are going to change the permissive_traffic_policy_mode from true to false. Each time you make a change to `osm-config`, you will have to provide the full list of changes (compared to the default `osm-config`) in a JSON file.
- ```json
- {
- "osm.osm.enablePermissiveTrafficPolicy" : "false"
- }
- ```
-
- Set the file path as an environment variable:
-
- ```azurecli-interactive
- export SETTINGS_FILE=<json-file-path>
- ```
-
-2. Run the same `az k8s-extension create` command used to create the extension, but now pass in the configuration settings file:
- ```azurecli-interactive
- az k8s-extension create --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --extension-type Microsoft.openservicemesh --scope cluster --release-train pilot --name osm --version $VERSION --configuration-settings-file $SETTINGS_FILE
- ```
-
- > [!NOTE]
- > To ensure that the ConfigMap changes are not reverted to the default, pass in the same configuration settings to all subsequent az k8s-extension create commands.
+[ ![Edit configuration button in the Open Service Mesh section](media/tutorial-arc-enabled-open-service-mesh/osm-portal-configuration-edit.jpg) ](media/tutorial-arc-enabled-open-service-mesh/osm-portal-configuration-edit.jpg#lightbox)
## Using Azure Arc-enabled OSM
Add namespaces to the mesh by running the following command:
```azurecli-interactive osm namespace add <namespace_name> ```
+Namespaces can be onboarded from Azure portal as well by selecting **+Add** in the cluster's Open Service Mesh section.
+
+[ ![+Add button located on top of the Open Service Mesh section](media/tutorial-arc-enabled-open-service-mesh/osm-portal-add-namespace.jpg) ](media/tutorial-arc-enabled-open-service-mesh/osm-portal-add-namespace.jpg#lightbox)
More information about onboarding services can be found [here](https://docs.openservicemesh.io/docs/guides/app_onboarding/#onboard-services). ### Configure OSM with Service Mesh Interface (SMI) policies
-You can start with a [demo application](https://release-v0-11.docs.openservicemesh.io/docs/getting_started/quickstart/manual_demo/#deploy-applications) or use your test environment to try out SMI policies.
-
-> [!NOTE]
-> Ensure that the version of the bookstore application you run matches the version of the OSM extension installed on your cluster. Ex: if you are using v0.8.4 of the OSM extension, use the bookstore demo from release-v0.8 branch of OSM upstream repository.
+You can start with a [sample application](https://docs.openservicemesh.io/docs/getting_started/install_apps/) or use your test environment to try out SMI policies.
+> [!NOTE]
+> If you are using a sample applications, ensure that their versions match the version of the OSM extension installed on your cluster. For example, if you are using v1.0.0 of the OSM extension, use the bookstore manifest from release-v1.0 branch of OSM upstream repository.
### Configuring your own Jaeger, Prometheus and Grafana instances
-The OSM extension does not install add-ons like [Flagger](https://docs.flagger.app/), [Jaeger](https://www.jaegertracing.io/docs/getting-started/), [Prometheus](https://prometheus.io/docs/prometheus/latest/installation/) and [Grafana](https://grafana.com/docs/grafana/latest/installation/) so that users can integrate OSM with their own running instances of those tools instead. To integrate with your own instances, check the following documentation:
+The OSM extension does not install add-ons like [Jaeger](https://www.jaegertracing.io/docs/getting-started/), [Prometheus](https://prometheus.io/docs/prometheus/latest/installation/), [Grafana](https://grafana.com/docs/grafana/latest/installation/) and [Flagger](https://docs.flagger.app/) so that users can integrate OSM with their own running instances of those tools instead. To integrate with your own instances, refer to the following documentation:
> [!NOTE]
-> Use the commands provided in the OSM GitHub documentation with caution. Ensure that you use the correct namespace name 'arc-osm-system' when making changes to `osm-mesh-config`.
-
+> Use the commands provided in the OSM GitHub documentation with caution. Ensure that you use the correct namespace name `arc-osm-system` when making changes to `osm-mesh-config`.
- [BYO-Jaeger instance](https://docs.openservicemesh.io/docs/guides/observability/tracing/#byo-bring-your-own)-- [BYO-Prometheus instance](https://docs.openservicemesh.io/docs/guides/observability/metrics/#byo-prometheus)-- [BYO-Grafana dashboard](https://docs.openservicemesh.io/docs/guides/observability/metrics/#importing-dashboards-on-a-byo-grafana-instance)
+- [BYO-Prometheus instance](https://docs.openservicemesh.io/docs/guides/observability/metrics/#prometheus)
+- [BYO-Grafana dashboard](https://docs.openservicemesh.io/docs/guides/observability/metrics/#grafana)
- [OSM Progressive Delivery with Flagger](https://docs.flagger.app/tutorials/osm-progressive-delivery) ## Monitoring application using Azure Monitor and Applications Insights
-Both Azure Monitor and Azure Application Insights helps you maximize the availability and performance of your applications and services by delivering a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments.
+Both Azure Monitor and Azure Application Insights help you maximize the availability and performance of your applications and services by delivering a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments.
-Azure Arc-enabled Open Service Mesh will have deep integrations into both of these Azure services, and provide a seemless Azure experience for viewing and responding to critical KPIs provided by OSM metrics. Follow the steps below to allow Azure Monitor to scrape prometheus endpoints for collecting application metrics.
+Azure Arc-enabled Open Service Mesh will have deep integrations into both of these Azure services, and provide a seamless Azure experience for viewing and responding to critical KPIs provided by OSM metrics. Follow the steps below to allow Azure Monitor to scrape Prometheus endpoints for collecting application metrics.
-1. Ensure that the application namespaces that you wish to be monitored are onboarded to the mesh. Follow the guidance [available here](#onboard-namespaces-to-the-service-mesh).
+1. Follow the guidance available [here](#onboard-namespaces-to-the-service-mesh) to ensure that the application namespaces that you wish to be monitored are onboarded to the mesh.
-2. Expose the prometheus endpoints for application namespaces.
+2. Expose the Prometheus endpoints for application namespaces.
```azurecli-interactive osm metrics enable --namespace <namespace1> osm metrics enable --namespace <namespace2> ```
- For v0.8.4, ensure that `prometheus_scraping` is set to `true` in the `osm-config` ConfigMap.
3. Install the Azure Monitor extension using the guidance available [here](../../azure-monitor/containers/container-insights-enable-arc-enabled-clusters.md?toc=/azure/azure-arc/kubernetes/toc.json).
-4. Add the namespaces you want to monitor in container-azm-ms-osmconfig ConfigMap. Download the ConfigMap from [here](https://github.com/microsoft/Docker-Provider/blob/ci_prod/kubernetes/container-azm-ms-osmconfig.yaml).
- ```azurecli-interactive
- monitor_namespaces = ["namespace1", "namespace2"]
- ```
+4. Create a Configmap in the `kube-system` namespace that enables Azure Monitor to monitor your namespaces. For example, create a `container-azm-ms-osmconfig.yaml` with the following to monitor `<namespace1>` and `<namespace2>`:
+
+ ```yaml
+ kind: ConfigMap
+ apiVersion: v1
+ data:
+ schema-version: v1
+ config-version: ver1
+ osm-metric-collection-configuration: |-
+ # OSM metric collection settings
+ [osm_metric_collection_configuration]
+ [osm_metric_collection_configuration.settings]
+ # Namespaces to monitor
+ monitor_namespaces = ["<namespace1>", "<namespace2>"]
+ metadata:
+ name: container-azm-ms-osmconfig
+ namespace: kube-system
+ ```
5. Run the following kubectl command ```azurecli-interactive
InsightsMetrics
| extend t=parse_json(Tags) | where t.app == "namespace1" ```
-Read more about integration with Azure Monitor [here](https://github.com/microsoft/Docker-Provider/blob/ci_dev/Documentation/OSMPrivatePreview/ReadMe.md).
-
-### Navigating the OSM dashboard
-
-1. Access your Azure Arc-connected Kubernetes cluster using this [link](https://aka.ms/azmon/osmarcux).
-2. Go to Azure Monitor and navigate to the Reports tab to access the OSM workbook.
-3. Select the time-range & namespace to scope your services.
-
-![OSM workbook](./media/tutorial-arc-enabled-open-service-mesh/osm-workbook.jpg)
#### Requests tab
Read more about integration with Azure Monitor [here](https://github.com/microso
#### Connections tab - This tab provides you a summary of all the connections between your services in Open Service Mesh.-- Outbound connections: Total number of connections between Source and destination services.-- Outbound active connections: Last count of active connections between source and destination in selected time range.-- Outbound failed connections: Total number of failed connections between source and destination service
+- Outbound connections: total number of connections between Source and destination services.
+- Outbound active connections: last count of active connections between source and destination in selected time range.
+- Outbound failed connections: total number of failed connections between source and destination service.
-## Upgrade the OSM extension instance to a specific version
+## Upgrade to a specific version of OSM
There may be some downtime of the control plane during upgrades. The data plane will only be affected during CRD upgrades.
-### Supported Upgrades
-
-The OSM extension can be upgraded up to the next minor version. Downgrades and major version upgrades are not supported at this time.
-
-### CRD Upgrades
-
-The OSM extension cannot be upgraded to a new version if that version contains CRD version updates without deleting the existing CRDs first. You can check if an OSM upgrade also includes CRD version updates by checking the CRD Updates section of the [OSM release notes](https://github.com/openservicemesh/osm/releases).
+### Supported upgrades
-Make sure to back up your Custom Resources prior to deleting the CRDs so that they can be easily recreated after upgrading. Afterwards, follow the upgrade instructions captured below.
+The OSM extension can be upgraded manually across minor and major versions. However, auto-upgrades (if enabled) will only work across minor versions.
-> [!NOTE]
-> Upgrading the CRDs will affect the data plane as the SMI policies won't exist between the time they're deleted and the time they're created again.
+### Upgrade to a specific OSM version manually
-### Upgrade instructions
+The following command will upgrade the OSM-Arc extension to a specific version:
-1. Delete the old CRDs and custom resources (Run from the root of the [OSM repo](https://github.com/openservicemesh/osm)). Ensure that the tag of the [OSM CRDs](https://github.com/openservicemesh/osm/tree/main/cmd/osm-bootstrap/crds) corresponds to the new version of the chart.
- ```azurecli-interactive
- kubectl delete --ignore-not-found --recursive -f ./charts/osm/crds/
+```azurecli-interactive
+az k8s-extension update --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --release-train stable --name osm --version x.y.z
+```
-2. Install the updated CRDs.
- ```azurecli-interactive
- kubectl apply -f charts/osm/crds/
- ```
+### Enable auto-upgrades
-3. Set the new chart version as an environment variable:
- ```azurecli-interactive
- export VERSION=<chart version>
- ```
-
-4. Run az k8s-extension create with the new chart version
- ```azurecli-interactive
- az k8s-extension create --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --extension-type Microsoft.openservicemesh --scope cluster --release-train pilot --name osm --version $VERSION --configuration-settings-file $SETTINGS_FILE
- ```
+If auto-upgrades are not enabled by default, the following command can be run to enable auto-upgrades. The current value of `--auto-upgrade-minor-version` can be verified by running the `az k8s-extension show` command as detailed in the [Validate installation](#validate-installation) stage.
-5. Recreate Custom Resources using new CRDs
+```azurecli-interactive
+az k8s-extension update --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP --cluster-type connectedClusters --release-train stable --name osm --auto-upgrade-minor-version true
+```
## Uninstall Azure Arc-enabled OSM
Verify that the extension instance has been deleted:
az k8s-extension list --cluster-type connectedClusters --cluster-name $CLUSTER_NAME --resource-group $RESOURCE_GROUP ```
-This output should not include OSM. If you don't have any other extensions installed on your cluster, it will just be an empty array.
+This output should not include OSM. If you do not have any other extensions installed on your cluster, it will just be an empty array.
When you use the az k8s-extension command to delete the OSM extension, the arc-osm-system namespace is not removed, and the actual resources within the namespace (like mutating webhook configuration and osm-controller pod) will take around ~10 minutes to delete.
-> [!NOTE]
+> [!NOTE]
> Use the az k8s-extension CLI to uninstall OSM components managed by Arc. Using the OSM CLI to uninstall is not supported by Arc and can result in undesirable behavior.- ## Troubleshooting Refer to the troubleshooting guide [available here](troubleshooting.md#azure-arc-enabled-open-service-mesh).
+## Frequently asked questions
+
+### Is the extension of Azure Arc-enabled OSM zone redundant?
+Yes, all components of Azure Arc-enabled OSM are deployed on availability zones and are hence zone redundant.
++ ## Next steps > **Just want to try things out?**
-> Get started quickly with an [Azure Arc Jumpstart](https://aka.ms/arc-jumpstart-osm) scenario using Cluster API.
+> Get started quickly with an [Azure Arc Jumpstart](https://aka.ms/arc-jumpstart-osm) scenario using Cluster API.
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/overview.md
Currently, Azure Arc allows you to manage the following resource types hosted ou
* [Azure data services](dat): Run Azure data services on-premises, at the edge, and in public clouds using Kubernetes and the infrastructure of your choice. SQL Managed Instance and PostgreSQL Hyperscale (preview) services are currently available. * [SQL Server](/sql/sql-server/azure-arc/overview): Extend Azure services to SQL Server instances hosted outside of Azure.
-* Virtual machines (preview): Provision, resize, delete and manage virtual machines based on [VMware vSphere](/azure/azure-arc/vmware-vsphere/overview) or [Azure Stack HCI](/azure-stack/hci/manage/azure-arc-enabled-virtual-machines) and enable VM self-service through role-based access.
+* Virtual machines (preview): Provision, resize, delete and manage virtual machines based on [VMware vSphere](./vmware-vsphere/overview.md) or [Azure Stack HCI](/azure-stack/hci/manage/azure-arc-enabled-virtual-machines) and enable VM self-service through role-based access.
## Key features and benefits
Some of the key scenarios that Azure Arc supports are:
* Create [custom locations](./kubernetes/custom-locations.md) on top of your [Azure Arc-enabled Kubernetes](./kubernetes/overview.md) clusters, using them as target locations for deploying Azure services instances. Deploy your Azure service cluster extensions for [Azure Arc-enabled Data Services](./dat).
-* Perform virtual machine lifecycle and management operations for [VMware vSphere](/azure/azure-arc/vmware-vsphere/overview) and [Azure Stack HCI](/azure-stack/hci/manage/azure-arc-enabled-virtual-machines) environments.
+* Perform virtual machine lifecycle and management operations for [VMware vSphere](./vmware-vsphere/overview.md) and [Azure Stack HCI](/azure-stack/hci/manage/azure-arc-enabled-virtual-machines) environments.
* A unified experience viewing your Azure Arc-enabled resources, whether you are using the Azure portal, the Azure CLI, Azure PowerShell, or Azure REST API.
For information, see the [Azure pricing page](https://azure.microsoft.com/pricin
* Learn about [Azure Arc-enabled data services](https://azure.microsoft.com/services/azure-arc/hybrid-data-services/). * Learn about [SQL Server on Azure Arc-enabled servers](/sql/sql-server/azure-arc/overview). * Learn about [Azure Arc-enabled VMware vSphere](vmware-vsphere/overview.md) and [Azure Arc-enabled Azure Stack HCI](/azure-stack/hci/manage/azure-arc-enabled-virtual-machines)
-* Experience Azure Arc-enabled services by exploring the [Jumpstart proof of concept](https://azurearcjumpstart.io/azure_arc_jumpstart/).
+* Experience Azure Arc-enabled services by exploring the [Jumpstart proof of concept](https://azurearcjumpstart.io/azure_arc_jumpstart/).
azure-arc Prerequisites https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/prerequisites.md
The following versions of the Windows and Linux operating system are officially
* Amazon Linux 2 * Oracle Linux 7 and 8
+> [!NOTE]
+> On Linux, Azure Arc-enabled servers installs several daemon processes. We only support using systemd to manage these processes. In some environments, systemd may not be installed or available, in which case Arc-enabled servers is not supported, even if the distribution is otherwise supported. These environments include **Windows Subsystem for Linux** (WSL) and most container-based systems, such as Kubernetes or Docker. The Azure Connected Machine agent can be installed on the node that runs the containers but not inside the containers themselves.
++ > [!WARNING] > If the Linux hostname or Windows computer name uses a reserved word or trademark, attempting to register the connected machine with Azure will fail. For a list of reserved words, see [Resolve reserved resource name errors](../../azure-resource-manager/templates/error-reserved-resource-name.md).
azure-arc Ssh Arc Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/ssh-arc-troubleshoot.md
Error:
- "Failed to create ssh key file with error: \<ERROR\>." - "Failed to run ssh command with error: \<ERROR\>." - "Failed to get certificate info with error: \<ERROR\>."
+ - "Failed to create ssh key file with error: [WinError 2] The system cannot find the file specified."
+ - "Failed to create ssh key file with error: [Errno 2] No such file or directory: 'ssh-keygen'."
Resolution: - Provide the path to the folder that contains the SSH client executables by using the ```--ssh-client-folder``` parameter.
Resolution:
## Disable SSH to Arc-enabled servers This functionality can be disabled by completing the following actions: - Remove the SSH port from the allowedincoming ports: ```azcmagent config set incomingconnections.ports <other open ports,...>```
- - Delete the default connectivity endpoint: ```az rest --method delete --uri https://management.azure.com/subscriptions/<subscription>/resourceGroups/<resourcegroup>/providers/Microsoft.HybridCompute/machines/<arc enabled server name>/providers/Microsoft.HybridConnectivity/endpoints/default?api-version=2021-10-06-preview```
+ - Delete the default connectivity endpoint: ```az rest --method delete --uri https://management.azure.com/subscriptions/<subscription>/resourceGroups/<resourcegroup>/providers/Microsoft.HybridCompute/machines/<arc enabled server name>/providers/Microsoft.HybridConnectivity/endpoints/default?api-version=2021-10-06-preview```
azure-arc Day2 Operations Resource Bridge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/vmware-vsphere/day2-operations-resource-bridge.md
az connectedvmware vcenter connect --custom-location <name of the custom locatio
## Collecting logs from the Arc resource bridge
-For any issues encountered with the Azure Arc resource bridge, you can collect logs for further investigation. To collect the logs, use the Azure CLI [`Az arcappliance log`](https://docs.microsoft.com/cli/azure/arcappliance/logs?#az-arcappliance-logs-vmware) command.
+For any issues encountered with the Azure Arc resource bridge, you can collect logs for further investigation. To collect the logs, use the Azure CLI [`Az arcappliance log`](/cli/azure/arcappliance/logs#az-arcappliance-logs-vmware) command.
The `az arcappliance log` command must be run from a workstation that can communicate with the Arc resource bridge either via the cluster configuration IP address or the IP address of the Arc resource bridge VM.
If you're running this command from a different workstation, you must make sure
## Next steps
-[Troubleshoot common issues related to resource bridge](../resource-bridge/troubleshoot-resource-bridge.md)
+[Troubleshoot common issues related to resource bridge](../resource-bridge/troubleshoot-resource-bridge.md)
azure-cache-for-redis Cache Administration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-administration.md
On the left, **Schedule updates** allows you to choose a maintenance window for
:::image type="content" source="media/cache-administration/redis-schedule-updates-2.png" alt-text="Screenshot showing schedule updates":::
-To specify a maintenance window, check the days you want and specify the maintenance window start hour for each day. Then, select **OK**. The maintenance window time is in UTC.
+To specify a maintenance window, check the days you want and specify the maintenance window start hour for each day. Then, select **OK**. The maintenance window time is in UTC and can only be configured on an hourly basis.
The default, and minimum, maintenance window for updates is five hours. This value isn't configurable from the Azure portal, but you can configure it in PowerShell using the `MaintenanceWindow` parameter of the [New-AzRedisCacheScheduleEntry](/powershell/module/az.rediscache/new-azrediscachescheduleentry) cmdlet. For more information, see [Can I manage scheduled updates using PowerShell, CLI, or other management tools?](#can-i-manage-scheduled-updates-using-powershell-cli-or-other-management-tools)
azure-cache-for-redis Cache High Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-high-availability.md
Because your cache data is stored in memory, a rare and unplanned failure of mul
### Storage account for persistence
-Consider choosing a geo-redundant storage account to ensure high availability of persisted data. For more information, see [Azure Storage redundancy](/azure/storage/common/storage-redundancy?toc=/azure/storage/blobs/toc.json).
+Consider choosing a geo-redundant storage account to ensure high availability of persisted data. For more information, see [Azure Storage redundancy](../storage/common/storage-redundancy.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json).
## Import/Export
Azure cache for Redis supports the option to import and export Redis Database (R
### Storage account for export
-Consider choosing a geo-redundant storage account to ensure high availability of your exported data. For more information, see [Azure Storage redundancy](/azure/storage/common/storage-redundancy?toc=/azure/storage/blobs/toc.json).
+Consider choosing a geo-redundant storage account to ensure high availability of your exported data. For more information, see [Azure Storage redundancy](../storage/common/storage-redundancy.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json).
## Geo-replication Applicable tiers: **Premium** [Geo-replication](cache-how-to-geo-replication.md) is a mechanism for linking two or more Azure Cache for Redis instances, typically spanning two Azure regions. Geo-replication is designed mainly for disaster recovery. Two Premium tier cache instances are connected through geo-replication in away that provides reads and writes to your primary cache, and that data is replicated to the secondary cache.
-For more information on how to set it up, see [Configure geo-replication for Premium Azure Cache for Redis instances](/azure/azure-cache-for-redis/cache-how-to-geo-replication).
+For more information on how to set it up, see [Configure geo-replication for Premium Azure Cache for Redis instances](./cache-how-to-geo-replication.md).
If the region hosting the primary cache goes down, youΓÇÖll need to start the failover by: first, unlinking the secondary cache, and then, updating your application to point to the secondary cache for reads and writes.
Applicable tiers: **Standard**, **Premium**, **Enterprise**, **Enterprise Flash*
If you experience a regional outage, consider recreating your cache in a different region and updating your application to connect to the new cache instead. It's important to understand that data will be lost during a regional outage. Your application code should be resilient to data loss.
-Once the affected region is restored, your unavailable Azure Cache for Redis is automatically restored and available for use again. For more strategies for moving your cache to a different region, see [Move Azure Cache for Redis instances to different regions](/azure/azure-cache-for-redis/cache-moving-resources).
+Once the affected region is restored, your unavailable Azure Cache for Redis is automatically restored and available for use again. For more strategies for moving your cache to a different region, see [Move Azure Cache for Redis instances to different regions](./cache-moving-resources.md).
## Next steps
Learn more about how to configure Azure Cache for Redis high-availability option
- [Azure Cache for Redis Premium service tiers](cache-overview.md#service-tiers) - [Add replicas to Azure Cache for Redis](cache-how-to-multi-replicas.md) - [Enable zone redundancy for Azure Cache for Redis](cache-how-to-zone-redundancy.md)-- [Set up geo-replication for Azure Cache for Redis](cache-how-to-geo-replication.md)
+- [Set up geo-replication for Azure Cache for Redis](cache-how-to-geo-replication.md)
azure-cache-for-redis Cache Troubleshoot Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-troubleshoot-connectivity.md
Steps to check your private endpoint configuration:
1. If you're trying to connect to your cache private endpoint from outside your virtual network of your cache, `Public Network Access` needs to be enabled. 1. If you've deleted your private endpoint, ensure that the public network access is enabled. 1. Verify if your private endpoint is configured correctly. For more information, see [Create a private endpoint with a new Azure Cache for Redis instance](cache-private-link.md#create-a-private-endpoint-with-a-new-azure-cache-for-redis-instance).-
+1. Verify if your application is connecting to `<cachename>.redis.cache.windows.net` on port 6380. We recommend avoiding the use of `<cachename>.privatelink.redis.cache.windows.net` in the configuration or the connection string.
+1. Run a command like `nslookup <hostname>` from within the VNet that is linked to the private endpoint to verify that the command resolves to the private IP address for the cache.
+
### Firewall rules If you have a firewall configured for your Azure Cache For Redis, ensure that your client IP address is added to the firewall rules. You can check **Firewall** on the Resource menu under **Settings** on the Azure portal.
azure-functions Durable Functions Bindings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/durable/durable-functions-bindings.md
async def main(msg: func.QueueMessage, starter: str) -> None:
**run.ps1** ```powershell
-param($[string] $input, $TriggerMetadata)
+param([string] $input, $TriggerMetadata)
$InstanceId = Start-DurableOrchestration -FunctionName $FunctionName -Input $input ```
azure-functions Functions Bindings Azure Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-azure-sql.md
The Azure SQL bindings for Azure Functions are open-source and available on the
- [Save data to a database (Output binding)](./functions-bindings-azure-sql-output.md) - [Review ToDo API sample with Azure SQL bindings](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/) - [Learn how to connect Azure Function to Azure SQL with managed identity](./functions-identity-access-azure-sql-with-managed-identity.md)-- [Use SQL bindings in Azure Stream Analytics](/azure/stream-analytics/sql-database-upsert#option-1-update-by-key-with-the-azure-function-sql-binding)
+- [Use SQL bindings in Azure Stream Analytics](../stream-analytics/sql-database-upsert.md#option-1-update-by-key-with-the-azure-function-sql-binding)
azure-functions Functions Create Maven Intellij https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-create-maven-intellij.md
Title: Create a Java function in Azure Functions using IntelliJ
-description: Learn how to use IntelliJ to create a simple HTTP-triggered Java function, which you then publish to run in a serverless environment in Azure.
+description: Learn how to use IntelliJ to create an HTTP-triggered Java function and then run it in a serverless environment in Azure.
+ Previously updated : 07/01/2018 Last updated : 03/28/2022+ ms.devlang: java # Create your first Java function in Azure using IntelliJ
-This article shows you:
+This article shows you how to use Java and IntelliJ to create an Azure function.
+
+Specifically, this article shows you:
+ - How to create an HTTP-triggered Java function in an IntelliJ IDEA project. - Steps for testing and debugging the project in the integrated development environment (IDE) on your own computer.-- Instructions for deploying the function project to Azure Functions
+- Instructions for deploying the function project to Azure Functions.
<!-- TODO ![Access a Hello World function from the command line with cURL](media/functions-create-java-maven/hello-azure.png) --> -
-## Set up your development environment
-
-To create and publish Java functions to Azure using IntelliJ, install the following software:
+## Prerequisites
-+ An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-+ An [Azure supported Java Development Kit (JDK)](/azure/developer/java/fundamentals/java-support-on-azure) for Java 8
-+ An [IntelliJ IDEA](https://www.jetbrains.com/idea/download/) Ultimate Edition or Community Edition installed
-+ [Maven 3.5.0+](https://maven.apache.org/download.cgi)
-+ Latest [Function Core Tools](https://github.com/Azure/azure-functions-core-tools)
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
+- An [Azure supported Java Development Kit (JDK)](/azure/developer/java/fundamentals/java-support-on-azure) for Java, version 8 or 11
+- An [IntelliJ IDEA](https://www.jetbrains.com/idea/download/) Ultimate Edition or Community Edition installed
+- [Maven 3.5.0+](https://maven.apache.org/download.cgi)
+- Latest [Function Core Tools](https://github.com/Azure/azure-functions-core-tools)
+## Install plugin and sign in
-## Installation and sign in
+To install the Azure Toolkit for IntelliJ and then sign in, follow these steps:
-1. In IntelliJ IDEA's Settings/Preferences dialog (Ctrl+Alt+S), select **Plugins**. Then, find the **Azure Toolkit for IntelliJ** in the **Marketplace** and click **Install**. After installed, click **Restart** to activate the plugin.
+1. In IntelliJ IDEA's **Settings/Preferences** dialog (Ctrl+Alt+S), select **Plugins**. Then, find the **Azure Toolkit for IntelliJ** in the **Marketplace** and click **Install**. After it's installed, click **Restart** to activate the plugin.
- ![Azure Toolkit for IntelliJ plugin in Marketplace][marketplace]
+ :::image type="content" source="media/functions-create-first-java-intellij/marketplace.png" alt-text="Azure Toolkit for IntelliJ plugin in Marketplace." lightbox="media/functions-create-first-java-intellij/marketplace.png":::
-2. To sign in to your Azure account, open sidebar **Azure Explorer**, and then click the **Azure Sign In** icon in the bar on top (or from IDEA menu **Tools/Azure/Azure Sign in**).
- ![The IntelliJ Azure Sign In command][intellij-azure-login]
+2. To sign in to your Azure account, open the **Azure Explorer** sidebar, and then click the **Azure Sign In** icon in the bar on top (or from the IDEA menu, select **Tools > Azure > Azure Sign in**).
-3. In the **Azure Sign In** window, select **Device Login**, and then click **Sign in** ([other sign in options](/azure/developer/java/toolkit-for-intellij/sign-in-instructions)).
+ :::image type="content" source="media/functions-create-first-java-intellij/intellij-azure-login.png" alt-text="The IntelliJ Azure Sign In command." lightbox="media/functions-create-first-java-intellij/intellij-azure-login.png":::
- ![The Azure Sign In window with device login selected][intellij-azure-popup]
+3. In the **Azure Sign In** window, select **OAuth 2.0**, and then click **Sign in**. For other sign-in options, see [Sign-in instructions for the Azure Toolkit for IntelliJ](/azure/developer/java/toolkit-for-intellij/sign-in-instructions).
-4. Click **Copy&Open** in **Azure Device Login** dialog .
+ :::image type="content" source="media/functions-create-first-java-intellij/intellij-azure-login-popup.png" alt-text="The Azure Sign In window with device login selected." lightbox="media/functions-create-first-java-intellij/intellij-azure-login-popup.png":::
- ![The Azure Login Dialog window][intellij-azure-copycode]
+4. In the browser, sign in with your account and then go back to IntelliJ. In the **Select Subscriptions** dialog box, click on the subscriptions that you want to use, then click **Select**.
-5. In the browser, paste your device code (which has been copied when you click **Copy&Open** in last step) and then click **Next**.
+ :::image type="content" source="media/functions-create-first-java-intellij/intellij-azure-login-selectsubs.png" alt-text="The Select Subscriptions dialog box." lightbox="media/functions-create-first-java-intellij/intellij-azure-login-selectsubs.png":::
- ![The device login browser][intellij-azure-link-ms-account]
-
-6. In the **Select Subscriptions** dialog box, select the subscriptions that you want to use, and then click **Select**.
-
- ![The Select Subscriptions dialog box][intellij-azure-login-select-subs]
-
## Create your local project
-In this section, you use Azure Toolkit for IntelliJ to create a local Azure Functions project. Later in this article, you'll publish your function code to Azure.
+To use Azure Toolkit for IntelliJ to create a local Azure Functions project, follow these steps:
-1. Open IntelliJ Welcome dialog, select *Create New Project* to open a new Project wizard, select *Azure Functions*.
+1. Open IntelliJ IDEA's **Welcome** dialog, select **New Project** to open a new project wizard, then select **Azure Functions**.
- ![Create function project](media/functions-create-first-java-intellij/create-functions-project.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/create-functions-project.png" alt-text="Create function project." lightbox="media/functions-create-first-java-intellij/create-functions-project.png":::
-1. Select *Http Trigger*, then click *Next* and follow the wizard to go through all the configurations in the following pages; confirm your project location then click *Finish*; Intellj IDEA will then open your new project.
+1. Select **Http Trigger**, then click **Next** and follow the wizard to go through all the configurations in the following pages. Confirm your project location, then click **Finish**. Intellj IDEA will then open your new project.
- ![Create function project finish](media/functions-create-first-java-intellij/create-functions-project-finish.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/create-functions-project-finish.png" alt-text="Create function project finish." lightbox="media/functions-create-first-java-intellij/create-functions-project-finish.png":::
## Run the project locally
-1. Navigate to `src/main/java/org/example/functions/HttpTriggerFunction.java` to see the code generated. Beside the line *17*, you will notice that there is a green *Run* button, click it and select *Run 'azure-function-exam...'*, you will see that your function app is running locally with a few logs.
+To run the project locally, follow these steps:
- ![Local run project](media/functions-create-first-java-intellij/local-run-functions-project.png)
+1. Navigate to *src/main/java/org/example/functions/HttpTriggerFunction.java* to see the code generated. Beside the line *24*, you'll notice that there's a green **Run** button. Click it and select **Run 'Functions-azur...'**. You'll see that your function app is running locally with a few logs.
- ![Local run project output](media/functions-create-first-java-intellij/local-run-functions-output.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/local-run-functions-project.png" alt-text="Local run project." lightbox="media/functions-create-first-java-intellij/local-run-functions-project.png":::
-1. You can try the function by accessing the printed endpoint from browser, like `http://localhost:7071/api/HttpTrigger-Java?name=Azure`.
+ :::image type="content" source="media/functions-create-first-java-intellij/local-run-functions-output.png" alt-text="Local run project output." lightbox="media/functions-create-first-java-intellij/local-run-functions-output.png":::
- ![Local run function test result](media/functions-create-first-java-intellij/local-run-functions-test.png)
+1. You can try the function by accessing the displayed endpoint from browser, such as `http://localhost:7071/api/HttpExample?name=Azure`.
-1. The log is also printed out in your IDEA, now, stop the function app by clicking the *stop* button.
+ :::image type="content" source="media/functions-create-first-java-intellij/local-run-functions-test.png" alt-text="Local run function test result." lightbox="media/functions-create-first-java-intellij/local-run-functions-test.png":::
- ![Local run function test log](media/functions-create-first-java-intellij/local-run-functions-log.png)
+1. The log is also displayed in your IDEA. Stop the function app by clicking the **Stop** button.
+
+ :::image type="content" source="media/functions-create-first-java-intellij/local-run-functions-log.png" alt-text="Local run function test log." lightbox="media/functions-create-first-java-intellij/local-run-functions-log.png":::
## Debug the project locally
-1. To debug the function code in your project locally, select the *Debug* button in the toolbar. If you don't see the toolbar, enable it by choosing **View** > **Appearance** > **Toolbar**.
+To debug the project locally, follow these steps:
+
+1. Select the **Debug** button in the toolbar. If you don't see the toolbar, enable it by choosing **View** > **Appearance** > **Toolbar**.
- ![Local debug function app button](media/functions-create-first-java-intellij/local-debug-functions-button.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/local-debug-functions-button.png" alt-text="Local debug function app button." lightbox="media/functions-create-first-java-intellij/local-debug-functions-button.png":::
-1. Click on line *20* of the file `src/main/java/org/example/functions/HttpTriggerFunction.java` to add a breakpoint, access the endpoint `http://localhost:7071/api/HttpTrigger-Java?name=Azure` again , you will find the breakpoint is hit, you can try more debug features like *step*, *watch*, *evaluation*. Stop the debug session by click the stop button.
+1. Click on line *31* of the file *src/main/java/org/example/functions/HttpTriggerFunction.java* to add a breakpoint. Access the endpoint `http://localhost:7071/api/HttpTrigger-Java?name=Azure` again and you'll find the breakpoint is hit. You can then try more debug features like **Step**, **Watch**, and **Evaluation**. Stop the debug session by clicking the **Stop** button.
- ![Local debug function app break](media/functions-create-first-java-intellij/local-debug-functions-break.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/local-debug-functions-break.png" alt-text="Local debug function app break." lightbox="media/functions-create-first-java-intellij/local-debug-functions-break.png":::
## Deploy your project to Azure
-1. Right click your project in IntelliJ Project explorer, select *Azure -> Deploy to Azure Functions*
+To deploy your project to Azure, follow these steps:
- ![Deploy project to Azure](media/functions-create-first-java-intellij/deploy-functions-to-azure.png)
+1. Right click your project in IntelliJ Project explorer, then select **Azure -> Deploy to Azure Functions**.
-1. If you don't have any Function App yet, click *+* in the *Function* line. Type in the function app name and choose proper platform, here we can simply accept default. Click *OK* and the new function app you just created will be automatically selected. Click *Run* to deploy your functions.
+ :::image type="content" source="media/functions-create-first-java-intellij/deploy-functions-to-azure.png" alt-text="Deploy project to Azure." lightbox="media/functions-create-first-java-intellij/deploy-functions-to-azure.png":::
- ![Create function app in Azure](media/functions-create-first-java-intellij/deploy-functions-create-app.png)
+1. If you don't have any Function App yet, click **+** in the *Function* line. Type in the function app name and choose proper platform. Here you can accept the default value. Click **OK** and the new function app you created will be automatically selected. Click **Run** to deploy your functions.
- ![Deploy function app to Azure log](media/functions-create-first-java-intellij/deploy-functions-log.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/deploy-functions-create-app.png" alt-text="Create function app in Azure." lightbox="media/functions-create-first-java-intellij/deploy-functions-create-app.png":::
+
+ :::image type="content" source="media/functions-create-first-java-intellij/deploy-functions-log.png" alt-text="Deploy function app to Azure log." lightbox="media/functions-create-first-java-intellij/deploy-functions-log.png":::
## Manage function apps from IDEA
-1. You can manage your function apps with *Azure Explorer* in your IDEA, click on *Function App*, you will see all your function apps here.
+To manage your function apps with **Azure Explorer** in your IDEA, follow these steps:
+
+1. Click on **Function App** and you'll see all your function apps listed.
- ![View function apps in explorer](media/functions-create-first-java-intellij/explorer-view-functions.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/explorer-view-functions.png" alt-text="View function apps in explorer." lightbox="media/functions-create-first-java-intellij/explorer-view-functions.png":::
-1. Click to select on one of your function apps, and right click, select *Show Properties* to open the detail page.
+1. Click to select on one of your function apps, then right click and select **Show Properties** to open the detail page.
- ![Show function app properties](media/functions-create-first-java-intellij/explorer-functions-show-properties.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/explorer-functions-show-properties.png" alt-text="Show function app properties." lightbox="media/functions-create-first-java-intellij/explorer-functions-show-properties.png":::
-1. Right click on your *HttpTrigger-Java* function app, and select *Trigger Function*, you will see that the browser is opened with the trigger URL.
+1. Right click on your **HttpTrigger-Java** function app, then select **Trigger Function in Browser**. You'll see that the browser is opened with the trigger URL.
- ![Screenshot shows a browser with the U R L.](media/functions-create-first-java-intellij/explorer-trigger-functions.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/explorer-trigger-functions.png" alt-text="Screenshot shows a browser with the U R L." lightbox="media/functions-create-first-java-intellij/explorer-trigger-functions.png":::
## Add more functions to the project
-1. Right click on the package *org.example.functions* and select *New -> Azure Function Class*.
+To add more functions to your project, follow these steps:
+
+1. Right click on the package **org.example.functions** and select **New -> Azure Function Class**.
+
+ :::image type="content" source="media/functions-create-first-java-intellij/add-functions-entry.png" alt-text="Add functions to the project entry." lightbox="media/functions-create-first-java-intellij/add-functions-entry.png":::
- ![Add functions to the project entry](media/functions-create-first-java-intellij/add-functions-entry.png)
+1. Fill in the class name **HttpTest** and select **HttpTrigger** in the create function class wizard, then click **OK** to create. In this way, you can create new functions as you want.
-1. Fill in the class name *HttpTest* and select *HttpTrigger* in the create function class wizard, click *OK* to create, in this way, you can create new functions as you want.
+ :::image type="content" source="media/functions-create-first-java-intellij/add-functions-trigger.png" alt-text="Screenshot shows the Create Function Class dialog box." lightbox="media/functions-create-first-java-intellij/add-functions-trigger.png":::
- ![Screenshot shows the Create Function Class dialog box.](media/functions-create-first-java-intellij/add-functions-trigger.png)
-
- ![Add functions to the project output](media/functions-create-first-java-intellij/add-functions-output.png)
+ :::image type="content" source="media/functions-create-first-java-intellij/add-functions-output.png" alt-text="Add functions to the project output." lightbox="media/functions-create-first-java-intellij/add-functions-output.png":::
## Cleaning up functions
-1. Deleting functions in Azure Explorer
-
- ![Screenshot shows Delete selected from a context menu.](media/functions-create-first-java-intellij/delete-function.png)
-
+Select one of your function apps using **Azure Explorer** in your IDEA, then right-click and select **Delete**. This command might take several minutes to run. When it's done, the status will refresh in **Azure Explorer**.
+ ## Next steps
-You've created a Java project with an HTTP triggered function, run it on your local machine, and deployed it to Azure. Now, extend your function by...
+You've created a Java project with an HTTP triggered function, run it on your local machine, and deployed it to Azure. Now, extend your function by continuing to the following article:
> [!div class="nextstepaction"] > [Adding an Azure Storage queue output binding](./functions-add-output-binding-storage-queue-java.md)--
-[marketplace]:./media/functions-create-first-java-intellij/marketplace.png
-[intellij-azure-login]: media/functions-create-first-java-intellij/intellij-azure-login.png
-[intellij-azure-popup]: media/functions-create-first-java-intellij/intellij-azure-login-popup.png
-[intellij-azure-copycode]: media/functions-create-first-java-intellij/intellij-azure-login-copyopen.png
-[intellij-azure-link-ms-account]: media/functions-create-first-java-intellij/intellij-azure-login-linkms-account.png
-[intellij-azure-login-select-subs]: media/functions-create-first-java-intellij/intellij-azure-login-selectsubs.png
azure-functions Functions How To Use Azure Function App Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-how-to-use-azure-function-app-settings.md
In this script, replace `<SUBSCRIPTION_ID>` and `<APP_NAME>` with the ID of your
## Platform features
-Function apps run in, and are maintained by, the Azure App Service platform. As such, your function apps have access to most of the features of Azure's core web hosting platform. The left pane is where you access the many features of the App Service platform that you can use in your function apps.
+Function apps run in, and are maintained by, the Azure App Service platform. As such, your function apps have access to most of the features of Azure's core web hosting platform. When working in the [Azure portal](https://portal.azure.com), the left pane is where you access the many features of the App Service platform that you can use in your function apps.
-> [!NOTE]
-> Not all App Service features are available when a function app runs on the Consumption hosting plan.
+The following matrix indicates portal feature support by hosting plan and operating system:
-The rest of this article focuses on the following App Service features in the Azure portal that are useful for Functions:
+| Feature | Consumption plan | Premium plan | Dedicated plan |
+| | | | |
+| [Advanced tools (Kudu)](#kudu) | Windows: Γ£ö <br/>Linux: **X** | Γ£ö | Γ£ö|
+| [App Service editor](#editor) | Windows: Γ£ö <br/>Linux: **X** | Windows: Γ£ö <br/>Linux: **X** | Windows: Γ£ö <br/>Linux: **X**|
+| [Backups](../app-service/manage-backup.md) |**X** |**X** | Γ£ö|
+| [Console](#console) | Windows: command-line <br/>Linux: **X** | Windows: command-line <br/>Linux: SSH | Windows: command-line <br/>Linux: SSH |
+
+The rest of this article focuses on the following features in the portal that are useful for your function apps:
+ [App Service editor](#editor) + [Console](#console)
azure-functions Functions Premium Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-premium-plan.md
See the complete regional availability of Functions on the [Azure web site](http
|--| -- | -- | |Australia Central| 100 | Not Available | |Australia Central 2| 100 | Not Available |
-|Australia East| 100 | 20 |
+|Australia East| 100 | 40 |
|Australia Southeast | 100 | 20 | |Brazil South| 100 | 20 | |Canada Central| 100 | 20 |
See the complete regional availability of Functions on the [Azure web site](http
|China North 2| 100 | 20 | |East Asia| 100 | 20 | |East US | 100 | 60 |
-|East US 2| 100 | 20 |
+|East US 2| 100 | 40 |
|France Central| 100 | 20 | |Germany West Central| 100 | 20 | |Japan East| 100 | 20 |
See the complete regional availability of Functions on the [Azure web site](http
|North Europe| 100 | 40 | |Norway East| 100 | 20 | |South Africa North| 100 | 20 |
-|South Central US| 100 | 20 |
+|South Central US| 100 | 40 |
|South India | 100 | Not Available | |Southeast Asia| 100 | 20 | |Switzerland North| 100 | 20 |
azure-functions Functions Proxies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-proxies.md
This section shows you how to create a proxy in the Functions portal.
> Not all languages and operating system combinations support in-portal editing. If you're unable to create a proxy in the portal, you can instead manually create a _proxies.json_ file in the root of your function app project folder. To learn more about portal editing support, see [Language support details](functions-create-function-app-portal.md#language-support-details). 1. Open the [Azure portal], and then go to your function app.
-2. In the left pane, select **New proxy**.
+2. In the left pane, select **Proxies** and then select **+Add**.
3. Provide a name for your proxy. 4. Configure the endpoint that's exposed on this function app by specifying the **route template** and **HTTP methods**. These parameters behave according to the rules for [HTTP triggers]. 5. Set the **backend URL** to another endpoint. This endpoint could be a function in another function app, or it could be any other API. The value does not need to be static, and it can reference [application settings] and [parameters from the original client request].
azure-functions Functions Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-scale.md
Title: Azure Functions scale and hosting
description: Learn how to choose between Azure Functions Consumption plan and Premium plan. ms.assetid: 5b63649c-ec7f-4564-b168-e0a74cb7e0f3 Previously updated : 08/17/2020 Last updated : 03/24/2022
azure-monitor Azure Monitor Agent Windows Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-windows-client.md
Then, proceed with the instructions below to create and associate them to a Moni
#### 1. Assign ΓÇÿMonitored Object ContributorΓÇÖ role to the operator This step grants the ability to create and link a monitored object to a user.
-**Permissions required:** Since MO is a tenant level resource, the scope of the permission would be higher than a subscription scope. Therefore, an Azure tenant admin may be needed to perform this step. [Follow these steps to elevate Azure AD Tenant Admin as Azure Tenant Admin](/azure/role-based-access-control/elevate-access-global-admin). It will give the Azure AD admin 'owner' permissions at the root scope.
+**Permissions required:** Since MO is a tenant level resource, the scope of the permission would be higher than a subscription scope. Therefore, an Azure tenant admin may be needed to perform this step. [Follow these steps to elevate Azure AD Tenant Admin as Azure Tenant Admin](../../role-based-access-control/elevate-access-global-admin.md). It will give the Azure AD admin 'owner' permissions at the root scope.
**Request URI** ```HTTP
Make sure to start the installer on administrator command prompt. Silent install
## Questions and feedback
-Take this [quick survey](https://forms.microsoft.com/r/CBhWuT1rmM) or share your feedback/questions regarding the preview on the [Azure Monitor Agent User Community](https://teams.microsoft.com/l/team/19%3af3f168b782f64561b52abe75e59e83bc%40thread.tacv2/conversations?groupId=770d6aa5-c2f7-4794-98a0-84fd6ae7f193&tenantId=72f988bf-86f1-41af-91ab-2d7cd011db47).
+Take this [quick survey](https://forms.microsoft.com/r/CBhWuT1rmM) or share your feedback/questions regarding the preview on the [Azure Monitor Agent User Community](https://teams.microsoft.com/l/team/19%3af3f168b782f64561b52abe75e59e83bc%40thread.tacv2/conversations?groupId=770d6aa5-c2f7-4794-98a0-84fd6ae7f193&tenantId=72f988bf-86f1-41af-91ab-2d7cd011db47).
azure-monitor Data Collection Text Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/data-collection-text-log.md
This article describes how to configure the collection of file-based text logs,
To complete this procedure, you need the following: - Log Analytics workspace where you have at least [contributor rights](../logs/manage-access.md#manage-access-using-azure-permissions) .-- [Permissions to create Data Collection Rule objects](/azure/azure-monitor/essentials/data-collection-rule-overview#permissions) in the workspace.
+- [Permissions to create Data Collection Rule objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
- An agent with supported log file as described in the next section. ## Log files supported
The final step is to create a data collection association that associates the da
- Learn more about the [Azure Monitor agent](azure-monitor-agent-overview.md). - Learn more about [data collection rules](../essentials/data-collection-rule-overview.md).-- Learn more about [data collection endpoints](../essentials/data-collection-endpoint-overview.md).
+- Learn more about [data collection endpoints](../essentials/data-collection-endpoint-overview.md).
azure-monitor Asp Net Trace Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/asp-net-trace-logs.md
You can, for example:
## Troubleshooting ### Delayed telemetry, overloading network, or inefficient transmission
-System.Diagnostics.Tracing has an [Autoflush feature](https://docs.microsoft.com/dotnet/api/system.diagnostics.trace.autoflush). This causes SDK to flush with every telemetry item, which is undesirable, and can cause logging adapter issues like delayed telemetry, overloading network, inefficient transmission, etc.
+System.Diagnostics.Tracing has an [Autoflush feature](/dotnet/api/system.diagnostics.trace.autoflush). This causes SDK to flush with every telemetry item, which is undesirable, and can cause logging adapter issues like delayed telemetry, overloading network, inefficient transmission, etc.
If your application sends voluminous amounts of data and you're using the Applic
[exceptions]: asp-net-exceptions.md [portal]: https://portal.azure.com/ [qna]: ../faq.yml
-[start]: ./app-insights-overview.md
+[start]: ./app-insights-overview.md
azure-monitor Auto Instrumentation Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/auto-instrumentation-troubleshoot.md
- Title: Troubleshoot Azure Application Insights auto-instrumentation
-description: Troubleshoot auto-instrumentation in Azure Application Insights
- Previously updated : 02/28/2022--
-# Troubleshooting Azure Application Insights auto-instrumentation
-
-This article will help you troubleshoot problems with auto-instrumentation in Azure Application Insights.
-
-> [!NOTE]
-> Auto-instrumentation used to be known as "codeless attach" before October 2021.
-
-## Telemetry data isn't reported after enabling auto-instrumentation
-
-Review these common scenarios if you've enabled Azure Application Insights auto-instrumentation for your app service but don't see telemetry data reported.
-
-### The Application Insights SDK was previously installed
-
-Auto-instrumentation will fail when .NET and .NET Core apps were already instrumented with the SDK.
-
-Remove the Application Insights SDK if you would like to auto-instrument your app.
-
-### An app was published using an unsupported version of .NET or .NET Core
-
-Verify a supported version of .NET or .NET Core was used to build and publish applications.
-
-Refer to the .NET or .NET core documentation to determine if your version is supported.
--- [Application Monitoring for Azure App Service and ASP.NET Core](azure-web-apps-net-core.md#application-monitoring-for-azure-app-service-and-aspnet-core)-
-### A diagnostics library was detected
-
-Auto-instrumentation will fail if it detects the following libraries.
--- System.Diagnostics.DiagnosticSource-- Microsoft.AspNet.TelemetryCorrelation-- Microsoft.ApplicationInsights-
-These libraries will need to be removed for auto-instrumentation to succeed.
-
-## More help
-
-If you have questions about Azure Application Insights auto-instrumentation, you can post a question in our [Microsoft Q&A question page](/answers/topics/azure-monitor.html).
azure-monitor Availability Azure Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/availability-azure-functions.md
This article will cover how to create an Azure Function with TrackAvailability()
> [!NOTE] > This example is designed solely to show you the mechanics of how the TrackAvailability() API call works within an Azure Function. Not how to write the underlying HTTP Test code/business logic that would be required to turn this into a fully functional availability test. By default if you walk through this example you will be creating a basic availability HTTP GET test.
-> To follow these instructions, you must use the [dedicated plan](https://docs.microsoft.com/azure/azure-functions/dedicated-plan) to allow editing code in App Service Editor.
+> To follow these instructions, you must use the [dedicated plan](../../azure-functions/dedicated-plan.md) to allow editing code in App Service Editor.
## Create a timer trigger function
You can use Logs(analytics) to view you availability results, dependencies, and
## Next steps - [Application Map](./app-map.md)-- [Transaction diagnostics](./transaction-diagnostics.md)
+- [Transaction diagnostics](./transaction-diagnostics.md)
azure-monitor Ilogger https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/ilogger.md
The Application Insights extension in Azure Web Apps uses the new provider. You
### I can't see some of the logs from my application in the workspace.
-This may happen because of adaptive sampling. Adaptive sampling is enabled by default in all the latest versions of the Application Insights ASP.NET and ASP.NET Core Software Development Kits (SDKs). See the [Sampling in Application Insights](/azure/azure-monitor/app/sampling) for more details.
+This may happen because of adaptive sampling. Adaptive sampling is enabled by default in all the latest versions of the Application Insights ASP.NET and ASP.NET Core Software Development Kits (SDKs). See the [Sampling in Application Insights](./sampling.md) for more details.
## Next steps * [Logging in .NET](/dotnet/core/extensions/logging) * [Logging in ASP.NET Core](/aspnet/core/fundamentals/logging)
-* [.NET trace logs in Application Insights](./asp-net-trace-logs.md)
+* [.NET trace logs in Application Insights](./asp-net-trace-logs.md)
azure-monitor Sdk Connection String https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/sdk-connection-string.md
In this example, the connection string specifies the South Central US region.
- The regional service URIs are based on the explicit override values: - Ingestion: `https://southcentralus.in.applicationinsights.azure.com/`
-Run the following command in the [Azure Command-Line Interface (CLI)](https://docs.microsoft.com/cli/azure/account?view=azure-cli-latest#az-account-list-locations) to list available regions.
+Run the following command in the [Azure Command-Line Interface (CLI)](/cli/azure/account?view=azure-cli-latest#az-account-list-locations) to list available regions.
`az account list-locations -o table`
Get started at development time with:
* [ASP.NET Core](./asp-net-core.md) * [Java](./java-in-process-agent.md) * [Node.js](./nodejs.md)
-* [Python](./opencensus-python.md)
+* [Python](./opencensus-python.md)
azure-monitor Snapshot Collector Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/snapshot-collector-release-notes.md
A point release to address user-reported bugs.
### Bug fixes - Fix [Hide the IDMS dependency from dependency tracker.](https://github.com/microsoft/ApplicationInsights-SnapshotCollector/issues/17) - Fix [ArgumentException: telemetryProcessorTypedoes not implement ITelemetryProcessor.](https://github.com/microsoft/ApplicationInsights-SnapshotCollector/issues/19)
-<br>Snapshot Collector used via SDK is not supported when Interop feature is enabled. [See more not supported scenarios.](https://docs.microsoft.com/azure/azure-monitor/app/snapshot-debugger-troubleshoot#not-supported-scenarios)
+<br>Snapshot Collector used via SDK is not supported when Interop feature is enabled. [See more not supported scenarios.](./snapshot-debugger-troubleshoot.md#not-supported-scenarios)
## [1.4.2](https://www.nuget.org/packages/Microsoft.ApplicationInsights.SnapshotCollector/1.4.2) A point release to address a user-reported bug.
Augmented usage telemetry
## [1.1.0](https://www.nuget.org/packages/Microsoft.ApplicationInsights.SnapshotCollector/1.1.0) ### Changes - Added host memory protection. This feature reduces the impact on the host machine's memory.-- Improve the Azure portal snapshot viewing experience.
+- Improve the Azure portal snapshot viewing experience.
azure-monitor Troubleshoot Portal Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/troubleshoot-portal-connectivity.md
-- Title: Application Insights portal connectivity troubleshooting
-description: Troubleshooting guide for Application Insights portal connectivity issues
-- Previously updated : 03/09/2022----
-# "Error retrieving data" message on Application Insights portal
-
-This is a troubleshooting guide for the Application Insights portal when encountering connectivity errors similar to `Error retrieving data` or `Missing localization resource`.
-
-![image Portal connectivity error](./media/troubleshoot-portal-connectivity/troubleshoot-portal-connectivity.png)
-
-The source of the issue is likely third-party browser plugins that interfere with the portal's connectivity.
-
-To confirm that this is the source of the issue and to identify which plugin is interfering:
--- Open the portal in an InPrivate or Incognito window and verify the site functions correctly.--- Attempt disabling plugins to identify the one that is causing the connectivity issue.
azure-monitor Azure Monitor Monitoring Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/azure-monitor-monitoring-reference.md
This section lists all the platform metrics collected automatically for Azure Mo
|Metric Type | Resource Provider / Type Namespace<br/> and link to individual metrics | |-|--|
-| [Autoscale behaviors for VMs and AppService](/azure/azure-monitor/autoscale/autoscale-overview) | [microsoft.insights/autoscalesettings](/azure/azure-monitor/platform/metrics-supported#microsoftinsightsautoscalesettings) |
+| [Autoscale behaviors for VMs and AppService](./autoscale/autoscale-overview.md) | [microsoft.insights/autoscalesettings](/azure/azure-monitor/platform/metrics-supported#microsoftinsightsautoscalesettings) |
While technically not about Azure Monitor operations, the following metrics are collected into Azure Monitor namespaces. |Metric Type | Resource Provider / Type Namespace<br/> and link to individual metrics | |-|--|
-| Log Analytics agent gathered data for the [Metric alerts on logs](/azure/azure-monitor/alerts/alerts-metric-logs#metrics-and-dimensions-supported-for-logs) feature | [Microsoft.OperationalInsights/workspaces](/azure/azure-monitor/platform/metrics-supported##microsoftoperationalinsightsworkspaces)
-| [Application Insights availability tests](/azure/azure-monitor/app/availability-overview) | [Microsoft.Insights/Components](/azure/azure-monitor/essentials/metrics-supported#microsoftinsightscomponents)
+| Log Analytics agent gathered data for the [Metric alerts on logs](./alerts/alerts-metric-logs.md#metrics-and-dimensions-supported-for-logs) feature | [Microsoft.OperationalInsights/workspaces](/azure/azure-monitor/platform/metrics-supported##microsoftoperationalinsightsworkspaces)
+| [Application Insights availability tests](./app/availability-overview.md) | [Microsoft.Insights/Components](./essentials/metrics-supported.md#microsoftinsightscomponents)
See a complete list of [platform metrics for other resources types](/azure/azure-monitor/platform/metrics-supported).
This section lists all the Azure Monitor resource log category types collected.
|Resource Log Type | Resource Provider / Type Namespace<br/> and link | |-|--|
-| [Autoscale for VMs and AppService](/azure/azure-monitor/autoscale/autoscale-overview) | [Microsoft.insights/autoscalesettings](/azure/azure-monitor/essentials/resource-logs-categories#microsoftinsightsautoscalesettings)|
-| [Application Insights availability tests](/azure/azure-monitor/app/availability-overview) | [Microsoft.insights/Components](/azure/azure-monitor/essentials/resource-logs-categories#microsoftinsightscomponents) |
+| [Autoscale for VMs and AppService](./autoscale/autoscale-overview.md) | [Microsoft.insights/autoscalesettings](./essentials/resource-logs-categories.md#microsoftinsightsautoscalesettings)|
+| [Application Insights availability tests](./app/availability-overview.md) | [Microsoft.insights/Components](./essentials/resource-logs-categories.md#microsoftinsightscomponents) |
For additional reference, see a list of [all resource logs category types supported in Azure Monitor](/azure/azure-monitor/platform/resource-logs-schema).
This section refers to all of the Azure Monitor Logs Kusto tables relevant to Az
|Resource Type | Notes | |--|-|
-| [Autoscale for VMs and AppService](/azure/azure-monitor/autoscale/autoscale-overview) | [Autoscale Tables](/azure/azure-monitor/reference/tables/tables-resourcetype#azure-monitor-autoscale-settings) |
+| [Autoscale for VMs and AppService](./autoscale/autoscale-overview.md) | [Autoscale Tables](/azure/azure-monitor/reference/tables/tables-resourcetype#azure-monitor-autoscale-settings) |
## Activity log
-For a partial list of entires that the Azure Monitor services writes to the activity log, see [Azure resource provider operations](/azure/role-based-access-control/resource-provider-operations#monitor). There may be other entires not listed here.
+For a partial list of entires that the Azure Monitor services writes to the activity log, see [Azure resource provider operations](../role-based-access-control/resource-provider-operations.md#monitor). There may be other entires not listed here.
-For more information on the schema of Activity Log entries, see [Activity Log schema](/azure/azure-monitor/essentials/activity-log-schema).
+For more information on the schema of Activity Log entries, see [Activity Log schema](./essentials/activity-log-schema.md).
## Schemas
azure-monitor Container Insights Cost https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-cost.md
After completing your analysis to determine which source or sources are generati
The following are examples of what changes you can apply to your cluster by modifying the ConfigMap file to help control cost.
-1. Disable stdout logs across all namespaces in the cluster by modifying the following in the ConfigMap file:
+1. Disable stdout logs across all namespaces in the cluster by modifying the following in the ConfigMap file for the Azure Container Insights service pulling the metrics:
``` [log_collection_settings]
azure-monitor Activity Logs Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/activity-logs-insights.md
Activity logs insights let you view information about changes to resources and resource groups in your Azure subscription. It uses information from the [Activity log](activity-log.md) to also present data about which users or services performed particular activities in the subscription. This includes which administrators deleted, updated or created resources, and whether the activities failed or succeeded. This article explains how to enable and use Activity log insights. ## Enable Activity log insights
-The only requirement to enable Activity log insights is to [configure the Activity log to export to a Log Analytics workspace](activity-log.md#send-to-log-analytics-workspace). Pre-built [workbooks](/azure/azure-monitor/visualize/workbooks-overview) curate this data, which is stored in the [AzureActivity](/azure/azure-monitor/reference/tables/azureactivity) table in the workspace.
+The only requirement to enable Activity log insights is to [configure the Activity log to export to a Log Analytics workspace](activity-log.md#send-to-log-analytics-workspace). Pre-built [workbooks](../visualize/workbooks-overview.md) curate this data, which is stored in the [AzureActivity](/azure/azure-monitor/reference/tables/azureactivity) table in the workspace.
:::image type="content" source="media/activity-log/activity-logs-insights-main.png" lightbox="media/activity-log/activity-logs-insights-main.png" alt-text="A screenshot showing Azure Activity logs insights dashboards":::
To view Activity logs insights on a resource level:
1. At the top of the **Activity Logs Insights** page, select: 1. A time range for which to view data from the **TimeRange** dropdown.
- * **Azure Activity Logs Entries** shows the count of Activity log records in each [activity log category](/azure/azure-monitor/essentials/activity-log-schema#categories).
+ * **Azure Activity Logs Entries** shows the count of Activity log records in each [activity log category](./activity-log-schema.md#categories).
:::image type="content" source="media/activity-log/activity-logs-insights-category-value.png" lightbox= "media/activity-log/activity-logs-insights-category-value.png" alt-text="Azure Activity Logs by Category Value":::
To view Activity logs insights on a resource level:
Learn more about: * [Platform logs](./platform-logs-overview.md) * [Activity log event schema](activity-log-schema.md)
-* [Creating a diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
+* [Creating a diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
azure-monitor Metrics Supported https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/metrics-supported.md
This latest update adds a new column and reorders the metrics to be alphabetical
|Metric|Exportable via Diagnostic Settings?|Metric Display Name|Unit|Aggregation Type|Description|Dimensions| ||||||||
-|Average_% Available Memory|Yes|% Available Memory|Count|Average|Average_% Available Memory. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Available Swap Space|Yes|% Available Swap Space|Count|Average|Average_% Available Swap Space. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Committed Bytes In Use|Yes|% Committed Bytes In Use|Count|Average|Average_% Committed Bytes In Use. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% DPC Time|Yes|% DPC Time|Count|Average|Average_% DPC Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Free Inodes|Yes|% Free Inodes|Count|Average|Average_% Free Inodes. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Free Space|Yes|% Free Space|Count|Average|Average_% Free Space. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Idle Time|Yes|% Idle Time|Count|Average|Average_% Idle Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Interrupt Time|Yes|% Interrupt Time|Count|Average|Average_% Interrupt Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% IO Wait Time|Yes|% IO Wait Time|Count|Average|Average_% IO Wait Time. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Nice Time|Yes|% Nice Time|Count|Average|Average_% Nice Time. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Privileged Time|Yes|% Privileged Time|Count|Average|Average_% Privileged Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Processor Time|Yes|% Processor Time|Count|Average|Average_% Processor Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Used Inodes|Yes|% Used Inodes|Count|Average|Average_% Used Inodes. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Used Memory|Yes|% Used Memory|Count|Average|Average_% Used Memory. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Used Space|Yes|% Used Space|Count|Average|Average_% Used Space. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% Used Swap Space|Yes|% Used Swap Space|Count|Average|Average_% Used Swap Space. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_% User Time|Yes|% User Time|Count|Average|Average_% User Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Available MBytes|Yes|Available MBytes|Count|Average|Average_Available MBytes. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Available MBytes Memory|Yes|Available MBytes Memory|Count|Average|Average_Available MBytes Memory. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Available MBytes Swap|Yes|Available MBytes Swap|Count|Average|Average_Available MBytes Swap. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Avg. Disk sec/Read|Yes|Avg. Disk sec/Read|Count|Average|Average_Avg. Disk sec/Read. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Avg. Disk sec/Transfer|Yes|Avg. Disk sec/Transfer|Count|Average|Average_Avg. Disk sec/Transfer. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Avg. Disk sec/Write|Yes|Avg. Disk sec/Write|Count|Average|Average_Avg. Disk sec/Write. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric). |Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Bytes Received/sec|Yes|Bytes Received/sec|Count|Average|Average_Bytes Received/sec. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Bytes Sent/sec|Yes|Bytes Sent/sec|Count|Average|Average_Bytes Sent/sec. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Bytes Total/sec|Yes|Bytes Total/sec|Count|Average|Average_Bytes Total/sec. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Current Disk Queue Length|Yes|Current Disk Queue Length|Count|Average|Average_Current Disk Queue Length. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Disk Read Bytes/sec|Yes|Disk Read Bytes/sec|Count|Average|Average_Disk Read Bytes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Disk Reads/sec|Yes|Disk Reads/sec|Count|Average|Average_Disk Reads/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Disk Transfers/sec|Yes|Disk Transfers/sec|Count|Average|Average_Disk Transfers/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Disk Write Bytes/sec|Yes|Disk Write Bytes/sec|Count|Average|Average_Disk Write Bytes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Disk Writes/sec|Yes|Disk Writes/sec|Count|Average|Average_Disk Writes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Free Megabytes|Yes|Free Megabytes|Count|Average|Average_Free Megabytes. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Free Physical Memory|Yes|Free Physical Memory|Count|Average|Average_Free Physical Memory. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric). |Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Free Space in Paging Files|Yes|Free Space in Paging Files|Count|Average|Average_Free Space in Paging Files. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Free Virtual Memory|Yes|Free Virtual Memory|Count|Average|Average_Free Virtual Memory. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Logical Disk Bytes/sec|Yes|Logical Disk Bytes/sec|Count|Average|Average_Logical Disk Bytes/sec. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Page Reads/sec|Yes|Page Reads/sec|Count|Average|Average_Page Reads/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Page Writes/sec|Yes|Page Writes/sec|Count|Average|Average_Page Writes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Pages/sec|Yes|Pages/sec|Count|Average|Average_Pages/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Pct Privileged Time|Yes|Pct Privileged Time|Count|Average|Average_Pct Privileged Time. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Pct User Time|Yes|Pct User Time|Count|Average|Average_Pct User Time. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Physical Disk Bytes/sec|Yes|Physical Disk Bytes/sec|Count|Average|Average_Physical Disk Bytes/sec. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Processes|Yes|Processes|Count|Average|Average_Processes. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Processor Queue Length|Yes|Processor Queue Length|Count|Average|Average_Processor Queue Length. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric). |Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Size Stored In Paging Files|Yes|Size Stored In Paging Files|Count|Average|Average_Size Stored In Paging Files. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Bytes|Yes|Total Bytes|Count|Average|Average_Total Bytes. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Bytes Received|Yes|Total Bytes Received|Count|Average|Average_Total Bytes Received. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Bytes Transmitted|Yes|Total Bytes Transmitted|Count|Average|Average_Total Bytes Transmitted. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Collisions|Yes|Total Collisions|Count|Average|Average_Total Collisions. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Packets Received|Yes|Total Packets Received|Count|Average|Average_Total Packets Received. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Packets Transmitted|Yes|Total Packets Transmitted|Count|Average|Average_Total Packets Transmitted. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Rx Errors|Yes|Total Rx Errors|Count|Average|Average_Total Rx Errors. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Total Tx Errors|Yes|Total Tx Errors|Count|Average|Average_Total Tx Errors. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Uptime|Yes|Uptime|Count|Average|Average_Uptime. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Used MBytes Swap Space|Yes|Used MBytes Swap Space|Count|Average|. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Used Memory kBytes|Yes|Used Memory kBytes|Count|Average|Average_Used Memory kBytes. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Used Memory MBytes|Yes|Used Memory MBytes|Count|Average|Average_Used Memory MBytes. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Users|Yes|Users|Count|Average|Average_Users. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Average_Virtual Shared Memory|Yes|Virtual Shared Memory|Count|Average|Average_Virtual Shared Memory. Supported for: Linux. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
-|Event|Yes|Event|Count|Average|Event. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Source, EventLog, Computer, EventCategory, EventLevel, EventLevelName, EventID|
-|Heartbeat|Yes|Heartbeat|Count|Total|Heartbeat. Supported for: Linux, Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, OSType, Version, SourceComputerId|
-|Update|Yes|Update|Count|Average|Update. Supported for: Windows. Part of [metric alerts for logs feature](https://aka.ms/am-log-to-metric).|Computer, Product, Classification, UpdateState, Optional, Approved|
+|Average_% Available Memory|Yes|% Available Memory|Count|Average|Average_% Available Memory. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Available Swap Space|Yes|% Available Swap Space|Count|Average|Average_% Available Swap Space. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Committed Bytes In Use|Yes|% Committed Bytes In Use|Count|Average|Average_% Committed Bytes In Use. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% DPC Time|Yes|% DPC Time|Count|Average|Average_% DPC Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Free Inodes|Yes|% Free Inodes|Count|Average|Average_% Free Inodes. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Free Space|Yes|% Free Space|Count|Average|Average_% Free Space. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Idle Time|Yes|% Idle Time|Count|Average|Average_% Idle Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Interrupt Time|Yes|% Interrupt Time|Count|Average|Average_% Interrupt Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% IO Wait Time|Yes|% IO Wait Time|Count|Average|Average_% IO Wait Time. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Nice Time|Yes|% Nice Time|Count|Average|Average_% Nice Time. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Privileged Time|Yes|% Privileged Time|Count|Average|Average_% Privileged Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Processor Time|Yes|% Processor Time|Count|Average|Average_% Processor Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Used Inodes|Yes|% Used Inodes|Count|Average|Average_% Used Inodes. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Used Memory|Yes|% Used Memory|Count|Average|Average_% Used Memory. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Used Space|Yes|% Used Space|Count|Average|Average_% Used Space. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% Used Swap Space|Yes|% Used Swap Space|Count|Average|Average_% Used Swap Space. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_% User Time|Yes|% User Time|Count|Average|Average_% User Time. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Available MBytes|Yes|Available MBytes|Count|Average|Average_Available MBytes. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Available MBytes Memory|Yes|Available MBytes Memory|Count|Average|Average_Available MBytes Memory. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Available MBytes Swap|Yes|Available MBytes Swap|Count|Average|Average_Available MBytes Swap. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Avg. Disk sec/Read|Yes|Avg. Disk sec/Read|Count|Average|Average_Avg. Disk sec/Read. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Avg. Disk sec/Transfer|Yes|Avg. Disk sec/Transfer|Count|Average|Average_Avg. Disk sec/Transfer. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Avg. Disk sec/Write|Yes|Avg. Disk sec/Write|Count|Average|Average_Avg. Disk sec/Write. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md). |Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Bytes Received/sec|Yes|Bytes Received/sec|Count|Average|Average_Bytes Received/sec. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Bytes Sent/sec|Yes|Bytes Sent/sec|Count|Average|Average_Bytes Sent/sec. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Bytes Total/sec|Yes|Bytes Total/sec|Count|Average|Average_Bytes Total/sec. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Current Disk Queue Length|Yes|Current Disk Queue Length|Count|Average|Average_Current Disk Queue Length. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Disk Read Bytes/sec|Yes|Disk Read Bytes/sec|Count|Average|Average_Disk Read Bytes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Disk Reads/sec|Yes|Disk Reads/sec|Count|Average|Average_Disk Reads/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Disk Transfers/sec|Yes|Disk Transfers/sec|Count|Average|Average_Disk Transfers/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Disk Write Bytes/sec|Yes|Disk Write Bytes/sec|Count|Average|Average_Disk Write Bytes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Disk Writes/sec|Yes|Disk Writes/sec|Count|Average|Average_Disk Writes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Free Megabytes|Yes|Free Megabytes|Count|Average|Average_Free Megabytes. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Free Physical Memory|Yes|Free Physical Memory|Count|Average|Average_Free Physical Memory. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md). |Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Free Space in Paging Files|Yes|Free Space in Paging Files|Count|Average|Average_Free Space in Paging Files. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Free Virtual Memory|Yes|Free Virtual Memory|Count|Average|Average_Free Virtual Memory. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Logical Disk Bytes/sec|Yes|Logical Disk Bytes/sec|Count|Average|Average_Logical Disk Bytes/sec. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Page Reads/sec|Yes|Page Reads/sec|Count|Average|Average_Page Reads/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Page Writes/sec|Yes|Page Writes/sec|Count|Average|Average_Page Writes/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Pages/sec|Yes|Pages/sec|Count|Average|Average_Pages/sec. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Pct Privileged Time|Yes|Pct Privileged Time|Count|Average|Average_Pct Privileged Time. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Pct User Time|Yes|Pct User Time|Count|Average|Average_Pct User Time. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Physical Disk Bytes/sec|Yes|Physical Disk Bytes/sec|Count|Average|Average_Physical Disk Bytes/sec. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Processes|Yes|Processes|Count|Average|Average_Processes. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Processor Queue Length|Yes|Processor Queue Length|Count|Average|Average_Processor Queue Length. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md). |Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Size Stored In Paging Files|Yes|Size Stored In Paging Files|Count|Average|Average_Size Stored In Paging Files. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Bytes|Yes|Total Bytes|Count|Average|Average_Total Bytes. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Bytes Received|Yes|Total Bytes Received|Count|Average|Average_Total Bytes Received. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Bytes Transmitted|Yes|Total Bytes Transmitted|Count|Average|Average_Total Bytes Transmitted. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Collisions|Yes|Total Collisions|Count|Average|Average_Total Collisions. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Packets Received|Yes|Total Packets Received|Count|Average|Average_Total Packets Received. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Packets Transmitted|Yes|Total Packets Transmitted|Count|Average|Average_Total Packets Transmitted. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Rx Errors|Yes|Total Rx Errors|Count|Average|Average_Total Rx Errors. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Total Tx Errors|Yes|Total Tx Errors|Count|Average|Average_Total Tx Errors. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Uptime|Yes|Uptime|Count|Average|Average_Uptime. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Used MBytes Swap Space|Yes|Used MBytes Swap Space|Count|Average|. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Used Memory kBytes|Yes|Used Memory kBytes|Count|Average|Average_Used Memory kBytes. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Used Memory MBytes|Yes|Used Memory MBytes|Count|Average|Average_Used Memory MBytes. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Users|Yes|Users|Count|Average|Average_Users. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Average_Virtual Shared Memory|Yes|Virtual Shared Memory|Count|Average|Average_Virtual Shared Memory. Supported for: Linux. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, ObjectName, InstanceName, CounterPath, SourceSystem|
+|Event|Yes|Event|Count|Average|Event. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Source, EventLog, Computer, EventCategory, EventLevel, EventLevelName, EventID|
+|Heartbeat|Yes|Heartbeat|Count|Total|Heartbeat. Supported for: Linux, Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, OSType, Version, SourceComputerId|
+|Update|Yes|Update|Count|Average|Update. Supported for: Windows. Part of [metric alerts for logs feature](../alerts/alerts-metric-logs.md).|Computer, Product, Classification, UpdateState, Optional, Approved|
## Microsoft.Peering/peerings
This latest update adds a new column and reorders the metrics to be alphabetical
- [Read about metrics in Azure Monitor](../data-platform.md) - [Create alerts on metrics](../alerts/alerts-overview.md)-- [Export metrics to storage, Event Hub, or Log Analytics](../essentials/platform-logs-overview.md)
+- [Export metrics to storage, Event Hub, or Log Analytics](../essentials/platform-logs-overview.md)
azure-monitor Network Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/insights/network-insights-overview.md
Here are some links to troubleshooting articles for frequently used services. Fo
* [Azure VPN Gateway](../../vpn-gateway/vpn-gateway-troubleshoot.md) * [Azure ExpressRoute](../../expressroute/expressroute-troubleshooting-expressroute-overview.md) * [Azure Load Balancer](../../load-balancer/load-balancer-troubleshoot.md)
-* [Azure NAT Gateway](/azure/virtual-network/nat-gateway/troubleshoot-nat)
+* [Azure NAT Gateway](../../virtual-network/nat-gateway/troubleshoot-nat.md)
### Why don't I see the resources for all the subscriptions I've selected?
You can edit the workbook you see in any side-panel or detailed metric view by u
## Next steps - Learn more about network monitoring: [What is Azure Network Watcher?](../../network-watcher/network-watcher-monitoring-overview.md)-- Learn the scenarios workbooks are designed to support, how to create reports and customize existing reports, and more: [Create interactive reports with Azure Monitor workbooks](../visualize/workbooks-overview.md)
+- Learn the scenarios workbooks are designed to support, how to create reports and customize existing reports, and more: [Create interactive reports with Azure Monitor workbooks](../visualize/workbooks-overview.md)
azure-monitor Sql Insights Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/insights/sql-insights-troubleshoot.md
description: Learn how to troubleshoot SQL insights in Azure Monitor.
Previously updated : 1/3/2022 Last updated : 4/19/2022 # Troubleshoot SQL insights (preview)
For common cases, we provide troubleshooting tips in our logs view:
During preview of SQL Insights, you may encounter the following known issues. * **'Login failed' error connecting to server or database**. Using certain special characters in SQL authentication passwords saved in the monitoring VM configuration or in Key Vault may prevent the monitoring VM from connecting to a SQL server or database. This set of characters includes parentheses, square and curly brackets, the dollar sign, forward and back slashes, and dot (`[ { ( ) } ] $ \ / .`).
+* Spaces in the database connection string attributes may be replaced with special characters, leading to database connection failures. For example, if the space in the `User Id` attribute is replaced with a special character, connections will fail with the **Login failed for user ''** error. To resolve, edit the monitoring profile configuration, and delete every special character appearing in place of a space. Some special characters may look indistinguishable from a space, thus you may want to delete every space character, type it again, and save the configuration.
## Best practices
azure-monitor Cost Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/cost-logs.md
Some solutions have more specific policies about free data ingestion. For exampl
See the documentation for different services and solutions for any unique billing calculations. ## Commitment Tiers
-In addition to the Pay-As-You-Go model, Log Analytics has **Commitment Tiers**, which can save you as much as 30 percent compared to the Pay-As-You-Go price. With commitment tier pricing, you can commit to buy data ingestion starting at 100 GB/day at a lower price than Pay-As-You-Go pricing. Any usage above the commitment level (overage) is billed at that same price per GB as provided by the current commitment tier. The commitment tiers have a 31-day commitment period from the time a commitment tier is selected.
+In addition to the Pay-As-You-Go model, Log Analytics has **Commitment Tiers**, which can save you as much as 30 percent compared to the Pay-As-You-Go price. With commitment tier pricing, you can commit to buy data ingestion for a workspace, starting at 100 GB/day, at a lower price than Pay-As-You-Go pricing. Any usage above the commitment level (overage) is billed at that same price per GB as provided by the current commitment tier. The commitment tiers have a 31-day commitment period from the time a commitment tier is selected.
- During the commitment period, you can change to a higher commitment tier (which restarts the 31-day commitment period), but you can't move back to Pay-As-You-Go or to a lower commitment tier until after you finish the commitment period. - At the end of the commitment period, the workspace retains the selected commitment tier, and the workspace can be moved to Pay-As-You-Go or to a different commitment tier at any time.
-Billing for the commitment tiers is done on a daily basis. See [Azure Monitor pricing](https://azure.microsoft.com/pricing/details/monitor/) for a detailed listing of the commitment tiers and their prices.
+Billing for the commitment tiers is done per workspace on a daily basis. If the workspace is part of a [dedicated cluster](#dedicated-clusters), the billing is done for the cluster (see below). See [Azure Monitor pricing](https://azure.microsoft.com/pricing/details/monitor/) for a detailed listing of the commitment tiers and their prices.
> [!TIP] > The **Usage and estimated costs** menu item for each Log Analytics workspace hows an estimate of your monthly charges at each commitment level. You should periodically review this information to determine if you can reduce your charges by moving to another tier. See [Usage and estimated costs](../usage-estimated-costs.md#usage-and-estimated-costs) for information on this view. -
-> [!NOTE]
-> Starting June 2, 2021, **Capacity Reservations** were renamed to **Commitment Tiers**. Data collected above your commitment tier level (overage) is now billed at the same price-per-GB as the current commitment tier level, lowering costs compared to the old method of billing at the Pay-As-You-Go rate, and reducing the need for users with large data volumes to fine-tune their commitment level. Three new commitment tiers were also added: 1000, 2000, and 5000 GB/day.
- ## Dedicated clusters An [Azure Monitor Logs dedicated cluster](logs-dedicated-clusters.md) is a collection of workspaces in a single managed Azure Data Explorer cluster. Dedicated clusters support advanced features such as [customer-managed keys](customer-managed-keys.md) and use the same commitment tier pricing model as workspaces although they must have a commitment level of at least 500 GB/day. Any usage above the commitment level (overage) is billed at that same price per GB as provided by the current commitment tier. There is no Pay-As-You-Go option for clusters.
azure-monitor Tutorial Custom Logs Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tutorial-custom-logs-api.md
In this tutorial, you learn to:
To complete this tutorial, you need the following: - Log Analytics workspace where you have at least [contributor rights](manage-access.md#manage-access-using-azure-permissions) .-- [Permissions to create Data Collection Rule objects](/azure/azure-monitor/essentials/data-collection-rule-overview#permissions) in the workspace.
+- [Permissions to create Data Collection Rule objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
## Collect workspace details Start by gathering information that you'll need from your workspace.
The cache that drives IntelliSense may take up to 24 hours to update.
- [Complete a similar tutorial using the Azure portal.](tutorial-custom-logs.md) - [Read more about custom logs.](custom-logs-overview.md)-- [Learn more about writing transformation queries](../essentials/data-collection-rule-transformations.md)
+- [Learn more about writing transformation queries](../essentials/data-collection-rule-transformations.md)
azure-monitor Tutorial Custom Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tutorial-custom-logs.md
In this tutorial, you learn to:
To complete this tutorial, you need the following: - Log Analytics workspace where you have at least [contributor rights](manage-access.md#manage-access-using-azure-permissions) .-- [Permissions to create Data Collection Rule objects](/azure/azure-monitor/essentials/data-collection-rule-overview#permissions) in the workspace.
+- [Permissions to create Data Collection Rule objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
## Overview of tutorial
Following is sample data that you can use for the tutorial. Alternatively, you c
- [Complete a similar tutorial using the Azure portal.](tutorial-custom-logs-api.md) - [Read more about custom logs.](custom-logs-overview.md)-- [Learn more about writing transformation queries](../essentials/data-collection-rule-transformations.md)
+- [Learn more about writing transformation queries](../essentials/data-collection-rule-transformations.md)
azure-monitor Monitor Azure Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/monitor-azure-monitor.md
Last updated 04/07/2022
When you have critical applications and business processes relying on Azure resources, you want to monitor those resources for their availability, performance, and operation.
-This article describes the monitoring data generated by Azure Monitor. Azure Monitor uses [itself](/azure/azure-monitor/overview) to monitor certain parts of its own functionality. You can monitor:
+This article describes the monitoring data generated by Azure Monitor. Azure Monitor uses [itself](./overview.md) to monitor certain parts of its own functionality. You can monitor:
- Autoscale operations - Monitoring operations in the audit log
- If you're unfamiliar with the features of Azure Monitor common to all Azure services that use it, read [Monitoring Azure resources with Azure Monitor](/azure/azure-monitor/essentials/monitor-azure-resource).
+ If you're unfamiliar with the features of Azure Monitor common to all Azure services that use it, read [Monitoring Azure resources with Azure Monitor](./essentials/monitor-azure-resource.md).
For an overview showing where autoscale and the audit log fit into Azure Monitor, see [Introduction to Azure Monitor](overview.md).
The **Overview** page in the Azure portal for Azure Monitor shows links and tuto
## Monitoring data
-Azure Monitor collects the same kinds of monitoring data as other Azure resources that are described in [Monitoring data from Azure resources](/azure/azure-monitor/essentials/monitor-azure-resource#monitoring-data-from-Azure-resources).
+Azure Monitor collects the same kinds of monitoring data as other Azure resources that are described in [Monitoring data from Azure resources](./essentials/monitor-azure-resource.md#monitoring-data-from-azure-resources).
See [Monitoring *Azure Monitor* data reference](azure-monitor-monitoring-reference.md) for detailed information on the metrics and logs metrics created by Azure Monitor.
The metrics and logs you can collect are discussed in the following sections.
## Analyzing metrics
-You can analyze metrics for *Azure Monitor* with metrics from other Azure services using metrics explorer by opening **Metrics** from the **Azure Monitor** menu. See [Getting started with Azure Metrics Explorer](/azure/azure-monitor/essentials/metrics-getting-started) for details on using this tool.
+You can analyze metrics for *Azure Monitor* with metrics from other Azure services using metrics explorer by opening **Metrics** from the **Azure Monitor** menu. See [Getting started with Azure Metrics Explorer](./essentials/metrics-getting-started.md) for details on using this tool.
For a list of the platform metrics collected for Azure Monitor into itself, see [Azure Monitor monitoring data reference](azure-monitor-monitoring-reference.md#metrics).
-For reference, you can see a list of [all resource metrics supported in Azure Monitor](/azure/azure-monitor/essentials/metrics-supported).
+For reference, you can see a list of [all resource metrics supported in Azure Monitor](./essentials/metrics-supported.md).
<!-- Optional: Call out additional information to help your customers. For example, you can include additional information here about how to use metrics explorer specifically for your service. Remember that the UI is subject to change quite often so you will need to maintain these screenshots yourself if you add them in. -->
For reference, you can see a list of [all resource metrics supported in Azure Mo
Data in Azure Monitor Logs is stored in tables where each table has its own set of unique properties.
-All resource logs in Azure Monitor have the same fields followed by service-specific fields. The common schema is outlined in [Azure Monitor resource log schema](/azure/azure-monitor/essentials/resource-logs-schema) The schemas for autoscale resource logs are found in the [Azure Monitor Data Reference](azure-monitor-monitoring-reference.md#resource-logs)
+All resource logs in Azure Monitor have the same fields followed by service-specific fields. The common schema is outlined in [Azure Monitor resource log schema](./essentials/resource-logs-schema.md) The schemas for autoscale resource logs are found in the [Azure Monitor Data Reference](azure-monitor-monitoring-reference.md#resource-logs)
-The [Activity log](/azure/azure-monitor/essentials/activity-log) is a type of platform log in Azure that provides insight into subscription-level events. You can view it independently or route it to Azure Monitor Logs, where you can do much more complex queries using Log Analytics.
+The [Activity log](./essentials/activity-log.md) is a type of platform log in Azure that provides insight into subscription-level events. You can view it independently or route it to Azure Monitor Logs, where you can do much more complex queries using Log Analytics.
For a list of the types of resource logs collected for Azure Monitor, see [Monitoring Azure Monitor data reference](azure-monitor-monitoring-reference.md#resource-logs).
These are now listed in the [Log Analytics user interface](./logs/queries.md).
## Alerts
-Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues in your system before your customers notice them. You can set alerts on [metrics](/azure/azure-monitor/alerts/alerts-metric-overview), [logs](/azure/azure-monitor/alerts/alerts-unified-log), and the [activity log](/azure/azure-monitor/alerts/activity-log-alerts). Different types of alerts have benefits and drawbacks.
+Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues in your system before your customers notice them. You can set alerts on [metrics](./alerts/alerts-metric-overview.md), [logs](./alerts/alerts-unified-log.md), and the [activity log](./alerts/activity-log-alerts.md). Different types of alerts have benefits and drawbacks.
-For an in-depth discussion of using alerts with autoscale, see [Troubleshoot Azure autoscale](/azure/azure-monitor/autoscale/autoscale-troubleshoot).
+For an in-depth discussion of using alerts with autoscale, see [Troubleshoot Azure autoscale](./autoscale/autoscale-troubleshoot.md).
## Next steps - See [Monitoring Azure Monitor data reference](azure-monitor-monitoring-reference.md) for a reference of the metrics, logs, and other important values created by Azure Monitor to monitor itself.-- See [Monitoring Azure resources with Azure Monitor](/azure/azure-monitor/essentials/monitor-azure-resource) for details on monitoring Azure resources.
+- See [Monitoring Azure resources with Azure Monitor](./essentials/monitor-azure-resource.md) for details on monitoring Azure resources.
azure-monitor Monitor Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/monitor-reference.md
The following table lists Azure services and the data they collect into Azure Mo
| [Microsoft Power BI](/power-bi/power-bi-overview) | Microsoft.PowerBI/tenants | No | [**Yes**](./essentials/resource-logs-categories.md#microsoftpowerbitenants) | | | | [Microsoft Power BI](/power-bi/power-bi-overview) | Microsoft.PowerBI/tenants/workspaces | No | [**Yes**](./essentials/resource-logs-categories.md#microsoftpowerbitenantsworkspaces) | | | | [Power BI Embedded](/azure/power-bi-embedded/) | Microsoft.PowerBIDedicated/capacities | [**Yes**](./essentials/metrics-supported.md#microsoftpowerbidedicatedcapacities) | [**Yes**](./essentials/resource-logs-categories.md#microsoftpowerbidedicatedcapacities) | | |
- | [Azure Purview](../purview/index.yml) | Microsoft.Purview/accounts | [**Yes**](./essentials/metrics-supported.md#microsoftpurviewaccounts) | [**Yes**](./essentials/resource-logs-categories.md#microsoftpurviewaccounts) | | |
+ | [Microsoft Purview](../purview/index.yml) | Microsoft.Purview/accounts | [**Yes**](./essentials/metrics-supported.md#microsoftpurviewaccounts) | [**Yes**](./essentials/resource-logs-categories.md#microsoftpurviewaccounts) | | |
| [Azure Site Recovery](../site-recovery/index.yml) | Microsoft.RecoveryServices/vaults | [**Yes**](./essentials/metrics-supported.md#microsoftrecoveryservicesvaults) | [**Yes**](./essentials/resource-logs-categories.md#microsoftrecoveryservicesvaults) | | | | [Azure Relay](../azure-relay/relay-what-is-it.md) | Microsoft.Relay/namespaces | [**Yes**](./essentials/metrics-supported.md#microsoftrelaynamespaces) | [**Yes**](./essentials/resource-logs-categories.md#microsoftrelaynamespaces) | | | | [Azure Resource Manager](../azure-resource-manager/index.yml) | Microsoft.Resources/subscriptions | [**Yes**](./essentials/metrics-supported.md#microsoftresourcessubscriptions) | No | | |
azure-monitor Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/whats-new.md
This article lists significant changes to Azure Monitor documentation.
### Application Insights
-**New articles**
--- [Error retrieving data message on Application Insights portal](app/troubleshoot-portal-connectivity.md)-- [Troubleshooting Azure Application Insights auto-instrumentation](app/auto-instrumentation-troubleshoot.md)- **Updated articles** - [Application Insights API for custom events and metrics](app/api-custom-events-metrics.md)
azure-resource-manager Azure Subscription Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/azure-subscription-service-limits.md
The following table applies to v1, v2, Standard, and WAF SKUs unless otherwise s
[!INCLUDE [notification-hub-limits](../../../includes/notification-hub-limits.md)]
-## Azure Purview limits
+## Microsoft Purview limits
-The latest values for Azure Purview quotas can be found in the [Azure Purview quota page](../../purview/how-to-manage-quotas.md).
+The latest values for Microsoft Purview quotas can be found in the [Microsoft Purview quota page](../../purview/how-to-manage-quotas.md).
## Service Bus limits
azure-signalr Concept Connection String https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/concept-connection-string.md
Besides access key, SignalR service also supports other types of authentication
### Azure Active Directory Application
-You can use [Azure AD application](/azure/active-directory/develop/app-objects-and-service-principals) to connect to SignalR service. As long as the application has the right permission to access SignalR service, no access key is needed.
+You can use [Azure AD application](../active-directory/develop/app-objects-and-service-principals.md) to connect to SignalR service. As long as the application has the right permission to access SignalR service, no access key is needed.
To use Azure AD authentication, you need to remove `AccessKey` from connection string and add `AuthType=aad`. You also need to specify the credentials of your Azure AD application, including client ID, client secret and tenant ID. The connection string will look as follows:
For more information about how to authenticate using Azure AD application, see t
### Managed identity
-You can also use [managed identity](/azure/active-directory/managed-identities-azure-resources/overview) to authenticate with SignalR service.
+You can also use [managed identity](../active-directory/managed-identities-azure-resources/overview.md) to authenticate with SignalR service.
There are two types of managed identities, to use system assigned identity, you just need to add `AuthType=aad` to the connection string:
For more information about how to configure managed identity, see this [article]
Connection string contains the HTTP endpoint for app server to connect to SignalR service. This is also the endpoint server will return to clients in negotiate response, so client can also connect to the service.
-But in some applications there may be an additional component in front of SignalR service and all client connections need to go through that component first (to gain additional benefits like network security, [Azure Application Gateway](/azure/application-gateway/overview) is a common service that provides such functionality).
+But in some applications there may be an additional component in front of SignalR service and all client connections need to go through that component first (to gain additional benefits like network security, [Azure Application Gateway](../application-gateway/overview.md) is a common service that provides such functionality).
In such case, the client will need to connect to an endpoint different than SignalR service. Instead of manually replace the endpoint at client side, you can add `ClientEndpoint` to connecting string:
In a local development environment, the config is usually stored in file (appset
* Use .NET secret manager (`dotnet user-secrets set Azure:SignalR:ConnectionString "<connection_string>"`) * Set connection string to environment variable named `Azure__SignalR__ConnectionString` (colon needs to replaced with double underscore in [environment variable config provider](/dotnet/core/extensions/configuration-providers#environment-variable-configuration-provider)).
-In production environment, you can use other Azure services to manage config/secrets like Azure [Key Vault](/azure/key-vault/general/overview) and [App Configuration](/azure/azure-app-configuration/overview). See their documentation to learn how to set up config provider for those services.
+In production environment, you can use other Azure services to manage config/secrets like Azure [Key Vault](../key-vault/general/overview.md) and [App Configuration](../azure-app-configuration/overview.md). See their documentation to learn how to set up config provider for those services.
> [!NOTE] > Even you're directly setting connection string using code, it's not recommended to hardcode the connection string in source code, so you should still first read the connection string from a secret store like key vault and pass it to `AddAzureSignalR()`.
azure-signalr Signalr Concept Internals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/signalr-concept-internals.md
Once the application server is started,
- For ASP.NET Core SignalR, Azure SignalR Service SDK opens 5 WebSocket connections per hub to SignalR Service. - For ASP.NET SignalR, Azure SignalR Service SDK opens 5 WebSocket connections per hub to SignalR Service, and one per application WebSocket connection.
-5 WebSocket connections is the default value that can be changed in [configuration](https://github.com/Azure/azure-signalr/blob/dev/docs/run-asp-net-core.md#connectioncount).
+5 WebSocket connections is the default value that can be changed in [configuration](https://github.com/Azure/azure-signalr/blob/dev/docs/run-asp-net-core.md#connectioncount). Please note that this configures the initial server connection count the SDK starts. While the app server is connected to the SignalR service, the Azure SignalR service might send load-balancing messages to the server and the SDK will start new server connections to the service for better performance.
Messages to and from clients will be multiplexed into these connections.
azure-signalr Signalr Howto Scale Multi Instances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/signalr-howto-scale-multi-instances.md
private class CustomRouter : EndpointRouterDecorator
## Dynamic Scale ServiceEndpoints
-From SDK version 1.5.0, we're enabling dynamic scale ServiceEndpoints for ASP.NET Core version first. So you don't have to restart app server when you need to add/remove a ServiceEndpoint. As ASP.NET Core is supporting default configuration like `appsettings.json` with `reloadOnChange: true`, you don't need to change a code and it's supported by nature. And if you'd like to add some customized configuration and work with hot-reload, please refer to [this](https://docs.microsoft.com/aspnet/core/fundamentals/configuration/?view=aspnetcore-3.1).
+From SDK version 1.5.0, we're enabling dynamic scale ServiceEndpoints for ASP.NET Core version first. So you don't have to restart app server when you need to add/remove a ServiceEndpoint. As ASP.NET Core is supporting default configuration like `appsettings.json` with `reloadOnChange: true`, you don't need to change a code and it's supported by nature. And if you'd like to add some customized configuration and work with hot-reload, please refer to [this](/aspnet/core/fundamentals/configuration/?view=aspnetcore-3.1).
> [!NOTE] >
In this guide, you learned about how to configure multiple instances in the same
Multiple endpoints supports can also be used in high availability and disaster recovery scenarios. > [!div class="nextstepaction"]
-> [Setup SignalR Service for disaster recovery and high availability](./signalr-concept-disaster-recovery.md)
+> [Setup SignalR Service for disaster recovery and high availability](./signalr-concept-disaster-recovery.md)
azure-sql Active Directory Interactive Connect Azure Sql Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/active-directory-interactive-connect-azure-sql-db.md
Last updated 04/06/2022
This article provides a C# program that connects to Azure SQL Database. The program uses interactive mode authentication, which supports [Azure AD Multi-Factor Authentication](../../active-directory/authentication/concept-mfa-howitworks.md).
-For more information about Multi-Factor Authentication support for SQL tools, see [Using multi-factor Azure Active Directory authentication](/azure/azure-sql/database/authentication-mfa-ssms-overview).
+For more information about Multi-Factor Authentication support for SQL tools, see [Using multi-factor Azure Active Directory authentication](./authentication-mfa-ssms-overview.md).
## Multi-Factor Authentication for Azure SQL Database
For more information about Azure AD admins and users for Azure SQL Database, see
The C# example relies on the [Microsoft.Data.SqlClient](/sql/connect/ado-net/introduction-microsoft-data-sqlclient-namespace) namespace. For more information, see [Using Azure Active Directory authentication with SqlClient](/sql/connect/ado-net/sql/azure-active-directory-authentication). > [!NOTE]
-> [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) uses the Azure Active Directory Authentication Library (ADAL), which will be deprecated. If you're using the [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) namespace for Azure Active Directory authentication, migrate applications to [Microsoft.Data.SqlClient](/sql/connect/ado-net/introduction-microsoft-data-sqlclient-namespace) and the [Microsoft Authentication Library (MSAL)](/azure/active-directory/develop/msal-migration). For more information about using Azure AD authentication with SqlClient, see [Using Azure Active Directory authentication with SqlClient](/sql/connect/ado-net/sql/azure-active-directory-authentication).
+> [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) uses the Azure Active Directory Authentication Library (ADAL), which will be deprecated. If you're using the [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) namespace for Azure Active Directory authentication, migrate applications to [Microsoft.Data.SqlClient](/sql/connect/ado-net/introduction-microsoft-data-sqlclient-namespace) and the [Microsoft Authentication Library (MSAL)](../../active-directory/develop/msal-migration.md). For more information about using Azure AD authentication with SqlClient, see [Using Azure Active Directory authentication with SqlClient](/sql/connect/ado-net/sql/azure-active-directory-authentication).
## Verify with SQL Server Management Studio
azure-sql Analyze Prevent Deadlocks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/analyze-prevent-deadlocks.md
GO
## Use Azure Storage Explorer
-[Azure Storage Explorer](/azure/vs-azure-tools-storage-manage-with-storage-explorer) is a standalone application that simplifies working with event file targets stored in blobs in Azure Storage. You can use Storage Explorer to:
+[Azure Storage Explorer](../../vs-azure-tools-storage-manage-with-storage-explorer.md) is a standalone application that simplifies working with event file targets stored in blobs in Azure Storage. You can use Storage Explorer to:
-- [Create a blob container](/azure/vs-azure-tools-storage-explorer-blobs#create-a-blob-container) to hold XEvent session data.-- [Get the shared access signature (SAS)](/azure/vs-azure-tools-storage-explorer-blobs#get-the-sas-for-a-blob-container) for a blob container.
+- [Create a blob container](../../vs-azure-tools-storage-explorer-blobs.md#create-a-blob-container) to hold XEvent session data.
+- [Get the shared access signature (SAS)](../../vs-azure-tools-storage-explorer-blobs.md#get-the-sas-for-a-blob-container) for a blob container.
- As mentioned in [Collect deadlock graphs in Azure SQL Database with Extended Events](#collect-deadlock-graphs-in-azure-sql-database-with-extended-events), the read, write, and list permissions are required. - Remove any leading `?` character from the `Query string` to use the value as the secret when [creating a database scoped credential](?tabs=event-file#create-a-database-scoped-credential).-- [View and download](/azure/vs-azure-tools-storage-explorer-blobs#view-a-blob-containers-contents) extended event files from a blob container.
+- [View and download](../../vs-azure-tools-storage-explorer-blobs.md#view-a-blob-containers-contents) extended event files from a blob container.
[Download Azure Storage Explorer.](https://azure.microsoft.com/features/storage-explorer/).
Learn more about performance in Azure SQL Database:
- [SET TRANSACTION ISOLATION LEVEL](/sql/t-sql/statements/set-transaction-isolation-level-transact-sql) - [Azure SQL Database: Improving Performance Tuning with Automatic Tuning](/Shows/Data-Exposed/Azure-SQL-Database-Improving-Performance-Tuning-with-Automatic-Tuning) - [Deliver consistent performance with Azure SQL](/learn/modules/azure-sql-performance/)-- [Retry logic for transient errors](troubleshoot-common-connectivity-issues.md#retry-logic-for-transient-errors).
+- [Retry logic for transient errors](troubleshoot-common-connectivity-issues.md#retry-logic-for-transient-errors).
azure-sql Authentication Aad Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/authentication-aad-configure.md
For more information about CLI commands, see [az sql server](/cli/azure/sql/serv
## Configure your client computers > [!NOTE]
-> [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) uses the Azure Active Directory Authentication Library (ADAL), which will be deprecated. If you're using the [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) namespace for Azure Active Directory authentication, migrate applications to [Microsoft.Data.SqlClient](/sql/connect/ado-net/introduction-microsoft-data-sqlclient-namespace) and the [Microsoft Authentication Library (MSAL)](/azure/active-directory/develop/msal-migration). For more information about using Azure AD authentication with SqlClient, see [Using Azure Active Directory authentication with SqlClient](/sql/connect/ado-net/sql/azure-active-directory-authentication).
+> [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) uses the Azure Active Directory Authentication Library (ADAL), which will be deprecated. If you're using the [System.Data.SqlClient](/dotnet/api/system.data.sqlclient) namespace for Azure Active Directory authentication, migrate applications to [Microsoft.Data.SqlClient](/sql/connect/ado-net/introduction-microsoft-data-sqlclient-namespace) and the [Microsoft Authentication Library (MSAL)](../../active-directory/develop/msal-migration.md). For more information about using Azure AD authentication with SqlClient, see [Using Azure Active Directory authentication with SqlClient](/sql/connect/ado-net/sql/azure-active-directory-authentication).
> > SSMS and SSDT still uses the Azure Active Directory Authentication Library (ADAL). If you want to continue using *ADAL.DLL* in your applications, you can use the links in this section to install the latest SSMS, ODBC, and OLE DB driver that contains the latest *ADAL.DLL* library. On all client machines, from which your applications or users connect to SQL Database or Azure Synapse using Azure AD identities, you must install the following software: - .NET Framework 4.6 or later from [https://msdn.microsoft.com/library/5a4x27ek.aspx](/dotnet/framework/install/guide-for-developers).-- [Microsoft Authentication Library (MSAL)](/azure/active-directory/develop/msal-migration) or Azure Active Directory Authentication Library for SQL Server (*ADAL.DLL*). Below are the download links to install the latest SSMS, ODBC, and OLE DB driver that contains the *ADAL.DLL* library.
+- [Microsoft Authentication Library (MSAL)](../../active-directory/develop/msal-migration.md) or Azure Active Directory Authentication Library for SQL Server (*ADAL.DLL*). Below are the download links to install the latest SSMS, ODBC, and OLE DB driver that contains the *ADAL.DLL* library.
- [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) - [ODBC Driver 17 for SQL Server](/sql/connect/odbc/download-odbc-driver-for-sql-server?view=sql-server-ver15&preserve-view=true) - [OLE DB Driver 18 for SQL Server](/sql/connect/oledb/download-oledb-driver-for-sql-server?view=sql-server-ver15&preserve-view=true)
Guidance on troubleshooting issues with Azure AD authentication can be found in
[11]: ./media/authentication-aad-configure/active-directory-integrated.png [12]: ./media/authentication-aad-configure/12connect-using-pw-auth2.png
-[13]: ./media/authentication-aad-configure/13connect-to-db2.png
+[13]: ./media/authentication-aad-configure/13connect-to-db2.png
azure-sql Data Discovery And Classification Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/data-discovery-and-classification-overview.md
You can use the following SQL drivers to retrieve classification metadata:
## FAQ - Advanced classification capabilities
-**Question**: Will [Azure Purview](../../purview/overview.md) replace SQL Data Discovery & Classification or will SQL Data Discovery & Classification be retired soon?
-**Answer**: We continue to support SQL Data Discovery & Classification and encourage you to adopt [Azure Purview](../../purview/overview.md) which has richer capabilities to drive advanced classification capabilities and data governance. If we decide to retire any service, feature, API or SKU, you will receive advance notice including a migration or transition path. Learn more about Microsoft Lifecycle policies here.
+**Question**: Will [Microsoft Purview](../../purview/overview.md) replace SQL Data Discovery & Classification or will SQL Data Discovery & Classification be retired soon?
+**Answer**: We continue to support SQL Data Discovery & Classification and encourage you to adopt [Microsoft Purview](../../purview/overview.md) which has richer capabilities to drive advanced classification capabilities and data governance. If we decide to retire any service, feature, API or SKU, you will receive advance notice including a migration or transition path. Learn more about Microsoft Lifecycle policies here.
## Next steps - Consider configuring [Azure SQL Auditing](../../azure-sql/database/auditing-overview.md) for monitoring and auditing access to your classified sensitive data. - For a presentation that includes data Discovery & Classification, see [Discovering, classifying, labeling & protecting SQL data | Data Exposed](https://www.youtube.com/watch?v=itVi9bkJUNc).-- To classify your Azure SQL Databases and Azure Synapse Analytics with Azure Purview labels using T-SQL commands, see [Classify your Azure SQL data using Azure Purview labels](../../sql-database/scripts/sql-database-import-purview-labels.md).
+- To classify your Azure SQL Databases and Azure Synapse Analytics with Microsoft Purview labels using T-SQL commands, see [Classify your Azure SQL data using Microsoft Purview labels](../../sql-database/scripts/sql-database-import-purview-labels.md).
azure-sql Maintenance Window https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/maintenance-window.md
Previously updated : 04/04/2022 Last updated : 04/19/2022 # Maintenance window
Choosing a maintenance window other than the default is currently available in t
| Switzerland North | Yes | Yes | | | Switzerland West | Yes | | | | UAE Central | Yes | | |
-| UAE North | Yes | | |
+| UAE North | Yes | Yes | |
| UK South | Yes | Yes | Yes | | UK West | Yes | Yes | | | US Gov Arizona | Yes | | |
azure-sql Troubleshoot Memory Errors Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/troubleshoot-memory-errors-issues.md
ORDER BY max_query_max_used_memory DESC, avg_query_max_used_memory DESC;
### Extended events In addition to the previous information, it may be helpful to capture a trace of the activities on the server to thoroughly investigate an out of memory issue in Azure SQL Database.
-There are two ways to capture traces in SQL Server; Extended Events (XEvents) and Profiler Traces. However, [SQL Server Profiler](/sql/tools/sql-server-profiler/sql-server-profiler) is deprecated trace technology not supported for Azure SQL Database. [Extended Events](/sql/relational-databases/extended-events/extended-events) is the newer tracing technology that allows more versatility and less impact to the observed system, and its interface is integrated into SQL Server Management Studio (SSMS). For more information on querying extended events in Azure SQL Database, see [Extended events in Azure SQL Database](/azure/azure-sql/database/xevent-db-diff-from-svr).
+There are two ways to capture traces in SQL Server; Extended Events (XEvents) and Profiler Traces. However, [SQL Server Profiler](/sql/tools/sql-server-profiler/sql-server-profiler) is deprecated trace technology not supported for Azure SQL Database. [Extended Events](/sql/relational-databases/extended-events/extended-events) is the newer tracing technology that allows more versatility and less impact to the observed system, and its interface is integrated into SQL Server Management Studio (SSMS). For more information on querying extended events in Azure SQL Database, see [Extended events in Azure SQL Database](./xevent-db-diff-from-svr.md).
Refer to the document that explains how to use the [Extended Events New Session Wizard](/sql/relational-databases/extended-events/quick-start-extended-events-in-sql-server) in SSMS. For Azure SQL databases however, SSMS provides an Extended Events subfolder under each database in Object Explorer. Use an Extended Events session to capture these useful events, and identify the queries generating them:
If out of memory errors persist in Azure SQL Database, file an Azure support req
- [Troubleshooting connectivity issues and other errors with Azure SQL Database and Azure SQL Managed Instance](troubleshoot-common-errors-issues.md) - [Troubleshoot transient connection errors in SQL Database and SQL Managed Instance](troubleshoot-common-connectivity-issues.md) - [Demonstrating Intelligent Query Processing](https://github.com/Microsoft/sql-server-samples/tree/master/samples/features/intelligent-query-processing) -- [Resource management in Azure SQL Database](resource-limits-logical-server.md#memory).
+- [Resource management in Azure SQL Database](resource-limits-logical-server.md#memory).
azure-sql Winauth Azuread Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/winauth-azuread-overview.md
Last updated 03/01/2022
## Key capabilities and scenarios
-As customers modernize their infrastructure, application, and data tiers, they also modernize their identity management capabilities by shifting to Azure AD. Azure SQL offers multiple [Azure AD Authentication](/azure/azure-sql/database/authentication-aad-overview) options:
+As customers modernize their infrastructure, application, and data tiers, they also modernize their identity management capabilities by shifting to Azure AD. Azure SQL offers multiple [Azure AD Authentication](../database/authentication-aad-overview.md) options:
- 'Azure Active Directory - Password' offers authentication with Azure AD credentials - 'Azure Active Directory - Universal with MFA' adds multi-factor authentication
azure-sql Winauth Azuread Run Trace Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/winauth-azuread-run-trace-managed-instance.md
To use Windows Authentication to connect to and run a trace against a managed in
- To create or modify extended events sessions, ensure that your account has the [server permission](/sql/t-sql/statements/grant-server-permissions-transact-sql) of ALTER ANY EVENT SESSION on the managed instance. - To create or modify traces in SQL Server Profiler, ensure that your account has the [server permission](/sql/t-sql/statements/grant-server-permissions-transact-sql) of ALTER TRACE on the managed instance.
-If you have not yet enabled Windows authentication for Azure AD principals against your managed instance, you may run a trace against a managed instance using an [Azure AD Authentication](/azure/azure-sql/database/authentication-aad-overview) option, including:
+If you have not yet enabled Windows authentication for Azure AD principals against your managed instance, you may run a trace against a managed instance using an [Azure AD Authentication](../database/authentication-aad-overview.md) option, including:
- 'Azure Active Directory - Password' - 'Azure Active Directory - Universal with MFA'
azure-sql Winauth Azuread Setup Incoming Trust Based Flow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/winauth-azuread-setup-incoming-trust-based-flow.md
To implement the incoming trust-based authentication flow, first ensure that the
|Prerequisite |Description | ||| |Client must run Windows 10, Windows Server 2012, or a higher version of Windows. | |
-|Clients must be joined to AD. The domain must have a functional level of Windows Server 2012 or higher. | You can determine if the client is joined to AD by running the [dsregcmd command](/azure/active-directory/devices/troubleshoot-device-dsregcmd): `dsregcmd.exe /status` |
+|Clients must be joined to AD. The domain must have a functional level of Windows Server 2012 or higher. | You can determine if the client is joined to AD by running the [dsregcmd command](../../active-directory/devices/troubleshoot-device-dsregcmd.md): `dsregcmd.exe /status` |
|Azure AD Hybrid Authentication Management Module. | This PowerShell module provides management features for on-premises setup. | |Azure tenant. | | |Azure subscription under the same Azure AD tenant you plan to use for authentication.| |
Install-Module -Name AzureADHybridAuthenticationManagement -AllowClobber
- Enter the password for your Azure AD global administrator account. - If your organization uses other modern authentication methods such as MFA (Azure Multi-Factor Authentication) or Smart Card, follow the instructions as requested for sign in.
- If this is the first time you're configuring Azure AD Kerberos settings, the [Get-AzureAdKerberosServer cmdlet](/azure/active-directory/authentication/howto-authentication-passwordless-security-key-on-premises#view-and-verify-the-azure-ad-kerberos-server) will display empty information, as in the following sample output:
+ If this is the first time you're configuring Azure AD Kerberos settings, the [Get-AzureAdKerberosServer cmdlet](../../active-directory/authentication/howto-authentication-passwordless-security-key-on-premises.md#view-and-verify-the-azure-ad-kerberos-server) will display empty information, as in the following sample output:
``` ID :
Install-Module -Name AzureADHybridAuthenticationManagement -AllowClobber
1. Add the Trusted Domain Object.
- Run the [Set-AzureAdKerberosServer PowerShell cmdlet](/azure/active-directory/authentication/howto-authentication-passwordless-security-key-on-premises#create-a-kerberos-server-object) to add the Trusted Domain Object. Be sure to include `-SetupCloudTrust` parameter. If there is no Azure AD service account, this command will create a new Azure AD service account. If there is an Azure AD service account already, this command will only create the requested Trusted Domain object.
+ Run the [Set-AzureAdKerberosServer PowerShell cmdlet](../../active-directory/authentication/howto-authentication-passwordless-security-key-on-premises.md#create-a-kerberos-server-object) to add the Trusted Domain Object. Be sure to include `-SetupCloudTrust` parameter. If there is no Azure AD service account, this command will create a new Azure AD service account. If there is an Azure AD service account already, this command will only create the requested Trusted Domain object.
```powershell Set-AzureAdKerberosServer -Domain $domain `
Install-Module -Name AzureADHybridAuthenticationManagement -AllowClobber
## Configure the Group Policy Object (GPO)
-1. Identify your [Azure AD tenant ID](/azure/active-directory/fundamentals/active-directory-how-to-find-tenant).
+1. Identify your [Azure AD tenant ID](../../active-directory/fundamentals/active-directory-how-to-find-tenant.md).
1. Deploy the following Group Policy setting to client machines using the incoming trust-based flow:
Learn more about implementing Windows Authentication for Azure AD principals on
- [Configure Azure SQL Managed Instance for Windows Authentication for Azure Active Directory (Preview)](winauth-azuread-kerberos-managed-instance.md) - [What is Windows Authentication for Azure Active Directory principals on Azure SQL Managed Instance? (Preview)](winauth-azuread-overview.md)-- [How to set up Windows Authentication for Azure SQL Managed Instance using Azure Active Directory and Kerberos (Preview)](winauth-azuread-setup.md)
+- [How to set up Windows Authentication for Azure SQL Managed Instance using Azure Active Directory and Kerberos (Preview)](winauth-azuread-setup.md)
azure-sql Winauth Azuread Setup Modern Interactive Flow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/winauth-azuread-setup-modern-interactive-flow.md
There is no AD to Azure AD set up required for enabling software running on Azur
|Prerequisite |Description | ||| |Clients must run Windows 10 20H1, Windows Server 2022, or a higher version of Windows. | |
-|Clients must be joined to Azure AD or Hybrid Azure AD. | You can determine if this prerequisite is met by running the [dsregcmd command](/azure/active-directory/devices/troubleshoot-device-dsregcmd): `dsregcmd.exe /status` |
+|Clients must be joined to Azure AD or Hybrid Azure AD. | You can determine if this prerequisite is met by running the [dsregcmd command](../../active-directory/devices/troubleshoot-device-dsregcmd.md): `dsregcmd.exe /status` |
|Application must connect to the managed instance via an interactive session. | This supports applications such as SQL Server Management Studio (SSMS) and web applications, but won't work for applications that run as a service. | |Azure AD tenant. | | |Azure AD Connect installed. | Hybrid environments where identities exist both in Azure AD and AD. |
Learn more about implementing Windows Authentication for Azure AD principals on
- [How Windows Authentication for Azure SQL Managed Instance is implemented with Azure Active Directory and Kerberos (Preview)](winauth-implementation-aad-kerberos.md) - [How to set up Windows Authentication for Azure AD with the incoming trust-based flow (Preview)](winauth-azuread-setup-incoming-trust-based-flow.md) - [Configure Azure SQL Managed Instance for Windows Authentication for Azure Active Directory (Preview)](winauth-azuread-kerberos-managed-instance.md)-- [Troubleshoot Windows Authentication for Azure AD principals on Azure SQL Managed Instance](winauth-azuread-troubleshoot.md)
+- [Troubleshoot Windows Authentication for Azure AD principals on Azure SQL Managed Instance](winauth-azuread-troubleshoot.md)
azure-sql Winauth Azuread Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/winauth-azuread-setup.md
Following this, a system administrator configures authentication flows. Two auth
### Synchronize AD with Azure AD
-Customers should first implement [Azure AD Connect](/azure/active-directory/hybrid/whatis-azure-ad-connect) to integrate on-premises directories with Azure AD.
+Customers should first implement [Azure AD Connect](../../active-directory/hybrid/whatis-azure-ad-connect.md) to integrate on-premises directories with Azure AD.
### Select which authentication flow(s) you will implement
The following prerequisites are required to implement the modern interactive aut
|Prerequisite |Description | ||| |Clients must run Windows 10 20H1, Windows Server 2022, or a higher version of Windows. | |
-|Clients must be joined to Azure AD or Hybrid Azure AD. | You can determine if this prerequisite is met by running the [dsregcmd command](/azure/active-directory/devices/troubleshoot-device-dsregcmd): `dsregcmd.exe /status` |
+|Clients must be joined to Azure AD or Hybrid Azure AD. | You can determine if this prerequisite is met by running the [dsregcmd command](../../active-directory/devices/troubleshoot-device-dsregcmd.md): `dsregcmd.exe /status` |
|Application must connect to the managed instance via an interactive session. | This supports applications such as SQL Server Management Studio (SSMS) and web applications, but won't work for applications that run as a service. | |Azure AD tenant. | | |Azure AD Connect installed. | Hybrid environments where identities exist both in Azure AD and AD. |
The following prerequisites are required to implement the incoming trust-based a
|Prerequisite |Description | ||| |Client must run Windows 10, Windows Server 2012, or a higher version of Windows. | |
-|Clients must be joined to AD. The domain must have a functional level of Windows Server 2012 or higher. | You can determine if the client is joined to AD by running the [dsregcmd command](/azure/active-directory/devices/troubleshoot-device-dsregcmd): `dsregcmd.exe /status` |
+|Clients must be joined to AD. The domain must have a functional level of Windows Server 2012 or higher. | You can determine if the client is joined to AD by running the [dsregcmd command](../../active-directory/devices/troubleshoot-device-dsregcmd.md): `dsregcmd.exe /status` |
|Azure AD Hybrid Authentication Management Module. | This PowerShell module provides management features for on-premises setup. | |Azure tenant. | | |Azure subscription under the same Azure AD tenant you plan to use for authentication.| |
azure-vmware Deploy Arc For Azure Vmware Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/deploy-arc-for-azure-vmware-solution.md
Before you begin checking off the prerequisites, verify the following actions ha
The following items are needed to ensure you're set up to begin the onboarding process to deploy Arc for Azure VMware Solution (Preview). - A jump box virtual machine (VM) with network access to the Azure VMware Solution vCenter.
- - From the jump-box VM, verify you have access to [vCenter and NSX-T portals](/azure/azure-vmware/tutorial-configure-networking).
+ - From the jump-box VM, verify you have access to [vCenter and NSX-T portals](./tutorial-configure-networking.md).
- Verify that your Azure subscription has been enabled or you have connectivity to Azure end points, mentioned in the [Appendices](#appendices). - Resource group in the subscription where you have owner or contributor role. - A minimum of three free non-overlapping IPs addresses.
The following items are needed to ensure you're set up to begin the onboarding p
At this point, you should have already deployed an Azure VMware Solution private cluster. You need to have a connection from your on-prem environment or your native Azure Virtual Network to the Azure VMware Solution private cloud.
-For Network planning and setup, use the [Network planning checklist - Azure VMware Solution | Microsoft Docs](/azure/azure-vmware/tutorial-network-checklist)
+For Network planning and setup, use the [Network planning checklist - Azure VMware Solution | Microsoft Docs](./tutorial-network-checklist.md)
### Registration to Arc for Azure VMware Solution feature set
The guest management must be enabled on the VMware virtual machine (VM) before y
>[!NOTE] > The following conditions are necessary to enable guest management on a VM. -- The machine must be running a [Supported operating system](/azure/azure-arc/servers/agent-overview).-- The machine needs to connect through the firewall to communicate over the Internet. Make sure the [URLs](/azure/azure-arc/servers/agent-overview) listed aren't blocked.
+- The machine must be running a [Supported operating system](../azure-arc/servers/agent-overview.md).
+- The machine needs to connect through the firewall to communicate over the Internet. Make sure the [URLs](../azure-arc/servers/agent-overview.md) listed aren't blocked.
- The machine can't be behind a proxy, it's not supported yet. - If you're using Linux VM, the account must not prompt to sign in on pseudo commands.
Use the following tips as a self-help guide.
**Where can I find more information related to Azure Arc resource bridge?** -- For more information, go to [Azure Arc resource bridge (preview) overview](/azure/azure-arc/resource-bridge/overview)
+- For more information, go to [Azure Arc resource bridge (preview) overview](../azure-arc/resource-bridge/overview.md)
## Appendices
Appendix 1 shows proxy URLs required by the Azure Arc-enabled private cloud. The
**Additional URL resources** - [Google Container Registry](http://gcr.io/)-- [Red Hat Quay.io](http://quay.io/)---
+- [Red Hat Quay.io](http://quay.io/)
azure-vmware Ecosystem App Monitoring Solutions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/ecosystem-app-monitoring-solutions.md
Title: Application performance monitoring and troubleshooting solutions for Azure VMware Solution description: Learn about leading application monitoring and troubleshooting solutions for your Azure VMware Solution private cloud. Previously updated : 12/10/2021 Last updated : 04/11/2022 # Application performance monitoring and troubleshooting solutions for Azure VMware Solution A key objective of Azure VMware Solution is to maintain the performance and security of applications and services across VMware on Azure and on-premises. Getting there requires visibility into complex infrastructures and quickly pinpointing the root cause of service disruptions across the hybrid cloud.
-Our application performance monitoring and troubleshooting partners have industry-leading solutions in VMware-based environments that assure the availability, reliability, and responsiveness of applications and services. Our customers have adopted many of these solutions integrated with VMware NSX-T for their on-premises deployments. As one of our key principles, we want to enable them to continue to use their investments and VMware solutions running on Azure. Many of these Independent Software Vendors (ISV) have validated their solutions with Azure VMware Solution.
+Our application performance monitoring and troubleshooting partners have industry-leading solutions in VMware-based environments that assure the availability, reliability, and responsiveness of applications and services. Our customers have adopted many of these solutions integrated with VMware NSX-T Data Center for their on-premises deployments. As one of our key principles, we want to enable them to continue to use their investments and VMware solutions running on Azure. Many of these Independent Software Vendors (ISV) have validated their solutions with Azure VMware Solution.
You can find more information about these solutions here: - [NETSCOUT](https://www.netscout.com/technology-partners/microsoft-azure)-- [Turbonomic](https://blog.turbonomic.com/turbonomic-announces-partnership-and-support-for-azure-vmware-service)
+- [Turbonomic](https://blog.turbonomic.com/turbonomic-announces-partnership-and-support-for-azure-vmware-service)
azure-vmware Ecosystem Os Vms https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/ecosystem-os-vms.md
Title: Operating system support for Azure VMware Solution virtual machines description: Learn about operating system support for your Azure VMware Solution virtual machines. Previously updated : 03/13/2022 Last updated : 04/11/2022 # Operating system support for Azure VMware Solution virtual machines
Azure VMware Solution supports a wide range of operating systems to be used in t
Check the list of operating systems and configurations supported in the [VMware Compatibility Guide](https://www.vmware.com/resources/compatibility/search.php?deviceCategory=software), create a query for ESXi 6.7 Update 3 and select all operating systems and vendors.
-Additionally to the supported operating systems by VMware on vSphere we have worked with Red Hat, SUSE and Canonical to extend the support model currently in place for Azure Virtual Machines to the workloads running on Azure VMware Solution, given that it is a first-party Azure service. You can check the following sites of vendors for more information about the benefits of running their operating system on Azure.
+Additionally to the supported operating systems by VMware for vSphere, we have worked with Red Hat, SUSE and Canonical to extend the support model currently in place for Azure Virtual Machines to the workloads running on Azure VMware Solution, given that it is a first-party Azure service. You can check the following sites of vendors for more information about the benefits of running their operating system on Azure.
- [Red Hat Enterprise Linux](https://access.redhat.com/ecosystem/microsoft-azure) - [Ubuntu Server](https://ubuntu.com/azure)-- [SUSE Enterprise Linux Server](https://www.suse.com/partners/alliance/microsoft/)
+- [SUSE Enterprise Linux Server](https://www.suse.com/partners/alliance/microsoft/)
azure-vmware Ecosystem Security Solutions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/ecosystem-security-solutions.md
Title: Security solutions for Azure VMware Solution description: Learn about leading security solutions for your Azure VMware Solution private cloud. Previously updated : 09/15/2021 Last updated : 04/11/2022 # Security solutions for Azure VMware Solution A fundamental part of Azure VMware Solution is security. It allows customers to run their VMware-based workloads in a safe and trustable environment.
-Our security partners have industry-leading solutions in VMware-based environments that cover many aspects of the security ecosystem like threat protection and security scanning. Our customers have adopted many of these solutions integrated with VMware NSX-T for their on-premises deployments. As one of our key principles, we want to enable them to continue to use their investments and VMware solutions running on Azure. Many of these Independent Software Vendors (ISV) have validated their solutions with Azure VMware Solution.
+Our security partners have industry-leading solutions in VMware-based environments that cover many aspects of the security ecosystem like threat protection and security scanning. Our customers have adopted many of these solutions integrated with VMware NSX-T Data Center for their on-premises deployments. As one of our key principles, we want to enable them to continue to use their investments and VMware solutions running on Azure. Many of these Independent Software Vendors (ISV) have validated their solutions with Azure VMware Solution.
You can find more information about these solutions here: - [Bitdefender](https://businessinsights.bitdefender.com/expanding-security-support-for-azure-vmware-solution) - [Trend Micro Deep Security](https://www.trendmicro.com/en_us/business/products/hybrid-cloud/deep-security.html)-- [Check Point](https://www.checkpoint.com/cloudguard/cloud-network-security/iaas-public-cloud-security/)
+- [Check Point](https://www.checkpoint.com/cloudguard/cloud-network-security/iaas-public-cloud-security/)
bastion Bastion Vm Copy Paste https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/bastion/bastion-vm-copy-paste.md
description: Learn how copy and paste to and from a Windows VM using Bastion.
Previously updated : 04/18/2022 Last updated : 04/19/2022 # Customer intent: I want to copy and paste to and from VMs using Azure Bastion.
Before you proceed, make sure you have the following items.
## <a name="configure"></a> Configure the bastion host
-By default, Azure Bastion is automatically enabled to allow copy and paste for all sessions connected through the bastion resource. You don't need to configure anything additional. This applies to both the Basic and the Standard SKU tier. If you want to disable the copy and paste feature, the Standard SKU is required.
+By default, Azure Bastion is automatically enabled to allow copy and paste for all sessions connected through the bastion resource. You don't need to configure anything additional. This applies to both the Basic and the Standard SKU tier. If you want to disable this feature, you can disable it for web-based clients on the configuration page of your Bastion resource.
1. To view or change your configuration, in the portal, go to your Bastion resource. 1. Go to the **Configuration** page.
bastion Vm About https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/bastion/vm-about.md
description: Learn about VM connections and features when connecting using Azure
Previously updated : 04/18/2022 Last updated : 04/19/2022
You can use a variety of different methods to connect to a target VM. Some conne
## <a name="copy-paste"></a>Copy and paste
-You can copy and paste text between your local device and the remote session. Only text copy/paste is supported. By default, this feature is enabled. If you want to disable this feature, you can change the setting on the configuration page for your bastion host. To disable, your bastion host must be configured with the Standard SKU tier.
+You can copy and paste text between your local device and the remote session. Only text copy/paste is supported. By default, this feature is enabled. If you want to disable this feature for web-based clients, you can change the setting on the configuration page for your bastion host. To disable, your bastion host must be configured with the Standard SKU tier.
For steps and more information, see [Copy and paste - Windows VMs](bastion-vm-copy-paste.md).
cdn Cdn Custom Ssl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-custom-ssl.md
If your CNAME record is in the correct format, DigiCert automatically verifies y
Automatic validation typically takes a few hours. If you donΓÇÖt see your domain validated in 24 hours, open a support ticket. >[!NOTE]
->If you have a Certificate Authority Authorization (CAA) record with your DNS provider, it must include DigiCert as a valid CA. A CAA record allows domain owners to specify with their DNS providers which CAs are authorized to issue certificates for their domain. If a CA receives an order for a certificate for a domain that has a CAA record and that CA is not listed as an authorized issuer, it is prohibited from issuing the certificate to that domain or subdomain. For information about managing CAA records, see [Manage CAA records](https://support.dnsimple.com/articles/manage-caa-record/). For a CAA record tool, see [CAA Record Helper](https://sslmate.com/caa/).
+>If you have a Certificate Authority Authorization (CAA) record with your DNS provider, it must include the appropriate CA(s) for authorization. DigiCert is the CA for Microsoft and Verizon profiles. Akamai profile obtains certificates from three CAs: GeoTrust, Let's Encrypt and DigiCert. If a CA receives an order for a certificate for a domain that has a CAA record and that CA is not listed as an authorized issuer, it is prohibited from issuing the certificate to that domain or subdomain. For information about managing CAA records, see [Manage CAA records](https://support.dnsimple.com/articles/manage-caa-record/). For a CAA record tool, see [CAA Record Helper](https://sslmate.com/caa/).
### Custom domain isn't mapped to your CDN endpoint
cdn Cdn Verizon Premium Rules Engine https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-verizon-premium-rules-engine.md
Previously updated : 05/31/2019 Last updated : 04/13/2022
To access the rules engine, you must first select **Manage** from the top of the
Select the **HTTP Large** tab, then select **Rules Engine**.
- ![Rules engine for HTTP](./media/cdn-rules-engine/cdn-http-rules-engine.png)
+ :::image type="content" source="./media/cdn-rules-engine/cdn-http-rules-engine.png" alt-text="Screenshot of rules engine for HTTP.":::
- Endpoints optimized for DSA:
To access the rules engine, you must first select **Manage** from the top of the
ADN is a term used by Verizon to specify DSA content. Any rules you create here are ignored by any endpoints in your profile that are not optimized for DSA.
- ![Rules engine for DSA](./media/cdn-rules-engine/cdn-dsa-rules-engine.png)
+ :::image type="content" source="./media/cdn-rules-engine/cdn-dsa-rules-engine.png" alt-text="Screenshot of rules engine for DSA.":::
## Tutorial
-1. From the **CDN profile** page, select **Manage**.
-
- ![CDN profile Manage button](./media/cdn-rules-engine/cdn-manage-btn.png)
-
- The CDN management portal opens.
+1. From the **CDN profile** page, select **Manage** to open the CDN management portal.
-2. Select the **HTTP Large** tab, then select **Rules Engine**.
-
- The options for a new rule are displayed.
-
- ![CDN new rule options](./media/cdn-rules-engine/cdn-new-rule.png)
+ :::image type="content" source="./media/cdn-rules-engine/cdn-manage-btn.png" alt-text="Screenshot of the manage button from the CDN profile.":::
+
+1. Select the **HTTP Large** tab, then select **Rules Engine**.
+
+1. Select **+ New** to create a new draft policy.
+
+ :::image type="content" source="./media/cdn-rules-engine/new-draft.png" alt-text="Screenshot of the create a new policy button.":::
+1. Give the policy a name. Select **Continue**, then select **+ Rule**.
+
+ :::image type="content" source="./media/cdn-rules-engine/new-draft-2.png" alt-text="Screenshot of the policy creation page.":::
+ > [!IMPORTANT] > The order in which multiple rules are listed affects how they are handled. A subsequent rule may override the actions specified by a previous rule. For example, if you have a rule that allows access to a resource based on a request property and a rule that denies access to all requests, the second rule overrides the first one. Rules will override earlier rules only if they interact with the same properties. >
-3. Enter a name in the **Name / Description** textbox.
+1. Enter a name in the **Name / Description** textbox.
-4. Identify the type of requests the rule applies to. Use the default match condition, **Always**.
+1. Select the **+** button and then select **Match** or **Select First Match** for the match logic. The difference between the two is described in [Request Identification](https://docs.edgecast.com/cdn/https://docsupdatetracker.net/index.html#HRE/MatchesConcept.htm?).
+
+1. Identify the type of requests the rule applies to. Use the default match condition, **Always**.
- ![CDN rule match condition](./media/cdn-rules-engine/cdn-request-type.png)
+ :::image type="content" source="./media/cdn-rules-engine/cdn-request-type.png" alt-text="Screenshot of the CDN rule match condition.":::
> [!NOTE] > Multiple match conditions are available in the dropdown list. For information about the currently selected match condition, select the blue informational icon to its left. >
- > For a detailed list of conditional expressions, see [Rules engine conditional expressions](cdn-verizon-premium-rules-engine-reference-match-conditions.md).
+ > For a detailed list of conditional expressions, see [Rules engine conditional expressions](cdn-verizon-premium-rules-engine-reference-match-conditions.md).
> > For a detailed list of match conditions, see [Rules engine match conditions](cdn-verizon-premium-rules-engine-reference-match-conditions.md). > >
-5. To add a new feature, select the **+** button next to **Features**. In the dropdown on the left, select **Force Internal Max-Age**. In the textbox that appears, enter **300**. Do not change the remaining default values.
+1. To add a new feature, select the **+** button in the conditional statement.
- ![CDN rule feature](./media/cdn-rules-engine/cdn-new-feature.png)
+ :::image type="content" source="./media/cdn-rules-engine/cdn-new-feature.png" alt-text="Screenshot of the CDN rules feature in a rule.":::
+1. From the *category* drop-down, select **Caching**. Then from the *feature* drop-down, select **Force Internal Max-Age**. In the text box enter the value **300**. Leave the rest of the settings as default and select **Save** to complete the configuration of the rule.
+ > [!NOTE] > Multiple features are available in the dropdown list. For information about the currently selected feature, select the blue informational icon to its left. >
To access the rules engine, you must first select **Manage** from the top of the
> >
-6. Select **Add** to save the new rule. The new rule is now awaiting approval. After it has been approved, the status changes from **Pending XML** to **Active XML**.
-
- > [!IMPORTANT]
- > Rules changes can take up to 10 minutes to propagate through Azure CDN.
- >
- >
+1. Select **Lock Draft as Policy**. Once you lock the draft into a policy, you won't be able to add or update any rules within that policy.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-builder.png" alt-text="Screenshot of the CDN policy builder.":::
+
+1. Select **Deploy Request**.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-builder-2.png" alt-text="Screenshot of the deploy request button in policy builder.":::
+
+1. If this CDN profile is new with no previous rules or production traffic, you can select the environment as **Production** in the drop-down menu. Enter a description of the environment and then select **Create Deploy Request**.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-builder-environment.png" alt-text="Screenshot of the CDN policy builder environment.":::
+
+ > [!NOTE]
+ > Once the policy has been deployed, it will take about 30 mins for it propagate. If you want to add or update more rules, you'll need to duplicate the current rule and deploy the new policy.
+
+## Add rules to an existing policy deployed in production
+
+1. Select the policy that is deployed in production.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-production-overview.png" alt-text="Screenshot of the policy production overview page.":::
+
+1. Select **Duplicate** to clone the existing policy in production.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-production-duplicate.png" alt-text="Screenshot of the duplicate button on the policy overview page.":::
+
+1. Select the pencil icon to edit an existing rule or select **+ Rule** to add a new rule to the policy.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-production-edit.png" alt-text="Screenshot of the edit button and new rule for duplicate policy." lightbox="./media/cdn-rules-engine/policy-production-edit-expanded.png":::
+
+1. Once you're happy with the updates, follow steps 10-12 in the last section to deploy the policy.
+
+## Rules Engine staging environment
+
+* The staging environment provides a sandbox where you can test the new CDN configuration end to end without impacting the production environment. This configuration allows you to replicate traffic flow through your staging network to an origin server.
+* The staging environment is designed for functional testing and is at a smaller scale than the production CDN environment. Therefore, you shouldn't use this environment for scale, high volume or throughput testing.
+* Traffic should be kept under 50 Mbps or 500 requests per second.
+* Changes made to the staging environment will not affect your live site environment.
+* Testing HTTPS traffic using the staging environment will result in a TLS certificate mismatch.
+* Testing mechanism:
+ * After locking a draft into a policy, select **Deploy Request**. Select the environment as **Staging** and then select **Create Deploy Request**.
+
+ :::image type="content" source="./media/cdn-rules-engine/policy-staging.png" alt-text="Screenshot of a staging policy." lightbox="./media/cdn-rules-engine/policy-staging-expanded.png":::
+
+ * Edit your local host file to create an A record for your endpoint or custom domain.
+ * Check the test asset for the custom domain in the browser and proceed without using HTTPS.
+
+ > [!NOTE]
+ > Once a policy is deployed in the staging environment, it will take 15 mins to propagate.
+ >
## See also
To access the rules engine, you must first select **Manage** from the top of the
- [Rules engine match conditions](cdn-verizon-premium-rules-engine-reference-match-conditions.md) - [Rules engine conditional expressions](cdn-verizon-premium-rules-engine-reference-conditional-expressions.md) - [Rules engine features](cdn-verizon-premium-rules-engine-reference-features.md)-- [Azure Fridays: Azure CDN's powerful new premium features](https://azure.microsoft.com/documentation/videos/azure-cdns-powerful-new-premium-features/) (video)
+- [Azure Fridays: Azure CDN's powerful new premium features](https://azure.microsoft.com/documentation/videos/azure-cdns-powerful-new-premium-features/) (video)
cognitive-services Overview Ocr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Computer-vision/overview-ocr.md
The **Read** call takes images and documents as its input. They have the followi
* Supported file formats: JPEG, PNG, BMP, PDF, and TIFF * For PDF and TIFF files, up to 2000 pages (only first two pages for the free tier) are processed.
-* The file size must be less than 50 MB (6 MB for the free tier) and dimensions at least 50 x 50 pixels and at most 10000 x 10000 pixels.
+* The file size must be less than 50 MB (4 MB for the free tier) and dimensions at least 50 x 50 pixels and at most 10000 x 10000 pixels.
* The minimum height of the text to be extracted is 12 pixels for a 1024X768 image. This corresponds to about 8 font point text at 150 DPI. ## Supported languages
cognitive-services Luis Reference Prebuilt Domains https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/luis-reference-prebuilt-domains.md
Previously updated : 09/27/2019 Last updated : 04/18/2022 #source: https://raw.githubusercontent.com/Microsoft/luis-prebuilt-domains/master/README.md #acrolinx bug for exception: https://mseng.visualstudio.com/TechnicalContent/_workitems/edit/1518317 # Prebuilt domain reference for your LUIS app+ This reference provides information about the [prebuilt domains](./howto-add-prebuilt-models.md), which are prebuilt collections of intents and entities that LUIS offers. [Custom domains](luis-how-to-start-new-app.md), by contrast, start with no intents and models. You can add any prebuilt domain intents and entities to a custom model.
This reference provides information about the [prebuilt domains](./howto-add-pre
The table below summarizes the currently supported domains. Support for English is usually more complete than others.
-| Entity Type | EN-US | ZH-CN | DE | FR | ES | IT | PT-BR | JP | KO | NL | TR |
-|::|:--:|:--:|:--:|:--:|:--:|:--:|:|:|:|:|:|
-| Calendar | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Communication | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Email | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| HomeAutomation | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Notes | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Places | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| RestaurantReservation | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| ToDo | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Utilities | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Weather | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-| Web | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Entity Type | EN-US | ZH-CN | DE | FR | ES | IT | PT-BR | KO | NL | TR |
+|::|:--:|:--:|:--:|:--:|:--:|:--:|:|:|:|:|
+| Calendar | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Communication | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Email | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| HomeAutomation | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Notes | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Places | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| RestaurantReservation | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| ToDo | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Utilities | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Weather | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+| Web | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
Prebuilt domains are **not supported** in: * French Canadian * Hindi * Spanish Mexican
+* Japanese
## Next steps
cognitive-services Reference Pattern Syntax https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/reference-pattern-syntax.md
description: Create entities to extract key data from user utterances in Languag
Previously updated : 04/14/2020- Last updated : 04/18/2022 # Pattern syntax
Pattern syntax is a template for an utterance. The template should contain words
> [!CAUTION] > Patterns only include machine-learning entity parents, not subentities.- Entities in patterns are surrounded by curly brackets, `{}`. Patterns can include entities, and entities with roles. [Pattern.any](concepts/entities.md#patternany-entity) is an entity only used in patterns. Pattern syntax supports the following syntax:
The **optional** syntax, with square brackets, can be nested two levels. For exa
|is a new form|matches outer optional word and non-optional words in pattern| |a new form|matches required words only|
-The **grouping** syntax, with parentheses, can be nested two levels. For example: `(({Entity1.RoleName1} | {Entity1.RoleName2} ) | {Entity2} )`. This feature allows any of the three entities to be matched.
+The **grouping** syntax, with parentheses, can be nested two levels. For example: `(({Entity1:RoleName1} | {Entity1:RoleName2} ) | {Entity2} )`. This feature allows any of the three entities to be matched.
If Entity1 is a Location with roles such as origin (Seattle) and destination (Cairo) and Entity 2 is a known building name from a list entity (RedWest-C), the following utterances would map to this pattern:
A combination of **grouping** with **or-ing** syntax has a limit of 2 vertical b
|No|( test1 &#x7c; test2 &#x7c; test3 &#x7c; ( test4 &#x7c; test5 ) ) | ## Syntax to add an entity to a pattern template+ To add an entity into the pattern template, surround the entity name with curly braces, such as `Who does {Employee} manage?`. |Pattern with entity|
To add an entity into the pattern template, surround the entity name with curly
|`Who does {Employee} manage?`| ## Syntax to add an entity and role to a pattern template+ An entity role is denoted as `{entity:role}` with the entity name followed by a colon, then the role name. To add an entity with a role into the pattern template, surround the entity name and role name with curly braces, such as `Book a ticket from {Location:Origin} to {Location:Destination}`. |Pattern with entity roles|
An entity role is denoted as `{entity:role}` with the entity name followed by a
|`Book a ticket from {Location:Origin} to {Location:Destination}`| ## Syntax to add a pattern.any to pattern template+ The Pattern.any entity allows you to add an entity of varying length to the pattern. As long as the pattern template is followed, the pattern.any can be any length. To add a **Pattern.any** entity into the pattern template, surround the Pattern.any entity with the curly braces, such as `How much does {Booktitle} cost and what format is it available in?`.
In the preceding table, the subject should be `the man from La Mancha` (a book t
To fix this exception to the pattern, add `the man from la mancha` as an explicit list match for the {subject} entity using the [authoring API for explicit list](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5ade550bd5b81c209ce2e5a8). ## Syntax to mark optional text in a template utterance+ Mark optional text in the utterance using the regular expression square bracket syntax, `[]`. The optional text can nest square brackets up to two brackets only. |Pattern with optional text|Meaning|
Learn more about patterns:
* [How to add pattern.any entity](how-to/entities.md#create-a-patternany-entity) * [Patterns Concepts](luis-concept-patterns.md)
-Understand how [sentiment](luis-reference-prebuilt-sentiment.md) is returned in the .json response.
+Understand how [sentiment](luis-reference-prebuilt-sentiment.md) is returned in the .json response.
cognitive-services Captioning Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/captioning-concepts.md
The following are aspects to consider when using captioning:
* Consider output formats such as SRT (SubRip Subtitle) and WebVTT (Web Video Text Tracks). These can be loaded onto most video players such as VLC, automatically adding the captions on to your video. > [!TIP]
-> Try the [Azure Video Analyzer for Media](/azure/azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview) as a demonstration of how you can get captions for videos that you upload.
+> Try the [Azure Video Analyzer for Media](../../azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview.md) as a demonstration of how you can get captions for videos that you upload.
Captioning can accompany real time or pre-recorded speech. Whether you're showing captions in real time or with a recording, you can use the [Speech SDK](speech-sdk.md) to recognize speech and get transcriptions. You can also use the [Batch transcription API](batch-transcription.md) for pre-recorded video.
There are some situations where [training a custom model](custom-speech-overview
## Next steps * [Get started with speech to text](get-started-speech-to-text.md)
-* [Get speech recognition results](get-speech-recognition-results.md)
+* [Get speech recognition results](get-speech-recognition-results.md)
cognitive-services How To Use Custom Entity Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-use-custom-entity-pattern-matching.md
Use this sample code if:
If you do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents, this can be helpful since it is embedded within the SDK.
+For supported locales see [here](./language-support.md?tabs=IntentRecognitionPatternMatcher).
+ ## Prerequisites Be sure you have the following items before you begin this guide:
cognitive-services How To Use Simple Language Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-use-simple-language-pattern-matching.md
Use this sample code if:
If you do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents, this can be helpful since it is embedded within the SDK.
+For supported locales see [here](./language-support.md?tabs=IntentRecognitionPatternMatcher).
## Prerequisites
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/language-support.md
The following table outlines supported languages for custom keyword and keyword
| Japanese (Japan) | ja-JP | No | Yes | | Portuguese (Brazil) | pt-BR | No | Yes |
+## Intent Recognition Pattern Matcher
+
+The Intent Recognizer Pattern Matcher supports the following locales:
+
+| Locale | Locale (BCP-47) |
+|--|--|
+| English (United States) | `en-US` |
+ ## Next steps * [Region support](regions.md)
cognitive-services Speech To Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/speech-to-text.md
The [Speech SDK](speech-sdk.md) provides most of the functionalities that you ne
Use the following list to find the appropriate Speech SDK reference docs: -- <a href="https://aka.ms/csspeech/csharpref">C# SDK </a>-- <a href="https://aka.ms/csspeech/cppref">C++ SDK </a>-- <a href="https://aka.ms/csspeech/javaref">Java SDK </a>-- <a href="https://aka.ms/csspeech/pythonref">Python SDK</a>-- <a href="https://aka.ms/csspeech/javascriptref">JavaScript SDK</a>-- <a href="https://aka.ms/csspeech/objectivecref">Objective-C SDK </a>
+- <a href="/dotnet/api/overview/azure/cognitiveservices/client/speechservice">C# SDK </a>
+- <a href="/cpp/cognitive-services/speech/">C++ SDK </a>
+- <a href="/java/api/com.microsoft.cognitiveservices.speech">Java SDK </a>
+- <a href="/python/api/azure-cognitiveservices-speech/">Python SDK</a>
+- <a href="/javascript/api/microsoft-cognitiveservices-speech-sdk/">JavaScript SDK</a>
+- <a href="/objectivec/cognitive-services/speech/">Objective-C SDK </a>
> [!TIP] > The Speech service SDK is actively maintained and updated. To track changes, updates, and feature additions, see the [Speech SDK release notes](releasenotes.md).
For speech-to-text REST APIs, see the following resources:
## Next steps - [Get a Speech service subscription key for free](overview.md#try-the-speech-service-for-free)-- [Get the Speech SDK](speech-sdk.md)
+- [Get the Speech SDK](speech-sdk.md)
cognitive-services Smart Url Refresh https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/language-service/question-answering/how-to/smart-url-refresh.md
If these two QnA pairs have individual prompts attached to them (for example, Q1
## Next steps
-* [Question answering quickstart](/azure/cognitive-services/language-service/question-answering/quickstart/sdk?pivots=studio)
-* [Update Sources API reference](/rest/api/cognitiveservices/questionanswering/question-answering-projects/update-sources)
+* [Question answering quickstart](../quickstart/sdk.md?pivots=studio)
+* [Update Sources API reference](/rest/api/cognitiveservices/questionanswering/question-answering-projects/update-sources)
communication-services Join Teams Meeting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/join-teams-meeting.md
Additional information on required dataflows for joining Teams meetings is avail
## Chat storage
-During a Teams meeting, all chat messages sent by Teams users or Communication Services users are stored in the geographic region associated with the Microsoft 365 organization hosting the meeting. For more information, review the article [Location of data in Microsoft Teams](/microsoftteams/location-of-data-in-teams). For each Communication Services user in the meetings, there is also a copy of the most recently sent message that is stored in the geographic region associated with the Communication Services resource used to develop the Communication Services application. For more information, review the article [Region availability and data residency](/azure/communication-services/concepts/privacy).
+During a Teams meeting, all chat messages sent by Teams users or Communication Services users are stored in the geographic region associated with the Microsoft 365 organization hosting the meeting. For more information, review the article [Location of data in Microsoft Teams](/microsoftteams/location-of-data-in-teams). For each Communication Services user in the meetings, there is also a copy of the most recently sent message that is stored in the geographic region associated with the Communication Services resource used to develop the Communication Services application. For more information, review the article [Region availability and data residency](./privacy.md).
If the hosting Microsoft 365 organization has defined a retention policy that deletes chat messages for any of the Teams users in the meeting, then all copies of the most recently sent message that have been stored for Communication Services users will also be deleted in accordance with the policy. If there is not a retention policy defined, then the copies of the most recently sent message for all Communication Services users will be deleted after 30 days. For more information about Teams retention policies, review the article [Learn about retention for Microsoft Teams](/microsoft-365/compliance/retention-policies-teams).
Microsoft will indicate to you via the Azure Communication Services API that rec
- [How-to: Join a Teams meeting](../how-tos/calling-sdk/teams-interoperability.md) - [Quickstart: Join a BYOI calling app to a Teams meeting](../quickstarts/voice-video-calling/get-started-teams-interop.md)-- [Quickstart: Join a BYOI chat app to a Teams meeting](../quickstarts/chat/meeting-interop.md)
+- [Quickstart: Join a BYOI chat app to a Teams meeting](../quickstarts/chat/meeting-interop.md)
communication-services Get Started Raw Media Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/voice-video-calling/get-started-raw-media-access.md
Title: Quickstart - Add RAW media access to your app (Android) description: In this quickstart, you'll learn how to add raw media access calling capabilities to your app using Azure Communication Services.-+ - Previously updated : 11/18/2021+ Last updated : 04/19/2022
In this quickstart, you'll learn how implement raw media access using the Azure Communication Services Calling SDK for Android.
-## Outbound virtual video device
- The Azure Communication Services Calling SDK offers APIs allowing apps to generate their own video frames to send to remote participants. This quick start builds upon [QuickStart: Add 1:1 video calling to your app](./get-started-with-video-calling.md?pivots=platform-android) for Android.
-## Overview
-
-Once an outbound virtual video device is created, use DeviceManager to make a new virtual video device that behaves just like any other webcam connected to your computer or mobile phone.
+## Virtual Video Stream Overview
Since the app will be generating the video frames, the app must inform the Azure Communication Services Calling SDK about the video formats the app is capable of generating. This is required to allow the Azure Communication Services Calling SDK to pick the best video format configuration given the network conditions at any giving time. The app must register a delegate to get notified about when it should start or stop producing video frames. The delegate event will inform the app which video format is more appropriate for the current network conditions.
-The following is an overview of the steps required to create an outbound virtual video device.
-
-1. Create a `VirtualDeviceIdentification` with basic identification information for the new outbound virtual video device.
-
- ```java
- VirtualDeviceIdentification deviceId = new VirtualDeviceIdentification();
- deviceId.setId("QuickStartVirtualVideoDevice");
- deviceId.setName("My First Virtual Video Device");
- ```
+The following is an overview of the steps required to create a virtual video stream.
-2. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `MediaFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list does not influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth.
+1. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `VideoFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list does not influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth.
```java ArrayList<VideoFormat> videoFormats = new ArrayList<VideoFormat>();
The following is an overview of the steps required to create an outbound virtual
format.setWidth(1280); format.setHeight(720); format.setPixelFormat(PixelFormat.RGBA);
- format.setMediaFrameKind(MediaFrameKind.VIDEO_SOFTWARE);
+ format.setMediaFrameKind(VideoFrameKind.VIDEO_SOFTWARE);
format.setFramesPerSecond(30); format.setStride1(1280 * 4); // It is times 4 because RGBA is a 32-bit format. videoFormats.add(format); ```
-3. Create `OutboundVirtualVideoDeviceOptions` and set `DeviceIdentification` and `VideoFormats` with the previously created objects.
+2. Create `OutgoingVirtualVideoStreamOptions` and set `VideoFormats` with the previously created object.
```java
- OutboundVirtualVideoDeviceOptions m_options = new OutboundVirtualVideoDeviceOptions();
-
- // ...
-
- m_options.setDeviceIdentification(deviceId);
- m_options.setVideoFormats(videoFormats);
+ OutgoingVirtualVideoStreamOptions options = new OutgoingVirtualVideoStreamOptions();
+ options.setVideoFormats(videoFormats);
```
-4. Make sure the `OutboundVirtualVideoDeviceOptions::OnFlowChanged` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `m_mediaFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger.
+3. Subscribe to `OutgoingVirtualVideoStreamOptions::addOnOutgoingVideoStreamStateChangedListener` delegate. This delegate will inform the state of the current stream, its important that you do not send frames if the state is no equal to `OutgoingVideoStreamState.STARTED`.
```java
- private MediaFrameSender m_mediaFrameSender;
+ private OutgoingVideoStreamState outgoingVideoStreamState;
- // ...
+ options.addOnOutgoingVideoStreamStateChangedListener(event -> {
- m_options.addOnFlowChangedListener(virtualDeviceFlowControlArgs -> {
- if (virtualDeviceFlowControlArgs.getMediaFrameSender().getRunningState() == VirtualDeviceRunningState.STARTED) {
- // Tell the app's frame generator to start producing frames.
- m_mediaFrameSender = virtualDeviceFlowControlArgs.getMediaFrameSender();
- } else {
- // Tell the app's frame generator to stop producing frames.
- m_mediaFrameSender = null;
- }
+ outgoingVideoStreamState = event.getOutgoingVideoStreamState();
}); ```
-5. Use `Device
+4. Make sure the `OutgoingVirtualVideoStreamOptions::addOnVideoFrameSenderChangedListener` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `mediaFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger.
```java
- private OutboundVirtualVideoDevice m_outboundVirtualVideoDevice;
+ private VideoFrameSender mediaFrameSender;
- // ...
+ options.addOnVideoFrameSenderChangedListener(event -> {
- m_outboundVirtualVideoDevice = m_deviceManager.createOutboundVirtualVideoDevice(m_options).get();
+ mediaFrameSender = event.getMediaFrameSender();
+ });
```
-6. Tell device manager to use the recently created virtual camera on calls.
+5. Create an instance of `VirtualVideoStream` using the `OutgoingVirtualVideoStreamOptions` we created previously
```java
- private LocalVideoStream m_localVideoStream;
-
- // ...
+ private VirtualVideoStream virtualVideoStream;
- for (VideoDeviceInfo videoDeviceInfo : m_deviceManager.getCameras())
- {
- String deviceId = videoDeviceInfo.getId();
- if (deviceId.equalsIgnoreCase("QuickStartVirtualVideoDevice")) // Same id used in step 1.
- {
- m_localVideoStream = LocalVideoStream(videoDeviceInfo, getApplicationContext());
- }
- }
+ virtualVideoStream = new VirtualVideoStream(options);
```
-7. In a non-UI thread or loop in the app, cast the `MediaFrameSender` to the appropriate type defined by the `MediaFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrame` and then call the `send` method according to the number of planes defined by the MediaFormat.
+7. Once outgoingVideoStreamState is equal to `OutgoingVideoStreamState.STARTED` create and instance of `FrameGenerator` class this will start a non-UI thread and will send frames, call `FrameGenerator.SetVideoFrameSender` each time we get an updated `VideoFrameSender` on the previous delegate, cast the `VideoFrameSender` to the appropriate type defined by the `VideoFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrameSender` and then call the `send` method according to the number of planes defined by the MediaFormat.
After that, create the ByteBuffer backing the video frame if needed. Then, update the content of the video frame. Finally, send the video frame to other participants with the `sendFrame` API. ```java
- java.nio.ByteBuffer plane1 = null;
- Random rand = new Random();
- byte greyValue = 0;
-
- // ...
- java.nio.ByteBuffer plane1 = null;
- Random rand = new Random();
-
- while (m_outboundVirtualVideoDevice != null) {
- while (m_mediaFrameSender != null) {
- if (m_mediaFrameSender.getMediaFrameKind() == MediaFrameKind.VIDEO_SOFTWARE) {
- SoftwareBasedVideoFrame sender = (SoftwareBasedVideoFrame) m_mediaFrameSender;
+ public class FrameGenerator {
+
+ private VideoFrameSender videoFrameSender;
+ private Thread frameIteratorThread;
+ private final Random random;
+ private volatile boolean stopFrameIterator = false;
+
+ public FrameGenerator() {
+
+ random = new Random();
+ }
+
+ public void FrameIterator() {
+
+ ByteBuffer plane = null;
+ while (!stopFrameIterator && videoFrameSender != null) {
+
+ plane = GenerateFrame(plane);
+ }
+ }
+
+ private ByteBuffer GenerateFrame(ByteBuffer plane)
+ {
+ try {
+
+ SoftwareBasedVideoFrameSender sender = (SoftwareBasedVideoFrameSender) videoFrameSender;
VideoFormat videoFormat = sender.getVideoFormat();
+ long timeStamp = sender.getTimestamp();
- // Gets the timestamp for when the video frame has been created.
- // This allows better synchronization with audio.
- int timeStamp = sender.getTimestamp();
+ if (plane == null || videoFormat.getStride1() * videoFormat.getHeight() != plane.capacity()) {
- // Adjusts frame dimensions to the video format that network conditions can manage.
- if (plane1 == null || videoFormat.getStride1() * videoFormat.getHeight() != plane1.capacity()) {
- plane1 = ByteBuffer.allocateDirect(videoFormat.getStride1() * videoFormat.getHeight());
- plane1.order(ByteOrder.nativeOrder());
+ plane = ByteBuffer.allocateDirect(videoFormat.getStride1() * videoFormat.getHeight());
+ plane.order(ByteOrder.nativeOrder());
}
- // Generates random gray scaled bands as video frame.
- int bandsCount = rand.nextInt(15) + 1;
+ int bandsCount = random.nextInt(15) + 1;
int bandBegin = 0; int bandThickness = videoFormat.getHeight() * videoFormat.getStride1() / bandsCount; for (int i = 0; i < bandsCount; ++i) {
- byte greyValue = (byte)rand.nextInt(254);
- java.util.Arrays.fill(plane1.array(), bandBegin, bandBegin + bandThickness, greyValue);
+
+ byte greyValue = (byte) random.nextInt(254);
+ java.util.Arrays.fill(plane.array(), bandBegin, bandBegin + bandThickness, greyValue);
bandBegin += bandThickness; }
- // Sends video frame to the other participants in the call.
- FrameConfirmation fr = sender.sendFrame(plane1, timeStamp).get();
+ FrameConfirmation fr = sender.sendFrame(plane, timeStamp).get();
- // Waits before generating the next video frame.
- // Video format defines how many frames per second app must generate.
Thread.sleep((long) (1000.0f / videoFormat.getFramesPerSecond())); }
+ catch (InterruptedException ex) {
+
+ ex.printStackTrace();
+ }
+ catch (ExecutionException ex2)
+ {
+ ex2.getMessage();
+ }
+
+ return plane;
}
- // Virtual camera hasn't been created yet.
- // Let's wait a little bit before checking again.
- // This is for demo only purposes.
- // Feel free to use a better synchronization mechanism.
- Thread.sleep(100);
+ private void StartFrameIterator()
+ {
+ frameIteratorThread = new Thread(this::FrameIterator);
+ frameIteratorThread.start();
+ }
+
+ public void StopFrameIterator()
+ {
+ try
+ {
+ if (frameIteratorThread != null)
+ {
+ stopFrameIterator = true;
+ frameIteratorThread.join();
+ frameIteratorThread = null;
+ stopFrameIterator = false;
+ }
+ }
+ catch (InterruptedException ex)
+ {
+ ex.getMessage();
+ }
+ }
+
+ @Override
+ public void SetVideoFrameSender(VideoFrameSender videoFramSender) {
+
+ StopFrameIterator();
+ this.videoFrameSender = videoFramSender;
+ StartFrameIterator();
+ }
} ```+
+## Screen Share Video Stream Overview
+
+Repeat steps `1 to 4` from the previous VirtualVideoStream tutorial.
+
+Since the Android system generates the frames, you have to implement your own foreground service to capture the frames and send them through using our API
+
+The following is an overview of the steps required to create a screen share video stream.
+
+1. Add this permission to your `Manifest.xml` file inside your Android project
+
+ ```xml
+ <uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
+ ```
+
+2. Create an instance of `ScreenShareVideoStream` using the `OutgoingVirtualVideoStreamOptions` we created previously
+
+ ```java
+ private ScreenShareVideoStream screenShareVideoStream;
+
+ screenShareVideoStream = new ScreenShareVideoStream(options);
+ ```
+
+3. Request needed permissions for screen capture on Android, once this method is called Android will call automatically `onActivityResult` containing the request code we have sent and the result of the operation, expect `Activity.RESULT_OK` if the permission has been provided by the user if so attach the screenShareVideoStream to the call and start your own foreground service to capture the frames.
+
+ ```java
+ public void GetScreenSharePermissions() {
+
+ try {
+
+ MediaProjectionManager mediaProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
+ startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), Constants.SCREEN_SHARE_REQUEST_INTENT_REQ_CODE);
+ } catch (Exception e) {
+
+ String error = "Could not start screen share due to failure to startActivityForResult for mediaProjectionManager screenCaptureIntent";
+ }
+ }
+
+ @Override
+ protected void onActivityResult(int requestCode, int resultCode, Intent data) {
+
+ super.onActivityResult(requestCode, resultCode, data);
+
+ if (requestCode == Constants.SCREEN_SHARE_REQUEST_INTENT_REQ_CODE) {
+
+ if (resultCode == Activity.RESULT_OK && data != null) {
+
+ // Attach the screenShareVideoStream to the call
+ // Start your foreground service
+ } else {
+
+ String error = "user cancelled, did not give permission to capture screen";
+ }
+ }
+ }
+ ```
+
+4. Once you receive a frame on your foreground service send it through using the `VideoFrameSender` provided
+
+ ````java
+ public void onImageAvailable(ImageReader reader) {
+
+ Image image = reader.acquireLatestImage();
+ if (image != null) {
+
+ final Image.Plane[] planes = image.getPlanes();
+ if (planes.length > 0) {
+
+ Image.Plane plane = planes[0];
+ final ByteBuffer buffer = plane.getBuffer();
+ try {
+
+ SoftwareBasedVideoFrameSender sender = (SoftwareBasedVideoFrameSender) videoFrameSender;
+ sender.sendFrame(buffer, sender.getTimestamp()).get();
+ } catch (Exception ex) {
+
+ Log.d("MainActivity", "MainActivity.onImageAvailable trace, failed to send Frame");
+ }
+ }
+
+ image.close();
+ }
+ }
+ ````
confidential-computing Confidential Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/confidential-containers.md
Marblerun supports confidential containers created with Graphene, Occlum, and EG
## Confidential Containers reference architectures - [Confidential data messaging for healthcare reference architecture and sample with Intel SGX confidential containers](https://github.com/Azure-Samples/confidential-container-samples/blob/main/confidential-healthcare-scone-confinf-onnx/README.md). -- [Confidential big-data processing with Apache Spark on AKS with Intel SGX confidential containers](https://docs.microsoft.com/azure/architecture/example-scenario/confidential/data-analytics-containers-spark-kubernetes-azure-sql).
+- [Confidential big-data processing with Apache Spark on AKS with Intel SGX confidential containers](/azure/architecture/example-scenario/confidential/data-analytics-containers-spark-kubernetes-azure-sql).
## Get in touch
Do you have questions about your implementation? Do you want to become an enable
- [Deploy AKS cluster with Intel SGX Confidential VM Nodes](./confidential-enclave-nodes-aks-get-started.md) - [Microsoft Azure Attestation](../attestation/overview.md) - [Intel SGX Confidential Virtual Machines](virtual-machine-solutions-sgx.md)-- [Azure Kubernetes Service (AKS)](../aks/intro-kubernetes.md)
+- [Azure Kubernetes Service (AKS)](../aks/intro-kubernetes.md)
container-apps Deploy Visual Studio Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/deploy-visual-studio-code.md
Now that you have a container app environment in Azure you can create a containe
9) Choose **External** to configure the HTTP traffic that the endpoint will accept.
-10) Leave the default value of 80 for the port, and then select **Enter** to complete the workflow.
+10) Enter a value of 3000 for the port, and then select **Enter** to complete the workflow. This value should be set to the port number that your container uses, which in the case of the sample app is 3000.
During this process, Visual Studio Code and Azure create the container app for you. The published Docker image you created earlier is also be deployed to the app. Once this process finishes, Visual Studio Code displays a notification with a link to browse to the site. Click this link, and to view your app in the browser.
cosmos-db Audit Restore Continuous https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/audit-restore-continuous.md
+
+ Title: Auditing the point in time restore action for continuous backup mode in Azure Cosmos DB
+description: This article provides details available to audit Azure Cosmos DB's point in time restore feature in continuous backup mode.
+++ Last updated : 04/18/2022++++
+# Audit the point in time restore action for continuous backup mode in Azure Cosmos DB
+
+Azure Cosmos DB provides you the list of all the point in time restores for continuous mode that were performed on a Cosmos DB account using [Activity Logs](/azure-monitor/essentials/activity-log). Activity logs can be viewed for any Cosmos DB account from the **Activity Logs** page in the Azure portal. The Activity Log shows all the operations that were triggered on the specific account. When a point in time restore is triggered, it shows up as `Restore Database Account` operation on the source account as well as the target account. The Activity Log for the source account can be used to audit restore events, and the activity logs on the target account can be used to get the updates about the progress of the restore.
+
+## Audit the restores that were triggered on a live database account
+
+When a restore is triggered on a source account, a log is emitted with the status *Started*. And when the restore succeeds or fails, a new log is emitted with the status *Succeeded* or *Failed* respectively.
+
+To get the list of just the restore operations that were triggered on a specific account, you can open the Activity Log of the source account, and search for **Restore database account** in the search bar with the required **Timespan** filter. The `UserPrincipalName` of the user that triggered the restore can be found from the `Event initiated by` column.
++
+The parameters of the restore request can be found by clicking on the event and selecting the JSON tab:
++
+## Audit the restores that were triggered on a deleted database account
+
+For the accounts that were already deleted, there would not be any database account page. Instead, the Activity Log in the subscription page can be used to get the restores that were triggered on a deleted account. Once the Activity Log page is opened, a new filter can be added to narrow down the results specific to the resource group the account existed in, or even using the database account name in the Resource filter. The Resource for the activity log is the database account on which the restore was triggered.
++
+The activity logs can also be accessed using Azure CLI or Azure PowerShell. For more information on activity logs, review [Azure Activity log - Azure Monitor](/azure-monitor/essentials/activity-log).
+
+## Track the progress of the restore operation
+
+Azure Cosmos DB allows you to track the progress of the restore using the activity logs of the restored database account. Once the restore is triggered, you will see a notification with the title **Restore Account**.
++
+The account status would be *Creating*, but it would have an Activity Log page. A new log event will appear after the restore of each collection. Note that there can be a delay of 5-10 minutes to see the log event after the actual restore of the collection is complete.
+
+ ## Next steps
+
+ * Learn more about [continuous backup](continuous-backup-restore-introduction.md) mode.
+ * Provision an account with continuous backup by using the [Azure portal](provision-account-continuous-backup.md#provision-portal), [PowerShell](provision-account-continuous-backup.md#provision-powershell), the [Azure CLI](provision-account-continuous-backup.md#provision-cli), or [Azure Resource Manager](provision-account-continuous-backup.md#provision-arm-template).
+ * [Manage permissions](continuous-backup-restore-permissions.md) required to restore data with continuous backup mode.
+ * Learn about the [resource model of continuous backup mode](continuous-backup-restore-resource-model.md).
+ * Explore the [Frequently asked questions for continuous mode](continuous-backup-restore-frequently-asked-questions.yml).
cosmos-db Continuous Backup Restore Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/continuous-backup-restore-permissions.md
# Manage permissions to restore an Azure Cosmos DB account Azure Cosmos DB allows you to isolate and restrict the restore permissions for continuous backup account to a specific role or a principal. The owner of the account can trigger a restore and assign a role to other principals to perform the restore operation. These permissions can be applied at the subscription scope as shown in the following image:
cosmos-db Restore Account Continuous Backup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/restore-account-continuous-backup.md
Title: Restore an Azure Cosmos DB account that uses continuous backup mode.
-description: Learn how to identify the restore time and restore a live or deleted Azure Cosmos DB account. It shows how to use the event feed to identify the restore time and restore the account using Azure portal, PowerShell, CLI, or a Resource Manager template.
+description: Learn how to identify the restore time and restore a live or deleted Azure Cosmos DB account. It shows how to use the event feed to identify the restore time and restore the account using Azure portal, PowerShell, CLI, or an Azure Resource Manager template.
Previously updated : 12/09/2021 Last updated : 04/18/2022 -+ # Restore an Azure Cosmos DB account that uses continuous backup mode Azure Cosmos DB's point-in-time restore feature helps you to recover from an accidental change within a container, to restore a deleted account, database, or a container or to restore into any region (where backups existed). The continuous backup mode allows you to do restore to any point of time within the last 30 days.
-This article describes how to identify the restore time and restore a live or deleted Azure Cosmos DB account. It shows restore the account using [Azure portal](#restore-account-portal), [PowerShell](#restore-account-powershell), [CLI](#restore-account-cli), or a [Resource Manager template](#restore-arm-template).
+This article describes how to identify the restore time and restore a live or deleted Azure Cosmos DB account. It shows restore the account using [Azure portal](#restore-account-portal), [PowerShell](#restore-account-powershell), [CLI](#restore-account-cli), or an [Azure Resource Manager template](#restore-arm-template).
+
+> [!NOTE]
+> Currently in preview, the restore action for Table API and Gremlin API is supported via PowerShell and the Azure CLI.
## <a id="restore-account-portal"></a>Restore an account using Azure portal
Deleting source account while a restore is in-progress could result in failure o
### Restorable timestamp for live accounts
-To restore Azure Cosmos DB live accounts that are not deleted, it is a best practice to always identify the [latest restorable timestamp](get-latest-restore-timestamp.md) for the container. You can then use this timestamp to restore the account to it's latest version.
+To restore Azure Cosmos DB live accounts that are not deleted, it is a best practice to always identify the [latest restorable timestamp](get-latest-restore-timestamp.md) for the container. You can then use this timestamp to restore the account to its latest version.
### <a id="event-feed"></a>Use event feed to identify the restore time
Use the following steps to get the restore details from Azure portal:
1. Navigate to the **Export template** pane. It opens a JSON template, corresponding to the restored account.
-1. The **resources** > **properties** > **restoreParameters** object contains the restore details. The **restoreTimestampInUtc** gives you the time at which the account was restored and the **databasesToRestore** shows the specific database and container from which the account was restored.
- ## <a id="restore-account-powershell"></a>Restore an account using Azure PowerShell Before restoring the account, install the [latest version of Azure PowerShell](/powershell/azure/install-az-ps?view=azps-6.2.1&preserve-view=true) or version higher than 6.2.0. Next connect to your Azure account and select the required subscription with the following commands:
Before restoring the account, install the [latest version of Azure PowerShell](/
```azurepowershell Select-AzSubscription -Subscription <SubscriptionName>
-### <a id="trigger-restore-ps"></a>Trigger a restore operation
+### <a id="trigger-restore-ps"></a>Trigger a restore operation for SQL API account
The following cmdlet is an example to trigger a restore operation with the restore command by using the target account, source account, location, resource group, and timestamp:
Restore-AzCosmosDBAccount `
-Location "West US" ```
+**Example 3:** Restoring Gremlin API Account. This example restores the graphs *graph1*, *graph2* from *MyDB1* and the entire database *MyDB2*, which, includes all the containers under it.
+
+```azurepowershell
+$datatabaseToRestore1 = New-AzCosmosDBGremlinDatabaseToRestore -DatabaseName "MyDB1" -GraphName "graph1", "graph2"
+$datatabaseToRestore2 = New-AzCosmosDBGremlinDatabaseToRestore -DatabaseName "MyDB2"
+
+Restore-AzCosmosDBAccount `
+ -TargetResourceGroupName "MyRG" `
+ -TargetDatabaseAccountName "Pitracct" `
+ -SourceDatabaseAccountName "SourceGremlin" `
+ -RestoreTimestampInUtc "2022-04-05T22:06:00" `
+ -DatabasesToRestore $datatabaseToRestore1, $datatabaseToRestore2 `
+ -Location "West US"
+
+```
+
+**Example 4:** Restoring Table API Account. This example restores the tables *table1*, *table1* from *MyDB1*
+
+```azurepowershell
+$tablesToRestore = New-AzCosmosDBTableToRestore -TableName "table1", "table2"
+
+Restore-AzCosmosDBAccount `
+ -TargetResourceGroupName "MyRG" `
+ -TargetDatabaseAccountName "Pitracct" `
+ -SourceDatabaseAccountName "SourceTable" `
+ -RestoreTimestampInUtc "2022-04-06T22:06:00" `
+ -TablesToRestore $tablesToRestore
+ -Location "West US"
+```
### <a id="get-the-restore-details-powershell"></a>Get the restore details from the restored account
Get-AzCosmosdbMongoDBRestorableDatabase `
```
-#### List all the versions of mongodb collections of a database in a live database account
+#### List all the versions of MongoDB collections of a database in a live database account
```azurepowershell
Get-AzCosmosdbMongoDBRestorableCollection `
-Location "West US" ```
-#### List all the resources of a mongodb database account that are available to restore at a given timestamp and region
+#### List all the resources of a MongoDB database account that are available to restore at a given timestamp and region
```azurepowershell
Get-AzCosmosdbMongoDBRestorableResource `
-RestoreLocation "West US" ` -RestoreTimestamp "2020-07-20T16:09:53+0000" ```
+### <a id="enumerate-gremlin-api-ps"></a>Enumerate restorable resources for Gremlin API
+
+The enumeration cmdlets help you discover the resources that are available for restore at various timestamps. Additionally, they also provide a feed of key events on the restorable account, database, and graph resources.
+
+#### List all the versions of Gremlin databases in a live database account
+
+Listing all the versions of databases allows you to choose the right database in a scenario where the actual time of existence of database is unknown.
+Run the following PowerShell command to list all the versions of databases. This command only works with live accounts. The `DatabaseAccountInstanceId` and the `Location` parameters are obtained from the `name` and `location` properties in the response of `Get-AzCosmosDBRestorableDatabaseAccount` cmdlet. The `DatabaseAccountInstanceId` attribute refers to `instanceId` property of source database account being restored:
+
+```azurepowershell
+Get-AzCosmosdbGremlinRestorableDatabase `
+ -Location "East US" `
+ -DatabaseAccountInstanceId <DatabaseAccountInstanceId>
+```
+
+#### List all the versions of Gremlin graphs of a database in a live database account
+
+Use the following command to list all the versions of Gremlin API graphs. This command only works with live accounts. The `DatabaseRId` parameter is the `ResourceId` of the database you want to restore. It is the value of `ownerResourceid` attribute found in the response of `Get-AzCosmosdbGremlinRestorableDatabase` cmdlet. The response also includes a list of operations performed on all the graphs inside this database.
+
+```azurepowershell
+Get-AzCosmosdbGremlinRestorableGraph `
+ -DatabaseAccountInstanceId "d056a4f8-044a-436f-80c8-cd3edbc94c68" `
+ -DatabaseRId "AoQ13r==" `
+ -Location "West US"
+```
+
+#### Find databases or graphs that can be restored at any given timestamp
+
+Use the following command to get the list of databases or graphs that can be restored at any given timestamp. This command only works with live accounts.
+
+```azurepowershell
+Get-AzCosmosdbGremlinRestorableResource `
+ -DatabaseAccountInstanceId "d056a4f8-044a-436f-80c8-cd3edbc94c68" `
+ -Location "West US" `
+ -RestoreLocation "East US" `
+ -RestoreTimestamp "2020-07-20T16:09:53+0000"
+```
+
+### <a id="enumerate-table-api-ps"></a>Enumerate restorable resources for Table API
+
+The enumeration cmdlets help you discover the resources that are available for restore at various timestamps. Additionally, they also provide a feed of key events on the restorable account and table resources.
+
+#### List all the versions of tables of a database in a live database account
+
+Use the following command to list all the versions of tables. This command only works with live accounts.
+
+```azurepowershell
+Get-AzCosmosdbTableRestorableTable `
+ -DatabaseAccountInstanceId "d056a4f8-044a-436f-80c8-cd3edbc94c68"
+ ` -Location "West US"
+```
+
+#### Find tables that can be restored at any given timestamp
+
+Use the following command to get the list of tables that can be restored at any given timestamp. This command only works with live accounts.
+
+```azurepowershell
+Get-AzCosmosdbTableRestorableResource `
+ -DatabaseAccountInstanceId "d056a4f8-044a-436f-80c8-cd3edbc94c68" `
+ -Location "West US" `
+ -RestoreLocation "East US" `
+ -RestoreTimestamp "2020-07-20T16:09:53+0000"
+```
+ ## <a id="restore-account-cli"></a>Restore an account using Azure CLI
Before restoring the account, install Azure CLI with the following steps:
1. Install the latest version of Azure CLI
- * Install the latest version of [Azure CLI](/cli/azure/install-azure-cli) or version higher than 2.26.0
+ * Install the latest version of [Azure CLI](/cli/azure/install-azure-cli) or version higher than 2.26.0.
* If you have already installed CLI, run `az upgrade` command to update to the latest version. This command will only work with CLI version higher than 2.11. If you have an earlier version, use the above link to install the latest version. 1. Sign in and select your subscription
- * Sign into your Azure account with `az login` command.
+ * Sign in to your Azure account with `az login` command.
* Select the required subscription using `az account set -s <subscriptionguid>` command.
-### <a id="trigger-restore-cli"></a>Trigger a restore operation with CLI
+### <a id="trigger-restore-cli"></a>Trigger a restore operation with Azure CLI
The simplest way to trigger a restore is by issuing the restore command with name of the target account, source account, location, resource group, timestamp (in UTC), and optionally the database and container names. The following are some examples to trigger the restore operation:
-1. Create a new Azure Cosmos DB account by restoring from an existing account.
+#### Create a new Azure Cosmos DB account by restoring from an existing account
```azurecli-interactive
The simplest way to trigger a restore is by issuing the restore command with nam
```
-2. Create a new Azure Cosmos DB account by restoring only selected databases and containers from an existing database account.
+#### Create a new Azure Cosmos DB account by restoring only selected databases and containers from an existing database account
```azurecli-interactive
The simplest way to trigger a restore is by issuing the restore command with nam
--databases-to-restore name=MyDB2 collections=Collection3 Collection4 ```
+#### Create a new Azure Cosmos DB Gremlin API account by restoring only selected databases and graphs from an existing Gremlin API account
+
+ ```azurecli-interactive
+
+ az cosmosdb restore \
+ --resource-group MyResourceGroup \
+ --target-database-account-name MyRestoredCosmosDBDatabaseAccount \
+ --account-name MySourceAccount \
+ --restore-timestamp 2022-04-13T16:03:41+0000 \
+ --location "West US" \
+ --gremlin-databases-to-restore name=MyDB1 graphs=graph1 graph2 \
+ --gremlin-databases-to-restore name=MyDB2 graphs =graph3 graph4
+ ```
+
+ #### Create a new Azure Cosmos DB Table API account by restoring only selected tables from an existing Table API account
+
+ ```azurecli-interactive
+
+ az cosmosdb restore \
+ --resource-group MyResourceGroup \
+ --target-database-account-name MyRestoredCosmosDBDatabaseAccount \
+ --account-name MySourceAccount \
+ --restore-timestamp 2022-04-14T06:03:41+0000 \
+ --location "West US" \
+ --tables-to-restore table1 table2
+ ```
### <a id="get-the-restore-details-cli"></a>Get the restore details from the restored account
-Run the following command to get the restore details. The `az cosmosdb show` command output shows the value of `createMode` property. If the value is set to **Restore**. it indicates that the account was restored from another account. The `restoreParameters` property has further details such as `restoreSource`, which has the source account ID. The last GUID in the `restoreSource` parameter is the instanceId of the source account. And the restoreTimestamp will be under the restoreParameters object:
+Run the following command to get the restore details. The `az cosmosdb show` command output shows the value of `createMode` property. If the value is set to **Restore**, it indicates that the account was restored from another account. The `restoreParameters` property has further details such as `restoreSource`, which has the source account ID. The last GUID in the `restoreSource` parameter is the `instanceId` of the source account. And the `restoreTimestamp` will be under the `restoreParameters` object:
```azurecli-interactive az cosmosdb show --name MyCosmosDBDatabaseAccount --resource-group MyResourceGroup ```
-### <a id="enumerate-sql-api"></a>Enumerate restorable resources for SQL API
+### <a id="enumerate-sql-api-cli"></a>Enumerate restorable resources for SQL API
The enumeration commands described below help you discover the resources that are available for restore at various timestamps. Additionally, they also provide a feed of key events on the restorable account, database, and container resources. #### List all the accounts that can be restored in the current subscription
-Run the following CLI command to list all the accounts that can be restored in the current subscription
+Run the following Azure CLI command to list all the accounts that can be restored in the current subscription
```azurecli-interactive az cosmosdb restorable-database-account list --account-name "Pitracct" ```
-The response includes all the database accounts (both live and deleted) that can be restored and the regions that they can be restored from:
+The response includes all the database accounts (both live and deleted) that can be restored, and the regions that they can be restored from:
```json {
The response includes all the database accounts (both live and deleted) that can
"apiType": "Sql", "creationTime": "2021-01-08T23:34:11.095870+00:00", "deletionTime": null,
- "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/7133a59a-d1c0-4645-a699-6e296d6ac865",
+ "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/abcd1234-d1c0-4645-a699-abcd1234",
"identity": null, "location": "West US",
- "name": "7133a59a-d1c0-4645-a699-6e296d6ac865",
+ "name": "abcd1234-d1c0-4645-a699-abcd1234",
"restorableLocations": [ { "creationTime": "2021-01-08T23:34:11.095870+00:00",
Just like the `CreationTime` or `DeletionTime` for the account, there is a `Crea
Listing all the versions of databases allows you to choose the right database in a scenario where the actual time of existence of database is unknown.
-Run the following CLI command to list all the versions of databases. This command only works with live accounts. The `instance-id` and the `location` parameters are obtained from the `name` and `location` properties in the response of `az cosmosdb restorable-database-account list` command. The instanceId attribute is also a property of source database account that is being restored:
+Run the following Azure CLI command to list all the versions of databases. This command only works with live accounts. The `instance-id` and the `location` parameters are obtained from the `name` and `location` properties in the response of `az cosmosdb restorable-database-account list` command. The `instanceId` attribute is also a property of source database account that is being restored:
```azurecli-interactive az cosmosdb sql restorable-database list \
- --instance-id "7133a59a-d1c0-4645-a699-6e296d6ac865" \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
--location "West US" ```
This command output now shows when a database was created and deleted.
```json [ {
- "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/7133a59a-d1c0-4645-a699-6e296d6ac865/restorableSqlDatabases/40e93dbd-2abe-4356-a31a-35567b777220",
+ "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/abcd1234-d1c0-4645-a699-abcd1234/restorableSqlDatabases/40e93dbd-2abe-4356-a31a-35567b777220",
.. "name": "40e93dbd-2abe-4356-a31a-35567b777220", "resource": {
This command output now shows when a database was created and deleted.
.. }, {
- "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/7133a59a-d1c0-4645-a699-6e296d6ac865/restorableSqlDatabases/243c38cb-5c41-4931-8cfb-5948881a40ea",
+ "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/abcd1234-d1c0-4645-a699-abcd1234/restorableSqlDatabases/243c38cb-5c41-4931-8cfb-5948881a40ea",
.. "name": "243c38cb-5c41-4931-8cfb-5948881a40ea", "resource": {
Use the following command to list all the versions of SQL containers. This comma
```azurecli-interactive az cosmosdb sql restorable-container list \
- --instance-id "7133a59a-d1c0-4645-a699-6e296d6ac865" \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
--database-rid "OIQ1AA==" \ --location "West US" ```
Use the following command to get the list of databases or containers that can be
```azurecli-interactive az cosmosdb sql restorable-resource list \
- --instance-id "7133a59a-d1c0-4645-a699-6e296d6ac865" \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
--location "West US" \ --restore-location "West US" \ --restore-timestamp "2021-01-10T01:00:00+0000"
az cosmosdb sql restorable-resource list \
] ```
-### <a id="enumerate-mongodb-api"></a>Enumerate restorable resources for MongoDB API account
+### <a id="enumerate-mongodb-api-cli"></a>Enumerate restorable resources for MongoDB API account
The enumeration commands described below help you discover the resources that are available for restore at various timestamps. Additionally, they also provide a feed of key events on the restorable account, database, and container resources. These commands only work for live accounts.
-#### List all the versions of mongodb databases in a live database account
+#### List all the versions of MongoDB databases in a live database account
```azurecli-interactive az cosmosdb mongodb restorable-database list \
- --instance-id "7133a59a-d1c0-4645-a699-6e296d6ac865" \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
--location "West US" ```
-#### List all the versions of mongodb collections of a database in a live database account
+#### List all the versions of MongoDB collections of a database in a live database account
```azurecli-interactive az cosmosdb mongodb restorable-collection list \
- --instance-id "7133a59a-d1c0-4645-a699-6e296d6ac865" \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
--database-rid "AoQ13r==" \ --location "West US" ```
az cosmosdb mongodb restorable-collection list \
```azurecli-interactive az cosmosdb mongodb restorable-resource list \
- --instance-id "7133a59a-d1c0-4645-a699-6e296d6ac865" \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
--location "West US" \ --restore-location "West US" \ --restore-timestamp "2020-07-20T16:09:53+0000" ```
-## <a id="restore-arm-template"></a>Restore using the Resource Manager template
-You can also restore an account using Resource Manager template. When defining the template include the following parameters:
-* Set the `createMode` parameter to *Restore*
-* Define the `restoreParameters`, notice that the `restoreSource` value is extracted from the output of the `az cosmosdb restorable-database-account list` command for your source account. The Instance ID attribute for your account name is used to do the restore.
-* Set the `restoreMode` parameter to *PointInTime* and configure the `restoreTimestampInUtc` value.
+#### List all the versions of databases in a live database account
+The enumeration commands described below help you discover the resources that are available for restore at various timestamps. Additionally, they also provide a feed of key events on the restorable account, database, and graph resources. These commands only work for live accounts.
+
+```azurecli-interactive
+az cosmosdb gremlin restorable-database list \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
+ --location "West US"
+```
+
+This command output now shows when a database was created and deleted.
+```
+[ {
+ "id": "/subscriptions/abcd1234-b6ac-4328-a753-abcd1234/providers/Microsoft.DocumentDB/locations/eastus2euap/restorableDatabaseAccounts/abcd1234-4316-483b-8308-abcd1234/restorableGremlinDatabases/abcd1234-0e32-4036-ac9d-abcd1234",
+ "name": "abcd1234-0e32-4036-ac9d-abcd1234",
+ "resource": {
+ "eventTimestamp": "2022-02-09T17:10:18Z",
+ "operationType": "Create",
+ "ownerId": "db1",
+ "ownerResourceId": "1XUdAA==",
+ "rid": "ymn7kwAAAA=="
+ },
+ "type": "Microsoft.DocumentDB/locations/restorableDatabaseAccounts/restorableGremlinDatabases"
+
+ }
+]
+```
+
+#### List all the versions of Gremlin graphs of a database in a live database account
+
+```azurecli-interactive
+az cosmosdb gremlin restorable-graph list \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
+ --database-rid "OIQ1AA==" \
+ --location "West US"
+```
+
+This command output shows includes list of operations performed on all the containers inside this database:
+```
+[ {
+
+ "id": "/subscriptions/23587e98-b6ac-4328-a753-03bcd3c8e744/providers/Microsoft.DocumentDB/locations/eastus2euap/restorableDatabaseAccounts/a00d591d-4316-483b-8308-44193c5f3073/restorableGraphs/1792cead-4307-4032-860d-3fc30bd46a20",
+ "name": "1792cead-4307-4032-860d-3fc30bd46a20",
+ "resource": {
+ "eventTimestamp": "2022-02-09T17:10:31Z",
+ "operationType": "Create",
+ "ownerId": "graph1",
+ "ownerResourceId": "1XUdAPv9duQ=",
+ "rid": "IcWqcQAAAA=="
+ },
+ "type": "Microsoft.DocumentDB/locations/restorableDatabaseAccounts/restorableGraphs"
+ }
+]
+```
+
+#### Find databases or graphs that can be restored at any given timestamp
+
+```azurecli-interactive
+
+az cosmosdb gremlin restorable-resource list \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
+ --location "West US" \
+ --restore-location "West US" \
+ --restore-timestamp "2021-01-10T01:00:00+0000"
+```
+```
+[ {
+ "databaseName": "db1",
+ "graphNames": [
+ "graph1",
+ "graph3",
+ "graph2"
+ ]
+ }
+]
+```
+
+### <a id="enumerate-table-api-cli"></a>Enumerate restorable resources for Table API account
+
+The enumeration commands described below help you discover the resources that are available for restore at various timestamps. Additionally, they also provide a feed of key events on the restorable account and Table API resources. These commands only work for live accounts.
+
+#### List all the versions of tables in a live database account
+
+```azurecli-interactive
+az cosmosdb table restorable-table list \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234"
+ --location "West US"
+```
+```
+[ {
+ "id": "/subscriptions/23587e98-b6ac-4328-a753-03bcd3c8e744/providers/Microsoft.DocumentDB/locations/WestUS/restorableDatabaseAccounts/7e4d666a-c6ba-4e1f-a4b9-e92017c5e8df/restorableTables/59781d91-682b-4cc2-93a3-c25d03fab159",
+ "name": "59781d91-682b-4cc2-93a3-c25d03fab159",
+ "resource": {
+ "eventTimestamp": "2022-02-09T17:09:54Z",
+ "operationType": "Create",
+ "ownerId": "table1",
+ "ownerResourceId": "tOdDAKYiBhQ=",
+ "rid": "9pvDGwAAAA=="
+ },
+ "type": "Microsoft.DocumentDB/locations/restorableDatabaseAccounts/restorableTables"
+ },
+ {"id": "/subscriptions/23587e98-b6ac-4328-a753-03bcd3c8e744/providers/Microsoft.DocumentDB/locations/eastus2euap/restorableDatabaseAccounts/7e4d666a-c6ba-4e1f-a4b9-e92017c5e8df/restorableTables/2c9f35eb-a14c-4ab5-a7e0-6326c4f6b785",
+ "name": "2c9f35eb-a14c-4ab5-a7e0-6326c4f6b785",
+ "resource": {
+ "eventTimestamp": "2022-02-09T20:47:53Z",
+ "operationType": "Create",
+ "ownerId": "table3",
+ "ownerResourceId": "tOdDALBwexw=",
+ "rid": "01DtkgAAAA=="
+ },
+ "type": "Microsoft.DocumentDB/locations/restorableDatabaseAccounts/restorableTables"
+ },
+]
+```
+
+### List all the resources of a Table API account that are available to restore at a given timestamp and region
+
+```azurecli-interactive
+az cosmosdb table restorable-resource list \
+ --instance-id "abcd1234-d1c0-4645-a699-abcd1234" \
+ --location "West US" \
+ --restore-location "West US" \
+ --restore-timestamp "2020-07-20T16:09:53+0000"
+```
+```
+{
+ "tableNames": [
+ "table1",
+ "table3",
+ "table2"
+ ]
+}
+```
+
+## <a id="restore-arm-template"></a>Restore using the Azure Resource Manager template
+
+You can also restore an account using Azure Resource Manager (ARM) template. When defining the template, include the following parameters:
+
+### Restore SQL API or MongoDB API account using ARM template
+
+1. Set the `createMode` parameter to *Restore*.
+1. Define the `restoreParameters`, notice that the `restoreSource` value is extracted from the output of the `az cosmosdb restorable-database-account list` command for your source account. The Instance ID attribute for your account name is used to do the restore.
+1. Set the `restoreMode` parameter to *PointInTime* and configure the `restoreTimestampInUtc` value.
+
+Use the following ARM template to restore an account for the Azure Cosmos DB SQL API or MongoDB API. Examples for other APIs are provided next.
```json {
You can also restore an account using Resource Manager template. When defining t
} ```
-Next deploy the template by using Azure PowerShell or CLI. The following example shows how to deploy the template with a CLI command:
+### Restore Gremlin API account using ARM template
+
+```json
+{
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "resources": [
+ {
+ "name": "ademo-pitr1",
+ "type": "Microsoft.DocumentDB/databaseAccounts",
+ "apiVersion": "2016-03-31",
+ "location": "West US",
+ "properties": {
+ "locations": [
+ {
+ "locationName": "West US"
+ }
+ ],
+ "backupPolicy": {
+ "type": "Continuous"
+ },
+ "databaseAccountOfferType": "Standard",
+ "createMode": "Restore",
+ "restoreParameters": {
+ "restoreSource": "/subscriptions/2296c272-5d55-40d9-bc05-4d56dc2d7588/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/5cb9d82e-ec71-430b-b977-cd6641db85bc",
+ "restoreMode": "PointInTime",
+ "restoreTimestampInUtc": "2021-10-27T23:20:46Z",
+ "gremlinDatabasesToRestore": {
+ "databaseName": "db1",
+ "graphNames": [
+ "graph1", "graph2"
+ ]
+ }
+ }
+ }
+ }
+ ]
+}
+```
+
+### Restore Table API account using ARM template
+
+```json
+{
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "resources": [
+ {
+ "name": "ademo-pitr1",
+ "type": "Microsoft.DocumentDB/databaseAccounts",
+ "apiVersion": "2016-03-31",
+ "location": "West US",
+ "properties": {
+ "locations": [
+ {
+ "locationName": "West US"
+ }
+ ],
+ "backupPolicy": {
+ "type": "Continuous"
+ },
+ "databaseAccountOfferType": "Standard",
+ "createMode": "Restore",
+ "restoreParameters": {
+ "restoreSource": "/subscriptions/1296c352-5d33-40d9-bc05-4d56dc2a7521/providers/Microsoft.DocumentDB/locations/West US/restorableDatabaseAccounts/4bcb9d82e-ec71-430b-b977-cd6641db85ad",
+ "restoreMode": "PointInTime",
+ "restoreTimestampInUtc": "2022-04-13T10:20:46Z",
+ "tablesToRestore": [
+ "table1", "table2"
+ ]
+ }
+ }
+ }
+ ]
+}
+```
+
+Next, deploy the template by using Azure PowerShell or Azure CLI. The following example shows how to deploy the template with an Azure CLI command:
```azurecli-interactive az group deployment create -g <ResourceGroup> --template-file <RestoreTemplateFilePath>
cosmos-db Best Practice Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/best-practice-java.md
This article walks through the best practices for using the Azure Cosmos DB Java
| <input type="checkbox"/> | Indexing | The Azure Cosmos DB indexing policy also allows you to specify which document paths to include or exclude from indexing by using indexing paths `IndexingPolicy#getIncludedPaths()` and `IndexingPolicy#getExcludedPaths()`. Ensure that you exclude unused paths from indexing for faster writes. For a sample on how to create indexes using the SDK [visit here](performance-tips-java-sdk-v4-sql.md#indexing-policy) | | <input type="checkbox"/> | Document Size | The request charge of a specified operation correlates directly to the size of the document. We recommend reducing the size of your documents as operations on large documents cost more than operations on smaller documents. | | <input type="checkbox"/> | Enabling Query Metrics | For additional logging of your backend query executions, follow instructions on how to capture SQL Query Metrics using [Java SDK](troubleshoot-java-sdk-v4-sql.md#query-operations) |
-| <input type="checkbox"/> | SDK Logging | Use SDK logging to capture additional diagnostics information and troubleshoot latency issues. Log the [CosmosDiagnostics](/java/api/com.azure.cosmos.cosmosdiagnostics?view=azure-java-stable&preserve-view=true) in Java SDK for more detailed cosmos diagnostic information for the current request to the service. As an example use case, capture Diagnostics on any exception and on completed operations if the `CosmosDiagnostics#getDuration()` is greater than a designated threshold value (i.e. if you have an SLA of 10 seconds, then capture diagnostics when `getDuration()` > 10 seconds). It's advised to only use these diagnostics during performance testing. For more information, follow [capture diagnostics on Java SDK](/azure/cosmos-db/sql/troubleshoot-java-sdk-v4-sql#capture-the-diagnostics) |
+| <input type="checkbox"/> | SDK Logging | Use SDK logging to capture additional diagnostics information and troubleshoot latency issues. Log the [CosmosDiagnostics](/jav#capture-the-diagnostics) |
## Best practices when using Gateway mode Azure Cosmos DB requests are made over HTTPS/REST when you use Gateway mode. They're subject to the default connection limit per hostname or IP address. You might need to tweak [maxConnectionPoolSize](/java/api/com.azure.cosmos.gatewayconnectionconfig.setmaxconnectionpoolsize?view=azure-java-stable#com-azure-cosmos-gatewayconnectionconfig-setmaxconnectionpoolsize(int)&preserve-view=true) to a different value (from 100 through 1,000) so that the client library can use multiple simultaneous connections to Azure Cosmos DB. In Java v4 SDK, the default value for `GatewayConnectionConfig#maxConnectionPoolSize` is 1000. To change the value, you can set `GatewayConnectionConfig#maxConnectionPoolSize` to a different value.
To learn more about designing your application for scale and high performance, s
Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning. * If all you know is the number of vCores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
-* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Create Sql Api Go https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-go.md
In this quickstart, you'll build a sample Go application that uses the Azure SDK
Azure Cosmos DB is a multi-model database service that lets you quickly create and query document, table, key-value, and graph databases with global distribution and horizontal scale capabilities.
-To learn more about Azure Cosmos DB, go to [Azure Cosmos DB](/azure/cosmos-db/introduction).
+To learn more about Azure Cosmos DB, go to [Azure Cosmos DB](../introduction.md).
## Prerequisites
Trying to do capacity planning for a migration to Azure Cosmos DB? You can use i
* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md) > [!div class="nextstepaction"]
-> [Import data into Azure Cosmos DB for the SQL API](../import-data.md)
+> [Import data into Azure Cosmos DB for the SQL API](../import-data.md)
cosmos-db Performance Tips Dotnet Sdk V3 Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/performance-tips-dotnet-sdk-v3-sql.md
If you're testing at high throughput levels, or at rates that are greater than 5
## <a id="metadata-operations"></a> Metadata operations
-Do not verify a Database and/or Container exists by calling `Create...IfNotExistsAsync` and/or `Read...Async` in the hot path and/or before doing an item operation. The validation should only be done on application startup when it is necessary, if you expect them to be deleted (otherwise it's not needed). These metadata operations will generate extra end-to-end latency, have no SLA, and their own separate [limitations](/azure/cosmos-db/sql/troubleshoot-request-rate-too-large#rate-limiting-on-metadata-requests) that do not scale like data operations.
+Do not verify a Database and/or Container exists by calling `Create...IfNotExistsAsync` and/or `Read...Async` in the hot path and/or before doing an item operation. The validation should only be done on application startup when it is necessary, if you expect them to be deleted (otherwise it's not needed). These metadata operations will generate extra end-to-end latency, have no SLA, and their own separate [limitations](./troubleshoot-request-rate-too-large.md#rate-limiting-on-metadata-requests) that do not scale like data operations.
## <a id="logging-and-tracing"></a> Logging and tracing
cosmos-db Performance Tips https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/performance-tips.md
If you're testing at high throughput levels (more than 50,000 RU/s), the client
## <a id="metadata-operations"></a> Metadata operations
-Do not verify a Database and/or Collection exists by calling `Create...IfNotExistsAsync` and/or `Read...Async` in the hot path and/or before doing an item operation. The validation should only be done on application startup when it is necessary, if you expect them to be deleted (otherwise it's not needed). These metadata operations will generate extra end-to-end latency, have no SLA, and their own separate [limitations](/azure/cosmos-db/sql/troubleshoot-request-rate-too-large#rate-limiting-on-metadata-requests) that do not scale like data operations.
+Do not verify a Database and/or Collection exists by calling `Create...IfNotExistsAsync` and/or `Read...Async` in the hot path and/or before doing an item operation. The validation should only be done on application startup when it is necessary, if you expect them to be deleted (otherwise it's not needed). These metadata operations will generate extra end-to-end latency, have no SLA, and their own separate [limitations](./troubleshoot-request-rate-too-large.md#rate-limiting-on-metadata-requests) that do not scale like data operations.
## <a id="logging-and-tracing"></a> Logging and tracing
cosmos-db Troubleshoot Changefeed Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-changefeed-functions.md
description: Common issues, workarounds, and diagnostic steps, when using the Az
Previously updated : 03/28/2022 Last updated : 04/14/2022
The previous versions of the Azure Cosmos DB Extension did not support using a l
This error means that you are currently using a partitioned lease collection with an old [extension dependency](#dependencies). Upgrade to the latest available version. If you are currently running on Azure Functions V1, you will need to upgrade to Azure Functions V2.
+### Azure Function fails to start with "Forbidden (403); Substatus: 5300... The given request [POST ...] cannot be authorized by AAD token in data plane"
+
+This error means your Function is attempting to [perform a non-data operation using Azure AD identities](troubleshoot-forbidden.md#non-data-operations-are-not-allowed). You cannot use `CreateLeaseContainerIfNotExists = true` when using Azure AD identities.
+ ### Azure Function fails to start with "The lease collection, if partitioned, must have partition key equal to id." This error means that your current leases container is partitioned, but the partition key path is not `/id`. To resolve this issue, you need to recreate the leases container with `/id` as the partition key.
cosmos-db Troubleshoot Dot Net Sdk Slow Request https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-dot-net-sdk-slow-request.md
Consider the following when developing your application:
## Metadata operations
-If you need to verify that a database or container exists, don't do so by calling `Create...IfNotExistsAsync` or `Read...Async` before doing an item operation. The validation should only be done on application startup when it's necessary, if you expect them to be deleted. These metadata operations generate extra latency, have no service-level agreement (SLA), and have their own separate [limitations](/azure/cosmos-db/sql/troubleshoot-request-rate-too-large#rate-limiting-on-metadata-requests). They don't scale like data operations.
+If you need to verify that a database or container exists, don't do so by calling `Create...IfNotExistsAsync` or `Read...Async` before doing an item operation. The validation should only be done on application startup when it's necessary, if you expect them to be deleted. These metadata operations generate extra latency, have no service-level agreement (SLA), and have their own separate [limitations](./troubleshoot-request-rate-too-large.md#rate-limiting-on-metadata-requests). They don't scale like data operations.
## Slow requests on bulk mode
cosmos-db Troubleshoot Forbidden https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-forbidden.md
description: Learn how to diagnose and fix forbidden exceptions.
Previously updated : 10/06/2021 Last updated : 04/14/2022
The HTTP status code 403 represents the request is forbidden to complete.
## Firewall blocking requests
-Data plane requests can come to Cosmos DB via the following 3 paths.
+Data plane requests can come to Cosmos DB via the following three paths.
- Public internet (IPv4) - Service endpoint - Private endpoint
-When a data plane request is blocked with 403 Forbidden, the error message will specify via which of the above 3 paths the request came to Cosmos DB.
+When a data plane request is blocked with 403 Forbidden, the error message will specify via which of the above three paths the request came to Cosmos DB.
- `Request originated from client IP {...} through public internet.` - `Request originated from client VNET through service endpoint.`
Partition key reached maximum size of {...} GB
This error means that your current [partitioning design](../partitioning-overview.md#logical-partitions) and workload is trying to store more than the allowed amount of data for a given partition key value. There is no limit to the number of logical partitions in your container but the size of data each logical partition can store is limited. You can reach to support for clarification. ## Non-data operations are not allowed
-This scenario happens when non-data [operations are disallowed in the account](../how-to-setup-rbac.md#permission-model). On this scenario, it's common to see errors like the ones below:
+This scenario happens when [attempting to perform non-data operations](../how-to-setup-rbac.md#permission-model) using Azure Active Directory (Azure AD) identities. On this scenario, it's common to see errors like the ones below:
``` Operation 'POST' on resource 'calls' is not allowed through Azure Cosmos DB endpoint
Forbidden (403); Substatus: 5300; The given request [PUT ...] cannot be authoriz
``` ### Solution
-Perform the operation through Azure Resource Manager, Azure portal, Azure CLI, or Azure PowerShell. Or reallow execution of non-data operations.
+Perform the operation through Azure Resource Manager, Azure portal, Azure CLI, or Azure PowerShell.
+If you are using the [Azure Functions Cosmos DB Trigger](../../azure-functions/functions-bindings-cosmosdb-v2-trigger.md) make sure the `CreateLeaseContainerIfNotExists` property of the trigger isn't set to `true`. Using Azure AD identities blocks any non-data operation, such as creating the lease container.
## Next steps * Configure [IP Firewall](../how-to-configure-firewall.md).
cosmos-db Create Table Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/table/create-table-dotnet.md
public class UpdateWeatherObject
} ```
-In the sample app, this object is passed to the `UpdateEntity` method in the `TableService` class. This method first loads the existing entity from the Table API using the [GetEntity](/dotnet/api/azure.data.tables.tableclient.getentity) method on the [TableClient](/dotnet/api/azure.data.tables.tableclient). It then updates that entity object and uses the `UpdateEntity` method save the updates to the database. Note how the [UpdateEntity](/dotnet/api/azure.data.tables.tableclient.updateentity) method takes the current Etag of the object to insure the object has not changed since it was initially loaded. If you want to update the entity regardless, you may pass a value of `Etag.Any` to the `UpdateEntity` method.
+In the sample app, this object is passed to the `UpdateEntity` method in the `TableService` class. This method first loads the existing entity from the Table API using the [GetEntity](/dotnet/api/azure.data.tables.tableclient.getentity) method on the [TableClient](/dotnet/api/azure.data.tables.tableclient). It then updates that entity object and uses the `UpdateEntity` method save the updates to the database. Note how the [UpdateEntity](/dotnet/api/azure.data.tables.tableclient.updateentity) method takes the current Etag of the object to insure the object has not changed since it was initially loaded. If you want to update the entity regardless, you may pass a value of `ETag.All` to the `UpdateEntity` method.
```csharp public void UpdateEntity(UpdateWeatherObject weatherObject)
cosmos-db How To Use Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/table/how-to-use-python.md
Title: Use the Azure Tables client library for Python
-description: Store structured data in the cloud using the Azure Tables client library for Python.
+ Title: 'Quickstart: Table API with Python - Azure Cosmos DB'
+description: This quickstart shows how to access the Azure Cosmos DB Table API from a Python application using the Azure Data Tables SDK
+ ms.devlang: python-+ Last updated 03/23/2021- -+
-# Get started with Azure Tables client library using Python
+
+# Quickstart: Build a Table API app with Python SDK and Azure Cosmos DB
+ [!INCLUDE[appliesto-table-api](../includes/appliesto-table-api.md)]
+This quickstart shows how to access the Azure Cosmos DB [Table API](https://docs.microsoft.com/azure/cosmos-db/table/introduction) from a Python application. The Cosmos DB Table API is a schemaless data store allowing applications to store structured NoSQL data in the cloud. Because data is stored in a schemaless design, new properties (columns) are automatically added to the table when an object with a new attribute is added to the table. Python applications can access the Cosmos DB Table API using the [Azure Data Tables SDK for Python](https://pypi.org/project/azure-data-tables/) package.
+
+## Prerequisites
-The Azure Table storage and the Azure Cosmos DB are services that store structured NoSQL data in the cloud, providing a key/attribute store with a schemaless design. Because Table storage and Azure Cosmos DB are schemaless, it's easy to adapt your data as the needs of your application evolve. Access to the table storage and table API data is fast and cost-effective for many types of applications, and is typically lower in cost than traditional SQL for similar volumes of data.
+The sample application is written in [Python3.6](https://www.python.org/downloads/), though the principles apply to all Python3.6+ applications. You can use [Visual Studio Code](https://code.visualstudio.com/) as an IDE.
-You can use the Table storage or the Azure Cosmos DB to store flexible datasets like user data for web applications, address books, device information, or other types of metadata your service requires. You can store any number of entities in a table, and a storage account may contain any number of tables, up to the capacity limit of the storage account.
+If you don't have an [Azure subscription](https://docs.microsoft.com/azure/guides/developer/azure-developer-guide#understanding-accounts-subscriptions-and-billing), create a [free account](https://azure.microsoft.com/free/dotnet) before you begin.
-### About this sample
+## Sample application
-This sample shows you how to use the [Azure Data Tables SDK for Python](https://pypi.org/project/azure-data-tables/) in common Azure Table storage scenarios. The name of the SDK indicates it is for use with Azure Tables storage, but it works with both Azure Cosmos DB and Azure Tables storage, each service just has a unique endpoint. These scenarios are explored using Python examples that illustrate how to:
+The sample application for this tutorial may be cloned or downloaded from the repository https://github.com/Azure-Samples/msdocs-azure-tables-sdk-python-flask. Both a starter and completed app are included in the sample repository.
-* Create and delete tables
-* Insert and query entities
-* Modify entities
+```bash
+git clone https://github.com/Azure-Samples/msdocs-azure-tables-sdk-python-flask.git
+```
-While working through the scenarios in this sample, you may want to refer to the [Azure Data Tables SDK for Python API reference](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/tables/azure-data-tables).
+The sample application uses weather data as an example to demonstrate the capabilities of the Table API. Objects representing weather observations are stored and retrieved using the Table API, including storing objects with additional properties to demonstrate the schemaless capabilities of the Table API.
-## Prerequisites
-You need the following to complete this sample successfully:
+## 1 - Create an Azure Cosmos DB account
-* [Python](https://www.python.org/downloads/) 2.7 or 3.6+.
-* [Azure Data Tables SDK for Python](https://pypi.python.org/pypi/azure-data-tables/). This SDK connects with both Azure Table storage and the Azure Cosmos DB Table API.
-* [Azure Storage account](../../storage/common/storage-account-create.md) or [Azure Cosmos DB account](https://azure.microsoft.com/try/cosmosdb/).
+You first need to create a Cosmos DB Tables API account that will contain the table(s) used in your application. This can be done using the Azure portal, Azure CLI, or Azure PowerShell.
-## Create an Azure service account
+### [Azure portal](#tab/azure-portal)
+Log in to the [Azure portal](https://portal.azure.com/) and follow these steps to create an Cosmos DB account.
-**Create an Azure storage account**
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create cosmos db account step 1](./includes/create-table-python/create-cosmos-db-acct-1.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-1-240px.png" alt-text="A screenshot showing how to use the search box in the top tool bar to find Cosmos DB accounts in Azure." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-1.png"::: |
+| [!INCLUDE [Create cosmos db account step 2](./includes/create-table-python/create-cosmos-db-acct-2.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-2-240px.png" alt-text="A screenshot showing the Create button location on the Cosmos DB accounts page in Azure." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-2.png"::: |
+| [!INCLUDE [Create cosmos db account step 3](./includes/create-table-python/create-cosmos-db-acct-3.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-3-240px.png" alt-text="A screenshot showing the Azure Table option as the correct option to select." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-3.png"::: |
+| [!INCLUDE [Create cosmos db account step 4](./includes/create-table-python/create-cosmos-db-acct-4.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-4-240px.png" alt-text="A screenshot showing how to fill out the fields on the Cosmos DB Account creation page." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-account-table-api-4.png"::: |
+### [Azure CLI](#tab/azure-cli)
-**Create an Azure Cosmos DB Table API account**
+Cosmos DB accounts are created using the [az cosmosdb create](https://docs.microsoft.com/cli/azure/cosmosdb#az-cosmosdb-create) command. You must include the `--capabilities EnableTable` option to enable table storage within your Cosmos DB. As all Azure resources must be contained in a resource group, the following code snippet also creates a resource group for the Cosmos DB account.
+Cosmos DB account names must be between 3 and 44 characters in length and may contain only lowercase letters, numbers, and the hyphen (-) character. Cosmos DB account names must also be unique across Azure.
-## Install the Azure Data Tables SDK for Python
+Azure CLI commands can be run in the [Azure Cloud Shell](https://shell.azure.com/) or on a workstation with the [Azure CLI installed](https://docs.microsoft.com/cli/azure/install-azure-cli).
-After you've created a Storage account, your next step is to install the [Microsoft Azure Data Tables SDK for Python](https://pypi.python.org/pypi/azure-data-tables/). For details on installing the SDK, refer to the [README.md](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/tables/azure-data-tables/README.md) file in the Data Tables SDK for Python repository on GitHub.
+It typically takes several minutes for the Cosmos DB account creation process to complete.
-## Import the TableServiceClient and TableEntity classes
+```azurecli
+LOCATION='eastus'
+RESOURCE_GROUP_NAME='rg-msdocs-tables-sdk-demo'
+COSMOS_ACCOUNT_NAME='cosmos-msdocs-tables-sdk-demo-123' # change 123 to a unique set of characters for a unique name
+COSMOS_TABLE_NAME='WeatherData'
-To work with entities in the Azure Data Tables service in Python, you use the `TableServiceClient` and `TableEntity` classes. Add this code near the top your Python file to import both:
+az group create \
+ --location $LOCATION \
+ --name $RESOURCE_GROUP_NAME
-```python
-from azure.data.tables import TableServiceClient
-from azure.data.tables import TableEntity
+az cosmosdb create \
+ --name $COSMOS_ACCOUNT_NAME \
+ --resource-group $RESOURCE_GROUP_NAME \
+ --capabilities EnableTable
```
-## Connect to Azure Table service
-You can either connect to the Azure Storage account or the Azure Cosmos DB Table API account. Get the shared key or connection string based on the type of account you are using.
+### [Azure PowerShell](#tab/azure-powershell)
-### Creating the Table service client from a shared key
+Azure Cosmos DB accounts are created using the [New-AzCosmosDBAccount](https://docs.microsoft.com/powershell/module/az.cosmosdb/new-azcosmosdbaccount) cmdlet. You must include the `-ApiKind "Table"` option to enable table storage within your Cosmos DB. As all Azure resources must be contained in a resource group, the following code snippet also creates a resource group for the Azure Cosmos DB account.
-Create a `TableServiceClient` object, and pass in your Cosmos DB or Storage account name, account key and table endpoint. Replace `myaccount`, `mykey` and `mytableendpoint` with your Cosmos DB or Storage account name, key and table endpoint.
+Azure Cosmos DB account names must be between 3 and 44 characters in length and may contain only lowercase letters, numbers, and the hyphen (-) character. Azure Cosmos DB account names must also be unique across Azure.
-```python
-from azure.core.credentials import AzureNamedKeyCredential
+Azure PowerShell commands can be run in the [Azure Cloud Shell](https://shell.azure.com) or on a workstation with [Azure PowerShell installed](https://docs.microsoft.com/powershell/azure/install-az-ps).
+
+It typically takes several minutes for the Cosmos DB account creation process to complete.
+
+```azurepowershell
+$location = 'eastus'
+$resourceGroupName = 'rg-msdocs-tables-sdk-demo'
+$cosmosAccountName = 'cosmos-msdocs-tables-sdk-demo-123' # change 123 to a unique set of characters for a unique name
-credential = AzureNamedKeyCredential("myaccount", "mykey")
-table_service = TableServiceClient(endpoint="mytableendpoint", credential=credential)
+# Create a resource group
+New-AzResourceGroup `
+ -Location $location `
+ -Name $resourceGroupName
+
+# Create an Azure Cosmos DB
+New-AzCosmosDBAccount `
+ -Name $cosmosAccountName `
+ -ResourceGroupName $resourceGroupName `
+ -Location $location `
+ -ApiKind "Table"
```
-### Creating the Table service client from a connection string
++
+## 2 - Create a table
-Copy your Cosmos DB or Storage account connection string from the Azure portal, and create a `TableServiceClient` object using your copied connection string:
+Next, you need to create a table within your Cosmos DB account for your application to use. Unlike a traditional database, you only need to specify the name of the table, not the properties (columns) in the table. As data is loaded into your table, the properties (columns) will be automatically created as needed.
-```python
-table_service = TableServiceClient.from_connection_string(conn_str='DefaultEndpointsProtocol=https;AccountName=myaccount;AccountKey=mykey;TableEndpoint=mytableendpoint;')
+### [Azure portal](#tab/azure-portal)
+
+In the [Azure portal](https://portal.azure.com/), complete the following steps to create a table inside your Cosmos DB account.
+
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create cosmos db table step 1](./includes/create-table-python/create-cosmos-table-1.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-table-api-1-240px.png" alt-text="A screenshot showing how to use the search box in the top tool bar to find your Cosmos DB account." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-table-api-1.png"::: |
+| [!INCLUDE [Create cosmos db table step 2](./includes/create-table-python/create-cosmos-table-2.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-table-api-2-240px.png" alt-text="A screenshot showing the location of the Add Table button." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-table-api-2.png"::: |
+| [!INCLUDE [Create cosmos db table step 3](./includes/create-table-python/create-cosmos-table-3.md)] | :::image type="content" source="./media/create-table-python/azure-portal-create-cosmos-db-table-api-3-240px.png" alt-text="A screenshot showing how to New Table dialog box for an Cosmos DB table." lightbox="./media/create-table-python/azure-portal-create-cosmos-db-table-api-3.png"::: |
+
+### [Azure CLI](#tab/azure-cli)
+
+Tables in Cosmos DB are created using the [az cosmosdb table create](https://docs.microsoft.com/cli/azure/cosmosdb/table#az-cosmosdb-table-create) command.
+
+```azurecli
+COSMOS_TABLE_NAME='WeatherData'
+
+az cosmosdb table create \
+ --account-name $COSMOS_ACCOUNT_NAME \
+ --resource-group $RESOURCE_GROUP_NAME \
+ --name $COSMOS_TABLE_NAME \
+ --throughput 400
```
-## Create a table
+### [Azure PowerShell](#tab/azure-powershell)
-Call `create_table` to create the table.
+Tables in Cosmos DB are created using the [New-AzCosmosDBTable](https://docs.microsoft.com/powershell/module/az.cosmosdb/new-azcosmosdbtable) cmdlet.
-```python
-table_service.create_table('tasktable')
+```azurepowershell
+$cosmosTableName = 'WeatherData'
+
+# Create the table for the application to use
+New-AzCosmosDBTable `
+ -Name $cosmosTableName `
+ -AccountName $cosmosAccountName `
+ -ResourceGroupName $resourceGroupName
```
-## Add an entity to a table
+
-Create a table in your account and get a `TableClient` to perform operations on the newly created table. To add an entity, you first create an object that represents your entity, then pass the object to the `TableClient.create_entity` method. The entity object can be a dictionary or an object of type `TableEntity`, and defines your entity's property names and values. Every entity must include the required [PartitionKey and RowKey](#partitionkey-and-rowkey) properties, in addition to any other properties you define for the entity.
+## 3 - Get Cosmos DB connection string
-This example creates a dictionary object representing an entity, then passes it to the `create_entity` method to add it to the table:
+To access your table(s) in Cosmos DB, your app will need the table connection string for the CosmosDB Storage account. The connection string can be retrieved using the Azure portal, Azure CLI or Azure PowerShell.
-```python
-table_client = table_service.get_table_client(table_name="tasktable")
-task = {u'PartitionKey': u'tasksSeattle', u'RowKey': u'001',
- u'description': u'Take out the trash', u'priority': 200}
-table_client.create_entity(entity=task)
+### [Azure portal](#tab/azure-portal)
+
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Get cosmos db table connection string step 1](./includes/create-table-python/get-cosmos-connection-string-1.md)] | :::image type="content" source="./media/create-table-python/azure-portal-cosmos-db-table-connection-string-1-240px.png" alt-text="A screenshot showing the location of the connection strings link on the Cosmos DB page." lightbox="./media/create-table-python/azure-portal-cosmos-db-table-connection-string-1.png"::: |
+| [!INCLUDE [Get cosmos db table connection string step 2](./includes/create-table-python/get-cosmos-connection-string-2.md)] | :::image type="content" source="./media/create-table-python/azure-portal-cosmos-db-table-connection-string-2-240px.png" alt-text="A screenshot showing which connection string to select and use in your application." lightbox="./media/create-table-python/azure-portal-cosmos-db-table-connection-string-2.png"::: |
+
+### [Azure CLI](#tab/azure-cli)
+
+To get the primary connection string using Azure CLI, use the [az cosmosdb keys list](https://docs.microsoft.com/cli/azure/cosmosdb/keys#az-cosmosdb-keys-list) command with the option `--type connection-strings`. This command uses a [JMESPath query](https://jmespath.org/) to display only the primary table connection string.
+
+```azurecli
+# This gets the primary connection string
+az cosmosdb keys list \
+ --type connection-strings \
+ --resource-group $RESOURCE_GROUP_NAME \
+ --name $COSMOS_ACCOUNT_NAME \
+ --query "connectionStrings[?description=='Primary Table Connection String'].connectionString" \
+ --output tsv
```
-This example creates an `TableEntity` object, then passes it to the `create_entity` method to add it to the table:
+### [Azure PowerShell](#tab/azure-powershell)
-```python
-task = TableEntity()
-task[u'PartitionKey'] = u'tasksSeattle'
-task[u'RowKey'] = u'002'
-task[u'description'] = u'Wash the car'
-task[u'priority'] = 100
-table_client.create_entity(task)
+To get the primary connection string using Azure PowerShell, use the [Get-AzCosmosDBAccountKey](https://docs.microsoft.com/powershell/module/az.cosmosdb/get-azcosmosdbaccountkey) cmdlet.
+
+```azurepowershell
+# This gets the primary connection string
+$(Get-AzCosmosDBAccountKey `
+ -ResourceGroupName $resourceGroupName `
+ -Name $cosmosAccountName `
+ -Type "ConnectionStrings")."Primary Table Connection String"
+```
+
+The connection string for your Cosmos DB account is considered an app secret and must be protected like any other app secret or password.
+++
+## 4 - Install the Azure Data Tables SDK for Python
+
+After you've created a Cosmos DB account, your next step is to install the Microsoft [Azure Data Tables SDK for Python](https://pypi.python.org/pypi/azure-data-tables/). For details on installing the SDK, refer to the [README.md](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/tables/azure-data-tables/README.md) file in the Data Tables SDK for Python repository on GitHub.
+
+Install the Azure Tables client library for Python with pip:
+
+```bash
+pip install azure-data-tables
```
-### PartitionKey and RowKey
+
-You must specify both a **PartitionKey** and a **RowKey** property for every entity. These are the unique identifiers of your entities, as together they form the primary key of an entity. You can query using these values much faster than you can query any other entity properties because only these properties are indexed.
+## 5 - Configure the Table client in .env file
-The Table service uses **PartitionKey** to intelligently distribute table entities across storage nodes. Entities that have the same **PartitionKey** are stored on the same node. **RowKey** is the unique ID of the entity within the partition it belongs to.
+Copy your Azure Cosmos DB account connection string from the Azure portal, and create a TableServiceClient object using your copied connection string. Switch to folder `1-strater-app` or `2-completed-app`. Then, add the value of the corresponding environment variables in `.env` file.
-## Update an entity
+```python
+# Configuration Parameters
+conn_str = "A connection string to an Azure Cosmos account."
+table_name = "WeatherData"
+project_root_path = "Project abs path"
+```
-To update all of an entity's property values, call the `update_entity` method. This example shows how to replace an existing entity with an updated version:
+The Azure SDK communicates with Azure using client objects to execute different operations against Azure. The `TableServiceClient` object is the object used to communicate with the Cosmos DB Table API. An application will typically have a single `TableServiceClient` overall, and it will have a `TableClient` per table.
```python
-task = {u'PartitionKey': u'tasksSeattle', u'RowKey': u'001',
- u'description': u'Take out the garbage', u'priority': 250}
-table_client.update_entity(task)
+self.conn_str = os.getenv("AZURE_CONNECTION_STRING")
+self.table_service = TableServiceClient.from_connection_string(self.conn_str)
```
-If the entity that is being updated doesn't already exist, then the update operation will fail. If you want to store an entity whether it exists or not, use `upsert_entity`. In the following example, the first call will replace the existing entity. The second call will insert a new entity, since no entity with the specified PartitionKey and RowKey exists in the table.
++
+## 6 - Implement Cosmos DB table operations
+
+All Cosmos DB table operations for the sample app are implemented in the `TableServiceHelper` class located in *helper* file under the *webapp* directory. You will need to import the `TableServiceClient` class at the top of this file to work with objects in the `azure.data.tables` SDK package.
```python
-# Replace the entity created earlier
-task = {u'PartitionKey': u'tasksSeattle', u'RowKey': u'001',
- u'description': u'Take out the garbage again', u'priority': 250}
-table_client.upsert_entity(task)
+from azure.data.tables import TableServiceClient
+```
+
+At the start of the `TableServiceHelper` class, create a constructor and add a member variable for the `TableClient` object to allow the `TableClient` object to be injected into the class.
-# Insert a new entity
-task = {u'PartitionKey': u'tasksSeattle', u'RowKey': u'003',
- u'description': u'Buy detergent', u'priority': 300}
-table_client.upsert_entity(task)
+```python
+def __init__(self, table_name=None, conn_str=None):
+ self.table_name = table_name if table_name else os.getenv("table_name")
+ self.conn_str = conn_str if conn_str else os.getenv("conn_str")
+ self.table_service = TableServiceClient.from_connection_string(self.conn_str)
+ self.table_client = self.table_service.get_table_client(self.table_name)
```
-> [!TIP]
-> The **mode=UpdateMode.REPLACE** parameter in `update_entity` method replaces all properties and values of an existing entity, which you can also use to remove properties from an existing entity. The **mode=UpdateMode.MERGE** parameter is used by default to update an existing entity with new or modified property values without completely replacing the entity.
+### Filter rows returned from a table
+
+To filter the rows returned from a table, you can pass an OData style filter string to the `query_entities` method. For example, if you wanted to get all of the weather readings for Chicago between midnight July 1, 2021 and midnight July 2, 2021 (inclusive) you would pass in the following filter string.
-## Modify multiple entities
+```odata
+PartitionKey eq 'Chicago' and RowKey ge '2021-07-01 12:00 AM' and RowKey le '2021-07-02 12:00 AM'
+```
-To ensure the atomic processing of a request by the Table service, you can submit multiple operations together in a batch. First, add multiple operations to a list. Next, call `Table_client.submit_transaction` to submit the operations in an atomic operation. All entities to be modified in batch must be in the same partition.
+You can view related OData filter operators on the azure-data-tables website in the section [Writing Filters](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/tables/azure-data-tables/samples#writing-filters).
-This example adds two entities together in a batch:
+When request.args parameter is passed to the `query_entity` method in the `TableServiceHelper` class, it creates a filter string for each non-null property value. It then creates a combined filter string by joining all of the values together with an "and" clause. This combined filter string is passed to the `query_entities` method on the `TableClient` object and only rows matching the filter string will be returned. You can use a similar method in your code to construct suitable filter strings as required by your application.
```python
-task004 = {u'PartitionKey': u'tasksSeattle', u'RowKey': '004',
- 'description': u'Go grocery shopping', u'priority': 400}
-task005 = {u'PartitionKey': u'tasksSeattle', u'RowKey': '005',
- u'description': u'Clean the bathroom', u'priority': 100}
-operations = [("create", task004), ("create", task005)]
-table_client.submit_transaction(operations)
+def query_entity(self, params):
+ filters = []
+ if params.get("partitionKey"):
+ filters.append("PartitionKey eq '{}'".format(params.get("partitionKey")))
+ if params.get("rowKeyDateStart") and params.get("rowKeyTimeStart"):
+ filters.append("RowKey ge '{} {}'".format(params.get("rowKeyDateStart"), params.get("rowKeyTimeStart")))
+ if params.get("rowKeyDateEnd") and params.get("rowKeyTimeEnd"):
+ filters.append("RowKey le '{} {}'".format(params.get("rowKeyDateEnd"), params.get("rowKeyTimeEnd")))
+ if params.get("minTemperature"):
+ filters.append("Temperature ge {}".format(params.get("minTemperature")))
+ if params.get("maxTemperature"):
+ filters.append("Temperature le {}".format(params.get("maxTemperature")))
+ if params.get("minPrecipitation"):
+ filters.append("Precipitation ge {}".format(params.get("minPrecipitation")))
+ if params.get("maxPrecipitation"):
+ filters.append("Precipitation le {}".format(params.get("maxPrecipitation")))
+ return list(self.table_client.query_entities(" and ".join(filters)))
```
-## Query for an entity
+### Insert data using a TableEntity object
-To query for an entity in a table, pass its PartitionKey and RowKey to the `Table_client.get_entity` method.
+The simplest way to add data to a table is by using a `TableEntity` object. In this example, data is mapped from an input model object to a `TableEntity` object. The properties on the input object representing the weather station name and observation date/time are mapped to the `PartitionKey` and `RowKey` properties respectively which together form a unique key for the row in the table. Then the additional properties on the input model object are mapped to dictionary properties on the TableEntity object. Finally, the `create_entity` method on the `TableClient` object is used to insert data into the table.
+
+Modify the `insert_entity` function in the example application to contain the following code.
```python
-task = table_client.get_entity('tasksSeattle', '001')
-print(task['description'])
-print(task['priority'])
+def insert_entity(self):
+ entity = self.deserialize()
+ return self.table_client.create_entity(entity)
+
+@staticmethod
+def deserialize():
+ params = {key: request.form.get(key) for key in request.form.keys()}
+ params["PartitionKey"] = params.pop("StationName")
+ params["RowKey"] = "{} {}".format(params.pop("ObservationDate"), params.pop("ObservationTime"))
+ return params
```
-## Query a set of entities
+### Upsert data using a TableEntity object
-You can query for a set of entities by supplying a filter string with the **query_filter** parameter. This example finds all tasks in Seattle by applying a filter on PartitionKey:
+If you try to insert a row into a table with a partition key/row key combination that already exists in that table, you will receive an error. For this reason, it is often preferable to use the `upsert_entity` instead of the `create_entity` method when adding rows to a table. If the given partition key/row key combination already exists in the table, the `upsert_entity` method will update the existing row. Otherwise, the row will be added to the table.
```python
-tasks = table_client.query_entities(query_filter="PartitionKey eq 'tasksSeattle'")
-for task in tasks:
- print(task['description'])
- print(task['priority'])
+def upsert_entity(self):
+ entity = self.deserialize()
+ return self.table_client.upsert_entity(entity)
+
+@staticmethod
+def deserialize():
+ params = {key: request.form.get(key) for key in request.form.keys()}
+ params["PartitionKey"] = params.pop("StationName")
+ params["RowKey"] = "{} {}".format(params.pop("ObservationDate"), params.pop("ObservationTime"))
+ return params
```
-## Query a subset of entity properties
+### Insert or upsert data with variable properties
+
+One of the advantages of using the Cosmos DB Table API is that if an object being loaded to a table contains any new properties then those properties are automatically added to the table and the values stored in Cosmos DB. There is no need to run DDL statements like ALTER TABLE to add columns as in a traditional database.
-You can also restrict which properties are returned for each entity in a query. This technique, called *projection*, reduces bandwidth and can improve query performance, especially for large entities or result sets. Use the **select** parameter and pass the names of the properties you want returned to the client.
+This model gives your application flexibility when dealing with data sources that may add or modify what data needs to be captured over time or when different inputs provide different data to your application. In the sample application, we can simulate a weather station that sends not just the base weather data but also some additional values. When an object with these new properties is stored in the table for the first time, the corresponding properties (columns) will be automatically added to the table.
-The query in the following code returns only the descriptions of entities in the table.
+To insert or upsert such an object using the Table API, map the properties of the expandable object into a `TableEntity` object and use the `create_entity` or `upsert_entity` methods on the `TableClient` object as appropriate.
-> [!NOTE]
-> The following snippet works only against the Azure Storage. It is not supported by the Storage Emulator.
+In the sample application, the `upsert_entity` function can also implement the function of insert or upsert data with variable properties
```python
-tasks = table_client.query_entities(
- query_filter="PartitionKey eq 'tasksSeattle'", select='description')
-for task in tasks:
- print(task['description'])
+def insert_entity(self):
+ entity = self.deserialize()
+ return self.table_client.create_entity(entity)
+
+def upsert_entity(self):
+ entity = self.deserialize()
+ return self.table_client.upsert_entity(entity)
+
+@staticmethod
+def deserialize():
+ params = {key: request.form.get(key) for key in request.form.keys()}
+ params["PartitionKey"] = params.pop("StationName")
+ params["RowKey"] = "{} {}".format(params.pop("ObservationDate"), params.pop("ObservationTime"))
+ return params
```
-## Query for an entity without partition and row keys
+### Update an entity
-You can also list entities within a table without using the partition and row keys. Use the `table_client.list_entities` method as show in the following example:
+Entities can be updated by calling the `update_entity` method on the `TableClient` object.
+
+In the sample app, this object is passed to the `upsert_entity` method in the `TableClient` class. It updates that entity object and uses the `upsert_entity` method save the updates to the database.
```python
-print("Get the first item from the table")
-tasks = table_client.list_entities()
-lst = list(tasks)
-print(lst[0])
+def update_entity(self):
+ entity = self.update_deserialize()
+ return self.table_client.update_entity(entity)
+
+@staticmethod
+def update_deserialize():
+ params = {key: request.form.get(key) for key in request.form.keys()}
+ params["PartitionKey"] = params.pop("StationName")
+ params["RowKey"] = params.pop("ObservationDate")
+ return params
```
-## Delete an entity
-
-Delete an entity by passing its **PartitionKey** and **RowKey** to the `delete_entity` method.
+### Remove an entity
+To remove an entity from a table, call the `delete_entity` method on the `TableClient` object with the partition key and row key of the object.
+
```python
-table_client.delete_entity('tasksSeattle', '001')
+def delete_entity(self):
+ partition_key = request.form.get("StationName")
+ row_key = request.form.get("ObservationDate")
+ return self.table_client.delete_entity(partition_key, row_key)
```
-## Delete a table
+## 7 - Run the code
-If you no longer need a table or any of the entities within it, call the `delete_table` method to permanently delete the table from Azure Storage.
+Run the sample application to interact with the Cosmos DB Table API. The first time you run the application, there will be no data because the table is empty. Use any of the buttons at the top of application to add data to the table.
-```python
-table_service.delete_table('tasktable')
+
+Selecting the **Insert using Table Entity** button opens a dialog allowing you to insert or upsert a new row using a `TableEntity` object.
++
+Selecting the **Insert using Expandable** Data button brings up a dialog that enables you to insert an object with custom properties, demonstrating how the Cosmos DB Table API automatically adds properties (columns) to the table when needed. Use the *Add Custom Field* button to add one or more new properties and demonstrate this capability.
++
+Use the **Insert Sample Data** button to load some sample data into your Cosmos DB Table.
++
+Select the **Filter Results** item in the top menu to be taken to the Filter Results page. On this page, fill out the filter criteria to demonstrate how a filter clause can be built and passed to the Cosmos DB Table API.
++
+## Clean up resources
+
+When you are finished with the sample application, you should remove all Azure resources related to this article from your Azure account. You can do this by deleting the resource group.
+
+### [Azure portal](#tab/azure-portal)
+
+A resource group can be deleted using the [Azure portal](https://portal.azure.com/) by doing the following.
+
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Delete resource group step 1](./includes/create-table-python/remove-resource-group-1.md)] | :::image type="content" source="./media/create-table-python/azure-portal-remove-resource-group-1-240px.png" alt-text="A screenshot showing how to search for a resource group." lightbox="./media/create-table-python/azure-portal-remove-resource-group-1.png"::: |
+| [!INCLUDE [Delete resource group step 2](./includes/create-table-python/remove-resource-group-2.md)] | :::image type="content" source="./media/create-table-python/azure-portal-remove-resource-group-2-240px.png" alt-text="A screenshot showing the location of the Delete resource group button." lightbox="./media/create-table-python/azure-portal-remove-resource-group-2.png"::: |
+| [!INCLUDE [Delete resource group step 3](./includes/create-table-python/remove-resource-group-3.md)] | :::image type="content" source="./media/create-table-python/azure-portal-remove-resource-group-3-240px.png" alt-text="A screenshot showing the confirmation dialog for deleting a resource group." lightbox="./media/create-table-python/azure-portal-remove-resource-group-3.png"::: |
+
+### [Azure CLI](#tab/azure-cli)
+
+To delete a resource group using the Azure CLI, use the [az group delete](https://docs.microsoft.com/cli/azure/group#az-group-delete) command with the name of the resource group to be deleted. Deleting a resource group will also remove all Azure resources contained in the resource group.
+
+```azurecli
+az group delete --name $RESOURCE_GROUP_NAME
+```
+
+### [Azure PowerShell](#tab/azure-powershell)
+
+To delete a resource group using Azure PowerShell, use the [Remove-AzResourceGroup](https://docs.microsoft.com/powershell/module/az.resources/remove-azresourcegroup) command with the name of the resource group to be deleted. Deleting a resource group will also remove all Azure resources contained in the resource group.
+
+```azurepowershell
+Remove-AzResourceGroup -Name $resourceGroupName
``` ++ ## Next steps
-* [FAQ - Develop with the Table API](table-api-faq.yml)
-* [Azure Data Tables SDK for Python API reference](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/tables/azure-data-tables)
-* [Python Developer Center](https://azure.microsoft.com/develop/python/)
-* [Microsoft Azure Storage Explorer](../../vs-azure-tools-storage-manage-with-storage-explorer.md): A free, cross-platform application for working visually with Azure Storage data on Windows, macOS, and Linux.
-* [Working with Python in Visual Studio (Windows)](/visualstudio/python/overview-of-python-tools-for-visual-studio)
+In this quickstart, you've learned how to create an Azure Cosmos DB account, create a table using the Data Explorer, and run an app. Now you can query your data using the Table API.
+> [!div class="nextstepaction"]
+> [Import table data to the Table API](https://docs.microsoft.com/azure/cosmos-db/table/table-import)
cost-management-billing Download Azure Invoice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/understand/download-azure-invoice.md
tags: billing
Previously updated : 02/17/2022 Last updated : 04/15/2022
If you pay for Azure with a credit card and you buy reservation, Azure generates
An invoice is only generated for a subscription that belongs to a billing account for an MOSP. [Check your access to an MOSP account](../manage/view-all-accounts.md#check-the-type-of-your-account).
-You must have an account admin role for a subscription to download its invoice. Users with owner, contributor, or reader roles can download its invoice, if the account admin has given them permission. For more information, see [Allow users to download invoices](../manage/manage-billing-access.md#opt-in).
+You must have an *account admin* role for a subscription to download its invoice. Users with owner, contributor, or reader roles can download its invoice, if the account admin has given them permission. For more information, see [Allow users to download invoices](../manage/manage-billing-access.md#opt-in).
+
+Azure Government customers canΓÇÖt request their invoice by email. They can only download it.
1. Select your subscription from the [Subscriptions page](https://portal.azure.com/#blade/Microsoft_Azure_Billing/SubscriptionsBlade) in the Azure portal. 1. Select **Invoices** from the billing section.
data-catalog Data Catalog Adopting Data Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-adopting-data-catalog.md
Last updated 02/17/2022
# Approach and process for adopting Azure Data Catalog This article helps you get started adopting **Azure Data Catalog** in your organization. To successfully adopt **Azure Data Catalog**, focus on three key items: define your vision, identify key business use cases within your organization, and choose a pilot project.
data-catalog Data Catalog Common Scenarios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-common-scenarios.md
Last updated 02/22/2022
# Azure Data Catalog common scenarios This article presents common scenarios where Azure Data Catalog can help your organization get more value from its existing data sources.
data-catalog Data Catalog Developer Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-developer-concepts.md
Last updated 02/16/2022
# Azure Data Catalog developer concepts Microsoft **Azure Data Catalog** is a fully managed cloud service that provides capabilities for data source discovery and for crowdsourcing data source metadata. Developers can use the service via its REST APIs. Understanding the concepts implemented in the service is important for developers to successfully integrate with **Azure Data Catalog**.
data-catalog Data Catalog Dsr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-dsr.md
Last updated 02/24/2022
# Supported data sources in Azure Data Catalog You can publish metadata by using a public API or a click-once registration tool, or by manually entering information directly to the Azure Data Catalog web portal. The following table summarizes all data sources that are supported by the catalog today, and the publishing capabilities for each. Also listed are the external data tools that each data source can launch from our portal "open-in" experience. The second table contains a more technical specification of each data-source connection property.
data-catalog Data Catalog Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-get-started.md
# Quickstart: Create an Azure Data Catalog via the Azure portal Azure Data Catalog is a fully managed cloud service that serves as a system of registration and system of discovery for enterprise data assets. For a detailed overview, see [What is Azure Data Catalog](overview.md).
data-catalog Data Catalog How To Annotate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-annotate.md
Last updated 02/18/2022
# How to annotate data sources in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Big Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-big-data.md
Last updated 02/14/2022
# How to catalog big data in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Business Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-business-glossary.md
Last updated 02/23/2022
# Set up the business glossary for governed tagging ## Introduction
data-catalog Data Catalog How To Connect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-connect.md
Last updated 02/22/2022
# How to connect to data sources ## Introduction
data-catalog Data Catalog How To Data Profile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-data-profile.md
Last updated 02/18/2022
# How to data profile data sources in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Discover https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-discover.md
Last updated 02/24/2022
# How to discover data sources in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Documentation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-documentation.md
Last updated 02/17/2022
# How to document data sources in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-manage.md
Last updated 02/15/2022
# Manage data assets in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Register https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-register.md
Last updated 02/25/2022
# Register data sources in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Save Pin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-save-pin.md
Last updated 02/10/2022
# Save searches and pin data assets in Azure Data Catalog ## Introduction
data-catalog Data Catalog How To Secure Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-secure-catalog.md
Last updated 02/14/2022
# How to secure access to data catalog and data assets > [!IMPORTANT] > This feature is available only in the standard edition of Azure Data Catalog.
data-catalog Data Catalog How To View Related Data Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-how-to-view-related-data-assets.md
Last updated 02/11/2022
# How to view related data assets in Azure Data Catalog Azure Data Catalog allows you to view data assets that are related to a selected data asset, and see the relationships between them.
data-catalog Data Catalog Keyboard Shortcuts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-keyboard-shortcuts.md
Last updated 02/11/2022
# Keyboard shortcuts for Azure Data Catalog ## Keyboard shortcuts for the Data Catalog data source registration tool
data-catalog Data Catalog Migration To Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-migration-to-azure-purview.md
Title: Migrate from Azure Data Catalog to Azure Purview
-description: Steps to migrate from Azure Data Catalog to Microsoft's unified data governance service--Azure Purview.
+ Title: Migrate from Azure Data Catalog to Microsoft Purview
+description: Steps to migrate from Azure Data Catalog to Microsoft's unified data governance service--Microsoft Purview.
Last updated 01/24/2022
-#Customer intent: As an Azure Data Catalog user, I want to know why and how to migrate to Azure Purview so that I can use the best tools to manage my data.
+#Customer intent: As an Azure Data Catalog user, I want to know why and how to migrate to Microsoft Purview so that I can use the best tools to manage my data.
-# Migrate from Azure Data Catalog to Azure Purview
+# Migrate from Azure Data Catalog to Microsoft Purview
-Microsoft launched a unified data governance service to help manage and govern your on-premises, multi-cloud, and software-as-a-service (SaaS) data. Azure Purview creates a map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Azure Purview enables data curators to manage and secure their data estate and empowers data consumers to find valuable, trustworthy data.
+Microsoft launched a unified data governance service to help manage and govern your on-premises, multi-cloud, and software-as-a-service (SaaS) data. Microsoft Purview creates a map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Microsoft Purview enables data curators to manage and secure their data estate and empowers data consumers to find valuable, trustworthy data.
-The document shows you how to do the migration from Azure Data Catalog to Azure Purview.
+The document shows you how to do the migration from Azure Data Catalog to Microsoft Purview.
## Recommended approach
-To migrate from Azure Data Catalog to Azure Purview, we recommend the following approach:
+To migrate from Azure Data Catalog to Microsoft Purview, we recommend the following approach:
:heavy_check_mark: Step 1: [Assess readiness](#assess-readiness) :heavy_check_mark: Step 2: [Prepare to migrate](#prepare-to-migrate)
-:heavy_check_mark: Step 3: [Migrate to Azure Purview](#migrate-to-azure-purview)
+:heavy_check_mark: Step 3: [Migrate to Microsoft Purview](#migrate-to-microsoft-purview)
-:heavy_check_mark: Step 4: [Cutover from Azure Data Catalog to Azure Purview](#cutover-from-azure-data-catalog-to-azure-purview)
+:heavy_check_mark: Step 4: [Cutover from Azure Data Catalog to Microsoft Purview](#cutover-from-azure-data-catalog-to-microsoft-purview)
> [!NOTE]
-> Azure Data Catalog and Azure Purview are different services, so there is no in-place upgrade experience. Intentional migration effort required.
+> Azure Data Catalog and Microsoft Purview are different services, so there is no in-place upgrade experience. Intentional migration effort required.
## Assess readiness
-Look at [Azure Purview](https://azure.microsoft.com/services/purview/) and understand key differences of Azure Data Catalog and Azure Purview.
+Look at [Microsoft Purview](https://azure.microsoft.com/services/purview/) and understand key differences of Azure Data Catalog and Microsoft Purview.
-||Azure Data Catalog |Azure Purview |
+||Azure Data Catalog |Microsoft Purview |
|||| |**Pricing** |[User based model](https://azure.microsoft.com/pricing/details/data-catalog/) |[Pay-As-You-Go model](https://azure.microsoft.com/pricing/details/azure-purview/) | |**Platform** |[Data catalog](overview.md) |[Unified governance platform for data discoverability, classification, lineage, and governance.](../purview/purview-connector-overview.md) | |**Extensibility** |N/A |[Extensible on Apache Atlas](../purview/tutorial-purview-tools.md)| |**SDK/PowerShell support** |N/A |[Supports REST APIs](/rest/api/purview/) |
-Compare [Azure Data Catalog supported sources](data-catalog-dsr.md) and [Azure Purview supported sources](../purview/purview-connector-overview.md), to confirm you can support your data landscape.
+Compare [Azure Data Catalog supported sources](data-catalog-dsr.md) and [Microsoft Purview supported sources](../purview/purview-connector-overview.md), to confirm you can support your data landscape.
## Prepare to migrate 1. Identify data sources that you'll migrate.
- Take this opportunity to identify logical and business connections between your data sources and assets. Azure Purview will allow you to create a map of your data landscape that reflects how your data is used and discovered in your organization.
-1. Review [Azure Purview best practices for deployment and architecture](../purview/deployment-best-practices.md) to develop a deployment strategy for Azure Purview.
+ Take this opportunity to identify logical and business connections between your data sources and assets. Microsoft Purview will allow you to create a map of your data landscape that reflects how your data is used and discovered in your organization.
+1. Review [Microsoft Purview best practices for deployment and architecture](../purview/deployment-best-practices.md) to develop a deployment strategy for Microsoft Purview.
1. Determine the impact that a migration will have on your business. For example: how will Azure Data catalog be used until the transition is complete? 1. Create a migration plan.
-## Migrate to Azure Purview
+## Migrate to Microsoft Purview
-Manually migrate your data from Azure Data Catalog to Azure Purview.
+Manually migrate your data from Azure Data Catalog to Microsoft Purview.
-[Create an Azure Purview account](../purview/create-catalog-portal.md), [create collections](../purview/create-catalog-portal.md) in your data map, set up [permissions for your users](../purview/catalog-permissions.md), and onboard your data sources.
+[Create a Microsoft Purview account](../purview/create-catalog-portal.md), [create collections](../purview/create-catalog-portal.md) in your data map, set up [permissions for your users](../purview/catalog-permissions.md), and onboard your data sources.
-We suggest you review the Azure Purview best practices documentation before deploying your Azure Purview account, so you can deploy the best environment for your data landscape.
+We suggest you review the Microsoft Purview best practices documentation before deploying your Microsoft Purview account, so you can deploy the best environment for your data landscape.
Here's a selection of articles that may help you get started:-- [Azure Purview security best practices](../purview/concept-best-practices-security.md)
+- [Microsoft Purview security best practices](../purview/concept-best-practices-security.md)
- [Accounts architecture best practices](../purview/concept-best-practices-accounts.md) - [Collections architectures best practices](../purview/concept-best-practices-collections.md) - [Create a collection](../purview/quickstart-create-collection.md)-- [Import Azure sources to Azure Purview at scale](../purview/tutorial-data-sources-readiness.md)
+- [Import Azure sources to Microsoft Purview at scale](../purview/tutorial-data-sources-readiness.md)
- [Tutorial: Onboard an on-premises SQL Server instance](../purview/tutorial-register-scan-on-premises-sql-server.md)
-## Cutover from Azure Data Catalog to Azure Purview
+## Cutover from Azure Data Catalog to Microsoft Purview
-After the business has begun to use Azure Purview, cutover from Azure Data Catalog by deleting the Azure Data Catalog.
+After the business has begun to use Microsoft Purview, cutover from Azure Data Catalog by deleting the Azure Data Catalog.
## Next steps-- Learn how [Azure Purview's data insights](../purview/concept-insights.md) can provide you up-to-date information on your data landscape.-- Learn how [Azure Purview integrations with Azure security products](../purview/how-to-integrate-with-azure-security-products.md) to bring even more security to your data landscape.-- Discover how [sensitivity labels in Azure Purview](../purview/create-sensitivity-label.md) help detect and protect your sensitive information.
+- Learn how [Microsoft Purview's data insights](../purview/concept-insights.md) can provide you up-to-date information on your data landscape.
+- Learn how [Microsoft Purview integrations with Azure security products](../purview/how-to-integrate-with-azure-security-products.md) to bring even more security to your data landscape.
+- Discover how [sensitivity labels in Microsoft Purview](../purview/create-sensitivity-label.md) help detect and protect your sensitive information.
data-catalog Data Catalog Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-samples.md
Last updated 02/16/2022
# Azure Data Catalog developer samples Get started developing Azure Data Catalog apps using the Data Catalog REST API. The Data Catalog REST API is a REST-based API that provides programmatic access to Data Catalog resources to register, annotate, and search data assets programmatically.
data-catalog Data Catalog Terminology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/data-catalog-terminology.md
Last updated 02/15/2022
# Azure Data Catalog terminology This article provides an introduction to concepts and terms used in Azure Data Catalog documentation.
data-catalog Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/overview.md
Last updated 02/24/2022
# What is Azure Data Catalog? Azure Data Catalog is a fully managed cloud service that lets users discover the data sources they need and understand the data sources they find. At the same time, Data Catalog helps organizations get more value from their existing investments.
data-catalog Register Data Assets Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/register-data-assets-tutorial.md
Last updated 02/24/2022
# Tutorial: Register data assets in Azure Data Catalog In this tutorial, you use the registration tool to register data assets from the database sample with the catalog. Registration is the process of extracting key structural metadata such as names, types, and locations from the data source and the assets it contains, and copying that metadata to the catalog. The data source and data assets remain where they are, but the metadata is used by the catalog to make them more easily discoverable and understandable.
data-catalog Troubleshoot Policy Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-catalog/troubleshoot-policy-configuration.md
Last updated 02/10/2022
# Troubleshooting Azure Data Catalog This article describes common troubleshooting concerns for Azure Data Catalog resources.
data-factory Author Global Parameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/author-global-parameters.md
There are two ways to integrate global parameters in your continuous integration
* Include global parameters in the ARM template * Deploy global parameters via a PowerShell script
-For general use cases, it is recommended to include global parameters in the ARM template. This integrates natively with the solution outlined in [the CI/CD doc](continuous-integration-delivery.md). In case of automatic publishing and Azure Purview connection, **PowerShell script** method is required. You can find more about PowerShell script method later. Global parameters will be added as an ARM template parameter by default as they often change from environment to environment. You can enable the inclusion of global parameters in the ARM template from the **Manage** hub.
+For general use cases, it is recommended to include global parameters in the ARM template. This integrates natively with the solution outlined in [the CI/CD doc](continuous-integration-delivery.md). In case of automatic publishing and Microsoft Purview connection, **PowerShell script** method is required. You can find more about PowerShell script method later. Global parameters will be added as an ARM template parameter by default as they often change from environment to environment. You can enable the inclusion of global parameters in the ARM template from the **Manage** hub.
:::image type="content" source="media/author-global-parameters/include-arm-template.png" alt-text="Include in ARM template"::: > [!NOTE]
-> The **Include in ARM template** configuration is only available in "Git mode". Currently it is disabled in "live mode" or "Data Factory" mode. In case of automatic publishing or Azure Purview connection, do not use Include global parameters method; use PowerShell script method.
+> The **Include in ARM template** configuration is only available in "Git mode". Currently it is disabled in "live mode" or "Data Factory" mode. In case of automatic publishing or Microsoft Purview connection, do not use Include global parameters method; use PowerShell script method.
> [!WARNING] >You cannot use ΓÇÿ-ΓÇÿ in the parameter name. You will receive an errorcode "{"code":"BadRequest","message":"ErrorCode=InvalidTemplate,ErrorMessage=The expression >'pipeline().globalParameters.myparam-dbtest-url' is not valid: .....}". But, you can use the ΓÇÿ_ΓÇÖ in the parameter name.
data-factory Ci Cd Github Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/ci-cd-github-troubleshoot-guide.md
Previously updated : 11/09/2021 Last updated : 04/18/2022 # Troubleshoot CI-CD, Azure DevOps, and GitHub issues in Azure Data Factory and Synapse Analytics
Sometimes you encounter Authentication issues like HTTP status 401. Especially w
#### Cause
-What we have observed is that the token was obtained from the original tenant, but the service is in guest tenant and trying to use the token to visit DevOps in guest tenant. This is not the expected behavior.
+The token was obtained from the original tenant, but the service is in guest tenant trying to use the token to visit DevOps in guest tenant. This type of token access isn't the expected behavior.
#### Recommendation
When trying to publish changes, you get following error message:
` ### Cause
-You have detached the Git configuration and set it up again with the "Import resources" flag selected, which sets the service as "in sync". This means no change during publication..
+You have detached the Git configuration and set it up again with the "Import resources" flag selected, which sets the service as "in sync". This means no change during publication.
#### Resolution
You are unable to move a data factory from one Resource Group to another, failin
#### Resolution
-You can delete the SSIS-IR and Shared IRs to allow the move operation. If you do not want to delete the integration runtimes, then the best way is to follow the copy and clone document to do the copy and after it's done, delete the old data factory.
+You can delete the SSIS-IR and Shared IRs to allow the move operation. If you don't want to delete the integration runtimes, then the best way is to follow the copy and clone document to do the copy and after it's done, delete the old data factory.
### Unable to export and import ARM template
Until recently, the it was only possible to publish a pipeline for deployments b
CI/CD process has been enhanced. The **Automated** publish feature takes, validates, and exports all ARM template features from the UI. It makes the logic consumable via a publicly available npm package [@microsoft/azure-data-factory-utilities](https://www.npmjs.com/package/@microsoft/azure-data-factory-utilities). This method allows you to programmatically trigger these actions instead of having to go to the UI and click a button. This method gives your CI/CD pipelines a **true** continuous integration experience. Follow [CI/CD Publishing Improvements](./continuous-integration-delivery-improvements.md) for details.
-### Cannot publish because of 4 MB ARM template limit
+### Cannot publish because of 4-MB ARM template limit
#### Issue
-You cannot deploy because you hit Azure Resource Manager limit of 4 MB total template size. You need a solution to deploy after crossing the limit.
+You can't deploy because you hit Azure Resource Manager limit of 4-MB total template size. You need a solution to deploy after crossing the limit.
#### Cause
-Azure Resource Manager restricts template size to be 4-MB. Limit the size of your template to 4-MB, and each parameter file to 64 KB. The 4 MB limit applies to the final state of the template after it has been expanded with iterative resource definitions, and values for variables and parameters. But, you have crossed the limit.
+Azure Resource Manager restricts template size to be 4-MB. Limit the size of your template to 4-MB, and each parameter file to 64 KB. The 4-MB limit applies to the final state of the template after it has been expanded with iterative resource definitions, and values for variables and parameters. But, you have crossed the limit.
#### Resolution For small to medium solutions, a single template is easier to understand and maintain. You can see all the resources and values in a single file. For advanced scenarios, linked templates enable you to break down the solution into targeted components. Follow best practice at [Using Linked and Nested Templates](../azure-resource-manager/templates/linked-templates.md?tabs=azure-powershell).
+### DevOps API limit of 20 MB causes ADF trigger twice instead of once
+
+#### Issue
+
+While publishing ADF resources, the azure pipeline triggers twice instead of once.
+
+#### Cause
+
+DevOps has limitation of 20-MB REST api load for arm templates, linked template and global parameters. Large ADF resources are reorganized to get around GitHub API rate limits. That may rarely cause ADF DevOps APIs hit 20-MB limit.
+
+#### Resolution
+
+Use ADF **Automated publish** (preferred) or **manual trigger** method to trigger once instead of twice.
+ ### Cannot connect to GIT Enterprise ##### Issue
-You cannot connect to GIT Enterprise because of permission issues. You can see error like **422 - Unprocessable Entity.**
+You can't connect to GIT Enterprise because of permission issues. You can see error like **422 - Unprocessable Entity.**
#### Cause
-* You have not configured Oauth for the service.
+* You haven't configured Oauth for the service.
* Your URL is misconfigured. The repoConfiguration should be of type [FactoryGitHubConfiguration](/dotnet/api/microsoft.azure.management.datafactory.models.factorygithubconfiguration?view=azure-dotnet&preserve-view=true) #### Resolution
An instance of the service, or the resource group containing it, was deleted and
#### Cause
-It is possible to recover the instance only if source control was configured for it with DevOps or Git. This action will bring all the latest published resources, but **will not** restore any unpublished pipelines, datasets, or linked services. If there is no Source control, recovering a deleted instance from the Azure backend is not possible because once the service receives the delete command, the instance is permanently deleted without any backup.
+It is possible to recover the instance only if source control was configured for it with DevOps or Git. This action will bring all the latest published resources, but **will not** restore any unpublished pipelines, datasets, or linked services. If there is no Source control, recovering a deleted instance from the Azure backend isn't possible because once the service receives the delete command, the instance is permanently deleted without any backup.
#### Resolution
To recover a deleted service instance that has source control configured, refer
* If there was a Self-hosted Integration Runtime in a deleted data factory or Synapse workspace, a new instance of the IR must be created in a new factory or workspace. The on-premises or virtual machine IR instance must be uninstalled and reinstalled, and a new key obtained. After setup of the new IR is completed, the Linked Service must be updated to point to new IR and the connected tested again, or it will fail with error **invalid reference.**
-### Cannot deploy to different stage using automatic publish method
+### Can't deploy to different stage using automatic publish method
#### Issue Customer followed all necessary steps like installing NPM package and setting up a higher stage using Azure DevOps, but deployment still fails.
Following section is not valid because package.json folder is not valid.
``` It should have DataFactory included in customCommand like *'run build validate $(Build.Repository.LocalPath)/DataFactory/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/testResourceGroup/providers/Microsoft.DataFactory/factories/yourFactoryName'*. Make sure the generated YAML file for higher stage should have required JSON artifacts.
-### Git Repository or Azure Purview Connection Disconnected
+### Git Repository or Microsoft Purview Connection Disconnected
#### Issue When deploying a service instance, the git repository or purview connection is disconnected.
You can monitor the pipeline using **SDK**, **Azure Monitor** or [Monitor](./mon
You want to perform unit testing during development and deployment of your pipelines. #### Cause
-During development and deployment cycles, you may want to unit test your pipeline before you manually or automatically publish your pipeline. Test automation allows you to run more tests, in less time, with guaranteed repeatability. Automatically re-testing all your pipelines before deployment gives you some protection against regression faults. Automated testing is a key component of CI/CD software development approaches: inclusion of automated tests in CI/CD deployment pipelines can significantly improve quality. In long run, tested pipeline artifacts are reused saving you cost and time.
+During development and deployment cycles, you may want to unit test your pipeline before you manually or automatically publish your pipeline. Test automation allows you to run more tests, in less time, with guaranteed repeatability. Automatically retesting all your pipelines before deployment gives you some protection against regression faults. Automated testing is a key component of CI/CD software development approaches: inclusion of automated tests in CI/CD deployment pipelines can significantly improve quality. In long run, tested pipeline artifacts are reused saving you cost and time.
#### Resolution Because customers may have different unit testing requirements with different skillsets, usual practice is to follow following steps:
If you want to share integration runtimes across all stages, consider using a te
### GIT publish may fail because of PartialTempTemplates files #### Issue
-When you have 1000s of old temporary ARM json files in PartialTemplates folder, publish may fail.
+When you have 1000 s of old temporary ARM json files in PartialTemplates folder, publish may fail.
#### Cause On publish, ADF fetches every file inside each folder in the collaboration branch. In the past, publishing generated two folders in the publish branch: PartialArmTemplates and LinkedTemplates. PartialArmTemplates files are no longer generated. However, because there can be many old files (thousands) in the PartialArmTemplates folder, this may result in many requests being made to GitHub on publish and the rate limit being hit.
For more help with troubleshooting, try the following resources:
* [Data Factory feature requests](/answers/topics/azure-data-factory.html) * [Azure videos](https://azure.microsoft.com/resources/videos/index/?sort=newest&services=data-factory) * [Stack overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
-* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
+* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
data-factory Connect Data Factory To Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connect-data-factory-to-azure-purview.md
Title: Connect a Data Factory to Azure Purview
-description: Learn about how to connect a Data Factory to Azure Purview
+ Title: Connect a Data Factory to Microsoft Purview
+description: Learn about how to connect a Data Factory to Microsoft Purview
Last updated 10/25/2021
-# Connect Data Factory to Azure Purview (Preview)
+# Connect Data Factory to Microsoft Purview (Preview)
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
-[Azure Purview](../purview/overview.md) is a unified data governance service that helps you manage and govern your on-premises, multi-cloud, and software-as-a-service (SaaS) data. You can connect your data factory to Azure Purview. That connection allows you to use Azure Purview for capturing lineage data, and to discover and explore Azure Purview assets.
+[Microsoft Purview](../purview/overview.md) is a unified data governance service that helps you manage and govern your on-premises, multi-cloud, and software-as-a-service (SaaS) data. You can connect your data factory to Microsoft Purview. That connection allows you to use Microsoft Purview for capturing lineage data, and to discover and explore Microsoft Purview assets.
-## Connect Data Factory to Azure Purview
+## Connect Data Factory to Microsoft Purview
-You have two options to connect data factory to Azure Purview:
+You have two options to connect data factory to Microsoft Purview:
-- [Connect to Azure Purview account in Data Factory](#connect-to-azure-purview-account-in-data-factory)-- [Register Data Factory in Azure Purview](#register-data-factory-in-azure-purview)
+- [Connect to Microsoft Purview account in Data Factory](#connect-to-microsoft-purview-account-in-data-factory)
+- [Register Data Factory in Microsoft Purview](#register-data-factory-in-microsoft-purview)
-### Connect to Azure Purview account in Data Factory
+### Connect to Microsoft Purview account in Data Factory
-You need to have **Owner** or **Contributor** role on your data factory to connect to an Azure Purview account.
+You need to have **Owner** or **Contributor** role on your data factory to connect to a Microsoft Purview account.
To establish the connection on Data Factory authoring UI:
-1. In the ADF authoring UI, go to **Manage** -> **Azure Purview**, and select **Connect to an Azure Purview account**.
+1. In the ADF authoring UI, go to **Manage** -> **Microsoft Purview**, and select **Connect to a Microsoft Purview account**.
- :::image type="content" source="./media/data-factory-purview/register-purview-account.png" alt-text="Screenshot for registering an Azure Purview account.":::
+ :::image type="content" source="./media/data-factory-purview/register-purview-account.png" alt-text="Screenshot for registering a Microsoft Purview account.":::
2. Choose **From Azure subscription** or **Enter manually**. **From Azure subscription**, you can select the account that you have access to.
-3. Once connected, you can see the name of the Azure Purview account in the tab **Azure Purview account**.
+3. Once connected, you can see the name of the Microsoft Purview account in the tab **Microsoft Purview account**.
-If your Azure Purview account is protected by firewall, create the managed private endpoints for Azure Purview. Learn more about how to let Data Factory [access a secured Azure Purview account](how-to-access-secured-purview-account.md). You can either do it during the initial connection or edit an existing connection later.
+If your Microsoft Purview account is protected by firewall, create the managed private endpoints for Microsoft Purview. Learn more about how to let Data Factory [access a secured Microsoft Purview account](how-to-access-secured-purview-account.md). You can either do it during the initial connection or edit an existing connection later.
-The Azure Purview connection information is stored in the data factory resource like the following. To establish the connection programmatically, you can update the data factory and add the `purviewConfiguration` settings. When you want to push lineage from SSIS activities, also add `catalogUri` tag additionally.
+The Microsoft Purview connection information is stored in the data factory resource like the following. To establish the connection programmatically, you can update the data factory and add the `purviewConfiguration` settings. When you want to push lineage from SSIS activities, also add `catalogUri` tag additionally.
```json {
The Azure Purview connection information is stored in the data factory resource
} ```
-### Register Data Factory in Azure Purview
+### Register Data Factory in Microsoft Purview
-For how to register Data Factory in Azure Purview, see [How to connect Azure Data Factory and Azure Purview](../purview/how-to-link-azure-data-factory.md).
+For how to register Data Factory in Microsoft Purview, see [How to connect Azure Data Factory and Microsoft Purview](../purview/how-to-link-azure-data-factory.md).
## Set up authentication
-Data factory's managed identity is used to authenticate lineage push operations from data factory to Azure Purview.
+Data factory's managed identity is used to authenticate lineage push operations from data factory to Microsoft Purview.
-Grant the data factory's managed identity **Data Curator** role on your Azure Purview **root collection**. Learn more about [Access control in Azure Purview](../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+Grant the data factory's managed identity **Data Curator** role on your Microsoft Purview **root collection**. Learn more about [Access control in Microsoft Purview](../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
-When connecting data factory to Azure Purview on authoring UI, ADF tries to add such role assignment automatically. If you have **Collection admins** role on the Azure Purview root collection and have access to Azure Purview account from your network, this operation is done successfully.
+When connecting data factory to Microsoft Purview on authoring UI, ADF tries to add such role assignment automatically. If you have **Collection admins** role on the Microsoft Purview root collection and have access to Microsoft Purview account from your network, this operation is done successfully.
-## Monitor Azure Purview connection
+## Monitor Microsoft Purview connection
-Once you connect the data factory to an Azure Purview account, you see the following page with details on the enabled integration capabilities.
+Once you connect the data factory to a Microsoft Purview account, you see the following page with details on the enabled integration capabilities.
For **Data Lineage - Pipeline**, you may see one of below status: -- **Connected**: The data factory is successfully connected to the Azure Purview account. Note this indicates data factory is associated with an Azure Purview account and has permission to push lineage to it. If your Azure Purview account is protected by firewall, you also need to make sure the integration runtime used to execute the activities and conduct lineage push can reach the Azure Purview account. Learn more from [Access a secured Azure Purview account from Azure Data Factory](how-to-access-secured-purview-account.md).-- **Disconnected**: The data factory cannot push lineage to Azure Purview because Azure Purview Data Curator role is not granted to data factory's managed identity. To fix this issue, go to your Azure Purview account to check the role assignments, and manually grant the role as needed. Learn more from [Set up authentication](#set-up-authentication) section.
+- **Connected**: The data factory is successfully connected to the Microsoft Purview account. Note this indicates data factory is associated with a Microsoft Purview account and has permission to push lineage to it. If your Microsoft Purview account is protected by firewall, you also need to make sure the integration runtime used to execute the activities and conduct lineage push can reach the Microsoft Purview account. Learn more from [Access a secured Microsoft Purview account from Azure Data Factory](how-to-access-secured-purview-account.md).
+- **Disconnected**: The data factory cannot push lineage to Microsoft Purview because Microsoft Purview Data Curator role is not granted to data factory's managed identity. To fix this issue, go to your Microsoft Purview account to check the role assignments, and manually grant the role as needed. Learn more from [Set up authentication](#set-up-authentication) section.
- **Unknown**: Data Factory cannot check the status. Possible reasons are:
- - Cannot reach the Azure Purview account from your current network because the account is protected by firewall. You can launch the ADF UI from a private network that has connectivity to your Azure Purview account instead.
- - You don't have permission to check role assignments on the Azure Purview account. You can contact the Azure Purview account admin to check the role assignments for you. Learn about the needed Azure Purview role from [Set up authentication](#set-up-authentication) section.
+ - Cannot reach the Microsoft Purview account from your current network because the account is protected by firewall. You can launch the ADF UI from a private network that has connectivity to your Microsoft Purview account instead.
+ - You don't have permission to check role assignments on the Microsoft Purview account. You can contact the Microsoft Purview account admin to check the role assignments for you. Learn about the needed Microsoft Purview role from [Set up authentication](#set-up-authentication) section.
-## Report lineage data to Azure Purview
+## Report lineage data to Microsoft Purview
-Once you connect the data factory to an Azure Purview account, when you execute pipelines, Data Factory push lineage information to the Azure Purview account. For detailed supported capabilities, see [Supported Azure Data Factory activities](../purview/how-to-link-azure-data-factory.md#supported-azure-data-factory-activities). For an end to end walkthrough, refer to [Tutorial: Push Data Factory lineage data to Azure Purview](tutorial-push-lineage-to-purview.md).
+Once you connect the data factory to a Microsoft Purview account, when you execute pipelines, Data Factory push lineage information to the Microsoft Purview account. For detailed supported capabilities, see [Supported Azure Data Factory activities](../purview/how-to-link-azure-data-factory.md#supported-azure-data-factory-activities). For an end to end walkthrough, refer to [Tutorial: Push Data Factory lineage data to Microsoft Purview](tutorial-push-lineage-to-purview.md).
-## Discover and explore data using Azure Purview
+## Discover and explore data using Microsoft Purview
-Once you connect the data factory to an Azure Purview account, you can use the search bar at the top center of Data Factory authoring UI to search for data and perform actions. Learn more from [Discover and explore data in ADF using Azure Purview](how-to-discover-explore-purview-data.md).
+Once you connect the data factory to a Microsoft Purview account, you can use the search bar at the top center of Data Factory authoring UI to search for data and perform actions. Learn more from [Discover and explore data in ADF using Microsoft Purview](how-to-discover-explore-purview-data.md).
## Next steps
-[Tutorial: Push Data Factory lineage data to Azure Purview](tutorial-push-lineage-to-purview.md)
+[Tutorial: Push Data Factory lineage data to Microsoft Purview](tutorial-push-lineage-to-purview.md)
-[Discover and explore data in ADF using Azure Purview](how-to-discover-explore-purview-data.md)
+[Discover and explore data in ADF using Microsoft Purview](how-to-discover-explore-purview-data.md)
-[Access a secured Azure Purview account](how-to-access-secured-purview-account.md)
+[Access a secured Microsoft Purview account](how-to-access-secured-purview-account.md)
data-factory Connector Dynamics Crm Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-dynamics-crm-office-365.md
Previously updated : 01/10/2022 Last updated : 04/12/2022 # Copy and transform data in Dynamics 365 (Microsoft Dataverse) or Dynamics CRM using Azure Data Factory or Azure Synapse Analytics
To copy data from Dynamics, the copy activity **source** section supports the fo
> [!IMPORTANT] >- When you copy data from Dynamics, explicit column mapping from Dynamics to sink is optional. But we highly recommend the mapping to ensure a deterministic copy result.
->- When the service imports a schema in the authoring UI, it infers the schema. It does so by sampling the top rows from the Dynamics query result to initialize the source column list. In that case, columns with no values in the top rows are omitted. The same behavior applies to copy executions if there is no explicit mapping. You can review and add more columns into the mapping, which are honored during copy runtime.
+>- When the service imports a schema in the authoring UI, it infers the schema. It does so by sampling the top rows from the Dynamics query result to initialize the source column list. In that case, columns with no values in the top rows are omitted. The same behavior also applies to data preview and copy executions if there is no explicit mapping. You can review and add more columns into the mapping, which are honored during copy runtime.
#### Example
data-factory Control Flow Lookup Activity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/control-flow-lookup-activity.md
Previously updated : 09/09/2021 Last updated : 04/06/2022 # Lookup activity in Azure Data Factory and Azure Synapse Analytics
Note the following:
- The Lookup activity can return up to **5000 rows**; if the result set contains more records, the first 5000 rows will be returned. - The Lookup activity output supports up to **4 MB** in size, activity will fail if the size exceeds the limit. - The longest duration for Lookup activity before timeout is **24 hours**.-- When you use query or stored procedure to lookup data, make sure to return one and exact one result set. Otherwise, Lookup activity fails.+
+> [!Note]
+> When you use query or stored procedure to lookup data, make sure to return one and exact one result set. Otherwise, Lookup activity fails.
The following data sources are supported for Lookup activity.
data-factory Data Factory Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/data-factory-private-link.md
Finally, you must create the private endpoint in your data factory.
## Restrict access for data factory resources using private link
-If you want to restrict access for data factory resources in your subscriptions by private link, please follow [Use portal to create private link for managing Azure resources](https://docs.microsoft.com/azure/azure-resource-manager/management/create-private-link-access-portal?source=docs)
+If you want to restrict access for data factory resources in your subscriptions by private link, please follow [Use portal to create private link for managing Azure resources](../azure-resource-manager/management/create-private-link-access-portal.md?source=docs)
## Known issue You are unable to access each other PaaS Resources when both sides are exposed to private Link and private endpoint. This is a known limitation of private link and private endpoint.
For example, if A is using a private link to access the portal of data factory A
- [Create a data factory by using the Azure Data Factory UI](quickstart-create-data-factory-portal.md) - [Introduction to Azure Data Factory](introduction.md)-- [Visual authoring in Azure Data Factory](author-visually.md)
+- [Visual authoring in Azure Data Factory](author-visually.md)
data-factory Data Factory Tutorials https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/data-factory-tutorials.md
Below is a list of tutorials to help explain and walk through a series of Data F
## Data lineage
-[Azure Purview](turorial-push-lineage-to-purview.md)
+[Microsoft Purview](turorial-push-lineage-to-purview.md)
## Next steps Learn more about Data Factory [pipelines](concepts-pipelines-activities.md) and [data flows](concepts-data-flow-overview.md).
data-factory How To Access Secured Purview Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/how-to-access-secured-purview-account.md
Title: Access a secured Azure Purview account
-description: Learn about how to access a firewall protected Azure Purview account through private endpoints from Azure Data Factory
+ Title: Access a secured Microsoft Purview account
+description: Learn about how to access a firewall protected Microsoft Purview account through private endpoints from Azure Data Factory
Last updated 09/02/2021
-# Access a secured Azure Purview account from Azure Data Factory
+# Access a secured Microsoft Purview account from Azure Data Factory
-This article describes how to access a secured Azure Purview account from Azure Data Factory for different integration scenarios.
+This article describes how to access a secured Microsoft Purview account from Azure Data Factory for different integration scenarios.
-## Azure Purview private endpoint deployment scenarios
+## Microsoft Purview private endpoint deployment scenarios
-You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Azure Purview accounts to allow secure access from a virtual network (VNet) to the catalog over a Private Link. Azure Purview provides different types of private points for various access need: *account* private endpoint, *portal* private endpoint, and *ingestion* private endpoints. Learn more from [Azure Purview private endpoints conceptual overview](../purview/catalog-private-link.md#conceptual-overview).
+You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Microsoft Purview accounts to allow secure access from a virtual network (VNet) to the catalog over a Private Link. Microsoft Purview provides different types of private points for various access need: *account* private endpoint, *portal* private endpoint, and *ingestion* private endpoints. Learn more from [Microsoft Purview private endpoints conceptual overview](../purview/catalog-private-link.md#conceptual-overview).
-If your Azure Purview account is protected by firewall and denies public access, make sure you follow below checklist to set up the private endpoints so Data Factory can successfully connect to Azure Purview.
+If your Microsoft Purview account is protected by firewall and denies public access, make sure you follow below checklist to set up the private endpoints so Data Factory can successfully connect to Microsoft Purview.
-| Scenario | Required Azure Purview private endpoints |
+| Scenario | Required Microsoft Purview private endpoints |
| | |
-| [Run pipeline and report lineage to Azure Purview](tutorial-push-lineage-to-purview.md) | For Data Factory pipeline to push lineage to Azure Purview, Azure Purview ***account*** and ***ingestion*** private endpoints are required. <br>- When using **Azure Integration Runtime**, follow the steps in [Managed private endpoints for Azure Purview](#managed-private-endpoints-for-azure-purview) section to create managed private endpoints in the Data Factory managed virtual network.<br>- When using **Self-hosted Integration Runtime**, follow the steps in [this section](../purview/catalog-private-link-end-to-end.md#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-azure-purview-accounts) to create the *account* and *ingestion* private endpoints in your integration runtime's virtual network. |
-| [Discover and explore data using Azure Purview on ADF UI](how-to-discover-explore-purview-data.md) | To use the search bar at the top center of Data Factory authoring UI to search for Azure Purview data and perform actions, you need to create Azure Purview ***account*** and ***portal*** private endpoints in the virtual network that you launch the Data Factory Studio. Follow the steps in [Enable *account* and *portal* private endpoint](../purview/catalog-private-link-account-portal.md#option-2enable-account-and-portal-private-endpoint-on-existing-azure-purview-accounts). |
+| [Run pipeline and report lineage to Microsoft](tutorial-push-lineage-to-purview.md) | For Data Factory pipeline to push lineage to Microsoft, Microsoft Purview ***account*** and ***ingestion*** private endpoints are required. <br>- When using **Azure Integration Runtime**, follow the steps in [Managed private endpoints for Microsoft Purview](#managed-private-endpoints-for-microsoft-purview) section to create managed private endpoints in the Data Factory managed virtual network.<br>- When using **Self-hosted Integration Runtime**, follow the steps in [this section](../purview/catalog-private-link-end-to-end.md#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-microsoft-purview-accounts) to create the *account* and *ingestion* private endpoints in your integration runtime's virtual network. |
+| [Discover and explore data using Microsoft on ADF UI](how-to-discover-explore-purview-data.md) | To use the search bar at the top center of Data Factory authoring UI to search for Microsoft Purview data and perform actions, you need to create Microsoft Purview ***account*** and ***portal*** private endpoints in the virtual network that you launch the Data Factory Studio. Follow the steps in [Enable *account* and *portal* private endpoint](../purview/catalog-private-link-account-portal.md#option-2enable-account-and-portal-private-endpoint-on-existing-microsoft-purview-accounts). |
-## Managed private endpoints for Azure Purview
+## Managed private endpoints for Microsoft Purview
-[Managed private endpoints](managed-virtual-network-private-endpoint.md#managed-private-endpoints) are private endpoints created in the Azure Data Factory Managed Virtual Network establishing a private link to Azure resources. When you run pipeline and report lineage to a firewall protected Azure Purview account, create an Azure Integration Runtime with "Virtual network configuration" option enabled, then create the Azure Purview ***account*** and ***ingestion*** managed private endpoints as follows.
+[Managed private endpoints](managed-virtual-network-private-endpoint.md#managed-private-endpoints) are private endpoints created in the Azure Data Factory Managed Virtual Network establishing a private link to Azure resources. When you run pipeline and report lineage to a firewall protected Microsoft Purview account, create an Azure Integration Runtime with "Virtual network configuration" option enabled, then create the Microsoft Purview ***account*** and ***ingestion*** managed private endpoints as follows.
### Create managed private endpoints
-To create managed private endpoints for Azure Purview on Data Factory authoring UI:
+To create managed private endpoints for Microsoft Purview on Data Factory authoring UI:
-1. Go to **Manage** -> **Azure Purview**, and click **Edit** to edit your existing connected Azure Purview account or click **Connect to an Azure Purview account** to connect to a new Azure Purview account.
+1. Go to **Manage** -> **Microsoft Purview**, and click **Edit** to edit your existing connected Microsoft Purview account or click **Connect to a Microsoft Purview account** to connect to a new Microsoft Purview account.
2. Select **Yes** for **Create managed private endpoints**. You need to have at least one Azure Integration Runtime with "Virtual network configuration" option enabled in the data factory to see this option.
-3. Click **+ Create all** button to batch create the needed Azure Purview private endpoints, including the ***account*** private endpoint and the ***ingestion*** private endpoints for the Azure Purview managed resources - Blob storage, Queue storage, and Event Hubs namespace. You need to have at least **Reader** role on your Azure Purview account for Data Factory to retrieve the Azure Purview managed resources' information.
+3. Click **+ Create all** button to batch create the needed Microsoft Purview private endpoints, including the ***account*** private endpoint and the ***ingestion*** private endpoints for the Microsoft Purview managed resources - Blob storage, Queue storage, and Event Hubs namespace. You need to have at least **Reader** role on your Microsoft Purview account for Data Factory to retrieve the Microsoft Purview managed resources' information.
- :::image type="content" source="./media/how-to-access-secured-purview-account/purview-create-all-managed-private-endpoints.png" alt-text="Create managed private endpoint for your connected Azure Purview account.":::
+ :::image type="content" source="./media/how-to-access-secured-purview-account/purview-create-all-managed-private-endpoints.png" alt-text="Create managed private endpoint for your connected Microsoft Purview account.":::
4. In the next page, specify a name for the private endpoint. It will be used to generate names for the ingestion private endpoints as well with suffix.
- :::image type="content" source="./media/how-to-access-secured-purview-account/name-purview-private-endpoints.png" alt-text="Name the managed private endpoints for your connected Azure Purview account.":::
+ :::image type="content" source="./media/how-to-access-secured-purview-account/name-purview-private-endpoints.png" alt-text="Name the managed private endpoints for your connected Microsoft Purview account.":::
-5. Click **Create** to create the private endpoints. After creation, 4 private endpoint requests will be generated that must [get approved by an owner of Azure Purview](#approve-private-endpoint-connections).
+5. Click **Create** to create the private endpoints. After creation, 4 private endpoint requests will be generated that must [get approved by an owner of Microsoft Purview](#approve-private-endpoint-connections).
-Such batch managed private endpoint creation is provided on the Azure Purview UI only. If you want to create the managed private endpoints programmatically, you need to create those PEs individually. You can find Azure Purview managed resources' information from Azure portal -> your Azure Purview account -> Managed resources.
+Such batch managed private endpoint creation is provided on the Microsoft Purview UI only. If you want to create the managed private endpoints programmatically, you need to create those PEs individually. You can find Microsoft Purview managed resources' information from Azure portal -> your Microsoft Purview account -> Managed resources.
### Approve private endpoint connections
-After you create the managed private endpoints for Azure Purview, you see "Pending" state first. The Azure Purview owner need to approve the private endpoint connections for each resource.
+After you create the managed private endpoints for Microsoft Purview, you see "Pending" state first. The Microsoft Purview owner need to approve the private endpoint connections for each resource.
-If you have permission to approve the Azure Purview private endpoint connection, from Data Factory UI:
+If you have permission to approve the Microsoft Purview private endpoint connection, from Data Factory UI:
-1. Go to **Manage** -> **Azure Purview** -> **Edit**
+1. Go to **Manage** -> **Microsoft Purview** -> **Edit**
2. In the private endpoint list, click the **Edit** (pencil) button next to each private endpoint name 3. Click **Manage approvals in Azure portal** which will bring you to the resource. 4. On the given resource, go to **Networking** -> **Private endpoint connection** to approve it. The private endpoint is named as `data_factory_name.your_defined_private_endpoint_name` with description as "Requested by data_factory_name". 5. Repeat this operation for all private endpoints.
-If you don't have permission to approve the Azure Purview private endpoint connection, ask the Azure Purview account owner to do as follows.
+If you don't have permission to approve the Microsoft Purview private endpoint connection, ask the Microsoft Purview account owner to do as follows.
-- For *account* private endpoint, go to Azure portal -> your Azure Purview account -> Networking -> Private endpoint connection to approve.-- For *ingestion* private endpoints, go to Azure portal -> your Azure Purview account -> Managed resources, click into the Storage account and Event Hubs namespace respectively, and approve the private endpoint connection in Networking -> Private endpoint connection page.
+- For *account* private endpoint, go to Azure portal -> your Microsoft Purview account -> Networking -> Private endpoint connection to approve.
+- For *ingestion* private endpoints, go to Azure portal -> your Microsoft Purview account -> Managed resources, click into the Storage account and Event Hubs namespace respectively, and approve the private endpoint connection in Networking -> Private endpoint connection page.
### Monitor managed private endpoints
-You can monitor the created managed private endpoints for Azure Purview at two places:
+You can monitor the created managed private endpoints for Microsoft Purview at two places:
-- Go to **Manage** -> **Azure Purview** -> **Edit** to open your existing connected Azure Purview account. To see all the relevant private endpoints, you need to have at least **Reader** role on your Azure Purview account for Data Factory to retrieve the Azure Purview managed resources' information. Otherwise, you only see *account* private endpoint with warning.-- Go to **Manage** -> **Managed private endpoints** where you see all the managed private endpoints created under the data factory. If you have at least **Reader** role on your Azure Purview account, you see Azure Purview relevant private endpoints being grouped together. Otherwise, they show up separately in the list.
+- Go to **Manage** -> **Microsoft Purview** -> **Edit** to open your existing connected Microsoft Purview account. To see all the relevant private endpoints, you need to have at least **Reader** role on your Microsoft Purview account for Data Factory to retrieve the Microsoft Purview managed resources' information. Otherwise, you only see *account* private endpoint with warning.
+- Go to **Manage** -> **Managed private endpoints** where you see all the managed private endpoints created under the data factory. If you have at least **Reader** role on your Microsoft Purview account, you see Microsoft Purview relevant private endpoints being grouped together. Otherwise, they show up separately in the list.
## Next steps -- [Connect Data Factory to Azure Purview](connect-data-factory-to-azure-purview.md)-- [Tutorial: Push Data Factory lineage data to Azure Purview](tutorial-push-lineage-to-purview.md)-- [Discover and explore data in ADF using Azure Purview](how-to-discover-explore-purview-data.md)
+- [Connect Data Factory to Microsoft Purview](connect-data-factory-to-azure-purview.md)
+- [Tutorial: Push Data Factory lineage data to Microsoft Purview](tutorial-push-lineage-to-purview.md)
+- [Discover and explore data in ADF using Microsoft Purview](how-to-discover-explore-purview-data.md)
data-factory How To Discover Explore Purview Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/how-to-discover-explore-purview-data.md
Title: Discover and explore data in ADF using Azure Purview
-description: Learn how to discover, explore data in Azure Data Factory using Azure Purview
+ Title: Discover and explore data in ADF using Microsoft Purview
+description: Learn how to discover, explore data in Azure Data Factory using Microsoft Purview
Last updated 08/10/2021
-# Discover and explore data in ADF using Azure Purview
+# Discover and explore data in ADF using Microsoft Purview
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
-In this article, you will register an Azure Purview Account to a Data Factory. That connection allows you to discover Azure Purview assets and interact with them through ADF capabilities.
+In this article, you will register a Microsoft Purview Account to a Data Factory. That connection allows you to discover Microsoft Purview assets and interact with them through ADF capabilities.
You can perform the following tasks in ADF: -- Use the search box at the top to find Azure Purview assets based on keywords
+- Use the search box at the top to find Microsoft Purview assets based on keywords
- Understand the data based on metadata, lineage, annotations - Connect those data to your data factory with linked services or datasets ## Prerequisites -- [Azure Purview account](../purview/create-catalog-portal.md)
+- [Microsoft Purview account](../purview/create-catalog-portal.md)
- [Data Factory](./quickstart-create-data-factory-portal.md) -- [Connect an Azure Purview Account into Data Factory](./connect-data-factory-to-azure-purview.md)
+- [Connect a Microsoft Purview Account into Data Factory](./connect-data-factory-to-azure-purview.md)
-## Using Azure Purview in Data Factory
+## Using Microsoft Purview in Data Factory
-The use Azure Purview in Data Factory requires you to have access to that Azure Purview account. Data Factory passes-through your Azure Purview permission. As an example, if you have a curator permission role, you will be able to edit metadata scanned by Azure Purview.
+The use Microsoft Purview in Data Factory requires you to have access to that Microsoft Purview account. Data Factory passes-through your Microsoft Purview permission. As an example, if you have a curator permission role, you will be able to edit metadata scanned by Microsoft Purview.
### Data discovery: search datasets
-To discover data registered and scanned by Azure Purview, you can use the Search bar at the top center of Data Factory portal. Make sure that you select Azure Purview to search for all of your organization data.
+To discover data registered and scanned by Microsoft Purview, you can use the Search bar at the top center of Data Factory portal. Make sure that you select Microsoft Purview to search for all of your organization data.
:::image type="content" source="./media/data-factory-purview/search-dataset.png" alt-text="Screenshot for performing over datasets."::: ### Actions that you can perform over datasets with Data Factory resources
-You can directly create Linked Service, Dataset, or dataflow over the data you search by Azure Purview.
+You can directly create Linked Service, Dataset, or dataflow over the data you search by Microsoft Purview.
##  Next steps
-[Tutorial: Push Data Factory lineage data to Azure Purview](turorial-push-lineage-to-purview.md)
+[Tutorial: Push Data Factory lineage data to Microsoft Purview](turorial-push-lineage-to-purview.md)
-[Connect an Azure Purview Account into Data Factory](connect-data-factory-to-azure-purview.md)
+[Connect a Microsoft Purview Account into Data Factory](connect-data-factory-to-azure-purview.md)
-[How to Search Data in Azure Purview Data Catalog](../purview/how-to-search-catalog.md)
+[How to Search Data in Microsoft Purview Data Catalog](../purview/how-to-search-catalog.md)
data-factory Managed Virtual Network Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/managed-virtual-network-private-endpoint.md
The following data sources have native Private Endpoint support and can be conne
- Azure Key Vault - Azure Machine Learning - Azure Private Link Service-- Azure Purview
+- Microsoft Purview
- Azure SQL Database - Azure SQL Managed Instance - (public preview) - Azure Synapse Analytics
data-factory Tutorial Push Lineage To Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/tutorial-push-lineage-to-purview.md
Title: Push Data Factory lineage data to Azure Purview
-description: Learn about how to push Data Factory lineage data to Azure Purview
+ Title: Push Data Factory lineage data to Microsoft Purview
+description: Learn about how to push Data Factory lineage data to Microsoft Purview
Last updated 08/10/2021
-# Push Data Factory lineage data to Azure Purview
+# Push Data Factory lineage data to Microsoft Purview
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
-In this tutorial, you'll use the Data Factory user interface (UI) to create a pipeline that run activities and report lineage data to Azure Purview account. Then you can view all the lineage information in your Azure Purview account.
+In this tutorial, you'll use the Data Factory user interface (UI) to create a pipeline that run activities and report lineage data to Microsoft Purview account. Then you can view all the lineage information in your Microsoft Purview account.
Currently, lineage is supported for Copy, Data Flow, and Execute SSIS activities. Learn more details on the supported capabilities from [Supported Azure Data Factory activities](../purview/how-to-link-azure-data-factory.md#supported-azure-data-factory-activities).
Currently, lineage is supported for Copy, Data Flow, and Execute SSIS activities
* **Azure subscription**. If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/free/) before you begin. * **Azure Data Factory**. If you don't have an Azure Data Factory, see [Create an Azure Data Factory](./quickstart-create-data-factory-portal.md).
-* **Azure Purview account**. The Azure Purview account captures all lineage data generated by data factory. If you don't have an Azure Purview account, see [Create an Azure Purview](../purview/create-catalog-portal.md).
+* **Microsoft Purview account**. The Microsoft Purview account captures all lineage data generated by data factory. If you don't have a Microsoft Purview account, see [Create a Microsoft Purview](../purview/create-catalog-portal.md).
-## Run pipeline and push lineage data to Azure Purview
+## Run pipeline and push lineage data to Microsoft Purview
-### Step 1: Connect Data Factory to your Azure Purview account
+### Step 1: Connect Data Factory to your Microsoft Purview account
-You can establish the connection between Data Factory and Azure Purview account by following the steps in [Connect Data Factory to Azure Purview](connect-data-factory-to-azure-purview.md).
+You can establish the connection between Data Factory and Microsoft Purview account by following the steps in [Connect Data Factory to Microsoft Purview](connect-data-factory-to-azure-purview.md).
### Step 2: Run pipeline in Data Factory
After you run the pipeline, in the [pipeline monitoring view](monitor-visually.m
:::image type="content" source="./media/data-factory-purview/monitor-lineage-reporting-status.png" alt-text="Monitor the lineage reporting status in pipeline monitoring view.":::
-### Step 4: View lineage information in your Azure Purview account
+### Step 4: View lineage information in your Microsoft Purview account
-On Azure Purview UI, you can browse assets and choose type "Azure Data Factory". You can also search the Data Catalog using keywords.
+On Microsoft Purview UI, you can browse assets and choose type "Azure Data Factory". You can also search the Data Catalog using keywords.
On the activity asset, click the Lineage tab, you can see all the lineage information. - Copy activity:
- :::image type="content" source="./media/data-factory-purview/copy-lineage.png" alt-text="Screenshot of the Copy activity lineage in Azure Purview." lightbox="./media/data-factory-purview/copy-lineage.png":::
+ :::image type="content" source="./media/data-factory-purview/copy-lineage.png" alt-text="Screenshot of the Copy activity lineage in Microsoft Purview." lightbox="./media/data-factory-purview/copy-lineage.png":::
- Data Flow activity:
- :::image type="content" source="./media/data-factory-purview/dataflow-lineage.png" alt-text="Screenshot of the Data Flow lineage in Azure Purview." lightbox="./media/data-factory-purview/dataflow-lineage.png":::
+ :::image type="content" source="./media/data-factory-purview/dataflow-lineage.png" alt-text="Screenshot of the Data Flow lineage in Microsoft Purview." lightbox="./media/data-factory-purview/dataflow-lineage.png":::
> [!NOTE] > For the lineage of Dataflow activity, we only support source and sink. The lineage for Dataflow transformation is not supported yet. - Execute SSIS Package activity:
- :::image type="content" source="./media/data-factory-purview/ssis-lineage.png" alt-text="Screenshot of the Execute SSIS lineage in Azure Purview." lightbox="./media/data-factory-purview/ssis-lineage.png":::
+ :::image type="content" source="./media/data-factory-purview/ssis-lineage.png" alt-text="Screenshot of the Execute SSIS lineage in Microsoft Purview." lightbox="./media/data-factory-purview/ssis-lineage.png":::
> [!NOTE] > For the lineage of Execute SSIS Package activity, we only support source and destination. The lineage for transformation is not supported yet.
On the activity asset, click the Lineage tab, you can see all the lineage inform
[Catalog lineage user guide](../purview/catalog-lineage-user-guide.md)
-[Connect Data Factory to Azure Purview](connect-data-factory-to-azure-purview.md)
+[Connect Data Factory to Microsoft Purview](connect-data-factory-to-azure-purview.md)
databox Data Box Troubleshoot Share Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/databox/data-box-troubleshoot-share-access.md
Previously updated : 08/23/2021 Last updated : 04/15/2022
The failed connection attempts may include background processes, such as retries
**Suggested resolution.** To connect to an SMB share after a share account lockout, do these steps:
-1. Verify the SMB credentials for the share. In the local web UI of your device, go to **Connect and copy**, and select **SMB** for the share. You'll see the following dialog box.
+1. If the dashboard status indicates the device is locked, unlock the device from the top command bar and retry the connection.
+
+ :::image type="content" source="media/data-box-troubleshoot-share-access/dashboard-locked.png" alt-text="Screenshot of the dashboard locked status.":::
+
+1. If you are still unable to connect to an SMB share after unlocking your device, verify the SMB credentials for the share. In the local web UI of your device, go to **Connect and copy**, and select **SMB** for the share. You'll see the following dialog box.
![Screenshot of Access Share And Copy Data screen for an SMB share on a Data Box. Copy icons for the account, username, and password are highlighted.](media/data-box-troubleshoot-share-access/get-share-credentials-01.png)
defender-for-cloud Information Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/information-protection.md
Title: Prioritize security actions by data sensitivity - Microsoft Defender for Cloud
-description: Use Azure Purview's data sensitivity classifications in Microsoft Defender for Cloud
+description: Use Microsoft Purview's data sensitivity classifications in Microsoft Defender for Cloud
Last updated 11/09/2021
Last updated 11/09/2021
[!INCLUDE [Banner for top of topics](./includes/banner.md)]
-[Azure Purview](../purview/overview.md), Microsoft's data governance service, provides rich insights into the *sensitivity of your data*. With automated data discovery, sensitive data classification, and end-to-end data lineage, Azure Purview helps organizations manage and govern data in hybrid and multi-cloud environments.
+[Microsoft Purview](../purview/overview.md), Microsoft's data governance service, provides rich insights into the *sensitivity of your data*. With automated data discovery, sensitive data classification, and end-to-end data lineage, Microsoft Purview helps organizations manage and govern data in hybrid and multi-cloud environments.
-Microsoft Defender for Cloud customers using Azure Purview can benefit from an additional vital layer of metadata in alerts and recommendations: information about any potentially sensitive data involved. This knowledge helps solve the triage challenge and ensures security professionals can focus their attention on threats to sensitive data.
+Microsoft Defender for Cloud customers using Microsoft Purview can benefit from an additional vital layer of metadata in alerts and recommendations: information about any potentially sensitive data involved. This knowledge helps solve the triage challenge and ensures security professionals can focus their attention on threats to sensitive data.
-This page explains the integration of Azure Purview's data sensitivity classification labels within Defender for Cloud.
+This page explains the integration of Microsoft Purview's data sensitivity classification labels within Defender for Cloud.
## Availability |Aspect|Details| |-|:-| |Release state:|Preview.<br>[!INCLUDE [Legalese](../../includes/defender-for-cloud-preview-legal-text.md)]|
-|Pricing:|You'll need an Azure Purview account to create the data sensitivity classifications and run the scans. Viewing the scan results and using the output is free for Defender for Cloud users|
+|Pricing:|You'll need a Microsoft Purview account to create the data sensitivity classifications and run the scans. Viewing the scan results and using the output is free for Defender for Cloud users|
|Required roles and permissions:|**Security admin** and **Security contributor**| |Clouds:|:::image type="icon" source="./media/icons/yes-icon.png"::: Commercial clouds<br>:::image type="icon" source="./media/icons/no-icon.png"::: Azure Government<br>:::image type="icon" source="./media/icons/no-icon.png"::: Azure China 21Vianet (**Partial**: Subset of alerts and vulnerability assessment for SQL servers. Behavioral threat protections aren't available.)|
Defender for Cloud includes two mechanisms to help prioritize recommendations an
However, where possible, you'd want to focus the security team's efforts on risks to the organization's **data**. If two recommendations have equal impact on your secure score, but one relates to a resource with sensitive data, ideally you'd include that knowledge when determining prioritization.
-Azure Purview's data sensitivity classifications and data sensitivity labels provide that knowledge.
+Microsoft Purview's data sensitivity classifications and data sensitivity labels provide that knowledge.
## Discover resources with sensitive data
-To provide the vital information about discovered sensitive data, and help ensure you have that information when you need it, Defender for Cloud displays information from Azure Purview in multiple locations.
+To provide the vital information about discovered sensitive data, and help ensure you have that information when you need it, Defender for Cloud displays information from Microsoft Purview in multiple locations.
> [!TIP]
-> If a resource is scanned by multiple Azure Purview accounts, the information shown in Defender for Cloud relates to the most recent scan.
+> If a resource is scanned by multiple Microsoft Purview accounts, the information shown in Defender for Cloud relates to the most recent scan.
### Alerts and recommendations pages
This vital additional layer of metadata helps solve the triage challenge and ens
### Inventory filters
-The [asset inventory page](asset-inventory.md) has a collection of powerful filters to group your resources with outstanding alerts and recommendations according to the criteria relevant for any scenario. These filters include **Data sensitivity classifications** and **Data sensitivity labels**. Use these filters to evaluate the security posture of resources on which Azure Purview has discovered sensitive data.
+The [asset inventory page](asset-inventory.md) has a collection of powerful filters to group your resources with outstanding alerts and recommendations according to the criteria relevant for any scenario. These filters include **Data sensitivity classifications** and **Data sensitivity labels**. Use these filters to evaluate the security posture of resources on which Microsoft Purview has discovered sensitive data.
:::image type="content" source="./media/information-protection/information-protection-inventory-filters.png" alt-text="Screenshot of information protection filters in Microsoft Defender for Cloud's asset inventory page." lightbox="./media/information-protection/information-protection-inventory-filters.png":::
When you select a single resource - whether from an alert, recommendation, or th
The resource health page provides a snapshot view of the overall health of a single resource. You can review detailed information about the resource and all recommendations that apply to that resource. Also, if you're using any of the Microsoft Defender plans, you can see outstanding security alerts for that specific resource too.
-When reviewing the health of a specific resource, you'll see the Azure Purview information on this page and can use it determine what data has been discovered on this resource alongside the Azure Purview account used to scan the resource.
+When reviewing the health of a specific resource, you'll see the Microsoft Purview information on this page and can use it determine what data has been discovered on this resource alongside the Microsoft Purview account used to scan the resource.
### Overview tile
-The dedicated **Information protection** tile in Defender for CloudΓÇÖs [overview dashboard](overview-page.md) shows Azure PurviewΓÇÖs coverage. It also shows the resource types with the most sensitive data discovered.
+The dedicated **Information protection** tile in Defender for CloudΓÇÖs [overview dashboard](overview-page.md) shows Microsoft PurviewΓÇÖs coverage. It also shows the resource types with the most sensitive data discovered.
-A graph shows the number of recommendations and alerts by classified resource types. The tile also includes a link to Azure Purview to scan additional resources. Select the tile to see classified resources in Defender for CloudΓÇÖs asset inventory page.
+A graph shows the number of recommendations and alerts by classified resource types. The tile also includes a link to Microsoft Purview to scan additional resources. Select the tile to see classified resources in Defender for CloudΓÇÖs asset inventory page.
:::image type="content" source="./media/information-protection/overview-dashboard-information-protection.png" alt-text="Screenshot of the information protection tile in Microsoft Defender for Cloud's overview dashboard." lightbox="./media/information-protection/overview-dashboard-information-protection.png":::
A graph shows the number of recommendations and alerts by classified resource ty
For related information, see: -- [What is Azure Purview?](../purview/overview.md)-- [Azure Purview's supported data sources and file types](../purview/sources-and-scans.md) and [supported data stores](../purview/purview-connector-overview.md)-- [Azure Purview deployment best practices](../purview/deployment-best-practices.md)-- [How to label to your data in Azure Purview](../purview/how-to-automatically-label-your-content.md)
+- [What is Microsoft Purview?](../purview/overview.md)
+- [Microsoft Purview's supported data sources and file types](../purview/sources-and-scans.md) and [supported data stores](../purview/purview-connector-overview.md)
+- [Microsoft Purview deployment best practices](../purview/deployment-best-practices.md)
+- [How to label to your data in Microsoft Purview](../purview/how-to-automatically-label-your-content.md)
defender-for-cloud Overview Page https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/overview-page.md
In the center of the page are the **feature tiles**, each linking to a high prof
- **Regulatory compliance** - Defender for Cloud provides insights into your compliance posture based on continuous assessments of your Azure environment. Defender for Cloud analyzes risk factors in your environment according to security best practices. These assessments are mapped to compliance controls from a supported set of standards. [Learn more](regulatory-compliance-dashboard.md). - **Firewall Manager** - This tile shows the status of your hubs and networks from [Azure Firewall Manager](../firewall-manager/overview.md). - **Inventory** - The asset inventory page of Microsoft Defender for Cloud provides a single page for viewing the security posture of the resources you've connected to Microsoft Defender for Cloud. All resources with unresolved security recommendations are shown in the inventory. If you've enabled the integration with Microsoft Defender for Endpoint and enabled Microsoft Defender for Servers, you'll also have access to a software inventory. The tile on the overview page shows you at a glance the total healthy and unhealthy resources (for the currently selected subscriptions). [Learn more](asset-inventory.md).-- **Information protection** - A graph on this tile shows the resource types that have been scanned by [Azure Purview](../purview/overview.md), found to contain sensitive data, and have outstanding recommendations and alerts. Follow the **scan** link to access the Azure Purview accounts and configure new scans, or select any other part of the tile to open the [asset inventory](asset-inventory.md) and view your resources according to your Azure Purview data sensitivity classifications. [Learn more](information-protection.md).
+- **Information protection** - A graph on this tile shows the resource types that have been scanned by [Microsoft Purview](../purview/overview.md), found to contain sensitive data, and have outstanding recommendations and alerts. Follow the **scan** link to access the Microsoft Purview accounts and configure new scans, or select any other part of the tile to open the [asset inventory](asset-inventory.md) and view your resources according to your Microsoft Purview data sensitivity classifications. [Learn more](information-protection.md).
### Insights
defender-for-cloud Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/release-notes.md
Our Ignite release includes:
- [Azure Security Center and Azure Defender become Microsoft Defender for Cloud](#azure-security-center-and-azure-defender-become-microsoft-defender-for-cloud) - [Native CSPM for AWS and threat protection for Amazon EKS, and AWS EC2](#native-cspm-for-aws-and-threat-protection-for-amazon-eks-and-aws-ec2)-- [Prioritize security actions by data sensitivity (powered by Azure Purview) (in preview)](#prioritize-security-actions-by-data-sensitivity-powered-by-azure-purview-in-preview)
+- [Prioritize security actions by data sensitivity (powered by Microsoft Purview) (in preview)](#prioritize-security-actions-by-data-sensitivity-powered-by-microsoft-purview-in-preview)
- [Expanded security control assessments with Azure Security Benchmark v3](#expanded-security-control-assessments-with-azure-security-benchmark-v3) - [Microsoft Sentinel connector's optional bi-directional alert synchronization released for general availability (GA)](#microsoft-sentinel-connectors-optional-bi-directional-alert-synchronization-released-for-general-availability-ga) - [New recommendation to push Azure Kubernetes Service (AKS) logs to Sentinel](#new-recommendation-to-push-azure-kubernetes-service-aks-logs-to-sentinel)
When you've added your AWS accounts, Defender for Cloud protects your AWS resour
Learn more about [connecting your AWS accounts to Microsoft Defender for Cloud](quickstart-onboard-aws.md).
-### Prioritize security actions by data sensitivity (powered by Azure Purview) (in preview)
+### Prioritize security actions by data sensitivity (powered by Microsoft Purview) (in preview)
Data resources remain a popular target for threat actors. So it's crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments.
-To address this challenge, Microsoft Defender for Cloud now integrates sensitivity information from [Azure Purview](../purview/overview.md). Azure Purview is a unified data governance service that provides rich insights into the sensitivity of your data within multi-cloud, and on-premises workloads.
+To address this challenge, Microsoft Defender for Cloud now integrates sensitivity information from [Microsoft Purview](../purview/overview.md). Microsoft Purview is a unified data governance service that provides rich insights into the sensitivity of your data within multi-cloud, and on-premises workloads.
-The integration with Azure Purview extends your security visibility in Defender for Cloud from the infrastructure level down to the data, enabling an entirely new way to prioritize resources and security activities for your security teams.
+The integration with Microsoft Purview extends your security visibility in Defender for Cloud from the infrastructure level down to the data, enabling an entirely new way to prioritize resources and security activities for your security teams.
Learn more in [Prioritize security actions by data sensitivity](information-protection.md).
defender-for-iot Architecture https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/architecture.md
Specifically for OT networks, OT network sensors also provide the following anal
Defender for IoT provides hybrid network support using the following management options: -- **The Azure portal**. Use the Azure portal as a single pane of glass view all data ingested from your devices via network sensors. The Azure portal provides extra value, such as [workbooks](workbooks.md), [connections to Microsoft Sentinel](/azure/sentinel/iot-solution?toc=%2Fazure%2Fdefender-for-iot%2Forganizations%2Ftoc.json&bc=%2Fazure%2Fdefender-for-iot%2Fbreadcrumb%2Ftoc.json&tabs=use-out-of-the-box-analytics-rules-recommended), and more.
+- **The Azure portal**. Use the Azure portal as a single pane of glass view all data ingested from your devices via network sensors. The Azure portal provides extra value, such as [workbooks](workbooks.md), [connections to Microsoft Sentinel](../../sentinel/iot-solution.md?bc=%2fazure%2fdefender-for-iot%2fbreadcrumb%2ftoc.json&tabs=use-out-of-the-box-analytics-rules-recommended&toc=%2fazure%2fdefender-for-iot%2forganizations%2ftoc.json), and more.
Also use the Azure portal to obtain new appliances and software updates, onboard and maintain your sensors in Defender for IoT, and update threat intelligence packages.
For more information, see:
- [Frequently asked questions](resources-frequently-asked-questions.md) - [Sensor connection methods](architecture-connections.md)-- [Connect your sensors to Microsoft Defender for IoT](connect-sensors.md)-
+- [Connect your sensors to Microsoft Defender for IoT](connect-sensors.md)
defender-for-iot Connect Sensors https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/connect-sensors.md
Attach the gateway to the `GatewaySubnet` subnet you created [earlier](#step-2-d
For more information, see: -- [About VPN gateways](/azure/vpn-gateway/vpn-gateway-about-vpngateways)-- [Connect a virtual network to an ExpressRoute circuit using the portal](/azure/expressroute/expressroute-howto-linkvnet-portal-resource-manager)-- [Modify local network gateway settings using the Azure portal](/azure/vpn-gateway/vpn-gateway-modify-local-network-gateway-portal)
+- [About VPN gateways](../../vpn-gateway/vpn-gateway-about-vpngateways.md)
+- [Connect a virtual network to an ExpressRoute circuit using the portal](../../expressroute/expressroute-howto-linkvnet-portal-resource-manager.md)
+- [Modify local network gateway settings using the Azure portal](../../vpn-gateway/vpn-gateway-modify-local-network-gateway-portal.md)
### Step 4: Define network security groups
For more information, see:
Define an Azure virtual machine scale set to create and manage a group of load-balanced virtual machine, where you can automatically increase or decrease the number of virtual machines as needed.
-Use the following procedure to create a scale set to use with your sensor connection. For more information, see [What are virtual machine scale sets?](/azure/virtual-machine-scale-sets/overview)
+Use the following procedure to create a scale set to use with your sensor connection. For more information, see [What are virtual machine scale sets?](../../virtual-machine-scale-sets/overview.md)
1. Create a scale set with the following parameter definitions:
Use the following procedure to create a scale set to use with your sensor connec
Azure Load Balancer is a layer-4 load balancer that distributes incoming traffic among healthy virtual machine instances using a hash-based distribution algorithm.
-For more information, see the [Azure Load Balancer documentation](/azure/load-balancer/load-balancer-overview).
+For more information, see the [Azure Load Balancer documentation](../../load-balancer/load-balancer-overview.md).
To create an Azure load balancer for your sensor connection:
While you'll need to migrate your connections before the [legacy version reaches
## Next steps
-For more information, see [Sensor connection methods](architecture-connections.md).
+For more information, see [Sensor connection methods](architecture-connections.md).
defender-for-iot Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/overview.md
Microsoft Defender for IoT is a unified security solution for identifying IoT an
**For end-user organizations**, Microsoft Defender for IoT provides an agentless, network-layer monitoring that integrates smoothly with industrial equipment and SOC tools. You can deploy Microsoft Defender for IoT in Azure-connected and hybrid environments or completely on-premises.
-**For IoT device builders**, Microsoft Defender for IoT also offers a lightweight, micro-agent that supports standard IoT operating systems, such as Linux and RTOS. The Microsoft Defender device builder agent helps you ensure that security is built into your IoT/OT projects, from the cloud. For more information, see [Microsoft Defender for IoT for device builders documentation](/azure/defender-for-iot/device-builders/overview).
+**For IoT device builders**, Microsoft Defender for IoT also offers a lightweight, micro-agent that supports standard IoT operating systems, such as Linux and RTOS. The Microsoft Defender device builder agent helps you ensure that security is built into your IoT/OT projects, from the cloud. For more information, see [Microsoft Defender for IoT for device builders documentation](../device-builders/overview.md).
## Agentless device monitoring
For more information, see:
- [OT threat monitoring in enterprise SOCs](concept-sentinel-integration.md) - [Microsoft Defender for IoT architecture](architecture.md)-- [Quickstart: Get started with Defender for IoT](getting-started.md)
+- [Quickstart: Get started with Defender for IoT](getting-started.md)
defender-for-iot Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/release-notes.md
For more information, see [Manage your device inventory from the Azure portal](h
### Use Azure Monitor workbooks with Microsoft Defender for IoT (Public preview)
-[Azure Monitor workbooks](/azure/azure-monitor/visualize/workbooks-overview) provide graphs and dashboards that visually reflect your data, and are now available directly in Microsoft Defender for IoT with data from [Azure Resource Graph](/azure/governance/resource-graph/).
+[Azure Monitor workbooks](../../azure-monitor/visualize/workbooks-overview.md) provide graphs and dashboards that visually reflect your data, and are now available directly in Microsoft Defender for IoT with data from [Azure Resource Graph](../../governance/resource-graph/index.yml).
In the Azure portal, use the new Defender for IoT **Workbooks** page to view workbooks created by Microsoft and provided out-of-the-box, or create custom workbooks of your own.
Unicode characters are now supported when working with sensor certificate passph
## Next steps
-[Getting started with Defender for IoT](getting-started.md)
+[Getting started with Defender for IoT](getting-started.md)
defender-for-iot Workbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/workbooks.md
Learn more about viewing dashboards and reports on the sensor console:
Learn more about Azure Monitor workbooks and Azure Resource Graph: -- [Azure Resource Graph documentation](/azure/governance/resource-graph/)-- [Azure Monitor workbook documentation](/azure/azure-monitor/visualize/workbooks-overview)-- [Kusto Query Language (KQL) documentation](/azure/data-explorer/kusto/query/)
+- [Azure Resource Graph documentation](../../governance/resource-graph/index.yml)
+- [Azure Monitor workbook documentation](../../azure-monitor/visualize/workbooks-overview.md)
+- [Kusto Query Language (KQL) documentation](/azure/data-explorer/kusto/query/)
devtest-labs Deliver Proof Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/deliver-proof-concept.md
Learn about Azure and DevTest Labs by using the following resources:
### Enroll all users in Azure AD
-For management, such as adding users or adding lab owners, all lab users must belong to the [Azure Active Directory (Azure AD)](https://azure.microsoft.com/services/active-directory) tenant for the Azure subscription the pilot uses. Many enterprises set up [hybrid identity](/azure/active-directory/hybrid/whatis-hybrid-identity) to enable users to use their on-premises identities in the cloud. You don't need a hybrid identity for a DevTest Labs proof of concept.
+For management, such as adding users or adding lab owners, all lab users must belong to the [Azure Active Directory (Azure AD)](https://azure.microsoft.com/services/active-directory) tenant for the Azure subscription the pilot uses. Many enterprises set up [hybrid identity](../active-directory/hybrid/whatis-hybrid-identity.md) to enable users to use their on-premises identities in the cloud. You don't need a hybrid identity for a DevTest Labs proof of concept.
## Scope the proof of concept
A full DevTest Labs solution includes some important planning and design decisio
### Subscription topology
-The enterprise-level requirements for resources in Azure can extend beyond the [available quotas within a single subscription](/azure/azure-resource-manager/management/azure-subscription-service-limits). You might need several Azure subscriptions, or you might need to make service requests to increase initial subscription limits. For more information, see [Scalability considerations](devtest-lab-reference-architecture.md#scalability-considerations).
+The enterprise-level requirements for resources in Azure can extend beyond the [available quotas within a single subscription](../azure-resource-manager/management/azure-subscription-service-limits.md). You might need several Azure subscriptions, or you might need to make service requests to increase initial subscription limits. For more information, see [Scalability considerations](devtest-lab-reference-architecture.md#scalability-considerations).
It's important to decide how to distribute resources across subscriptions before final, full-scale rollout, because it's difficult to move resources to another subscription later. For example, you can't move a lab to another subscription after it's created. The [Subscription decision guide](/azure/architecture/cloud-adoption/decision-guides/subscriptions) is a valuable planning resource. ### Network topology
-The [default network infrastructure](/azure/app-service/networking-features) that DevTest Labs automatically creates might not meet requirements and constraints for enterprise users. For example, enterprises often use:
+The [default network infrastructure](../app-service/networking-features.md) that DevTest Labs automatically creates might not meet requirements and constraints for enterprise users. For example, enterprises often use:
- [Azure ExpressRoute-connected virtual networks](/azure/architecture/reference-architectures/hybrid-networking) for connecting on-premises networks to Azure.-- [Peered virtual networks](/azure/virtual-network/virtual-network-peering-overview) in a [hub-spoke configuration](/azure/architecture/reference-architectures/hybrid-networking/hub-spoke) for connecting virtual networks across subscriptions.-- [Forced tunneling](/azure/vpn-gateway/vpn-gateway-forced-tunneling-rm) to limit traffic to on-premises networks.
+- [Peered virtual networks](../virtual-network/virtual-network-peering-overview.md) in a [hub-spoke configuration](/azure/architecture/reference-architectures/hybrid-networking/hub-spoke) for connecting virtual networks across subscriptions.
+- [Forced tunneling](../vpn-gateway/vpn-gateway-forced-tunneling-rm.md) to limit traffic to on-premises networks.
For more information, see [Networking components](devtest-lab-reference-architecture.md#networking-components).
The solution has the following requirements:
## Next steps - [Scale up a DevTest Labs deployment](devtest-lab-guidance-orchestrate-implementation.md)-- [Orchestrate DevTest Labs implementation](devtest-lab-guidance-orchestrate-implementation.md)
+- [Orchestrate DevTest Labs implementation](devtest-lab-guidance-orchestrate-implementation.md)
devtest-labs Devtest Lab Add Devtest User https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-add-devtest-user.md
To add a member:
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
-You can add a DevTest Labs User to a lab by using the following Azure PowerShell script. The script requires the user to be in the Azure Active Directory (Azure AD). For information about adding an external user to Azure AD as a guest, see [Add a new guest user](/azure/active-directory/fundamentals/add-users-azure-active-directory#add-a-new-guest-user). If the user isn't in Azure AD, use the portal procedure instead.
+You can add a DevTest Labs User to a lab by using the following Azure PowerShell script. The script requires the user to be in the Azure Active Directory (Azure AD). For information about adding an external user to Azure AD as a guest, see [Add a new guest user](../active-directory/fundamentals/add-users-azure-active-directory.md#add-a-new-guest-user). If the user isn't in Azure AD, use the portal procedure instead.
In the following script, update the parameter values under the `# Values to change` comment. You can get the `subscriptionId`, `labResourceGroup`, and `labName` values from the lab's main page in the Azure portal.
New-AzRoleAssignment -ObjectId $adObject.Id -RoleDefinitionName 'DevTest Labs Us
## Next steps - [Customize permissions with custom roles](devtest-lab-grant-user-permissions-to-specific-lab-policies.md)-- [Automate adding lab users](automate-add-lab-user.md)
+- [Automate adding lab users](automate-add-lab-user.md)
devtest-labs Devtest Lab Attach Detach Data Disk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-attach-detach-data-disk.md
Last updated 03/29/2022
# Attach or detach a data disk for a lab virtual machine in Azure DevTest Labs
-This article explains how to attach and detach a lab virtual machine (VM) data disk in Azure DevTest Labs. You can create, attach, detach, and reattach [data disks](/azure/virtual-machines/managed-disks-overview) for lab VMs that you own. This functionality is useful for managing storage or software separately from individual VMs.
+This article explains how to attach and detach a lab virtual machine (VM) data disk in Azure DevTest Labs. You can create, attach, detach, and reattach [data disks](../virtual-machines/managed-disks-overview.md) for lab VMs that you own. This functionality is useful for managing storage or software separately from individual VMs.
## Prerequisites
-To attach or detach a data disk, you need to own the lab VM, and the VM must be running. The VM size determines how many data disks you can attach. For more information, see [Sizes for virtual machines](/azure/virtual-machines/sizes).
+To attach or detach a data disk, you need to own the lab VM, and the VM must be running. The VM size determines how many data disks you can attach. For more information, see [Sizes for virtual machines](../virtual-machines/sizes.md).
## Create and attach a new data disk
Follow these steps to create and attach a new managed data disk for a DevTest La
1. Fill out the **Attach new disk** form as follows: - For **Name**, enter a unique name.
- - For **Disk type**, select a [disk type](/azure/virtual-machines/disks-types) from the drop-down list.
+ - For **Disk type**, select a [disk type](../virtual-machines/disks-types.md) from the drop-down list.
- For **Size (GiB)**, enter a size in gigabytes. 1. Select **OK**.
You can also delete a detached data disk, by selecting **Delete** from the conte
## Next steps
-For information about transferring data disks for claimable lab VMs, see [Transfer the data disk](devtest-lab-add-claimable-vm.md#transfer-the-data-disk).
+For information about transferring data disks for claimable lab VMs, see [Transfer the data disk](devtest-lab-add-claimable-vm.md#transfer-the-data-disk).
devtest-labs Devtest Lab Reference Architecture https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-reference-architecture.md
On-premises, a [remote desktop gateway](/windows-server/remote/remote-desktop-se
### Networking components
-In this architecture, [Azure Active Directory (Azure AD)](/azure/active-directory/fundamentals/active-directory-whatis) provides identity and access management across all networks. Lab VMs usually have a local administrative account for access. If there's an Azure AD, on-premises, or [Azure AD Domain Services](../active-directory-domain-services/overview.md) domain available, you can join lab VMs to the domain. Users can then use their domain-based identities to connect to the VMs.
+In this architecture, [Azure Active Directory (Azure AD)](../active-directory/fundamentals/active-directory-whatis.md) provides identity and access management across all networks. Lab VMs usually have a local administrative account for access. If there's an Azure AD, on-premises, or [Azure AD Domain Services](../active-directory-domain-services/overview.md) domain available, you can join lab VMs to the domain. Users can then use their domain-based identities to connect to the VMs.
[Azure networking topology](../networking/fundamentals/networking-overview.md) controls how lab resources access and communicate with on-premises networks and the internet. This architecture shows a common way that enterprises network DevTest Labs. The labs connect with [peered virtual networks](../virtual-network/virtual-network-peering-overview.md) in a [hub-spoke configuration](/azure/architecture/reference-architectures/hybrid-networking/hub-spoke), through the ExpressRoute or site-to-site VPN connection, to the on-premises network.
DevTest Labs automatically benefits from built-in Azure security features. To re
Another security consideration is the permission level you grant to lab users. Lab owners use Azure role-based access control (Azure RBAC) to assign roles to users and set resource and access-level permissions. The most common DevTest Labs permissions are Owner, Contributor, and User. You can also create and assign [custom roles](devtest-lab-grant-user-permissions-to-specific-lab-policies.md). For more information, see [Add owners and users in Azure DevTest Labs](devtest-lab-add-devtest-user.md). ## Next steps
-See the next article in this series: [Deliver a proof of concept](deliver-proof-concept.md).
+See the next article in this series: [Deliver a proof of concept](deliver-proof-concept.md).
devtest-labs Devtest Lab Troubleshoot Apply Artifacts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-troubleshoot-apply-artifacts.md
To troubleshoot connectivity issues to the Azure Storage account:
- Check for added network security groups (NSGs). If a subscription policy was added to automatically configure NSGs in all virtual networks, it would affect the virtual network used for creating lab VMs. -- Verify NSG rules. Use [IP flow verify](../network-watcher/diagnose-vm-network-traffic-filtering-problem.md#use-ip-flow-verify) to determine whether an NSG rule is blocking traffic to or from a VM. You can also review effective security group rules to ensure that an inbound **Allow** NSG rule exists. For more information, see [Using effective security rules to troubleshoot VM traffic flow](/azure/virtual-network/diagnose-network-traffic-filter-problem).
+- Verify NSG rules. Use [IP flow verify](../network-watcher/diagnose-vm-network-traffic-filtering-problem.md#use-ip-flow-verify) to determine whether an NSG rule is blocking traffic to or from a VM. You can also review effective security group rules to ensure that an inbound **Allow** NSG rule exists. For more information, see [Using effective security rules to troubleshoot VM traffic flow](../virtual-network/diagnose-network-traffic-filter-problem.md).
- Check the lab's default storage account. The default storage account is the first storage account created when the lab was created. The name usually starts with the letter "a" and ends with a multi-digit number, such as a\<labname>#.
To troubleshoot connectivity issues to the Azure Storage account:
1. On the storage account **Overview** page, select **Firewalls and virtual networks** in the left navigation. 1. Ensure that **Firewalls and virtual networks** is set to **All networks**. Or, if the **Selected networks** option is selected, make sure the lab's virtual networks used to create VMs are added to the list.
-For in-depth troubleshooting, see [Configure Azure Storage firewalls and virtual networks](/azure/storage/common/storage-network-security).
+For in-depth troubleshooting, see [Configure Azure Storage firewalls and virtual networks](../storage/common/storage-network-security.md).
## Troubleshoot artifact failures from the lab VM
You can connect to the lab VM where the artifact failed, and investigate the iss
1. Open and inspect the *STATUS* file to view the error.
-For instructions on finding the log files on a **Linux** VM, see [Use the Azure Custom Script Extension Version 2 with Linux virtual machines](/azure/virtual-machines/extensions/custom-script-linux#troubleshooting).
+For instructions on finding the log files on a **Linux** VM, see [Use the Azure Custom Script Extension Version 2 with Linux virtual machines](../virtual-machines/extensions/custom-script-linux.md#troubleshooting).
### Check the VM Agent
-Ensure that the [Azure Virtual Machine Agent (VM Agent)](/azure/virtual-machines/extensions/agent-windows) is installed and ready.
+Ensure that the [Azure Virtual Machine Agent (VM Agent)](../virtual-machines/extensions/agent-windows.md) is installed and ready.
-When the VM first starts, or when the CSE first installs to serve the request to apply artifacts, the VM might need to either upgrade the VM Agent or wait for the VM Agent to initialize. The VM Agent might depend on services that take a long time to initialize. For further troubleshooting, see [Azure Virtual Machine Agent overview](/azure/virtual-machines/extensions/agent-windows).
+When the VM first starts, or when the CSE first installs to serve the request to apply artifacts, the VM might need to either upgrade the VM Agent or wait for the VM Agent to initialize. The VM Agent might depend on services that take a long time to initialize. For further troubleshooting, see [Azure Virtual Machine Agent overview](../virtual-machines/extensions/agent-windows.md).
To verify if the artifact appeared to stop responding because of the VM Agent:
To verify if the artifact appeared to stop responding because of the VM Agent:
In the previous example, the VM Agent took 10 minutes and 20 seconds to start. The cause was the OOBE service taking a long time to start.
-For general information about Azure extensions, see [Azure virtual machine extensions and features](/azure/virtual-machines/extensions/overview).
+For general information about Azure extensions, see [Azure virtual machine extensions and features](../virtual-machines/extensions/overview.md).
### Investigate script issues
If you need more help, try one of the following support channels:
- Contact the Azure DevTest Labs experts on the [MSDN Azure and Stack Overflow forums](https://azure.microsoft.com/support/forums/). - Get answers from Azure experts through [Azure Forums](https://azure.microsoft.com/support/forums). - Connect with [@AzureSupport](https://twitter.com/azuresupport), the official Microsoft Azure account for improving customer experience. Azure Support connects the Azure community to answers, support, and experts.-- Go to the [Azure support site](https://azure.microsoft.com/support/options) and select **Get Support** to file an Azure support incident.
+- Go to the [Azure support site](https://azure.microsoft.com/support/options) and select **Get Support** to file an Azure support incident.
devtest-labs Devtest Lab Vm Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-vm-powershell.md
This article shows you how to create an Azure DevTest Labs virtual machine (VM)
You need the following prerequisites to work through this article: - Access to a lab in DevTest Labs. [Create a lab](devtest-lab-create-lab.md), or use an existing lab.-- Azure PowerShell. [Install Azure PowerShell](/powershell/azure/install-az-ps), or [use Azure Cloud Shell](/azure/cloud-shell/quickstart-powershell) in the Azure portal.
+- Azure PowerShell. [Install Azure PowerShell](/powershell/azure/install-az-ps), or [use Azure Cloud Shell](../cloud-shell/quickstart-powershell.md) in the Azure portal.
## PowerShell VM creation script
Set-AzResource -ResourceId $VmResourceId -Properties $VmProperties -Force
## Next steps
-[Az.DevTestLabs PowerShell reference](/powershell/module/az.devtestlabs/)
+[Az.DevTestLabs PowerShell reference](/powershell/module/az.devtestlabs/)
devtest-labs Encrypt Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/encrypt-storage.md
Azure Storage encrypts lab data with a Microsoft-managed key. Optionally, you ca
For more information and instructions on configuring customer-managed keys for Azure Storage encryption, see: -- [Use customer-managed keys with Azure Key Vault to manage Azure Storage encryption](/azure/storage/common/customer-managed-keys-overview)-- [Configure encryption with customer-managed keys stored in Azure Key Vault](/azure/storage/common/customer-managed-keys-configure-key-vault)
+- [Use customer-managed keys with Azure Key Vault to manage Azure Storage encryption](../storage/common/customer-managed-keys-overview.md)
+- [Configure encryption with customer-managed keys stored in Azure Key Vault](../storage/common/customer-managed-keys-configure-key-vault.md)
## Next steps
-For more information about managing Azure Storage, see [Optimize costs by automatically managing the data lifecycle](../storage/blobs/lifecycle-management-overview.md).
-
+For more information about managing Azure Storage, see [Optimize costs by automatically managing the data lifecycle](../storage/blobs/lifecycle-management-overview.md).
devtest-labs Network Isolation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/network-isolation.md
Last updated 03/21/2022
This article walks you through creating a network-isolated lab in Azure DevTest Labs.
-By default, Azure DevTest Labs creates a new [Azure virtual network](/azure/virtual-network/virtual-networks-overview) for each lab. The virtual network acts as a security boundary to isolate lab resources from the public internet. To ensure lab resources follow organizational networking policies, you can use several other networking options:
+By default, Azure DevTest Labs creates a new [Azure virtual network](../virtual-network/virtual-networks-overview.md) for each lab. The virtual network acts as a security boundary to isolate lab resources from the public internet. To ensure lab resources follow organizational networking policies, you can use several other networking options:
- Isolate all lab [virtual machines (VMs)](devtest-lab-configure-vnet.md) and [environments](connect-environment-lab-virtual-network.md) in a pre-existing virtual network that you select. - Join an Azure virtual network to an on-premises network, to securely connect to on-premises resources. For more information, see [DevTest Labs enterprise reference architecture: Connectivity components](devtest-lab-reference-architecture.md#connectivity-components).
If you enabled network isolation for a virtual network other than the default, c
Azure Storage now allows inbound connections from the added virtual network, which enables the lab to operate successfully in a network isolated mode.
-You can automate these steps with PowerShell or Azure CLI to configure network isolation for multiple labs. For more information, see [Configure Azure Storage firewalls and virtual networks](/azure/storage/common/storage-network-security).
+You can automate these steps with PowerShell or Azure CLI to configure network isolation for multiple labs. For more information, see [Configure Azure Storage firewalls and virtual networks](../storage/common/storage-network-security.md).
### Configure the endpoint for the lab key vault
Here are some things to remember when using a lab in a network isolated mode:
The lab owner must explicitly enable access to a network isolated lab's storage account from an allowed endpoint. Actions like uploading a VHD to the storage account for creating custom images require this access. You can enable access by creating a lab VM, and securely accessing the lab's storage account from that VM.
-For more information, see [Connect to a storage account using an Azure Private Endpoint](/azure/private-link/tutorial-private-endpoint-storage-portal).
+For more information, see [Connect to a storage account using an Azure Private Endpoint](../private-link/tutorial-private-endpoint-storage-portal.md).
### Provide storage account to export lab usage data
For more information, see [Export or delete personal data from Azure DevTest Lab
Enabling the key vault service endpoint affects only the firewall. Make sure to configure the appropriate key vault access permissions in the key vault **Access policies** section.
-For more information, see [Assign a Key Vault access policy](/azure/key-vault/general/assign-access-policy).
+For more information, see [Assign a Key Vault access policy](../key-vault/general/assign-access-policy.md).
## Next steps - [Azure Resource Manager (ARM) templates in Azure DevTest Labs](devtest-lab-use-arm-and-powershell-for-lab-resources.md) - [Manage Azure DevTest Labs storage accounts](encrypt-storage.md)-- [Store secrets in a key vault in Azure DevTest Labs](devtest-lab-store-secrets-in-key-vault.md)
+- [Store secrets in a key vault in Azure DevTest Labs](devtest-lab-store-secrets-in-key-vault.md)
devtest-labs Start Machines Use Automation Runbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/start-machines-use-automation-runbooks.md
The DevTest Labs [autostart](devtest-lab-set-lab-policy.md#set-autostart) featur
- [Create and apply a tag](devtest-lab-add-tag.md) called **StartupOrder** to all lab VMs with an appropriate startup value, 0 through 10. Designate any machines that don't need starting as -1. -- Create an Azure Automation account by following instructions in [Create a standalone Azure Automation account](/azure/automation/automation-create-standalone-account). Choose the **Run As Accounts** option when you create the account.
+- Create an Azure Automation account by following instructions in [Create a standalone Azure Automation account](../automation/automation-create-standalone-account.md). Choose the **Run As Accounts** option when you create the account.
## Create the PowerShell runbook
While ($current -le 10) {
- [What is Azure Automation?](/azure/automation/automation-intro) - [Start up lab virtual machines automatically](devtest-lab-auto-startup-vm.md)-- [Use command-line tools to start and stop Azure DevTest Labs virtual machines](use-command-line-start-stop-virtual-machines.md)
+- [Use command-line tools to start and stop Azure DevTest Labs virtual machines](use-command-line-start-stop-virtual-machines.md)
devtest-labs Test App Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/test-app-azure.md
This article shows how to set up an application for testing from an Azure DevTes
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - A Windows-based [DevTest Labs VM](devtest-lab-add-vm.md) to use for testing the app. - [Visual Studio](https://visualstudio.microsoft.com/free-developer-offers/) installed on a different workstation.-- A [file share](/azure/storage/files/storage-how-to-create-file-share) created in your lab's [Azure Storage Account](encrypt-storage.md).-- The [file share mounted](/azure/storage/files/storage-how-to-use-files-windows#mount-the-azure-file-share) to your Visual Studio workstation, and to the lab VM you want to use for testing.
+- A [file share](../storage/files/storage-how-to-create-file-share.md) created in your lab's [Azure Storage Account](encrypt-storage.md).
+- The [file share mounted](../storage/files/storage-how-to-use-files-windows.md#mount-the-azure-file-share) to your Visual Studio workstation, and to the lab VM you want to use for testing.
## Publish your app from Visual Studio
See the following articles to learn how to use VMs in a lab.
- [Add a VM to a lab](devtest-lab-add-vm.md) - [Restart a lab VM](devtest-lab-restart-vm.md)-- [Resize a lab VM](devtest-lab-resize-vm.md)
+- [Resize a lab VM](devtest-lab-resize-vm.md)
devtest-labs Tutorial Create Custom Lab https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/tutorial-create-custom-lab.md
In the [next tutorial](tutorial-use-custom-lab.md), lab users, such as developer
## Prerequisite -- To create a lab, you need at least [Contributor](/azure/role-based-access-control/built-in-roles#contributor) role in an Azure subscription. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- To create a lab, you need at least [Contributor](../role-based-access-control/built-in-roles.md#contributor) role in an Azure subscription. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-- To add users to a lab, you must have [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner) role in the subscription the lab is in.
+- To add users to a lab, you must have [User Access Administrator](../role-based-access-control/built-in-roles.md#user-access-administrator) or [Owner](../role-based-access-control/built-in-roles.md#owner) role in the subscription the lab is in.
## Create a lab
From the lab **Overview** page, you can select **Claimable virtual machines** in
## Add a user to the DevTest Labs User role
-To add users to a lab, you must be a [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner) of the subscription the lab is in. For more information, see [Add lab owners, contributors, and users in Azure DevTest Labs](devtest-lab-add-devtest-user.md).
+To add users to a lab, you must be a [User Access Administrator](../role-based-access-control/built-in-roles.md#user-access-administrator) or [Owner](../role-based-access-control/built-in-roles.md#owner) of the subscription the lab is in. For more information, see [Add lab owners, contributors, and users in Azure DevTest Labs](devtest-lab-add-devtest-user.md).
1. On the lab's **Overview** page, under **Settings**, select **Configuration and policies**.
If you created a resource group for the lab, you can now delete that resource gr
To learn how to access the lab and VMs as a lab user, go on to the next tutorial: > [!div class="nextstepaction"]
-> [Tutorial: Access the lab](tutorial-use-custom-lab.md)
+> [Tutorial: Access the lab](tutorial-use-custom-lab.md)
devtest-labs Tutorial Use Custom Lab https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/tutorial-use-custom-lab.md
In this tutorial, you learn how to:
## Prerequisites
-You need at least [DevTest Labs User](/azure/role-based-access-control/built-in-roles#devtest-labs-user) access to the lab created in [Tutorial: Set up a lab in Azure DevTest Labs](tutorial-create-custom-lab.md), or to another lab that has a claimable VM.
+You need at least [DevTest Labs User](../role-based-access-control/built-in-roles.md#devtest-labs-user) access to the lab created in [Tutorial: Set up a lab in Azure DevTest Labs](tutorial-create-custom-lab.md), or to another lab that has a claimable VM.
The owner or administrator of the lab can give you the URL to access the lab in the Azure portal, and the username and password to access the lab VM.
To connect to a Windows machine through Remote Desktop Protocol (RDP), follow th
:::image type="content" source="./media/tutorial-use-custom-lab/remote-computer-verification.png" alt-text="Screenshot of remote computer verification.":::
-Once you connect to the VM, you can use it to do your work. You have [Owner](/azure/role-based-access-control/built-in-roles#owner) role on all lab VMs you claim or create, unless you unclaim them.
+Once you connect to the VM, you can use it to do your work. You have [Owner](../role-based-access-control/built-in-roles.md#owner) role on all lab VMs you claim or create, unless you unclaim them.
## Unclaim a lab VM
When you're done using a VM, you can delete it. Or, the lab owner can delete the
## Next steps
-In this tutorial, you learned how to claim and connect to claimable VMs in Azure DevTest Labs. To create your own lab VMs, see [Create lab virtual machines in Azure DevTest Labs](devtest-lab-add-vm.md).
+In this tutorial, you learned how to claim and connect to claimable VMs in Azure DevTest Labs. To create your own lab VMs, see [Create lab virtual machines in Azure DevTest Labs](devtest-lab-add-vm.md).
devtest-labs Use Paas Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/use-paas-services.md
When you create an environment, DevTest Labs can replace the `$(LabSubnetId)` to
### Use nested templates
-DevTest Labs supports [nested ARM templates](/azure/azure-resource-manager/templates/linked-templates). To use `_artifactsLocation` and `_artifactsLocationSasToken` tokens to create a URI to a nested ARM template, see [Deploy DevTest Labs environments by using nested templates](deploy-nested-template-environments.md). For more information, see the **Deployment artifacts** section of the [Azure Resource Manager Best Practices Guide](https://github.com/Azure/azure-quickstart-templates/blob/master/1-CONTRIBUTION-GUIDE/best-practices.md#deployment-artifacts-nested-templates-scripts).
+DevTest Labs supports [nested ARM templates](../azure-resource-manager/templates/linked-templates.md). To use `_artifactsLocation` and `_artifactsLocationSasToken` tokens to create a URI to a nested ARM template, see [Deploy DevTest Labs environments by using nested templates](deploy-nested-template-environments.md). For more information, see the **Deployment artifacts** section of the [Azure Resource Manager Best Practices Guide](https://github.com/Azure/azure-quickstart-templates/blob/master/1-CONTRIBUTION-GUIDE/best-practices.md#deployment-artifacts-nested-templates-scripts).
## Next steps - [Use ARM templates to create DevTest Labs environments](devtest-lab-create-environment-from-arm.md) - [Create an environment with a self-contained Service Fabric cluster in Azure DevTest Labs](create-environment-service-fabric-cluster.md) - [Connect an environment to your lab's virtual network in Azure DevTest Labs](connect-environment-lab-virtual-network.md)-- [Integrate environments into your Azure DevOps CI/CD pipelines](integrate-environments-devops-pipeline.md)-
+- [Integrate environments into your Azure DevOps CI/CD pipelines](integrate-environments-devops-pipeline.md)
digital-twins How To Use Data History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-use-data-history.md
This article shows how to set up a working data history connection between Azure
* an [Event Hubs](../event-hubs/event-hubs-about.md) namespace containing an event hub * an [Azure Data Explorer](/azure/data-explorer/data-explorer-overview) cluster containing a database
-It also contains a sample twin graph and telemetry scenario that you can use to see the historized twin updates in Azure Data Explorer.
+It also contains a sample twin graph that you can use to see the historized twin property updates in Azure Data Explorer.
>[!NOTE] >You can also work with data history using the [2021-06-30-preview](https://github.com/Azure/azure-rest-api-specs/tree/main/specification/digitaltwins/data-plane/Microsoft.DigitalTwins/preview/2021-06-30-preview) version of the rest APIs. That process isn't shown in this article.
After setting up the data history connection, you can optionally remove the role
Now that your data history connection is set up, you can test it with data from your digital twins.
-If you already have twins in your Azure Digital Twins instance that are receiving telemetry updates, you can skip this section and visualize the results using your own resources.
+If you already have twins in your Azure Digital Twins instance that are receiving property updates, you can skip this section and visualize the results using your own resources.
-Otherwise, continue through this section to set up a sample graph containing twins that can receive telemetry updates.
+Otherwise, continue through this section to set up a sample graph containing twins that receives twin property updates.
-You can set up a sample graph for this scenario using the **Azure Digital Twins Data Simulator**. The Azure Digital Twins Data Simulator continuously pushes telemetry to several twins in an Azure Digital Twins instance.
+You can set up a sample graph for this scenario using the **Azure Digital Twins Data Simulator**. The Azure Digital Twins Data Simulator continuously pushes property updates to several twins in an Azure Digital Twins instance.
### Create a sample graph
-You can use the **Azure Digital Twins Data Simulator** to provision a sample twin graph and push telemetry data to it. The twin graph created here models pasteurization processes for a dairy company.
+You can use the **Azure Digital Twins Data Simulator** to provision a sample twin graph and push property updates to it. The twin graph created here models pasteurization processes for a dairy company.
Start by opening the [Azure Digital Twins Data Simulator](https://explorer.digitaltwins.azure.net/tools/data-pusher) web application in your browser.
event-grid Transition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/edge/transition.md
Title: Transition from Event Grid on Azure IoT Edge to Azure IoT Edge
-description: This article explains transition from Event Grid on Azure IoT Edge to Azure IoT Edge MQTT Broker or IoT Hub message routing.
+description: This article explains transition from Event Grid on Azure IoT Edge to Azure IoT Edge Hub module in Azure IoT Edge runtime.
Previously updated : 02/16/2022 Last updated : 04/13/2022
On March 31, 2023, Event Grid on Azure IoT Edge will be retired, so make sure to transition to IoT Edge native capabilities prior to that date.
-## Why are we retiring?
-There are multiple reasons for deciding to retire Event Grid on IoT Edge, which is currently in Preview, in March 2023.
+## Why are we retiring?
-- Event Grid has been evolving in the cloud native space to provide more robust capabilities not only in Azure but also in on-prem scenarios with [Kubernetes with Azure Arc](../kubernetes/overview.md).-- We've seen an increase of adoption of MQTT brokers in the IoT space, this adoption has been the motivation to allow IoT Edge team to build a new native MQTT broker that provides a better integration for pub/sub messaging scenarios. With the new MQTT broker provided natively on IoT Edge, you'll be able to connect to this broker, publish, and subscribe to messages over user-defined topics, and use IoT Hub messaging primitives. The IoT Edge MQTT broker is built in the IoT Edge hub.
+There's one major reason for deciding to retire Event Grid on IoT Edge, which is currently in Preview, in March 2023: Event Grid has been evolving in the cloud native space to provide more robust capabilities not only in Azure but also in on-prem scenarios with [Kubernetes with Azure Arc](../kubernetes/overview.md).
-Here's the list of the features that will be removed with the retirement of Event Grid on Azure IoT Edge and a list of the new IoT Edge native capabilities.
-
-| Event Grid on Azure IoT Edge | MQTT broker on Azure IoT Edge |
+| Event Grid on Azure IoT Edge | Azure IoT Edge Hub |
| - | -- |
-| - Publishing and subscribing to events locally/cloud<br/>- Forwarding events to Event Grid<br/>- Forwarding events to IoT Hub<br/>- React to Blob Storage events locally | - Connectivity to IoT Edge hub<br/>- Publish and subscribe on user-defined topics<br/>- Publish and subscribe on IoT Hub topics<br/>- Publish and subscribe between MQTT brokers |
-
+| - Publishing and subscribing to events locally/cloud<br/>- Forwarding events to Event Grid<br/>- Forwarding events to IoT Hub<br/>- React to Blob Storage events locally | - Connectivity to Azure IoT Hub<br/>- Route messages between modules or devices locally<br/>- Offline support<br/>- Message filtering |
## How to transition to Azure IoT Edge features
The following table highlights the key differences during this transition.
| Event Grid on Azure IoT Edge | Azure IoT Edge | | | -- |
-| Publish, subscribe and forward events locally or cloud | You can use Azure IoT Edge MQTT broker to publish and subscribe messages. To learn how to connect to this broker, publish and subscribe to messages over user-defined topics, and use IoT Hub messaging primitives, see [publish and subscribe with Azure IoT Edge](../../iot-edge/how-to-publish-subscribe.md). The IoT Edge MQTT broker is built in the IoT Edge hub. For more information, see [the brokering capabilities of the IoT Edge hub](../../iot-edge/iot-edge-runtime.md). </br> </br> If you're subscribing to IoT Hub, itΓÇÖs possible to create an event to publish to Event Grid if you need. For details, see [Azure IoT Hub and Event Grid](../../iot-hub/iot-hub-event-grid.md). |
-| Forward events to IoT Hub | You can use IoT Hub message routing to send device-cloud messages to different endpoints. For details, see [Understand Azure IoT Hub message routing](../../iot-hub/iot-hub-devguide-messages-d2c.md). |
-| React to Blob Storage events on IoT Edge (Preview) | You can use Azure Function Apps to react to blob storage events on cloud when a blob is created or updated. For more information, see [Azure Blob storage trigger for Azure Functions](../../azure-functions/functions-bindings-storage-blob-trigger.md) and [Tutorial: Deploy Azure Functions as modules - Azure IoT Edge](../../iot-edge/tutorial-deploy-function.md). Blob triggers in IoT Edge blob storage module aren't supported. |
+| Publish, subscribe and forward events locally or cloud | Use the message routing feature in IoT Edge Hub to facilitate local and cloud communication. It enables device-to-module, module-to-module, and device-to-device communications by brokering messages to keep devices and modules independent from each other. To learn more, see [using routing for IoT Edge hub](../../iot-edge/iot-edge-runtime.md#using-routing). </br> </br> If you're subscribing to IoT Hub, itΓÇÖs possible to create an event to publish to Event Grid if you need. For details, see [Azure IoT Hub and Event Grid](../../iot-hub/iot-hub-event-grid.md). |
+| Forward events to IoT Hub | Use IoT Edge Hub to optimize connections to send messages to the cloud with offline support. For details, see [IoT Edge Hub cloud communication](../../iot-edge/iot-edge-runtime.md#using-routing). |
+| React to Blob Storage events on IoT Edge (Preview) | You can use Azure Function Apps to react to blob storage events on cloud when a blob is created or updated. For more information, see [Azure Blob storage trigger for Azure Functions](../../azure-functions/functions-bindings-storage-blob-trigger.md) and [Tutorial: Deploy Azure Functions as modules - Azure IoT Edge](../../iot-edge/tutorial-deploy-function.md). Blob triggers in IoT Edge blob storage module aren't supported. |
firewall Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/firewall/overview.md
Previously updated : 03/30/2022 Last updated : 04/19/2022 # Customer intent: As an administrator, I want to evaluate Azure Firewall so I can determine if I want to use it.
Azure Firewall Standard has the following known issues:
|NAT rules with ports between 64000 and 65535 are unsupported|Azure Firewall allows any port in the 1-65535 range in network and application rules, however NAT rules only support ports in the 1-63999 range.|This is a current limitation. |Configuration updates may take five minutes on average|An Azure Firewall configuration update can take three to five minutes on average, and parallel updates aren't supported.|A fix is being investigated.| |Azure Firewall uses SNI TLS headers to filter HTTPS and MSSQL traffic|If browser or server software doesn't support the Server Name Indicator (SNI) extension, you can't connect through Azure Firewall.|If browser or server software doesn't support SNI, then you may be able to control the connection using a network rule instead of an application rule. See [Server Name Indication](https://wikipedia.org/wiki/Server_Name_Indication) for software that supports SNI.|
-|Start/Stop doesnΓÇÖt work with a firewall configured in forced-tunnel mode|Start/stop doesnΓÇÖt work with Azure firewall configured in forced-tunnel mode. Attempting to start Azure Firewall with forced tunneling configured results in the following error:<br><br>*Set-AzFirewall: AzureFirewall FW-xx management IP configuration cannot be added to an existing firewall. Redeploy with a management IP configuration if you want to use forced tunneling support.<br>StatusCode: 400<br>ReasonPhrase: Bad Request*|Under investigation.<br><br>As a workaround, you can delete the existing firewall and create a new one with the same parameters.|
|Can't add firewall policy tags using the portal or Azure Resource Manager (ARM) templates|Azure Firewall Policy has a patch support limitation that prevents you from adding a tag using the Azure portal or ARM templates. The following error is generated: *Could not save the tags for the resource*.|A fix is being investigated. Or, you can use the Azure PowerShell cmdlet `Set-AzFirewallPolicy` to update tags.| |IPv6 not currently supported|If you add an IPv6 address to a rule, the firewall fails.|Use only IPv4 addresses. IPv6 support is under investigation.| |Updating multiple IP Groups fails with conflict error.|When you update two or more IP Groups attached to the same firewall, one of the resources goes into a failed state.|This is a known issue/limitation. <br><br>When you update an IP Group, it triggers an update on all firewalls that the IPGroup is attached to. If an update to a second IP Group is started while the firewall is still in the *Updating* state, then the IPGroup update fails.<br><br>To avoid the failure, IP Groups attached to the same firewall must be updated one at a time. Allow enough time between updates to allow the firewall to get out of the *Updating* state.|
governance Built In Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/built-in-policies.md
side of the page. Otherwise, use <kbd>Ctrl</kbd>-<kbd>F</kbd> (Windows) or <kbd>
[!INCLUDE [azure-policy-reference-policies-azure-edge-hardware-center](../../../../includes/policy/reference/bycat/policies-azure-edge-hardware-center.md)]
-## Azure Purview
+## Microsoft Purview
[!INCLUDE [azure-policy-reference-policies-azure-purview](../../../../includes/policy/reference/bycat/policies-azure-purview.md)]
governance Policy Devops Pipelines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/tutorials/policy-devops-pipelines.md
For more information, see [What is Azure Pipelines?](/azure/devops/pipelines/get
and [Create your first pipeline](/azure/devops/pipelines/create-first-pipeline). ## Prepare
-1. Create an [Azure Policy](/azure/governance/policy/tutorials/create-and-manage) in the Azure portal.
- There are several [predefined sample policies](/azure/governance/policy/samples/)
+1. Create an [Azure Policy](./create-and-manage.md) in the Azure portal.
+ There are several [predefined sample policies](../samples/index.md)
that can be applied to a management group, subscription, and resource group. 1. In Azure DevOps, create a release pipeline that contains at least one stage, or open an existing release pipeline.
and [Create your first pipeline](/azure/devops/pipelines/create-first-pipeline).
To learn more about the structures of policy definitions, look at this article: > [!div class="nextstepaction"]
-> [Azure Policy definition structure](../concepts/definition-structure.md)
+> [Azure Policy definition structure](../concepts/definition-structure.md)
governance Supported Tables Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/resource-graph/reference/supported-tables-resources.md
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.powerplatform/enterprisepolicies - microsoft.projectbabylon/accounts - microsoft.providerhubdevtest/regionalstresstests-- Microsoft.Purview/Accounts (Azure Purview accounts)
+- Microsoft.Purview/Accounts (Microsoft Purview accounts)
- Microsoft.Quantum/Workspaces (Quantum Workspaces) - Microsoft.RecommendationsService/accounts (Intelligent Recommendations Accounts) - Microsoft.RecommendationsService/accounts/modeling (Modeling)
hdinsight Apache Spark Jupyter Notebook Kernels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/spark/apache-spark-jupyter-notebook-kernels.md
description: Learn about the PySpark, PySpark3, and Spark kernels for Jupyter No
Previously updated : 04/24/2020 Last updated : 04/18/2022 # Kernels for Jupyter Notebook on Apache Spark clusters in Azure HDInsight HDInsight Spark clusters provide kernels that you can use with the Jupyter Notebook on [Apache Spark](./apache-spark-overview.md) for testing your applications. A kernel is a program that runs and interprets your code. The three kernels are: -- **PySpark** - for applications written in Python2.
+- **PySpark** - for applications written in Python2. (Applicable only for Spark 2.4 version clusters)
- **PySpark3** - for applications written in Python3. - **Spark** - for applications written in Scala.
An Apache Spark cluster in HDInsight. For instructions, see [Create Apache Spark
:::image type="content" source="./media/apache-spark-jupyter-notebook-kernels/kernel-jupyter-notebook-on-spark.png " alt-text="Kernels for Jupyter Notebook on Spark" border="true":::
+ > [!NOTE]
+ > For Spark 3.1, only **PySpark3**, or **Spark** will be available.
+ >
+ :::image type="content" source="./media/apache-spark-jupyter-notebook-kernels/kernel-jupyter-notebook-on-spark-for-hdi-4-0.png " alt-text="Kernels for Jupyter Notebook on Spark HDI4.0" border="true":::
+
+ 4. A notebook opens with the kernel you selected. ## Benefits of using the kernels
healthcare-apis Azure Active Directory Identity Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/azure-active-directory-identity-configuration.md
Title: Azure Active Directory identity configuration for Azure API for FHIR description: Learn the principles of identity, authentication, and authorization for Azure FHIR servers. -+ Last updated 02/15/2022-+ # Azure Active Directory identity configuration for Azure API for FHIR
healthcare-apis Azure Api Fhir Access Token Validation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/azure-api-fhir-access-token-validation.md
Title: Azure API for FHIR access token validation description: Walks through token validation and gives tips on how to troubleshoot access issues -+ Last updated 02/15/2022-+ # Azure API for FHIR access token validation
healthcare-apis Azure Api For Fhir Additional Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/azure-api-for-fhir-additional-settings.md
--++ Last updated 02/15/2022
healthcare-apis Carin Implementation Guide Blue Button Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/carin-implementation-guide-blue-button-tutorial.md
--++ Last updated 02/15/2022
healthcare-apis Centers For Medicare Tutorial Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/centers-for-medicare-tutorial-introduction.md
--++ Last updated 02/15/2022
healthcare-apis Copy To Synapse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/copy-to-synapse.md
In this article, you'll learn three ways to copy data from Azure API for FHIR to
> [!Note] > [FHIR to Synapse Sync Agent](https://github.com/microsoft/FHIR-Analytics-Pipelines/blob/main/FhirToDataLake/docs/Deployment.md) is an open source tool released under MIT license, and is not covered by the Microsoft SLA for Azure services.
-The **FHIR to Synapse Sync Agent** is a Microsoft OSS project released under MIT License. It's an Azure function that extracts data from a FHIR server using FHIR Resource APIs, converts it to hierarchical Parquet files, and writes it to Azure Data Lake in near real time. This also contains a script to create external tables and views in [Synapse Serverless SQL pool](https://docs.microsoft.com/azure/synapse-analytics/sql/on-demand-workspace-overview) pointing to the Parquet files.
+The **FHIR to Synapse Sync Agent** is a Microsoft OSS project released under MIT License. It's an Azure function that extracts data from a FHIR server using FHIR Resource APIs, converts it to hierarchical Parquet files, and writes it to Azure Data Lake in near real time. This also contains a script to create external tables and views in [Synapse Serverless SQL pool](../../synapse-analytics/sql/on-demand-workspace-overview.md) pointing to the Parquet files.
This solution enables you to query against the entire FHIR data with tools such as Synapse Studio, SSMS, and Power BI. You can also access the Parquet files directly from a Synapse Spark pool. You should consider this solution if you want to access all of your FHIR data in near real time, and want to defer custom transformation to downstream systems.
healthcare-apis Davinci Pdex Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/davinci-pdex-tutorial.md
--++ Last updated 02/15/2022
healthcare-apis Davinci Plan Net https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/davinci-plan-net.md
-+ Last updated 02/15/2022
healthcare-apis Enable Diagnostic Logging https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/enable-diagnostic-logging.md
--++ Last updated 02/15/2022
healthcare-apis Export Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/export-data.md
Title: Executing the export by invoking $export command on Azure API for FHIR description: This article describes how to export FHIR data using $export for Azure API for FHIR-+ Last updated 02/15/2022-+ # How to export FHIR data in Azure API for FHIR
healthcare-apis Fhir App Registration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/fhir-app-registration.md
-+ Last updated 02/15/2022
healthcare-apis Fhir Features Supported https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/fhir-features-supported.md
Title: Supported FHIR features in Azure - Azure API for FHIR description: This article explains which features of the FHIR specification that are implemented in Azure API for FHIR -+
healthcare-apis How To Do Custom Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/how-to-do-custom-search.md
Last updated 02/15/2022-+ # Defining custom search parameters for Azure API for FHIR
healthcare-apis How To Run A Reindex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/how-to-run-a-reindex.md
Last updated 02/15/2022-+ # Running a reindex job in Azure API for FHIR
healthcare-apis Overview Of Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/overview-of-search.md
Last updated 02/15/2022-+ # Overview of search in Azure API for FHIR
healthcare-apis Patient Everything https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/patient-everything.md
Title: Use patient-everything in Azure API for FHIR description: This article explains how to use the Patient-everything operation in the Azure API for FHIR. -+ Last updated 02/15/2022-+ # Patient-everything in FHIR
healthcare-apis Register Confidential Azure Ad Client App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/register-confidential-azure-ad-client-app.md
Last updated 02/15/2022-+ # Register a confidential client application in Azure Active Directory for Azure API for FHIR
healthcare-apis Register Public Azure Ad Client App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/register-public-azure-ad-client-app.md
Last updated 03/21/2022-+ # Register a public client application in Azure Active Directory for Azure API for FHIR
healthcare-apis Register Resource Azure Ad Client App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/register-resource-azure-ad-client-app.md
Last updated 02/15/2022-+
healthcare-apis Register Service Azure Ad Client App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/register-service-azure-ad-client-app.md
Last updated 03/21/2022-+ # Register a service client application in Azure Active Directory for Azure API for FHIR
healthcare-apis Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/release-notes.md
Title: Azure API for FHIR monthly releases description: This article provides details about the Azure API for FHIR monthly features and enhancements. -+
healthcare-apis Search Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/search-samples.md
Last updated 02/15/2022-+ # FHIR search examples for Azure API for FHIR
healthcare-apis Store Profiles In Fhir https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/store-profiles-in-fhir.md
Last updated 02/15/2022-+ # Store profiles in Azure API for FHIR
healthcare-apis Tutorial Member Match https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/tutorial-member-match.md
--++ Last updated 02/15/2022
healthcare-apis Tutorial Web App Fhir Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/tutorial-web-app-fhir-server.md
--++ Last updated 02/15/2022
healthcare-apis Tutorial Web App Public App Reg https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/tutorial-web-app-public-app-reg.md
--++ Last updated 03/22/2022
healthcare-apis Tutorial Web App Test Postman https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/tutorial-web-app-test-postman.md
--++ Last updated 02/15/2022
healthcare-apis Tutorial Web App Write Web App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/tutorial-web-app-write-web-app.md
--++ Last updated 02/15/2022
healthcare-apis Validation Against Profiles https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/validation-against-profiles.md
Title: Validate FHIR resources against profiles in Azure API for FHIR description: This article describes how to validate FHIR resources against profiles in Azure API for FHIR.-+ Last updated 02/15/2022-+ # Validate FHIR resources against profiles in Azure API for FHIR
healthcare-apis Azure Active Directory Identity Configuration Old https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/azure-active-directory-identity-configuration-old.md
Title: Azure Active Directory identity configuration for Azure Health Data Services for FHIR service description: Learn the principles of identity, authentication, and authorization for FHIR service -+ Last updated 03/01/2022-+ # Azure Active Directory identity configuration for FHIR service
healthcare-apis Carin Implementation Guide Blue Button Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/carin-implementation-guide-blue-button-tutorial.md
--++ Last updated 03/01/2022
healthcare-apis Centers For Medicare Tutorial Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/centers-for-medicare-tutorial-introduction.md
--++ Last updated 03/01/2022
healthcare-apis Configure Export Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/configure-export-data.md
Last updated 03/01/2022-+ # Configure export settings and set up a storage account
healthcare-apis Configure Import Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/configure-import-data.md
The final step is to set the import configuration of the FHIR service, which con
> [!NOTE] > If you haven't assigned storage access permissions to the FHIR service, the import operations ($import) will fail.
-To specify the Azure Storage account, you need to use [Rest API](https://docs.microsoft.com/rest/api/healthcareapis/services/create-or-update) to update the FHIR service.
+To specify the Azure Storage account, you need to use [Rest API](/rest/api/healthcareapis/services/create-or-update) to update the FHIR service.
To get the request URL and body, browse to the Azure portal of your FHIR service. Select **Overview**, and then **JSON View**.
In this article, you've learned the FHIR service supports $import operation and
>[Configure export settings and set up a storage account](configure-export-data.md) >[!div class="nextstepaction"]
->[Copy data from FHIR service to Azure Synapse Analytics](copy-to-synapse.md)
+>[Copy data from FHIR service to Azure Synapse Analytics](copy-to-synapse.md)
healthcare-apis Copy To Synapse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/copy-to-synapse.md
In this article, youΓÇÖll learn three ways to copy data from the FHIR service in
> [!Note] > [FHIR to Synapse Sync Agent](https://github.com/microsoft/FHIR-Analytics-Pipelines/blob/main/FhirToDataLake/docs/Deployment.md) is an open source tool released under MIT license, and is not covered by the Microsoft SLA for Azure services.
-The **FHIR to Synapse Sync Agent** is a Microsoft OSS project released under MIT License. It's an Azure function that extracts data from a FHIR server using FHIR Resource APIs, converts it to hierarchical Parquet files, and writes it to Azure Data Lake in near real time. This also contains a script to create external tables and views in [Synapse Serverless SQL pool](https://docs.microsoft.com/azure/synapse-analytics/sql/on-demand-workspace-overview) pointing to the Parquet files.
+The **FHIR to Synapse Sync Agent** is a Microsoft OSS project released under MIT License. It's an Azure function that extracts data from a FHIR server using FHIR Resource APIs, converts it to hierarchical Parquet files, and writes it to Azure Data Lake in near real time. This also contains a script to create external tables and views in [Synapse Serverless SQL pool](../../synapse-analytics/sql/on-demand-workspace-overview.md) pointing to the Parquet files.
This solution enables you to query against the entire FHIR data with tools such as Synapse Studio, SSMS, and Power BI. You can also access the Parquet files directly from a Synapse Spark pool. You should consider this solution if you want to access all of your FHIR data in near real time, and want to defer custom transformation to downstream systems.
Follow the OSS [documentation](https://github.com/microsoft/FHIR-Analytics-Pipel
> [!Note] > [FHIR to CDM pipeline generator](https://github.com/microsoft/FHIR-Analytics-Pipelines/blob/main/FhirToCdm/docs/fhir-to-cdm.md) is an open source tool released under MIT license, and is not covered by the Microsoft SLA for Azure services.
-The **FHIR to CDM pipeline generator** is a Microsoft OSS project released under MIT License. It's a tool to generate an ADF pipeline for copying a snapshot of data from a FHIR server using $export API, transforming it to csv format, and writing to a [CDM folder](https://docs.microsoft.com/common-data-model/data-lake) in Azure Data Lake Storage Gen 2. The tool requires a user-created configuration file containing instructions to project and flatten FHIR Resources and fields into tables. You can also follow the instructions for creating a downstream pipeline in Synapse workspace to move data from CDM folder to Synapse dedicated SQL pool.
+The **FHIR to CDM pipeline generator** is a Microsoft OSS project released under MIT License. It's a tool to generate an ADF pipeline for copying a snapshot of data from a FHIR server using $export API, transforming it to csv format, and writing to a [CDM folder](/common-data-model/data-lake) in Azure Data Lake Storage Gen 2. The tool requires a user-created configuration file containing instructions to project and flatten FHIR Resources and fields into tables. You can also follow the instructions for creating a downstream pipeline in Synapse workspace to move data from CDM folder to Synapse dedicated SQL pool.
This solution enables you to transform the data into tabular format as it gets written to CDM folder. You should consider this solution if you want to transform FHIR data into a custom schema after it's extracted from the FHIR server.
In this article, you learned three different ways to copy your FHIR data into Sy
Next, you can learn about how you can de-identify your FHIR data while exporting it to Synapse in order to protect PHI. >[!div class="nextstepaction"]
->[Exporting de-identified data](./de-identified-export.md)
--------
+>[Exporting de-identified data](./de-identified-export.md)
healthcare-apis Davinci Drug Formulary Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/davinci-drug-formulary-tutorial.md
-+ Last updated 03/01/2022
healthcare-apis Davinci Pdex Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/davinci-pdex-tutorial.md
--++ Last updated 03/01/2022
healthcare-apis Davinci Plan Net https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/davinci-plan-net.md
-+ Last updated 03/01/2022
healthcare-apis Export Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/export-data.md
Last updated 02/15/2022-+ # How to export FHIR data
healthcare-apis Fhir Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/fhir-faq.md
Title: FAQs about FHIR service in Azure Health Data Services description: Get answers to frequently asked questions about FHIR service, such as the storage location of data behind FHIR APIs and version support. -+ Last updated 03/01/2022-+
healthcare-apis Fhir Features Supported https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/fhir-features-supported.md
Title: Supported FHIR features in FHIR service description: This article explains which features of the FHIR specification that are implemented in Azure Health Data Services -+
healthcare-apis Fhir Service Access Token Validation Old https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/fhir-service-access-token-validation-old.md
Title: FHIR service access token validation description: Access token validation procedure and troubleshooting guide for FHIR service -+ Last updated 03/01/2022-+ # FHIR service access token validation
healthcare-apis How To Do Custom Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/how-to-do-custom-search.md
Last updated 03/01/2022-+ # Defining custom search parameters
healthcare-apis How To Run A Reindex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/how-to-run-a-reindex.md
Last updated 03/01/2022-+ # Running a reindex job
healthcare-apis Overview Of Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/overview-of-search.md
Title: Overview of FHIR search in Azure Health Data Services description: This article describes an overview of FHIR search that is implemented in Azure Health Data Services-+ Last updated 03/01/2022-+ # Overview of FHIR search
healthcare-apis Patient Everything https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/patient-everything.md
Title: Patient-everything - Azure Health Data Services description: This article explains how to use the Patient-everything operation. -+ Last updated 03/01/2022-+ # Using Patient-everything in FHIR service
healthcare-apis Search Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/search-samples.md
Last updated 03/01/2022-+ # FHIR search examples
healthcare-apis Store Profiles In Fhir https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/store-profiles-in-fhir.md
Last updated 03/01/2022-+ # Store profiles in FHIR service
healthcare-apis Tutorial Member Match https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/tutorial-member-match.md
--++ Last updated 03/01/2022
healthcare-apis Validation Against Profiles https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/validation-against-profiles.md
Title: Validate FHIR resources against profiles in Azure Health Data Services description: This article describes how to validate FHIR resources against profiles in the FHIR service.-+ Last updated 03/01/2022-+ # Validate FHIR resources against profiles in Azure Health Data Services
industrial-iot Tutorial Deploy Industrial Iot Platform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/industrial-iot/tutorial-deploy-industrial-iot-platform.md
The deployment script allows to select which set of components to deploy.
- [Storage](https://azure.microsoft.com/product-categories/storage/) for Event Hubs checkpointing - Standard dependencies: Minimum + - [SignalR Service](https://azure.microsoft.com/services/signalr-service/) used to scale out asynchronous API notifications, Azure AD app registrations,
- - [Device Provisioning Service](https://docs.microsoft.com/azure/iot-dps/) used for deploying and provisioning the simulation gateways
+ - [Device Provisioning Service](../iot-dps/index.yml) used for deploying and provisioning the simulation gateways
- [Time Series Insights](https://azure.microsoft.com/services/time-series-insights/) - Workbook, Log Analytics, [Application Insights](https://azure.microsoft.com/services/monitor/) for operations monitoring - Micro
References:
Now that you have deployed the IIoT Platform, you can learn how to customize configuration of the components: > [!div class="nextstepaction"]
-> [Customize the configuration of the components](tutorial-configure-industrial-iot-components.md)
+> [Customize the configuration of the components](tutorial-configure-industrial-iot-components.md)
iot-central Concepts Private Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/concepts-private-endpoints.md
The standard IoT Central endpoints for device connectivity are accessible using
Use private endpoints to limit and secure device connectivity to your IoT Central application and only allow access through your private virtual network.
-Private endpoints use private IP addresses from a virtual network address space to connect your devices privately to your IoT Central application. Network traffic between devices on the virtual network and the IoT platform traverses the virtual network and a private link on the [Microsoft backbone network](/azure/networking/microsoft-global-network), eliminating exposure on the public internet.
+Private endpoints use private IP addresses from a virtual network address space to connect your devices privately to your IoT Central application. Network traffic between devices on the virtual network and the IoT platform traverses the virtual network and a private link on the [Microsoft backbone network](../../networking/microsoft-global-network.md), eliminating exposure on the public internet.
To learn more about Azure Virtual Networks, see: -- [Azure Virtual Networks](/azure/virtual-network/virtual-networks-overview)-- [Azure private endpoints](/azure/private-link/private-endpoint-overview)-- [Azure private links](/azure/private-link/private-link-overview)
+- [Azure Virtual Networks](../../virtual-network/virtual-networks-overview.md)
+- [Azure private endpoints](../../private-link/private-endpoint-overview.md)
+- [Azure private links](../../private-link/private-link-overview.md)
Private endpoints in your IoT Central application enable you to: - Secure your cluster by configuring the firewall to block all device connections on the public endpoint. - Increase security for the virtual network by enabling you to block exfiltration of data from the virtual network.-- Securely connect devices to IoT Central from on-premises networks that connect to the virtual network by using a [VPN gateway](/azure/vpn-gateway/vpn-gateway-about-vpngateways) or [ExpressRoute](/azure/expressroute) private peering.
+- Securely connect devices to IoT Central from on-premises networks that connect to the virtual network by using a [VPN gateway](../../vpn-gateway/vpn-gateway-about-vpngateways.md) or [ExpressRoute](../../expressroute/index.yml) private peering.
The use of private endpoints in IoT Central is appropriate for devices connected to an on-premises network. You shouldn't use private endpoints for devices deployed in a wide-area network such as the internet.
Use the following information to help determine the total number of IP addresses
| Azure reserved addresses | 5 | | Total | 11-107 |
-To learn more, see the Azure [Azure Virtual Network FAQ](/azure/virtual-network/virtual-networks-faq).
+To learn more, see the Azure [Azure Virtual Network FAQ](../../virtual-network/virtual-networks-faq.md).
> [!NOTE] > The minimum size for the subnet is `/28` (14 usable IP addresses). For use with an IoT Central private endpoint `/24` is recommended, which helps with extreme workloads.
To learn more, see the Azure [Azure Virtual Network FAQ](/azure/virtual-network/
Now that you've learned about using private endpoints to connect device to your application, here's the suggested next step: > [!div class="nextstepaction"]
-> [Create a private endpoint for Azure IoT Central application](howto-create-private-endpoint.md).
+> [Create a private endpoint for Azure IoT Central application](howto-create-private-endpoint.md).
iot-central Concepts Telemetry Properties Commands https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/concepts-telemetry-properties-commands.md
IoT Central expects a response from the device to writable property updates. The
| -- | -- | -- | | `'ac': 200` | Completed | The property change operation was successfully completed. | | `'ac': 202` or `'ac': 201` | Pending | The property change operation is pending or in progress |
+| `'ac': 203` | Pending | The property change operation was initiated by the device |
| `'ac': 4xx` | Error | The requested property change wasn't valid or had an error | | `'ac': 5xx` | Error | The device experienced an unexpected error when processing the requested change. |
iot-central Howto Create Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-create-private-endpoint.md
You can connect your devices to your IoT Central application by using a private endpoint in an Azure Virtual Network.
-Private endpoints use private IP addresses from a virtual network address space to connect your devices privately to your IoT Central application. Network traffic between devices on the virtual network and the IoT platform traverses the virtual network and a private link on the [Microsoft backbone network](/azure/networking/microsoft-global-network), eliminating exposure on the public internet. This article shows you how to create a private endpoint for your IoT Central application.
+Private endpoints use private IP addresses from a virtual network address space to connect your devices privately to your IoT Central application. Network traffic between devices on the virtual network and the IoT platform traverses the virtual network and a private link on the [Microsoft backbone network](../../networking/microsoft-global-network.md), eliminating exposure on the public internet. This article shows you how to create a private endpoint for your IoT Central application.
## Prerequisites - An active Azure subscription. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free) before you begin. - An IoT Central application. To learn more, see [Create an IoT Central application](howto-create-iot-central-application.md).-- A virtual network in your Azure subscription. To learn more, see [Create a virtual network](/azure/virtual-network/quick-create-portal).
+- A virtual network in your Azure subscription. To learn more, see [Create a virtual network](../../virtual-network/quick-create-portal.md).
## Create a private endpoint There are several ways to create a private endpoint for IoT Central application: -- [Use the Azure portal to create a private endpoint resource directly](/azure/private-link/create-private-endpoint-portal). Use this option if you don't have access to the IoT Central application that needs the private endpoint.
+- [Use the Azure portal to create a private endpoint resource directly](../../private-link/create-private-endpoint-portal.md). Use this option if you don't have access to the IoT Central application that needs the private endpoint.
- Create private endpoint on an existing IoT Central application To create a private endpoint on an existing IoT Central application:
DNS configuration can be overwritten if you create or delete multiple private en
### Other troubleshooting tips
-If after trying all these checks you're still experiencing an issue, try the [private endpoint troubleshooting guide](/azure/private-link/troubleshoot-private-endpoint-connectivity).
+If after trying all these checks you're still experiencing an issue, try the [private endpoint troubleshooting guide](../../private-link/troubleshoot-private-endpoint-connectivity.md).
If all the checks are successful and your devices still can't establish a connection to IoT Central, contact the corporate security team responsible for firewalls and networking in general. Potential reasons for failure include:
If all the checks are successful and your devices still can't establish a connec
Now that you've learned how to create a private endpoint for your application, here's the suggested next step: > [!div class="nextstepaction"]
-> [Administer your application](howto-administer.md)
+> [Administer your application](howto-administer.md)
iot-develop Concepts Azure Rtos Security Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-develop/concepts-azure-rtos-security-practices.md
Use cloud resources to record and analyze device failures remotely. Aggregate er
**Application**: Make use of logging libraries and your cloud service's client SDK to push error logs to the cloud where they can be stored and analyzed safely without using valuable device storage space. Integration with [Microsoft Defender for IoT](https://azure.microsoft.com/services/azure-defender-for-iot/) provides this functionality and more. Microsoft Defender for IoT provides agentless monitoring of devices in an IoT solution. Monitoring can be enhanced by including the [Microsoft Defender for IOT micro-agent for Azure RTOS](../defender-for-iot/device-builders/iot-security-azure-rtos.md) on your device. For more information, see the [Runtime security monitoring and threat detection](#runtime-security-monitoring-and-threat-detection) recommendation.
-Microsoft Defender for IoT provides agentless monitoring of devices in an IoT solution. Monitoring can be enhanced by including the [Microsoft Defender for IOT micro-agent for Azure RTOS](/azure/defender-for-iot/device-builders/iot-security-azure-rtos) on your device. For more information, see the [Runtime security monitoring and threat detection](#runtime-security-monitoring-and-threat-detection) recommendation.
+Microsoft Defender for IoT provides agentless monitoring of devices in an IoT solution. Monitoring can be enhanced by including the [Microsoft Defender for IOT micro-agent for Azure RTOS](../defender-for-iot/device-builders/iot-security-azure-rtos.md) on your device. For more information, see the [Runtime security monitoring and threat detection](#runtime-security-monitoring-and-threat-detection) recommendation.
### Disable unused protocols and features
Whether you're using Azure RTOS in combination with Azure Sphere or not, the Mic
- [Common Criteria](https://www.commoncriteriaportal.org/) is an international agreement that provides standardized guidelines and an authorized laboratory program to evaluate products for IT security. Certification provides a level of confidence in the security posture of applications using devices that were evaluated by using the program guidelines. - [Security Evaluation Standard for IoT Platforms (SESIP)](https://globalplatform.org/sesip/) is a standardized methodology for evaluating the security of connected IoT products and components. - [ISO 27000 family](https://www.iso.org/isoiec-27001-information-security.html) is a collection of standards regarding the management and security of information assets. The standards provide baseline guarantees about the security of digital information in certified products.-- [FIPS 140-2/3](https://csrc.nist.gov/publications/detail/fips/140/3/final) is a US government program that standardizes cryptographic algorithms and implementations used in US government and military applications. Along with documented standards, certified laboratories provide FIPS certification to guarantee specific cryptographic implementations adhere to regulations.
+- [FIPS 140-2/3](https://csrc.nist.gov/publications/detail/fips/140/3/final) is a US government program that standardizes cryptographic algorithms and implementations used in US government and military applications. Along with documented standards, certified laboratories provide FIPS certification to guarantee specific cryptographic implementations adhere to regulations.
iot-dps How To Manage Enrollments https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/how-to-manage-enrollments.md
To create a symmetric key individual enrollment:
| **Mechanism** | Select *Symmetric Key* | | **Auto Generate Keys** |Check this box. | | **Registration ID** | Type in a unique registration ID.|
- | **IoT Hub Device ID** | This ID will represent your device. It must follow the rules for a device ID. For more information, see [Device identity properties](../iot-hub/iot-hub-devguide-identity-registry. If the device ID is left unspecified, then the registration ID will be used.|
+ | **IoT Hub Device ID** | This ID will represent your device. It must follow the rules for a device ID. For more information, see [Device identity properties](../iot-hub/iot-hub-devguide-identity-registry.md). If the device ID is left unspecified, then the registration ID will be used.|
| **Select how you want to assign devices to hubs** |Select *Static configuration* so that you can assign to a specific hub| | **Select the IoT hubs this group can be assigned to** |Select one of your hubs.|
iot-dps How To Provision Multitenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/how-to-provision-multitenant.md
For simplicity, this article uses [Symmetric key attestation](concepts-symmetric
8. Repeat Steps 5 through 7 for the second IoT hub that you created for the *westgus* location.
-9. Select the two IoT Hubs you created in the **Select the IoT hubs this group c an be assigned to** drop down.
+9. Select the two IoT Hubs you created in the **Select the IoT hubs this group can be assigned to** drop down.
:::image type="content" source="./media/how-to-provision-multitenant/enrollment-regional-hub-group.png" alt-text="Select the linked IoT hubs.":::
To delete the resource group by name:
* To learn more about deprovisioning, see > [!div class="nextstepaction"]
-> [How to deprovision devices that were previously auto-provisioned](how-to-unprovision-devices.md)
+> [How to deprovision devices that were previously auto-provisioned](how-to-unprovision-devices.md)
iot-edge How To Configure Proxy Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/how-to-configure-proxy-support.md
This step takes place once on the IoT Edge device during initial device setup.
1. Open the config file on your IoT Edge device: `/etc/aziot/config.toml`. The configuration file is protected, so you need administrative privileges to access it. On Linux systems, use the `sudo` command before opening the file in your preferred text editor.
-2. In the config file, find the `[agent]` section, which contains all the configuration information for the edgeAgent module to use on startup. The IoT Edge agent definition includes an `[agent.env]` subsection where you can add environment variables.
+2. In the config file, find the `[agent]` section, which contains all the configuration information for the edgeAgent module to use on startup. Check and make sure that the `[agent]`section is uncommented or add it if it is not included in the `config.toml`. The IoT Edge agent definition includes an `[agent.env]` subsection where you can add environment variables.
3. Add the **https_proxy** parameter to the environment variables section, and set your proxy URL as its value. ```toml
+ [agent]
+ name = "edgeAgent"
+ type = "docker"
+
[agent.env] # "RuntimeLogLevel" = "debug" # "UpstreamProtocol" = "AmqpWs"
key-vault Overview Vnet Service Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/general/overview-vnet-service-endpoints.md
Here's a list of trusted services that are allowed to access a key vault if the
|Azure Container Registry|[Registry encryption using customer-managed keys](../../container-registry/container-registry-customer-managed-keys.md) |Azure Application Gateway |[Using Key Vault certificates for HTTPS-enabled listeners](../../application-gateway/key-vault-certs.md) |Azure Front Door|[Using Key Vault certificates for HTTPS](../../frontdoor/front-door-custom-domain-https.md#prepare-your-azure-key-vault-account-and-certificate)
-|Azure Purview|[Using credentials for source authentication in Azure Purview](../../purview/manage-credentials.md)
+|Microsoft Purview|[Using credentials for source authentication in Microsoft Purview](../../purview/manage-credentials.md)
|Azure Machine Learning|[Secure Azure Machine Learning in a virtual network](../../machine-learning/how-to-secure-workspace-vnet.md)| > [!NOTE]
load-balancer Load Balancer Standard Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/load-balancer-standard-availability-zones.md
Now that you understand the zone-related properties for Standard Load Balancer,
### Tolerance to zone failure - A **zone redundant** frontend can serve a zonal resource in any zone with a single IP address. The IP can survive one or more zone failures as long as at least one zone remains healthy within the region.-- A **zonal** frontend is a reduction of the service to a single zone and shares fate with the respective zone. If the zone your deployment is in goes down, your deployment will not survive this failure.
+- A **zonal** frontend is a reduction of the service to a single zone and shares fate with the respective zone. If the deployment in your zone goes down, your load balancer will not survive this failure.
Members in the backend pool of a load balancer are normally associated with a single zone (e.g. zonal virtual machines). A common design for production workloads would be to have multiple zonal resources (e.g. virtual machines from zone 1, 2, and 3) in the backend of a load balancer with a zone-redundant frontend.
logic-apps Create Single Tenant Workflows Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-single-tenant-workflows-azure-portal.md
ms.suite: integration Previously updated : 03/02/2022 Last updated : 04/15/2022 #Customer intent: As a developer, I want to create an automated integration workflow that runs in single-tenant Azure Logic Apps using the Azure portal.
In this example, the workflow runs when the Request trigger receives an inbound
1. After the details pane opens, on the **Parameters** tab, find the **HTTP POST URL** property. To copy the generated URL, select the **Copy Url** (copy file icon), and save the URL somewhere else for now. The URL follows this format:
- `http://<logic-app-name>.azurewebsites.net:443/api/<workflow-name>/triggers/manual/invoke?api-version=2020-05-01-preview&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<shared-access-signature>`
+ `http://<logic-app-name>.azurewebsites.net:443/api/<workflow-name>/triggers/manual/invoke?api-version=2020-05-01w&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<shared-access-signature>`
![Screenshot that shows the designer with the Request trigger and endpoint URL in the "HTTP POST URL" property.](./media/create-single-tenant-workflows-azure-portal/find-request-trigger-url.png) For this example, the URL looks like this:
- `https://fabrikam-workflows.azurewebsites.net:443/api/Fabrikam-Stateful-Workflow/triggers/manual/invoke?api-version=2020-05-01-preview&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=xxxxxXXXXxxxxxXXXXxxxXXXXxxxxXXXX`
+ `https://fabrikam-workflows.azurewebsites.net:443/api/Fabrikam-Stateful-Workflow/triggers/manual/invoke?api-version=2020-05-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=xxxxxXXXXxxxxxXXXXxxxXXXXxxxxXXXX`
> [!TIP] > You can also find the endpoint URL on your logic app's **Overview** pane in the **Workflow URL** property.
For a stateful workflow, after each workflow run, you can view the run history,
| Run status | Description | ||-| | **Aborted** | The run stopped or didn't finish due to external problems, for example, a system outage or lapsed Azure subscription. |
- | **Cancelled** | The run was triggered and started but received a cancel request. |
+ | **Canceled** | The run was triggered and started but received a cancel request. |
| **Failed** | At least one action in the run failed. No subsequent actions in the workflow were set up to handle the failure. | | **Running** | The run was triggered and is in progress, but this status can also appear for a run that is throttled due to [action limits](logic-apps-limits-and-config.md) or the [current pricing plan](https://azure.microsoft.com/pricing/details/logic-apps/). <p><p>**Tip**: If you set up [diagnostics logging](monitor-logic-apps-log-analytics.md), you can get information about any throttle events that happen. | | **Succeeded** | The run succeeded. If any action failed, a subsequent action in the workflow handled that failure. |
For a stateful workflow, after each workflow run, you can view the run history,
| Action status | Description | ||-| | **Aborted** | The action stopped or didn't finish due to external problems, for example, a system outage or lapsed Azure subscription. |
- | **Cancelled** | The action was running but received a cancel request. |
+ | **Canceled** | The action was running but received a cancel request. |
| **Failed** | The action failed. | | **Running** | The action is currently running. | | **Skipped** | The action was skipped because its `runAfter` conditions weren't met, for example, a preceding action failed. Each action has a `runAfter` object where you can set up conditions that must be met before the current action can run. |
For a stateful workflow, after each workflow run, you can view the run history,
||| [aborted-icon]: ./media/create-single-tenant-workflows-azure-portal/aborted.png
- [cancelled-icon]: ./media/create-single-tenant-workflows-azure-portal/cancelled.png
+ [canceled-icon]: ./media/create-single-tenant-workflows-azure-portal/cancelled.png
[failed-icon]: ./media/create-single-tenant-workflows-azure-portal/failed.png [running-icon]: ./media/create-single-tenant-workflows-azure-portal/running.png [skipped-icon]: ./media/create-single-tenant-workflows-azure-portal/skipped.png
logic-apps Create Single Tenant Workflows Visual Studio Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-single-tenant-workflows-visual-studio-code.md
ms.suite: integration Previously updated : 01/28/2022 Last updated : 04/15/2022
To test your logic app, follow these steps to start a debugging session, and fin
1. Find the **Callback URL** value, which looks similar to this URL for the example Request trigger:
- `http://localhost:7071/api/<workflow-name>/triggers/manual/invoke?api-version=2020-05-01-preview&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<shared-access-signature>`
+ `http://localhost:7071/api/<workflow-name>/triggers/manual/invoke?api-version=2020-05-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<shared-access-signature>`
![Screenshot that shows your workflow's overview page with callback URL](./media/create-single-tenant-workflows-visual-studio-code/find-callback-url.png)
To test your logic app, follow these steps to start a debugging session, and fin
| Run status | Description | ||-| | **Aborted** | The run stopped or didn't finish due to external problems, for example, a system outage or lapsed Azure subscription. |
- | **Cancelled** | The run was triggered and started but received a cancellation request. |
+ | **Canceled** | The run was triggered and started but received a cancellation request. |
| **Failed** | At least one action in the run failed. No subsequent actions in the workflow were set up to handle the failure. | | **Running** | The run was triggered and is in progress, but this status can also appear for a run that is throttled due to [action limits](logic-apps-limits-and-config.md) or the [current pricing plan](https://azure.microsoft.com/pricing/details/logic-apps/). <p><p>**Tip**: If you set up [diagnostics logging](monitor-logic-apps-log-analytics.md), you can get information about any throttle events that happen. | | **Succeeded** | The run succeeded. If any action failed, a subsequent action in the workflow handled that failure. |
To test your logic app, follow these steps to start a debugging session, and fin
| Action status | Description | ||-| | **Aborted** | The action stopped or didn't finish due to external problems, for example, a system outage or lapsed Azure subscription. |
- | **Cancelled** | The action was running but received a request to cancel. |
+ | **Canceled** | The action was running but received a request to cancel. |
| **Failed** | The action failed. | | **Running** | The action is currently running. | | **Skipped** | The action was skipped because the immediately preceding action failed. An action has a `runAfter` condition that requires that the preceding action finishes successfully before the current action can run. |
machine-learning Reference Yaml Core Syntax https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/reference-yaml-core-syntax.md
To reference an Azure ML resource (such as compute), you can use either of the f
* Shorthand syntax: `azureml:<resource_name>` * Longhand syntax, which includes the ARM resource ID of the resource: ```
-azureml:/subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.MachineLearningServices/workspaces/<workspace-name>/compute/<compute-name>
+azureml:/subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.MachineLearningServices/workspaces/<workspace-name>/computes/<compute-name>
``` ## Azure ML data reference URI
managed-grafana How To Api Calls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-api-calls.md
Title: 'How to call Grafana APIs in your automation: Azure Managed Grafana Preview'
-description: Learn how to call Grafana APIs in your automation with Azure Active Directory (Azure AD) and an Azure service principal
+ Title: 'Call Grafana APIs programmatically'
+
+description: Learn how to call Grafana APIs programmatically with Azure Active Directory (Azure AD) and an Azure service principal
Previously updated : 3/31/2022 Last updated : 4/18/2022
-# How to call Grafana APIs in your automation within Azure Managed Grafana Preview
+# How to call Grafana APIs programmatically
In this article, you'll learn how to call Grafana APIs within Azure Managed Grafana Preview using a service principal.
In this article, you'll learn how to call Grafana APIs within Azure Managed Graf
Sign in to the Azure portal at [https://portal.azure.com/](https://portal.azure.com/) with your Azure account.
-## Assign roles to the service principal of your application and of your Azure Managed Grafana workspace
+## Assign roles to the service principal of your application and of your Azure Managed Grafana Preview workspace
1. Start by [Creating an Azure AD application and service principal that can access resources](../active-directory/develop/howto-create-service-principal-portal.md). This guide takes you through creating an application and assigning a role to its service principal. For simplicity, use an application located in the same Azure Active Directory (Azure AD) tenant as your Grafana workspace. 1. Assign the role of your choice to the service principal for your Grafana resource. Refer to [How to share a Managed Grafana workspace](how-to-share-grafana-workspace.md) to learn how to grant access to a Grafana instance. Instead of selecting a user, select **Service principal**.
curl -X GET \
https://<grafana-url>/api/user ```
-Replace `<access-token>` with the access token retrieved in the previous step and replace `<grafana-url>` with the URL of your Grafana instance. For example `https://grafanaworkspace-abcd.cuse.grafana.azure.com`. This URL is displayed in the Azure platform, in the **Overview** page of your Managed Grafana workspace.
+Replace `<access-token>` with the access token retrieved in the previous step and replace `<grafana-url>` with the URL of your Grafana instance. For example `https://grafanaworkspace-abcd.cuse.grafana.azure.com`. This URL is displayed in the Azure platform, in the **Overview** page of your Managed Grafana workspace.
:::image type="content" source="media/managed-grafana-how-to-api-endpoint.png" alt-text="Screenshot of the Azure platform. Endpoint displayed in the Overview page.":::
marketplace Dynamics 365 Business Central Offer Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/dynamics-365-business-central-offer-setup.md
Previously updated : 03/28/2022 Last updated : 04/18/2022 # Create a Dynamics 365 Business Central offer
Review [Plan a Dynamics 365 offer](marketplace-dynamics-365.md). It explains the
1. In the dialog box that appears, enter an **Offer ID**. This is a unique identifier for each offer in your account. - This ID is visible to customers in the web address for the offer and in Azure Resource Manager templates, if applicable.
- - Use only lowercase letters and numbers. The ID can include hyphens and underscores, but no spaces, and is limited to 50 characters. For example, if your Publisher ID is `testpublisherid` and you enter **test-offer-1**, the offer web address will be `https://appsource.microsoft.com/product/dynamics-365/testpublisherid.test-offer-1`.
+ - Use only lowercase letters and numbers. The ID can include hyphens and underscores, but no spaces. The combined sum of the Offer ID and Publisher ID is limited to 40 characters. For example, if your Publisher ID is `testpublisherid` and you enter **test-offer-1**, the offer web address will be `https://appsource.microsoft.com/product/dynamics-365/testpublisherid.test-offer-1`. In this case, the segment, ΓÇ£testpublisherid.test-offer-1ΓÇ¥is 28 characters long, which is within the 40-character limit.
- The Offer ID can't be changed after you select **Create**. 1. Enter an **Offer alias**. This is the name used for the offer in Partner Center.
marketplace Dynamics 365 Customer Engage Offer Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/dynamics-365-customer-engage-offer-setup.md
Previously updated : 03/28/2022 Last updated : 04/18/2022 # Create a Dynamics 365 apps on Dataverse and Power Apps offer
Review [Plan a Dynamics 365 offer](marketplace-dynamics-365.md). It will explain
1. Enter an **Offer ID**. This is a unique identifier for each offer in your account. - This ID is visible to customers in the web address for the offer and in Azure Resource Manager templates, if applicable.
- - Use only lowercase letters and numbers. The ID can include hyphens and underscores, but no spaces, and is limited to 50 characters. For example, if your Publisher ID is `testpublisherid` and you enter **test-offer-1**, the offer web address will be `https://appsource.microsoft.com/product/dynamics-365/testpublisherid.test-offer-1`.
+ - Use only lowercase letters and numbers. The ID can include hyphens and underscores, but no spaces. The combined sum of the Offer ID and Publisher ID is limited to 40 characters. For example, if your Publisher ID is `testpublisherid` and you enter **test-offer-1**, the offer web address will be `https://appsource.microsoft.com/product/dynamics-365/testpublisherid.test-offer-1`. In this case, the segment, ΓÇ£testpublisherid.test-offer-1ΓÇ¥ is 28 characters long, which is within the 40-character limit.
- The Offer ID can't be changed after you select **Create**. 1. Enter an **Offer alias**. This is the name used for the offer in Partner Center.
marketplace Dynamics 365 Operations Offer Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/dynamics-365-operations-offer-setup.md
Previously updated : 03/28/2022 Last updated : 04/18/2022 # Create a Dynamics 365 Operations Apps offer
Review [Plan a Dynamics 365 offer](marketplace-dynamics-365.md). It will explain
1. Enter an **Offer ID**. This is a unique identifier for each offer in your account. - This ID is visible to customers in the web address for the offer and in Azure Resource Manager templates, if applicable.
- - Use only lowercase letters and numbers. The ID can include hyphens and underscores, but no spaces, and is limited to 50 characters. For example, if you enter **test-offer-1**, the offer web address will be `https://azuremarketplace.microsoft.com/marketplace/../test-offer-1`.
+ - Use only lowercase letters and numbers. The ID can include hyphens and underscores, but no spaces, and is limited to 40 characters. For example, if you enter **test-offer-1**, the offer web address will be `https://azuremarketplace.microsoft.com/marketplace/../test-offer-1`.
- The Offer ID can't be changed after you select **Create**. 1. Enter an **Offer alias**. This is the name used for the offer in Partner Center.
marketplace Review Publish Offer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/review-publish-offer.md
You can review your offer status on the **Overview** tab of the commercial marke
| Live | Offer is live in the marketplace and can be seen and acquired by customers. | | Pending Stop distribution | Publisher selected "Stop distribution" on an offer or plan, but the action has not yet been completed. | | Not available in the marketplace | A previously published offer in the marketplace has been removed. |
-|
## Validation and publishing steps
After all pages are complete and you have entered applicable testing notes, sel
| [Preview creation](#preview-creation-phase) | The listing page for your offer preview is available to anyone who has the preview link. If your offer will be sold through Microsoft (transactable), only the audience you specified on the **Preview audience** page of your offer can purchase and access the offer for testing. | | [Publisher sign-off](#publisher-sign-off-phase) | We send you an email with a request for you to preview and approve your offer. | | [Publish](#publish-phase) | We run a series of steps to verify that the preview offer is published live to the commercial marketplace. |
-|||
For more information about validation in Azure Marketplace, see [Azure Marketplace listing guidelines](marketplace-criteria-content-validation.md).
On the **Offer overview** page, you will see preview links under the **Go live**
After you approve your preview, select **Go live** to publish your offer live to the commercial marketplace.
-If you want to make changes after previewing the offer, you can edit and resubmit your publication request. If your offer is already live and available to the public in the marketplace, any updates you make won't go live until you select **Go live*. For more information, see [Update an existing offer in the commercial marketplace](update-existing-offer.md)
+If you want to make changes after previewing the offer, you can edit and resubmit your publication request. If your offer is already live and available to the public in the marketplace, any updates you make won't go live until you select *Go live*. For more information, see [Update an existing offer in the commercial marketplace](update-existing-offer.md).
## Publish phase
open-datasets Dataset 1000 Genomes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-1000-genomes.md
Title: 1000 Genomes- description: Learn how to use the 1000 Genomes dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Bing Covid 19 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-bing-covid-19.md
Title: Bing COVID-19- description: Learn how to use the Bing COVID-19 dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Boston Safety https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-boston-safety.md
Title: Boston Safety Data- description: Learn how to use the Boston Safety Data dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-catalog.md
Title: Datasets in Azure Open Datasets- description: Explore the datasets in Azure Open Datasets. -- Last updated 04/16/2021 # Azure Open Datasets
open-datasets Dataset Chicago Safety https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-chicago-safety.md
Title: Chicago Safety Data- description: Learn how to use the Chicago Safety Data dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Clinvar Annotations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-clinvar-annotations.md
Title: ClinVar Annotations- description: Learn how to use the ClinVar Annotations dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Covid 19 Data Lake https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-covid-19-data-lake.md
Title: COVID-19 Data Lake- description: Learn how to use the COVID-19 Data Lake in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Covid 19 Open Research https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-covid-19-open-research.md
Title: COVID-19 Open Research Dataset- description: Learn how to use the COVID-19 Open Research Dataset dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Covid Tracking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-covid-tracking.md
Title: COVID Tracking Project- description: Learn how to use the COVID tracking project dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Diabetes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-diabetes.md
Title: Diabetes dataset- description: Learn how to use the diabetes dataset in Azure Open Datasets. -- Last updated 04/16/2021 # Diabetes dataset
open-datasets Dataset Ecdc Covid Cases https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-ecdc-covid-cases.md
Title: European Centre for Disease Prevention and Control (ECDC) COVID-19 Cases- description: Learn how to use the European Centre for Disease Prevention and Control (ECDC) COVID-19 Cases dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Encode https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-encode.md
Title: ENCODE- description: Learn how to use the ENCODE dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Gatk Resource Bundle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-gatk-resource-bundle.md
Title: GATK Resource Bundle- description: Learn how to use the GATK Resource Bundle dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Genomics Data Lake https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-genomics-data-lake.md
Title: Genomics Data Lake- description: Learn how to use the Genomics Data Lake in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Gnomad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-gnomad.md
Title: Genome Aggregation Database (gnomAD)- description: Learn how to use the Genome Aggregation Database (gnomAD) dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Human Reference Genomes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-human-reference-genomes.md
Title: Human Reference Genomes- description: Learn how to use the Human Reference Genomes dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Illumina Platinum Genomes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-illumina-platinum-genomes.md
Title: Illumina Platinum Genomes- description: Learn how to use the Illumina Platinum Genomes dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Microsoft News https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-microsoft-news.md
Title: Microsoft News Recommendation Dataset- description: Learn how to use the Microsoft News Recommendation dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Mnist https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-mnist.md
Title: MNIST database of handwritten digits- description: Learn how to use the MNIST database of handwritten digits dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset New York City Safety https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-new-york-city-safety.md
Title: New York City Safety Data- description: Learn how to use the New York City Safety dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Oj Sales Simulated https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-oj-sales-simulated.md
Title: OJ Sales Simulated - description: Learn how to use the OJ Sales Simulated dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Open Cravat https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-open-cravat.md
Title: OpenCravat- description: Learn how to use the OpenCravat dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Open Speech Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-open-speech-text.md
Title: Russian Open Speech To Text- description: Learn how to use the Russian Open Speech To Text dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Oxford Covid Government Response Tracker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-oxford-covid-government-response-tracker.md
Title: Oxford COVID-19 Government Response Tracker- description: Learn how to use the Oxford COVID-19 Government Response Tracker dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Public Holidays https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-public-holidays.md
Title: Public Holidays- description: Learn how to use the Public Holidays dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset San Francisco Safety https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-san-francisco-safety.md
Title: San Francisco Safety Data- description: Learn how to use the San Francisco Safety dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Seattle Safety https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-seattle-safety.md
Title: Seattle Safety Data- description: Learn how to use the Seattle Safety dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Snpeff https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-snpeff.md
Title: SnpEff- description: Learn how to use the SnpEff dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Tartanair Simulation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-tartanair-simulation.md
Title: TartanAir AirSim dataset- description: Learn how to use the TartanAir dataset in Azure Open Datasets. -- Last updated 04/16/2021 # TartanAir: AirSim simulation dataset for simultaneous localization and mapping (SLAM)
open-datasets Dataset Taxi For Hire Vehicle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-taxi-for-hire-vehicle.md
Title: NYC Taxi and Limousine for-hire vehicle dataset- description: Learn how to use the NYC Taxi and Limousine for-hire vehicle (VHF) dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Taxi Green https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-taxi-green.md
Title: NYC Taxi and Limousine green dataset- description: Learn how to use the NYC Taxi and Limousine green dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Taxi Yellow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-taxi-yellow.md
Title: NYC Taxi and Limousine yellow dataset- description: Learn how to use the NYC Taxi and Limousine yellow dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Consumer Price Index https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-consumer-price-index.md
Title: US Consumer Price Index- description: Learn how to use the US Consumer Price Index dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Labor Force https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-labor-force.md
Title: US Labor Force Statistics- description: Learn how to use the US Labor Force Statistics dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Local Unemployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-local-unemployment.md
Title: US Local Area Unemployment Statistics- description: Learn how to use the US Local Area Unemployment Statistics dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us National Employment Earnings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-national-employment-earnings.md
Title: US National Employment Hours and Earnings- description: Learn how to use the US National Employment Hours and Earnings dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Population County https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-population-county.md
Title: US Population by County- description: Learn how to use the US Population by County dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Population Zip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-population-zip.md
Title: US Population by ZIP code- description: Learn how to use the US Population by ZIP code dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Producer Price Index Commodities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-producer-price-index-commodities.md
Title: US Producer Price Index - Commodities- description: Learn how to use the US Producer Price Index - Commodities dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us Producer Price Index Industry https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-producer-price-index-industry.md
Title: US Producer Price Index industry- description: Learn how to use the US Producer Price Index industry dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets Dataset Us State Employment Earnings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/dataset-us-state-employment-earnings.md
Title: US State Employment Hours and Earnings- description: Learn how to use the US State Employment Hours and Earnings dataset in Azure Open Datasets. -- Last updated 04/16/2021
open-datasets How To Create Azure Machine Learning Dataset From Open Dataset https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/how-to-create-azure-machine-learning-dataset-from-open-dataset.md
Title: Create datasets with Azure Open Datasets- description: Learn how to create an Azure Machine Learning dataset from Azure Open Datasets.
open-datasets Overview What Are Open Datasets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/overview-what-are-open-datasets.md
Title: What are open datasets? Curated public datasets- description: Learn about Azure Open Datasets, curated datasets from the public domain such as weather, census, holidays, and location to enrich predictive solutions.
open-datasets Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/open-datasets/samples.md
Title: Example Jupyter notebooks using NOAA data- description: Use example Jupyter notebooks for Azure Open Datasets to learn how to load open datasets and use them to enrich demo data. Techniques include use of Spark and Pandas to process data.
postgresql Quickstart Create Postgresql Server Database Using Arm Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/quickstart-create-postgresql-server-database-using-arm-template.md
On the **Deploy Azure Database for PostgreSQL with VNet** page:
* **Sku Size MB**: the storage size, in megabytes, of the Azure Database for PostgreSQL server (default *51200*). * **Sku Tier**: the deployment tier, such as *Basic*, *GeneralPurpose* (the default), or *MemoryOptimized*. * **Sku Family**: *Gen4* or *Gen5* (the default), which indicates hardware generation for server deployment.
- * **Postgresql Version**: the version of PostgreSQL server to deploy, such as *9.5*, *9.6*, *10*, or *11* (the default).
+ * **PostgreSQL Version**: the version of PostgreSQL server to deploy, such as *9.5*, *9.6*, *10*, or *11* (the default).
* **Backup Retention Days**: the desired period for geo-redundant backup retention, in days (default *7*). * **Geo Redundant Backup**: *Enabled* or *Disabled* (the default), depending on geo-disaster recovery (Geo-DR) requirements. * **Virtual Network Name**: the name of the virtual network (default *azure_postgresql_vnet*).
private-link Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/availability.md
The following tables list the Private Link services and the regions where they'r
| | -| | -| | Azure Automation | All public regions<br/> All Government regions | | GA </br> [Learn how to create a private endpoint for Azure Automation.](../automation/how-to/private-link-security.md)| |Azure Backup | All public regions<br/> All Government regions | | GA <br/> [Learn how to create a private endpoint for Azure Backup.](../backup/private-endpoints.md) |
-|Azure Purview | Southeast Asia, Australia East, Brazil South, North Europe, West Europe, Canada Central, East US, East US 2, EAST US 2 EUAP, South Central US, West Central US, West US 2, Central India, UK South | [Select for known limitations](../purview/catalog-private-link-troubleshoot.md#known-limitations) | GA <br/> [Learn how to create a private endpoint for Azure Purview.](../purview/catalog-private-link.md) |
+|Microsoft Purview | Southeast Asia, Australia East, Brazil South, North Europe, West Europe, Canada Central, East US, East US 2, EAST US 2 EUAP, South Central US, West Central US, West US 2, Central India, UK South | [Select for known limitations](../purview/catalog-private-link-troubleshoot.md#known-limitations) | GA <br/> [Learn how to create a private endpoint for Microsoft Purview.](../purview/catalog-private-link.md) |
### Security
private-link Private Endpoint Dns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/private-endpoint-dns.md
For Azure services, use the recommended zone names as described in the following
| Azure Data Factory (Microsoft.DataFactory/factories) / portal | privatelink.adf.azure.com | adf.azure.com | | Azure Cache for Redis (Microsoft.Cache/Redis) / redisCache | privatelink.redis.cache.windows.net | redis.cache.windows.net | | Azure Cache for Redis Enterprise (Microsoft.Cache/RedisEnterprise) / redisEnterprise | privatelink.redisenterprise.cache.azure.net | redisenterprise.cache.azure.net |
-| Azure Purview (Microsoft.Purview) / account | privatelink.purview.azure.com | purview.azure.com |
-| Azure Purview (Microsoft.Purview) / portal| privatelink.purviewstudio.azure.com | purview.azure.com |
+| Microsoft Purview (Microsoft.Purview) / account | privatelink.purview.azure.com | purview.azure.com |
+| Microsoft Purview (Microsoft.Purview) / portal| privatelink.purviewstudio.azure.com | purview.azure.com |
| Azure Digital Twins (Microsoft.DigitalTwins) / digitalTwinsInstances | privatelink.digitaltwins.azure.net | digitaltwins.azure.net | | Azure HDInsight (Microsoft.HDInsight) | privatelink.azurehdinsight.net | azurehdinsight.net | | Azure Arc (Microsoft.HybridCompute) / hybridcompute | privatelink.his.arc.azure.com<br />privatelink.guestconfiguration.azure.com | his.arc.azure.com<br />guestconfiguration.azure.com |
private-link Private Endpoint Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/private-endpoint-overview.md
A private-link resource is the destination target of a specified private endpoin
| Azure Database for PostgreSQL - Single server | Microsoft.DBforPostgreSQL/servers | postgresqlServer | | Azure Device Provisioning Service | Microsoft.Devices/provisioningServices | iotDps | | Azure IoT Hub | Microsoft.Devices/IotHubs | iotHub |
+| Azure IoT Central | Microsoft.IoTCentral/IoTApps | IoTApps |
| Azure Digital Twins | Microsoft.DigitalTwins/digitalTwinsInstances | digitaltwinsinstance | | Azure Event Grid | Microsoft.EventGrid/domains | domain | | Azure Event Grid | Microsoft.EventGrid/topics | topic |
A private-link resource is the destination target of a specified private endpoin
| Application Gateway | Microsoft.Network/applicationgateways | application gateway | | Private Link service (your own service) | Microsoft.Network/privateLinkServices | empty | | Power BI | Microsoft.PowerBI/privateLinkServicesForPowerBI | Power BI |
-| Azure Purview | Microsoft.Purview/accounts | account |
-| Azure Purview | Microsoft.Purview/accounts | portal |
+| Microsoft Purview | Microsoft.Purview/accounts | account |
+| Microsoft Purview | Microsoft.Purview/accounts | portal |
| Azure Backup | Microsoft.RecoveryServices/vaults | vault | | Azure Relay | Microsoft.Relay/namespaces | namespace | | Azure Cognitive Search | Microsoft.Search/searchServices | search service |
A private-link resource is the destination target of a specified private endpoin
| Azure Synapse Analytics | Microsoft.Synapse/workspaces | SQL, SqlOnDemand, Dev | | Azure App Service | Microsoft.Web/hostingEnvironments | hosting environment | | Azure App Service | Microsoft.Web/sites | sites |
-| Azure Static Web Apps | Microsoft.Web/staticSites | staticSite |
+| Azure Static Web Apps | Microsoft.Web/staticSites | staticSites |
> [!NOTE] > You can create private endpoints only on a General Purpose v2 (GPv2) storage account.
purview Abap Functions Deployment Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/abap-functions-deployment-guide.md
Title: SAP ABAP function module deployment guide - Azure Purview
+ Title: SAP ABAP function module deployment guide - Microsoft Purview
description: This article outlines the steps to deploy the ABAP function module in your SAP server.
Last updated 03/05/2022
# SAP ABAP function module deployment guide
-When you scan [SAP ECC](register-scan-sapecc-source.md), [SAP S/4HANA](register-scan-saps4hana-source.md), and [SAP BW](register-scan-sap-bw.md) sources in Azure Purview, you need to create the dependent ABAP function module in your SAP server. Azure Purview invokes this function module to extract the metadata from your SAP system during scan.
+When you scan [SAP ECC](register-scan-sapecc-source.md), [SAP S/4HANA](register-scan-saps4hana-source.md), and [SAP BW](register-scan-sap-bw.md) sources in Microsoft Purview, you need to create the dependent ABAP function module in your SAP server. Microsoft Purview invokes this function module to extract the metadata from your SAP system during scan.
This article describes the steps required to deploy this module.
This article describes the steps required to deploy this module.
## Prerequisites
-Download the SAP ABAP function module source code from Azure Purview Studio. After you register a source for [SAP ECC](register-scan-sapecc-source.md), [SAP S/4HANA](register-scan-saps4hana-source.md), or [SAP BW](register-scan-sap-bw.md), you can find a download link on top as shown in the following image. You can also see the link when you create a new scan or edit a scan.
+Download the SAP ABAP function module source code from Microsoft Purview Studio. After you register a source for [SAP ECC](register-scan-sapecc-source.md), [SAP S/4HANA](register-scan-saps4hana-source.md), or [SAP BW](register-scan-sap-bw.md), you can find a download link on top as shown in the following image. You can also see the link when you create a new scan or edit a scan.
## Deploy the module
After the module is created, specify the following information:
1. Go to the **Source code** tab. There are two ways to deploy code for the function:
- 1. On the main menu, upload the text file you downloaded from Azure Purview Studio as described in [Prerequisites](#prerequisites). To do so, select **Utilities** > **More Utilities** > **Upload/Download** > **Upload**.
+ 1. On the main menu, upload the text file you downloaded from Microsoft Purview Studio as described in [Prerequisites](#prerequisites). To do so, select **Utilities** > **More Utilities** > **Upload/Download** > **Upload**.
1. Alternatively, open the file and copy and paste the contents in the **Source code** area.
purview Apply Classifications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/apply-classifications.md
Last updated 09/27/2021
-# Apply classifications on assets in Azure Purview
+# Apply classifications on assets in Microsoft Purview
This article discusses how to apply classifications on assets. ## Introduction
-Classifications can be system or custom types. System classifications are present in Azure Purview by default. Custom classifications can be created based on a regular expression pattern. Classifications can be applied to assets either automatically or manually.
+Classifications can be system or custom types. System classifications are present in Microsoft Purview by default. Custom classifications can be created based on a regular expression pattern. Classifications can be applied to assets either automatically or manually.
This document explains how to apply classifications to your data.
This document explains how to apply classifications to your data.
- Set up scan on your data sources. ## Apply classifications
-In Azure Purview, you can apply system or custom classifications on a file, table, or column asset. This article describes the steps to manually apply classifications on your assets.
+In Microsoft Purview, you can apply system or custom classifications on a file, table, or column asset. This article describes the steps to manually apply classifications on your assets.
### Apply classification to a file asset
-Azure Purview can scan and automatically classify documentation files. For example, if you have a file named *multiple.docx* and it has a National ID number in its content, Azure Purview adds the classification **EU National Identification Number** to the file asset's detail page.
+Microsoft Purview can scan and automatically classify documentation files. For example, if you have a file named *multiple.docx* and it has a National ID number in its content, Microsoft Purview adds the classification **EU National Identification Number** to the file asset's detail page.
In some scenarios, you might want to manually add classifications to your file asset. If you have multiple files that are grouped into a resource set, add classifications at the resource set level.
Follow these steps to add a custom or system classification to a partition resou
### Apply classification to a table asset
-When Azure Purview scans your data sources, it doesn't automatically assign classifications to table assets. If you want your table asset to have a classification, you must add it manually.
+When Microsoft Purview scans your data sources, it doesn't automatically assign classifications to table assets. If you want your table asset to have a classification, you must add it manually.
To add a classification to a table asset:
To add a classification to a table asset:
1. Select **Save** to save the classifications.
-1. On the **Overview** page, verify that Azure Purview added your new classifications.
+1. On the **Overview** page, verify that Microsoft Purview added your new classifications.
:::image type="content" source="./media/apply-classifications/verify-classifications-added-to-table.png" alt-text="Screenshot showing how to verify that classifications were added to a table asset."::: ### Add classification to a column asset
-Azure Purview automatically scans and adds classifications to all column assets. However, if you want to change the classification, you can do so at the column level.
+Microsoft Purview automatically scans and adds classifications to all column assets. However, if you want to change the classification, you can do so at the column level.
To add a classification to a column:
purview Asset Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/asset-insights.md
Title: Asset insights on your data in Azure Purview
-description: This how-to guide describes how to view and use Azure Purview Insights asset reporting on your data.
+ Title: Asset insights on your data in Microsoft Purview
+description: This how-to guide describes how to view and use Microsoft Purview Insights asset reporting on your data.
Last updated 09/27/2021
-# Asset insights on your data in Azure Purview
+# Asset insights on your data in Microsoft Purview
-This how-to guide describes how to access, view, and filter Azure Purview Asset insight reports for your data.
+This how-to guide describes how to access, view, and filter Microsoft Purview Asset insight reports for your data.
> [!IMPORTANT]
-> Azure Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
In this how-to guide, you'll learn how to: > [!div class="checklist"]
-> * View insights from your Azure Purview account.
+> * View insights from your Microsoft Purview account.
> * Get a bird's eye view of your data. > * Drill down for more asset count details. ## Prerequisites
-Before getting started with Azure Purview insights, make sure that you've completed the following steps:
+Before getting started with Microsoft Purview insights, make sure that you've completed the following steps:
* Set up your Azure resources and populate the account with data. * Set up and complete a scan on the source type.
-For more information, see [Manage data sources in Azure Purview](manage-data-sources.md).
+For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md).
-## Use Azure Purview Asset Insights
+## Use Microsoft Purview Asset Insights
-In Azure Purview, you can register and scan source types. Once the scan is complete, you can view the asset distribution in Asset Insights, which tells you the state of your data estate by classification and resource sets. It also tells you if there is any change in data size.
+In Microsoft Purview, you can register and scan source types. Once the scan is complete, you can view the asset distribution in Asset Insights, which tells you the state of your data estate by classification and resource sets. It also tells you if there is any change in data size.
> [!NOTE] > After you have scanned your source types, give Asset Insights 3-8 hours to reflect the new assets. The delay may be due to high traffic in deployment region or size of your workload. For further information, please contact the field support team.
-1. Navigate to your Azure Purview account in the Azure portal.
+1. Navigate to your Microsoft Purview account in the Azure portal.
-1. On the **Overview** page, in the **Get Started** section, select the **Open Azure Purview Studio** tile.
+1. On the **Overview** page, in the **Get Started** section, select the **Open Microsoft Purview Studio** tile.
- :::image type="content" source="./media/asset-insights/portal-access.png" alt-text="Launch Azure Purview from the Azure portal":::
+ :::image type="content" source="./media/asset-insights/portal-access.png" alt-text="Launch Microsoft Purview from the Azure portal":::
-1. On the Azure Purview **Home** page, select **Insights** on the left menu.
+1. On the Microsoft Purview **Home** page, select **Insights** on the left menu.
:::image type="content" source="./media/asset-insights/view-insights.png" alt-text="View your insights in the Azure portal":::
-1. In the **Insights** area, select **Assets** to display the Azure Purview **Asset insights** report.
+1. In the **Insights** area, select **Assets** to display the Microsoft Purview **Asset insights** report.
### View Asset Insights
The second graph in file-based source types is ***Files not associated with a re
## Next steps
-Learn more about Azure Purview insight reports with
+Learn more about Microsoft Purview insight reports with
- [Classification insights](./classification-insights.md) - [Glossary insights](glossary-insights.md)
purview Azure Purview Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/azure-purview-connector-overview.md
Title: Azure Purview supported data sources and file types
-description: This article provides details about supported data sources, file types, and functionalities in Azure Purview.
+ Title: Microsoft Purview supported data sources and file types
+description: This article provides details about supported data sources, file types, and functionalities in Microsoft Purview.
# Supported data sources and file types
-This article discusses currently supported data sources, file types, and scanning concepts in Azure Purview.
+This article discusses currently supported data sources, file types, and scanning concepts in Microsoft Purview.
-## Azure Purview data sources
+## Microsoft Purview data sources
The table below shows the supported capabilities for each data source. Select the data source, or the feature, to learn more.
The table below shows the supported capabilities for each data source. Select th
\* Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md). > [!NOTE]
-> Currently, Azure Purview can't scan an asset that has `/`, `\`, or `#` in its name. To scope your scan and avoid scanning assets that have those characters in the asset name, use the example in [Register and scan an Azure SQL Database](register-scan-azure-sql-database.md#creating-the-scan).
+> Currently, Microsoft Purview can't scan an asset that has `/`, `\`, or `#` in its name. To scope your scan and avoid scanning assets that have those characters in the asset name, use the example in [Register and scan an Azure SQL Database](register-scan-azure-sql-database.md#creating-the-scan).
## Scan regions
-The following is a list of all the Azure data source (data center) regions where the Azure Purview scanner runs. If your Azure data source is in a region outside of this list, the scanner will run in the region of your Azure Purview instance.
+The following is a list of all the Azure data source (data center) regions where the Microsoft Purview scanner runs. If your Azure data source is in a region outside of this list, the scanner will run in the region of your Microsoft Purview instance.
-### Azure Purview scanner regions
+### Microsoft Purview scanner regions
- Australia East - Australia Southeast
The following file types are supported for scanning, for schema extraction, and
- Structured file formats supported by extension: AVRO, ORC, PARQUET, CSV, JSON, PSV, SSV, TSV, TXT, XML, GZIP > [!Note]
- > * Azure Purview scanner only supports schema extraction for the structured file types listed above.
- > * For AVRO, ORC, and PARQUET file types, Azure Purview scanner does not support schema extraction for files that contain complex data types (for example, MAP, LIST, STRUCT).
- > * Azure Purview scanner supports scanning snappy compressed PARQUET types for schema extraction and classification.
+ > * Microsoft Purview scanner only supports schema extraction for the structured file types listed above.
+ > * For AVRO, ORC, and PARQUET file types, Microsoft Purview scanner does not support schema extraction for files that contain complex data types (for example, MAP, LIST, STRUCT).
+ > * Microsoft Purview scanner supports scanning snappy compressed PARQUET types for schema extraction and classification.
> * For GZIP file types, the GZIP must be mapped to a single csv file within. > Gzip files are subject to System and Custom Classification rules. We currently don't support scanning a gzip file mapped to multiple files within, or any file type other than csv. > * For delimited file types(CSV, PSV, SSV, TSV, TXT), we do not support data type detection. The data type will be listed as "string" for all columns. - Document file formats supported by extension: DOC, DOCM, DOCX, DOT, ODP, ODS, ODT, PDF, POT, PPS, PPSX, PPT, PPTM, PPTX, XLC, XLS, XLSB, XLSM, XLSX, XLT-- Azure Purview also supports custom file extensions and custom parsers.
+- Microsoft Purview also supports custom file extensions and custom parsers.
## Nested data
Nested data, or nested schema parsing, is not supported in SQL. A column with ne
## Sampling within a file
-In Azure Purview terminology,
+In Microsoft Purview terminology,
- L1 scan: Extracts basic information and meta data like file name, size and fully qualified name - L2 scan: Extracts schema for structured file types and database tables - L3 scan: Extracts schema where applicable and subjects the sampled file to system and custom classification rules
-For all structured file formats, Azure Purview scanner samples files in the following way:
+For all structured file formats, Microsoft Purview scanner samples files in the following way:
- For structured file types, it samples the top 128 rows in each column or the first 1 MB, whichever is lower. - For document file formats, it samples the first 20 MB of each file.
- - If a document file is larger than 20 MB, then it is not subject to a deep scan (subject to classification). In that case, Azure Purview captures only basic meta data like file name and fully qualified name.
+ - If a document file is larger than 20 MB, then it is not subject to a deep scan (subject to classification). In that case, Microsoft Purview captures only basic meta data like file name and fully qualified name.
- For **tabular data sources(SQL, CosmosDB)**, it samples the top 128 rows. ## Resource set file sampling
-A folder or group of partition files is detected as a *resource set* in Azure Purview, if it matches with a system resource set policy or a customer defined resource set policy. If a resource set is detected, then Azure Purview will sample each folder that it contains. Learn more about resource sets [here](concept-resource-sets.md).
+A folder or group of partition files is detected as a *resource set* in Microsoft Purview, if it matches with a system resource set policy or a customer defined resource set policy. If a resource set is detected, then Microsoft Purview will sample each folder that it contains. Learn more about resource sets [here](concept-resource-sets.md).
File sampling for resource sets by file types:
File sampling for resource sets by file types:
## Classification
-All 206 system classification rules apply to structured file formats. Only the MCE classification rules apply to document file types (Not the data scan native regex patterns, bloom filter-based detection). For more information on supported classifications, see [Supported classifications in Azure Purview](supported-classifications.md).
+All 206 system classification rules apply to structured file formats. Only the MCE classification rules apply to document file types (Not the data scan native regex patterns, bloom filter-based detection). For more information on supported classifications, see [Supported classifications in Microsoft Purview](supported-classifications.md).
## Next steps - [Register and scan Azure Blob storage source](register-scan-azure-blob-storage-source.md)-- [Scans and ingestion in Azure Purview](concept-scans-and-ingestion.md)-- [Manage data sources in Azure Purview](manage-data-sources.md)
+- [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md)
+- [Manage data sources in Microsoft Purview](manage-data-sources.md)
purview Catalog Asset Details https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-asset-details.md
Last updated 02/24/2022
-# View, edit and delete assets in Azure Purview catalog
+# View, edit and delete assets in Microsoft Purview catalog
This article discusses how you can view your assets and their relevant details. It also describes how you can edit and delete assets from your catalog. ## Prerequisites - Set up your data sources and scan the assets into your catalog.-- *Or* Use the Azure Purview Atlas APIs to ingest assets into the catalog.
+- *Or* Use the Microsoft Purview Atlas APIs to ingest assets into the catalog.
## Viewing asset details
-You can discover your assets in the Azure Purview data catalog by either:
+You can discover your assets in the Microsoft Purview data catalog by either:
- [Browsing the data catalog](how-to-browse-catalog.md) - [Searching the data catalog](how-to-search-catalog.md)
If you edit an asset by adding a description, asset level classification, glossa
If you make some column level updates, like adding a description, column level classification, or glossary term, then subsequent scans will also update the asset schema (new columns and classifications will be detected by the scanner in subsequent scan runs).
-Even on edited assets, after a scan Azure Purview will reflect the truth of the source system. For example: if you edit a column and it's deleted from the source, it will be deleted from your asset in Azure Purview.
+Even on edited assets, after a scan Microsoft Purview will reflect the truth of the source system. For example: if you edit a column and it's deleted from the source, it will be deleted from your asset in Microsoft Purview.
>[!NOTE]
-> If you update the **name or data type of a column** in an Azure Purview asset, later scans **will not** update the asset schema. New columns and classifications **will not** be detected.
+> If you update the **name or data type of a column** in a Microsoft Purview asset, later scans **will not** update the asset schema. New columns and classifications **will not** be detected.
## Deleting assets
You can delete an asset by selecting the delete icon under the name of the asset
### Delete behavior explained
-Any asset you delete using the delete button is permanently deleted in Azure Purview. However, if you run a **full scan** on the source from which the asset was ingested into the catalog, then the asset is reingested and you can discover it using the Azure Purview catalog.
+Any asset you delete using the delete button is permanently deleted in Microsoft Purview. However, if you run a **full scan** on the source from which the asset was ingested into the catalog, then the asset is reingested and you can discover it using the Microsoft Purview catalog.
-If you have a scheduled scan (weekly or monthly) on the source, the **deleted asset will not get re-ingested** into the catalog unless the asset is modified by an end user since the previous run of the scan. For example, if a SQL table was deleted from Azure Purview, but after the table was deleted a user added a new column to the table in SQL, at the next scan the asset will be rescanned and ingested into the catalog.
+If you have a scheduled scan (weekly or monthly) on the source, the **deleted asset will not get re-ingested** into the catalog unless the asset is modified by an end user since the previous run of the scan. For example, if a SQL table was deleted from Microsoft Purview, but after the table was deleted a user added a new column to the table in SQL, at the next scan the asset will be rescanned and ingested into the catalog.
-If you delete an asset, only that asset is deleted. Azure Purview does not currently support cascaded deletes. For example, if you delete a storage account asset in your catalog - the containers, folders and files within them are not deleted.
+If you delete an asset, only that asset is deleted. Microsoft Purview does not currently support cascaded deletes. For example, if you delete a storage account asset in your catalog - the containers, folders and files within them are not deleted.
## Next steps -- [Browse the Azure Purview Data catalog](how-to-browse-catalog.md)-- [Search the Azure Purview Data Catalog](how-to-search-catalog.md)
+- [Browse the Microsoft Purview Data catalog](how-to-browse-catalog.md)
+- [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
purview Catalog Conditional Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-conditional-access.md
Title: Configure Azure AD Conditional Access for Azure Purview
-description: This article describes steps how to configure Azure AD Conditional Access for Azure Purview
+ Title: Configure Azure AD Conditional Access for Microsoft Purview
+description: This article describes steps how to configure Azure AD Conditional Access for Microsoft Purview
Last updated 01/14/2022
-# Customer intent: As an identity and security admin, I want to set up Azure Active Directory Conditional Access for Azure Purview, for secure access.
+# Customer intent: As an identity and security admin, I want to set up Azure Active Directory Conditional Access for Microsoft Purview, for secure access.
-# Conditional Access with Azure Purview
+# Conditional Access with Microsoft Purview
-[Azure Purview](./overview.md) supports Microsoft Conditional Access.
+[Microsoft Purview](./overview.md) supports Microsoft Conditional Access.
-The following steps show how to configure Azure Purview to enforce a Conditional Access policy.
+The following steps show how to configure Microsoft Purview to enforce a Conditional Access policy.
## Prerequisites -- When multi-factor authentication is enabled, to sign in to Azure Purview Studio, you must perform multi-factor authentication.
+- When multi-factor authentication is enabled, to sign in to Microsoft Purview Studio, you must perform multi-factor authentication.
## Configure conditional access
The following steps show how to configure Azure Purview to enforce a Conditional
:::image type="content" source="media/catalog-conditional-access/select-users-and-groups.png" alt-text="Screenshot that shows User and Group selection"lightbox="media/catalog-conditional-access/select-users-and-groups.png":::
-1. Select **Cloud apps**, select **Select apps**. You see all apps available for Conditional Access. Select **Azure Purview**, at the bottom select **Select**, and then select **Done**.
+1. Select **Cloud apps**, select **Select apps**. You see all apps available for Conditional Access. Select **Microsoft Purview**, at the bottom select **Select**, and then select **Done**.
:::image type="content" source="media/catalog-conditional-access/select-azure-purview.png" alt-text="Screenshot that shows Applications selection"lightbox="media/catalog-conditional-access/select-azure-purview.png":::
The following steps show how to configure Azure Purview to enforce a Conditional
## Next steps -- [Use Azure Purview Studio](./use-purview-studio.md)
+- [Use Microsoft Purview Studio](./use-purview-studio.md)
purview Catalog Lineage User Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-lineage-user-guide.md
Title: Data Catalog lineage user guide
-description: This article provides an overview of the catalog lineage feature of Azure Purview.
+description: This article provides an overview of the catalog lineage feature of Microsoft Purview.
Last updated 01/20/2022
-# Azure Purview Data Catalog lineage user guide
+# Microsoft Purview Data Catalog lineage user guide
-This article provides an overview of the data lineage features in Azure Purview Data Catalog.
+This article provides an overview of the data lineage features in Microsoft Purview Data Catalog.
## Background
-One of the platform features of Azure Purview is the ability to show the lineage between datasets created by data processes. Systems like Data Factory, Data Share, and Power BI capture the lineage of data as it moves. Custom lineage reporting is also supported via Atlas hooks and REST API.
+One of the platform features of Microsoft Purview is the ability to show the lineage between datasets created by data processes. Systems like Data Factory, Data Share, and Power BI capture the lineage of data as it moves. Custom lineage reporting is also supported via Atlas hooks and REST API.
## Lineage collection
- Metadata collected in Azure Purview from enterprise data systems are stitched across to show an end to end data lineage. Data systems that collect lineage into Azure Purview are broadly categorized into following three types:
+ Metadata collected in Microsoft Purview from enterprise data systems are stitched across to show an end to end data lineage. Data systems that collect lineage into Microsoft Purview are broadly categorized into following three types:
- [Data processing systems](#data-processing-systems) - [Data storage systems](#data-storage-systems)
One of the platform features of Azure Purview is the ability to show the lineage
Each system supports a different level of lineage scope. Check the sections below, or your system's individual lineage article, to confirm the scope of lineage currently available. ### Data processing systems
-Data integration and ETL tools can push lineage into Azure Purview at execution time. Tools such as Data Factory, Data Share, Synapse, Azure Databricks, and so on, belong to this category of data processing systems. The data processing systems reference datasets as source from different databases and storage solutions to create target datasets. The list of data processing systems currently integrated with Azure Purview for lineage are listed in below table.
+Data integration and ETL tools can push lineage into Microsoft Purview at execution time. Tools such as Data Factory, Data Share, Synapse, Azure Databricks, and so on, belong to this category of data processing systems. The data processing systems reference datasets as source from different databases and storage solutions to create target datasets. The list of data processing systems currently integrated with Microsoft Purview for lineage are listed in below table.
| Data processing system | Supported scope | | - | |
Data integration and ETL tools can push lineage into Azure Purview at execution
| Azure Data Share | [Share snapshot](how-to-link-azure-data-share.md) | ### Data storage systems
-Databases & storage solutions such as Oracle, Teradata, and SAP have query engines to transform data using scripting language. Data lineage from views/stored procedures/etc are collected into Azure Purview and stitched with lineage from other systems. Lineage is supported for the following data sources via Azure Purview data scan. Learn more about the supported lineage scenarios from the respective article.
+Databases & storage solutions such as Oracle, Teradata, and SAP have query engines to transform data using scripting language. Data lineage from views/stored procedures/etc are collected into Microsoft Purview and stitched with lineage from other systems. Lineage is supported for the following data sources via Microsoft Purview data scan. Learn more about the supported lineage scenarios from the respective article.
|**Category**| **Data source** | |||
Databases & storage solutions such as Oracle, Teradata, and SAP have query engin
|| [SAP S/4HANA](register-scan-saps4hana-source.md) | ### Data analytics and reporting systems
-Data analytics and reporting systems like Azure ML and Power BI report lineage into Azure Purview. These systems will use the datasets from storage systems and process through their meta model to create BI Dashboards, ML experiments and so on.
+Data analytics and reporting systems like Azure ML and Power BI report lineage into Microsoft Purview. These systems will use the datasets from storage systems and process through their meta model to create BI Dashboards, ML experiments and so on.
| Data analytics & reporting system | Supported scope | | - | |
Data analytics and reporting systems like Azure ML and Power BI report lineage i
> [!VIDEO https://www.microsoft.com/videoplayer/embed/RWxTAK]
-Lineage in Azure Purview includes datasets and processes. Datasets are also referred to as nodes while processes can be also called edges:
+Lineage in Microsoft Purview includes datasets and processes. Datasets are also referred to as nodes while processes can be also called edges:
-* **Dataset (Node)**: A dataset (structured or unstructured) provided as an input to a process. For example, a SQL Table, Azure blob, and files (such as .csv and .xml), are all considered datasets. In the lineage section of Azure Purview, datasets are represented by rectangular boxes.
+* **Dataset (Node)**: A dataset (structured or unstructured) provided as an input to a process. For example, a SQL Table, Azure blob, and files (such as .csv and .xml), are all considered datasets. In the lineage section of Microsoft Purview, datasets are represented by rectangular boxes.
-* **Process (Edge)**: An activity or transformation performed on a dataset is called a process. For example, ADF Copy activity, Data Share snapshot and so on. In the lineage section of Azure Purview, processes are represented by round-edged boxes.
+* **Process (Edge)**: An activity or transformation performed on a dataset is called a process. For example, ADF Copy activity, Data Share snapshot and so on. In the lineage section of Microsoft Purview, processes are represented by round-edged boxes.
-To access lineage information for an asset in Azure Purview, follow the steps:
+To access lineage information for an asset in Microsoft Purview, follow the steps:
-1. In the Azure portal, go to the [Azure Purview accounts page](https://aka.ms/purviewportal).
+1. In the Azure portal, go to the [Microsoft Purview accounts page](https://aka.ms/purviewportal).
-1. Select your Azure Purview account from the list, and then select **Open Azure Purview Studio** from the **Overview** page.
+1. Select your Microsoft Purview account from the list, and then select **Open Microsoft Purview Studio** from the **Overview** page.
-1. On the Azure Purview Studio **Home** page, search for a dataset name or the process name such as ADF Copy or Data Flow activity. And then press Enter.
+1. On the Microsoft Purview Studio **Home** page, search for a dataset name or the process name such as ADF Copy or Data Flow activity. And then press Enter.
1. From the search results, select the asset and select its **Lineage** tab.
To access lineage information for an asset in Azure Purview, follow the steps:
## Asset-level lineage
-Azure Purview supports asset level lineage for the datasets and processes. To see the asset level lineage go to the **Lineage** tab of the current asset in the catalog. Select the current dataset asset node. By default the list of columns belonging to the data appears in the left pane.
+Microsoft Purview supports asset level lineage for the datasets and processes. To see the asset level lineage go to the **Lineage** tab of the current asset in the catalog. Select the current dataset asset node. By default the list of columns belonging to the data appears in the left pane.
:::image type="content" source="./media/catalog-lineage-user-guide/view-columns-from-lineage.png" alt-text="Screenshot showing how to select View columns in the lineage page" border="true":::
To see column-level lineage of a dataset, go to the **Lineage** tab of the curre
:::image type="content" source="./media/catalog-lineage-user-guide/use-toggle-to-filter-nodes.png" alt-text="Screenshot showing how to use the toggle to filter the list of nodes on the lineage page." lightbox="./media/catalog-lineage-user-guide/use-toggle-to-filter-nodes.png"::: ## Process column lineage
-Data process can take one or more input datasets to produce one or more outputs. In Azure Purview, column level lineage is available for process nodes.
+Data process can take one or more input datasets to produce one or more outputs. In Microsoft Purview, column level lineage is available for process nodes.
1. Switch between input and output datasets from a drop down in the columns panel. 2. Select columns from one or more tables to see the lineage flowing from input dataset to corresponding output dataset.
purview Catalog Managed Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-managed-vnet.md
Title: Managed Virtual Network and managed private endpoints
-description: This article describes Managed Virtual Network and managed private endpoints in Azure Purview.
+description: This article describes Managed Virtual Network and managed private endpoints in Microsoft Purview.
Last updated 03/17/2022
-# Customer intent: As a Azure Purview admin, I want to set up Managed Virtual Network and managed private endpoints for my Azure Purview account.
+# Customer intent: As a Microsoft Purview admin, I want to set up Managed Virtual Network and managed private endpoints for my Microsoft Purview account.
-# Use a Managed VNet with your Azure Purview account
+# Use a Managed VNet with your Microsoft Purview account
> [!IMPORTANT]
-> Azure Purview Managed Vnet, VNet Integration Runtime, and managed private endpoint connections are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Managed Vnet, VNet Integration Runtime, and managed private endpoint connections are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
> [!IMPORTANT]
-> Currently, Managed Virtual Network and managed private endpoints are available for Azure Purview accounts that are deployed in the following regions:
+> Currently, Managed Virtual Network and managed private endpoints are available for Microsoft Purview accounts that are deployed in the following regions:
> - Australia East > - Canada Central > - East US
## Conceptual overview
-This article describes how to configure Managed Virtual Network and managed private endpoints for Azure Purview.
+This article describes how to configure Managed Virtual Network and managed private endpoints for Microsoft Purview.
### Supported regions
-Currently, Managed Virtual Network and managed private endpoints are available for Azure Purview accounts that are deployed in the following regions:
+Currently, Managed Virtual Network and managed private endpoints are available for Microsoft Purview accounts that are deployed in the following regions:
> - Australia East > - Canada Central > - East US
Currently, Managed Virtual Network and managed private endpoints are available f
### Supported data sources
-Currently, the following data sources are supported to have a managed private endpoint and can be scanned using Managed VNet Runtime in Azure Purview:
+Currently, the following data sources are supported to have a managed private endpoint and can be scanned using Managed VNet Runtime in Microsoft Purview:
- Azure Blob Storage - Azure Data Lake Storage Gen 2
Currently, the following data sources are supported to have a managed private en
Additionally, you can deploy managed private endpoints for your Azure Key Vault resources if you need to run scans using any authentication options rather than Managed Identities, such as SQL Authentication or Account Key. > [!IMPORTANT]
-> If you are planning to scan Azure Synapse workspaces using Managed Virtual Network, you are also required to [configure Azure Synapse workspace firewall access](register-scan-synapse-workspace.md#set-up-azure-synapse-workspace-firewall-access) to enable **Allow Azure services and resources to access this workspace**. Currently, we do not support setting up scans for an Azure Synapse workspace from Azure Purview Studio, if you cannot enable **Allow Azure services and resources to access this workspace** on your Azure Synapse workspaces. If you cannot enable the firewall:
-> - You can use [Azure Purview Rest API - Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/) to create a new scan for your Synapse workspaces including dedicated and serverless pools.
+> If you are planning to scan Azure Synapse workspaces using Managed Virtual Network, you are also required to [configure Azure Synapse workspace firewall access](register-scan-synapse-workspace.md#set-up-azure-synapse-workspace-firewall-access) to enable **Allow Azure services and resources to access this workspace**. Currently, we do not support setting up scans for an Azure Synapse workspace from Microsoft Purview Studio, if you cannot enable **Allow Azure services and resources to access this workspace** on your Azure Synapse workspaces. If you cannot enable the firewall:
+> - You can use [Microsoft Purview Rest API - Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/) to create a new scan for your Synapse workspaces including dedicated and serverless pools.
> - You must use **SQL Authentication** as authentication mechanism. ### Managed Virtual Network
-A Managed Virtual Network in Azure Purview is a virtual network which is deployed and managed by Azure inside the same region as Azure Purview account to allow scanning Azure data sources inside a managed network, without having to deploy and manage any self-hosted integration runtime virtual machines by the customer in Azure.
+A Managed Virtual Network in Microsoft Purview is a virtual network which is deployed and managed by Azure inside the same region as Microsoft Purview account to allow scanning Azure data sources inside a managed network, without having to deploy and manage any self-hosted integration runtime virtual machines by the customer in Azure.
-You can deploy an Azure Managed Integration Runtime within an Azure Purview Managed Virtual Network. From there, the Managed VNet Runtime will leverage private endpoints to securely connect to and scan supported data sources.
+You can deploy an Azure Managed Integration Runtime within a Microsoft Purview Managed Virtual Network. From there, the Managed VNet Runtime will leverage private endpoints to securely connect to and scan supported data sources.
Creating a Managed VNet Runtime within Managed Virtual Network ensures that data integration process is isolated and secure. Benefits of using Managed Virtual Network: -- With a Managed Virtual Network, you can offload the burden of managing the Virtual Network to Azure Purview. You don't need to create and manage VNets or subnets for Azure Integration Runtime to use for scanning Azure data sources.
+- With a Managed Virtual Network, you can offload the burden of managing the Virtual Network to Microsoft Purview. You don't need to create and manage VNets or subnets for Azure Integration Runtime to use for scanning Azure data sources.
- It doesn't require deep Azure networking knowledge to do data integrations securely. Using a Managed Virtual Network is much simplified for data engineers. - Managed Virtual Network along with Managed private endpoints protects against data exfiltration. > [!IMPORTANT]
-> Currently, the Managed Virtual Network is only supported in the same region as Azure Purview account region.
+> Currently, the Managed Virtual Network is only supported in the same region as Microsoft Purview account region.
> [!Note] > You cannot switch a global Azure integration runtime or self-hosted integration runtime to a Managed VNet Runtime and vice versa.
-A Managed VNet is created for your Azure Purview account when you create a Managed VNet Runtime for the first time in your Azure Purview account. You can't view or manage the Managed VNets.
+A Managed VNet is created for your Microsoft Purview account when you create a Managed VNet Runtime for the first time in your Microsoft Purview account. You can't view or manage the Managed VNets.
### Managed private endpoints
-Managed private endpoints are private endpoints created in the Azure Purview Managed Virtual Network establishing a private link to Azure Purview and Azure resources. Azure Purview manages these private endpoints on your behalf.
+Managed private endpoints are private endpoints created in the Microsoft Purview Managed Virtual Network establishing a private link to Microsoft Purview and Azure resources. Microsoft Purview manages these private endpoints on your behalf.
-Azure Purview supports private links. Private link enables you to access Azure (PaaS) services (such as Azure Storage, Azure Cosmos DB, Azure Synapse Analytics).
+Microsoft Purview supports private links. Private link enables you to access Azure (PaaS) services (such as Azure Storage, Azure Cosmos DB, Azure Synapse Analytics).
When you use a private link, traffic between your data sources and Managed Virtual Network traverses entirely over the Microsoft backbone network. Private Link protects against data exfiltration risks. You establish a private link to a resource by creating a private endpoint.
Private endpoint uses a private IP address in the Managed Virtual Network to eff
> To reduce administrative overhead, it's recommended that you create managed private endpoints to scan all supported Azure data sources. > [!WARNING]
-> If an Azure PaaS data store (Blob, Azure Data Lake Storage Gen2, Azure Synapse Analytics) has a private endpoint already created against it, and even if it allows access from all networks, Azure Purview would only be able to access it using a managed private endpoint. If a private endpoint does not already exist, you must create one in such scenarios.
+> If an Azure PaaS data store (Blob, Azure Data Lake Storage Gen2, Azure Synapse Analytics) has a private endpoint already created against it, and even if it allows access from all networks, Microsoft Purview would only be able to access it using a managed private endpoint. If a private endpoint does not already exist, you must create one in such scenarios.
-A private endpoint connection is created in a "Pending" state when you create a managed private endpoint in Azure Purview. An approval workflow is initiated. The private link resource owner is responsible to approve or reject the connection.
+A private endpoint connection is created in a "Pending" state when you create a managed private endpoint in Microsoft Purview. An approval workflow is initiated. The private link resource owner is responsible to approve or reject the connection.
:::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source-approval.png" alt-text="approval for managed private endpoint":::
Interactive authoring capabilities is used for functionalities like test connect
### Prerequisites
-Before deploying a Managed VNet and Managed VNet Runtime for an Azure Purview account, ensure you meet the following prerequisites:
+Before deploying a Managed VNet and Managed VNet Runtime for a Microsoft Purview account, ensure you meet the following prerequisites:
-1. An Azure Purview account deployed in one of the [supported regions](#supported-regions).
-2. From Azure Purview roles, you must be a data curator at root collection level in your Azure Purview account.
-3. From Azure RBAC roles, you must be contributor on the Azure Purview account and data source to approve private links.
+1. An Microsoft Purview account deployed in one of the [supported regions](#supported-regions).
+2. From Microsoft Purview roles, you must be a data curator at root collection level in your Microsoft Purview account.
+3. From Azure RBAC roles, you must be contributor on the Microsoft Purview account and data source to approve private links.
### Deploy Managed VNet Runtimes > [!NOTE] > The following guide shows how to register and scan an Azure Data Lake Storage Gen 2 using Managed VNet Runtime.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_.
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_.
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-azure-portal.png" alt-text="Screenshot that shows the Azure Purview account":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-azure-portal.png" alt-text="Screenshot that shows the Microsoft Purview account":::
-2. **Open Azure Purview Studio** and navigate to the **Data Map --> Integration runtimes**.
+2. **Open Microsoft Purview Studio** and navigate to the **Data Map --> Integration runtimes**.
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-vnet.png" alt-text="Screenshot that shows Azure Purview Data Map menus":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-vnet.png" alt-text="Screenshot that shows Microsoft Purview Data Map menus":::
3. From **Integration runtimes** page, select **+ New** icon, to create a new runtime. Select Azure and then select **Continue**.
Before deploying a Managed VNet and Managed VNet Runtime for an Azure Purview ac
:::image type="content" source="media/catalog-managed-vnet/purview-managed-ir-region.png" alt-text="Screenshot that shows to create a Managed VNet Runtime":::
-5. Deploying the Managed VNet Runtime for the first time triggers multiple workflows in Azure Purview Studio for creating managed private endpoints for Azure Purview and its Managed Storage Account. Click on each workflow to approve the private endpoint for the corresponding Azure resource.
+5. Deploying the Managed VNet Runtime for the first time triggers multiple workflows in Microsoft Purview Studio for creating managed private endpoints for Microsoft Purview and its Managed Storage Account. Click on each workflow to approve the private endpoint for the corresponding Azure resource.
:::image type="content" source="media/catalog-managed-vnet/purview-managed-ir-workflows.png" alt-text="Screenshot that shows deployment of a Managed VNet Runtime":::
-6. In Azure portal, from your Azure Purview account resource blade, approve the managed private endpoint. From Managed storage account blade approve the managed private endpoints for blob and queue
+6. In Azure portal, from your Microsoft Purview account resource blade, approve the managed private endpoint. From Managed storage account blade approve the managed private endpoints for blob and queue
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-purview.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Azure Purview":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-purview.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Microsoft Purview":::
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-purview-approved.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Azure Purview - approved":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-purview-approved.png" alt-text="Screenshot that shows how to approve a managed private endpoint for Microsoft Purview - approved":::
:::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-managed-storage.png" alt-text="Screenshot that shows how to approve a managed private endpoint for managed storage account":::
Before deploying a Managed VNet and Managed VNet Runtime for an Azure Purview ac
7. From Management, select Managed private endpoint to validate if all managed private endpoints are successfully deployed and approved. All private endpoints be approved.
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list.png" alt-text="Screenshot that shows managed private endpoints in Azure Purview":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list.png" alt-text="Screenshot that shows managed private endpoints in Microsoft Purview":::
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-approved.png" alt-text="Screenshot that shows managed private endpoints in Azure Purview - approved ":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-approved.png" alt-text="Screenshot that shows managed private endpoints in Microsoft Purview - approved ":::
### Deploy managed private endpoints for data sources
To scan any data sources using Managed VNet Runtime, you need to deploy and appr
:::image type="content" source="media/catalog-managed-vnet/purview-managed-data-source-pe-azure-approved.png" alt-text="Screenshot that shows approved private endpoint for data sources in Azure portal":::
-7. Inside Azure Purview Studio, the managed private endpoint must be shown as approved as well.
+7. Inside Microsoft Purview Studio, the managed private endpoint must be shown as approved as well.
:::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-2.png" alt-text="Screenshot that shows managed private endpoints including data sources' in purview studio"::: ### Register and scan a data source using Managed VNet Runtime #### Register data source
-It is important to register the data source in Azure Purview prior to setting up a scan for the data source. Follow these steps to register data source if you haven't yet registered it.
+It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source. Follow these steps to register data source if you haven't yet registered it.
-1. Go to your Azure Purview account.
+1. Go to your Microsoft Purview account.
1. Select **Data Map** on the left menu. 1. Select **Register**. 2. On **Register sources**, select your data source
It is important to register the data source in Azure Purview prior to setting up
3. In the **Select a collection** box, select a collection. 4. Select **Register** to register the data sources.
-For more information, see [Manage data sources in Azure Purview](manage-data-sources.md).
+For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md).
#### Scan data source
-You can use any of the following options to scan data sources using Azure Purview Managed VNet Runtime:
+You can use any of the following options to scan data sources using Microsoft Purview Managed VNet Runtime:
-- [Using Managed Identity](#scan-using-managed-identity) (Recommended) - As soon as the Azure Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Azure Purview system-assigned managed identity (SAMI) to perform the scans.
+- [Using Managed Identity](#scan-using-managed-identity) (Recommended) - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview system-assigned managed identity (SAMI) to perform the scans.
- [Using other authentication options](#scan-using-other-authentication-options):
- - Account Key or SQL Authentication- Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Azure Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
+ - Account Key or SQL Authentication- Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Microsoft Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
- Service Principal - In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant. ##### Scan using Managed Identity
-To scan a data source using a Managed VNet Runtime and Azure Purview managed identity perform these steps:
+To scan a data source using a Managed VNet Runtime and Microsoft Purview managed identity perform these steps:
-1. Select the **Data Map** tab on the left pane in the Azure Purview Studio.
+1. Select the **Data Map** tab on the left pane in the Microsoft Purview Studio.
1. Select the data source that you registered.
To scan a data source using a Managed VNet Runtime and Azure Purview managed ide
##### Scan using other authentication options
-You can also use other supported options to scan data sources using Azure Purview Managed Runtime. This requires setting up a private connection to Azure Key Vault where the secret is stored.
+You can also use other supported options to scan data sources using Microsoft Purview Managed Runtime. This requires setting up a private connection to Azure Key Vault where the secret is stored.
To set up a scan using Account Key or SQL Authentication follow these steps:
-1. [Grant Azure Purview access to your Azure Key Vault](manage-credentials.md#grant-azure-purview-access-to-your-azure-key-vault).
+1. [Grant Microsoft Purview access to your Azure Key Vault](manage-credentials.md#grant-microsoft-purview-access-to-your-azure-key-vault).
-2. [Create a new credential in Azure Purview](manage-credentials.md#create-a-new-credential).
+2. [Create a new credential in Microsoft Purview](manage-credentials.md#create-a-new-credential).
3. Navigate to **Management**, and select **Managed private endpoints**.
To set up a scan using Account Key or SQL Authentication follow these steps:
6. Provide a name for the managed private endpoint, select the Azure subscription and the Azure Key Vault from the drop down lists. Select **create**.
- :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-create.png" alt-text="Screenshot that shows how to create a managed private endpoint for Azure Key Vault in Azure Purview Studio":::
+ :::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-create.png" alt-text="Screenshot that shows how to create a managed private endpoint for Azure Key Vault in Microsoft Purview Studio":::
7. From the list of managed private endpoints, click on the newly created managed private endpoint for your Azure Key Vault and then click on **Manage approvals in the Azure portal**, to approve the private endpoint in Azure portal.
To set up a scan using Account Key or SQL Authentication follow these steps:
:::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-key-vault-az-approved.png" alt-text="Screenshot that shows approved private endpoint for Azure Key Vault in Azure portal":::
-9. Inside Azure Purview Studio, the managed private endpoint must be shown as approved as well.
+9. Inside Microsoft Purview Studio, the managed private endpoint must be shown as approved as well.
:::image type="content" source="media/catalog-managed-vnet/purview-managed-pe-list-3.png" alt-text="Screenshot that shows managed private endpoints including Azure Key Vault in purview studio":::
-10. Select the **Data Map** tab on the left pane in the Azure Purview Studio.
+10. Select the **Data Map** tab on the left pane in the Microsoft Purview Studio.
11. Select the data source that you registered.
To set up a scan using Account Key or SQL Authentication follow these steps:
## Next steps -- [Manage data sources in Azure Purview](manage-data-sources.md)
+- [Manage data sources in Microsoft Purview](manage-data-sources.md)
purview Catalog Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-permissions.md
Title: Understand access and permissions
-description: This article gives an overview permission, access control, and collections in Azure Purview. Role-based access control (RBAC) is managed within Azure Purview itself, so this guide will cover the basics to secure your information.
+description: This article gives an overview permission, access control, and collections in Microsoft Purview. Role-based access control (RBAC) is managed within Microsoft Purview itself, so this guide will cover the basics to secure your information.
Last updated 03/09/2022
-# Access control in Azure Purview
+# Access control in Microsoft Purview
-Azure Purview uses **Collections** to organize and manage access across its sources, assets, and other artifacts. This article describes collections and access management in your Azure Purview account.
+Microsoft Purview uses **Collections** to organize and manage access across its sources, assets, and other artifacts. This article describes collections and access management in your Microsoft Purview account.
## Collections
-A collection is a tool Azure Purview uses to group assets, sources, and other artifacts into a hierarchy for discoverability and to manage access control. All accesses to Azure Purview's resources are managed from collections in the Azure Purview account itself.
+A collection is a tool Microsoft Purview uses to group assets, sources, and other artifacts into a hierarchy for discoverability and to manage access control. All accesses to Microsoft Purview's resources are managed from collections in the Microsoft Purview account itself.
> [!NOTE] > As of November 8th, 2021, ***Insights*** is accessible to Data Curators. Data Readers do not have access to Insights. ## Roles
-Azure Purview uses a set of predefined roles to control who can access what within the account. These roles are currently:
+Microsoft Purview uses a set of predefined roles to control who can access what within the account. These roles are currently:
-- **Collection administrator** - a role for users that will need to assign roles to other users in Azure Purview or manage collections. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.
+- **Collection administrator** - a role for users that will need to assign roles to other users in Microsoft Purview or manage collections. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.
- **Data curators** - a role that provides access to the data catalog to manage assets, configure custom classifications, set up glossary terms, and view insights. Data curators can create, read, modify, move, and delete assets. They can also apply annotations to assets. - **Data readers** - a role that provides read-only access to data assets, classifications, classification rules, collections and glossary terms. - **Data source administrator** - a role that allows a user to manage data sources and scans. If a user is granted only to **Data source admin** role on a given data source, they can run new scans using an existing scan rule. To create new scan rules, the user must be also granted as either **Data reader** or **Data curator** roles.-- **Policy author (Preview)** - a role that allows a user to view, update, and delete Azure Purview policies through the policy management app within Azure Purview.-- **Workflow administrator** - a role that allows a user to access the workflow authoring page in the Azure Purview studio, and publish workflows on collections where they have access permissions. Workflow administrator only has access to authoring, and so will need at least Data reader permission on a collection to be able to access the Purview Studio.
+- **Policy author (Preview)** - a role that allows a user to view, update, and delete Microsoft Purview policies through the policy management app within Microsoft Purview.
+- **Workflow administrator** - a role that allows a user to access the workflow authoring page in the Microsoft Purview studio, and publish workflows on collections where they have access permissions. Workflow administrator only has access to authoring, and so will need at least Data reader permission on a collection to be able to access the Purview Studio.
> [!NOTE]
-> At this time, Azure Purview Policy author role is not sufficient to create policies. The Azure Purview Data source admin role is also required.
+> At this time, Microsoft Purview Policy author role is not sufficient to create policies. The Microsoft Purview Data source admin role is also required.
## Who should be assigned to what role?
Azure Purview uses a set of predefined roles to control who can access what with
|I need to edit information about assets, assign classifications, associate them with glossary entries, and so on.|Data curator| |I need to edit the glossary or set up new classification definitions|Data curator| |I need to view Insights to understand the governance posture of my data estate|Data curator|
-|My application's Service Principal needs to push data to Azure Purview|Data curator|
-|I need to set up scans via the Azure Purview Studio|Data curator on the collection **or** data curator **and** data source administrator where the source is registered.|
-|I need to enable a Service Principal or group to set up and monitor scans in Azure Purview without allowing them to access the catalog's information |Data source administrator|
-|I need to put users into roles in Azure Purview | Collection administrator |
+|My application's Service Principal needs to push data to Microsoft Purview|Data curator|
+|I need to set up scans via the Microsoft Purview Studio|Data curator on the collection **or** data curator **and** data source administrator where the source is registered.|
+|I need to enable a Service Principal or group to set up and monitor scans in Microsoft Purview without allowing them to access the catalog's information |Data source administrator|
+|I need to put users into roles in Microsoft Purview | Collection administrator |
|I need to create and publish access policies | Data source administrator and policy author |
-|I need to create workflows for my Azure Purview account | Workflow administrator |
+|I need to create workflows for my Microsoft Purview account | Workflow administrator |
>[!NOTE] > **\*Data source administrator permissions on Policies** - Data source administrators are also able to publish data policies.
-## Understand how to use Azure Purview's roles and collections
+## Understand how to use Microsoft Purview's roles and collections
-All access control is managed in Azure Purview's collections. Azure Purview's collections can be found in the [Azure Purview Studio](https://web.purview.azure.com/resource/). Open your Azure Purview account in the [Azure portal](https://portal.azure.com) and select the Azure Purview Studio tile on the Overview page. From there, navigate to the data map on the left menu, and then select the 'Collections' tab.
+All access control is managed in Microsoft Purview's collections. Microsoft Purview's collections can be found in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/). Open your Microsoft Purview account in the [Azure portal](https://portal.azure.com) and select the Microsoft Purview Studio tile on the Overview page. From there, navigate to the data map on the left menu, and then select the 'Collections' tab.
-When an Azure Purview account is created, it starts with a root collection that has the same name as the Azure Purview account itself. The creator of the Azure Purview account is automatically added as a Collection Admin, Data Source Admin, Data Curator, and Data Reader on this root collection, and can edit and manage this collection.
+When a Microsoft Purview account is created, it starts with a root collection that has the same name as the Microsoft Purview account itself. The creator of the Microsoft Purview account is automatically added as a Collection Admin, Data Source Admin, Data Curator, and Data Reader on this root collection, and can edit and manage this collection.
-Sources, assets, and objects can be added directly to this root collection, but so can other collections. Adding collections will give you more control over who has access to data across your Azure Purview account.
+Sources, assets, and objects can be added directly to this root collection, but so can other collections. Adding collections will give you more control over who has access to data across your Microsoft Purview account.
-All other users can only access information within the Azure Purview account if they, or a group they're in, are given one of the above roles. This means, when you create an Azure Purview account, no one but the creator can access or use its APIs until they're [added to one or more of the above roles in a collection](how-to-create-and-manage-collections.md#add-role-assignments).
+All other users can only access information within the Microsoft Purview account if they, or a group they're in, are given one of the above roles. This means, when you create a Microsoft Purview account, no one but the creator can access or use its APIs until they're [added to one or more of the above roles in a collection](how-to-create-and-manage-collections.md#add-role-assignments).
Users can only be added to a collection by a collection admin, or through permissions inheritance. The permissions of a parent collection are automatically inherited by its subcollections. However, you can choose to [restrict permission inheritance](how-to-create-and-manage-collections.md#restrict-inheritance) on any collection. If you do this, its subcollections will no longer inherit permissions from the parent and will need to be added directly, though collection admins that are automatically inherited from a parent collection can't be removed.
-You can assign Azure Purview roles to users, security groups and service principals from your Azure Active Directory that is associated with your purview account's subscription.
+You can assign Microsoft Purview roles to users, security groups and service principals from your Azure Active Directory that is associated with your purview account's subscription.
## Assign permissions to your users
-After creating an Azure Purview account, the first thing to do is create collections and assign users to roles within those collections.
+After creating a Microsoft Purview account, the first thing to do is create collections and assign users to roles within those collections.
> [!NOTE]
-> If you created your Azure Purview account using a service principal, to be able to access the Azure Purview Studio and assign permissions to users, you will need to grant a user collection admin permissions on the root collection.
+> If you created your Microsoft Purview account using a service principal, to be able to access the Microsoft Purview Studio and assign permissions to users, you will need to grant a user collection admin permissions on the root collection.
> You can use [this Azure CLI command](/cli/azure/purview/account#az-purview-account-add-root-collection-admin): > > ```azurecli
-> az purview account add-root-collection-admin --account-name [Azure Purview Account Name] --resource-group [Resource Group Name] --object-id [User Object Id]
+> az purview account add-root-collection-admin --account-name [Microsoft Purview Account Name] --resource-group [Resource Group Name] --object-id [User Object Id]
> ``` > The object-id is optional. For more information and an example, see the [CLI command reference page](/cli/azure/purview/account#az-purview-account-add-root-collection-admin). ### Create collections
-Collections can be customized for structure of the sources in your Azure Purview account, and can act like organized storage bins for these resources. When you're thinking about the collections you might need, consider how your users will access or discover information. Are your sources broken up by departments? Are there specialized groups within those departments that will only need to discover some assets? Are there some sources that should be discoverable by all your users?
+Collections can be customized for structure of the sources in your Microsoft Purview account, and can act like organized storage bins for these resources. When you're thinking about the collections you might need, consider how your users will access or discover information. Are your sources broken up by departments? Are there specialized groups within those departments that will only need to discover some assets? Are there some sources that should be discoverable by all your users?
This will inform the collections and subcollections you may need to most effectively organize your data map.
Now that we have a base understanding of collections, permissions, and how they
This is one way an organization might structure their data: Starting with their root collection (Contoso, in this example) collections are organized into regions, and then into departments and subdepartments. Data sources and assets can be added to any one these collections to organize data resources by these regions and department, and manage access control along those lines. There's one subdepartment, Revenue, that has strict access guidelines, so permissions will need to be tightly managed.
-The [data reader role](#roles) can access information within the catalog, but not manage or edit it. So for our example above, adding the Data Reader permission to a group on the root collection and allowing inheritance will give all users in that group reader permissions on Azure Purview sources and assets. This makes these resources discoverable, but not editable, by everyone in that group. [Restricting inheritance](how-to-create-and-manage-collections.md#restrict-inheritance) on the Revenue group will control access to those assets. Users who need access to revenue information can be added separately to the Revenue collection.
+The [data reader role](#roles) can access information within the catalog, but not manage or edit it. So for our example above, adding the Data Reader permission to a group on the root collection and allowing inheritance will give all users in that group reader permissions on Microsoft Purview sources and assets. This makes these resources discoverable, but not editable, by everyone in that group. [Restricting inheritance](how-to-create-and-manage-collections.md#restrict-inheritance) on the Revenue group will control access to those assets. Users who need access to revenue information can be added separately to the Revenue collection.
Similarly with the Data Curator and Data Source Admin roles, permissions for those groups will start at the collection where they're assigned and trickle down to subcollections that haven't restricted inheritance. Below we have assigned permissions for several groups at collections levels in the Americas sub collection. :::image type="content" source="./media/catalog-permissions/collection-permissions-example.png" alt-text="Chart showing a sample collections hierarchy broken up by region and department showing permissions distribution." border="true"::: ### Add users to roles
-Role assignment is managed through the collections. Only a user with the [collection admin role](#roles) can grant permissions to other users on that collection. When new permissions need to be added, a collection admin will access the [Azure Purview Studio](https://web.purview.azure.com/resource/), navigate to data map, then the collections tab, and select the collection where a user needs to be added. From the Role Assignments tab they'll be able to add and manage users who need permissions.
+Role assignment is managed through the collections. Only a user with the [collection admin role](#roles) can grant permissions to other users on that collection. When new permissions need to be added, a collection admin will access the [Microsoft Purview Studio](https://web.purview.azure.com/resource/), navigate to data map, then the collections tab, and select the collection where a user needs to be added. From the Role Assignments tab they'll be able to add and manage users who need permissions.
For full instructions, see our [how-to guide for adding role assignments](how-to-create-and-manage-collections.md#add-role-assignments). ## Next steps
-Now that you have a base understanding of collections, and access control, follow the guides below to create and manage those collections, or get started with registering sources into your Azure Purview Resource.
+Now that you have a base understanding of collections, and access control, follow the guides below to create and manage those collections, or get started with registering sources into your Microsoft Purview Resource.
- [How to create and manage collections](how-to-create-and-manage-collections.md)-- [Azure Purview supported data sources](azure-purview-connector-overview.md)
+- [Microsoft Purview supported data sources](azure-purview-connector-overview.md)
purview Catalog Private Link Account Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-account-portal.md
Title: Connect privately and securely to your Azure Purview account
-description: This article describes how you can set up a private endpoint to connect to your Azure Purview account from restricted network.
+ Title: Connect privately and securely to your Microsoft Purview account
+description: This article describes how you can set up a private endpoint to connect to your Microsoft Purview account from restricted network.
Last updated 09/27/2021
-# Customer intent: As an Azure Purview admin, I want to set up private endpoints for my Azure Purview account for secure access.
+# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account for secure access.
-# Connect privately and securely to your Azure Purview account
-In this guide, you will learn how to deploy private endpoints for your Azure Purview account to allow you to connect to your Azure Purview account only from VNets and private networks. To achieve this goal, you need to deploy _account_ and _portal_ private endpoints for your Azure Purview account.
+# Connect privately and securely to your Microsoft Purview account
+In this guide, you will learn how to deploy private endpoints for your Microsoft Purview account to allow you to connect to your Microsoft Purview account only from VNets and private networks. To achieve this goal, you need to deploy _account_ and _portal_ private endpoints for your Microsoft Purview account.
-The Azure Purview _account_ private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the Azure Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
+The Microsoft Purview _account_ private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the Microsoft Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
-The Azure Purview _portal_ private endpoint is required to enable connectivity to [Azure Purview Studio](https://web.purview.azure.com/resource/) using a private network.
+The Microsoft Purview _portal_ private endpoint is required to enable connectivity to [Microsoft Purview Studio](https://web.purview.azure.com/resource/) using a private network.
> [!NOTE] > If you only create _account_ and _portal_ private endpoints, you won't be able to run any scans. To enable scanning on a private network, you will need to [create an ingestion private endpoint also](catalog-private-link-end-to-end.md).
- :::image type="content" source="media/catalog-private-link/purview-private-link-account-portal.png" alt-text="Diagram that shows Azure Purview and Private Link architecture.":::
+ :::image type="content" source="media/catalog-private-link/purview-private-link-account-portal.png" alt-text="Diagram that shows Microsoft Purview and Private Link architecture.":::
For more information about Azure Private Link service, see [private links and private endpoints](../private-link/private-endpoint-overview.md) to learn more. ## Deployment checklist
-Using one of the deployment options from this guide, you can deploy a new Azure Purview account with _account_ and _portal_ private endpoints or you can choose to deploy these private endpoints for an existing Azure Purview account:
+Using one of the deployment options from this guide, you can deploy a new Microsoft Purview account with _account_ and _portal_ private endpoints or you can choose to deploy these private endpoints for an existing Microsoft Purview account:
-1. Choose an appropriate Azure virtual network and a subnet to deploy Azure Purview private endpoints. Select one of the following options:
+1. Choose an appropriate Azure virtual network and a subnet to deploy Microsoft Purview private endpoints. Select one of the following options:
- Deploy a [new virtual network](../virtual-network/quick-create-portal.md) in your Azure subscription. - Locate an existing Azure virtual network and a subnet in your Azure subscription.
-2. Define an appropriate [DNS name resolution method](./catalog-private-link-name-resolution.md#deployment-options), so Azure Purview account and web portal can be accessible through private IP addresses. You can use any of the following options:
+2. Define an appropriate [DNS name resolution method](./catalog-private-link-name-resolution.md#deployment-options), so Microsoft Purview account and web portal can be accessible through private IP addresses. You can use any of the following options:
- Deploy new Azure DNS zones using the steps explained further in this guide. - Add required DNS records to existing Azure DNS zones using the steps explained further in this guide. - After completing the steps in this guide, add required DNS A records in your existing DNS servers manually.
-3. Deploy a [new Azure Purview account](#option-1deploy-a-new-azure-purview-account-with-account-and-portal-private-endpoints) with account and portal private endpoints, or deploy account and portal private endpoints for an [existing Azure Purview account](#option-2enable-account-and-portal-private-endpoint-on-existing-azure-purview-accounts).
+3. Deploy a [new Microsoft Purview account](#option-1deploy-a-new-microsoft-purview-account-with-account-and-portal-private-endpoints) with account and portal private endpoints, or deploy account and portal private endpoints for an [existing Microsoft Purview account](#option-2enable-account-and-portal-private-endpoint-on-existing-microsoft-purview-accounts).
4. [Enable access to Azure Active Directory](#enable-access-to-azure-active-directory) if your private network has network security group rules set to deny for all public internet traffic. 5. After completing this guide, adjust DNS configurations if needed.
-6. Validate your network and name resolution from management machine to Azure Purview.
+6. Validate your network and name resolution from management machine to Microsoft Purview.
-## Option 1 - Deploy a new Azure Purview account with _account_ and _portal_ private endpoints
+## Option 1 - Deploy a new Microsoft Purview account with _account_ and _portal_ private endpoints
-1. Go to the [Azure portal](https://portal.azure.com), and then go to the **Azure Purview accounts** page. Select **+ Create** to create a new Azure Purview account.
+1. Go to the [Azure portal](https://portal.azure.com), and then go to the **Microsoft Purview accounts** page. Select **+ Create** to create a new Microsoft Purview account.
2. Fill in the basic information, and on the **Networking** tab, set the connectivity method to **Private endpoint**. Set enable private endpoint to **Account and Portal only**.
-3. Under **Account and portal** select **+ Add** to add a private endpoint for your Azure Purview account.
+3. Under **Account and portal** select **+ Add** to add a private endpoint for your Microsoft Purview account.
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-account-portal.png" alt-text="Screenshot that shows create private endpoint for account and portal page selections.":::
-4. On the **Create a private endpoint** page, for **Azure Purview sub-resource**, choose your location, provide a name for _account_ private endpoint and select **account**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
+4. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**, choose your location, provide a name for _account_ private endpoint and select **account**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-account.png" alt-text="Screenshot that shows create account private endpoint page.":::
Using one of the deployment options from this guide, you can deploy a new Azure
5. Select **OK**.
-6. In **Create Azure Purview account** wizard, select **+Add** again to add _portal_ private endpoint.
+6. In **Create Microsoft Purview account** wizard, select **+Add** again to add _portal_ private endpoint.
-7. On the **Create a private endpoint** page, for **Azure Purview sub-resource**,choose your location, provide a name for _portal_ private endpoint and select **portal**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
+7. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**,choose your location, provide a name for _portal_ private endpoint and select **portal**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-portal.png" alt-text="Screenshot that shows create portal private endpoint page.":::
Using one of the deployment options from this guide, you can deploy a new Azure
10. When you see the "Validation passed" message, select **Create**.
-## Option 2 - Enable _account_ and _portal_ private endpoint on existing Azure Purview accounts
+## Option 2 - Enable _account_ and _portal_ private endpoint on existing Microsoft Purview accounts
-There are two ways you can add Azure Purview _account_ and _portal_ private endpoints for an existing Azure Purview account:
+There are two ways you can add Microsoft Purview _account_ and _portal_ private endpoints for an existing Microsoft Purview account:
-- Use the Azure portal (Azure Purview account).
+- Use the Azure portal (Microsoft Purview account).
- Use the Private Link Center.
-### Use the Azure portal (Azure Purview account)
+### Use the Azure portal (Microsoft Purview account)
-1. Go to the [Azure portal](https://portal.azure.com), and then select your Azure Purview account, and under **Settings** select **Networking**, and then select **Private endpoint connections**.
+1. Go to the [Azure portal](https://portal.azure.com), and then select your Microsoft Purview account, and under **Settings** select **Networking**, and then select **Private endpoint connections**.
:::image type="content" source="media/catalog-private-link/purview-pe-add-to-existing.png" alt-text="Screenshot that shows creating an account private endpoint.":::
There are two ways you can add Azure Purview _account_ and _portal_ private endp
4. On the **Resource** tab, for **Resource type**, select **Microsoft.Purview/accounts**.
-5. For **Resource**, select the Azure Purview account, and for **Target sub-resource**, select **account**.
+5. For **Resource**, select the Microsoft Purview account, and for **Target sub-resource**, select **account**.
6. On the **Configuration** tab, select the virtual network and optionally, select Azure Private DNS zone to create a new Azure DNS Zone.
There are two ways you can add Azure Purview _account_ and _portal_ private endp
:::image type="content" source="media/catalog-private-link/private-link-center.png" alt-text="Screenshot that shows creating private endpoints from the Private Link Center.":::
-1. For **Resource**, select the already created Azure Purview account. For **Target sub-resource**, select **account**.
+1. For **Resource**, select the already created Microsoft Purview account. For **Target sub-resource**, select **account**.
1. On the **Configuration** tab, select the virtual network and private DNS zone. Go to the summary page, and select **Create** to create the account private endpoint.
There are two ways you can add Azure Purview _account_ and _portal_ private endp
## Enable access to Azure Active Directory > [!NOTE]
-> If your VM, VPN gateway, or VNet Peering gateway has public internet access, it can access the Azure Purview portal and the Azure Purview account enabled with private endpoints. For this reason, you don't have to follow the rest of the instructions. If your private network has network security group rules set to deny all public internet traffic, you'll need to add some rules to enable Azure Active Directory (Azure AD) access. Follow the instructions to do so.
+> If your VM, VPN gateway, or VNet Peering gateway has public internet access, it can access the Microsoft Purview portal and the Microsoft Purview account enabled with private endpoints. For this reason, you don't have to follow the rest of the instructions. If your private network has network security group rules set to deny all public internet traffic, you'll need to add some rules to enable Azure Active Directory (Azure AD) access. Follow the instructions to do so.
-These instructions are provided for accessing Azure Purview securely from an Azure VM. Similar steps must be followed if you're using VPN or other VNet Peering gateways.
+These instructions are provided for accessing Microsoft Purview securely from an Azure VM. Similar steps must be followed if you're using VPN or other VNet Peering gateways.
1. Go to your VM in the Azure portal, and under **Settings**, select **Networking**. Then select **Outbound port rules**, **Add outbound port rule**.
These instructions are provided for accessing Azure Purview securely from an Azu
:::image type="content" source="media/catalog-private-link/aadcdn-rule.png" alt-text="Screenshot that shows the Azure A D Content Delivery Network rule.":::
-1. After the new rule is created, go back to the VM and try to sign in by using your Azure AD credentials again. If sign-in succeeds, then the Azure Purview portal is ready to use. But in some cases, Azure AD redirects to other domains to sign in based on a customer's account type. For example, for a live.com account, Azure AD redirects to live.com to sign in, and then those requests are blocked again. For Microsoft employee accounts, Azure AD accesses msft.sts.microsoft.com for sign-in information.
+1. After the new rule is created, go back to the VM and try to sign in by using your Azure AD credentials again. If sign-in succeeds, then the Microsoft Purview portal is ready to use. But in some cases, Azure AD redirects to other domains to sign in based on a customer's account type. For example, for a live.com account, Azure AD redirects to live.com to sign in, and then those requests are blocked again. For Microsoft employee accounts, Azure AD accesses msft.sts.microsoft.com for sign-in information.
Check the networking requests on the browser **Networking** tab to see which domain's requests are getting blocked, redo the previous step to get its IP, and add outbound port rules in the network security group to allow requests for that IP. If possible, add the URL and IP to the VM's host file to fix the DNS resolution. If you know the exact sign-in domain's IP ranges, you can also directly add them into networking rules.
-1. Now your Azure AD sign-in should be successful. The Azure Purview portal will load successfully, but listing all the Azure Purview accounts won't work because it can only access a specific Azure Purview account. Enter `web.purview.azure.com/resource/{PurviewAccountName}` to directly visit the Azure Purview account that you successfully set up a private endpoint for.
+1. Now your Azure AD sign-in should be successful. The Microsoft Purview portal will load successfully, but listing all the Microsoft Purview accounts won't work because it can only access a specific Microsoft Purview account. Enter `web.purview.azure.com/resource/{PurviewAccountName}` to directly visit the Microsoft Purview account that you successfully set up a private endpoint for.
## Next steps - [Verify resolution for private endpoints](./catalog-private-link-name-resolution.md)-- [Manage data sources in Azure Purview](./manage-data-sources.md)-- [Troubleshooting private endpoint configuration for your Azure Purview account](./catalog-private-link-troubleshoot.md)
+- [Manage data sources in Microsoft Purview](./manage-data-sources.md)
+- [Troubleshooting private endpoint configuration for your Microsoft Purview account](./catalog-private-link-troubleshoot.md)
purview Catalog Private Link End To End https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-end-to-end.md
Title: Connect to your Azure Purview and scan data sources privately and securely
-description: This article describes how you can set up a private endpoint to connect to your Azure Purview account and scan data sources from restricted network for an end to end isolation
+ Title: Connect to your Microsoft Purview and scan data sources privately and securely
+description: This article describes how you can set up a private endpoint to connect to your Microsoft Purview account and scan data sources from restricted network for an end to end isolation
Last updated 01/12/2022
-# Customer intent: As an Azure Purview admin, I want to set up private endpoints for my Azure Purview account to access purview account and scan data sources from restricted network.
+# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account to access purview account and scan data sources from restricted network.
-# Connect to your Azure Purview and scan data sources privately and securely
+# Connect to your Microsoft Purview and scan data sources privately and securely
-In this guide, you will learn how to deploy _account_, _portal_ and _ingestion_ private endpoints for your Azure Purview account to access purview account and scan data sources using a self-hosted integration runtime securely and privately, thereby enabling end-to-end network isolation.
+In this guide, you will learn how to deploy _account_, _portal_ and _ingestion_ private endpoints for your Microsoft Purview account to access purview account and scan data sources using a self-hosted integration runtime securely and privately, thereby enabling end-to-end network isolation.
-The Azure Purview _account_ private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the Azure Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
+The Microsoft Purview _account_ private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the Microsoft Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
-The Azure Purview _portal_ private endpoint is required to enable connectivity to [Azure Purview Studio](https://web.purview.azure.com/resource/) using a private network.
+The Microsoft Purview _portal_ private endpoint is required to enable connectivity to [Microsoft Purview Studio](https://web.purview.azure.com/resource/) using a private network.
-Azure Purview can scan data sources in Azure or an on-premises environment by using _ingestion_ private endpoints. Three private endpoint resources are required to be deployed and linked to Azure Purview managed resources when ingestion private endpoint is deployed:
+Microsoft Purview can scan data sources in Azure or an on-premises environment by using _ingestion_ private endpoints. Three private endpoint resources are required to be deployed and linked to Microsoft Purview managed resources when ingestion private endpoint is deployed:
+ - Blob private endpoint is linked to a Microsoft Purview managed storage account.
+ - Queue private endpoint is linked to a Microsoft Purview managed storage account.
+ - namespace private endpoint is linked to a Microsoft Purview managed Event Hub namespace.
- :::image type="content" source="media/catalog-private-link/purview-private-link-architecture.png" alt-text="Diagram that shows Azure Purview and Private Link architecture.":::
+ :::image type="content" source="media/catalog-private-link/purview-private-link-architecture.png" alt-text="Diagram that shows Microsoft Purview and Private Link architecture.":::
## Deployment checklist
-Using one of the deployment options explained further in this guide, you can deploy a new Azure Purview account with _account_, _portal_ and _ingestion_ private endpoints or you can choose to deploy these private endpoints for an existing Azure Purview account:
+Using one of the deployment options explained further in this guide, you can deploy a new Microsoft Purview account with _account_, _portal_ and _ingestion_ private endpoints or you can choose to deploy these private endpoints for an existing Microsoft Purview account:
-1. Choose an appropriate Azure virtual network and a subnet to deploy Azure Purview private endpoints. Select one of the following options:
+1. Choose an appropriate Azure virtual network and a subnet to deploy Microsoft Purview private endpoints. Select one of the following options:
- Deploy a [new virtual network](../virtual-network/quick-create-portal.md) in your Azure subscription. - Locate an existing Azure virtual network and a subnet in your Azure subscription.
-2. Define an appropriate [DNS name resolution method](./catalog-private-link-name-resolution.md#deployment-options), so you can access Azure Purview account and scan data sources using private network. You can use any of the following options:
+2. Define an appropriate [DNS name resolution method](./catalog-private-link-name-resolution.md#deployment-options), so you can access Microsoft Purview account and scan data sources using private network. You can use any of the following options:
- Deploy new Azure DNS zones using the steps explained further in this guide. - Add required DNS records to existing Azure DNS zones using the steps explained further in this guide. - After completing the steps in this guide, add required DNS A records in your existing DNS servers manually.
-3. Deploy a [new Azure Purview account](#option-1deploy-a-new-azure-purview-account-with-account-portal-and-ingestion-private-endpoints) with account, portal and ingestion private endpoints, or deploy private endpoints for an [existing Azure Purview account](#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-azure-purview-accounts).
+3. Deploy a [new Microsoft Purview account](#option-1deploy-a-new-microsoft-purview-account-with-account-portal-and-ingestion-private-endpoints) with account, portal and ingestion private endpoints, or deploy private endpoints for an [existing Microsoft Purview account](#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-microsoft-purview-accounts).
4. [Enable access to Azure Active Directory](#enable-access-to-azure-active-directory) if your private network has network security group rules set to deny for all public internet traffic.
-5. Deploy and register [Self-hosted integration runtime](#deploy-self-hosted-integration-runtime-ir-and-scan-your-data-sources) inside the same VNet or a peered VNet where Azure Purview account and ingestion private endpoints are deployed.
+5. Deploy and register [Self-hosted integration runtime](#deploy-self-hosted-integration-runtime-ir-and-scan-your-data-sources) inside the same VNet or a peered VNet where Microsoft Purview account and ingestion private endpoints are deployed.
6. After completing this guide, adjust DNS configurations if needed.
-7. Validate your network and name resolution between management machine, self-hosted IR VM and data sources to Azure Purview.
+7. Validate your network and name resolution between management machine, self-hosted IR VM and data sources to Microsoft Purview.
-## Option 1 - Deploy a new Azure Purview account with _account_, _portal_ and _ingestion_ private endpoints
+## Option 1 - Deploy a new Microsoft Purview account with _account_, _portal_ and _ingestion_ private endpoints
-1. Go to the [Azure portal](https://portal.azure.com), and then go to the **Azure Purview accounts** page. Select **+ Create** to create a new Azure Purview account.
+1. Go to the [Azure portal](https://portal.azure.com), and then go to the **Microsoft Purview accounts** page. Select **+ Create** to create a new Microsoft Purview account.
2. Fill in the basic information, and on the **Networking** tab, set the connectivity method to **Private endpoint**. Set enable private endpoint to **Account, Portal and ingestion**.
-3. Under **Account and portal** select **+ Add** to add a private endpoint for your Azure Purview account.
+3. Under **Account and portal** select **+ Add** to add a private endpoint for your Microsoft Purview account.
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-end-to-end.png" alt-text="Screenshot that shows create private endpoint end-to-end page selections.":::
-4. On the **Create a private endpoint** page, for **Azure Purview sub-resource**, choose your location, provide a name for _account_ private endpoint and select **account**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
+4. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**, choose your location, provide a name for _account_ private endpoint and select **account**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-account.png" alt-text="Screenshot that shows create account private endpoint page.":::
Using one of the deployment options explained further in this guide, you can dep
6. Under **Account and portal** wizard, again select **+Add** again to add _portal_ private endpoint.
-7. On the **Create a private endpoint** page, for **Azure Purview sub-resource**,choose your location, provide a name for _portal_ private endpoint and select **portal**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
+7. On the **Create a private endpoint** page, for **Microsoft Purview sub-resource**,choose your location, provide a name for _portal_ private endpoint and select **portal**. Under **networking**, select your virtual network and subnet, and optionally, select **Integrate with private DNS zone** to create a new Azure Private DNS zone.
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-portal.png" alt-text="Screenshot that shows create portal private endpoint page.":::
Using one of the deployment options explained further in this guide, you can dep
:::image type="content" source="media/catalog-private-link/purview-pe-deploy-ingestion.png" alt-text="Screenshot that shows create private endpoint overview page."::: > [!IMPORTANT]
- > It is important to select correct Azure Private DNS Zones to allow correct name resolution between Azure Purview and data sources. You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md).
+ > It is important to select correct Azure Private DNS Zones to allow correct name resolution between Microsoft Purview and data sources. You can also use your existing Azure Private DNS Zones or create DNS records in your DNS Servers manually later. For more information, see [Configure DNS Name Resolution for private endpoints](./catalog-private-link-name-resolution.md).
11. Select **Review + Create**. On the **Review + Create** page, Azure validates your configuration. 12. When you see the "Validation passed" message, select **Create**.
-## Option 2 - Enable _account_, _portal_ and _ingestion_ private endpoint on existing Azure Purview accounts
+## Option 2 - Enable _account_, _portal_ and _ingestion_ private endpoint on existing Microsoft Purview accounts
-1. Go to the [Azure portal](https://portal.azure.com), and then select your Azure Purview account, and under **Settings** select **Networking**, and then select **Private endpoint connections**.
+1. Go to the [Azure portal](https://portal.azure.com), and then select your Microsoft Purview account, and under **Settings** select **Networking**, and then select **Private endpoint connections**.
:::image type="content" source="media/catalog-private-link/purview-pe-add-to-existing.png" alt-text="Screenshot that shows creating an account private endpoint.":::
Using one of the deployment options explained further in this guide, you can dep
4. On the **Resource** tab, for **Resource type**, select **Microsoft.Purview/accounts**.
-5. For **Resource**, select the Azure Purview account, and for **Target sub-resource**, select **account**.
+5. For **Resource**, select the Microsoft Purview account, and for **Target sub-resource**, select **account**.
6. On the **Configuration** tab, select the virtual network and optionally, select Azure Private DNS zone to create a new Azure DNS Zone.
Using one of the deployment options explained further in this guide, you can dep
8. Follow the same steps when you select **portal** for **Target sub-resource**.
-9. From your Azure Purview account, under **Settings** select **Networking**, and then select **Ingestion private endpoint connections**.
+9. From your Microsoft Purview account, under **Settings** select **Networking**, and then select **Ingestion private endpoint connections**.
10. Under Ingestion private endpoint connections, select **+ New** to create a new ingestion private endpoint.
Using one of the deployment options explained further in this guide, you can dep
## Enable access to Azure Active Directory > [!NOTE]
-> If your VM, VPN gateway, or VNet Peering gateway has public internet access, it can access the Azure Purview portal and the Azure Purview account enabled with private endpoints. For this reason, you don't have to follow the rest of the instructions. If your private network has network security group rules set to deny all public internet traffic, you'll need to add some rules to enable Azure Active Directory (Azure AD) access. Follow the instructions to do so.
+> If your VM, VPN gateway, or VNet Peering gateway has public internet access, it can access the Microsoft Purview portal and the Microsoft Purview account enabled with private endpoints. For this reason, you don't have to follow the rest of the instructions. If your private network has network security group rules set to deny all public internet traffic, you'll need to add some rules to enable Azure Active Directory (Azure AD) access. Follow the instructions to do so.
-These instructions are provided for accessing Azure Purview securely from an Azure VM. Similar steps must be followed if you're using VPN or other VNet Peering gateways.
+These instructions are provided for accessing Microsoft Purview securely from an Azure VM. Similar steps must be followed if you're using VPN or other VNet Peering gateways.
1. Go to your VM in the Azure portal, and under **Settings**, select **Networking**. Then select **Outbound port rules** > **Add outbound port rule**.
These instructions are provided for accessing Azure Purview securely from an Azu
:::image type="content" source="media/catalog-private-link/aadcdn-rule.png" alt-text="Screenshot that shows the Azure A D Content Delivery Network rule.":::
-1. After the new rule is created, go back to the VM and try to sign in by using your Azure AD credentials again. If sign-in succeeds, then the Azure Purview portal is ready to use. But in some cases, Azure AD redirects to other domains to sign in based on a customer's account type. For example, for a live.com account, Azure AD redirects to live.com to sign in, and then those requests are blocked again. For Microsoft employee accounts, Azure AD accesses msft.sts.microsoft.com for sign-in information.
+1. After the new rule is created, go back to the VM and try to sign in by using your Azure AD credentials again. If sign-in succeeds, then the Microsoft Purview portal is ready to use. But in some cases, Azure AD redirects to other domains to sign in based on a customer's account type. For example, for a live.com account, Azure AD redirects to live.com to sign in, and then those requests are blocked again. For Microsoft employee accounts, Azure AD accesses msft.sts.microsoft.com for sign-in information.
Check the networking requests on the browser **Networking** tab to see which domain's requests are getting blocked, redo the previous step to get its IP, and add outbound port rules in the network security group to allow requests for that IP. If possible, add the URL and IP to the VM's host file to fix the DNS resolution. If you know the exact sign-in domain's IP ranges, you can also directly add them into networking rules.
-1. Now your Azure AD sign-in should be successful. The Azure Purview portal will load successfully, but listing all the Azure Purview accounts won't work because it can only access a specific Azure Purview account. Enter `web.purview.azure.com/resource/{PurviewAccountName}` to directly visit the Azure Purview account that you successfully set up a private endpoint for.
+1. Now your Azure AD sign-in should be successful. The Microsoft Purview portal will load successfully, but listing all the Microsoft Purview accounts won't work because it can only access a specific Microsoft Purview account. Enter `web.purview.azure.com/resource/{PurviewAccountName}` to directly visit the Microsoft Purview account that you successfully set up a private endpoint for.
## Deploy self-hosted integration runtime (IR) and scan your data sources.
-Once you deploy ingestion private endpoints for your Azure Purview, you need to setup and register at least one self-hosted integration runtime (IR):
+Once you deploy ingestion private endpoints for your Microsoft Purview, you need to setup and register at least one self-hosted integration runtime (IR):
- All on-premises source types like Microsoft SQL Server, Oracle, SAP, and others are currently supported only via self-hosted IR-based scans. The self-hosted IR must run within your private network and then be peered with your virtual network in Azure. -- For all Azure source types like Azure Blob Storage and Azure SQL Database, you must explicitly choose to run the scan by using a self-hosted integration runtime that is deployed in the same VNet or a peered VNet where Azure Purview account and ingestion private endpoints are deployed.
+- For all Azure source types like Azure Blob Storage and Azure SQL Database, you must explicitly choose to run the scan by using a self-hosted integration runtime that is deployed in the same VNet or a peered VNet where Microsoft Purview account and ingestion private endpoints are deployed.
Follow the steps in [Create and manage a self-hosted integration runtime](manage-integration-runtimes.md) to set up a self-hosted IR. Then set up your scan on the Azure source by choosing that self-hosted IR in the **Connect via integration runtime** dropdown list to ensure network isolation.
Follow the steps in [Create and manage a self-hosted integration runtime](manage
## Firewalls to restrict public access
-To cut off access to the Azure Purview account completely from the public internet, follow these steps. This setting applies to both private endpoint and ingestion private endpoint connections.
+To cut off access to the Microsoft Purview account completely from the public internet, follow these steps. This setting applies to both private endpoint and ingestion private endpoint connections.
-1. Go to the Azure Purview account from the Azure portal, and under **Settings** > **Networking**, select **Private endpoint connections**.
+1. Go to the Microsoft Purview account from the Azure portal, and under **Settings** > **Networking**, select **Private endpoint connections**.
1. Go to the **Firewall** tab, and ensure that the toggle is set to **Deny**.
To cut off access to the Azure Purview account completely from the public intern
## Next steps - [Verify resolution for private endpoints](./catalog-private-link-name-resolution.md)-- [Manage data sources in Azure Purview](./manage-data-sources.md)-- [Troubleshooting private endpoint configuration for your Azure Purview account](./catalog-private-link-troubleshoot.md)
+- [Manage data sources in Microsoft Purview](./manage-data-sources.md)
+- [Troubleshooting private endpoint configuration for your Microsoft Purview account](./catalog-private-link-troubleshoot.md)
purview Catalog Private Link Faqs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-faqs.md
Title: Azure Purview private endpoints frequently asked questions (FAQ)
-description: This article answers frequently asked questions about Azure Purview private endpoints.
+ Title: Microsoft Purview private endpoints frequently asked questions (FAQ)
+description: This article answers frequently asked questions about Microsoft Purview private endpoints.
Last updated 05/11/2021
-# Customer intent: As an Azure Purview admin, I want to set up private endpoints for my Azure Purview account for secure access.
+# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account for secure access.
-# FAQ about Azure Purview private endpoints
+# FAQ about Microsoft Purview private endpoints
-This article answers common questions that customers and field teams often ask about Azure Purview network configurations by using [Azure Private Link](../private-link/private-link-overview.md). It's intended to clarify questions about Azure Purview firewall settings, private endpoints, DNS configuration, and related configurations.
+This article answers common questions that customers and field teams often ask about Microsoft Purview network configurations by using [Azure Private Link](../private-link/private-link-overview.md). It's intended to clarify questions about Microsoft Purview firewall settings, private endpoints, DNS configuration, and related configurations.
-To set up Azure Purview by using Private Link, see [Use private endpoints for your Azure Purview account](./catalog-private-link.md).
+To set up Microsoft Purview by using Private Link, see [Use private endpoints for your Microsoft Purview account](./catalog-private-link.md).
## Common questions Check out the answers to the following common questions.
-### What's the purpose of deploying the Azure Purview account private endpoint?
+### What's the purpose of deploying the Microsoft Purview account private endpoint?
-The Azure Purview account private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the account. This private endpoint is also a prerequisite for the portal private endpoint.
+The Microsoft Purview account private endpoint is used to add another layer of security by enabling scenarios where only client calls that originate from within the virtual network are allowed to access the account. This private endpoint is also a prerequisite for the portal private endpoint.
-### What's the purpose of deploying the Azure Purview portal private endpoint?
+### What's the purpose of deploying the Microsoft Purview portal private endpoint?
-The Azure Purview portal private endpoint provides private connectivity to Azure Purview Studio.
+The Microsoft Purview portal private endpoint provides private connectivity to Microsoft Purview Studio.
-### What's the purpose of deploying the Azure Purview ingestion private endpoints?
+### What's the purpose of deploying the Microsoft Purview ingestion private endpoints?
-Azure Purview can scan data sources in Azure or an on-premises environment by using ingestion private endpoints. Three other private endpoint resources are deployed and linked to Azure Purview managed resources when ingestion private endpoints are created:
+Microsoft Purview can scan data sources in Azure or an on-premises environment by using ingestion private endpoints. Three other private endpoint resources are deployed and linked to Microsoft Purview managed resources when ingestion private endpoints are created:
-- **Blob** is linked to an Azure Purview managed storage account.-- **Queue** is linked to an Azure Purview managed storage account.-- **namespace** is linked to an Azure Purview managed event hub namespace.
+- **Blob** is linked to a Microsoft Purview managed storage account.
+- **Queue** is linked to a Microsoft Purview managed storage account.
+- **namespace** is linked to a Microsoft Purview managed event hub namespace.
-### Can I scan data through a public endpoint if a private endpoint is enabled on my Azure Purview account?
+### Can I scan data through a public endpoint if a private endpoint is enabled on my Microsoft Purview account?
-Yes. Data sources that aren't connected through a private endpoint can be scanned by using a public endpoint while Azure Purview is configured to use a private endpoint.
+Yes. Data sources that aren't connected through a private endpoint can be scanned by using a public endpoint while Microsoft Purview is configured to use a private endpoint.
### Can I scan data through a service endpoint if a private endpoint is enabled?
-Yes. Data sources that aren't connected through a private endpoint can be scanned by using a service endpoint while Azure Purview is configured to use a private endpoint.
+Yes. Data sources that aren't connected through a private endpoint can be scanned by using a service endpoint while Microsoft Purview is configured to use a private endpoint.
Make sure you enable **Allow trusted Microsoft services** to access the resources inside the service endpoint configuration of the data source resource in Azure. For example, if you're going to scan Azure Blob Storage in which the firewalls and virtual networks settings are set to **selected networks**, make sure the **Allow trusted Microsoft services to access this storage account** checkbox is selected as an exception.
-### Can I access Azure Purview Studio from a public network if Public network access is set to Deny in Azure Purview account networking?
+### Can I access Microsoft Purview Studio from a public network if Public network access is set to Deny in Microsoft Purview account networking?
-No. Connecting to Azure Purview from a public endpoint where **Public network access** is set to **Deny** results in the following error message:
+No. Connecting to Microsoft Purview from a public endpoint where **Public network access** is set to **Deny** results in the following error message:
-"Not authorized to access this Azure Purview account. This Azure Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Azure Purview account's private endpoint."
+"Not authorized to access this Microsoft Purview account. This Microsoft Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Microsoft Purview account's private endpoint."
-In this case, to open Azure Purview Studio, either use a machine that's deployed in the same virtual network as the Azure Purview portal private endpoint or use a VM that's connected to your CorpNet in which hybrid connectivity is allowed.
+In this case, to open Microsoft Purview Studio, either use a machine that's deployed in the same virtual network as the Microsoft Purview portal private endpoint or use a VM that's connected to your CorpNet in which hybrid connectivity is allowed.
-### Is it possible to restrict access to the Azure Purview managed storage account and event hub namespace (for private endpoint ingestion only) but keep portal access enabled for users across the web?
+### Is it possible to restrict access to the Microsoft Purview managed storage account and event hub namespace (for private endpoint ingestion only) but keep portal access enabled for users across the web?
-No. When you set **Public network access** to **Deny**, access to the Azure Purview managed storage account and event hub namespace is automatically set for private endpoint ingestion only. When you set **Public network access** to **Allow**, access to the Azure Purview managed storage account and event hub namespace is automatically set for **All Networks**. You can't modify the private endpoint ingestion manually for the managed storage account or event hub namespace manually.
+No. When you set **Public network access** to **Deny**, access to the Microsoft Purview managed storage account and event hub namespace is automatically set for private endpoint ingestion only. When you set **Public network access** to **Allow**, access to the Microsoft Purview managed storage account and event hub namespace is automatically set for **All Networks**. You can't modify the private endpoint ingestion manually for the managed storage account or event hub namespace manually.
### If public network access is set to Allow, does it mean the managed storage account and event hub namespace can be publicly accessible?
-No. As protected resources, access to the Azure Purview managed storage account and event hub namespace is restricted to Azure Purview only. These resources are deployed with a deny assignment to all principals, which prevents any applications, users, or groups from gaining access to them.
+No. As protected resources, access to the Microsoft Purview managed storage account and event hub namespace is restricted to Microsoft Purview only. These resources are deployed with a deny assignment to all principals, which prevents any applications, users, or groups from gaining access to them.
To read more about Azure deny assignment, see [Understand Azure deny assignments](../role-based-access-control/deny-assignments.md).
To read more about Azure deny assignment, see [Understand Azure deny assignments
Azure Key Vault or Service Principal.
-### What private DNS zones are required for Azure Purview for a private endpoint?
+### What private DNS zones are required for Microsoft Purview for a private endpoint?
-For Azure Purview _account_ and _portal_ private endpoints:
+For Microsoft Purview _account_ and _portal_ private endpoints:
- `privatelink.purview.azure.com`
-For Azure Purview _ingestion_ private endpoints:
+For Microsoft Purview _ingestion_ private endpoints:
- `privatelink.blob.core.windows.net` - `privatelink.queue.core.windows.net` - `privatelink.servicebus.windows.net`
-### Do I have to use a dedicated virtual network and dedicated subnet when I deploy Azure Purview private endpoints?
+### Do I have to use a dedicated virtual network and dedicated subnet when I deploy Microsoft Purview private endpoints?
-No. However, `PrivateEndpointNetworkPolicies` must be disabled in the destination subnet before you deploy the private endpoints. Consider deploying Azure Purview into a virtual network that has network connectivity to data source virtual networks through VNet Peering and access to an on-premises network if you plan to scan data sources cross-premises.
+No. However, `PrivateEndpointNetworkPolicies` must be disabled in the destination subnet before you deploy the private endpoints. Consider deploying Microsoft Purview into a virtual network that has network connectivity to data source virtual networks through VNet Peering and access to an on-premises network if you plan to scan data sources cross-premises.
Read more about [Disable network policies for private endpoints](../private-link/disable-private-endpoint-network-policy.md).
-### Can I deploy Azure Purview private endpoints and use existing private DNS zones in my subscription to register the A records?
+### Can I deploy Microsoft Purview private endpoints and use existing private DNS zones in my subscription to register the A records?
-Yes. Your private endpoint DNS zones can be centralized in a hub or data management subscription for all internal DNS zones required for Azure Purview and all data source records. We recommend this method to allow Azure Purview to resolve data sources by using their private endpoint internal IP addresses.
+Yes. Your private endpoint DNS zones can be centralized in a hub or data management subscription for all internal DNS zones required for Microsoft Purview and all data source records. We recommend this method to allow Microsoft Purview to resolve data sources by using their private endpoint internal IP addresses.
You're also required to set up a [virtual network link](../dns/private-dns-virtual-network-links.md) for virtual networks for the existing private DNS zone.
You're also required to set up a [virtual network link](../dns/private-dns-virtu
No. You have to deploy and register a self-hosted integration runtime to scan data by using private connectivity. Azure Key Vault or Service Principal must be used as the authentication method to data sources.
-### What are the outbound ports and firewall requirements for virtual machines with self-hosted integration runtime for Azure Purview when you use a private endpoint?
+### What are the outbound ports and firewall requirements for virtual machines with self-hosted integration runtime for Microsoft Purview when you use a private endpoint?
-The VMs in which self-hosted integration runtime is deployed must have outbound access to Azure endpoints and an Azure Purview private IP address through port 443.
+The VMs in which self-hosted integration runtime is deployed must have outbound access to Azure endpoints and a Microsoft Purview private IP address through port 443.
### Do I need to enable outbound internet access from the virtual machine running self-hosted integration runtime if a private endpoint is enabled?
-No. However, it's expected that the virtual machine running self-hosted integration runtime can connect to your instance of Azure Purview through an internal IP address by using port 443. Use common troubleshooting tools for name resolution and connectivity testing, such as nslookup.exe and Test-NetConnection.
+No. However, it's expected that the virtual machine running self-hosted integration runtime can connect to your instance of Microsoft Purview through an internal IP address by using port 443. Use common troubleshooting tools for name resolution and connectivity testing, such as nslookup.exe and Test-NetConnection.
-### Why do I receive the following error message when I try to launch Azure Purview Studio from my machine?
+### Why do I receive the following error message when I try to launch Microsoft Purview Studio from my machine?
-"This Azure Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Azure Purview account's private endpoint."
+"This Microsoft Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Microsoft Purview account's private endpoint."
-It's likely your Azure Purview account is deployed by using Private Link and public access is disabled on your Azure Purview account. As a result, you have to browse Azure Purview Studio from a virtual machine that has internal network connectivity to Azure Purview.
+It's likely your Microsoft Purview account is deployed by using Private Link and public access is disabled on your Microsoft Purview account. As a result, you have to browse Microsoft Purview Studio from a virtual machine that has internal network connectivity to Microsoft Purview.
If you're connecting from a VM behind a hybrid network or using a jump machine connected to your virtual network, use common troubleshooting tools for name resolution and connectivity testing, such as nslookup.exe and Test-NetConnection.
-1. Validate if you can resolve the following addresses through your Azure Purview account's private IP addresses.
+1. Validate if you can resolve the following addresses through your Microsoft Purview account's private IP addresses.
- `Web.Purview.Azure.com` - `<YourPurviewAccountName>.Purview.Azure.com`
-1. Verify network connectivity to your Azure Purview account by using the following PowerShell command:
+1. Verify network connectivity to your Microsoft Purview account by using the following PowerShell command:
```powershell Test-NetConnection -ComputerName <YourPurviewAccountName>.Purview.Azure.com -Port 443
For more information about DNS settings for private endpoints, see [Azure privat
## Next steps
-To set up Azure Purview by using Private Link, see [Use private endpoints for your Azure Purview account](./catalog-private-link.md).
+To set up Microsoft Purview by using Private Link, see [Use private endpoints for your Microsoft Purview account](./catalog-private-link.md).
purview Catalog Private Link Name Resolution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-name-resolution.md
Title: Configure DNS Name Resolution for private endpoints
-description: This article describes an overview of how you can use a private end point for your Azure Purview account
+description: This article describes an overview of how you can use a private end point for your Microsoft Purview account
Last updated 01/21/2022
-# Customer intent: As an Azure Purview admin, I want to set up private endpoints for my Azure Purview account, for secure access.
+# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account, for secure access.
-# Configure and verify DNS Name Resolution for Azure Purview private endpoints
+# Configure and verify DNS Name Resolution for Microsoft Purview private endpoints
## Conceptual overview
-Accurate name resolution is a critical requirement when setting up private endpoints for your Azure Purview accounts.
+Accurate name resolution is a critical requirement when setting up private endpoints for your Microsoft Purview accounts.
-You may require enabling internal name resolution in your DNS settings to resolve the private endpoint IP addresses to the fully qualified domain name (FQDN) from data sources and your management machine to Azure Purview account and self-hosted integration runtime, depending on scenarios that you are deploying.
+You may require enabling internal name resolution in your DNS settings to resolve the private endpoint IP addresses to the fully qualified domain name (FQDN) from data sources and your management machine to Microsoft Purview account and self-hosted integration runtime, depending on scenarios that you are deploying.
-The following example shows Azure Purview DNS name resolution from outside the virtual network or when an Azure private endpoint is not configured.
+The following example shows Microsoft Purview DNS name resolution from outside the virtual network or when an Azure private endpoint is not configured.
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-external.png" alt-text="Screenshot that shows Azure Purview name resolution from outside CorpNet.":::
+ :::image type="content" source="media/catalog-private-link/purview-name-resolution-external.png" alt-text="Screenshot that shows Microsoft Purview name resolution from outside CorpNet.":::
-The following example shows Azure Purview DNS name resolution from inside the virtual network.
+The following example shows Microsoft Purview DNS name resolution from inside the virtual network.
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-private-link.png" alt-text="Screenshot that shows Azure Purview name resolution from inside CorpNet.":::
+ :::image type="content" source="media/catalog-private-link/purview-name-resolution-private-link.png" alt-text="Screenshot that shows Microsoft Purview name resolution from inside CorpNet.":::
## Deployment options
-Use any of the following options to set up internal name resolution when using private endpoints for your Azure Purview account:
+Use any of the following options to set up internal name resolution when using private endpoints for your Microsoft Purview account:
- [Deploy new Azure Private DNS Zones](#option-1deploy-new-azure-private-dns-zones) in your Azure environment part of private endpoint deployment. (Default option) - [Use existing Azure Private DNS Zones](#option-2use-existing-azure-private-dns-zones). Use this option if you using a private endpoint in a hub-and-spoke model from a different subscription or even within the same subscription.
Use any of the following options to set up internal name resolution when using p
## Option 1 - Deploy new Azure Private DNS Zones ### Deploy new Azure Private DNS Zones
-To enable internal name resolution, you can deploy the required Azure DNS Zones inside your Azure subscription where Azure Purview account is deployed.
+To enable internal name resolution, you can deploy the required Azure DNS Zones inside your Azure subscription where Microsoft Purview account is deployed.
:::image type="content" source="media/catalog-private-link/purview-pe-dns-zones.png" alt-text="Screenshot that shows DNS Zones.":::
-When you create ingestion, portal and account private endpoints, the DNS CNAME resource records for Azure Purview is automatically updated to an alias in few subdomains with the prefix `privatelink`:
+When you create ingestion, portal and account private endpoints, the DNS CNAME resource records for Microsoft Purview is automatically updated to an alias in few subdomains with the prefix `privatelink`:
-- By default, during the deployment of _account_ private endpoint for your Azure Purview account, we also create a [private DNS zone](../dns/private-dns-overview.md) that corresponds to the `privatelink` subdomain for Azure Purview as `privatelink.purview.azure.com` including DNS A resource records for the private endpoints.
+- By default, during the deployment of _account_ private endpoint for your Microsoft Purview account, we also create a [private DNS zone](../dns/private-dns-overview.md) that corresponds to the `privatelink` subdomain for Microsoft Purview as `privatelink.purview.azure.com` including DNS A resource records for the private endpoints.
-- During the deployment of _portal_ private endpoint for your Azure Purview account, we also create a new private DNS zone that corresponds to the `privatelink` subdomain for Azure Purview as `privatelink.purviewstudio.azure.com` including DNS A resource records for _Web_.
+- During the deployment of _portal_ private endpoint for your Microsoft Purview account, we also create a new private DNS zone that corresponds to the `privatelink` subdomain for Microsoft Purview as `privatelink.purviewstudio.azure.com` including DNS A resource records for _Web_.
- If you enable ingestion private endpoints, additional DNS zones are required for managed resources.
-The following table shows an example of Azure Private DNS zones and DNS A Records that are deployed as part of configuration of private endpoint for an Azure Purview account if you enable _Private DNS integration_ during the deployment:
+The following table shows an example of Azure Private DNS zones and DNS A Records that are deployed as part of configuration of private endpoint for a Microsoft Purview account if you enable _Private DNS integration_ during the deployment:
Private endpoint |Private endpoint associated to |DNS Zone (new) |A Record (example) | |||||
-|Account |Azure Purview |`privatelink.purview.azure.com` |Contoso-Purview |
-|Portal |Azure Purview |`privatelink.purviewstudio.azure.com` |Web |
-|Ingestion |Azure Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Azure Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Azure Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
+|Account |Microsoft Purview |`privatelink.purview.azure.com` |Contoso-Purview |
+|Portal |Microsoft Purview |`privatelink.purviewstudio.azure.com` |Web |
+|Ingestion |Microsoft Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
+|Ingestion |Microsoft Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
+|Ingestion |Microsoft Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
### Validate virtual network links on Azure Private DNS Zones
For more information, see [Azure private endpoint DNS configuration](../private-
### Verify internal name resolution
-When you resolve the Azure Purview endpoint URL from outside the virtual network with the private endpoint, it resolves to the public endpoint of Azure Purview. When resolved from the virtual network hosting the private endpoint, the Azure Purview endpoint URL resolves to the private endpoint's IP address.
+When you resolve the Microsoft Purview endpoint URL from outside the virtual network with the private endpoint, it resolves to the public endpoint of Microsoft Purview. When resolved from the virtual network hosting the private endpoint, the Microsoft Purview endpoint URL resolves to the private endpoint's IP address.
-As an example, if an Azure Purview account name is 'Contoso-Purview', when it is resolved from outside the virtual network that hosts the private endpoint, it will be:
+As an example, if a Microsoft Purview account name is 'Contoso-Purview', when it is resolved from outside the virtual network that hosts the private endpoint, it will be:
| Name | Type | Value | | - | -- | | | `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | CNAME | \<Azure Purview public endpoint\> |
-| \<Azure Purview public endpoint\> | A | \<Azure Purview public IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Azure Purview Studio public endpoint\> |
+| `Contoso-Purview.privatelink.purview.azure.com` | CNAME | \<Microsoft Purview public endpoint\> |
+| \<Microsoft Purview public endpoint\> | A | \<Microsoft Purview public IP address\> |
+| `Web.purview.azure.com` | CNAME | \<Microsoft Purview Studio public endpoint\> |
The DNS resource records for Contoso-Purview, when resolved in the virtual network hosting the private endpoint, will be: | Name | Type | Value | | - | -- | | | `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | A | \<Azure Purview account private endpoint IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Azure Purview portal private endpoint IP address\> |
+| `Contoso-Purview.privatelink.purview.azure.com` | A | \<Microsoft Purview account private endpoint IP address\> |
+| `Web.purview.azure.com` | CNAME | \<Microsoft Purview portal private endpoint IP address\> |
## Option 2 - Use existing Azure Private DNS Zones
During the deployment of Azure purview private endpoints, you can choose _Privat
This scenario also applies if your organization uses a central or hub subscription for all Azure Private DNS Zones.
-The following list shows the required Azure DNS zones and A records for Azure Purview private endpoints:
+The following list shows the required Azure DNS zones and A records for Microsoft Purview private endpoints:
> [!NOTE]
-> Update all names with `Contoso-Purview`,`scaneastusabcd1234` and `atlas-12345678-1234-1234-abcd-123456789abc` with corresponding Azure resources name in your environment. For example, instead of `scaneastusabcd1234` use the name of your Azure Purview managed storage account.
+> Update all names with `Contoso-Purview`,`scaneastusabcd1234` and `atlas-12345678-1234-1234-abcd-123456789abc` with corresponding Azure resources name in your environment. For example, instead of `scaneastusabcd1234` use the name of your Microsoft Purview managed storage account.
Private endpoint |Private endpoint associated to |DNS Zone (existing) |A Record (example) | |||||
-|Account |Azure Purview |`privatelink.purview.azure.com` |Contoso-Purview |
-|Portal |Azure Purview |`privatelink.purviewstudio.azure.com` |Web |
-|Ingestion |Azure Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Azure Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
-|Ingestion |Azure Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
+|Account |Microsoft Purview |`privatelink.purview.azure.com` |Contoso-Purview |
+|Portal |Microsoft Purview |`privatelink.purviewstudio.azure.com` |Web |
+|Ingestion |Microsoft Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
+|Ingestion |Microsoft Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
+|Ingestion |Microsoft Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
- :::image type="content" source="media/catalog-private-link/purview-name-resolution-diagram.png" alt-text="Diagram that shows Azure Purview name resolution"lightbox="media/catalog-private-link/purview-name-resolution-diagram.png":::
+ :::image type="content" source="media/catalog-private-link/purview-name-resolution-diagram.png" alt-text="Diagram that shows Microsoft Purview name resolution"lightbox="media/catalog-private-link/purview-name-resolution-diagram.png":::
For more information, see [Virtual network workloads without custom DNS server](../private-link/private-endpoint-dns.md#virtual-network-workloads-without-custom-dns-server) and [On-premises workloads using a DNS forwarder](../private-link/private-endpoint-dns.md#on-premises-workloads-using-a-dns-forwarder) scenarios in [Azure Private Endpoint DNS configuration](../private-link/private-endpoint-dns.md).
Additionally it is required to validate your DNS configurations on Azure virtual
### Verify internal name resolution
-When you resolve the Azure Purview endpoint URL from outside the virtual network with the private endpoint, it resolves to the public endpoint of Azure Purview. When resolved from the virtual network hosting the private endpoint, the Azure Purview endpoint URL resolves to the private endpoint's IP address.
+When you resolve the Microsoft Purview endpoint URL from outside the virtual network with the private endpoint, it resolves to the public endpoint of Microsoft Purview. When resolved from the virtual network hosting the private endpoint, the Microsoft Purview endpoint URL resolves to the private endpoint's IP address.
-As an example, if an Azure Purview account name is 'Contoso-Purview', when it is resolved from outside the virtual network that hosts the private endpoint, it will be:
+As an example, if a Microsoft Purview account name is 'Contoso-Purview', when it is resolved from outside the virtual network that hosts the private endpoint, it will be:
| Name | Type | Value | | - | -- | | | `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | CNAME | \<Azure Purview public endpoint\> |
-| \<Azure Purview public endpoint\> | A | \<Azure Purview public IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Azure Purview Studio public endpoint\> |
+| `Contoso-Purview.privatelink.purview.azure.com` | CNAME | \<Microsoft Purview public endpoint\> |
+| \<Microsoft Purview public endpoint\> | A | \<Microsoft Purview public IP address\> |
+| `Web.purview.azure.com` | CNAME | \<Microsoft Purview Studio public endpoint\> |
The DNS resource records for Contoso-Purview, when resolved in the virtual network hosting the private endpoint, will be: | Name | Type | Value | | - | -- | | | `Contoso-Purview.purview.azure.com` | CNAME | `Contoso-Purview.privatelink.purview.azure.com` |
-| `Contoso-Purview.privatelink.purview.azure.com` | A | \<Azure Purview account private endpoint IP address\> |
-| `Web.purview.azure.com` | CNAME | \<Azure Purview portal private endpoint IP address\> |
+| `Contoso-Purview.privatelink.purview.azure.com` | A | \<Microsoft Purview account private endpoint IP address\> |
+| `Web.purview.azure.com` | CNAME | \<Microsoft Purview portal private endpoint IP address\> |
## Option 3 - Use your own DNS Servers If you do not use DNS forwarders and instead you manage A records directly in your on-premises DNS servers to resolve the endpoints through their private IP addresses, you might need to create the following A records in your DNS servers. > [!NOTE]
-> Update all names with `Contoso-Purview`,`scaneastusabcd1234` and `atlas-12345678-1234-1234-abcd-123456789abc` with corresponding Azure resources name in your environment. For example, instead of `scaneastusabcd1234` use the name of your Azure Purview managed storage account.
+> Update all names with `Contoso-Purview`,`scaneastusabcd1234` and `atlas-12345678-1234-1234-abcd-123456789abc` with corresponding Azure resources name in your environment. For example, instead of `scaneastusabcd1234` use the name of your Microsoft Purview managed storage account.
| Name | Type | Value | | - | -- | |
-| `web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview> |
-| `scaneastusabcd1234.blob.core.windows.net` | A | \<blob-ingestion private endpoint IP address of Azure Purview> |
-| `scaneastusabcd1234.queue.core.windows.net` | A | \<queue-ingestion private endpoint IP address of Azure Purview> |
-| `atlas-12345678-1234-1234-abcd-123456789abc.servicebus.windows.net`| A | \<namespace-ingestion private endpoint IP address of Azure Purview> |
-| `Contoso-Purview.Purview.azure.com` | A | \<account private endpoint IP address of Azure Purview> |
-| `Contoso-Purview.scan.Purview.azure.com` | A | \<account private endpoint IP address of Azure Purview> |
-| `Contoso-Purview.catalog.Purview.azure.com` | A | \<account private endpoint IP address of Azure Purview\> |
-| `Contoso-Purview.proxy.purview.azure.com` | A | \<account private endpoint IP address of Azure Purview\> |
-| `Contoso-Purview.guardian.purview.azure.com` | A | \<account private endpoint IP address of Azure Purview\> |
-| `gateway.purview.azure.com` | A | \<account private endpoint IP address of Azure Purview\> |
-| `manifest.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `cdn.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `hub.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `catalog.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `cseo.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `datascan.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `datashare.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `datasource.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `policy.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
-| `sensitivity.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Azure Purview\> |
+| `web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview> |
+| `scaneastusabcd1234.blob.core.windows.net` | A | \<blob-ingestion private endpoint IP address of Microsoft Purview> |
+| `scaneastusabcd1234.queue.core.windows.net` | A | \<queue-ingestion private endpoint IP address of Microsoft Purview> |
+| `atlas-12345678-1234-1234-abcd-123456789abc.servicebus.windows.net`| A | \<namespace-ingestion private endpoint IP address of Microsoft Purview> |
+| `Contoso-Purview.Purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview> |
+| `Contoso-Purview.scan.Purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview> |
+| `Contoso-Purview.catalog.Purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
+| `Contoso-Purview.proxy.purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
+| `Contoso-Purview.guardian.purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
+| `gateway.purview.azure.com` | A | \<account private endpoint IP address of Microsoft Purview\> |
+| `manifest.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `cdn.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `hub.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `catalog.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `cseo.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `datascan.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `datashare.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `datasource.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `policy.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
+| `sensitivity.prod.ext.web.purview.azure.com` | A | \<portal private endpoint IP address of Microsoft Purview\> |
## Verify and DNS test name resolution and connectivity
If you do not use DNS forwarders and instead you manage A records directly in yo
|Private endpoint |Private endpoint associated to |DNS Zone |A Record )(example) | |||||
- |Account |Azure Purview |`privatelink.purview.azure.com` |Contoso-Purview |
- |Portal |Azure Purview |`privatelink.purviewstudio.azure.com` |Web |
- |Ingestion |Azure Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
- |Ingestion |Azure Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
- |Ingestion |Azure Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
+ |Account |Microsoft Purview |`privatelink.purview.azure.com` |Contoso-Purview |
+ |Portal |Microsoft Purview |`privatelink.purviewstudio.azure.com` |Web |
+ |Ingestion |Microsoft Purview managed Storage Account - Blob |`privatelink.blob.core.windows.net` |scaneastusabcd1234 |
+ |Ingestion |Microsoft Purview managed Storage Account - Queue |`privatelink.queue.core.windows.net` |scaneastusabcd1234 |
+ |Ingestion |Microsoft Purview managed Storage Account - Event Hub |`privatelink.servicebus.windows.net` |atlas-12345678-1234-1234-abcd-123456789abc |
2. Create [Virtual network links](../dns/private-dns-virtual-network-links.md) in your Azure Private DNS Zones for your Azure Virtual Networks to allow internal name resolution.
-3. From your management PC and self-hosted integration runtime VM, test name resolution and network connectivity to your Azure Purview account using tools such as Nslookup.exe and PowerShell
+3. From your management PC and self-hosted integration runtime VM, test name resolution and network connectivity to your Microsoft Purview account using tools such as Nslookup.exe and PowerShell
To test name resolution you need to resolve the following FQDNs through their private IP addresses: (Instead of Contoso-Purview, scaneastusabcd1234 or atlas-12345678-1234-1234-abcd-123456789abc, use the hostname associated with your purview account name and managed resources names)
You must resolve each endpoint by their private endpoint and obtain TcpTestSucce
## Next steps -- [Troubleshooting private endpoint configuration for your Azure Purview account](catalog-private-link-troubleshoot.md)-- [Manage data sources in Azure Purview](./manage-data-sources.md)
+- [Troubleshooting private endpoint configuration for your Microsoft Purview account](catalog-private-link-troubleshoot.md)
+- [Manage data sources in Microsoft Purview](./manage-data-sources.md)
purview Catalog Private Link Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link-troubleshoot.md
Title: Troubleshooting private endpoint configuration for Azure Purview accounts
-description: This article describes how to troubleshoot problems with your Azure Purview account related to private endpoints configurations
+ Title: Troubleshooting private endpoint configuration for Microsoft Purview accounts
+description: This article describes how to troubleshoot problems with your Microsoft Purview account related to private endpoints configurations
Last updated 01/12/2022
-# Customer intent: As an Azure Purview admin, I want to set up private endpoints for my Azure Purview account, for secure access.
+# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account, for secure access.
-# Troubleshooting private endpoint configuration for Azure Purview accounts
+# Troubleshooting private endpoint configuration for Microsoft Purview accounts
-This guide summarizes known limitations related to using private endpoints for Azure Purview and provides a list of steps and solutions for troubleshooting some of the most common relevant issues.
+This guide summarizes known limitations related to using private endpoints for Microsoft Purview and provides a list of steps and solutions for troubleshooting some of the most common relevant issues.
## Known limitations - We currently do not support ingestion private endpoints that work with your AWS sources. - Scanning Azure Multiple Sources using self-hosted integration runtime is not supported. - Using Azure integration runtime to scan data sources behind private endpoint is not supported.-- Using Azure portal, the ingestion private endpoints can be created via the Azure Purview portal experience described in the preceding steps. They can't be created from the Private Link Center.-- Creating DNS A records for ingestion private endpoints inside existing Azure DNS Zones, while the Azure Private DNS Zones are located in a different subscription than the private endpoints is not supported via the Azure Purview portal experience. A records can be added manually in the destination DNS Zones in the other subscription. -- Self-hosted integration runtime machine must be deployed in the same VNet or a peered VNet where Azure Purview account and ingestion private endpoints are deployed.
+- Using Azure portal, the ingestion private endpoints can be created via the Microsoft Purview portal experience described in the preceding steps. They can't be created from the Private Link Center.
+- Creating DNS A records for ingestion private endpoints inside existing Azure DNS Zones, while the Azure Private DNS Zones are located in a different subscription than the private endpoints is not supported via the Microsoft Purview portal experience. A records can be added manually in the destination DNS Zones in the other subscription.
+- Self-hosted integration runtime machine must be deployed in the same VNet or a peered VNet where Microsoft Purview account and ingestion private endpoints are deployed.
- We currently do not support scanning a cross-tenant Power BI tenant, which has a private endpoint configured with public access blocked. - For limitation related to Private Link service, see [Azure Private Link limits](../azure-resource-manager/management/azure-subscription-service-limits.md#private-link-limits). ## Recommended troubleshooting steps
-1. Once you deploy private endpoints for your Azure Purview account, review your Azure environment to make sure private endpoint resources are deployed successfully. Depending on your scenario, one or more of the following Azure private endpoints must be deployed in your Azure subscription:
+1. Once you deploy private endpoints for your Microsoft Purview account, review your Azure environment to make sure private endpoint resources are deployed successfully. Depending on your scenario, one or more of the following Azure private endpoints must be deployed in your Azure subscription:
|Private endpoint |Private endpoint assigned to | Example| ||||
- |Account |Azure Purview Account |mypurview-private-account |
- |Portal |Azure Purview Account |mypurview-private-portal |
+ |Account |Microsoft Purview Account |mypurview-private-account |
+ |Portal |Microsoft Purview Account |mypurview-private-portal |
|Ingestion |Managed Storage Account (Blob) |mypurview-ingestion-blob | |Ingestion |Managed Storage Account (Queue) |mypurview-ingestion-queue | |Ingestion |Managed Event Hubs Namespace |mypurview-ingestion-namespace | 2. If portal private endpoint is deployed, make sure you also deploy account private endpoint.
-3. If portal private endpoint is deployed, and public network access is set to deny in your Azure Purview account, make sure you launch [Azure Purview Studio](https://web.purview.azure.com/resource/) from internal network.
+3. If portal private endpoint is deployed, and public network access is set to deny in your Microsoft Purview account, make sure you launch [Microsoft Purview Studio](https://web.purview.azure.com/resource/) from internal network.
<br> - To verify the correct name resolution, you can use a **NSlookup.exe** command line tool to query `web.purview.azure.com`. The result must return a private IP address that belongs to portal private endpoint. - To verify network connectivity, you can use any network test tools to test outbound connectivity to `web.purview.azure.com` endpoint to port **443**. The connection must be successful. 3. If Azure Private DNS Zones are used, make sure the required Azure DNS Zones are deployed and there is DNS (A) record for each private endpoint.
-4. Test network connectivity and name resolution from management machine to Azure Purview endpoint and purview web url. If account and portal private endpoints are deployed, the endpoints must be resolved through private IP addresses.
+4. Test network connectivity and name resolution from management machine to Microsoft Purview endpoint and purview web url. If account and portal private endpoints are deployed, the endpoints must be resolved through private IP addresses.
```powershell
This guide summarizes known limitations related to using private endpoints for A
TcpTestSucceeded : True ```
-5. If you have created your Azure Purview account after 18 August 2021, make sure you download and install the latest version of self-hosted integration runtime from [Microsoft download center](https://www.microsoft.com/download/details.aspx?id=39717).
+5. If you have created your Microsoft Purview account after 18 August 2021, make sure you download and install the latest version of self-hosted integration runtime from [Microsoft download center](https://www.microsoft.com/download/details.aspx?id=39717).
-6. From self-hosted integration runtime VM, test network connectivity and name resolution to Azure Purview endpoint.
+6. From self-hosted integration runtime VM, test network connectivity and name resolution to Microsoft Purview endpoint.
-7. From self-hosted integration runtime, test network connectivity and name resolution to Azure Purview managed resources such as blob queue and Event Hub through port 443 and private IP addresses. (Replace the managed storage account and Event Hubs namespace with corresponding managed resource name assigned to your Azure Purview account).
+7. From self-hosted integration runtime, test network connectivity and name resolution to Microsoft Purview managed resources such as blob queue and Event Hub through port 443 and private IP addresses. (Replace the managed storage account and Event Hubs namespace with corresponding managed resource name assigned to your Microsoft Purview account).
```powershell Test-NetConnection -ComputerName `scansoutdeastasiaocvseab`.blob.core.windows.net -Port 443
This guide summarizes known limitations related to using private endpoints for A
TcpTestSucceeded : True ```
-8. From the network where data source is located, test network connectivity and name resolution to Azure Purview endpoint and managed resources endpoints.
+8. From the network where data source is located, test network connectivity and name resolution to Microsoft Purview endpoint and managed resources endpoints.
-9. If data sources are located in on-premises network, review your DNS forwarder configuration. Test name resolution from within the same network where data sources are located to self-hosted integration runtime, Azure Purview endpoints and managed resources. It is expected to obtain a valid private IP address from DNS query for each endpoint.
+9. If data sources are located in on-premises network, review your DNS forwarder configuration. Test name resolution from within the same network where data sources are located to self-hosted integration runtime, Microsoft Purview endpoints and managed resources. It is expected to obtain a valid private IP address from DNS query for each endpoint.
For more information, see [Virtual network workloads without custom DNS server](../private-link/private-endpoint-dns.md#virtual-network-workloads-without-custom-dns-server) and [On-premises workloads using a DNS forwarder](../private-link/private-endpoint-dns.md#on-premises-workloads-using-a-dns-forwarder) scenarios in [Azure Private Endpoint DNS configuration](../private-link/private-endpoint-dns.md). 10. If management machine and self-hosted integration runtime VMs are deployed in on-premises network and you have set up DNS forwarder in your environment, verify DNS and network settings in your environment.
-11. If ingestion private endpoint is used, make sure self-hosted integration runtime is registered successfully inside Azure Purview account and shows as running both inside the self-hosted integration runtime VM and in the [Azure Purview Studio](https://web.purview.azure.com/resource/) .
+11. If ingestion private endpoint is used, make sure self-hosted integration runtime is registered successfully inside Microsoft Purview account and shows as running both inside the self-hosted integration runtime VM and in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/) .
## Common errors and messages
You may receive the following error message when running a scan:
`Internal system error. Please contact support with correlationId:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx System Error, contact support.` ### Cause
-This can be an indication of issues related to connectivity or name resolution between the VM running self-hosted integration runtime and Azure Purview's managed resources storage account or Event Hub.
+This can be an indication of issues related to connectivity or name resolution between the VM running self-hosted integration runtime and Microsoft Purview's managed resources storage account or Event Hub.
### Resolution Validate if name resolution between the VM running Self-Hosted Integration Runtime.
Upgrade self-hosted integration runtime to 5.9.7885.3.
### Issue
-Azure Purview account with private endpoint deployment failed with Azure Policy validation error during the deployment.
+Microsoft Purview account with private endpoint deployment failed with Azure Policy validation error during the deployment.
### Cause This error suggests that there may be an existing Azure Policy Assignment on your Azure subscription that is preventing the deployment of any of the required Azure resources.
Review your existing Azure Policy Assignments and make sure deployment of the fo
> [!NOTE] > Depending on your scenario, you may need to deploy one or more of the following Azure resource types:
-> - Azure Purview (Microsoft.Purview/Accounts)
+> - Microsoft Purview (Microsoft.Purview/Accounts)
> - Private Endpoint (Microsoft.Network/privateEndpoints) > - Private DNS Zones (Microsoft.Network/privateDnsZones) > - Event Hub Name Space (Microsoft.EventHub/namespaces)
Review your existing Azure Policy Assignments and make sure deployment of the fo
### Issue
-Not authorized to access this Azure Purview account. This Azure Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Azure Purview account's private endpoint.
+Not authorized to access this Microsoft Purview account. This Microsoft Purview account is behind a private endpoint. Please access the account from a client in the same virtual network (VNet) that has been configured for the Microsoft Purview account's private endpoint.
### Cause
-User is trying to connect to Azure Purview from a public endpoint or using Azure Purview public endpoints where **Public network access** is set to **Deny**.
+User is trying to connect to Microsoft Purview from a public endpoint or using Microsoft Purview public endpoints where **Public network access** is set to **Deny**.
### Resolution
-In this case, to open Azure Purview Studio, either use a machine that is deployed in the same virtual network as the Azure Purview portal private endpoint or use a VM that is connected to your CorpNet in which hybrid connectivity is allowed.
+In this case, to open Microsoft Purview Studio, either use a machine that is deployed in the same virtual network as the Microsoft Purview portal private endpoint or use a VM that is connected to your CorpNet in which hybrid connectivity is allowed.
### Issue You may receive the following error message when scanning a SQL server, using a self-hosted integration runtime:
purview Catalog Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/catalog-private-link.md
Title: Use private endpoints for secure access to Azure Purview
-description: This article describes a high level overview of how you can use a private end point for your Azure Purview account
+ Title: Use private endpoints for secure access to Microsoft Purview
+description: This article describes a high level overview of how you can use a private end point for your Microsoft Purview account
Last updated 01/10/2022
-# Customer intent: As an Azure Purview admin, I want to set up private endpoints for my Azure Purview account, for secure access.
+# Customer intent: As a Microsoft Purview admin, I want to set up private endpoints for my Microsoft Purview account, for secure access.
-# Use private endpoints for your Azure Purview account
+# Use private endpoints for your Microsoft Purview account
-This article describes how to configure private endpoints for Azure Purview.
+This article describes how to configure private endpoints for Microsoft Purview.
## Conceptual Overview
-You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Azure Purview accounts to allow users on a virtual network (VNet) to securely access the catalog over a Private Link. A private endpoint uses an IP address from the VNet address space for your Azure Purview account. Network traffic between the clients on the VNet and the Azure Purview account traverses over the VNet and a private link on the Microsoft backbone network.
+You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Microsoft Purview accounts to allow users on a virtual network (VNet) to securely access the catalog over a Private Link. A private endpoint uses an IP address from the VNet address space for your Microsoft Purview account. Network traffic between the clients on the VNet and the Microsoft Purview account traverses over the VNet and a private link on the Microsoft backbone network.
-You can deploy Azure Purview _account_ private endpoint, to allow only client calls to Azure Purview that originate from within the private network.
+You can deploy Microsoft Purview _account_ private endpoint, to allow only client calls to Microsoft Purview that originate from within the private network.
-To connect to Azure Purview Studio using a private network connectivity, you can deploy _portal_ private endpoint.
+To connect to Microsoft Purview Studio using a private network connectivity, you can deploy _portal_ private endpoint.
-You can deploy _ingestion_ private endpoints if you need to scan Azure IaaS and PaaS data sources inside Azure virtual networks and on-premises data sources through a private connection. This method ensures network isolation for your metadata flowing from the data sources to Azure Purview Data Map.
+You can deploy _ingestion_ private endpoints if you need to scan Azure IaaS and PaaS data sources inside Azure virtual networks and on-premises data sources through a private connection. This method ensures network isolation for your metadata flowing from the data sources to Microsoft Purview Data Map.
## Prerequisites
-Before deploying private endpoints for Azure Purview account, ensure you meet the following prerequisites:
+Before deploying private endpoints for Microsoft Purview account, ensure you meet the following prerequisites:
1. An Azure account with an active subscription. [Create an account for free.](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) <br> 2. An existing Azure Virtual network. Deploy a new [Azure virtual network](../virtual-network/quick-create-portal.md) if you do not have one. <br>
-## Azure Purview private endpoint deployment scenarios
+## Microsoft Purview private endpoint deployment scenarios
-Use the following recommended checklist to perform deployment of Azure Purview account with private endpoints:
+Use the following recommended checklist to perform deployment of Microsoft Purview account with private endpoints:
|Scenario |Objectives | |||
-|**Scenario 1** - [Connect to your Azure Purview and scan data sources privately and securely](./catalog-private-link-end-to-end.md) |You need to restrict access to your Azure Purview account only via a private endpoint, including access to Azure Purview Studio, Atlas APIs and scan data sources in on-premises and Azure behind a virtual network using self-hosted integration runtime ensuring end to end network isolation. (Deploy _account_, _portal_ and _ingestion_ private endpoints.) |
-|**Scenario 2** - [Connect privately and securely to your Azure Purview account](./catalog-private-link-account-portal.md) | You need to enable access to your Azure Purview account, including access to _Azure Purview Studio_ and Atlas API through private endpoints. (Deploy _account_ and _portal_ private endpoints). |
-|**Scenario 3** - [Scan data source securely using Managed Virtual Network](./catalog-managed-vnet.md) | You need to scan Azure data sources securely, without having to manage a virtual network or a self-hosted integration runtime VM. (Deploy managed private endpoint for Azure Purview, managed storage account and Azure data sources). |
+|**Scenario 1** - [Connect to your Microsoft Purview and scan data sources privately and securely](./catalog-private-link-end-to-end.md) |You need to restrict access to your Microsoft Purview account only via a private endpoint, including access to Microsoft Purview Studio, Atlas APIs and scan data sources in on-premises and Azure behind a virtual network using self-hosted integration runtime ensuring end to end network isolation. (Deploy _account_, _portal_ and _ingestion_ private endpoints.) |
+|**Scenario 2** - [Connect privately and securely to your Microsoft Purview account](./catalog-private-link-account-portal.md) | You need to enable access to your Microsoft Purview account, including access to _Microsoft Purview Studio_ and Atlas API through private endpoints. (Deploy _account_ and _portal_ private endpoints). |
+|**Scenario 3** - [Scan data source securely using Managed Virtual Network](./catalog-managed-vnet.md) | You need to scan Azure data sources securely, without having to manage a virtual network or a self-hosted integration runtime VM. (Deploy managed private endpoint for Microsoft Purview, managed storage account and Azure data sources). |
## Support matrix for Scanning data sources through _ingestion_ private endpoint
-For scenarios where _ingestion_ private endpoint is used in your Azure Purview account, and public access on your data sources is disabled, Azure Purview can scan the following data sources that are behind a private endpoint:
+For scenarios where _ingestion_ private endpoint is used in your Microsoft Purview account, and public access on your data sources is disabled, Microsoft Purview can scan the following data sources that are behind a private endpoint:
|Data source behind a private endpoint |Integration runtime type |Credential type | ||||
For scenarios where _ingestion_ private endpoint is used in your Azure Purview a
## Frequently Asked Questions
-For FAQs related to private endpoint deployments in Azure Purview, see [FAQ about Azure Purview private endpoints](./catalog-private-link-faqs.md).
+For FAQs related to private endpoint deployments in Microsoft Purview, see [FAQ about Microsoft Purview private endpoints](./catalog-private-link-faqs.md).
## Troubleshooting guide
-For troubleshooting private endpoint configuration for Azure Purview accounts, see [Troubleshooting private endpoint configuration for Azure Purview accounts](./catalog-private-link-troubleshoot.md).
+For troubleshooting private endpoint configuration for Microsoft Purview accounts, see [Troubleshooting private endpoint configuration for Microsoft Purview accounts](./catalog-private-link-troubleshoot.md).
## Known limitations
-To view list of current limitations related to Azure Purview private endpoints, see [Azure Purview private endpoints known limitations](./catalog-private-link-troubleshoot.md#known-limitations).
+To view list of current limitations related to Microsoft Purview private endpoints, see [Microsoft Purview private endpoints known limitations](./catalog-private-link-troubleshoot.md#known-limitations).
## Next steps - [Deploy end to end private networking](./catalog-private-link-end-to-end.md)-- [Deploy private networking for the Azure Purview Studio](./catalog-private-link-account-portal.md)
+- [Deploy private networking for the Microsoft Purview Studio](./catalog-private-link-account-portal.md)
purview Classification Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/classification-insights.md
Title: Classification reporting on your data in Azure Purview using Azure Purview Insights
-description: This how-to guide describes how to view and use Azure Purview classification reporting on your data.
+ Title: Classification reporting on your data in Microsoft Purview using Microsoft Purview Insights
+description: This how-to guide describes how to view and use Microsoft Purview classification reporting on your data.
Last updated 09/27/2021
-# Customer intent: As a security officer, I need to understand how to use Azure Purview Insights to learn about sensitive data identified and classified and labeled during scanning.
+# Customer intent: As a security officer, I need to understand how to use Microsoft Purview Insights to learn about sensitive data identified and classified and labeled during scanning.
-# Classification insights about your data from Azure Purview
+# Classification insights about your data from Microsoft Purview
-This how-to guide describes how to access, view, and filter Azure Purview Classification insight reports for your data.
+This how-to guide describes how to access, view, and filter Microsoft Purview Classification insight reports for your data.
> [!IMPORTANT]
-> Azure Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
Supported data sources include: Azure Blob Storage, Azure Data Lake Storage (ADLS) GEN 1, Azure Data Lake Storage (ADLS) GEN 2, Azure Cosmos DB (SQL API), Azure Synapse Analytics (formerly SQL DW), Azure SQL Database, Azure SQL Managed Instance, SQL Server, Amazon S3 buckets, and Amazon RDS databases (public preview), Power BI In this how-to guide, you'll learn how to: > [!div class="checklist"]
-> - Launch your Azure Purview account from Azure
+> - Launch your Microsoft Purview account from Azure
> - View classification insights on your data > - Drill down for more classification details on your data ## Prerequisites
-Before getting started with Azure Purview insights, make sure that you've completed the following steps:
+Before getting started with Microsoft Purview insights, make sure that you've completed the following steps:
- Set up your Azure resources and populated the relevant accounts with test data -- Set up and completed a scan on the test data in each data source. For more information, see [Manage data sources in Azure Purview](manage-data-sources.md) and [Create a scan rule set](create-a-scan-rule-set.md).
+- Set up and completed a scan on the test data in each data source. For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md) and [Create a scan rule set](create-a-scan-rule-set.md).
-- Signed in to Azure Purview with account with a [Data Reader or Data Curator role](catalog-permissions.md#roles).
+- Signed in to Microsoft Purview with account with a [Data Reader or Data Curator role](catalog-permissions.md#roles).
-For more information, see [Manage data sources in Azure Purview](manage-data-sources.md).
+For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md).
-## Use Azure Purview classification insights
+## Use Microsoft Purview classification insights
-In Azure Purview, classifications are similar to subject tags, and are used to mark and identify data of a specific type that's found within your data estate during scanning.
+In Microsoft Purview, classifications are similar to subject tags, and are used to mark and identify data of a specific type that's found within your data estate during scanning.
-Azure Purview uses the same sensitive information types as Microsoft 365, allowing you to stretch your existing security policies and protection across your entire data estate.
+Microsoft Purview uses the same sensitive information types as Microsoft 365, allowing you to stretch your existing security policies and protection across your entire data estate.
> [!NOTE] > After you have scanned your source types, give **Classification** Insights a couple of hours to reflect the new assets. **To view classification insights:**
-1. Go to the **Azure Purview** [instance screen in the Azure portal](https://aka.ms/purviewportal) and select your Azure Purview account.
+1. Go to the **Microsoft Purview** [instance screen in the Azure portal](https://aka.ms/purviewportal) and select your Microsoft Purview account.
-1. On the **Overview** page, in the **Get Started** section, select the **Azure Purview Studio** tile.
+1. On the **Overview** page, in the **Get Started** section, select the **Microsoft Purview Studio** tile.
-1. In Azure Purview, select the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: menu item on the left to access your **Insights** area.
+1. In Microsoft Purview, select the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: menu item on the left to access your **Insights** area.
-1. In the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: area, select **Classification** to display the Azure Purview **Classification insights** report.
+1. In the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: area, select **Classification** to display the Microsoft Purview **Classification insights** report.
:::image type="content" source="./media/insights/select-classification-labeling.png" alt-text="Classification insights report" lightbox="media/insights/select-classification-labeling.png":::
Do any of the following to learn more:
|**Sort the grid** |Select a column header to sort the grid by that column. | |**Edit columns** | To display more or fewer columns in your grid, select **Edit Columns** :::image type="icon" source="media/insights/ico-columns.png" border="false":::, and then select the columns you want to view or change the order. | |**Drill down further** | To drill down to a specific classification, select a name in the **Classification** column to view the **Classification by source** report. <br><br>This report displays data for the selected classification, including the source name, source type, subscription ID, and the numbers of classified files and tables. |
-|**Browse assets** | To browse through the assets found with a specific classification or source, select a classification or source, depending on the report you're viewing, and then select **Browse assets** :::image type="icon" source="medi). |
+|**Browse assets** | To browse through the assets found with a specific classification or source, select a classification or source, depending on the report you're viewing, and then select **Browse assets** :::image type="icon" source="medi). |
| | | ## Next steps
-Learn more about Azure Purview insight reports
+Learn more about Microsoft Purview insight reports
> [!div class="nextstepaction"] > [Glossary insights](glossary-insights.md)
purview Concept Asset Normalization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-asset-normalization.md
Title: Asset normalization
-description: Learn how Azure Purview prevents duplicate assets in your data map through asset normalization
+description: Learn how Microsoft Purview prevents duplicate assets in your data map through asset normalization
# Asset normalization
-When ingesting assets into the Azure Purview data map, different sources updating the same data asset may send similar, but slightly different qualified names. While these qualified names represent the same asset, slight differences such as an extra character or different capitalization may cause these assets on the surface to appear different. To avoid storing duplicate entries and causing confusion when consuming the data catalog, Azure Purview applies normalization during ingestion to ensure all fully qualified names of the same entity type are in the same format.
+When ingesting assets into the Microsoft Purview data map, different sources updating the same data asset may send similar, but slightly different qualified names. While these qualified names represent the same asset, slight differences such as an extra character or different capitalization may cause these assets on the surface to appear different. To avoid storing duplicate entries and causing confusion when consuming the data catalog, Microsoft Purview applies normalization during ingestion to ensure all fully qualified names of the same entity type are in the same format.
For example, you scan in an Azure Blob with the qualified name `https://myaccount.file.core.windows.net/myshare/folderA/folderB/my-file.parquet`. This blob is also consumed by an Azure Data Factory pipeline which will then add lineage information to the asset. The ADF pipeline may be configured to read the file as `https://myAccount.file.core.windows.net//myshare/folderA/folderB/my-file.parquet`. While the qualified name is different, this ADF pipeline is consuming the same piece of data. Normalization ensures that all the metadata from both Azure Blob Storage and Azure Data Factory is visible on a single asset, `https://myaccount.file.core.windows.net/myshare/folderA/folderB/my-file.parquet`. ## Normalization rules
-Below are the normalization rules applied by Azure Purview.
+Below are the normalization rules applied by Microsoft Purview.
### Encode curly brackets Applies to: All Assets
Before: `https://myaccount.core.windows.net/`
After: `https://myaccount.core.windows.net` ## Next steps
-[Scan in an Azure Blob Storage](register-scan-azure-blob-storage-source.md) account into the Azure Purview data map.
+[Scan in an Azure Blob Storage](register-scan-azure-blob-storage-source.md) account into the Microsoft Purview data map.
purview Concept Best Practices Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-accounts.md
Title: Azure Purview accounts architecture and best practices
-description: This article provides examples of Azure Purview accounts architectures and describes best practices.
+ Title: Microsoft Purview accounts architecture and best practices
+description: This article provides examples of Microsoft Purview accounts architectures and describes best practices.
Last updated 10/12/2021
-# Azure Purview accounts architectures and best practices
+# Microsoft Purview accounts architectures and best practices
-Azure Purview is a unified data governance solution. You deploy an Azure Purview account to centrally manage data governance across your data estate, spanning both cloud and on-prem environments. To use Azure Purview as your centralized data governance solution, you need to deploy one or more Azure Purview accounts inside your Azure subscription. We recommend keeping the number of Azure Purview instances as minimum, however, in some cases more Azure Purview instances are needed to fulfill business security and compliance requirements.
+Microsoft Purview is a unified data governance solution. You deploy a Microsoft Purview account to centrally manage data governance across your data estate, spanning both cloud and on-prem environments. To use Microsoft Purview as your centralized data governance solution, you need to deploy one or more Microsoft Purview accounts inside your Azure subscription. We recommend keeping the number of Microsoft Purview instances as minimum, however, in some cases more Microsoft Purview instances are needed to fulfill business security and compliance requirements.
-## Single Azure Purview account
+## Single Microsoft Purview account
-Consider deploying minimum number of Azure Purview accounts for the entire organization. This approach takes maximum advantage of the "network effects" where the value of the platform increases exponentially as a function of the data that resides inside the platform.
+Consider deploying minimum number of Microsoft Purview accounts for the entire organization. This approach takes maximum advantage of the "network effects" where the value of the platform increases exponentially as a function of the data that resides inside the platform.
-Use [Azure Purview collections hierarchy](./concept-best-practices-collections.md) to lay out your organization's data management structure inside a single Azure Purview account. In this scenario, one Azure Purview account is deployed in an Azure subscription. Data sources from one or more Azure subscriptions can be registered and scanned inside the Azure Purview. You can also register and scan data sources from your on-premises or multi-cloud environments.
+Use [Microsoft Purview collections hierarchy](./concept-best-practices-collections.md) to lay out your organization's data management structure inside a single Microsoft Purview account. In this scenario, one Microsoft Purview account is deployed in an Azure subscription. Data sources from one or more Azure subscriptions can be registered and scanned inside the Microsoft Purview. You can also register and scan data sources from your on-premises or multi-cloud environments.
-## Multiple Azure Purview accounts
+## Multiple Microsoft Purview accounts
-Some organizations may require setting up multiple Azure Purview accounts. Review the following scenarios as few examples when defining your Azure Purview accounts architecture:ΓÇ»
+Some organizations may require setting up multiple Microsoft Purview accounts. Review the following scenarios as few examples when defining your Microsoft Purview accounts architecture:ΓÇ»
### Testing new features
-It is recommended to create a new instance of Azure Purview account when testing scan configurations or classifications in isolated environments. For some scenarios, there is a "versioning" feature in some areas of the platform such as glossary, however, it would be easier to have a "disposable" instance of Azure Purview to freely test expected functionality and then plan to roll out the feature into the production instance.
+It is recommended to create a new instance of Microsoft Purview account when testing scan configurations or classifications in isolated environments. For some scenarios, there is a "versioning" feature in some areas of the platform such as glossary, however, it would be easier to have a "disposable" instance of Microsoft Purview to freely test expected functionality and then plan to roll out the feature into the production instance.
-Additionally, consider using a test Azure Purview account when you cannot perform a rollback. For example, currently you cannot remove a glossary term attribute from an Azure Purview instance once it is added to your Azure Purview account. In this case, it is recommended using a test Azure Purview account first.
+Additionally, consider using a test Microsoft Purview account when you cannot perform a rollback. For example, currently you cannot remove a glossary term attribute from a Microsoft Purview instance once it is added to your Microsoft Purview account. In this case, it is recommended using a test Microsoft Purview account first.
### Isolating Production and non-production environments
-Consider deploying separate instances of Azure Purview accounts for development, testing and production environments, specially when you have separate instances of data for each environment.
+Consider deploying separate instances of Microsoft Purview accounts for development, testing and production environments, specially when you have separate instances of data for each environment.
-In this scenario, production and non-production data sources can be registered and scanned inside their corresponding Azure Purview instances.
+In this scenario, production and non-production data sources can be registered and scanned inside their corresponding Microsoft Purview instances.
-Optionally, you can register a data source in more than one Azure Purview instance, if needed.
+Optionally, you can register a data source in more than one Microsoft Purview instance, if needed.
### Fulfilling compliance requirements
-When you scan data sources in Azure Purview, information related to your metadata is ingested and stored inside your Azure Purview Data Map in the Azure region where your Azure Purview account is deployed. Consider deploying separate instances of Azure Purview if you have specific regulatory and compliance requirements that include even having metadata in a specific geographical location.
+When you scan data sources in Microsoft Purview, information related to your metadata is ingested and stored inside your Microsoft Purview Data Map in the Azure region where your Microsoft Purview account is deployed. Consider deploying separate instances of Microsoft Purview if you have specific regulatory and compliance requirements that include even having metadata in a specific geographical location.
-If your organization has data in multiple geographies and you must keep metadata in the same region as the actual data, you have to deploy multiple Azure Purview instances, one for each geography. In this case, data sources from each regions should be registered and scanned in the Azure Purview account that corresponds to the data source region or geography.
+If your organization has data in multiple geographies and you must keep metadata in the same region as the actual data, you have to deploy multiple Microsoft Purview instances, one for each geography. In this case, data sources from each regions should be registered and scanned in the Microsoft Purview account that corresponds to the data source region or geography.
### Having Data sources distributed across multiple tenants
-Currently, Azure Purview doesn't support multi-tenancy. If you have Azure data sources distributed across multiple Azure subscriptions under different Azure Active Directory tenants, it is recommended deploying separate Azure Purview accounts under each tenant.
+Currently, Microsoft Purview doesn't support multi-tenancy. If you have Azure data sources distributed across multiple Azure subscriptions under different Azure Active Directory tenants, it is recommended deploying separate Microsoft Purview accounts under each tenant.
-An exception applies to VM-based data sources and Power BI tenants.For more information about how to scan and register a cross tenant Power BI in a single Azure Purview account, see, [Register and scan a cross-tenant Power BI](./register-scan-power-bi-tenant.md).
+An exception applies to VM-based data sources and Power BI tenants.For more information about how to scan and register a cross tenant Power BI in a single Microsoft Purview account, see, [Register and scan a cross-tenant Power BI](./register-scan-power-bi-tenant.md).
### Billing model
-Review [Azure Purview Pricing model](https://azure.microsoft.com/pricing/details/azure-purview) when defining budgeting model and designing Azure Purview architecture for your organization. One billing is generated for a single Azure Purview account in the subscription where Azure Purview account is deployed. This model also applies to other Azure Purview costs such as scanning and classifying metadata inside Azure Purview Data Map.
+Review [Microsoft Purview Pricing model](https://azure.microsoft.com/pricing/details/azure-purview) when defining budgeting model and designing Microsoft Purview architecture for your organization. One billing is generated for a single Microsoft Purview account in the subscription where Microsoft Purview account is deployed. This model also applies to other Microsoft Purview costs such as scanning and classifying metadata inside Microsoft Purview Data Map.
-Some organizations often have many business units (BUs) that operate separately, and, in some cases, they don't even share billing with each other. In those cases, the organization will end up creating an Azure Purview instance for each BU. This model is not ideal, however, may be necessary, especially because Business Units are often not willing to share Azure billing.
+Some organizations often have many business units (BUs) that operate separately, and, in some cases, they don't even share billing with each other. In those cases, the organization will end up creating a Microsoft Purview instance for each BU. This model is not ideal, however, may be necessary, especially because Business Units are often not willing to share Azure billing.
For more information about cloud computing cost model in chargeback and showback models, see, [What is cloud accounting?](/azure/cloud-adoption-framework/strategy/cloud-accounting). ## Additional considerations and recommendations -- Keep the number of Azure Purview accounts low for simplified administrative overhead. If you plan building multiple Azure Purview accounts, you may require creating and managing additional scans, access control model, credentials, and runtimes across your Azure Purview accounts. Additionally, you may need to manage classifications and glossary terms for each Azure Purview account.
+- Keep the number of Microsoft Purview accounts low for simplified administrative overhead. If you plan building multiple Microsoft Purview accounts, you may require creating and managing additional scans, access control model, credentials, and runtimes across your Microsoft Purview accounts. Additionally, you may need to manage classifications and glossary terms for each Microsoft Purview account.
-- Review your budgeting and financial requirements. If possible, use chargeback or showback model when using Azure services and divide the cost of Azure Purview across the organization to keep the number of Azure Purview accounts minimum.
+- Review your budgeting and financial requirements. If possible, use chargeback or showback model when using Azure services and divide the cost of Microsoft Purview across the organization to keep the number of Microsoft Purview accounts minimum.
-- Use [Azure Purview collections](concept-best-practices-collections.md) to define metadata access control inside Azure Purview Data Map for your organization's business users, data management and governance teams. For more information, see [Access control in Azure Purview](./catalog-permissions.md).
+- Use [Microsoft Purview collections](concept-best-practices-collections.md) to define metadata access control inside Microsoft Purview Data Map for your organization's business users, data management and governance teams. For more information, see [Access control in Microsoft Purview](./catalog-permissions.md).
-- Review [Azure Purview limits](./how-to-manage-quotas.md#azure-purview-limits) before deploying any new Azure Purview accounts. Currently, the default limit of Azure Purview accounts per region, per tenant (all subscriptions combined) is 3. You may need to contact Microsoft support to increase this limit in your subscription or tenant before deploying extra instances of Azure Purview.ΓÇ»
+- Review [Microsoft Purview limits](./how-to-manage-quotas.md#microsoft-purview-limits) before deploying any new Microsoft Purview accounts. Currently, the default limit of Microsoft Purview accounts per region, per tenant (all subscriptions combined) is 3. You may need to contact Microsoft support to increase this limit in your subscription or tenant before deploying extra instances of Microsoft Purview.ΓÇ»
-- Review [Azure Purview prerequisites](./create-catalog-portal.md#prerequisites) before deploying any new Azure Purview accounts in your environment.
+- Review [Microsoft Purview prerequisites](./create-catalog-portal.md#prerequisites) before deploying any new Microsoft Purview accounts in your environment.
ΓÇ» ## Next steps-- [Create an Azure Purview account](./create-catalog-portal.md)
+- [Create a Microsoft Purview account](./create-catalog-portal.md)
purview Concept Best Practices Asset Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-asset-lifecycle.md
Title: Azure Purview asset management processes
-description: This article provides process and best practice guidance to effectively manage the lifecycle of assets in the Azure Purview catalog
+ Title: Microsoft Purview asset management processes
+description: This article provides process and best practice guidance to effectively manage the lifecycle of assets in the Microsoft Purview catalog
Last updated 01/06/2022
# Business processes for managing data effectively
-As data and content has a lifecycle that requires active management (for example, acquisition - processing - disposal) assets in the Azure Purview data catalog need active management in a similar way. "Assets" in the catalog include the technical metadata that describes collection, lineage and scan information. Metadata describing the business structure of data such as glossary, classifications and ownership also needs to be managed.
+As data and content has a lifecycle that requires active management (for example, acquisition - processing - disposal) assets in the Microsoft Purview data catalog need active management in a similar way. "Assets" in the catalog include the technical metadata that describes collection, lineage and scan information. Metadata describing the business structure of data such as glossary, classifications and ownership also needs to be managed.
To manage data assets, responsible people in the organization must understand how and when to apply data governance processes and manage workflows.
-## Why do you need business processes for managing assets in Azure Purview?
+## Why do you need business processes for managing assets in Microsoft Purview?
-An organization employing Azure Purview should define processes and people structure to manage the lifecycle of assets and ensure data is valuable to users of the catalog. Metadata in the catalog must be maintained to be able to manage data at scale for discovery, quality, security and privacy.
+An organization employing Microsoft Purview should define processes and people structure to manage the lifecycle of assets and ensure data is valuable to users of the catalog. Metadata in the catalog must be maintained to be able to manage data at scale for discovery, quality, security and privacy.
### Benefits -- Agreed definition and structure of data is required for the Azure Purview data catalog to provide effective data search and protection functionality at scale across organizations' data estates.
+- Agreed definition and structure of data is required for the Microsoft Purview data catalog to provide effective data search and protection functionality at scale across organizations' data estates.
- Defining and using processes for asset lifecycle management is key to maintaining accurate asset metadata, which will improve usability of the catalog and the ability to protect relevant data. - Business users looking for data will be more likely to use the catalog to search for data when it is maintained using data governance processes.
-### Best practice processes that should be considered when starting the data governance journey with Azure Purview:
+### Best practice processes that should be considered when starting the data governance journey with Microsoft Purview:
- **Capture and maintain assets** - Understand how to initially structure and record assets in the catalog for management - **Glossary and Classification management** - Understand how to effectively manage the catalog metadata needed to apply and maintain a business glossary-- **Moving and deleting assets** ΓÇô Managing collections and assets by understanding how to move assets from one collection to another or delete asset metadata from Azure Purview
+- **Moving and deleting assets** ΓÇô Managing collections and assets by understanding how to move assets from one collection to another or delete asset metadata from Microsoft Purview
## Data curator organizational personas
-The [Data Curator](catalog-permissions.md) role in Azure Purview controls read/write permission to assets within a collection group. To support the data governance processes, the Data Curator role has been granted to separate data governance personas in the organization:
+The [Data Curator](catalog-permissions.md) role in Microsoft Purview controls read/write permission to assets within a collection group. To support the data governance processes, the Data Curator role has been granted to separate data governance personas in the organization:
> [!Note]
-> The 4 **personas** listed are suggested read/write users, and would all be assigned Data Curator role in Azure Purview.
+> The 4 **personas** listed are suggested read/write users, and would all be assigned Data Curator role in Microsoft Purview.
- Data Owner or Data Expert:
The [Data Curator](catalog-permissions.md) role in Azure Purview controls read/w
## 1. Capture and maintain assets
-This process describes the high-level steps and suggested roles to capture and maintain assets in the Azure Purview data catalog.
+This process describes the high-level steps and suggested roles to capture and maintain assets in the Microsoft Purview data catalog.
:::image type="content" source="media/concept-best-practices/assets-capturing-asset-metadata.png" alt-text="Business Process 1 - Capturing and Maintaining Assets."lightbox="media/concept-best-practices/assets-capturing-asset-metadata.png" border="true":::
This process describes the high-level steps and suggested roles to capture and m
| Process Step | Guidance | | | -- |
-| 1 | [Azure Purview collections architecture and best practices](concept-best-practices-collections.md) |
+| 1 | [Microsoft Purview collections architecture and best practices](concept-best-practices-collections.md) |
| 2 | [How to create and manage collections](how-to-create-and-manage-collections.md)
-| 3 & 4 | [Understand Azure Purview access and permissions](catalog-permissions.md)
-| 5 | [Azure Purview supported sources](purview-connector-overview.md) <br> [Azure Purview private endpoint networking](catalog-private-link.md) |
+| 3 & 4 | [Understand Microsoft Purview access and permissions](catalog-permissions.md)
+| 5 | [Microsoft Purview supported sources](purview-connector-overview.md) <br> [Microsoft Purview private endpoint networking](catalog-private-link.md) |
| 6 | [How to manage multi-cloud data sources](manage-data-sources.md)
-| 7 | [Best practices for scanning data sources in Azure Purview](concept-best-practices-scanning.md)
+| 7 | [Best practices for scanning data sources in Microsoft Purview](concept-best-practices-scanning.md)
| 8, 9 & 10 | [Search the data catalog](how-to-search-catalog.md) <br> [Browse the data catalog](how-to-browse-catalog.md) ## 2. Glossary and classification maintenance
-This process describes the high-level steps and roles to manage and define the business glossary and classifications metadata to enrich the Azure Purview data catalog.
+This process describes the high-level steps and roles to manage and define the business glossary and classifications metadata to enrich the Microsoft Purview data catalog.
:::image type="content" source="media/concept-best-practices/assets-maintaining-glossary-and-classifications.png" alt-text="Business Process 2 - Maintaining glossary and classifications"lightbox="media/concept-best-practices/assets-maintaining-glossary-and-classifications.png" border="true":::
This process describes the high-level steps and roles to manage and define the b
| Process Step | Guidance | | | -- |
-| 1 & 2 | [Understand Azure Purview access and permissions](catalog-permissions.md) |
+| 1 & 2 | [Understand Microsoft Purview access and permissions](catalog-permissions.md) |
| 3 | [Create custom classifications and classification rules](create-a-custom-classification-and-classification-rule.md) | 4 | [Create a scan rule set](create-a-scan-rule-set.md) | 5 & 6 | [Apply classifications to assets](apply-classifications.md)
This process describes the high-level steps and roles to manage and define the b
| 12 & 13 | [Browse the Data Catalog](how-to-browse-catalog.md) > [!Note]
-> It is not currently possible to edit glossary term attributes (for example, Status) in bulk using the Azure Purview UI, but it is possible to export the glossary in bulk, edit in Excel and re-import with amendments.
+> It is not currently possible to edit glossary term attributes (for example, Status) in bulk using the Microsoft Purview UI, but it is possible to export the glossary in bulk, edit in Excel and re-import with amendments.
## 3. Moving assets between collections
-This process describes the high-level steps and roles to move assets between collections using the Azure Purview portal.
+This process describes the high-level steps and roles to move assets between collections using the Microsoft Purview portal.
:::image type="content" source="media/concept-best-practices/assets-moving-assets-between-collections.png" alt-text="Business Process 3 - Moving assets between collections"lightbox="media/concept-best-practices/assets-moving-assets-between-collections.png" border="true":::
This process describes the high-level steps and roles to move assets between col
| Process Step | Guidance | | | -- |
-| 1 & 2 | [Azure Purview collections architecture and best practice](concept-best-practices-collections.md) |
+| 1 & 2 | [Microsoft Purview collections architecture and best practice](concept-best-practices-collections.md) |
| 3 | [Create a collection](quickstart-create-collection.md) | 4 | [Understand access and permissions](catalog-permissions.md) | 5 | [How to manage collections](how-to-create-and-manage-collections.md#add-assets-to-collections) | 6 | [Check collection permissions](how-to-create-and-manage-collections.md#prerequisites)
-| 7 | [Browse the Azure Purview Catalog](how-to-browse-catalog.md)
+| 7 | [Browse the Microsoft Purview Catalog](how-to-browse-catalog.md)
> [!Note]
-> It is not currently possible to bulk move assets from one collection to another using the Azure Purview portal.
+> It is not currently possible to bulk move assets from one collection to another using the Microsoft Purview portal.
## 4. Deleting asset metadata
-This process describes the high-level steps and roles to delete asset metadata from the data catalog using the Azure Purview portal.
+This process describes the high-level steps and roles to delete asset metadata from the data catalog using the Microsoft Purview portal.
Asset Metadata may need to be deleted manually for many reasons:
Asset Metadata may need to be deleted manually for many reasons:
> [!Note] > Before deleting assets, please refer to the how-to guide to review considerations: [How to delete assets](catalog-asset-details.md#deleting-assets) ### Process Guidance
Asset Metadata may need to be deleted manually for many reasons:
| 6 | [Scanning best practices](concept-best-practices-scanning.md) > [!Note]
-> - Deleting a collection, registered source or scan from Azure Purview does not delete all associated asset metadata.
-> - It is not possible to bulk delete asset metadata using the Azure Purview Portal
+> - Deleting a collection, registered source or scan from Microsoft Purview does not delete all associated asset metadata.
+> - It is not possible to bulk delete asset metadata using the Microsoft Purview Portal
> - Deleting the asset metadata does not delete all associated lineage or other relationship data (for example, glossary or classification assignments) about the asset from the data map. The asset information and relationships will no longer be visible in the portal. ## Next steps-- [Azure Purview accounts architectures and best practices](concept-best-practices-accounts.md)-- [Azure Purview collections architectures and best practices](concept-best-practices-collections.md)-- [Azure Purview glossary best practices](concept-best-practices-glossary.md)-- [Azure Purview classifications best practices](concept-best-practices-classification.md)
+- [Microsoft Purview accounts architectures and best practices](concept-best-practices-accounts.md)
+- [Microsoft Purview collections architectures and best practices](concept-best-practices-collections.md)
+- [Microsoft Purview glossary best practices](concept-best-practices-glossary.md)
+- [Microsoft Purview classifications best practices](concept-best-practices-classification.md)
purview Concept Best Practices Automation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-automation.md
Title: Azure Purview automation best practices
-description: This article provides an overview of Azure Purview automation tools and guidance on what to use when.
+ Title: Microsoft Purview automation best practices
+description: This article provides an overview of Microsoft Purview automation tools and guidance on what to use when.
Last updated 11/23/2021
-# Azure Purview automation best practices
+# Microsoft Purview automation best practices
-While Azure Purview provides an out of the box user experience with Azure Purview Studio, not all tasks are suited to the point-and-click nature of the graphical user experience.
+While Microsoft Purview provides an out of the box user experience with Microsoft Purview Studio, not all tasks are suited to the point-and-click nature of the graphical user experience.
For example: * Triggering a scan to run as part of an automated process. * Monitoring for metadata changes in real time. * Building your own custom user experience.
-Azure Purview provides several tools in which we can use to interact with the underlying platform, in an automated, and programmatic fashion. Because of the open nature of the Azure Purview service, we can automate different aspects, from the control plane, made accessible via Azure Resource Manager, to Azure Purview's multiple data planes (catalog, scanning, administration, and more).
+Microsoft Purview provides several tools in which we can use to interact with the underlying platform, in an automated, and programmatic fashion. Because of the open nature of the Microsoft Purview service, we can automate different aspects, from the control plane, made accessible via Azure Resource Manager, to Microsoft Purview's multiple data planes (catalog, scanning, administration, and more).
This article provides a summary of the options available, and guidance on what to use when.
This article provides a summary of the options available, and guidance on what t
To implement infrastructure as code, we can build [ARM templates](../azure-resource-manager/templates/overview.md) using JSON or [Bicep](../azure-resource-manager/bicep/overview.md), or open-source alternatives such as [Terraform](/azure/developer/terraform/overview). When to use?
-* Scenarios that require repeated Azure Purview deployments, templates ensure Azure Purview along with any other dependent resources are deployed in a consistent manner.
-* When coupled with [deployment scripts](../azure-resource-manager/templates/deployment-script-template.md), templated solutions can traverse the control and data planes, enabling the deployment of end-to-end solutions. For example, create an Azure Purview account, register sources, trigger scans.
+* Scenarios that require repeated Microsoft Purview deployments, templates ensure Microsoft Purview along with any other dependent resources are deployed in a consistent manner.
+* When coupled with [deployment scripts](../azure-resource-manager/templates/deployment-script-template.md), templated solutions can traverse the control and data planes, enabling the deployment of end-to-end solutions. For example, create a Microsoft Purview account, register sources, trigger scans.
## Command Line
-Azure CLI and Azure PowerShell are command-line tools that enable you to manage Azure resources such as Azure Purview. While the list of commands will grow over time, only a subset of Azure Purview control plane operations is currently available. For an up-to-date list of commands currently available, check out the documentation ([Azure CLI](/cli/azure/purview) | [Azure PowerShell](/powershell/module/az.purview)).
+Azure CLI and Azure PowerShell are command-line tools that enable you to manage Azure resources such as Microsoft Purview. While the list of commands will grow over time, only a subset of Microsoft Purview control plane operations is currently available. For an up-to-date list of commands currently available, check out the documentation ([Azure CLI](/cli/azure/purview) | [Azure PowerShell](/powershell/module/az.purview)).
-* **Azure CLI** - A cross-platform tool that allows the execution of commands through a terminal using interactive command-line prompts or a script. Azure CLI has a **purview extension** that allows for the management of Azure Purview accounts. For example, `az purview account`.
-* **Azure PowerShell** - A cross-platform task automation program, consisting of a set of cmdlets for managing Azure resources. Azure PowerShell has a module called **Az.Purview** that allows for the management of Azure Purview accounts. For example, `Get-AzPurviewAccount`.
+* **Azure CLI** - A cross-platform tool that allows the execution of commands through a terminal using interactive command-line prompts or a script. Azure CLI has a **purview extension** that allows for the management of Microsoft Purview accounts. For example, `az purview account`.
+* **Azure PowerShell** - A cross-platform task automation program, consisting of a set of cmdlets for managing Azure resources. Azure PowerShell has a module called **Az.Purview** that allows for the management of Microsoft Purview accounts. For example, `Get-AzPurviewAccount`.
When to use? * Best suited for ad-hoc tasks and quick exploratory operations. ## API
-REST APIs are HTTP endpoints that surface different methods (`POST`, `GET`, `PUT`, `DELETE`), triggering actions such as create, read, update, or delete (CRUD). Azure Purview exposes a large portion of the Azure Purview platform via multiple [service endpoints](/rest/api/purview/).
+REST APIs are HTTP endpoints that surface different methods (`POST`, `GET`, `PUT`, `DELETE`), triggering actions such as create, read, update, or delete (CRUD). Microsoft Purview exposes a large portion of the Microsoft Purview platform via multiple [service endpoints](/rest/api/purview/).
When to use? * Required operations not available via Azure CLI, Azure PowerShell, or native client libraries. * Custom application development or process automation. ## Streaming (Atlas Kafka)
-Each Azure Purview account comes with an optional fully managed event hub, accessible via the Atlas Kafka endpoint found via the Azure portal > Azure Purview Account > Properties. Azure Purview events can be monitored by consuming messages from the event hub. External systems can also use the event hub to publish events to Azure Purview as they occur.
-* **Consume Events** - Azure Purview will send notifications about metadata changes to Kafka topic **ATLAS_ENTITIES**. Applications interested in metadata changes can monitor for these notifications. Supported operations include: `ENTITY_CREATE`, `ENTITY_UPDATE`, `ENTITY_DELETE`, `CLASSIFICATION_ADD`, `CLASSIFICATION_UPDATE`, `CLASSIFICATION_DELETE`.
-* **Publish Events** - Azure Purview can be notified of metadata changes via notifications to Kafka topic **ATLAS_HOOK**. Supported operations include: `ENTITY_CREATE_V2`, `ENTITY_PARTIAL_UPDATE_V2`, `ENTITY_FULL_UPDATE_V2`, `ENTITY_DELETE_V2`.
+Each Microsoft Purview account comes with an optional fully managed event hub, accessible via the Atlas Kafka endpoint found via the Azure portal > Microsoft Purview Account > Properties. Microsoft Purview events can be monitored by consuming messages from the event hub. External systems can also use the event hub to publish events to Microsoft Purview as they occur.
+* **Consume Events** - Microsoft Purview will send notifications about metadata changes to Kafka topic **ATLAS_ENTITIES**. Applications interested in metadata changes can monitor for these notifications. Supported operations include: `ENTITY_CREATE`, `ENTITY_UPDATE`, `ENTITY_DELETE`, `CLASSIFICATION_ADD`, `CLASSIFICATION_UPDATE`, `CLASSIFICATION_DELETE`.
+* **Publish Events** - Microsoft Purview can be notified of metadata changes via notifications to Kafka topic **ATLAS_HOOK**. Supported operations include: `ENTITY_CREATE_V2`, `ENTITY_PARTIAL_UPDATE_V2`, `ENTITY_FULL_UPDATE_V2`, `ENTITY_DELETE_V2`.
When to use? * Applications or processes that need to publish or consume Apache Atlas events in real time. ## Streaming (Diagnostic Logs)
-Azure Purview can send platform logs and metrics via "Diagnostic settings" to one or more destinations (Log Analytics Workspace, Storage Account, or Azure Event Hubs). [Available metrics](./how-to-monitor-with-azure-monitor.md#available-metrics) include `Data Map Capacity Units`, `Data Map Storage Size`, `Scan Canceled`, `Scan Completed`, `Scan Failed`, and `Scan Time Taken`.
+Microsoft Purview can send platform logs and metrics via "Diagnostic settings" to one or more destinations (Log Analytics Workspace, Storage Account, or Azure Event Hubs). [Available metrics](./how-to-monitor-with-azure-monitor.md#available-metrics) include `Data Map Capacity Units`, `Data Map Storage Size`, `Scan Canceled`, `Scan Completed`, `Scan Failed`, and `Scan Time Taken`.
-Once configured, Azure Purview automatically sends these events to the destination as a JSON payload. From there, application subscribers that need to consume and act on these events can do so with the option of orchestrating downstream logic.
+Once configured, Microsoft Purview automatically sends these events to the destination as a JSON payload. From there, application subscribers that need to consume and act on these events can do so with the option of orchestrating downstream logic.
When to use? * Applications or processes that need to consume diagnostic events in real time. ## SDK
-Microsoft provides Azure SDKs to programmatically manage and interact with Azure services. Azure Purview client libraries are available in several languages (.NET, Java, JavaScript, and Python), designed to be consistent, approachable, and idiomatic.
+Microsoft provides Azure SDKs to programmatically manage and interact with Azure services. Microsoft Purview client libraries are available in several languages (.NET, Java, JavaScript, and Python), designed to be consistent, approachable, and idiomatic.
When to use? * Recommended over the REST API as the native client libraries (where available) will follow standard programming language conventions in line with the target language that will feel natural to the developer.
When to use?
* [Docs](/python/api/azure-mgmt-purview/?view=azure-python&preserve-view=true) | [PyPi](https://pypi.org/project/azure-mgmt-purview/) azure-mgmt-purview ## Next steps
-* [Azure Purview REST API](/rest/api/purview)
+* [Microsoft Purview REST API](/rest/api/purview)
purview Concept Best Practices Classification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-classification.md
Title: Azure Purview classification best practices
-description: This article provides best practices for classification in Azure Purview.
+ Title: Microsoft Purview classification best practices
+description: This article provides best practices for classification in Microsoft Purview.
Last updated 11/18/2021
-# Azure Purview classification best practices
+# Microsoft Purview classification best practices
-Data classification, in the context of Azure Purview, is a way of categorizing data assets by assigning unique logical labels or classes to the data assets. Classification is based on the business context of the data. For example, you might classify assets by *Passport Number*, *Driver's License Number*, *Credit Card Number*, *SWIFT Code*, *PersonΓÇÖs Name*, and so on.
+Data classification, in the context of Microsoft Purview, is a way of categorizing data assets by assigning unique logical labels or classes to the data assets. Classification is based on the business context of the data. For example, you might classify assets by *Passport Number*, *Driver's License Number*, *Credit Card Number*, *SWIFT Code*, *PersonΓÇÖs Name*, and so on.
To learn more about classification, see [Classification](concept-classification.md).
When you create and configure the classification rules for a custom classificati
* Select the appropriate classification name for which the classification rule is to be created.
-* Azure Purview supports the following two methods for creating custom classification rules:
+* Microsoft Purview supports the following two methods for creating custom classification rules:
* Use the **Regular expression** (regex) method if you can consistently express the data element by using a regular expression pattern or you can generate the pattern by using a data file. Ensure that the sample data reflects the population. * Use the **Dictionary** method only if the list of values in the dictionary file represents all possible values of data to be classified and is expected to conform to a given set of data (considering future values as well).
When you create and configure the classification rules for a custom classificati
* Configure the regex pattern for the data to be classified. Ensure that the regex pattern is generic enough to cater to the data being classified.
- * Azure Purview also provides a feature to generate a suggested regex pattern. After you upload a sample data file, select one of the suggested patterns, and then select **Add to patterns** to use the suggested data and column patterns. You can modify the suggested patterns, or you can type your own patterns without having to upload a file.
+ * Microsoft Purview also provides a feature to generate a suggested regex pattern. After you upload a sample data file, select one of the suggested patterns, and then select **Add to patterns** to use the suggested data and column patterns. You can modify the suggested patterns, or you can type your own patterns without having to upload a file.
* You can also configure the column name pattern, for the column to be classified to minimize false positives.
Here are some considerations to bear in mind as you're defining classifications:
* Set priorities and develop a plan to achieve the security and compliance needs of an organization. * Describe the phases in the data preparation processes (raw zone, landing zone, and so on) and assign the classifications to specific assets to mark the phase in the process.
-* With Azure Purview, you can assign classifications at the asset or column level automatically by including relevant classifications in the scan rule, or you can assign them manually after you ingest the metadata into Azure Purview.
-* For automatic assignment, see [Supported data stores in Azure Purview](./azure-purview-connector-overview.md).
-* Before you scan your data sources in Azure Purview, it is important to understand your data and configure the appropriate scan rule set for it (for example, by selecting relevant system classification, custom classifications, or a combination of both), because it could affect your scan performance. For more information, see [Supported classifications in Azure Purview](./supported-classifications.md).
-* The Azure Purview scanner applies data sampling rules for deep scans (subject to classification) for both system and custom classifications. The sampling rule is based on the type of data sources. For more information, see the "Sampling within a file" section in [Supported data sources and file types in Azure Purview](./sources-and-scans.md#sampling-within-a-file).
+* With Microsoft Purview, you can assign classifications at the asset or column level automatically by including relevant classifications in the scan rule, or you can assign them manually after you ingest the metadata into Microsoft Purview.
+* For automatic assignment, see [Supported data stores in Microsoft Purview](./azure-purview-connector-overview.md).
+* Before you scan your data sources in Microsoft Purview, it is important to understand your data and configure the appropriate scan rule set for it (for example, by selecting relevant system classification, custom classifications, or a combination of both), because it could affect your scan performance. For more information, see [Supported classifications in Microsoft Purview](./supported-classifications.md).
+* The Microsoft Purview scanner applies data sampling rules for deep scans (subject to classification) for both system and custom classifications. The sampling rule is based on the type of data sources. For more information, see the "Sampling within a file" section in [Supported data sources and file types in Microsoft Purview](./sources-and-scans.md#sampling-within-a-file).
> [!Note] > **Distinct data threshold**: This is the total number of distinct data values that need to be found in a column before the scanner runs the data pattern on it. Distinct data threshold has nothing to do with pattern matching but it is a pre-requisite for pattern matching. System classification rules require there to be at least 8 distinct values in each column to subject them to classification. The system requires this value to make sure that the column contains enough data for the scanner to accurately classify it. For example, a column that contains multiple rows that all contain the value 1 won't be classified. Columns that contain one row with a value and the rest of the rows have null values also won't get classified. If you specify multiple patterns, this value applies to each of them.
-* The sampling rules apply to resource sets as well. For more information, see the "Resource set file sampling" section in [Supported data sources and file types in Azure Purview](./sources-and-scans.md#resource-set-file-sampling).
+* The sampling rules apply to resource sets as well. For more information, see the "Resource set file sampling" section in [Supported data sources and file types in Microsoft Purview](./sources-and-scans.md#resource-set-file-sampling).
* Custom classifications can't be applied on document type assets using custom classification rules. Classifications for such types can be applied manually only. * Custom classifications aren't included in any default scan rules. Therefore, if automatic assignment of custom classifications is expected, you must deploy and use a custom scan rule that includes the custom classification to run the scan.
-* If you apply classifications manually from Azure Purview Studio, such classifications are retained in subsequent scans.
+* If you apply classifications manually from Microsoft Purview Studio, such classifications are retained in subsequent scans.
* Subsequent scans won't remove any classifications from assets, if they were detected previously, even if the classification rules are inapplicable.
-* For *encrypted source* data assets, Azure Purview picks only file names, fully qualified names, schema details for structured file types, and database tables. For classification to work, decrypt the encrypted data before you run scans.
+* For *encrypted source* data assets, Microsoft Purview picks only file names, fully qualified names, schema details for structured file types, and database tables. For classification to work, decrypt the encrypted data before you run scans.
## Next steps
purview Concept Best Practices Collections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-collections.md
Title: Azure Purview collections architecture and best practices
-description: This article provides examples of Azure Purview collections architectures and describes best practices.
+ Title: Microsoft Purview collections architecture and best practices
+description: This article provides examples of Microsoft Purview collections architectures and describes best practices.
Last updated 09/27/2021
-# Azure Purview collections architectures and best practices
+# Microsoft Purview collections architectures and best practices
-At the core of Azure Purview, the data map is a platform as a service (PaaS) component that keeps an up-to-date map of assets and their metadata across your data estate. To hydrate the data map, you need to register and scan your data sources. In an organization, there might be thousands of sources of data that are managed and governed by either centralized or decentralized teams.
+At the core of Microsoft Purview, the data map is a platform as a service (PaaS) component that keeps an up-to-date map of assets and their metadata across your data estate. To hydrate the data map, you need to register and scan your data sources. In an organization, there might be thousands of sources of data that are managed and governed by either centralized or decentralized teams.
-[Collections](./how-to-create-and-manage-collections.md) in Azure Purview support organizational mapping of metadata. By using collections, you can manage and maintain data sources, scans, and assets in a hierarchy instead of a flat structure. Collections allow you to build a custom hierarchical model of your data landscape based on how your organization plans to use Azure Purview to govern your landscape.
+[Collections](./how-to-create-and-manage-collections.md) in Microsoft Purview support organizational mapping of metadata. By using collections, you can manage and maintain data sources, scans, and assets in a hierarchy instead of a flat structure. Collections allow you to build a custom hierarchical model of your data landscape based on how your organization plans to use Microsoft Purview to govern your landscape.
-A collection also provides a security boundary for your metadata in the data map. Access to collections, data sources, and metadata is set up and maintained based on the collections hierarchy in Azure Purview, following a least-privilege model:
+A collection also provides a security boundary for your metadata in the data map. Access to collections, data sources, and metadata is set up and maintained based on the collections hierarchy in Microsoft Purview, following a least-privilege model:
- Users have the minimum amount of access they need to do their jobs. - Users don't have access to sensitive data that they don't need.
-## Why do you need to define collections and an authorization model for your Azure Purview account?
+## Why do you need to define collections and an authorization model for your Microsoft Purview account?
-Consider deploying collections in Azure Purview to fulfill the following requirements:
+Consider deploying collections in Microsoft Purview to fulfill the following requirements:
- Organize data sources, distribute assets, and run scans based on your business requirements, geographical distribution of data, and data management teams, departments, or business functions.
Consider deploying collections in Azure Purview to fulfill the following require
### Design recommendations -- Review the [Azure Purview account best practices](./deployment-best-practices.md#determine-the-number-of-azure-purview-instances) and define the adequate number of Azure Purview accounts required in your organization before you plan the collection structure.
+- Review the [Microsoft Purview account best practices](./deployment-best-practices.md#determine-the-number-of-microsoft-purview-instances) and define the adequate number of Microsoft Purview accounts required in your organization before you plan the collection structure.
- We recommend that you design your collection architecture based on the security requirements and data management and governance structure of your organization. Review the recommended [collections archetypes](#collections-archetypes) in this article. - For future scalability, we recommend that you create a top-level collection for your organization below the root collection. Assign relevant roles at the top-level collection instead of at the root collection. -- Consider security and access management as part of your design decision-making process when you build collections in Azure Purview.
+- Consider security and access management as part of your design decision-making process when you build collections in Microsoft Purview.
-- Each collection has a name attribute and a friendly name attribute. If you use [Azure Purview Studio](https://web.purview.azure.com/resource/) to deploy a collection, the system automatically assigns a random six-letter name to the collection to avoid duplication. To reduce complexity, avoid using duplicated friendly names across your collections, especially in the same level.
+- Each collection has a name attribute and a friendly name attribute. If you use [Microsoft Purview Studio](https://web.purview.azure.com/resource/) to deploy a collection, the system automatically assigns a random six-letter name to the collection to avoid duplication. To reduce complexity, avoid using duplicated friendly names across your collections, especially in the same level.
- When you can, avoid duplicating your organizational structure into a deeply nested collection hierarchy. If you can't avoid doing so, be sure to use different names for every collection in the hierarchy to make the collections easy to distinguish.
Consider deploying collections in Azure Purview to fulfill the following require
### Design considerations -- Each Azure Purview account is created with a default _root collection_. The root collection name is the same as your Azure Purview account name. The root collection can't be removed. To change the root collection's friendly name, you can change the friendly name of your Azure Purview account from Azure Purview Management center.
+- Each Microsoft Purview account is created with a default _root collection_. The root collection name is the same as your Microsoft Purview account name. The root collection can't be removed. To change the root collection's friendly name, you can change the friendly name of your Microsoft Purview account from Microsoft Purview Management center.
- Collections can hold data sources, scans, assets, and role assignments.
Consider deploying collections in Azure Purview to fulfill the following require
- Data sources, scans, and assets can belong to only one collection. -- A collections hierarchy in an Azure Purview can support as many as 256 collections, with a maximum of eight levels of depth. This doesn't include the root collection.
+- A collections hierarchy in a Microsoft Purview can support as many as 256 collections, with a maximum of eight levels of depth. This doesn't include the root collection.
-- By design, you can't register data sources multiple times in a single Azure Purview account. This architecture helps to avoid the risk of assigning different levels of access control to a single data source. If multiple teams consume the metadata of a single data source, you can register and manage the data source in a parent collection. You can then create corresponding scans under each subcollection so that relevant assets appear under each child collection.
+- By design, you can't register data sources multiple times in a single Microsoft Purview account. This architecture helps to avoid the risk of assigning different levels of access control to a single data source. If multiple teams consume the metadata of a single data source, you can register and manage the data source in a parent collection. You can then create corresponding scans under each subcollection so that relevant assets appear under each child collection.
- Lineage connections and artifacts are attached to the root collection even if the data sources are registered at lower-level collections.
Consider deploying collections in Azure Purview to fulfill the following require
- You can delete a collection if it does not have any assets, associated scans, data sources or child collections. -- Data sources, scans, and assets must belong to a collection if they exist in the Azure Purview data map.
+- Data sources, scans, and assets must belong to a collection if they exist in the Microsoft Purview data map.
<!-- - Moving data sources across collections is allowed if the user is granted the Data Source Admin role for the source and destination collections.
Consider deploying collections in Azure Purview to fulfill the following require
## Define an authorization model
-Azure Purview data-plane roles are managed in Azure Purview. After you deploy an Azure Purview account, the creator of the Azure Purview account is automatically assigned the following roles at the root collection. You can use [Azure Purview Studio](https://web.purview.azure.com/resource/) or a programmatic method to directly assign and manage roles in Azure Purview.
+Microsoft Purview data-plane roles are managed in Microsoft Purview. After you deploy a Microsoft Purview account, the creator of the Microsoft Purview account is automatically assigned the following roles at the root collection. You can use [Microsoft Purview Studio](https://web.purview.azure.com/resource/) or a programmatic method to directly assign and manage roles in Microsoft Purview.
- - **Collection Admins** can edit Azure Purview collections and their details and add subcollections. They can also add users to other Azure Purview roles on collections where they're admins.
+ - **Collection Admins** can edit Microsoft Purview collections and their details and add subcollections. They can also add users to other Microsoft Purview roles on collections where they're admins.
- **Data Source Admins** can manage data sources and data scans. - **Data Curators** can create, read, modify, and delete catalog data assets and establish relationships between assets. - **Data Readers** can access but not modify catalog data assets. ### Design recommendations -- Consider implementing [emergency access](/azure/active-directory/users-groups-roles/directory-emergency-access) or a break-glass strategy for the Collection Admin role at your Azure Purview root collection level to avoid Azure Purview account-level lockouts. Document the process for using emergency accounts.
+- Consider implementing [emergency access](/azure/active-directory/users-groups-roles/directory-emergency-access) or a break-glass strategy for the Collection Admin role at your Microsoft Purview root collection level to avoid Microsoft Purview account-level lockouts. Document the process for using emergency accounts.
> [!NOTE]
- > In certain scenarios, you might need to use an emergency account to sign in to Azure Purview. You might need this type of account to fix organization-level access problems when nobody else can sign in to Azure Purview or when other admins can't accomplish certain operations because of corporate authentication problems. We strongly recommended that you follow Microsoft best practices around implementing [emergency access accounts](/azure/active-directory/users-groups-roles/directory-emergency-access) by using cloud-only users.
+ > In certain scenarios, you might need to use an emergency account to sign in to Microsoft Purview. You might need this type of account to fix organization-level access problems when nobody else can sign in to Microsoft Purview or when other admins can't accomplish certain operations because of corporate authentication problems. We strongly recommended that you follow Microsoft best practices around implementing [emergency access accounts](/azure/active-directory/users-groups-roles/directory-emergency-access) by using cloud-only users.
>
- > Follow the instructions in [this article](./concept-account-upgrade.md#what-happens-when-your-upgraded-account-doesnt-have-a-collection-admin) to recover access to your Azure Purview root collection if your previous Collection Admin is unavailable.
+ > Follow the instructions in [this article](./concept-account-upgrade.md#what-happens-when-your-upgraded-account-doesnt-have-a-collection-admin) to recover access to your Microsoft Purview root collection if your previous Collection Admin is unavailable.
- Minimize the number of root Collection Admins. Assign a maximum of three Collection Admin users at the root collection, including the SPN and your break-glass accounts. Assign your Collection Admin roles to the top-level collection or to subcollections instead.
Azure Purview data-plane roles are managed in Azure Purview. After you deploy an
### Design considerations -- Azure Purview access management has moved into data plane. Azure Resource Manager roles aren't used anymore, so you should use Azure Purview to assign roles.
+- Microsoft Purview access management has moved into data plane. Azure Resource Manager roles aren't used anymore, so you should use Microsoft Purview to assign roles.
-- In Azure Purview, you can assign roles to users, security groups, and service principals (including managed identities) from Azure Active Directory (Azure AD) on the same Azure AD tenant where the Azure Purview account is deployed.
+- In Microsoft Purview, you can assign roles to users, security groups, and service principals (including managed identities) from Azure Active Directory (Azure AD) on the same Azure AD tenant where the Microsoft Purview account is deployed.
-- You must first add guest accounts to your Azure AD tenant as B2B users before you can assign Azure Purview roles to external users.
+- You must first add guest accounts to your Azure AD tenant as B2B users before you can assign Microsoft Purview roles to external users.
- By default, Collection Admins don't have access to read or modify assets. But they can elevate their access and add themselves to more roles.
Azure Purview data-plane roles are managed in Azure Purview. After you deploy an
- For Azure Data Factory connection: to connect to Azure Data Factory, you have to be a Collection Admin for the root collection. -- If you need to connect to Azure Data Factory for lineage, grant the Data Curator role to the data factory's managed identity at your Azure Purview root collection level. When you connect Data Factory to Azure Purview in the authoring UI, Data Factory tries to add these role assignments automatically. If you have the Collection Admin role on the Azure Purview root collection, this operation will work.
+- If you need to connect to Azure Data Factory for lineage, grant the Data Curator role to the data factory's managed identity at your Microsoft Purview root collection level. When you connect Data Factory to Microsoft Purview in the authoring UI, Data Factory tries to add these role assignments automatically. If you have the Collection Admin role on the Microsoft Purview root collection, this operation will work.
## Collections archetypes
-You can deploy your Azure Purview collection based on centralized, decentralized, or hybrid data management and governance models. Base this decision on your business and security requirements.
+You can deploy your Microsoft Purview collection based on centralized, decentralized, or hybrid data management and governance models. Base this decision on your business and security requirements.
### Example 1: Single-region organization
Organization-level shared data sources are registered and scanned in the Hub-Sha
The department-level shared data sources are registered and scanned in the department collections. ### Example 2: Multiregion organization
The collection hierarchy consists of these verticals:
- Departments (a delegated collection for each department) - Teams or projects (further segregation based on teams or projects)
-In this scenario, each region has a subcollection of its own under the top-level collection in the Azure Purview account. Data sources are registered and scanned in the corresponding subcollections in their own geographic locations. So assets also appear in the subcollection hierarchy for the region.
+In this scenario, each region has a subcollection of its own under the top-level collection in the Microsoft Purview account. Data sources are registered and scanned in the corresponding subcollections in their own geographic locations. So assets also appear in the subcollection hierarchy for the region.
If you have centralized data management and governance teams, you can grant them access from the top-level collection. When you do, they gain oversight for the entire data estate in the data map. Optionally, the centralized team can register and scan any shared data sources.
Region-based data management and governance teams can obtain access from their c
The department-level shared data sources are registered and scanned in the department collections. ### Example 3: Multiregion, data transformation
The collection hierarchy consists of these verticals:
Data scientists and data engineers can have the Data Curators role on their corresponding zones so they can curate metadata. Data Reader access to the curated zone can be granted to entire data personas and business users. ### Example 4: Multiregion, business functions
The collection hierarchy consists of these verticals:
- Geographic locations (mid-level collections based on geographic locations where data sources and data owners are located) - Major business functions or clients (further segregation based on functions or clients)
-Each region has a subcollection of its own under the top-level collection in the Azure Purview account. Data sources are registered and scanned in the corresponding subcollections in their own geographic locations. So assets are added to the subcollection hierarchy for the region.
+Each region has a subcollection of its own under the top-level collection in the Microsoft Purview account. Data sources are registered and scanned in the corresponding subcollections in their own geographic locations. So assets are added to the subcollection hierarchy for the region.
If you have centralized data management and governance teams, you can grant them access from the top-level collection. When you do, they gain oversight for the entire data estate in the data map. Optionally, the centralized team can register and scan any shared data sources. Region-based data management and governance teams can obtain access from their corresponding collections at a lower level. Each business unit has its own subcollection. ## Access management options
If you want to implement data democratization across an entire organization, ass
If you need to restrict access to metadata search and discovery in your organization, assign Data Reader and Data Curator roles at the specific collection level. For example, you could restrict US employees so they can read data only at the US collection level and not in the LATAM collection.
-You can apply a combination of these two scenarios in your Azure Purview data map if total data democratization is required with a few exceptions for some collections. You can assign Azure Purview roles at the top-level collection and restrict inheritance to the specific child collections.
+You can apply a combination of these two scenarios in your Microsoft Purview data map if total data democratization is required with a few exceptions for some collections. You can assign Microsoft Purview roles at the top-level collection and restrict inheritance to the specific child collections.
Assign the Collection Admin role to the centralized data security and management team at the top-level collection. Delegate further collection management of lower-level collections to corresponding teams. ## Next steps-- [Create a collection and assign permissions in Azure Purview](./quickstart-create-collection.md)-- [Create and manage collections in Azure Purview](./how-to-create-and-manage-collections.md)-- [Access control in Azure Purview](./catalog-permissions.md)
+- [Create a collection and assign permissions in Microsoft Purview](./quickstart-create-collection.md)
+- [Create and manage collections in Microsoft Purview](./how-to-create-and-manage-collections.md)
+- [Access control in Microsoft Purview](./catalog-permissions.md)
purview Concept Best Practices Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-glossary.md
Title: Azure Purview glossary best practices
-description: This article provides examples of Azure Purview glossary best practices.
+ Title: Microsoft Purview glossary best practices
+description: This article provides examples of Microsoft Purview glossary best practices.
Last updated 12/15/2021
-# Azure Purview glossary best practices
+# Microsoft Purview glossary best practices
The business glossary is a definition of terms specific to a domain of knowledge that is commonly used, communicated, and shared in organizations as they are conducting business. A common business glossary (for example, business language) is significant as it is critical in improving an organizations overall business productivity and performance. You will observe in most organizations that their business language is being codified based on business dealings associated with:
A common business glossary (for example, business language) is significant as it
It is important that your organizations business language and discourse align to a common business glossary to ensure your organization data assets are properly applied in conducting business at speed and with agility as rapid changing business needs happen.
-This article is intended to provide guidance around how to establish and govern your organizations A to Z business glossary of commonly used terms and it is aimed to provide more guidance to ensure you are able to focus on promoting a shared understanding for business data governance ownership. Adopting these considerations and recommendations will help your organization achieve success with Azure Purview.
+This article is intended to provide guidance around how to establish and govern your organizations A to Z business glossary of commonly used terms and it is aimed to provide more guidance to ensure you are able to focus on promoting a shared understanding for business data governance ownership. Adopting these considerations and recommendations will help your organization achieve success with Microsoft Purview.
The adoption by your organization of the business glossary will depend on you promote consistent use of a business glossary to enable your organization to better understand the meanings of their business terms they come across daily while running the organizations business. ## Why is a common business glossary needed?
You will also observe when there are language barriers, in which, most organizat
## Recommendations for implementing new glossary terms
-Creating terms is necessary to build the business vocabulary and apply it to assets within Azure Purview. When a new Azure Purview account is created, by default, there are no built-in terms in the account.
+Creating terms is necessary to build the business vocabulary and apply it to assets within Microsoft Purview. When a new Microsoft Purview account is created, by default, there are no built-in terms in the account.
This creation process should follow strict naming standards to ensure that the glossary does not contain duplicate or competing terms. - Establish a strict hierarchy for all business terms across your organization. - The hierarchy could consist of a business domain such as: Finance, Marketing, Sales, HR, etc.-- Implement naming standards for all glossary terms that include case sensitivity. In Azure Purview terms are case-sensitive.
+- Implement naming standards for all glossary terms that include case sensitivity. In Microsoft Purview terms are case-sensitive.
- Always use the provide search glossary terms feature before adding a new term. This will help you avoid adding duplicate terms to the glossary.-- Avoid deploying terms with duplicated names. In Azure Purview, terms with the same name can exist under different parent terms. This can lead to confusion and should be well thought out before building your business glossary to avoid duplicated terms.
+- Avoid deploying terms with duplicated names. In Microsoft Purview, terms with the same name can exist under different parent terms. This can lead to confusion and should be well thought out before building your business glossary to avoid duplicated terms.
-Glossary terms in Azure Purview are case sensitive and allow white space. The following shows a poorly executed example of implementing glossary terms and demonstrates the confusion caused:
+Glossary terms in Microsoft Purview are case sensitive and allow white space. The following shows a poorly executed example of implementing glossary terms and demonstrates the confusion caused:
:::image type="content" source="media/concept-best-practices/glossary-duplicated-term-search.png" alt-text="Screenshot that shows searching duplicated glossary terms.":::
As a best practice it always best to: Plan, search, and strictly follow standard
## Recommendations for deploying glossary term templates
-When building new term templates in Azure Purview, review the following considerations:
+When building new term templates in Microsoft Purview, review the following considerations:
- Term templates are used to add custom attributes to glossary terms.-- By default, Azure Purview offers several [out-of-the-box term attributes](./concept-business-glossary.md#custom-attributes) such as Name, Nick Name, Status, Definition, Acronym, Resources, Related terms, Synonyms, Stewards, Experts, and Parent term, which are found in the "System Default" template.
+- By default, Microsoft Purview offers several [out-of-the-box term attributes](./concept-business-glossary.md#custom-attributes) such as Name, Nick Name, Status, Definition, Acronym, Resources, Related terms, Synonyms, Stewards, Experts, and Parent term, which are found in the "System Default" template.
- Default attributes cannot be edited or deleted. - Custom attributes extend beyond default attributes, allowing the data curators to add more descriptive details to each term to completely describe the term in the organization.-- As a reminder, Azure Purview stores only meta-data. Attributes should describe the meta-data; not the data itself.
+- As a reminder, Microsoft Purview stores only meta-data. Attributes should describe the meta-data; not the data itself.
- Keep definition simple. If there are custom metrics or formulas these could easily be added as an attribute. - A term must include at least default attributes. When building new glossary terms if you use custom templates, other attributes that are included in the custom template are expected for the given term.
When building new term templates in Azure Purview, review the following consider
- Terms may be imported with the "System default" or custom template. - When importing terms, use the sample .CSV file to guide you. This can save hours of frustration.-- When importing terms from a .CSV file, be sure that terms already existing in Azure Purview are intended to be updated. When using the import feature, Azure Purview will overwrite existing terms.
+- When importing terms from a .CSV file, be sure that terms already existing in Microsoft Purview are intended to be updated. When using the import feature, Microsoft Purview will overwrite existing terms.
- Before importing terms, test the import in a lab environment to ensure that no unexpected results occur, such as duplicate terms. - The email address for Stewards and Experts should be the primary address of the user from the Azure Active Directory group. Alternate email, user principal name and non-Azure Active Directory emails are not yet supported. - Glossary terms provide fours status: Draft, Approved, Expired, Alert. Draft is not officially implemented, Approved is official/stand/approved for production, Expired means should no longer be used, Alert need to pay more attention.
For more information, see [Create, import, and export glossary terms](./how-to-c
## Recommendations for exporting glossary terms
-Exporting terms may be useful in Azure Purview account to account, Backup, or Disaster Recovery scenarios. Exporting terms in Azure Purview Studio must be done one term template at a time. Choosing terms from multiple templates will disable the "Export terms" button. As a best practice, using the "Term template" filter before bulk selecting will make the export process quick.
+Exporting terms may be useful in Microsoft Purview account to account, Backup, or Disaster Recovery scenarios. Exporting terms in Microsoft Purview Studio must be done one term template at a time. Choosing terms from multiple templates will disable the "Export terms" button. As a best practice, using the "Term template" filter before bulk selecting will make the export process quick.
## Glossary Management
Exporting terms may be useful in Azure Purview account to account, Backup, or Di
- While classifications and sensitivity labels are applied to assets automatically by the system based on classification rules, glossary terms are not applied automatically. - Similar to classifications, glossary terms can be mapped to assets at the asset level or scheme level.-- In Azure Purview, terms can be added to assets in different ways:
- - Manually, using Azure Purview Studio.
- - Using Bulk Edit mode to update up to 25 assets, using Azure Purview Studio.
+- In Microsoft Purview, terms can be added to assets in different ways:
+ - Manually, using Microsoft Purview Studio.
+ - Using Bulk Edit mode to update up to 25 assets, using Microsoft Purview Studio.
- Curated Code using the Atlas API. - Use Bulk Edit Mode when assigning terms manually. This feature allows a curator to assign glossary terms, owners, experts, classifications and certified in bulk based on selected items from a search result. Multiple searches can be chained by selecting objects in the results. The Bulk Edit will apply to all selected objects. Be sure to clear the selections after the bulk edit has been performed. - Other bulk edit operations can be performed by using the Atlas API. An example would be using the API to add descriptions or other custom properties to assets in bulk programmatically.
purview Concept Best Practices Lineage Azure Data Factory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-lineage-azure-data-factory.md
Title: Azure Purview Data Lineage best practices
-description: This article provides best practices for data Lineage various data sources in Azure Purview.
+ Title: Microsoft Purview Data Lineage best practices
+description: This article provides best practices for data Lineage various data sources in Microsoft Purview.
Last updated 10/25/2021
-# Azure Purview Data Lineage best practices
+# Microsoft Purview Data Lineage best practices
-Data Lineage is broadly understood as the lifecycle that spans the dataΓÇÖs origin, and where it moves over time across the data estate. Azure Purview can capture lineage for data in different parts of your organization's data estate, and at different levels of preparation including:
+Data Lineage is broadly understood as the lifecycle that spans the dataΓÇÖs origin, and where it moves over time across the data estate. Microsoft Purview can capture lineage for data in different parts of your organization's data estate, and at different levels of preparation including:
* Completely raw data staged from various platforms * Transformed and prepared data * Data used by visualization platforms
Data lineage is the process of describing what data exists, where it is
:::image type="content" source="./media/how-to-link-azure-data-factory/data-factory-connection.png" alt-text="Screen shot showing a data factory connection list." lightbox="./media/how-to-link-azure-data-factory/data-factory-connection.png":::
-* Each Data Factory instance can connect to only one Azure Purview account. You can establish new connection in another Azure Purview account, but this will turn existing connection to disconnected.
+* Each Data Factory instance can connect to only one Microsoft Purview account. You can establish new connection in another Microsoft Purview account, but this will turn existing connection to disconnected.
:::image type="content" source="./media/how-to-link-azure-data-factory/warning-for-disconnect-factory.png" alt-text="Screenshot showing warning to disconnect Azure Data Factory.":::
-* Data factory's managed identity is used to authenticate lineage in Azure Purview account, the data factory's managed identity Data Curator role on Azure Purview root collection is required.
+* Data factory's managed identity is used to authenticate lineage in Microsoft Purview account, the data factory's managed identity Data Curator role on Microsoft Purview root collection is required.
* Support no more than 10 data factories at once. If you want to add more than 10 data factories at once, please file a support ticket. ### Azure Data Factory activities
-* Azure Purview captures runtime lineage from the following Azure Data Factory activities:
+* Microsoft Purview captures runtime lineage from the following Azure Data Factory activities:
* [Copy activity ](../data-factory/copy-activity-overview.md) * [Data Flow activity](../data-factory/concepts-data-flow-overview.md) * [Execute SSIS Package activity](../data-factory/how-to-invoke-ssis-package-ssis-activity.md)
-* Azure Purview drops lineage if the source or destination uses an unsupported data storage system.
+* Microsoft Purview drops lineage if the source or destination uses an unsupported data storage system.
* Supported data sources in copy activity is listed **Copy activity support** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md) * Supported data sources in data flow activity is listed **Data Flow support** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md) * Supported data sources in SSIS is listed **SSIS execute package activity support** of [Lineage from SQL Server Integration Services](how-to-lineage-sql-server-integration-services.md)
-* Azure Purview cannot capture lineage if Azure Data Factory copy activity use copy activity features listed in **Limitations on copy activity lineage** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md)
+* Microsoft Purview cannot capture lineage if Azure Data Factory copy activity use copy activity features listed in **Limitations on copy activity lineage** of [Connect to Azure Data Factory](how-to-link-azure-data-factory.md)
-* For the lineage of Dataflow activity, Azure Purview only support source and sink. The lineage for Dataflow transformation is not supported yet.
+* For the lineage of Dataflow activity, Microsoft Purview only support source and sink. The lineage for Dataflow transformation is not supported yet.
-* Data flow lineage doesn't integrate with Azure Purview resource set.
+* Data flow lineage doesn't integrate with Microsoft Purview resource set.
**Resource set example 1**
Data lineage is the process of describing what data exists, where it is
* For the lineage of Execute SSIS Package activity, we only support source and destination. The lineage for transformation is not supported yet.
- :::image type="content" source="./media/concept-best-practices-lineage/ssis-lineage.png" alt-text="Screenshot of the Execute SSIS lineage in Azure Purview." lightbox="./media/concept-best-practices-lineage/ssis-lineage.png":::
+ :::image type="content" source="./media/concept-best-practices-lineage/ssis-lineage.png" alt-text="Screenshot of the Execute SSIS lineage in Microsoft Purview." lightbox="./media/concept-best-practices-lineage/ssis-lineage.png":::
-* Please refer the following step-by-step guide to [push Azure Data Factory lineage in Azure Purview](../data-factory/tutorial-push-lineage-to-purview.md).
+* Please refer the following step-by-step guide to [push Azure Data Factory lineage in Microsoft Purview](../data-factory/tutorial-push-lineage-to-purview.md).
## Next steps - [Manage data sources](./manage-data-sources.md)
purview Concept Best Practices Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-migration.md
Title: Azure Purview migration best practices
+ Title: Microsoft Purview migration best practices
description: This article provides steps to perform backup and recovery for migration best practices.
Last updated 12/09/2021
-# Azure Purview backup and recovery for migration best practices
+# Microsoft Purview backup and recovery for migration best practices
-This article provides guidance on backup and recovery strategy when your organization has Azure Purview in production deployment. You can also use this general guideline to implement account migration. The scope of this article is to cover [manual BCDR methods](disaster-recovery.md) where you could automate using APIs. There is some key information to consider upfront:
+This article provides guidance on backup and recovery strategy when your organization has Microsoft Purview in production deployment. You can also use this general guideline to implement account migration. The scope of this article is to cover [manual BCDR methods](disaster-recovery.md) where you could automate using APIs. There is some key information to consider upfront:
- It is not advisable to back up "scanned" assets' details. You should only back up the curated data such as mapping of classifications and glossaries on assets. The only case when you need to back up assets' details is when you have custom assets via custom `typeDef`. - The backed-up asset count should be fewer than 100,000 assets. The main driver is that you have to use the search query API to get the assets, which has limitation of 100,000 assets returned. However, if you are able to segment the search query to get smaller number of assets per API call, it is possible to back up more than 100,000 assets. -- The goal is to perform one time migration. If you wish to continuously "sync" assets between two accounts, there are other steps which will not be covered in detail by this article. You have to use [Azure Purview's Event Hub to subscribe and create entities to another account](manage-kafka-dotnet.md). However, Event Hub only has Atlas information. Azure Purview has added other capabilities such as **glossaries** and **contacts** which won't be available via Event Hub.
+- The goal is to perform one time migration. If you wish to continuously "sync" assets between two accounts, there are other steps which will not be covered in detail by this article. You have to use [Microsoft Purview's Event Hub to subscribe and create entities to another account](manage-kafka-dotnet.md). However, Event Hub only has Atlas information. Microsoft Purview has added other capabilities such as **glossaries** and **contacts** which won't be available via Event Hub.
## Identify key requirements
-Most of enterprise organizations have critical requirement for Azure Purview for capabilities such as Backup, Business Continuity, and Disaster Recovery (BCDR). To get into more details of this requirement, you need to differentiate between Backup, High Availability (HA), and Disaster recovery (DR).
+Most of enterprise organizations have critical requirement for Microsoft Purview for capabilities such as Backup, Business Continuity, and Disaster Recovery (BCDR). To get into more details of this requirement, you need to differentiate between Backup, High Availability (HA), and Disaster recovery (DR).
While they are similar, HA will keep the service operational if there was a hardware fault, for example, but it would not protect you if someone accidentally or deliberately deleted all the records in your database. For that, you may need to restore the service from a backup. ### Backup
-You may need to create regular backups from an Azure Purview account and use a backup in case a piece of data or configuration is accidentally or deliberately deleted from the Azure Purview account by the users.
+You may need to create regular backups from a Microsoft Purview account and use a backup in case a piece of data or configuration is accidentally or deliberately deleted from the Microsoft Purview account by the users.
-The backup should allow saving a point in time copy of the following configurations from the Azure Purview account:
+The backup should allow saving a point in time copy of the following configurations from the Microsoft Purview account:
* Account information (for example, Friendly name) * Collection structure and role assignments
There are three main requirements to take into consideration:
* **Recovery Level Object (RLO)** ΓÇô This defines the granularity of the data being restored. It could be a SQL server, a set of databases, tables, records, etc. ### High availability
-In computing, the term availability is used to describe the period of time when a service is available, and the time required by a system to respond to a request made by a user. For Azure Purview, high availability means ensuring that Azure Purview instances are available in the case of a problem that is local to a data center or single region in the cloud region.
+In computing, the term availability is used to describe the period of time when a service is available, and the time required by a system to respond to a request made by a user. For Microsoft Purview, high availability means ensuring that Microsoft Purview instances are available in the case of a problem that is local to a data center or single region in the cloud region.
#### Measuring availability Availability is often expressed as a percentage indicating how much uptime is expected from a particular system or component in a given period of time, where a value of 100% would indicate that the system never fails.
These values are calculated based on several factors, including both scheduled a
> [!NOTE] > Azure data center outages are rare, but can last anywhere from a few minutes to hours.
-> Information about Azure Purview availability is available at [SLA for Azure Purview](https://azure.microsoft.com/support/legal/sla/purview/v1_0/).
-> Azure Purview ensures no data loss but a recovery from outages may require you to restart your workflows such as scans.
+> Information about Microsoft Purview availability is available at [SLA for Microsoft Purview](https://azure.microsoft.com/support/legal/sla/purview/v1_0/).
+> Microsoft Purview ensures no data loss but a recovery from outages may require you to restart your workflows such as scans.
### Resiliency The ability of a system to recover from failures and continue to function. It's not about avoiding failures but responding to failures in a way that avoids downtime or data loss.
The ability of a system to recover from failures and continue to function. It's
Business continuity means continuing your business in the event of a disaster, planning for recovery, and ensuring that your data map is highly available. ### Disaster recovery
-Organizations need a failover mechanism for their Azure Purview instances, so when the primary region experiences a catastrophic event like an earthquake or flood, the business must be prepared to have its systems come online elsewhere.
+Organizations need a failover mechanism for their Microsoft Purview instances, so when the primary region experiences a catastrophic event like an earthquake or flood, the business must be prepared to have its systems come online elsewhere.
> [!Note]
-> Not all Azure mirrored regions support deploying Azure Purview accounts. For example, For a DR scenario, you cannot choose to deploy a new Azure Purview account in Canada East if the primary region is Canada Central. Even with Customers managed DR, not all customer may able to trigger a DR.
+> Not all Azure mirrored regions support deploying Microsoft Purview accounts. For example, For a DR scenario, you cannot choose to deploy a new Microsoft Purview account in Canada East if the primary region is Canada Central. Even with Customers managed DR, not all customer may able to trigger a DR.
## Implementation steps
-This section provides high level guidance on required tasks to copy assets, glossaries, classifications & relationships across regions or subscriptions either using the Azure Purview Studio or the REST APIs. The approach is to perform the tasks as programmatically as possible at scale.
+This section provides high level guidance on required tasks to copy assets, glossaries, classifications & relationships across regions or subscriptions either using the Microsoft Purview Studio or the REST APIs. The approach is to perform the tasks as programmatically as possible at scale.
### High-level task outline 1. Create the new account
This section provides high level guidance on required tasks to copy assets, glos
1. Assign contacts to assets ### Create the new account
-You will need to create a new Azure Purview account by following below instruction:
-* [Quickstart: Create an Azure Purview account in the Azure portal](create-catalog-portal.md)
+You will need to create a new Microsoft Purview account by following below instruction:
+* [Quickstart: Create a Microsoft Purview account in the Azure portal](create-catalog-portal.md)
ItΓÇÖs crucial to plan ahead on configuration items that you cannot change later: * Account name
ItΓÇÖs crucial to plan ahead on configuration items that you cannot change later
* Manage resource group name ### Migrate configuration items
-Below steps are referring to [Azure Purview API documentation](/rest/api/purview/) so that you can programmatically stand up the backup account quickly:
+Below steps are referring to [Microsoft Purview API documentation](/rest/api/purview/) so that you can programmatically stand up the backup account quickly:
|Task|Description| |-|--|
Below steps are referring to [Azure Purview API documentation](/rest/api/purview
|**Data sources**|Call the [Get all data sources API](/rest/api/purview/scanningdataplane/scans/list-by-data-source) to list data sources with details. You also have to get the triggers by calling [Get trigger API](/rest/api/purview/scanningdataplane/triggers/get-trigger). There is also [Create data sources API](/rest/api/purview/scanningdataplane/data-sources/create-or-update) if you need to re-create the sources in bulk in the new account.| |**Credentials**|Create and maintain credentials used while scanning. There is no API to extract credentials, so this must be redone in the new account.| |**Self-hosted integration runtime (SHIR)**|Get a list of SHIR and get updated keys from the new account then update the SHIRs. This must be done [manually inside the SHIRs' hosts](manage-integration-runtimes.md#create-a-self-hosted-integration-runtime).|
-|**ADF connections**|Currently an ADF can be connected to one Azure Purview at a time. You must disconnect ADF from failed Azure Purview account and reconnect it to the new account later.|
+|**ADF connections**|Currently an ADF can be connected to one Microsoft Purview at a time. You must disconnect ADF from failed Microsoft Purview account and reconnect it to the new account later.|
### Run scans
This will populate all assets with default `typedef`. There are several reasons
* When you rerun the scans, you will get all relationships and assets details up to date.
-* Azure Purview comes out with new features regularly so you can benefit from other features when running new scans.
+* Microsoft Purview comes out with new features regularly so you can benefit from other features when running new scans.
-Running the scans is the most effective way to get all assets of data sources that Azure Purview is already supporting.
+Running the scans is the most effective way to get all assets of data sources that Microsoft Purview is already supporting.
### Migrate custom typedefs and custom assets
To complete the asset migration, you must remap the relationships. There are thr
1. Call the [relationship API](/rest/api/purview/catalogdataplane/relationship/get) to get relationship information between entities by its `guid`
-1. Prepare the relationship payload so that there is no hard reference to old `guids` in the old Azure Purview accounts. You need to update those `guids` to the new account's `guids`.
+1. Prepare the relationship payload so that there is no hard reference to old `guids` in the old Microsoft Purview accounts. You need to update those `guids` to the new account's `guids`.
1. Finally, [Create a new relationship between entities](/rest/api/purview/catalogdataplane/relationship/create)
To complete the asset migration, you must remap the relationships. There are thr
> [!Note] > Before migrating terms, you need to migrate the term templates. This step should be already covered in the custom `typedef` migration.
-#### Using Azure Purview Portal
-The quickest way to migrate glossary terms is to [export terms to a .csv file](how-to-create-import-export-glossary.md). You can do this using the Azure Purview Studio.
+#### Using Microsoft Purview Portal
+The quickest way to migrate glossary terms is to [export terms to a .csv file](how-to-create-import-export-glossary.md). You can do this using the Microsoft Purview Studio.
-#### Using Azure Purview API
+#### Using Microsoft Purview API
To automate glossary migration, you first need to get the glossary `guid` (`glossaryGuid`) via [List Glossaries API](/rest/api/purview/catalogdataplane/glossary/list-glossaries). The `glossaryGuid` is the top/root level glossary `guid`. The below sample response will provide the `guid` to use for subsequent API calls:
If you have extracted asset information from previous steps, the contact details
To assign contacts to assets, you need a list of `guids` and identify all `objectid` of the contacts. You can automate this process by iterating through all assets and reassign contacts to all assets using the [Create Or Update Entities API](/rest/api/purview/catalogdataplane/entity/create-or-update-entities) ## Next steps-- [Create an Azure Purview account](./create-catalog-portal.md)
+- [Create a Microsoft Purview account](./create-catalog-portal.md)
purview Concept Best Practices Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-network.md
Title: Azure Purview network architecture and best practices
-description: This article provides examples of Azure Purview network architecture options and describes best practices.
+ Title: Microsoft Purview network architecture and best practices
+description: This article provides examples of Microsoft Purview network architecture options and describes best practices.
Last updated 03/04/2022
-# Azure Purview network architecture and best practices
+# Microsoft Purview network architecture and best practices
-Azure Purview is a platform as a service (PaaS) solution for data governance. Azure Purview accounts have public endpoints that are accessible through the internet to connect to the service. However, all endpoints are secured through Azure Active Directory (Azure AD) logins and role-based access control (RBAC).
+Microsoft Purview is a platform as a service (PaaS) solution for data governance. Microsoft Purview accounts have public endpoints that are accessible through the internet to connect to the service. However, all endpoints are secured through Azure Active Directory (Azure AD) logins and role-based access control (RBAC).
-For an added layer of security, you can create private endpoints for your Azure Purview account. You then get a private IP address from your virtual network in Azure to the Azure Purview account and its managed resources. This address will restrict all traffic between your virtual network and the Azure Purview account to a private link for user interaction with the APIs and Azure Purview Studio, or for scanning and ingestion.
+For an added layer of security, you can create private endpoints for your Microsoft Purview account. You then get a private IP address from your virtual network in Azure to the Microsoft Purview account and its managed resources. This address will restrict all traffic between your virtual network and the Microsoft Purview account to a private link for user interaction with the APIs and Microsoft Purview Studio, or for scanning and ingestion.
-Currently, the Azure Purview firewall provides access control for the public endpoint of your purview account. You can use the firewall to allow all access or to block all access through the public endpoint when using private endpoints.
+Currently, the Microsoft Purview firewall provides access control for the public endpoint of your purview account. You can use the firewall to allow all access or to block all access through the public endpoint when using private endpoints.
-Based on your network, connectivity, and security requirements, you can set up and maintain Azure Purview accounts to access underlying services or ingestion. Use this best practices guide to define and prepare your network environment so you can access Azure Purview and scan data sources from various locations in your network or cloud.
+Based on your network, connectivity, and security requirements, you can set up and maintain Microsoft Purview accounts to access underlying services or ingestion. Use this best practices guide to define and prepare your network environment so you can access Microsoft Purview and scan data sources from various locations in your network or cloud.
This guide covers the following network options: - Use [Azure public endpoints](#option-1-use-public-endpoints). - Use [private endpoints](#option-2-use-private-endpoints). -- Use [private endpoints and allow public access on the same Azure Purview account](#option-3-use-both-private-and-public-endpoints).
+- Use [private endpoints and allow public access on the same Microsoft Purview account](#option-3-use-both-private-and-public-endpoints).
-This guide describes a few of the most common network architecture scenarios for Azure Purview. Though you're not limited to those scenarios, keep in mind the [limitations](#current-limitations) of the service when you're planning networking for your Azure Purview accounts.
+This guide describes a few of the most common network architecture scenarios for Microsoft Purview. Though you're not limited to those scenarios, keep in mind the [limitations](#current-limitations) of the service when you're planning networking for your Microsoft Purview accounts.
## Prerequisites To understand what network option is the most suitable for your environment, we suggest that you perform the following actions first: -- Review your network topology and security requirements before registering and scanning any data sources in Azure Purview. For more information, see [Define an Azure network topology](/azure/cloud-adoption-framework/ready/azure-best-practices/define-an-azure-network-topology).
+- Review your network topology and security requirements before registering and scanning any data sources in Microsoft Purview. For more information, see [Define an Azure network topology](/azure/cloud-adoption-framework/ready/azure-best-practices/define-an-azure-network-topology).
- Define your [network connectivity model for PaaS services](/azure/cloud-adoption-framework/ready/azure-best-practices/connectivity-to-azure-paas-services). ## Option 1: Use public endpoints
-By default, you can use Azure Purview accounts through public endpoints accessible through the internet. Allow public networks in your Azure Purview account if you have the following requirements:
+By default, you can use Microsoft Purview accounts through public endpoints accessible through the internet. Allow public networks in your Microsoft Purview account if you have the following requirements:
-- No private connectivity is required when scanning or connecting to Azure Purview endpoints.
+- No private connectivity is required when scanning or connecting to Microsoft Purview endpoints.
- All data sources are SaaS applications only. - All data sources have a public endpoint that's accessible through the internet. -- Business users require access to an Azure Purview account and Azure Purview Studio through the internet.
+- Business users require access to a Microsoft Purview account and Microsoft Purview Studio through the internet.
### Integration runtime options
-To scan data sources while the Azure Purview account firewall is set to allow public access, you can use both the Azure integration runtime and a [self-hosted integration runtime](./manage-integration-runtimes.md). How you use them depends on the [supportability of your data sources](manage-data-sources.md).
+To scan data sources while the Microsoft Purview account firewall is set to allow public access, you can use both the Azure integration runtime and a [self-hosted integration runtime](./manage-integration-runtimes.md). How you use them depends on the [supportability of your data sources](manage-data-sources.md).
Here are some best practices:
Here are some best practices:
- To scan multiple Azure data sources, use a public network and the Azure integration runtime. The following steps show the communication flow at a high level when you're using the Azure integration runtime to scan a data source in Azure:
- :::image type="content" source="media/concept-best-practices/network-azure-runtime.png" alt-text="Screenshot that shows the connection flow between Azure Purview, the Azure runtime, and data sources."lightbox="media/concept-best-practices/network-azure-runtime.png":::
+ :::image type="content" source="media/concept-best-practices/network-azure-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, the Azure runtime, and data sources."lightbox="media/concept-best-practices/network-azure-runtime.png":::
- 1. A manual or automatic scan is initiated from the Azure Purview data map through the Azure integration runtime.
+ 1. A manual or automatic scan is initiated from the Microsoft Purview data map through the Azure integration runtime.
2. The Azure integration runtime connects to the data source to extract metadata.
- 3. Metadata is queued in Azure Purview managed storage and stored in Azure Blob Storage.
+ 3. Metadata is queued in Microsoft Purview managed storage and stored in Azure Blob Storage.
- 4. Metadata is sent to the Azure Purview data map.
+ 4. Metadata is sent to the Microsoft Purview data map.
- Scanning on-premises and VM-based data sources always requires using a self-hosted integration runtime. The Azure integration runtime is not supported for these data sources. The following steps show the communication flow at a high level when you're using a self-hosted integration runtime to scan a data source:
- :::image type="content" source="media/concept-best-practices/network-self-hosted-runtime.png" alt-text="Screenshot that shows the connection flow between Azure Purview, a self-hosted runtime, and data sources."lightbox="media/concept-best-practices/network-self-hosted-runtime.png":::
+ :::image type="content" source="media/concept-best-practices/network-self-hosted-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, a self-hosted runtime, and data sources."lightbox="media/concept-best-practices/network-self-hosted-runtime.png":::
1. A manual or automatic scan is triggered. Azure purview connects to Azure Key Vault to retrieve the credential to access a data source.
- 2. The scan is initiated from the Azure Purview data map through a self-hosted integration runtime.
+ 2. The scan is initiated from the Microsoft Purview data map through a self-hosted integration runtime.
3. The self-hosted integration runtime service from the VM connects to the data source to extract metadata.
- 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Azure Purview managed storage and then stored in Azure Blob Storage.
+ 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Microsoft Purview managed storage and then stored in Azure Blob Storage.
- 5. Metadata is sent to the Azure Purview data map.
+ 5. Metadata is sent to the Microsoft Purview data map.
### Authentication options
-When you're scanning a data source in Azure Purview, you need to provide a credential. Azure Purview can then read the metadata of the assets by using the Azure integration runtime in the destination data source. When you're using a public network, authentication options and requirements vary based on the following factors:
+When you're scanning a data source in Microsoft Purview, you need to provide a credential. Microsoft Purview can then read the metadata of the assets by using the Azure integration runtime in the destination data source. When you're using a public network, authentication options and requirements vary based on the following factors:
-- **Data source type**. For example, if the data source is Azure SQL Database, you need to use SQL authentication with db_datareader access to each database. This can be a user-managed identity or an Azure Purview managed identity. Or it can be a service principal in Azure Active Directory added to SQL Database as db_datareader.
+- **Data source type**. For example, if the data source is Azure SQL Database, you need to use SQL authentication with db_datareader access to each database. This can be a user-managed identity or a Microsoft Purview managed identity. Or it can be a service principal in Azure Active Directory added to SQL Database as db_datareader.
- If the data source is Azure Blob Storage, you can use an Azure Purview managed identity or a service principal in Azure Active Directory added as a Blob Storage Data Reader role on the Azure storage account. Or simply use the storage account's key.
+ If the data source is Azure Blob Storage, you can use a Microsoft Purview managed identity or a service principal in Azure Active Directory added as a Blob Storage Data Reader role on the Azure storage account. Or simply use the storage account's key.
-- **Authentication type**. We recommend that you use an Azure Purview managed identity to scan Azure data sources when possible, to reduce administrative overhead. For any other authentication types, you need to [set up credentials for source authentication inside Azure Purview](manage-credentials.md):
+- **Authentication type**. We recommend that you use a Microsoft Purview managed identity to scan Azure data sources when possible, to reduce administrative overhead. For any other authentication types, you need to [set up credentials for source authentication inside Microsoft Purview](manage-credentials.md):
1. Generate a secret inside an Azure key vault.
- 1. Register the key vault inside Azure Purview.
- 1. Inside Azure Purview, create a new credential by using the secret saved in the key vault.
+ 1. Register the key vault inside Microsoft Purview.
+ 1. Inside Microsoft Purview, create a new credential by using the secret saved in the key vault.
-- **Runtime type that's used in the scan**. Currently, you can't use an Azure Purview managed identity with a self-hosted integration runtime.
+- **Runtime type that's used in the scan**. Currently, you can't use a Microsoft Purview managed identity with a self-hosted integration runtime.
### Additional considerations
When you're scanning a data source in Azure Purview, you need to provide a crede
## Option 2: Use private endpoints
-You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Azure Purview accounts. This option is useful if you need to do either of the following:
+You can use [Azure private endpoints](../private-link/private-endpoint-overview.md) for your Microsoft Purview accounts. This option is useful if you need to do either of the following:
- Scan Azure infrastructure as a service (IaaS) and PaaS data sources inside Azure virtual networks and on-premises data sources through a private connection.-- Allow users on a virtual network to securely access Azure Purview over [Azure Private Link](../private-link/private-link-overview.md).
+- Allow users on a virtual network to securely access Microsoft Purview over [Azure Private Link](../private-link/private-link-overview.md).
-Similar to other PaaS solutions, Azure Purview does not support deploying directly into a virtual network. So you can't use certain networking features with the offering's resources, such as network security groups, route tables, or other network-dependent appliances such as Azure Firewall. Instead, you can use private endpoints that can be enabled on your virtual network. You can then disable public internet access to securely connect to Azure Purview.
+Similar to other PaaS solutions, Microsoft Purview does not support deploying directly into a virtual network. So you can't use certain networking features with the offering's resources, such as network security groups, route tables, or other network-dependent appliances such as Azure Firewall. Instead, you can use private endpoints that can be enabled on your virtual network. You can then disable public internet access to securely connect to Microsoft Purview.
-You must use private endpoints for your Azure Purview account if you have any of the following requirements:
+You must use private endpoints for your Microsoft Purview account if you have any of the following requirements:
-- You need to have end-to-end network isolation for Azure Purview accounts and data sources.
+- You need to have end-to-end network isolation for Microsoft Purview accounts and data sources.
-- You need to [block public access](./catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access) to your Azure Purview accounts.
+- You need to [block public access](./catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access) to your Microsoft Purview accounts.
- Your PaaS data sources are deployed with private endpoints, and you've blocked all access through the public endpoint.
You must use private endpoints for your Azure Purview account if you have any of
### Design considerations -- To connect to your Azure Purview account privately and securely, you need to deploy an account and a portal private endpoint. For example, this deployment is necessary if you intend to connect to Azure Purview through the API or use Azure Purview Studio.
+- To connect to your Microsoft Purview account privately and securely, you need to deploy an account and a portal private endpoint. For example, this deployment is necessary if you intend to connect to Microsoft Purview through the API or use Microsoft Purview Studio.
-- If you need to connect to Azure Purview Studio by using private endpoints, you have to deploy both account and portal private endpoints.
+- If you need to connect to Microsoft Purview Studio by using private endpoints, you have to deploy both account and portal private endpoints.
-- To scan data sources through private connectivity, you need to configure at least one account and one ingestion private endpoint for Azure Purview. You must configure scans by using a self-hosted integration runtime through an authentication method other than an Azure Purview managed identity.
+- To scan data sources through private connectivity, you need to configure at least one account and one ingestion private endpoint for Microsoft Purview. You must configure scans by using a self-hosted integration runtime through an authentication method other than a Microsoft Purview managed identity.
- Review [Support matrix for scanning data sources through an ingestion private endpoint](catalog-private-link.md#support-matrix-for-scanning-data-sources-through-ingestion-private-endpoint) before you set up any scans. -- Review [DNS requirements](catalog-private-link-name-resolution.md#deployment-options). If you're using a custom DNS server on your network, clients must be able to resolve the fully qualified domain name (FQDN) for the Azure Purview account endpoints to the private endpoint's IP address.
+- Review [DNS requirements](catalog-private-link-name-resolution.md#deployment-options). If you're using a custom DNS server on your network, clients must be able to resolve the fully qualified domain name (FQDN) for the Microsoft Purview account endpoints to the private endpoint's IP address.
### Integration runtime options -- If your data sources are in Azure, you need to set up and use a self-hosted integration runtime on a Windows virtual machine that's deployed inside the same or a peered virtual network where Azure Purview ingestion private endpoints are deployed. The Azure integration runtime won't work with ingestion private endpoints.
+- If your data sources are in Azure, you need to set up and use a self-hosted integration runtime on a Windows virtual machine that's deployed inside the same or a peered virtual network where Microsoft Purview ingestion private endpoints are deployed. The Azure integration runtime won't work with ingestion private endpoints.
- To scan on-premises data sources, you can also install a self-hosted integration runtime either on an on-premises Windows machine or on a VM inside an Azure virtual network. -- When you're using private endpoints with Azure Purview, you need to allow network connectivity from data sources to the self-hosted integration VM on the Azure virtual network where Azure Purview private endpoints are deployed.
+- When you're using private endpoints with Microsoft Purview, you need to allow network connectivity from data sources to the self-hosted integration VM on the Azure virtual network where Microsoft Purview private endpoints are deployed.
- We recommend allowing automatic upgrade of the self-hosted integration runtime. Make sure you open required outbound rules in your Azure virtual network or on your corporate firewall to allow automatic upgrade. For more information, see [Self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements). ### Authentication options -- You can't use an Azure Purview managed identity to scan data sources through ingestion private endpoints. Use a service principal, an account key, or SQL authentication, based on data source type.
+- You can't use a Microsoft Purview managed identity to scan data sources through ingestion private endpoints. Use a service principal, an account key, or SQL authentication, based on data source type.
-- Make sure that your credentials are stored in an Azure key vault and registered inside Azure Purview.
+- Make sure that your credentials are stored in an Azure key vault and registered inside Microsoft Purview.
-- You must create a credential in Azure Purview based on each secret that you create in the Azure key vault. You need to assign, at minimum, _get_ and _list_ access for secrets for Azure Purview on the Key Vault resource in Azure. Otherwise, the credentials won't work in the Azure Purview account.
+- You must create a credential in Microsoft Purview based on each secret that you create in the Azure key vault. You need to assign, at minimum, _get_ and _list_ access for secrets for Microsoft Purview on the Key Vault resource in Azure. Otherwise, the credentials won't work in the Microsoft Purview account.
### Current limitations - Scanning multiple Azure sources by using the entire subscription or resource group through ingestion private endpoints and a self-hosted integration runtime is not supported when you're using private endpoints for ingestion. Instead, you can register and scan data sources individually. -- For limitations related to Azure Purview private endpoints, see [Known limitations](catalog-private-link-troubleshoot.md#known-limitations).
+- For limitations related to Microsoft Purview private endpoints, see [Known limitations](catalog-private-link-troubleshoot.md#known-limitations).
- For limitations related to the Private Link service, see [Azure Private Link limits](../azure-resource-manager/management/azure-subscription-service-limits.md#private-link-limits).
You must use private endpoints for your Azure Purview account if you have any of
#### Single virtual network, single region
-In this scenario, all Azure data sources, self-hosted integration runtime VMs, and Azure Purview private endpoints are deployed in the same virtual network in an Azure subscription.
+In this scenario, all Azure data sources, self-hosted integration runtime VMs, and Microsoft Purview private endpoints are deployed in the same virtual network in an Azure subscription.
-If on-premises data sources exist, connectivity is provided through a site-to-site VPN or Azure ExpressRoute connectivity to an Azure virtual network where Azure Purview private endpoints are deployed.
+If on-premises data sources exist, connectivity is provided through a site-to-site VPN or Azure ExpressRoute connectivity to an Azure virtual network where Microsoft Purview private endpoints are deployed.
This architecture is suitable mainly for small organizations or for development, testing, and proof-of-concept scenarios. #### Single region, multiple virtual networks
Many customers build their network infrastructure in Azure by using the hub-and-
In hub-and-spoke network architectures, your organization's data governance team can be provided with an Azure subscription that includes a virtual network (hub). All data services can be located in a few other subscriptions connected to the hub virtual network through a virtual network peering or a site-to-site VPN connection.
-In a hub-and-spoke architecture, you can deploy Azure Purview and one or more self-hosted integration runtime VMs in the hub subscription and virtual network. You can register and scan data sources from other virtual networks from multiple subscriptions in the same region.
+In a hub-and-spoke architecture, you can deploy Microsoft Purview and one or more self-hosted integration runtime VMs in the hub subscription and virtual network. You can register and scan data sources from other virtual networks from multiple subscriptions in the same region.
The self-hosted integration runtime VMs can be deployed inside the same Azure virtual network or a peered virtual network where the account and ingestion private endpoints are deployed. You can optionally deploy an additional self-hosted integration runtime in the spoke virtual networks.
If your data sources are distributed across multiple Azure regions in one or mor
For performance and cost optimization, we highly recommended deploying one or more self-hosted integration runtime VMs in each region where data sources are located. ### DNS configuration with private endpoints
-#### Name resolution for multiple Azure Purview accounts
+#### Name resolution for multiple Microsoft Purview accounts
-It is recommended to follow these recommendations, if your organization needs to deploy and maintain multiple Azure Purview accounts using private endpoints:
+It is recommended to follow these recommendations, if your organization needs to deploy and maintain multiple Microsoft Purview accounts using private endpoints:
-1. Deploy at least one _account_ private endpoint for each Azure Purview account.
-2. Deploy at least one set of _ingestion_ private endpoints for each Azure Purview account.
-3. Deploy one _portal_ private endpoint for one of the Azure Purview accounts in your Azure environments. Create one DNS A record for _portal_ private endpoint to resolve `web.purview.azure.com`. The _portal_ private endpoint can be used by all purview accounts in the same Azure virtual network or virtual networks connected through VNet peering.
+1. Deploy at least one _account_ private endpoint for each Microsoft Purview account.
+2. Deploy at least one set of _ingestion_ private endpoints for each Microsoft Purview account.
+3. Deploy one _portal_ private endpoint for one of the Microsoft Purview accounts in your Azure environments. Create one DNS A record for _portal_ private endpoint to resolve `web.purview.azure.com`. The _portal_ private endpoint can be used by all purview accounts in the same Azure virtual network or virtual networks connected through VNet peering.
-This scenario also applies if multiple Azure Purview accounts are deployed across multiple subscriptions and multiple VNets that are connected through VNet peering. _Portal_ private endpoint mainly renders static assets related to Azure Purview Studio, thus, it is independent of Azure Purview account, therefore, only one _portal_ private endpoint is needed to visit all Azure Purview accounts in the Azure environment if VNets are connected.
+This scenario also applies if multiple Microsoft Purview accounts are deployed across multiple subscriptions and multiple VNets that are connected through VNet peering. _Portal_ private endpoint mainly renders static assets related to Microsoft Purview Studio, thus, it is independent of Microsoft Purview account, therefore, only one _portal_ private endpoint is needed to visit all Microsoft Purview accounts in the Azure environment if VNets are connected.
> [!NOTE]
-> You may need to deploy separate _portal_ private endpoints for each Azure Purview account in the scenarios where Azure Purview accounts are deployed in isolated network segmentations.
-> Azure Purview _portal_ is static contents for all customers without any customer information. Optionally, you can use public network, (without portal private endpoint) to launch `web.purview.azure.com` if your end users are allowed to launch the Internet.
+> You may need to deploy separate _portal_ private endpoints for each Microsoft Purview account in the scenarios where Microsoft Purview accounts are deployed in isolated network segmentations.
+> Microsoft Purview _portal_ is static contents for all customers without any customer information. Optionally, you can use public network, (without portal private endpoint) to launch `web.purview.azure.com` if your end users are allowed to launch the Internet.
## Option 3: Use both private and public endpoints
You might choose an option in which a subset of your data sources uses private e
If you need to scan some data sources by using an ingestion private endpoint and some data sources by using public endpoints or a service endpoint, you can:
-1. Use private endpoints for your Azure Purview account.
-1. Set **Public network access** to **allow** on your Azure Purview account.
+1. Use private endpoints for your Microsoft Purview account.
+1. Set **Public network access** to **allow** on your Microsoft Purview account.
### Integration runtime options -- To scan an Azure data source that's configured with a private endpoint, you need to set up and use a self-hosted integration runtime on a Windows virtual machine that's deployed inside the same or a peered virtual network where Azure Purview account and ingestion private endpoints are deployed.
+- To scan an Azure data source that's configured with a private endpoint, you need to set up and use a self-hosted integration runtime on a Windows virtual machine that's deployed inside the same or a peered virtual network where Microsoft Purview account and ingestion private endpoints are deployed.
- When you're using a private endpoint with Azure Purview, you need to allow network connectivity from data sources to a self-hosted integration VM on the Azure virtual network where Azure Purview private endpoints are deployed.
+ When you're using a private endpoint with Microsoft Purview, you need to allow network connectivity from data sources to a self-hosted integration VM on the Azure virtual network where Microsoft Purview private endpoints are deployed.
- To scan an Azure data source that's configured to allow a public endpoint, you can use the Azure integration runtime.
If you need to scan some data sources by using an ingestion private endpoint and
- If you use an ingestion private endpoint to scan an Azure data source that's configured with a private endpoint:
- - You can't use an Azure Purview managed identity. Instead, use a service principal, an account key, or SQL authentication, based on the data source type.
+ - You can't use a Microsoft Purview managed identity. Instead, use a service principal, an account key, or SQL authentication, based on the data source type.
- - Make sure that your credentials are stored in an Azure key vault and registered inside Azure Purview.
+ - Make sure that your credentials are stored in an Azure key vault and registered inside Microsoft Purview.
- - You must create a credential in Azure Purview based on each secret that you create in Azure Key Vault. At minimum, assign _get_ and _list_ access for secrets for Azure Purview on the Key Vault resource in Azure. Otherwise, the credentials won't work in the Azure Purview account.
+ - You must create a credential in Microsoft Purview based on each secret that you create in Azure Key Vault. At minimum, assign _get_ and _list_ access for secrets for Microsoft Purview on the Key Vault resource in Azure. Otherwise, the credentials won't work in the Microsoft Purview account.
## Self-hosted integration runtime network and proxy recommendations For scanning data sources across your on-premises and Azure networks, you may need to deploy and use one or multiple [self-hosted integration runtime virtual machines](manage-integration-runtimes.md) inside an Azure VNet or an on-premises network, for any of the scenarios mentioned earlier in this document. -- To simplify management, when possible, use Azure runtime and [Azure Purview Managed runtime](catalog-managed-vnet.md) to scan Azure data sources.
+- To simplify management, when possible, use Azure runtime and [Microsoft Purview Managed runtime](catalog-managed-vnet.md) to scan Azure data sources.
-- The Self-hosted integration runtime service can communicate with Azure Purview through public or private network over port 443. For more information see, [self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).
+- The Self-hosted integration runtime service can communicate with Microsoft Purview through public or private network over port 443. For more information see, [self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).
-- One self-hosted integration runtime VM can be used to scan one or multiple data sources in Azure Purview, however, self-hosted integration runtime must be only registered for Azure Purview and cannot be used for Azure Data Factory or Azure Synapse at the same time.
+- One self-hosted integration runtime VM can be used to scan one or multiple data sources in Microsoft Purview, however, self-hosted integration runtime must be only registered for Microsoft Purview and cannot be used for Azure Data Factory or Azure Synapse at the same time.
-- You can register and use one or multiple self-hosted integration runtime in one Azure Purview account. It is recommended to place at least one self-hosted integration runtime VM in each region or on-premises network where your data sources reside.
+- You can register and use one or multiple self-hosted integration runtime in one Microsoft Purview account. It is recommended to place at least one self-hosted integration runtime VM in each region or on-premises network where your data sources reside.
- It is recommended to define a baseline for required capacity for each self-hosted integration runtime VM and scale the VM capacity based on demand. -- It is recommended to setup network connection between self-hosted integration runtime VMs and Azure Purview and its managed resources through private network, when possible.
+- It is recommended to setup network connection between self-hosted integration runtime VMs and Microsoft Purview and its managed resources through private network, when possible.
- Allow outbound connectivity to download.microsoft.com, if auto-update is enabled. - The self-hosted integration runtime service does not require outbound internet connectivity, if self-hosted integration runtime VMs are deployed in an Azure VNet or in the on-premises network that is connected to Azure through an ExpressRoute or Site to Site VPN connection. In this case, the scan and metadata ingestion process can be done through private network. -- Self-hosted integration runtime can communicate Azure Purview and its managed resources directly or through [a proxy server](manage-integration-runtimes.md#proxy-server-considerations). Avoid using proxy settings if self-hosted integration runtime VM is inside an Azure VNet or connected through ExpressRoute or Site to Site VPN connection.
+- Self-hosted integration runtime can communicate Microsoft Purview and its managed resources directly or through [a proxy server](manage-integration-runtimes.md#proxy-server-considerations). Avoid using proxy settings if self-hosted integration runtime VM is inside an Azure VNet or connected through ExpressRoute or Site to Site VPN connection.
- Review supported scenarios, if you need to use self-hosted integration runtime with [proxy setting](manage-integration-runtimes.md#proxy-server-considerations). ## Next steps-- [Use private endpoints for secure access to Azure Purview](./catalog-private-link.md)
+- [Use private endpoints for secure access to Microsoft Purview](./catalog-private-link.md)
purview Concept Best Practices Scanning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-scanning.md
Title: Best practices for scanning data sources in Azure Purview
-description: This article provides best practices for registering and scanning various data sources in Azure Purview.
+ Title: Best practices for scanning data sources in Microsoft Purview
+description: This article provides best practices for registering and scanning various data sources in Microsoft Purview.
Last updated 10/08/2021
-# Azure Purview scanning best practices
+# Microsoft Purview scanning best practices
-Azure Purview supports automated scanning of on-premises, multicloud, and software as a service (SaaS) data sources.
+Microsoft Purview supports automated scanning of on-premises, multicloud, and software as a service (SaaS) data sources.
Running a *scan* invokes the process to ingest metadata from the registered data sources. The metadata curated at the end of the scan and curation process includes technical metadata. This metadata can include data asset names such as table names or file names, file size, columns, and data lineage. Schema details are also captured for structured data sources. A relational database management system is an example of this type of source.
-The curation process applies automated classification labels on the schema attributes based on the scan rule set configured. Sensitivity labels are applied if your Azure Purview account is connected to the Microsoft 365 Security and Compliance Center.
+The curation process applies automated classification labels on the schema attributes based on the scan rule set configured. Sensitivity labels are applied if your Microsoft Purview account is connected to the Microsoft 365 Security and Compliance Center.
## Why do you need best practices to manage data sources?
The following design considerations and recommendations help you register a sour
### Design considerations - Use collections to create the hierarchy that aligns with the organization's strategy, like geographical, business function, or source of data. The hierarchy defines the data sources to be registered and scanned.-- By design, you can't register data sources multiple times in the same Azure Purview account. This architecture helps to avoid the risk of assigning different access control to the same data source.
+- By design, you can't register data sources multiple times in the same Microsoft Purview account. This architecture helps to avoid the risk of assigning different access control to the same data source.
### Design recommendations - If the metadata of the same data source is consumed by multiple teams, you can register and manage the data source at a parent collection. Then you can create corresponding scans under each subcollection. In this way, relevant assets appear under each child collection. Sources without parents are grouped in a dotted box in the map view. No arrows link them to parents.
- :::image type="content" source="media/concept-best-practices/scanning-parent-child.png" alt-text="Screenshot that shows Azure Purview with data source registered at parent collection.":::
+ :::image type="content" source="media/concept-best-practices/scanning-parent-child.png" alt-text="Screenshot that shows Microsoft Purview with data source registered at parent collection.":::
- Use the **Azure Multiple** option if you need to register multiple sources, such as Azure subscriptions or resource groups, in the cloud. For more information, see the following documentation:
- * [Scan multiple sources in Azure Purview](./register-scan-azure-multiple-sources.md)
+ * [Scan multiple sources in Microsoft Purview](./register-scan-azure-multiple-sources.md)
* [Check data source readiness at scale](./tutorial-data-sources-readiness.md)
- * [Configure access to data sources for Azure Purview MSI at scale](./tutorial-msi-configuration.md)
+ * [Configure access to data sources for Microsoft Purview MSI at scale](./tutorial-msi-configuration.md)
- After a data source is registered, you might scan the same source multiple times, in case the same source is being used differently by various teams or business units.
After you register your source in the relevant [collection](./how-to-create-and-
:::image type="content" source="media/concept-best-practices/scanning-create-custom-scan-rule-set.png" alt-text="Screenshot that shows the option to select relevant classification rules when you create the custom scan rule set."::: > [!NOTE]
- > When you scan a storage account, Azure Purview uses a set of defined patterns to determine if a group of assets forms a resource set. You can use resource set pattern rules to customize or override how Azure Purview detects which assets are grouped as resource sets. The rules also determine how the assets are displayed within the catalog.
+ > When you scan a storage account, Microsoft Purview uses a set of defined patterns to determine if a group of assets forms a resource set. You can use resource set pattern rules to customize or override how Microsoft Purview detects which assets are grouped as resource sets. The rules also determine how the assets are displayed within the catalog.
> For more information, see [Create resource set pattern rules](./how-to-resource-set-pattern-rules.md). > This feature has cost considerations. For information, see the [pricing page](https://azure.microsoft.com/pricing/details/azure-purview/). 1. Set up a scan for the registered data sources.
- - **Scan name**: By default, Azure Purview uses the naming convention **SCAN-[A-Z][a-z][a-z]**, which isn't helpful when you're trying to identify a scan that you've run. Be sure to use a meaningful naming convention. For instance, you could name the scan _environment-source-frequency-time_ as DEVODS-Daily-0200. This name represents a daily scan at 0200 hours.
+ - **Scan name**: By default, Microsoft Purview uses the naming convention **SCAN-[A-Z][a-z][a-z]**, which isn't helpful when you're trying to identify a scan that you've run. Be sure to use a meaningful naming convention. For instance, you could name the scan _environment-source-frequency-time_ as DEVODS-Daily-0200. This name represents a daily scan at 0200 hours.
- - **Authentication**: Azure Purview offers various authentication methods for scanning data sources, depending on the type of source. It could be Azure cloud or on-premises or third-party sources. Follow the least-privilege principle for the authentication method in this order of preference:
- - Azure Purview MSI - Managed Service Identity (for example, for Azure Data Lake Storage Gen2 sources)
+ - **Authentication**: Microsoft Purview offers various authentication methods for scanning data sources, depending on the type of source. It could be Azure cloud or on-premises or third-party sources. Follow the least-privilege principle for the authentication method in this order of preference:
+ - Microsoft Purview MSI - Managed Service Identity (for example, for Azure Data Lake Storage Gen2 sources)
- User-assigned managed identity - Service principal - SQL authentication (for example, for on-premises or Azure SQL sources)
After you register your source in the relevant [collection](./how-to-create-and-
- When you use SHIR, make sure that the memory is sufficient for the data source being scanned. For example, when you use SHIR for scanning an SAP source, if you see "out of memory error": - Ensure the SHIR machine has enough memory. The recommended amount is 128 GB. - In the scan setting, set the maximum memory available as some appropriate value, for example, 100.
- - For more information, see the prerequisites in [Scan to and manage SAP ECC Azure Purview](./register-scan-sapecc-source.md#create-and-run-scan).
+ - For more information, see the prerequisites in [Scan to and manage SAP ECC Microsoft Purview](./register-scan-sapecc-source.md#create-and-run-scan).
- **Scope scan** - When you set up the scope for the scan, select only the assets that are relevant at a granular level or parent level. This practice ensures that the scan cost is optimal and performance is efficient. All future assets under a certain parent will be automatically selected if the parent is fully or partially checked.
After you register your source in the relevant [collection](./how-to-create-and-
- **Scan rule set** - When you select the scan rule set, make sure to configure the relevant system or custom scan rule set that was created earlier.
- - You can create custom filetypes and fill in the details accordingly. Currently, Azure Purview supports only one character in Custom Delimiter. If you use custom delimiters, such as ~, in your actual data, you need to create a new scan rule set.
+ - You can create custom filetypes and fill in the details accordingly. Currently, Microsoft Purview supports only one character in Custom Delimiter. If you use custom delimiters, such as ~, in your actual data, you need to create a new scan rule set.
:::image type="content" source="media/concept-best-practices/scanning-scan-rule-set.png" alt-text="Screenshot that shows the scan rule set selection while configuring the scan.":::
After you register your source in the relevant [collection](./how-to-create-and-
### Points to note -- If a field or column, table, or a file is removed from the source system after the scan was executed, it will only be reflected (removed) in Azure Purview after the next scheduled full or incremental scan.-- An asset can be deleted from an Azure Purview catalog by using the **Delete** icon under the name of the asset. This action won't remove the object in the source. If you run a full scan on the same source, it would get reingested in the catalog. If you've scheduled a weekly or monthly scan instead (incremental), the deleted asset won't be picked unless the object is modified at the source. An example is if a column is added or removed from the table.-- To understand the behavior of subsequent scans after *manually* editing a data asset or an underlying schema through Azure Purview Studio, see [Catalog asset details](./catalog-asset-details.md#scans-on-edited-assets).
+- If a field or column, table, or a file is removed from the source system after the scan was executed, it will only be reflected (removed) in Microsoft Purview after the next scheduled full or incremental scan.
+- An asset can be deleted from a Microsoft Purview catalog by using the **Delete** icon under the name of the asset. This action won't remove the object in the source. If you run a full scan on the same source, it would get reingested in the catalog. If you've scheduled a weekly or monthly scan instead (incremental), the deleted asset won't be picked unless the object is modified at the source. An example is if a column is added or removed from the table.
+- To understand the behavior of subsequent scans after *manually* editing a data asset or an underlying schema through Microsoft Purview Studio, see [Catalog asset details](./catalog-asset-details.md#scans-on-edited-assets).
- For more information, see the tutorial on [how to view, edit, and delete assets](./catalog-asset-details.md). ## Next steps
purview Concept Best Practices Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-security.md
Title: Azure Purview security best practices
-description: This article provides Azure Purview best practices.
+ Title: Microsoft Purview security best practices
+description: This article provides Microsoft Purview best practices.
Last updated 12/05/2021
-# Azure Purview security best practices
+# Microsoft Purview security best practices
-This article provides best practices for common security requirements in Azure Purview. The security strategy described follows the layered defense-in-depth approach.
+This article provides best practices for common security requirements in Microsoft Purview. The security strategy described follows the layered defense-in-depth approach.
Before applying these recommendations to your environment, you should consult your security team as some may not be applicable to your security requirements. ## Network security
-Azure Purview is a Platform as a Service (PaaS) solution in Azure. You can enable the following network security capabilities for your Azure Purview accounts:
+Microsoft Purview is a Platform as a Service (PaaS) solution in Azure. You can enable the following network security capabilities for your Microsoft Purview accounts:
- Enable [end-to-end network isolation](catalog-private-link-end-to-end.md) using Private Link Service.-- Use [Azure Purview Firewall](catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access) to disable Public access.-- Deploy [Network Security Group (NSG) rules](#use-network-security-groups) for subnets where Azure data sources private endpoints, Azure Purview private endpoints and self-hosted runtime VMs are deployed.-- Implement Azure Purview with private endpoints managed by a Network Virtual Appliance, such as [Azure Firewall](../firewall/overview.md) for network inspection and network filtering.
+- Use [Microsoft Purview Firewall](catalog-private-link-end-to-end.md#firewalls-to-restrict-public-access) to disable Public access.
+- Deploy [Network Security Group (NSG) rules](#use-network-security-groups) for subnets where Azure data sources private endpoints, Microsoft Purview private endpoints and self-hosted runtime VMs are deployed.
+- Implement Microsoft Purview with private endpoints managed by a Network Virtual Appliance, such as [Azure Firewall](../firewall/overview.md) for network inspection and network filtering.
For more information, see [Best practices related to connectivity to Azure PaaS Services](/azure/cloud-adoption-framework/ready/azure-best-practices/connectivity-to-azure-paas-services).
-### Deploy private endpoints for Azure Purview accounts
+### Deploy private endpoints for Microsoft Purview accounts
-If you need to use Azure Purview from inside your private network, it is recommended to use Azure Private Link Service with your Azure Purview accounts for partial or [end-to-end isolation](catalog-private-link-end-to-end.md) to connect to Azure Purview Studio, access Azure Purview endpoints and to scan data sources.
+If you need to use Microsoft Purview from inside your private network, it is recommended to use Azure Private Link Service with your Microsoft Purview accounts for partial or [end-to-end isolation](catalog-private-link-end-to-end.md) to connect to Microsoft Purview Studio, access Microsoft Purview endpoints and to scan data sources.
-The Azure Purview _account_ private endpoint is used to add another layer of security, so only client calls that are originated from within the virtual network are allowed to access the Azure Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
+The Microsoft Purview _account_ private endpoint is used to add another layer of security, so only client calls that are originated from within the virtual network are allowed to access the Microsoft Purview account. This private endpoint is also a prerequisite for the portal private endpoint.
-The Azure Purview _portal_ private endpoint is required to enable connectivity to Azure Purview Studio using a private network.
+The Microsoft Purview _portal_ private endpoint is required to enable connectivity to Microsoft Purview Studio using a private network.
-Azure Purview can scan data sources in Azure or an on-premises environment by using ingestion private endpoints.
+Microsoft Purview can scan data sources in Azure or an on-premises environment by using ingestion private endpoints.
- For scanning Azure _platform as a service_ data sources, review [Support matrix for scanning data sources through ingestion private endpoint](catalog-private-link.md#support-matrix-for-scanning-data-sources-through-ingestion-private-endpoint).-- If you are deploying Azure Purview with end-to-end network isolation, to scan Azure data sources, these data sources must be also configured with private endpoints.
+- If you are deploying Microsoft Purview with end-to-end network isolation, to scan Azure data sources, these data sources must be also configured with private endpoints.
- Review [known limitations](catalog-private-link-troubleshoot.md).
-For more information, see [Azure Purview network architecture and best practices](concept-best-practices-network.md).
+For more information, see [Microsoft Purview network architecture and best practices](concept-best-practices-network.md).
-### Block public access using Azure Purview firewall
+### Block public access using Microsoft Purview firewall
-You can disable Azure Purview Public access to cut off access to the Azure Purview account completely from the public internet. In this case, you should consider the following requirements:
+You can disable Microsoft Purview Public access to cut off access to the Microsoft Purview account completely from the public internet. In this case, you should consider the following requirements:
-- Azure Purview must be deployed based on [end-to-end network isolation scenario](catalog-private-link-end-to-end.md).-- To access Azure Purview Studio and Azure Purview endpoints, you need to use a management machine that is connected to private network to access Azure Purview through private network.
+- Microsoft Purview must be deployed based on [end-to-end network isolation scenario](catalog-private-link-end-to-end.md).
+- To access Microsoft Purview Studio and Microsoft Purview endpoints, you need to use a management machine that is connected to private network to access Microsoft Purview through private network.
- Review [known limitations](catalog-private-link-troubleshoot.md). - To scan Azure platform as a service data sources, review [Support matrix for scanning data sources through ingestion private endpoint](catalog-private-link.md#support-matrix-for-scanning-data-sources-through-ingestion-private-endpoint). - Azure data sources must be also configured with private endpoints.
For more information, see [Firewalls to restrict public access](catalog-private-
You can use an Azure network security group to filter network traffic to and from Azure resources in an Azure virtual network. A network security group contains [security rules](../virtual-network/network-security-groups-overview.md#security-rules) that allow or deny inbound network traffic to, or outbound network traffic from, several types of Azure resources. For each rule, you can specify source and destination, port, and protocol.
-Network Security Groups can be applied to network interface or Azure virtual networks subnets, where Azure Purview private endpoints, self-hosted integration runtime VMs and Azure data sources are deployed.
+Network Security Groups can be applied to network interface or Azure virtual networks subnets, where Microsoft Purview private endpoints, self-hosted integration runtime VMs and Azure data sources are deployed.
For more information, see [apply NSG rules for private endpoints](../private-link/disable-private-endpoint-network-policy.md).
-The following NSG rules are required on **data sources** for Azure Purview scanning:
+The following NSG rules are required on **data sources** for Microsoft Purview scanning:
|Direction |Source |Source port range |Destination |Destination port |Protocol |Action | |||||||| |Inbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Data Sources private IP addresses or Subnets | 443 | Any | Allow |
-The following NSG rules are required on from the **management machines** to access Azure Purview Studio:
+The following NSG rules are required on from the **management machines** to access Microsoft Purview Studio:
|Direction |Source |Source port range |Destination |Destination port |Protocol |Action | ||||||||
-|Outbound | Management machines' private IP addresses or subnets | * | Azure Purview account and portal private endpoint IP addresses or subnets | 443 | Any | Allow |
+|Outbound | Management machines' private IP addresses or subnets | * | Microsoft Purview account and portal private endpoint IP addresses or subnets | 443 | Any | Allow |
|Outbound | Management machines' private IP addresses or subnets | * | Service tag: `AzureCloud` | 443 | Any | Allow |
-The following NSG rules are required on **self-hosted integration runtime VMs** for Azure Purview scanning and metadata ingestion:
+The following NSG rules are required on **self-hosted integration runtime VMs** for Microsoft Purview scanning and metadata ingestion:
> [!IMPORTANT] > Consider adding additional rules with relevant Service Tags, based on your data source types.
The following NSG rules are required on **self-hosted integration runtime VMs**
|Direction |Source |Source port range |Destination |Destination port |Protocol |Action | |||||||| |Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Data Sources private IP addresses or subnets | 443 | Any | Allow |
-|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Azure Purview account and ingestion private endpoint IP addresses or Subnets | 443 | Any | Allow |
+|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Microsoft Purview account and ingestion private endpoint IP addresses or Subnets | 443 | Any | Allow |
|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `Servicebus` | 443 | Any | Allow | |Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `Storage` | 443 | Any | Allow | |Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `AzureActiveDirectory` | 443 | Any | Allow |
The following NSG rules are required on **self-hosted integration runtime VMs**
|Outbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Service tag: `KeyVault` | 443 | Any | Allow |
-The following NSG rules are required on for **Azure Purview account, portal and ingestion private endpoints**:
+The following NSG rules are required on for **Microsoft Purview account, portal and ingestion private endpoints**:
|Direction |Source |Source port range |Destination |Destination port |Protocol |Action | ||||||||
-|Inbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Azure Purview account and ingestion private endpoint IP addresses or subnets | 443 | Any | Allow |
-|Inbound | Management machines' private IP addresses or subnets | * | Azure Purview account and ingestion private endpoint IP addresses or subnets | 443 | Any | Allow |
+|Inbound | Self-hosted integration runtime VMs' private IP addresses or subnets | * | Microsoft Purview account and ingestion private endpoint IP addresses or subnets | 443 | Any | Allow |
+|Inbound | Management machines' private IP addresses or subnets | * | Microsoft Purview account and ingestion private endpoint IP addresses or subnets | 443 | Any | Allow |
For more information, see [Self-hosted integration runtime networking requirements](manage-integration-runtimes.md#networking-requirements).
For more information, see [Self-hosted integration runtime networking requiremen
Identity and Access Management provides the basis of a large percentage of security assurance. It enables access based on identity authentication and authorization controls in cloud services. These controls protect data and resources and decide which requests should be permitted.
-Related to roles and access management in Azure Purview, you can apply the following security best practices:
+Related to roles and access management in Microsoft Purview, you can apply the following security best practices:
-- Define roles and responsibilities to manage Azure Purview in control plane and data plane:
- - Define roles and tasks required to deploy and manage Azure Purview inside an Azure subscription.
- - Define roles and task needed to perform data management and governance using Azure Purview.
+- Define roles and responsibilities to manage Microsoft Purview in control plane and data plane:
+ - Define roles and tasks required to deploy and manage Microsoft Purview inside an Azure subscription.
+ - Define roles and task needed to perform data management and governance using Microsoft Purview.
- Assign roles to Azure Active Directory groups instead of assigning roles to individual users. - Use Azure [Active Directory Entitlement Management](../active-directory/governance/entitlement-management-overview.md) to map user access to Azure AD groups using Access Packages. -- Enforce multi-factor authentication for Azure Purview users, especially, for users with privileged roles such as collection admins, data source admins or data curators.
+- Enforce multi-factor authentication for Microsoft Purview users, especially, for users with privileged roles such as collection admins, data source admins or data curators.
-### Manage an Azure Purview account in control plane and data plane
+### Manage a Microsoft Purview account in control plane and data plane
-Control plane refers to all operations related to Azure deployment and management of Azure Purview inside Azure Resource Manager.
+Control plane refers to all operations related to Azure deployment and management of Microsoft Purview inside Azure Resource Manager.
-Data plane refers to all operations, related to interacting with Azure Purview inside Data Map and Data Catalog.
+Data plane refers to all operations, related to interacting with Microsoft Purview inside Data Map and Data Catalog.
-You can assign control plane and data plane roles to users, security groups and service principals from your Azure Active Directory tenant that is associated to Azure Purview instance's Azure subscription.
+You can assign control plane and data plane roles to users, security groups and service principals from your Azure Active Directory tenant that is associated to Microsoft Purview instance's Azure subscription.
Examples of control plane operations and data plane operations: |Task |Scope |Recommended role |What roles to use? | |||||
-|Deploy an Azure Purview account | Control plane | Azure subscription owner or contributor | Azure RBAC roles |
-|Set up a Private Endpoint for Azure Purview | Control plane | Contributor  | Azure RBAC roles |
-|Delete an Azure Purview account | Control plane | Contributor  | Azure RBAC roles |
-|View Azure Purview metrics to get current capacity units | Control plane | Reader | Azure RBAC roles |
-|Create a collection | Data plane | Collection Admin | Azure Purview roles |
-|Register a data source | Data plane | Collection Admin | Azure Purview roles |
-|Scan a SQL Server | Data plane | Data source admin and data reader or data curator | Azure Purview roles |
-|Search inside Azure Purview Data Catalog | Data plane | Data source admin and data reader or data curator | Azure Purview roles |
+|Deploy a Microsoft Purview account | Control plane | Azure subscription owner or contributor | Azure RBAC roles |
+|Set up a Private Endpoint for Microsoft Purview | Control plane | Contributor  | Azure RBAC roles |
+|Delete a Microsoft Purview account | Control plane | Contributor  | Azure RBAC roles |
+|View Microsoft Purview metrics to get current capacity units | Control plane | Reader | Azure RBAC roles |
+|Create a collection | Data plane | Collection Admin | Microsoft Purview roles |
+|Register a data source | Data plane | Collection Admin | Microsoft Purview roles |
+|Scan a SQL Server | Data plane | Data source admin and data reader or data curator | Microsoft Purview roles |
+|Search inside Microsoft Purview Data Catalog | Data plane | Data source admin and data reader or data curator | Microsoft Purview roles |
-Azure Purview plane roles are defined and managed inside Azure Purview instance in Azure Purview collections. For more information, see [Access control in Azure Purview](catalog-permissions.md#roles).
+Microsoft Purview plane roles are defined and managed inside Microsoft Purview instance in Microsoft Purview collections. For more information, see [Access control in Microsoft Purview](catalog-permissions.md#roles).
Follow [Azure role-based access recommendations](../role-based-access-control/best-practices.md) for Azure control plane tasks. ### Authentication and authorization
-To gain access to Azure Purview, users must be authenticated and authorized. Authentication is the process of proving the user is who they claim to be. Authorization refers to controlling access inside Azure Purview assigned on collections.
+To gain access to Microsoft Purview, users must be authenticated and authorized. Authentication is the process of proving the user is who they claim to be. Authorization refers to controlling access inside Microsoft Purview assigned on collections.
-We use Azure Active Directory to provide authentication and authorization mechanisms for Azure Purview inside Collections. You can assign Azure Purview roles to the following security principals from your Azure Active Directory tenant that is associated with Azure subscription where your Azure Purview instance is hosted:
+We use Azure Active Directory to provide authentication and authorization mechanisms for Microsoft Purview inside Collections. You can assign Microsoft Purview roles to the following security principals from your Azure Active Directory tenant that is associated with Azure subscription where your Microsoft Purview instance is hosted:
- Users and guest users (if they are already added into your Azure AD tenant) - Security groups - Managed Identities - Service Principals
-Azure Purview fine-grained roles can be assigned to a flexible Collections hierarchy inside the Azure Purview instance.
+Microsoft Purview fine-grained roles can be assigned to a flexible Collections hierarchy inside the Microsoft Purview instance.
### Define Least Privilege model As a general rule, restrict access based on the [need to know](https://en.wikipedia.org/wiki/Need_to_know) and [least privilege](https://en.wikipedia.org/wiki/Principle_of_least_privilege) security principles is imperative for organizations that want to enforce security policies for data access.
-In Azure Purview, data sources, assets and scans can be organized using [Azure Purview Collections](quickstart-create-collection.md). Collections are hierarchical grouping of metadata in Azure Purview, but at the same time they provide a mechanism to manage access across Azure Purview. Roles in Azure Purview can be assigned to a collection based on your collection's hierarchy.
+In Microsoft Purview, data sources, assets and scans can be organized using [Microsoft Purview Collections](quickstart-create-collection.md). Collections are hierarchical grouping of metadata in Microsoft Purview, but at the same time they provide a mechanism to manage access across Microsoft Purview. Roles in Microsoft Purview can be assigned to a collection based on your collection's hierarchy.
-Use [Azure Purview collections](concept-best-practices-collections.md#define-a-collection-hierarchy) to implement your organization's metadata hierarchy for centralized or delegated management and governance hierarchy based on least privileged model.
+Use [Microsoft Purview collections](concept-best-practices-collections.md#define-a-collection-hierarchy) to implement your organization's metadata hierarchy for centralized or delegated management and governance hierarchy based on least privileged model.
-Follow least privilege access model when assigning roles inside Azure Purview collections by segregating duties within your team and grant only the amount of access to users that they need to perform their jobs.
+Follow least privilege access model when assigning roles inside Microsoft Purview collections by segregating duties within your team and grant only the amount of access to users that they need to perform their jobs.
-For more information how to assign least privilege access model in Azure Purview, based on Azure Purview collection hierarchy, see [Access control in Azure Purview](catalog-permissions.md#assign-permissions-to-your-users).
+For more information how to assign least privilege access model in Microsoft Purview, based on Microsoft Purview collection hierarchy, see [Access control in Microsoft Purview](catalog-permissions.md#assign-permissions-to-your-users).
### Lower exposure of privileged accounts Securing privileged access is a critical first step to protecting business assets. Minimizing the number of people who have access to secure information or resources, reduces the chance of a malicious user getting access, or an authorized user inadvertently affecting a sensitive resource.
-Reduce the number of users with write access inside your Azure Purview instance. Keep the number of collection admins and data curator roles minimum at root collection.
+Reduce the number of users with write access inside your Microsoft Purview instance. Keep the number of collection admins and data curator roles minimum at root collection.
### Use multi-factor authentication and conditional access [Azure Active Directory Multi-Factor Authentication](../active-directory/authentication/concept-mfa-howitworks.md) provides another layer of security and authentication. For more security, we recommend enforcing [conditional access policies](../active-directory/conditional-access/overview.md) for all privileged accounts.
-By using Azure Active Directory Conditional Access policies, apply Azure AD Multi-Factor Authentication at sign-in for all individual users who are assigned to Azure Purview roles with modify access inside your Azure Purview instances: Collection Admin, Data Source Admin, Data Curator.
+By using Azure Active Directory Conditional Access policies, apply Azure AD Multi-Factor Authentication at sign-in for all individual users who are assigned to Microsoft Purview roles with modify access inside your Microsoft Purview instances: Collection Admin, Data Source Admin, Data Curator.
Enable multi-factor authentication for your admin accounts and ensure that admin account users have registered for MFA.
-You can define your Conditional Access policies by selecting Azure Purview as a Cloud App.
+You can define your Conditional Access policies by selecting Microsoft Purview as a Cloud App.
-### Prevent accidental deletion of Azure Purview accounts
+### Prevent accidental deletion of Microsoft Purview accounts
In Azure, you can apply [resource locks](../azure-resource-manager/management/lock-resources.md) to an Azure subscription, a resource group, or a resource to prevent accidental deletion or modification for critical resources.
-Enable Azure resource lock for your Azure Purview accounts to prevent accidental deletion of Azure Purview instances in your Azure subscriptions.
+Enable Azure resource lock for your Microsoft Purview accounts to prevent accidental deletion of Microsoft Purview instances in your Azure subscriptions.
-Adding a `CanNotDelete` or `ReadOnly` lock to Azure Purview account does not prevent deletion or modification operations inside Azure Purview data plane, however, it prevents any operations in control plane, such as deleting the Azure Purview account, deploying a private endpoint or configuration of diagnostic settings.
+Adding a `CanNotDelete` or `ReadOnly` lock to Microsoft Purview account does not prevent deletion or modification operations inside Microsoft Purview data plane, however, it prevents any operations in control plane, such as deleting the Microsoft Purview account, deploying a private endpoint or configuration of diagnostic settings.
For more information, see [Understand scope of locks](../azure-resource-manager/management/lock-resources.md#understand-scope-of-locks).
-Resource locks can be assigned to Azure Purview resource groups or resources, however, you cannot assign an Azure resource lock to Azure Purview Managed resources or managed Resource Group.
+Resource locks can be assigned to Microsoft Purview resource groups or resources, however, you cannot assign an Azure resource lock to Microsoft Purview Managed resources or managed Resource Group.
### Implement a break glass strategy
-Plan for a break glass strategy for your Azure Active Directory tenant, Azure subscription and Azure Purview accounts to prevent tenant-wide account lockout.
+Plan for a break glass strategy for your Azure Active Directory tenant, Azure subscription and Microsoft Purview accounts to prevent tenant-wide account lockout.
For more information about Azure AD and Azure emergency access planning, see [Manage emergency access accounts in Azure AD](../active-directory/roles/security-emergency-access.md).
-For more information about Azure Purview break glass strategy, see [Azure Purview collections best practices and design recommendations](concept-best-practices-collections.md#design-recommendations).
+For more information about Microsoft Purview break glass strategy, see [Microsoft Purview collections best practices and design recommendations](concept-best-practices-collections.md#design-recommendations).
## Threat protection and preventing data exfiltration
-Azure Purview provides rich insights into the sensitivity of your data, which makes it valuable to security teams using Microsoft Defender for Cloud to manage the organization's security posture and protect against threats to their workloads. Data resources remain a popular target for malicious actors, making it crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments. To address this challenge, we're announcing the integration between Microsoft Defender for Cloud and Azure Purview in public preview.
+Microsoft Purview provides rich insights into the sensitivity of your data, which makes it valuable to security teams using Microsoft Defender for Cloud to manage the organization's security posture and protect against threats to their workloads. Data resources remain a popular target for malicious actors, making it crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments. To address this challenge, we're announcing the integration between Microsoft Defender for Cloud and Microsoft Purview in public preview.
### Integrate with Microsoft 365 and Microsoft Defender for Cloud
-Often, one of the biggest challenges for security organization in a company is to identify and protect assets based on their criticality and sensitivity. Microsoft recently [announced integration between Azure Purview and Microsoft Defender for Cloud in Public Preview](https://techcommunity.microsoft.com/t5/azure-purview-blog/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954) to help overcome these challenges.
+Often, one of the biggest challenges for security organization in a company is to identify and protect assets based on their criticality and sensitivity. Microsoft recently [announced integration between Microsoft Purview and Microsoft Defender for Cloud in Public Preview](https://techcommunity.microsoft.com/t5/azure-purview-blog/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954) to help overcome these challenges.
-If you have extended your Microsoft 365 sensitivity labels for assets and database columns in Azure Purview, you can keep track of highly valuable assets using Microsoft Defender for Cloud from inventory, alerts and recommendations based on assets detected sensitivity labels.
+If you have extended your Microsoft 365 sensitivity labels for assets and database columns in Microsoft Purview, you can keep track of highly valuable assets using Microsoft Defender for Cloud from inventory, alerts and recommendations based on assets detected sensitivity labels.
- For recommendations, we've provided **security controls** to help you understand how important each recommendation is to your overall security posture. Defender for Cloud includes a **secure score** value for each control to help you prioritize your security work. Learn more in [Security controls and their recommendations](../defender-for-cloud/secure-score-security-controls.md#security-controls-and-their-recommendations). - For alerts, we've assigned **severity labels** to each alert to help you prioritize the order in which you attend to each alert. Learn more in [How are alerts classified?](../defender-for-cloud/alerts-overview.md#how-are-alerts-classified).
-For more information, see [Integrate Azure Purview with Azure security products](how-to-integrate-with-azure-security-products.md).
+For more information, see [Integrate Microsoft Purview with Azure security products](how-to-integrate-with-azure-security-products.md).
## Information Protection ### Secure metadata extraction and storage
-Azure Purview is a data governance solution in cloud. You can register and scan different data sources from various data systems from your on-premises, Azure, or multi-cloud environments into Azure Purview. While data source is registered and scanned in Azure Purview, the actual data and data sources stay in their original locations, only metadata is extracted from data sources and stored in Azure Purview Data Map, which means you do not need to move data out of the region or their original location to extract the metadata into Azure Purview.
+Microsoft Purview is a data governance solution in cloud. You can register and scan different data sources from various data systems from your on-premises, Azure, or multi-cloud environments into Microsoft Purview. While data source is registered and scanned in Microsoft Purview, the actual data and data sources stay in their original locations, only metadata is extracted from data sources and stored in Microsoft Purview Data Map, which means you do not need to move data out of the region or their original location to extract the metadata into Microsoft Purview.
-When an Azure Purview account is deployed, in addition, a managed resource group is also deployed in your Azure subscription. A managed Azure Storage Account and a Managed Event Hubs are deployed inside this resource group. The managed storage account is used to ingest metadata from data sources during the scan. Since these resources are consumed by the Azure Purview they cannot be accessed by any other users or principals, except the Azure Purview account. This is because an Azure role-based access control (RBAC) deny assignment is added automatically for all principals to this resource group at the time of Azure Purview account deployment, preventing any CRUD operations on these resources if they are not initiated from Azure Purview.
+When a Microsoft Purview account is deployed, in addition, a managed resource group is also deployed in your Azure subscription. A managed Azure Storage Account and a Managed Event Hubs are deployed inside this resource group. The managed storage account is used to ingest metadata from data sources during the scan. Since these resources are consumed by the Microsoft Purview they cannot be accessed by any other users or principals, except the Microsoft Purview account. This is because an Azure role-based access control (RBAC) deny assignment is added automatically for all principals to this resource group at the time of Microsoft Purview account deployment, preventing any CRUD operations on these resources if they are not initiated from Microsoft Purview.
### Where is metadata stored?
-Azure Purview extracts only the metadata from different data source systems into [Azure Purview Data Map](concept-elastic-data-map.md) during the scanning process.
+Microsoft Purview extracts only the metadata from different data source systems into [Microsoft Purview Data Map](concept-elastic-data-map.md) during the scanning process.
-You can deploy an Azure Purview account inside your Azure subscription in any [supported Azure regions](https://azure.microsoft.com/global-infrastructure/services/?products=purview&regions=all).
+You can deploy a Microsoft Purview account inside your Azure subscription in any [supported Azure regions](https://azure.microsoft.com/global-infrastructure/services/?products=purview&regions=all).
-All metadata is stored inside Data Map inside your Azure Purview instance. This means the metadata is stored in the same region as your Azure Purview instance.
+All metadata is stored inside Data Map inside your Microsoft Purview instance. This means the metadata is stored in the same region as your Microsoft Purview instance.
### How metadata is extracted from data sources?
-Azure Purview allows you to use any of the following options to extract metadata from data sources:
+Microsoft Purview allows you to use any of the following options to extract metadata from data sources:
- **Azure runtime**. Metadata data is extracted and processed inside the same region as your data sources.
- :::image type="content" source="media/concept-best-practices/security-azure-runtime.png" alt-text="Screenshot that shows the connection flow between Azure Purview, the Azure runtime, and data sources."lightbox="media/concept-best-practices/security-azure-runtime.png":::
+ :::image type="content" source="media/concept-best-practices/security-azure-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, the Azure runtime, and data sources."lightbox="media/concept-best-practices/security-azure-runtime.png":::
- 1. A manual or automatic scan is initiated from the Azure Purview data map through the Azure integration runtime.
+ 1. A manual or automatic scan is initiated from the Microsoft Purview data map through the Azure integration runtime.
2. The Azure integration runtime connects to the data source to extract metadata.
- 3. Metadata is queued in Azure Purview managed storage and stored in Azure Blob Storage.
+ 3. Metadata is queued in Microsoft Purview managed storage and stored in Azure Blob Storage.
- 4. Metadata is sent to the Azure Purview data map.
+ 4. Metadata is sent to the Microsoft Purview data map.
-- **Self-hosted integration runtime**. Metadata is extracted and processed by self-hosted integration runtime inside self-hosted integration runtime VMs' memory before they are sent to Azure Purview Data Map. In this case, customers have to deploy and manage one or more self-hosted integration runtime Windows-based virtual machines inside their Azure subscriptions or on-premises environments. Scanning on-premises and VM-based data sources always requires using a self-hosted integration runtime. The Azure integration runtime is not supported for these data sources. The following steps show the communication flow at a high level when you're using a self-hosted integration runtime to scan a data source.
+- **Self-hosted integration runtime**. Metadata is extracted and processed by self-hosted integration runtime inside self-hosted integration runtime VMs' memory before they are sent to Microsoft Purview Data Map. In this case, customers have to deploy and manage one or more self-hosted integration runtime Windows-based virtual machines inside their Azure subscriptions or on-premises environments. Scanning on-premises and VM-based data sources always requires using a self-hosted integration runtime. The Azure integration runtime is not supported for these data sources. The following steps show the communication flow at a high level when you're using a self-hosted integration runtime to scan a data source.
- :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime.png" alt-text="Screenshot that shows the connection flow between Azure Purview, a self-hosted runtime, and data sources."lightbox="media/concept-best-practices/security-self-hosted-runtime.png":::
+ :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, a self-hosted runtime, and data sources."lightbox="media/concept-best-practices/security-self-hosted-runtime.png":::
1. A manual or automatic scan is triggered. Azure purview connects to Azure Key Vault to retrieve the credential to access a data source.
- 2. The scan is initiated from the Azure Purview data map through a self-hosted integration runtime.
+ 2. The scan is initiated from the Microsoft Purview data map through a self-hosted integration runtime.
3. The self-hosted integration runtime service from the VM connects to the data source to extract metadata.
- 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Azure Purview managed storage and then stored in Azure Blob Storage.
+ 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Microsoft Purview managed storage and then stored in Azure Blob Storage.
- 5. Metadata is sent to the Azure Purview data map.
+ 5. Metadata is sent to the Microsoft Purview data map.
- If you need to extract metadata from data sources with sensitive data that cannot leave the boundary of your on-premises network, it is highly recommended to deploy the self-hosted integration runtime VM inside your corporate network, where data sources are located, to extract and process metadata in on-premises, and send only metadata to Azure Purview.
+ If you need to extract metadata from data sources with sensitive data that cannot leave the boundary of your on-premises network, it is highly recommended to deploy the self-hosted integration runtime VM inside your corporate network, where data sources are located, to extract and process metadata in on-premises, and send only metadata to Microsoft Purview.
- :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime-on-premises.png" alt-text="Screenshot that shows the connection flow between Azure Purview, an on-premises self-hosted runtime, and data sources in on-premises network."lightbox="media/concept-best-practices/security-self-hosted-runtime-on-premises.png":::
+ :::image type="content" source="media/concept-best-practices/security-self-hosted-runtime-on-premises.png" alt-text="Screenshot that shows the connection flow between Microsoft Purview, an on-premises self-hosted runtime, and data sources in on-premises network."lightbox="media/concept-best-practices/security-self-hosted-runtime-on-premises.png":::
1. A manual or automatic scan is triggered. Azure purview connects to Azure Key Vault to retrieve the credential to access a data source.
Azure Purview allows you to use any of the following options to extract metadata
3. The self-hosted integration runtime service from the VM connects to the data source to extract metadata.
- 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Azure Purview managed storage and then stored in Azure Blob Storage. Actual data never leaves the boundary of your network.
+ 4. Metadata is processed in VM memory for the self-hosted integration runtime. Metadata is queued in Microsoft Purview managed storage and then stored in Azure Blob Storage. Actual data never leaves the boundary of your network.
- 5. Metadata is sent to the Azure Purview data map.
+ 5. Metadata is sent to the Microsoft Purview data map.
### Information protection and encryption
-Azure offers many mechanisms for keeping data private in rest and as it moves from one location to another. For Azure Purview, data is encrypted at rest using Microsoft-managed keys and when data is in transit, using Transport Layer Security (TLS) v1.2 or greater.
+Azure offers many mechanisms for keeping data private in rest and as it moves from one location to another. For Microsoft Purview, data is encrypted at rest using Microsoft-managed keys and when data is in transit, using Transport Layer Security (TLS) v1.2 or greater.
#### Transport Layer Security (Encryption-in-transit)
-Data in transit (also known as data in motion) is always encrypted in Azure Purview.
+Data in transit (also known as data in motion) is always encrypted in Microsoft Purview.
-To add another layer of security in addition to access controls, Azure Purview secures customer data by encrypting data in motion with Transport Layer Security (TLS) and protect data in transit against 'out of band' attacks (such as traffic capture). It uses encryption to make sure attackers can't easily read or modify the data.
+To add another layer of security in addition to access controls, Microsoft Purview secures customer data by encrypting data in motion with Transport Layer Security (TLS) and protect data in transit against 'out of band' attacks (such as traffic capture). It uses encryption to make sure attackers can't easily read or modify the data.
-Azure Purview supports data encryption in transit with Transport Layer Security (TLS) v1.2 or greater.
+Microsoft Purview supports data encryption in transit with Transport Layer Security (TLS) v1.2 or greater.
For more information, see, [Encrypt sensitive information in transit](/security/benchmark/azure/baselines/purview-security-baseline#dp-4-encrypt-sensitive-information-in-transit).
For more information, see, [Encrypt sensitive information in transit](/security/
Data at rest includes information that resides in persistent storage on physical media, in any digital format. The media can include files on magnetic or optical media, archived data, and data backups inside Azure regions.
-To add another layer of security in addition to access controls, Azure Purview encrypts data at rest to protect against 'out of band' attacks (such as accessing underlying storage).
+To add another layer of security in addition to access controls, Microsoft Purview encrypts data at rest to protect against 'out of band' attacks (such as accessing underlying storage).
It uses encryption with Microsoft-managed keys. This practice helps make sure attackers can't easily read or modify the data. For more information, see [Encrypt sensitive data at rest](/security/benchmark/azure/baselines/purview-security-baseline#dp-5-encrypt-sensitive-data-at-rest). ## Credential management
-To extract metadata from a data source system into Azure Purview Data Map, it is required to register and scan the data source systems in Azure Purview Data Map. To automate this process, we have made available [connectors](azure-purview-connector-overview.md) for different data source systems in Azure Purview to simplify the registration and scanning process.
+To extract metadata from a data source system into Microsoft Purview Data Map, it is required to register and scan the data source systems in Microsoft Purview Data Map. To automate this process, we have made available [connectors](azure-purview-connector-overview.md) for different data source systems in Microsoft Purview to simplify the registration and scanning process.
-To connect to a data source Azure Purview requires a credential with read-only access to the data source system.
+To connect to a data source Microsoft Purview requires a credential with read-only access to the data source system.
It is recommended prioritizing the use of the following credential options for scanning, when possible:
-1. Azure Purview Managed Identity
+1. Microsoft Purview Managed Identity
2. User Assigned Managed Identity 3. Service Principals 4. Other options such as Account key, SQL Authentication, etc.
-If you use any options rather than managed identities, all credentials must be stored and protected inside an [Azure key vault](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account). Azure Purview requires get/list access to secret on the Azure Key Vault resource.
+If you use any options rather than managed identities, all credentials must be stored and protected inside an [Azure key vault](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account). Microsoft Purview requires get/list access to secret on the Azure Key Vault resource.
As a general rule, you can use the following options to set up integration runtime and credentials to scan data source systems: |Scenario |Runtime option |Supported Credentials | ||||
-|Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside public network | Option 1: Azure Runtime | Azure Purview Managed Identity, Service Principal or Access Key / SQL Authentication (depending on Azure data source type) |
+|Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside public network | Option 1: Azure Runtime | Microsoft Purview Managed Identity, Service Principal or Access Key / SQL Authentication (depending on Azure data source type) |
|Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside public network | Option 2: Self-hosted integration runtime | Service Principal or Access Key / SQL Authentication (depending on Azure data source type) | |Data source is an Azure Platform as a Service, such as Azure Data Lake Storage Gen 2 or Azure SQL inside private network using Azure Private Link Service | Self-hosted integration runtime | Service Principal or Access Key / SQL Authentication (depending on Azure data source type) | |Data source is inside an Azure IaaS VM such as SQL Server | Self-hosted integration runtime deployed in Azure | SQL Authentication or Basic Authentication (depending on Azure data source type) | |Data source is inside an on-premises system such as SQL Server or Oracle | Self-hosted integration runtime deployed in Azure or in the on-premises network | SQL Authentication or Basic Authentication (depending on Azure data source type) | |Multi-cloud | Azure runtime or self-hosted integration runtime based on data source types | Supported credential options vary based on data sources types |
-|Power BI tenant | Azure Runtime | Azure Purview Managed Identity |
+|Power BI tenant | Azure Runtime | Microsoft Purview Managed Identity |
Use [this guide](azure-purview-connector-overview.md) to read more about each source and their supported authentication options. ## Other recommendations
-### Define required number of Azure Purview accounts for your organization
+### Define required number of Microsoft Purview accounts for your organization
-As part of security planning for implementation of Azure Purview in your organization, review your business and security requirements to define [how many Azure Purview accounts are needed](concept-best-practices-accounts.md) in your organization. various factors may impact the decision, such as [multi-tenancy](/azure/cloud-adoption-framework/ready/enterprise-scale/enterprise-enrollment-and-azure-ad-tenants#define-azure-ad-tenants) billing or compliance requirements.
+As part of security planning for implementation of Microsoft Purview in your organization, review your business and security requirements to define [how many Microsoft Purview accounts are needed](concept-best-practices-accounts.md) in your organization. various factors may impact the decision, such as [multi-tenancy](/azure/cloud-adoption-framework/ready/enterprise-scale/enterprise-enrollment-and-azure-ad-tenants#define-azure-ad-tenants) billing or compliance requirements.
### Apply security best practices for Self-hosted runtime VMs
-Consider securing the deployment and management of self-hosted integration runtime VMs in Azure or your on-premises environment, if self-hosted integration runtime is used to scan data sources in Azure Purview.
+Consider securing the deployment and management of self-hosted integration runtime VMs in Azure or your on-premises environment, if self-hosted integration runtime is used to scan data sources in Microsoft Purview.
For self-hosted integration runtime VMs deployed as virtual machines in Azure, follow [security best practices recommendations for Windows virtual machines](../virtual-machines/security-recommendations.md). - Lock down inbound traffic to your VMs using Network Security Groups and [Azure Defender access Just-in-Time](../defender-for-cloud/just-in-time-access-usage.md). - Install antivirus or antimalware. - Deploy Azure Defender to get insights around any potential anomaly on the VMs. -- Limit the amount of software in the self-hosted integration runtime VMs. Although it is not a mandatory requirement to have a dedicated VM for a self-hosted runtime for Azure Purview, we highly suggest using dedicated VMs especially for production environments.
+- Limit the amount of software in the self-hosted integration runtime VMs. Although it is not a mandatory requirement to have a dedicated VM for a self-hosted runtime for Microsoft Purview, we highly suggest using dedicated VMs especially for production environments.
- Monitor the VMs using [Azure Monitor for VMs](../azure-monitor/vm/vminsights-overview.md). By using Log analytics agent, you can capture content such as performance metrics to adjust required capacity for your VMs. - By integrating virtual machines with Microsoft Defender for Cloud, you can you prevent, detect, and respond to threats. - Keep your machines current. You can enable Automatic Windows Update or use [Update Management in Azure Automation](../automation/update-management/overview.md) to manage operating system level updates for the OS.
For self-hosted integration runtime VMs deployed as virtual machines in Azure, f
- Optionally, you can plan to enable Azure backup from your self-hosted integration runtime VMs to increase the recovery time of a self-hosted integration runtime VM if there is a VM level disaster. ## Next steps-- [Azure Purview accounts architectures and best practices](concept-best-practices-accounts.md)-- [Azure Purview network architecture and best practices](concept-best-practices-network.md)-- [Credentials for source authentication in Azure Purview](manage-credentials.md)
+- [Microsoft Purview accounts architectures and best practices](concept-best-practices-accounts.md)
+- [Microsoft Purview network architecture and best practices](concept-best-practices-network.md)
+- [Credentials for source authentication in Microsoft Purview](manage-credentials.md)
purview Concept Best Practices Sensitivity Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-best-practices-sensitivity-labels.md
Title: Best practices for applying sensitivity labels in Azure Purview
-description: This article provides best practices for applying sensitivity labels in Azure Purview.
+ Title: Best practices for applying sensitivity labels in Microsoft Purview
+description: This article provides best practices for applying sensitivity labels in Microsoft Purview.
# Labeling best practices
-Azure Purview supports labeling structured and unstructured data stored across various data sources. Labeling data within Azure Purview allows users to easily find data that matches predefined autolabeling rules that were configured in the Microsoft 365 Security and Compliance Center. Azure Purview extends the use of Microsoft 365 sensitivity labels to assets stored in infrastructure cloud locations and structured data sources.
+Microsoft Purview supports labeling structured and unstructured data stored across various data sources. Labeling data within Microsoft Purview allows users to easily find data that matches predefined autolabeling rules that were configured in the Microsoft 365 Security and Compliance Center. Microsoft Purview extends the use of Microsoft 365 sensitivity labels to assets stored in infrastructure cloud locations and structured data sources.
-## Protect personal data with custom sensitivity labels for Azure Purview
+## Protect personal data with custom sensitivity labels for Microsoft Purview
Storing and processing personal data is subject to special protection. Labeling personal data is crucial to help you identify sensitive information. You can use the detection and labeling tasks for personal data in different stages of your workflows. Because personal data is ubiquitous and fluid in your organization, you need to define identification rules for building policies that suit your individual situation.
-## Why do you need to use labeling within Azure Purview?
+## Why do you need to use labeling within Microsoft Purview?
-With Azure Purview, you can extend your organization's investment in Microsoft 365 sensitivity labels to assets that are stored in files and database columns within Azure, multicloud, and on-premises locations. These locations are defined in [supported data sources](./create-sensitivity-label.md#supported-data-sources).
-When you apply sensitivity labels to your content, you can keep your data secure by stating how sensitive certain data is in your organization. Azure Purview also abstracts the data itself, so you can use labels to track the type of data, without exposing sensitive data on another platform.
+With Microsoft Purview, you can extend your organization's investment in Microsoft 365 sensitivity labels to assets that are stored in files and database columns within Azure, multicloud, and on-premises locations. These locations are defined in [supported data sources](./create-sensitivity-label.md#supported-data-sources).
+When you apply sensitivity labels to your content, you can keep your data secure by stating how sensitive certain data is in your organization. Microsoft Purview also abstracts the data itself, so you can use labels to track the type of data, without exposing sensitive data on another platform.
-## Azure Purview labeling best practices and considerations
+## Microsoft Purview labeling best practices and considerations
The following sections walk you through the process of implementing labeling for your assets. ### Get started -- To enable sensitivity labeling in Azure Purview, follow the steps in [Automatically apply sensitivity labels to your data in Azure Purview](./how-to-automatically-label-your-content.md).-- To find information on required licensing and helpful answers to other questions, see [Sensitivity labels in Azure Purview FAQ](./sensitivity-labels-frequently-asked-questions.yml).
+- To enable sensitivity labeling in Microsoft Purview, follow the steps in [Automatically apply sensitivity labels to your data in Microsoft Purview](./how-to-automatically-label-your-content.md).
+- To find information on required licensing and helpful answers to other questions, see [Sensitivity labels in Microsoft Purview FAQ](./sensitivity-labels-frequently-asked-questions.yml).
### Label considerations -- If you already have Microsoft 365 sensitivity labels in use in your environment, continue to use your existing labels. Don't make duplicate or more labels for Azure Purview. This approach allows you to maximize the investment you've already made in the Microsoft 365 compliance space. It also ensures consistent labeling across your data estate.
+- If you already have Microsoft 365 sensitivity labels in use in your environment, continue to use your existing labels. Don't make duplicate or more labels for Microsoft Purview. This approach allows you to maximize the investment you've already made in the Microsoft 365 compliance space. It also ensures consistent labeling across your data estate.
- If you haven't created Microsoft 365 sensitivity labels, review the documentation to [get started with sensitivity labels](/microsoft-365/compliance/get-started-with-sensitivity-labels). Creating a classification schema is a tenant-wide operation. Discuss it thoroughly before you enable it within your organization. ### Label recommendations -- When you configure sensitivity labels for Azure Purview, you might define autolabeling rules for files, database columns, or both within the label properties. Azure Purview labels files within the Azure Purview data map. When the autolabeling rule is configured, Azure Purview automatically applies the label or recommends that the label is applied.
+- When you configure sensitivity labels for Microsoft Purview, you might define autolabeling rules for files, database columns, or both within the label properties. Microsoft Purview labels files within the Microsoft Purview data map. When the autolabeling rule is configured, Microsoft Purview automatically applies the label or recommends that the label is applied.
> [!WARNING] > If you haven't configured autolabeling for files and emails on your sensitivity labels, users might be affected within your Office and Microsoft 365 environment. You can test autolabeling on database columns without affecting users. -- If you're defining new autolabeling rules for files when you configure labels for Azure Purview, make sure that you have the condition for applying the label set appropriately.
+- If you're defining new autolabeling rules for files when you configure labels for Microsoft Purview, make sure that you have the condition for applying the label set appropriately.
- You can set the detection criteria to **All of these** or **Any of these** in the upper right of the autolabeling for files and emails page of the label properties. - The default setting for detection criteria is **All of these**. This setting means that the asset must contain all the specified sensitive information types for the label to be applied. While the default setting might be valid in some instances, many customers want to use **Any of these**. Then if at least one asset is found, the label is applied. :::image type="content" source="media/concept-best-practices/label-detection-criteria.png" alt-text="Screenshot that shows detection criteria for a label."::: > [!NOTE]
- > Microsoft 365 trainable classifiers aren't used by Azure Purview.
+ > Microsoft 365 trainable classifiers aren't used by Microsoft Purview.
- Maintain consistency in labeling across your data estate. If you use autolabeling rules for files, use the same sensitive information types for autolabeling database columns. - [Define your sensitivity labels via Microsoft Information Protection to identify your personal data at a central place](/microsoft-365/compliance/information-protection).
The following sections walk you through the process of implementing labeling for
- [Force labeling by using autolabel functionality](./how-to-automatically-label-your-content.md). - Build groups of sensitivity labels and store them as a dedicated sensitivity label policy. For example, store all required sensitivity labels for regulatory rules by using the same sensitivity label policy to publish. - Capture all test cases for your labels. Test your label policies with all applications you want to secure.-- Promote sensitivity label policies to Azure Purview.-- Run test scans from Azure Purview on different data sources like hybrid cloud and on-premises to identify sensitivity labels.-- Gather and consider insights, for example, by using Azure Purview Insights. Use alerting mechanisms to mitigate potential breaches of regulations.
+- Promote sensitivity label policies to Microsoft Purview.
+- Run test scans from Microsoft Purview on different data sources like hybrid cloud and on-premises to identify sensitivity labels.
+- Gather and consider insights, for example, by using Microsoft Purview Insights. Use alerting mechanisms to mitigate potential breaches of regulations.
-By using sensitivity labels with Azure Purview, you can extend Microsoft Information Protection beyond the border of your Microsoft data estate to your on-premises, hybrid cloud, multicloud, and software as a service (SaaS) scenarios.
+By using sensitivity labels with Microsoft Purview, you can extend Microsoft Information Protection beyond the border of your Microsoft data estate to your on-premises, hybrid cloud, multicloud, and software as a service (SaaS) scenarios.
## Next steps - [Get started with sensitivity labels](/microsoft-365/compliance/get-started-with-sensitivity-labels).-- [Automatically apply sensitivity labels to your data in Azure Purview](how-to-automatically-label-your-content.md).
+- [Automatically apply sensitivity labels to your data in Microsoft Purview](how-to-automatically-label-your-content.md).
purview Concept Business Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-business-glossary.md
Title: Understand business glossary features in Azure Purview
-description: This article explains what business glossary is in Azure Purview.
+ Title: Understand business glossary features in Microsoft Purview
+description: This article explains what business glossary is in Microsoft Purview.
Last updated 09/27/2021
-# Understand business glossary features in Azure Purview
+# Understand business glossary features in Microsoft Purview
-This article provides an overview of the business glossary feature in Azure Purview.
+This article provides an overview of the business glossary feature in Microsoft Purview.
## Business glossary
The same term can also imply multiple business objects. It is important that eac
## Custom attributes
-Azure Purview supports eight out-of-the-box attributes for any business glossary term:
+Microsoft Purview supports eight out-of-the-box attributes for any business glossary term:
- Name - Definition - Data stewards
Azure Purview supports eight out-of-the-box attributes for any business glossary
- Related terms - Resources
-These attributes cannot be edited or deleted. However, these attributes are not sufficient to completely define a term in an organization. To solve this problem, Azure Purview provides a feature where you can define custom attributes for your glossary.
+These attributes cannot be edited or deleted. However, these attributes are not sufficient to completely define a term in an organization. To solve this problem, Microsoft Purview provides a feature where you can define custom attributes for your glossary.
## Term templates
Classifications are annotations that can be assigned to entities. The flexibilit
- understanding the nature of data stored in the data assets - defining access control policies
-Azure Purview has more than 200 system classifiers today and you can define your own classifiers in catalog. As part of the scanning process, we automatically detect these classifications and apply them to data assets and schemas. However, you can override them at any point of time. The human overrides are never replaced by automated scans.
+Microsoft Purview has more than 200 system classifiers today and you can define your own classifiers in catalog. As part of the scanning process, we automatically detect these classifications and apply them to data assets and schemas. However, you can override them at any point of time. The human overrides are never replaced by automated scans.
### Sensitivity labels
-Sensitivity labels are a type of annotation that allows you to classify and protect your organization's data, without hindering productivity and collaboration. Sensitivity labels are used to identify the categories of classification types within your organizational data, and group the policies that you wish to apply to each category. Azure Purview makes use of the same sensitive information types as Microsoft 365, which allows you to stretch your existing security policies and protection across your entire content and data estate. The same labels can be shared across Microsoft Office products and data assets in Azure Purview.
+Sensitivity labels are a type of annotation that allows you to classify and protect your organization's data, without hindering productivity and collaboration. Sensitivity labels are used to identify the categories of classification types within your organizational data, and group the policies that you wish to apply to each category. Microsoft Purview makes use of the same sensitive information types as Microsoft 365, which allows you to stretch your existing security policies and protection across your entire content and data estate. The same labels can be shared across Microsoft Office products and data assets in Microsoft Purview.
## Next steps - [Manage Term Templates](how-to-manage-term-templates.md)-- [Browse the data catalog in Azure Purview](how-to-browse-catalog.md)
+- [Browse the data catalog in Microsoft Purview](how-to-browse-catalog.md)
purview Concept Classification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-classification.md
Title: Understand data classification feature in Azure Purview
-description: This article explains the concept of data classification in Azure Purview.
+ Title: Understand data classification feature in Microsoft Purview
+description: This article explains the concept of data classification in Microsoft Purview.
Last updated 01/04/2022
-# Data Classification in Azure Purview
+# Data Classification in Microsoft Purview
-Data classification, in the context of Azure Purview, is a way of categorizing data assets by assigning unique logical tags or classes to the data assets. Classification is based on the business context of the data. For example, you might classify assets by *Passport Number*, *Driver's License Number*, *Credit Card Number*, *SWIFT Code*, *PersonΓÇÖs Name*, and so on.
+Data classification, in the context of Microsoft Purview, is a way of categorizing data assets by assigning unique logical tags or classes to the data assets. Classification is based on the business context of the data. For example, you might classify assets by *Passport Number*, *Driver's License Number*, *Credit Card Number*, *SWIFT Code*, *PersonΓÇÖs Name*, and so on.
When you classify data assets, you make them easier to understand, search, and govern. Classifying data assets also helps you understand the risks associated with them. This in turn can help you implement measures to protect sensitive or important data from ungoverned proliferation and unauthorized access across the data estate.
-Azure Purview provides an automated classification capability while you scan your data sources. You get more than 200+ built-in system classifications and the ability to create custom classifications for your data. You can classify assets automatically when they're configured as part of a scan, or you can edit them manually in Azure Purview Studio after they're scanned and ingested.
+Microsoft Purview provides an automated classification capability while you scan your data sources. You get more than 200+ built-in system classifications and the ability to create custom classifications for your data. You can classify assets automatically when they're configured as part of a scan, or you can edit them manually in Microsoft Purview Studio after they're scanned and ingested.
## Use of classification
As shown in the following image, it's possible to apply classifications at both
## Types of classification
-Azure Purview supports both system and custom classifications.
+Microsoft Purview supports both system and custom classifications.
-* **System classifications**: Azure Purview supports 200+ system classifications out of the box. For the entire list of available system classifications, see [Supported classifications in Azure Purview](./supported-classifications.md).
+* **System classifications**: Microsoft Purview supports 200+ system classifications out of the box. For the entire list of available system classifications, see [Supported classifications in Microsoft Purview](./supported-classifications.md).
In the example in the preceding image, *PersonΓÇÖs Name* is a system classification.
Custom classification rules can be based on a *regular expression* pattern or *d
Let's say that the *Employee ID* column follows the EMPLOYEE{GUID} pattern (for example, EMPLOYEE9c55c474-9996-420c-a285-0d0fc23f1f55). You can create your own custom classification by using a regular expression, such as `\^Employee\[A-Za-z0-9\]{8}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{4}-\[A-Za-z0-9\]{12}\$`. > [!NOTE]
-> Sensitivity labels are different from classifications. Sensitivity labels categorize assets in the context of data security and privacy, such as *Highly Confidential*, *Restricted*, *Public*, and so on. To use sensitivity labels in Azure Purview, you'll need at least one Microsoft 365 license or account within the same Azure Active Directory (Azure AD) tenant as your Azure Purview account. For more information about the differences between sensitivity labels and classifications, see [Sensitivity labels in Azure Purview FAQ](sensitivity-labels-frequently-asked-questions.yml#what-is-the-difference-between-classifications-and-sensitivity-labels-in-azure-purview).
+> Sensitivity labels are different from classifications. Sensitivity labels categorize assets in the context of data security and privacy, such as *Highly Confidential*, *Restricted*, *Public*, and so on. To use sensitivity labels in Microsoft Purview, you'll need at least one Microsoft 365 license or account within the same Azure Active Directory (Azure AD) tenant as your Microsoft Purview account. For more information about the differences between sensitivity labels and classifications, see [Sensitivity labels in Microsoft Purview FAQ](sensitivity-labels-frequently-asked-questions.yml#what-is-the-difference-between-classifications-and-sensitivity-labels-in-microsoft-purview).
## Next steps * [Read about classification best practices](concept-best-practices-classification.md) * [Create custom classifications](create-a-custom-classification-and-classification-rule.md) * [Apply classifications](apply-classifications.md)
-* [Use the Azure Purview Studio](use-azure-purview-studio.md)
+* [Use the Microsoft Purview Studio](use-azure-purview-studio.md)
purview Concept Data Lineage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-data-lineage.md
Title: Data lineage in Azure Purview
+ Title: Data lineage in Microsoft Purview
description: Describes the concepts for data lineage.
Last updated 09/27/2021
-# Data lineage in Azure Purview Data Catalog client
+# Data lineage in Microsoft Purview Data Catalog client
-This article provides an overview of data lineage in Azure Purview Data Catalog. It also details how data systems can integrate with the catalog to capture lineage of data. Azure Purview can capture lineage for data in different parts of your organization's data estate, and at different levels of preparation including:
+This article provides an overview of data lineage in Microsoft Purview Data Catalog. It also details how data systems can integrate with the catalog to capture lineage of data. Microsoft Purview can capture lineage for data in different parts of your organization's data estate, and at different levels of preparation including:
- Completely raw data staged from various platforms - Transformed and prepared data
This article provides an overview of data lineage in Azure Purview Data Catalog.
Data lineage is broadly understood as the lifecycle that spans the dataΓÇÖs origin, and where it moves over time across the data estate. It is used for different kinds of backwards-looking scenarios such as troubleshooting, tracing root cause in data pipelines and debugging. Lineage is also used for data quality analysis, compliance and ΓÇ£what ifΓÇ¥ scenarios often referred to as impact analysis. Lineage is represented visually to show data moving from source to destination including how the data was transformed. Given the complexity of most enterprise data environments, these views can be hard to understand without doing some consolidation or masking of peripheral data points.
-## Lineage experience in Azure Purview Data Catalog
+## Lineage experience in Microsoft Purview Data Catalog
-Azure Purview Data Catalog will connect with other data processing, storage, and analytics systems to extract lineage information. The information is combined to represent a generic, scenario-specific lineage experience in the Catalog.
+Microsoft Purview Data Catalog will connect with other data processing, storage, and analytics systems to extract lineage information. The information is combined to represent a generic, scenario-specific lineage experience in the Catalog.
:::image type="content" source="media/concept-lineage/lineage-end-end.png" alt-text="end-end lineage showing data copied from blob store all the way to Power BI dashboard":::
The following example is a typical use case of data moving across multiple syste
## Lineage granularity
-The following section covers the details about the granularity of which the lineage information is gathered by Azure Purview. This granularity can vary based on the data systems supported in Azure Purview.
+The following section covers the details about the granularity of which the lineage information is gathered by Microsoft Purview. This granularity can vary based on the data systems supported in Microsoft Purview.
### Entity level lineage: Source(s) > Process > Target(s)
To support root cause analysis and data quality scenarios, we capture the execut
## Summary
-Lineage is a critical feature of the Azure Purview Data Catalog to support quality, trust, and audit scenarios. The goal of a data catalog is to build a robust framework where all the data systems within your environment can naturally connect and report lineage. Once the metadata is available, the data catalog can bring together the metadata provided by data systems to power data governance use cases.
+Lineage is a critical feature of the Microsoft Purview Data Catalog to support quality, trust, and audit scenarios. The goal of a data catalog is to build a robust framework where all the data systems within your environment can naturally connect and report lineage. Once the metadata is available, the data catalog can bring together the metadata provided by data systems to power data governance use cases.
## Next steps
-* [Quickstart: Create an Azure Purview account in the Azure portal](create-catalog-portal.md)
-* [Quickstart: Create an Azure Purview account using Azure PowerShell/Azure CLI](create-catalog-powershell.md)
-* [Use the Azure Purview Studio](use-azure-purview-studio.md)
+* [Quickstart: Create a Microsoft Purview account in the Azure portal](create-catalog-portal.md)
+* [Quickstart: Create a Microsoft Purview account using Azure PowerShell/Azure CLI](create-catalog-powershell.md)
+* [Use the Microsoft Purview Studio](use-azure-purview-studio.md)
purview Concept Data Owner Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-data-owner-policies.md
Title: Azure Purview data owner policies concepts
-description: Understand Azure Purview data owner policies
+ Title: Microsoft Purview data owner policies concepts
+description: Understand Microsoft Purview data owner policies
Last updated 03/20/2022
-# Concepts for Azure Purview data owner policies
+# Concepts for Microsoft Purview data owner policies
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-This article discusses concepts related to managing access to data sources in your data estate from within Azure Purview Studio.
+This article discusses concepts related to managing access to data sources in your data estate from within Microsoft Purview Studio.
> [!Note]
-> This capability is different from access control for Azure Purview itself, which is described in [Access control in Azure Purview](catalog-permissions.md).
+> This capability is different from access control for Microsoft Purview itself, which is described in [Access control in Microsoft Purview](catalog-permissions.md).
## Overview
-Access policies in Azure Purview enable you to manage access to different data systems across your entire data estate. For example:
+Access policies in Microsoft Purview enable you to manage access to different data systems across your entire data estate. For example:
-A user needs read access to an Azure Storage account that has been registered in Azure Purview. You can grant this access directly in Azure Purview by creating a data access policy through the **Policy management** app in Azure Purview Studio.
+A user needs read access to an Azure Storage account that has been registered in Microsoft Purview. You can grant this access directly in Microsoft Purview by creating a data access policy through the **Policy management** app in Microsoft Purview Studio.
Data access policies can be enforced through Purview on data systems that have been registered for policy.
-## Azure Purview policy concepts
+## Microsoft Purview policy concepts
-### Azure Purview policy
+### Microsoft Purview policy
A **policy** is a named collection of policy statements. When a policy is published to one or more data systems under PurviewΓÇÖs governance, it's then enforced by them. A policy definition includes a policy name, description, and a list of one or more policy statements.
The data resource specified in a policy statement is hierarchical by default. Th
### Policy combining algorithm
-Azure Purview can have different policy statements that refer to the same data asset. When evaluating a decision for data access, Azure Purview combines all the applicable policies and provides a consolidated decision. The combining strategy picks the most restrictive policy.
+Microsoft Purview can have different policy statements that refer to the same data asset. When evaluating a decision for data access, Microsoft Purview combines all the applicable policies and provides a consolidated decision. The combining strategy picks the most restrictive policy.
For example, let’s assume two different policies on an Azure Storage container *FinData* as follows, Policy 1 - *Allow Read on Data Asset /subscription/…./containers/FinData
Policy 2 - *Deny Read on Data Asset /subscription/…./containers/FinData
To group Finance-contractors* Then letΓÇÖs assume that user ΓÇÿuser1ΓÇÖ, who is part of two groups:
-*Finance-analyst* and *Finance-contractors*, executes a call to blob read API. Since both policies will be applicable, Azure Purview will choose the most restrictive one, which is *Deny* of *Read*. Thus, the access request will be denied.
+*Finance-analyst* and *Finance-contractors*, executes a call to blob read API. Since both policies will be applicable, Microsoft Purview will choose the most restrictive one, which is *Deny* of *Read*. Thus, the access request will be denied.
> [!Note] > Currently, the only supported effect is **Allow**. ## Policy publishing
-A newly created policy exists in the draft mode state, only visible in Azure Purview. The act of publishing initiates enforcement of a policy in the specified data systems. It's an asynchronous action that can take between 5 minutes and 2 hours to be effective, depending on the enforcement code in the underlying data sources. For more information, consult the tutorials related to each data source
+A newly created policy exists in the draft mode state, only visible in Microsoft Purview. The act of publishing initiates enforcement of a policy in the specified data systems. It's an asynchronous action that can take between 5 minutes and 2 hours to be effective, depending on the enforcement code in the underlying data sources. For more information, consult the tutorials related to each data source
A policy published to a data source could contain references to an asset belonging to a different data source. Such references will be ignored since the asset in question does not exist in the data source where the policy is applied. ## Next steps
-Check the tutorials on how to create policies in Azure Purview that work on specific data systems such as Azure Storage:
+Check the tutorials on how to create policies in Microsoft Purview that work on specific data systems such as Azure Storage:
* [Access provisioning by data owner to Azure Storage datasets](how-to-data-owner-policies-storage.md)
-* [Enable Azure Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
+* [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
purview Concept Default Purview Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-default-purview-account.md
Last updated 12/01/2021
-# Default Azure Purview Account
+# Default Microsoft Purview Account
-In general, our guidance is to have a single Azure Purview account for entire customer's data estate. However, there are cases in which customers would like to have multiple Azure Purview accounts in their organization. The top reasons for different Azure Purview accounts are listed below:
+In general, our guidance is to have a single Microsoft Purview account for entire customer's data estate. However, there are cases in which customers would like to have multiple Microsoft Purview accounts in their organization. The top reasons for different Microsoft Purview accounts are listed below:
* Testing new configurations - Customers want to create multiple catalogs for testing out configurations such as scan or classification rules before moving the configuration to a higher environment like pre-production or production. * Storing test/pre-production/production data separately - Customers want to create different catalogs for different kinds of data stored in different environments.
-* Conglomerates - Conglomerates often have many business units (BUs) that operate separately to the extent that they won't even share billing with each other. Hence, this might require the conglomerates to create different Azure Purview accounts for different BUs.
+* Conglomerates - Conglomerates often have many business units (BUs) that operate separately to the extent that they won't even share billing with each other. Hence, this might require the conglomerates to create different Microsoft Purview accounts for different BUs.
-* Compliance - There are some strict compliance regulations, which treat even metadata as sensitive and require it to be in a particular geography. For the same reason customers might end up with multiple Azure Purview accounts per region.
+* Compliance - There are some strict compliance regulations, which treat even metadata as sensitive and require it to be in a particular geography. For the same reason customers might end up with multiple Microsoft Purview accounts per region.
-Having multiple Azure Purview accounts in a tenant now poses the challenge of which Azure Purview account should all other services like PBI, Synapse connect to. A PBI admin or Synapse Admin who is given the responsibility of pairing their PBI tenant or Synapse account with right Azure Purview account. This is where default Azure Purview account will help our customers. Azure global administrator (or tenant admin) can designate an Azure Purview account as default Azure Purview account at tenant level. At any point in time a tenant can have only 0 or 1 default accounts. Once this is set PBI Admin or Synapse Admin or any user in your organization has clear understanding that this account is the "right" one, discover the same and all other services should connect to this one.
+Having multiple Microsoft Purview accounts in a tenant now poses the challenge of which Microsoft Purview account should all other services like PBI, Synapse connect to. A PBI admin or Synapse Admin who is given the responsibility of pairing their PBI tenant or Synapse account with right Microsoft Purview account. This is where default Microsoft Purview account will help our customers. Azure global administrator (or tenant admin) can designate a Microsoft Purview account as default Microsoft Purview account at tenant level. At any point in time a tenant can have only 0 or 1 default accounts. Once this is set PBI Admin or Synapse Admin or any user in your organization has clear understanding that this account is the "right" one, discover the same and all other services should connect to this one.
## Manage default account for tenant
Having multiple Azure Purview accounts in a tenant now poses the challenge of wh
* Setting up wrong default account can have security implications so only Azure global administrator at tenant level (Tenant Admin) can set the default account flag as 'Yes'.
-* Changing the default account is a two-step process. First you need to change the flag as 'No' to the current default Azure Purview account and then set the flag as 'Yes' to the new Azure Purview account.
+* Changing the default account is a two-step process. First you need to change the flag as 'No' to the current default Microsoft Purview account and then set the flag as 'Yes' to the new Microsoft Purview account.
-* Setting up default account is a control plane operation and hence Azure Purview studio will not have any changes if an account is defined as default. However, in the studio you can see the account name is appended with "(default)" for the default Azure Purview account.
+* Setting up default account is a control plane operation and hence Microsoft Purview studio will not have any changes if an account is defined as default. However, in the studio you can see the account name is appended with "(default)" for the default Microsoft Purview account.
## Next steps -- [Create an Azure Purview account](create-catalog-portal.md)-- [Azure Purview Pricing](https://azure.microsoft.com/pricing/details/azure-purview/)
+- [Create a Microsoft Purview account](create-catalog-portal.md)
+- [Microsoft Purview Pricing](https://azure.microsoft.com/pricing/details/azure-purview/)
purview Concept Elastic Data Map https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-elastic-data-map.md
Title: Elastic data map
-description: This article explains the concepts of the Elastic Data Map in Azure Purview
+description: This article explains the concepts of the Elastic Data Map in Microsoft Purview
Last updated 03/21/2022
-# Elastic data map in Azure Purview
+# Elastic data map in Microsoft Purview
-The Azure Purview data map provides the foundation for data discovery and data governance. It captures metadata about data present in analytics, software-as-a-service (SaaS), and operation systems in hybrid, on-premises, and multi-cloud environments. The Azure Purview data map stays up to date with its built-in scanning and classification system.
+The Microsoft Purview data map provides the foundation for data discovery and data governance. It captures metadata about data present in analytics, software-as-a-service (SaaS), and operation systems in hybrid, on-premises, and multi-cloud environments. The Microsoft Purview data map stays up to date with its built-in scanning and classification system.
-All Azure Purview accounts have a data map that elastically grow starting at one capacity unit. They scale up and down based on request load and metadata stored within the data map.
+All Microsoft Purview accounts have a data map that elastically grow starting at one capacity unit. They scale up and down based on request load and metadata stored within the data map.
## Data map capacity unit
-The elastic data map has two components, metadata storage and operation throughput, represented as a capacity unit (CU). All Azure Purview accounts, by default, start with one capacity unit and elastically grow based on usage. Each data Map capacity unit includes a throughput of 25 operations/sec and 10 GB of metadata storage limit.
+The elastic data map has two components, metadata storage and operation throughput, represented as a capacity unit (CU). All Microsoft Purview accounts, by default, start with one capacity unit and elastically grow based on usage. Each data Map capacity unit includes a throughput of 25 operations/sec and 10 GB of metadata storage limit.
### Operations
-Operations are the throughput measure of the Azure Purview Data Map. They include any Create, Read, Write, Update, and Delete operations on metadata stored in the Data Map. Some examples of operations are listed below:
+Operations are the throughput measure of the Microsoft Purview Data Map. They include any Create, Read, Write, Update, and Delete operations on metadata stored in the Data Map. Some examples of operations are listed below:
- Create an asset in Data Map - Add a relationship to an asset such as owner, steward, parent, lineage, etc.
Operations are the throughput measure of the Azure Purview Data Map. They includ
Storage is the second component of Data Map and includes technical, business, operational, and semantic metadata.
-The technical metadata includes schema, data type, columns, and so on, that are discovered from Azure Purview [scanning](concept-scans-and-ingestion.md). The business metadata includes automated (for example, promoted from Power BI datasets, or descriptions from SQL tables) and manual tagging of descriptions, glossary terms, and so on. Examples of semantic metadata include the collection mapping to data sources, or classifications. The operational metadata includes Data factory copy and data flow activity run status, and runs time.
+The technical metadata includes schema, data type, columns, and so on, that are discovered from Microsoft Purview [scanning](concept-scans-and-ingestion.md). The business metadata includes automated (for example, promoted from Power BI datasets, or descriptions from SQL tables) and manual tagging of descriptions, glossary terms, and so on. Examples of semantic metadata include the collection mapping to data sources, or classifications. The operational metadata includes Data factory copy and data flow activity run status, and runs time.
## Work with elastic data map
The technical metadata includes schema, data type, columns, and so on, that are
## Scenario
-Claudia is an Azure admin at Contoso who wants to provision a new Azure Purview account from Azure portal. While provisioning, she doesnΓÇÖt know the required size of Azure Purview Data Map to support the future state of the platform. However, she knows that the Azure Purview Data Map is billed by Capacity Units, which are affected by storage and operations throughput. She wants to provision the smallest Data Map to keep the cost low and grow the Data Map size elastically based on consumption.
+Claudia is an Azure admin at Contoso who wants to provision a new Microsoft Purview account from Azure portal. While provisioning, she doesnΓÇÖt know the required size of Microsoft Purview Data Map to support the future state of the platform. However, she knows that the Microsoft Purview Data Map is billed by Capacity Units, which are affected by storage and operations throughput. She wants to provision the smallest Data Map to keep the cost low and grow the Data Map size elastically based on consumption.
-Claudia can create an Azure Purview account with the default Data Map size of 1 capacity unit that can automatically scale up and down. The autoscaling feature also allows for capacity to be tuned based on intermittent or planned data bursts during specific periods. Claudia follows the next steps in provisioning experience to set up network configuration and completes the provisioning.
+Claudia can create a Microsoft Purview account with the default Data Map size of 1 capacity unit that can automatically scale up and down. The autoscaling feature also allows for capacity to be tuned based on intermittent or planned data bursts during specific periods. Claudia follows the next steps in provisioning experience to set up network configuration and completes the provisioning.
-In the Azure monitor metrics page, Claudia can see the consumption of the Data Map storage and operations throughput. She can further set up an alert when the storage or operations throughput reaches a certain limit to monitor the consumption and billing of the new Azure Purview account.
+In the Azure monitor metrics page, Claudia can see the consumption of the Data Map storage and operations throughput. She can further set up an alert when the storage or operations throughput reaches a certain limit to monitor the consumption and billing of the new Microsoft Purview account.
## Data map billing
-Customers are billed for one capacity unit (25 ops/sec and 10 GB) and extra billing is based on the consumption of each extra capacity unit rolled up to the hour. The Data Map operations scale in the increments of 25 operations/sec and metadata storage scales in the increments of 10 GB size. Azure Purview Data Map can automatically scale up and down within the elasticity window ([check current limits](how-to-manage-quotas.md)). However, to get the next level of elasticity window, a support ticket needs to be created.
+Customers are billed for one capacity unit (25 ops/sec and 10 GB) and extra billing is based on the consumption of each extra capacity unit rolled up to the hour. The Data Map operations scale in the increments of 25 operations/sec and metadata storage scales in the increments of 10 GB size. Microsoft Purview Data Map can automatically scale up and down within the elasticity window ([check current limits](how-to-manage-quotas.md)). However, to get the next level of elasticity window, a support ticket needs to be created.
Data Map capacity units come with a cap on operations throughput and storage. If storage exceeds the current capacity unit, customers are charged for the next capacity unit even if the operations throughput isn't used. The below table shows the Data Map capacity unit ranges. Contact support if the Data Map capacity unit goes beyond 100 capacity unit.
Data Map capacity units come with a cap on operations throughput and storage. If
### Billing examples -- Azure Purview Data MapΓÇÖs operation throughput for the given hour is less than or equal to 25 Ops/Sec and storage size is 1 GB. Customers are billed for one capacity unit.
+- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is less than or equal to 25 Ops/Sec and storage size is 1 GB. Customers are billed for one capacity unit.
-- Azure Purview Data MapΓÇÖs operation throughput for the given hour is less than or equal to 25 Ops/Sec and storage size is 15 GB. Customers are billed for two capacity units.
+- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is less than or equal to 25 Ops/Sec and storage size is 15 GB. Customers are billed for two capacity units.
-- Azure Purview Data MapΓÇÖs operation throughput for the given hour is 50 Ops/Sec and storage size is 15 GB. Customers are billed for two capacity units.
+- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is 50 Ops/Sec and storage size is 15 GB. Customers are billed for two capacity units.
-- Azure Purview Data MapΓÇÖs operation throughput for the given hour is 50 Ops/Sec and storage size is 25 GB. Customers are billed for three capacity units.
+- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is 50 Ops/Sec and storage size is 25 GB. Customers are billed for three capacity units.
-- Azure Purview Data MapΓÇÖs operation throughput for the given hour is 250 Ops/Sec and storage size is 15 GB. Customers are billed for ten capacity units.
+- Microsoft Purview Data MapΓÇÖs operation throughput for the given hour is 250 Ops/Sec and storage size is 15 GB. Customers are billed for ten capacity units.
### Detailed billing example
Based on the Data Map operations/second and metadata storage consumption in this
:::image type="content" source="./media/concept-elastic-data-map/billing-capacity-hours.png" alt-text="Table depicting number of CU hours over time."::: >[!Important]
->Azure Purview Data Map can automatically scale up and down within the elasticity window ([check current limits](how-to-manage-quotas.md)). To get the next level of the elasticity window, a support ticket needs to be created.
+>Microsoft Purview Data Map can automatically scale up and down within the elasticity window ([check current limits](how-to-manage-quotas.md)). To get the next level of the elasticity window, a support ticket needs to be created.
## Increase operations throughput limit
-The default limit for maximum operations per second is 10 capacity units. If you are working with a very large Azure Purview environment and require a higher throughput, you can request a larger capacity of elasticity window by [creating a quota request](how-to-manage-quotas.md#request-quota-increase). Select "Data map capacity unit" as the quota type and provide as much relevant information as you can about your environment and the additional capacity you would like to request.
+The default limit for maximum operations per second is 10 capacity units. If you are working with a very large Microsoft Purview environment and require a higher throughput, you can request a larger capacity of elasticity window by [creating a quota request](how-to-manage-quotas.md#request-quota-increase). Select "Data map capacity unit" as the quota type and provide as much relevant information as you can about your environment and the additional capacity you would like to request.
> [!IMPORTANT] > There's no default limit for metadata storage. As you add more metadata to your data map, it will elastically increase.
Increasing the operations throughput limit will also increase the minimum number
The metrics _data map capacity units_ and the _data map storage size_ can be monitored in order to understand the data estate size and the billing.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
2. Click on **Overview** and scroll down to observe the **Monitoring** section for _Data Map Capacity Units_ and _Data Map Storage Size_ metrics over different time periods
The metrics _data map capacity units_ and the _data map storage size_ can be mon
## Summary
-With elastic Data Map, Azure Purview provides low-cost barrier for customers to start their data governance journey.
-Azure Purview DataMap can grow elastically with pay as you go model starting from as small as 1 Capacity unit.
+With elastic Data Map, Microsoft Purview provides low-cost barrier for customers to start their data governance journey.
+Microsoft Purview DataMap can grow elastically with pay as you go model starting from as small as 1 Capacity unit.
Customers donΓÇÖt need to worry about choosing the correct Data Map size for their data estate at provision time and deal with platform migrations in the future due to size limits. ## Next Steps -- [Create an Azure Purview account](create-catalog-portal.md)-- [Azure Purview Pricing](https://azure.microsoft.com/pricing/details/azure-purview/)
+- [Create a Microsoft Purview account](create-catalog-portal.md)
+- [Microsoft Purview Pricing](https://azure.microsoft.com/pricing/details/azure-purview/)
purview Concept Guidelines Pricing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-guidelines-pricing.md
Title: Azure Purview pricing guidelines
-description: This article provides a guideline towards understanding the various components in Azure Purview pricing.
+ Title: Microsoft Purview pricing guidelines
+description: This article provides a guideline towards understanding the various components in Microsoft Purview pricing.
Last updated 04/06/2022
-# Azure Purview pricing
+# Microsoft Purview pricing
-Azure Purview enables a unified governance experience by providing a single pane of glass for managing data governance by enabling automated scanning and classifying data at scale.
+Microsoft Purview enables a unified governance experience by providing a single pane of glass for managing data governance by enabling automated scanning and classifying data at scale.
-## Why do you need to understand the components of the Azure Purview pricing?
+## Why do you need to understand the components of the Microsoft Purview pricing?
-- While the pricing for Azure Purview is on a subscription-based **Pay-As-You-Go** model, there are various dimensions that you can consider while budgeting for Azure Purview-- This guideline is intended to help you plan the budgeting for Azure Purview by providing a view on the various control factors that impact the budget
+- While the pricing for Microsoft Purview is on a subscription-based **Pay-As-You-Go** model, there are various dimensions that you can consider while budgeting for Microsoft Purview
+- This guideline is intended to help you plan the budgeting for Microsoft Purview by providing a view on the various control factors that impact the budget
## Factors impacting Azure Pricing
-There are **direct** and **indirect** costs that need to be considered while planning the Azure Purview budgeting and cost management.
+There are **direct** and **indirect** costs that need to be considered while planning the Microsoft Purview budgeting and cost management.
### Direct costs
-Direct costs impacting Azure Purview pricing are based on the following three dimensions:
+Direct costs impacting Microsoft Purview pricing are based on the following three dimensions:
- **Elastic data map** - **Automated scanning & classification** - **Advanced resource sets** #### Elastic data map -- The **Data map** is the foundation of the Azure Purview architecture and so needs to be up to date with asset information in the data estate at any given point
+- The **Data map** is the foundation of the Microsoft Purview architecture and so needs to be up to date with asset information in the data estate at any given point
- The data map is charged in terms of **Capacity Unit** (CU). The data map is provisioned at one CU if the catalog is storing up to 10 GB of metadata storage and serves up to 25 data map operations/sec
Direct costs impacting Azure Purview pricing are based on the following three di
#### Automated scanning, classification and ingestion
-There are two major automated processes that can trigger ingestion of metadata into Azure Purview:
+There are two major automated processes that can trigger ingestion of metadata into Microsoft Purview:
1. Automatic scans using native [connectors](azure-purview-connector-overview.md). This process includes three main steps: - Metadata scan - Automatic classification
- - Ingestion of metadata into Azure Purview
+ - Ingestion of metadata into Microsoft Purview
2. Automated ingestion using Azure Data Factory and/or Azure Synapse pipelines. This process includes:
- - Ingestion of metadata and lineage into Azure Purview if Azure Purview account is connected to any Azure Data Factory or Azure Synapse pipelines.
+ - Ingestion of metadata and lineage into Microsoft Purview if Microsoft Purview account is connected to any Azure Data Factory or Azure Synapse pipelines.
##### 1. Automatic scans using native connectors - A **full scan** processes all assets within a selected scope of a data source whereas an **incremental scan** detects and processes assets, which have been created, modified, or deleted since the previous successful scan
There are two major automated processes that can trigger ingestion of metadata i
#### Advanced resource sets -- Azure Purview uses **resource sets** to address the challenge of mapping large numbers of data assets to a single logical resource by providing the ability to scan all the files in the data lake and find patterns (GUID, localization patterns, etc.) to group them as a single asset in the data map
+- Microsoft Purview uses **resource sets** to address the challenge of mapping large numbers of data assets to a single logical resource by providing the ability to scan all the files in the data lake and find patterns (GUID, localization patterns, etc.) to group them as a single asset in the data map
- **Advanced Resource Set** is an optional feature, which allows for customers to get enriched resource set information computed such as Total Size, Partition Count, etc., and enables the customization of resource set grouping via pattern rules. If Advanced Resource Set feature is not enabled, your data catalog will still contain resource set assets, but without the aggregated properties. There will be no "Resource Set" meter billed to the customer in this case. -- Use the basic resource set feature, before switching on the Advanced Resource Sets in Azure Purview to verify if requirements are met
+- Use the basic resource set feature, before switching on the Advanced Resource Sets in Microsoft Purview to verify if requirements are met
- Consider turning on Advanced Resource Sets if:
- - your data lakes schema is constantly changing, and you are looking for additional value beyond the basic Resource Set feature to enable Azure Purview to compute parameters such as #partitions, size of the data estate, etc., as a service
+ - your data lakes schema is constantly changing, and you are looking for additional value beyond the basic Resource Set feature to enable Microsoft Purview to compute parameters such as #partitions, size of the data estate, etc., as a service
- there is a need to customize how resource set assets get grouped - It is important to note that billing for Advanced Resource Sets is based on the compute used by the offline tier to aggregate resource set information and is dependent on the size/number of resource sets in your catalog
There are two major automated processes that can trigger ingestion of metadata i
### Indirect costs
-Indirect costs impacting Azure Purview pricing to be considered are:
+Indirect costs impacting Microsoft Purview pricing to be considered are:
- [Managed resources](https://azure.microsoft.com/pricing/details/azure-purview/)
- - When an Azure Purview account is provisioned, a storage account and event hub queue are created within the subscription in order to cater to secured scanning, which may be charged separately
+ - When a Microsoft Purview account is provisioned, a storage account and event hub queue are created within the subscription in order to cater to secured scanning, which may be charged separately
- [Azure private endpoint](./catalog-private-link.md)
- - Azure private end points are used for Azure Purview accounts where it is required for users on a virtual network (VNet) to securely access the catalog over a private link
+ - Azure private end points are used for Microsoft Purview accounts where it is required for users on a virtual network (VNet) to securely access the catalog over a private link
- The prerequisites for setting up private endpoints could result in extra costs - [Self-hosted integration runtime related costs](./manage-integration-runtimes.md) - Self-hosted integration runtime requires infrastructure, which results in extra costs
- - It is required to deploy and register Self-hosted integration runtime (SHIR) inside the same virtual network where Azure Purview ingestion private endpoints are deployed
+ - It is required to deploy and register Self-hosted integration runtime (SHIR) inside the same virtual network where Microsoft Purview ingestion private endpoints are deployed
- [Additional memory requirements for scanning](./register-scan-sapecc-source.md#create-and-run-scan) - Certain data sources such as SAP require additional memory on the SHIR machine for scanning
Indirect costs impacting Azure Purview pricing to be considered are:
- Plan virtual machine sizing in order to distribute the scanning workload across VMs to optimize the v-cores utilized while running scans - [Microsoft 365 license](./create-sensitivity-label.md)
- - Microsoft Information Protection (MIP) sensitivity labels can be automatically applied to your Azure assets in Azure Purview.
+ - Microsoft Information Protection (MIP) sensitivity labels can be automatically applied to your Azure assets in Microsoft Purview.
- MIP sensitivity labels are created and managed in the Microsoft 365 Security and Compliance Center.
- - To create sensitivity labels for use in Azure Purview, you must have an active Microsoft 365 license, which offers the benefit of automatic labeling. For the full list of licenses, see the Sensitivity labels in Azure Purview FAQ.
+ - To create sensitivity labels for use in Microsoft Purview, you must have an active Microsoft 365 license, which offers the benefit of automatic labeling. For the full list of licenses, see the Sensitivity labels in Microsoft Purview FAQ.
- [Azure Alerts](../azure-monitor/alerts/alerts-overview.md) - Azure Alerts can notify customers of issues found with infrastructure or applications using the monitoring data in Azure Monitor
Indirect costs impacting Azure Purview pricing to be considered are:
## Next steps-- [Azure Purview pricing page](https://azure.microsoft.com/pricing/details/azure-purview/)
+- [Microsoft Purview pricing page](https://azure.microsoft.com/pricing/details/azure-purview/)
purview Concept Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-insights.md
Title: Understand Insights reports in Azure Purview
-description: This article explains what Insights are in Azure Purview.
+ Title: Understand Insights reports in Microsoft Purview
+description: This article explains what Insights are in Microsoft Purview.
Last updated 12/02/2020
-# Understand Insights in Azure Purview
+# Understand Insights in Microsoft Purview
-This article provides an overview of the Insights feature in Azure Purview.
+This article provides an overview of the Insights feature in Microsoft Purview.
-Insights are one of the key pillars of Azure Purview. The feature provides customers, a single pane of glass view into their catalog and further aims to provide specific insights to the data source administrators, business users, data stewards, data officer and, security administrators. Currently, Azure Purview has the following Insights reports that will be available to customers during Insight's public preview.
+Insights are one of the key pillars of Microsoft Purview. The feature provides customers, a single pane of glass view into their catalog and further aims to provide specific insights to the data source administrators, business users, data stewards, data officer and, security administrators. Currently, Microsoft Purview has the following Insights reports that will be available to customers during Insight's public preview.
> [!IMPORTANT]
-> Azure Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Asset Insights
This report summarizes top items that a Data Steward needs to focus on, to creat
This report provides details about where classified data is located, the classifications found during a scan, and a drill-down to the classified files themselves. It enables Stewards, Curators and Security Administrators to understand the types of information found in their organization's data estate.
-In Azure Purview, classifications are similar to subject tags, and are used to mark and identify content of a specific type in your data estate.
+In Microsoft Purview, classifications are similar to subject tags, and are used to mark and identify content of a specific type in your data estate.
Use the Classification Insights report to identify content with specific classifications and understand required actions, such as adding additional security to the repositories, or moving content to a more secure location.
-For more information, see [Classification insights about your data from Azure Purview](classification-insights.md).
+For more information, see [Classification insights about your data from Microsoft Purview](classification-insights.md).
## Sensitivity Labeling Insights This report provides details about the sensitivity labels found during a scan, as well as a drill-down to the labeled files themselves. It enables security administrators to ensure the security of information found in their organization's data estate.
-In Azure Purview, sensitivity labels are used to identify classification type categories, as well as the group security policies that you want to apply to each category.
+In Microsoft Purview, sensitivity labels are used to identify classification type categories, as well as the group security policies that you want to apply to each category.
Use the Labeling Insights report to identify the sensitivity labels found in your content and understand required actions, such as managing access to specific repositories or files.
-For more information, see [Sensitivity label insights about your data in Azure Purview](sensitivity-insights.md).
+For more information, see [Sensitivity label insights about your data in Microsoft Purview](sensitivity-insights.md).
## Next steps
purview Concept Resource Sets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-resource-sets.md
Title: Understanding resource sets
-description: This article explains what resource sets are and how Azure Purview creates them.
+description: This article explains what resource sets are and how Microsoft Purview creates them.
Last updated 09/24/2021
# Understanding resource sets
-This article helps you understand how Azure Purview uses resource sets to map data assets to logical resources.
+This article helps you understand how Microsoft Purview uses resource sets to map data assets to logical resources.
## Background info
-At-scale data processing systems typically store a single table in storage as multiple files. In the Azure Purview data catalog, this concept is represented by using resource sets. A resource set is a single object in the catalog that represents a large number of assets in storage.
+At-scale data processing systems typically store a single table in storage as multiple files. In the Microsoft Purview data catalog, this concept is represented by using resource sets. A resource set is a single object in the catalog that represents a large number of assets in storage.
For example, suppose your Spark cluster has persisted a DataFrame into an Azure Data Lake Storage (ADLS) Gen2 data source. Although in Spark the table looks like a single logical resource, on the disk there are likely thousands of Parquet files, each of which represents a partition of the total DataFrame's contents. IoT data and web log data have the same challenge. Imagine you have a sensor that outputs log files several times a second. It won't take long until you have hundreds of thousands of log files from that single sensor.
-## How Azure Purview detects resource sets
+## How Microsoft Purview detects resource sets
-Azure Purview supports detecting resource sets in Azure Blob Storage, ADLS Gen1, ADLS Gen2, Azure Files, and Amazon S3.
+Microsoft Purview supports detecting resource sets in Azure Blob Storage, ADLS Gen1, ADLS Gen2, Azure Files, and Amazon S3.
-Azure Purview automatically detects resource sets when scanning. This feature looks at all of the data that's ingested via scanning and compares it to a set of defined patterns.
+Microsoft Purview automatically detects resource sets when scanning. This feature looks at all of the data that's ingested via scanning and compares it to a set of defined patterns.
-For example, suppose you scan a data source whose URL is `https://myaccount.blob.core.windows.net/mycontainer/machinesets/23/foo.parquet`. Azure Purview looks at the path segments and determines if they match any built-in patterns. It has built-in patterns for GUIDs, numbers, date formats, localization codes (for example, en-us), and so on. In this case, the number pattern matches *23*. Azure Purview assumes that this file is part of a resource set named `https://myaccount.blob.core.windows.net/mycontainer/machinesets/{N}/foo.parquet`.
+For example, suppose you scan a data source whose URL is `https://myaccount.blob.core.windows.net/mycontainer/machinesets/23/foo.parquet`. Microsoft Purview looks at the path segments and determines if they match any built-in patterns. It has built-in patterns for GUIDs, numbers, date formats, localization codes (for example, en-us), and so on. In this case, the number pattern matches *23*. Microsoft Purview assumes that this file is part of a resource set named `https://myaccount.blob.core.windows.net/mycontainer/machinesets/{N}/foo.parquet`.
-Or, for a URL like `https://myaccount.blob.core.windows.net/mycontainer/weblogs/en_au/23.json`, Azure Purview matches both the localization pattern and the number pattern, producing a resource set named `https://myaccount.blob.core.windows.net/mycontainer/weblogs/{LOC}/{N}.json`.
+Or, for a URL like `https://myaccount.blob.core.windows.net/mycontainer/weblogs/en_au/23.json`, Microsoft Purview matches both the localization pattern and the number pattern, producing a resource set named `https://myaccount.blob.core.windows.net/mycontainer/weblogs/{LOC}/{N}.json`.
-Using this strategy, Azure Purview would map the following resources to the same resource set, `https://myaccount.blob.core.windows.net/mycontainer/weblogs/{LOC}/{N}.json`:
+Using this strategy, Microsoft Purview would map the following resources to the same resource set, `https://myaccount.blob.core.windows.net/mycontainer/weblogs/{LOC}/{N}.json`:
- `https://myaccount.blob.core.windows.net/mycontainer/weblogs/cy_gb/1004.json` - `https://myaccount.blob.core.windows.net/mycontainer/weblogs/cy_gb/234.json` - `https://myaccount.blob.core.windows.net/mycontainer/weblogs/de_Ch/23434.json`
-### File types that Azure Purview will not detect as resource sets
+### File types that Microsoft Purview will not detect as resource sets
-Azure Purview intentionally doesn't try to classify most document file types like Word, Excel, or PDF as Resource Sets. The exception is CSV format since that is a common partitioned file format.
+Microsoft Purview intentionally doesn't try to classify most document file types like Word, Excel, or PDF as Resource Sets. The exception is CSV format since that is a common partitioned file format.
-## How Azure Purview scans resource sets
+## How Microsoft Purview scans resource sets
-When Azure Purview detects resources that it thinks are part of a resource set, it switches from a full scan to a sample scan. A sample scan opens only a subset of the files that it thinks are in the resource set. For each file it opens, it uses its schema and runs its classifiers. Azure Purview then finds the newest resource among the opened resources and uses that resource's schema and classifications in the entry for the entire resource set in the catalog.
+When Microsoft Purview detects resources that it thinks are part of a resource set, it switches from a full scan to a sample scan. A sample scan opens only a subset of the files that it thinks are in the resource set. For each file it opens, it uses its schema and runs its classifiers. Microsoft Purview then finds the newest resource among the opened resources and uses that resource's schema and classifications in the entry for the entire resource set in the catalog.
## Advanced resource sets
-By default, Azure Purview determines the schema and classifications for resource sets based upon the [resource set file sampling rules](sources-and-scans.md#resource-set-file-sampling). Azure Purview can customize and further enrich your resource set assets through the **Advanced Resource Sets** capability. When Advanced Resource Sets are enabled, Azure Purview run extra aggregations to compute the following information about resource set assets:
+By default, Microsoft Purview determines the schema and classifications for resource sets based upon the [resource set file sampling rules](sources-and-scans.md#resource-set-file-sampling). Microsoft Purview can customize and further enrich your resource set assets through the **Advanced Resource Sets** capability. When Advanced Resource Sets are enabled, Microsoft Purview run extra aggregations to compute the following information about resource set assets:
- Most up-to-date schema and classifications to accurately reflect schema drift from changing metadata. - A sample path from a file that comprises the resource set.
These properties can be found on the asset details page of the resource set.
:::image type="content" source="media/concept-resource-sets/resource-set-properties.png" alt-text="The properties computed when advanced resource sets is on" border="true":::
-Enabling advanced resource sets also allows for the creation of [resource set pattern rules](how-to-resource-set-pattern-rules.md) that customize how Azure Purview groups resource sets during scanning.
+Enabling advanced resource sets also allows for the creation of [resource set pattern rules](how-to-resource-set-pattern-rules.md) that customize how Microsoft Purview groups resource sets during scanning.
### Turning on advanced resource sets
-Advanced resource sets is off by default in all new Azure Purview instances. Advanced resource sets can be enabled from **Account information** in the management hub.
+Advanced resource sets is off by default in all new Microsoft Purview instances. Advanced resource sets can be enabled from **Account information** in the management hub.
:::image type="content" source="media/concept-resource-sets/advanced-resource-set-toggle.png" alt-text="Turn on Advanced resource set." border="true":::
-After enabling advanced resource sets, the additional enrichments will occur on all newly ingested assets. The Azure Purview team recommends waiting an hour before scanning in new data lake data after toggling on the feature.
+After enabling advanced resource sets, the additional enrichments will occur on all newly ingested assets. The Microsoft Purview team recommends waiting an hour before scanning in new data lake data after toggling on the feature.
> [!IMPORTANT] > Enabling advanced resource sets will impact the refresh rate of asset and classification insights. When advanced resource sets is on, asset and classification insights will only update twice a day. ## Built-in resource set patterns
-Azure Purview supports the following resource set patterns. These patterns can appear as a name in a directory or as part of a file name.
+Microsoft Purview supports the following resource set patterns. These patterns can appear as a name in a directory or as part of a file name.
### Regex-based patterns | Pattern Name | Display Name | Description |
Azure Purview supports the following resource set patterns. These patterns can a
| Date(yyyy/mm/dd)InPath | {Year}/{Month}/{Day} | Year/month/day pattern spanning multiple folders |
-## How resource sets are displayed in the Azure Purview data catalog
+## How resource sets are displayed in the Microsoft Purview data catalog
-When Azure Purview matches a group of assets into a resource set, it attempts to extract the most useful information to use as a display name in the catalog. Some examples of the default naming convention applied:
+When Microsoft Purview matches a group of assets into a resource set, it attempts to extract the most useful information to use as a display name in the catalog. Some examples of the default naming convention applied:
### Example 1
Display name: "data"
## Customizing resource set grouping using pattern rules
-When scanning a storage account, Azure Purview uses a set of defined patterns to determine if a group of assets is a resource set. In some cases, Azure Purview's resource set grouping may not accurately reflect your data estate. These issues can include:
+When scanning a storage account, Microsoft Purview uses a set of defined patterns to determine if a group of assets is a resource set. In some cases, Microsoft Purview's resource set grouping may not accurately reflect your data estate. These issues can include:
- Incorrectly marking an asset as a resource set - Putting an asset into the wrong resource set - Incorrectly marking an asset as not being a resource set
-To customize or override how Azure Purview detects which assets are grouped as resource sets and how they are displayed within the catalog, you can define pattern rules in the management center. For step-by-step instructions and syntax, please see [resource set pattern rules](how-to-resource-set-pattern-rules.md).
+To customize or override how Microsoft Purview detects which assets are grouped as resource sets and how they are displayed within the catalog, you can define pattern rules in the management center. For step-by-step instructions and syntax, please see [resource set pattern rules](how-to-resource-set-pattern-rules.md).
## Next steps
-To get started with Azure Purview, see [Quickstart: Create an Azure Purview account](create-catalog-portal.md).
+To get started with Microsoft Purview, see [Quickstart: Create a Microsoft Purview account](create-catalog-portal.md).
purview Concept Scans And Ingestion https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-scans-and-ingestion.md
Title: Scans and ingestion
-description: This article explains scans and ingestion in Azure Purview.
+description: This article explains scans and ingestion in Microsoft Purview.
Last updated 08/18/2021
-# Scans and ingestion in Azure Purview
+# Scans and ingestion in Microsoft Purview
-This article provides an overview of the Scanning and Ingestion features in Azure Purview. These features connect your Azure Purview account to your sources to populate the data map and data catalog so you can begin exploring and managing your data through Azure Purview.
+This article provides an overview of the Scanning and Ingestion features in Microsoft Purview. These features connect your Microsoft Purview account to your sources to populate the data map and data catalog so you can begin exploring and managing your data through Microsoft Purview.
## Scanning
-After data sources are [registered](manage-data-sources.md) in your Azure Purview account, the next step is to scan the data sources. The scanning process establishes a connection to the data source and captures technical metadata like names, file size, columns, and so on. It also extracts schema for structured data sources, applies classifications on schemas, and [applies sensitivity labels if your Azure Purview account is connected to a Microsoft 365 Security and Compliance Center (SCC)](create-sensitivity-label.md). The scanning process can be triggered to run immediately or can be scheduled to run on a periodic basis to keep your Azure Purview account up to date.
+After data sources are [registered](manage-data-sources.md) in your Microsoft Purview account, the next step is to scan the data sources. The scanning process establishes a connection to the data source and captures technical metadata like names, file size, columns, and so on. It also extracts schema for structured data sources, applies classifications on schemas, and [applies sensitivity labels if your Microsoft Purview account is connected to a Microsoft 365 Security and Compliance Center (SCC)](create-sensitivity-label.md). The scanning process can be triggered to run immediately or can be scheduled to run on a periodic basis to keep your Microsoft Purview account up to date.
For each scan there are customizations you can apply so that you're only scanning your sources for the information you need. ### Choose an authentication method for your scans
-Azure Purview is secure by default. No passwords or secrets are stored directly in Azure Purview, so youΓÇÖll need to choose an authentication method for your sources. There are four possible ways to authenticate your Azure Purview account, but not all methods are supported for each data source.
+Microsoft Purview is secure by default. No passwords or secrets are stored directly in Microsoft Purview, so youΓÇÖll need to choose an authentication method for your sources. There are four possible ways to authenticate your Microsoft Purview account, but not all methods are supported for each data source.
- Managed Identity - Service Principal - SQL Authentication - Account Key or Basic Authentication
-Whenever possible, a Managed Identity is the preferred authentication method because it eliminates the need for storing and managing credentials for individual data sources. This can greatly reduce the time you and your team spend setting up and troubleshooting authentication for scans. When you enable a managed identity for your Azure Purview account, an identity is created in Azure Active Directory and is tied to the lifecycle of your account.
+Whenever possible, a Managed Identity is the preferred authentication method because it eliminates the need for storing and managing credentials for individual data sources. This can greatly reduce the time you and your team spend setting up and troubleshooting authentication for scans. When you enable a managed identity for your Microsoft Purview account, an identity is created in Azure Active Directory and is tied to the lifecycle of your account.
### Scope your scan
There are [system scan rule sets](create-a-scan-rule-set.md#system-scan-rule-set
### Schedule your scan
-Azure Purview gives you a choice of scanning weekly or monthly at a specific time you choose. Weekly scans may be appropriate for data sources with structures that are actively under development or frequently change. Monthly scanning is more appropriate for data sources that change infrequently. A good best practice is to work with the administrator of the source you want to scan to identify a time when compute demands on the source are low.
+Microsoft Purview gives you a choice of scanning weekly or monthly at a specific time you choose. Weekly scans may be appropriate for data sources with structures that are actively under development or frequently change. Monthly scanning is more appropriate for data sources that change infrequently. A good best practice is to work with the administrator of the source you want to scan to identify a time when compute demands on the source are low.
### How scans detect deleted assets
-An Azure Purview catalog is only aware of the state of a data store when it runs a scan. For the catalog to know if a file, table, or container was deleted, it compares the last scan output against the current scan output. For example, suppose that the last time you scanned an Azure Data Lake Storage Gen2 account, it included a folder named *folder1*. When the same account is scanned again, *folder1* is missing. Therefore, the catalog assumes the folder has been deleted.
+An Microsoft Purview catalog is only aware of the state of a data store when it runs a scan. For the catalog to know if a file, table, or container was deleted, it compares the last scan output against the current scan output. For example, suppose that the last time you scanned an Azure Data Lake Storage Gen2 account, it included a folder named *folder1*. When the same account is scanned again, *folder1* is missing. Therefore, the catalog assumes the folder has been deleted.
#### Detecting deleted files
When you enumerate large data stores like Data Lake Storage Gen2, there are mult
## Ingestion
-The technical metadata or classifications identified by the scanning process are then sent to Ingestion. The ingestion process is responsible for populating the data map and is managed by Azure Purview. Ingestion analyses the input from scan, [applies resource set patterns](concept-resource-sets.md#how-azure-purview-detects-resource-sets), populates available [lineage](concept-data-lineage.md) information, and then loads the data map automatically. Assets/schemas can be discovered or curated only after ingestion is complete. So, if your scan is completed but you haven't seen your assets in the data map or catalog, you'll need to wait for the ingestion process to finish.
+The technical metadata or classifications identified by the scanning process are then sent to Ingestion. The ingestion process is responsible for populating the data map and is managed by Microsoft Purview. Ingestion analyses the input from scan, [applies resource set patterns](concept-resource-sets.md#how-microsoft-purview-detects-resource-sets), populates available [lineage](concept-data-lineage.md) information, and then loads the data map automatically. Assets/schemas can be discovered or curated only after ingestion is complete. So, if your scan is completed but you haven't seen your assets in the data map or catalog, you'll need to wait for the ingestion process to finish.
## Next steps
For more information, or for specific instructions for scanning sources, follow
* To understand resource sets, see our [resource sets article](concept-resource-sets.md). * [How to register and scan an Azure SQL Database](register-scan-azure-sql-database.md#creating-the-scan)
-* [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+* [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
purview Concept Self Service Data Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-self-service-data-access-policy.md
Title: Azure Purview Self-service access concepts
-description: Understand what self-service access and data discovery are in Azure Purview, and explore how users can take advantage of it.
+ Title: Microsoft Purview Self-service access concepts
+description: Understand what self-service access and data discovery are in Microsoft Purview, and explore how users can take advantage of it.
Last updated 03/10/2022
-# Azure Purview Self-service data discovery and access (Preview)
+# Microsoft Purview Self-service data discovery and access (Preview)
-This article helps you understand Azure Purview Self-service data access policy.
+This article helps you understand Microsoft Purview Self-service data access policy.
> [!IMPORTANT]
-> Azure Purview Self-service data access policy is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Self-service data access policy is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Important limitations
The self-service data access policy is only supported when the prerequisites men
## Overview
-Azure Purview Self-service data access workflow allows data consumer to request access to data when browsing or searching for data. Once the data access request is approved, a policy gets auto-generated to grant access to the requestor provided the data source is enabled for data use governance. Currently, self-service data access policy is supported for storage accounts, containers, folders, and files.
+Microsoft Purview Self-service data access workflow allows data consumer to request access to data when browsing or searching for data. Once the data access request is approved, a policy gets auto-generated to grant access to the requestor provided the data source is enabled for data use governance. Currently, self-service data access policy is supported for storage accounts, containers, folders, and files.
-A **workflow admin** will need to map a self-service data access workflow to a collection. Collection is logical grouping of data sources that are registered within Azure Purview. **Only data source(s) that are registered** for data use governance will have self-service policies auto-generated.
+A **workflow admin** will need to map a self-service data access workflow to a collection. Collection is logical grouping of data sources that are registered within Microsoft Purview. **Only data source(s) that are registered** for data use governance will have self-service policies auto-generated.
## Terminology * **Data consumer** is anyone who uses the data. Example, a data analyst accessing marketing data for customer segmentation. Data consumer and data requestor will be used interchangeably within this document.
-* **Collection** is logical grouping of data sources that are registered within Azure Purview.
+* **Collection** is logical grouping of data sources that are registered within Microsoft Purview.
* **Self-service data access workflow** is the workflow that is initiated when a data consumer requests access to data. * **Approver** is either security group or AAD users that can approve self-service access requests.
-## How to use Azure Purview self-service data access policy
+## How to use Microsoft Purview self-service data access policy
-Azure Purview allows organizations to catalog metadata about all registered data assets. It allows data consumers to search for or browse to the required data asset.
+Microsoft Purview allows organizations to catalog metadata about all registered data assets. It allows data consumers to search for or browse to the required data asset.
With self-service data access workflow, data consumers can not only find data assets but also request access to the data assets. When the data consumer requests access to a data asset, the associated self-service data access workflow is triggered.
-A default self-service data access workflow template is provided with every Azure Purview account.The default template can be amended to add more approvers and/or set the approver's email address. For more details refer [Create and enable self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md).
+A default self-service data access workflow template is provided with every Microsoft Purview account.The default template can be amended to add more approvers and/or set the approver's email address. For more details refer [Create and enable self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md).
Whenever a data consumer requests access to a dataset, the notification is sent to the workflow approver(s). The approver(s) can view the request and approve it either from Azure purview portal or from within the email notification. When the request is approved, a policy is auto-generated and applied against the respective data source. Self-service data access policy gets auto-generated only if the data source is registered for **data use governance**. The pre-requisites mentioned within the [data use governance](./how-to-enable-data-use-governance.md#prerequisites) have to be satisfied.
purview Concept Workflow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-workflow.md
Title: Workflows in Azure Purview
-description: This article describes workflows in Azure Purview, the roles they play, and who can create and manage them.
+ Title: Workflows in Microsoft Purview
+description: This article describes workflows in Microsoft Purview, the roles they play, and who can create and manage them.
Last updated 03/09/2022
-# Workflows in Azure Purview
+# Workflows in Microsoft Purview
[!INCLUDE [Region Notice](./includes/workflow-regions.md)]
-Workflows are automated, repeatable business processes that users can create within Azure Purview to validate and orchestrate CUD (create, update, delete) operations on their data entities. Enabling these processes allow organizations to track changes, enforce policy compliance, and ensure quality data across their data landscape.
+Workflows are automated, repeatable business processes that users can create within Microsoft Purview to validate and orchestrate CUD (create, update, delete) operations on their data entities. Enabling these processes allow organizations to track changes, enforce policy compliance, and ensure quality data across their data landscape.
-Since the workflows are created and managed within Azure Purview, manual change monitoring or approval are no longer required to ensure quality updates to the data catalog.
+Since the workflows are created and managed within Microsoft Purview, manual change monitoring or approval are no longer required to ensure quality updates to the data catalog.
## What are workflows?
Currently, there are two kinds of workflows:
* **Data governance** - for data policy, access governance, and loss prevention. [Scoped](#workflow-scope) at the collection level. * **Data catalog** - to manage approvals for CUD (create, update, delete) operations for glossary terms. [Scoped](#workflow-scope) at the glossary level.
-These workflows can be built from pre-established [workflow templates](#workflow-templates) provided in the Azure Purview studio, but are fully customizable using the available workflow connectors.
+These workflows can be built from pre-established [workflow templates](#workflow-templates) provided in the Microsoft Purview studio, but are fully customizable using the available workflow connectors.
## Workflow templates
-For all the different types of user defined workflows enabled and available for your use, Azure Purview provides templates to help [workflow administrators](#who-can-manage-workflows) create workflows without needing to build them from scratch. The templates are built into the authoring experience and automatically populate based on the workflow being created, so there's no need to search for them.
+For all the different types of user defined workflows enabled and available for your use, Microsoft Purview provides templates to help [workflow administrators](#who-can-manage-workflows) create workflows without needing to build them from scratch. The templates are built into the authoring experience and automatically populate based on the workflow being created, so there's no need to search for them.
Templates are available to launch the workflow authoring experience. However, a workflow admin can customize the template to meet the requirements in their organization. ## Workflow connectors
-Workflow connectors are a common set of actions applicable across all workflows. They can be used in any workflow in Azure Purview to create processes customized to your organization. Currently, the available connectors are:
+Workflow connectors are a common set of actions applicable across all workflows. They can be used in any workflow in Microsoft Purview to create processes customized to your organization. Currently, the available connectors are:
- **Approval connector** ΓÇô Generates approval requests and assign the requests to individual users or Microsoft Azure Active Directory groups.
- Azure Purview workflow approval connector currently supports two types of approval types:
+ Microsoft Purview workflow approval connector currently supports two types of approval types:
* First to Respond ΓÇô This implies that the first approverΓÇÖs outcome (Approve/Reject) is considered final. * Everyone must approve ΓÇô This implies everyone identified as an approver must approve the request for the request to be considered approved. If one approver rejects the request, regardless of other approvers, the request is rejected.
Workflow connectors are a common set of actions applicable across all workflows.
Once a workflow is created and enabled, it can be bound to a particular scope. This gives you the flexibility to run different workflows for different areas/departments in your organization.
-Data governance workflows are scoped to collections, and can be bound to the root collection to govern the whole Azure Purview catalog, or any subcollection.
+Data governance workflows are scoped to collections, and can be bound to the root collection to govern the whole Microsoft Purview catalog, or any subcollection.
Data catalog workflows are scoped to the glossary and can be bound to the entire glossary, any single term, or any parent term to manage child-terms.
A Workflow admin defined for any collection can create approval workflows for th
## Next steps
-Now that you understand what workflows are, you can follow these guides to use them in your Azure Purview account:
+Now that you understand what workflows are, you can follow these guides to use them in your Microsoft Purview account:
- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md) - [Approval workflow for business terms](how-to-workflow-business-terms-approval.md)
purview Create A Custom Classification And Classification Rule https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-a-custom-classification-and-classification-rule.md
Title: Create a custom classification and classification rule
-description: Learn how to create custom classifications to define data types in your data estate that are unique to your organization in Azure Purview.
+description: Learn how to create custom classifications to define data types in your data estate that are unique to your organization in Microsoft Purview.
Last updated 09/27/2021
-# Custom classifications in Azure Purview
+# Custom classifications in Microsoft Purview
This article describes how you can create custom classifications to define data types in your data estate that are unique to your organization. It also describes the creation of custom classification rules that let you find specified data throughout your data estate. ## Default system classifications
-The Azure Purview Data Catalog provides a large set of default system classifications that represent typical personal data types that you might have in your data estate. For the entire list of available system classifications, see [Supported classifications in Azure Purview](supported-classifications.md).
+The Microsoft Purview Data Catalog provides a large set of default system classifications that represent typical personal data types that you might have in your data estate. For the entire list of available system classifications, see [Supported classifications in Microsoft Purview](supported-classifications.md).
:::image type="content" source="media/create-a-custom-classification-and-classification-rule/classification.png" alt-text="select classification" border="true":::
You also have the ability to create custom classifications, if any of the defaul
> Our [data sampling rules](sources-and-scans.md#sampling-within-a-file) are applied to both system and custom classifications. > [!NOTE]
-> Azure Purview custom classifications are applied only to structured data sources like SQL and CosmosDB, and to structured file types like CSV, JSON, and Parquet. Custom classification isn't applied to unstructured data file types like DOC, PDF, and XLSX.
+> Microsoft Purview custom classifications are applied only to structured data sources like SQL and CosmosDB, and to structured file types like CSV, JSON, and Parquet. Custom classification isn't applied to unstructured data file types like DOC, PDF, and XLSX.
## Steps to create a custom classification
purview Create A Scan Rule Set https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-a-scan-rule-set.md
Title: Create a scan rule set
-description: Create a scan rule set in Azure Purview to quickly scan data sources in your organization.
+description: Create a scan rule set in Microsoft Purview to quickly scan data sources in your organization.
Last updated 09/27/2021
# Create a scan rule set
-In an Azure Purview catalog, you can create scan rule sets to enable you to quickly scan data sources in your organization.
+In a Microsoft Purview catalog, you can create scan rule sets to enable you to quickly scan data sources in your organization.
A scan rule set is a container for grouping a set of scan rules together so that you can easily associate them with a scan. For example, you might create a default scan rule set for each of your data source types, and then use these scan rule sets by default for all scans within your company. You might also want users with the right permissions to create other scan rule sets with different configurations based on business need.
A scan rule set is a container for grouping a set of scan rules together so that
To create a scan rule set:
-1. From your Azure [Azure Purview Studio](https://web.purview.azure.com/resource/), select **Data Map**.
+1. From your Azure [Microsoft Purview Studio](https://web.purview.azure.com/resource/), select **Data Map**.
1. Select **Scan rule sets** from the left pane, and then select **New**.
To create a scan rule set:
## Create a custom file type
-Azure Purview supports adding a custom extension and defining a custom column delimiter in a scan rule set.
+Microsoft Purview supports adding a custom extension and defining a custom column delimiter in a scan rule set.
To create a custom file type:
To create a custom file type:
## Ignore patterns
-Azure Purview supports defining regular expressions (regex) to exclude assets during scanning. During scanning, Azure Purview will compare the asset's URL against these regular expressions. All assets matching any of the regexes mentioned will be ignored while scanning.
+Microsoft Purview supports defining regular expressions (regex) to exclude assets during scanning. During scanning, Microsoft Purview will compare the asset's URL against these regular expressions. All assets matching any of the regexes mentioned will be ignored while scanning.
The **Ignore patterns** blade pre-populates one regex for spark transaction files. You can remove the pre-existing pattern if it is not required. You can define up to 10 ignore patterns.
In the above example:
Here are some more tips you can use to ignore patterns: -- While processing the regex, Azure Purview will add $ to the regex by default.-- A good way to understand what url the scanning agent will compare with your regular expression is to browse through the Azure Purview data catalog, find the asset you want to ignore in the future, and see its fully qualified name (FQN) in the **Overview** tab.
+- While processing the regex, Microsoft Purview will add $ to the regex by default.
+- A good way to understand what url the scanning agent will compare with your regular expression is to browse through the Microsoft Purview data catalog, find the asset you want to ignore in the future, and see its fully qualified name (FQN) in the **Overview** tab.
:::image type="content" source="./media/create-a-scan-rule-set/fully-qualified-name.png" alt-text="Screenshot showing the fully qualified name on an asset's overview tab."::: ## System scan rule sets
-System scan rule sets are Microsoft-defined scan rule sets that are automatically created for each Azure Purview catalog. Each system scan rule set is associated with a specific data source type. When you create a scan, you can associate it with a system scan rule set. Every time Microsoft makes an update to these system rule sets, you can update them in your catalog, and apply the update to all the associated scans.
+System scan rule sets are Microsoft-defined scan rule sets that are automatically created for each Microsoft Purview catalog. Each system scan rule set is associated with a specific data source type. When you create a scan, you can associate it with a system scan rule set. Every time Microsoft makes an update to these system rule sets, you can update them in your catalog, and apply the update to all the associated scans.
1. To view the list of system scan rule sets, select **Scan rule sets** in the **Management Center** and choose the **System** tab.
purview Create Azure Purview Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-azure-purview-dotnet.md
Title: 'Quickstart: Create Azure Purview Account using .NET SDK'
-description: Create an Azure Purview Account using .NET SDK.
+ Title: 'Quickstart: Create Microsoft Purview Account using .NET SDK'
+description: Create a Microsoft Purview Account using .NET SDK.
Last updated 09/27/2021
-# Quickstart: Create an Azure Purview account using .NET SDK
+# Quickstart: Create a Microsoft Purview account using .NET SDK
-In this quickstart, you'll use the [.NET SDK](/dotnet/api/overview/azure/purviewresourceprovider) to create an Azure Purview account.
+In this quickstart, you'll use the [.NET SDK](/dotnet/api/overview/azure/purviewresourceprovider) to create a Microsoft Purview account.
-Azure Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Azure Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
+Microsoft Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Microsoft Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-For more information about Azure Purview, [see our overview page](overview.md). For more information about deploying Azure Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
+For more information about Microsoft Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
[!INCLUDE [purview-quickstart-prerequisites](includes/purview-quickstart-prerequisites.md)]
Next, create a C# .NET console application in Visual Studio:
Install-Package Microsoft.IdentityModel.Clients.ActiveDirectory ```
-## Create an Azure Purview client
+## Create a Microsoft Purview client
1. Open **Program.cs**, include the following statements to add references to namespaces.
Next, create a C# .NET console application in Visual Studio:
using Microsoft.IdentityModel.Clients.ActiveDirectory; ```
-2. Add the following code to the **Main** method that sets the variables. Replace the placeholders with your own values. For a list of Azure regions in which Azure Purview is currently available, search on **Azure Purview** and select the regions that interest you on the following page: [Products available by region](https://azure.microsoft.com/global-infrastructure/services/).
+2. Add the following code to the **Main** method that sets the variables. Replace the placeholders with your own values. For a list of Azure regions in which Microsoft Purview is currently available, search on **Microsoft Purview** and select the regions that interest you on the following page: [Products available by region](https://azure.microsoft.com/global-infrastructure/services/).
```csharp // Set variables
Next, create a C# .NET console application in Visual Studio:
"<specify the name of purview account to create. It must be globally unique.>"; ```
-3. Add the following code to the **Main** method that creates an instance of **PurviewManagementClient** class. You use this object to create an Azure Purview Account.
+3. Add the following code to the **Main** method that creates an instance of **PurviewManagementClient** class. You use this object to create a Microsoft Purview Account.
```csharp // Authenticate and create a purview management client
Next, create a C# .NET console application in Visual Studio:
}; ```
-## Create an Azure Purview account
+## Create a Microsoft Purview account
-Add the following code to the **Main** method that creates a **Azure Purview Account**.
+Add the following code to the **Main** method that creates a **Microsoft Purview Account**.
```csharp // Create a purview Account
-Console.WriteLine("Creating Azure Purview Account " + purviewAccountName + "...");
+Console.WriteLine("Creating Microsoft Purview Account " + purviewAccountName + "...");
Account account = new Account() { Location = region,
Console.ReadKey();
Build and start the application, then verify the execution.
-The console prints the progress of creating Azure Purview Account.
+The console prints the progress of creating Microsoft Purview Account.
### Sample output ```json
-Creating Azure Purview Account testpurview...
+Creating Microsoft Purview Account testpurview...
Succeeded { "sku": {
Press any key to exit...
## Verify the output
-Go to the **Azure Purview accounts** page in the [Azure portal](https://portal.azure.com) and verify the account created using the above code.
+Go to the **Microsoft Purview accounts** page in the [Azure portal](https://portal.azure.com) and verify the account created using the above code.
-## Delete Azure Purview account
+## Delete Microsoft Purview account
-To programmatically delete an Azure Purview Account, add the following lines of code to the program:
+To programmatically delete a Microsoft Purview Account, add the following lines of code to the program:
```csharp
-Console.WriteLine("Deleting the Azure Purview Account");
+Console.WriteLine("Deleting the Microsoft Purview Account");
client.Accounts.Delete(resourceGroup, purviewAccountName); ```
-## Check if Azure Purview account name is available
+## Check if Microsoft Purview account name is available
To check availability of a purview account, use the following code:
CheckNameAvailabilityRequest checkNameAvailabilityRequest = newCheckNameAvailabi
Name = purviewAccountName, Type = "Microsoft.Purview/accounts" };
-Console.WriteLine("Check Azure Purview account name");
+Console.WriteLine("Check Microsoft Purview account name");
Console.WriteLine(client.Accounts.CheckNameAvailability(checkNameAvailabilityRequest).NameAvailable); ```
The above code with print 'True' if the name is available and 'False' if the nam
## Next steps
-The code in this tutorial creates a purview account, deletes a purview account and checks for name availability of purview account. You can now download the .NET SDK and learn about other resource provider actions you can perform for an Azure Purview account.
+The code in this tutorial creates a purview account, deletes a purview account and checks for name availability of purview account. You can now download the .NET SDK and learn about other resource provider actions you can perform for a Microsoft Purview account.
-Follow these next articles to learn how to navigate the Azure Purview Studio, create a collection, and grant access to Azure Purview.
+Follow these next articles to learn how to navigate the Microsoft Purview Studio, create a collection, and grant access to Microsoft Purview.
-* [How to use the Azure Purview Studio](use-azure-purview-studio.md)
+* [How to use the Microsoft Purview Studio](use-azure-purview-studio.md)
* [Create a collection](quickstart-create-collection.md)
-* [Add users to your Azure Purview account](catalog-permissions.md)
+* [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Create Azure Purview Portal Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-azure-purview-portal-faq.md
Title: Create an exception to deploy Azure Purview
-description: This article describes how to create an exception to deploy Azure Purview while leaving existing Azure policies in place to maintain security.
+ Title: Create an exception to deploy Microsoft Purview
+description: This article describes how to create an exception to deploy Microsoft Purview while leaving existing Azure policies in place to maintain security.
Last updated 08/26/2021
-# Create an exception to deploy Azure Purview
+# Create an exception to deploy Microsoft Purview
-Many subscriptions have [Azure Policies](../governance/policy/overview.md) in place that restrict the creation of some resources. This is to maintain subscription security and cleanliness. However, Azure Purview accounts deploy two other Azure resources when they're created: an Azure Storage account, and an Event Hubs namespace. When you [create Azure Purview Account](create-catalog-portal.md), these resources will be deployed. They'll be managed by Azure, so you don't need to maintain them, but you'll need to deploy them. Existing policies may block this deployment, and you may receive an error when attempting to create an Azure Purview account.
+Many subscriptions have [Azure Policies](../governance/policy/overview.md) in place that restrict the creation of some resources. This is to maintain subscription security and cleanliness. However, Microsoft Purview accounts deploy two other Azure resources when they're created: an Azure Storage account, and an Event Hubs namespace. When you [create Microsoft Purview Account](create-catalog-portal.md), these resources will be deployed. They'll be managed by Azure, so you don't need to maintain them, but you'll need to deploy them. Existing policies may block this deployment, and you may receive an error when attempting to create a Microsoft Purview account.
To maintain your policies in your subscription, but still allow the creation of these managed resources, you can create an exception.
-## Create an Azure policy exception for Azure Purview
+## Create an Azure policy exception for Microsoft Purview
1. Navigate to the [Azure portal](https://portal.azure.com) and search for **Policy**
To maintain your policies in your subscription, but still allow the creation of
``` > [!Note]
- > The tag could be anything beside `resourceBypass` and it's up to you to define value when creating Azure Purview in later steps as long as the policy can detect the tag.
+ > The tag could be anything beside `resourceBypass` and it's up to you to define value when creating Microsoft Purview in later steps as long as the policy can detect the tag.
:::image type="content" source="media/create-catalog-portal/policy-definition.png" alt-text="Screenshot showing how to create policy definition.":::
To maintain your policies in your subscription, but still allow the creation of
> [!Note] > If you have **Azure Policy** and need to add exception as in **Prerequisites**, you need to add the correct tag. For example, you can add `resourceBypass` tag:
-> :::image type="content" source="media/create-catalog-portal/add-purview-tag.png" alt-text="Add tag to Azure Purview account.":::
+> :::image type="content" source="media/create-catalog-portal/add-purview-tag.png" alt-text="Add tag to Microsoft Purview account.":::
## Next steps
-To set up Azure Purview by using Private Link, see [Use private endpoints for your Azure Purview account](./catalog-private-link.md).
+To set up Microsoft Purview by using Private Link, see [Use private endpoints for your Microsoft Purview account](./catalog-private-link.md).
purview Create Azure Purview Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-azure-purview-python.md
Title: 'Quickstart: Create an Azure Purview account using Python'
-description: Create an Azure Purview account using Python.
+ Title: 'Quickstart: Create a Microsoft Purview account using Python'
+description: Create a Microsoft Purview account using Python.
Last updated 09/27/2021
-# Quickstart: Create an Azure Purview account using Python
+# Quickstart: Create a Microsoft Purview account using Python
-In this quickstart, youΓÇÖll create an Azure Purview account programatically using Python. [Python reference for Azure Purview](/python/api/azure-mgmt-purview/) is available, but this article will take you through all the steps needed to create an account with Python.
+In this quickstart, youΓÇÖll create a Microsoft Purview account programatically using Python. [Python reference for Microsoft Purview](/python/api/azure-mgmt-purview/) is available, but this article will take you through all the steps needed to create an account with Python.
-Azure Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Azure Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
+Microsoft Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Microsoft Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-For more information about Azure Purview, [see our overview page](overview.md). For more information about deploying Azure Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
+For more information about Microsoft Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
[!INCLUDE [purview-quickstart-prerequisites](includes/purview-quickstart-prerequisites.md)]
For more information about Azure Purview, [see our overview page](overview.md).
pip install azure-mgmt-resource ```
-3. To install the Python package for Azure Purview, run the following command:
+3. To install the Python package for Microsoft Purview, run the following command:
```python pip install azure-mgmt-purview ```
- The [Python SDK for Azure Purview](https://github.com/Azure/azure-sdk-for-python) supports Python 2.7, 3.3, 3.4, 3.5, 3.6 and 3.7.
+ The [Python SDK for Microsoft Purview](https://github.com/Azure/azure-sdk-for-python) supports Python 2.7, 3.3, 3.4, 3.5, 3.6 and 3.7.
4. To install the Python package for Azure Identity authentication, run the following command:
For more information about Azure Purview, [see our overview page](overview.md).
# The purview name. It must be globally unique. purview_name = '<purview account name>'
- # Location name, where Azure Purview account must be created.
+ # Location name, where Microsoft Purview account must be created.
location = '<location name>' # Specify your Active Directory client ID, client secret, and tenant ID
For more information about Azure Purview, [see our overview page](overview.md).
try: pa = (purview_client.accounts.begin_create_or_update(rg_name, purview_name, purview_resource)).result()
- print("location:", pa.location, " Azure Purview Account Name: ", pa.name, " Id: " , pa.id ," tags: " , pa.tags)
+ print("location:", pa.location, " Microsoft Purview Account Name: ", pa.name, " Id: " , pa.id ," tags: " , pa.tags)
except: print("Error") print_item(pa)
For more information about Azure Purview, [see our overview page](overview.md).
pa = (purview_client.accounts.get(rg_name, purview_name)) print(getattr(pa,'provisioning_state')) if getattr(pa,'provisioning_state') != "Failed" :
- print("Error in creating Azure Purview account")
+ print("Error in creating Microsoft Purview account")
break time.sleep(30) ```
HereΓÇÖs the full Python code:
try: pa = (purview_client.accounts.begin_create_or_update(rg_name, purview_name, purview_resource)).result()
- print("location:", pa.location, " Azure Purview Account Name: ", purview_name, " Id: " , pa.id ," tags: " , pa.tags)
+ print("location:", pa.location, " Microsoft Purview Account Name: ", purview_name, " Id: " , pa.id ," tags: " , pa.tags)
except: print("Error in submitting job to create account") print_item(pa)
HereΓÇÖs the full Python code:
pa = (purview_client.accounts.get(rg_name, purview_name)) print(getattr(pa,'provisioning_state')) if getattr(pa,'provisioning_state') != "Failed" :
- print("Error in creating Azure Purview account")
+ print("Error in creating Microsoft Purview account")
break time.sleep(30)
main()
## Run the code
-Build and start the application. The console prints the progress of Azure Purview account creation. Wait until itΓÇÖs completed.
+Build and start the application. The console prints the progress of Microsoft Purview account creation. Wait until itΓÇÖs completed.
HereΓÇÖs the sample output: ```console
-location: southcentralus Azure Purview Account Name: purviewpython7 Id: /subscriptions/8c2c7b23-848d-40fe-b817-690d79ad9dfd/resourceGroups/Demo_Catalog/providers/Microsoft.Purview/accounts/purviewpython7 tags: None
+location: southcentralus Microsoft Purview Account Name: purviewpython7 Id: /subscriptions/8c2c7b23-848d-40fe-b817-690d79ad9dfd/resourceGroups/Demo_Catalog/providers/Microsoft.Purview/accounts/purviewpython7 tags: None
Creating Creating Succeeded
Succeeded
## Verify the output
-Go to the **Azure Purview accounts** page in the Azure portal and verify the account created using the above code.
+Go to the **Microsoft Purview accounts** page in the Azure portal and verify the account created using the above code.
-## Delete Azure Purview account
+## Delete Microsoft Purview account
To delete purview account, add the following code to the program, then run:
pa = purview_client.accounts.begin_delete(rg_name, purview_name).result()
## Next steps
-The code in this tutorial creates a purview account and deletes a purview account. You can now download the Python SDK and learn about other resource provider actions you can perform for an Azure Purview account.
+The code in this tutorial creates a purview account and deletes a purview account. You can now download the Python SDK and learn about other resource provider actions you can perform for a Microsoft Purview account.
-Follow these next articles to learn how to navigate the Azure Purview Studio, create a collection, and grant access to Azure Purview.
+Follow these next articles to learn how to navigate the Microsoft Purview Studio, create a collection, and grant access to Microsoft Purview.
-* [How to use the Azure Purview Studio](use-azure-purview-studio.md)
+* [How to use the Microsoft Purview Studio](use-azure-purview-studio.md)
* [Create a collection](quickstart-create-collection.md)
-* [Add users to your Azure Purview account](catalog-permissions.md)
+* [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Create Catalog Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-catalog-portal.md
Title: 'Quickstart: Create an Azure Purview account in the Azure portal'
-description: This Quickstart describes how to create an Azure Purview account and configure permissions to begin using it.
+ Title: 'Quickstart: Create a Microsoft Purview account in the Azure portal'
+description: This Quickstart describes how to create a Microsoft Purview account and configure permissions to begin using it.
Last updated 11/15/2021
-#Customer intent: As a data steward, I want create a new Azure Purview Account so that I can scan and classify my data.
+#Customer intent: As a data steward, I want create a new Microsoft Purview Account so that I can scan and classify my data.
-# Quickstart: Create an Azure Purview account in the Azure portal
+# Quickstart: Create a Microsoft Purview account in the Azure portal
-This quickstart describes the steps to create an Azure Purview account in the Azure portal and get started on the process of classifying, securing, and discovering your data in Azure Purview!
+This quickstart describes the steps to create a Microsoft Purview account in the Azure portal and get started on the process of classifying, securing, and discovering your data in Microsoft Purview!
-Azure Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Azure Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
+Microsoft Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Microsoft Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-For more information about Azure Purview, [see our overview page](overview.md). For more information about deploying Azure Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
+For more information about Microsoft Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
[!INCLUDE [purview-quickstart-prerequisites](includes/purview-quickstart-prerequisites.md)]
-## Create an Azure Purview account
+## Create a Microsoft Purview account
-1. Go to the **Azure Purview accounts** page in the [Azure portal](https://portal.azure.com).
+1. Go to the **Microsoft Purview accounts** page in the [Azure portal](https://portal.azure.com).
:::image type="content" source="media/create-catalog-portal/purview-accounts-page.png" alt-text="Screenshot showing the purview accounts page in the Azure portal":::
-1. Select **Create** to create a new Azure Purview account.
+1. Select **Create** to create a new Microsoft Purview account.
- :::image type="content" source="media/create-catalog-portal/select-create.png" alt-text="Screenshot with the create button highlighted an Azure Purview in the Azure portal.":::
+ :::image type="content" source="media/create-catalog-portal/select-create.png" alt-text="Screenshot with the create button highlighted a Microsoft Purview in the Azure portal.":::
- Or instead, you can go to the marketplace, search for **Azure Purview**, and select **Create**.
+ Or instead, you can go to the marketplace, search for **Microsoft Purview**, and select **Create**.
- :::image type="content" source="media/create-catalog-portal/search-marketplace.png" alt-text="Screenshot showing Azure Purview in the Azure Marketplace, with the create button highlighted.":::
+ :::image type="content" source="media/create-catalog-portal/search-marketplace.png" alt-text="Screenshot showing Microsoft Purview in the Azure Marketplace, with the create button highlighted.":::
-1. On the new Create Azure Purview account page, under the **Basics** tab, select the Azure subscription where you want to create your Azure Purview account.
+1. On the new Create Microsoft Purview account page, under the **Basics** tab, select the Azure subscription where you want to create your Microsoft Purview account.
-1. Select an existing **resource group** or create a new one to hold your Azure Purview account.
+1. Select an existing **resource group** or create a new one to hold your Microsoft Purview account.
To learn more about resource groups, see our article on [using resource groups to manage your Azure resources](../azure-resource-manager/management/manage-resource-groups-portal.md#what-is-a-resource-group).
-1. Enter a **Azure Purview account name**. Spaces and symbols aren't allowed.
- The name of the Azure Purview account must be globally unique. If you see the following error, change the name of Azure Purview account and try creating again.
+1. Enter a **Microsoft Purview account name**. Spaces and symbols aren't allowed.
+ The name of the Microsoft Purview account must be globally unique. If you see the following error, change the name of Microsoft Purview account and try creating again.
- :::image type="content" source="media/create-catalog-portal/name-error.png" alt-text="Screenshot showing the Create Azure Purview account screen with an account name that is already in use, and the error message highlighted.":::
+ :::image type="content" source="media/create-catalog-portal/name-error.png" alt-text="Screenshot showing the Create Microsoft Purview account screen with an account name that is already in use, and the error message highlighted.":::
1. Choose a **location**.
- The list shows only locations that support Azure Purview. The location you choose will be the region where your Azure Purview account and meta data will be stored. Sources can be housed in other regions.
+ The list shows only locations that support Microsoft Purview. The location you choose will be the region where your Microsoft Purview account and meta data will be stored. Sources can be housed in other regions.
> [!Note]
- > Azure Purview does not support moving accounts across regions, so be sure to deploy to the correction region. You can find out more information about this in [move operation support for resources](../azure-resource-manager/management/move-support-resources.md).
+ > Microsoft Purview does not support moving accounts across regions, so be sure to deploy to the correction region. You can find out more information about this in [move operation support for resources](../azure-resource-manager/management/move-support-resources.md).
-1. Select **Review & Create**, and then select **Create**. It takes a few minutes to complete the creation. The newly created Azure Purview account instance will appear in the list on your **Azure Purview accounts** page.
+1. Select **Review & Create**, and then select **Create**. It takes a few minutes to complete the creation. The newly created Microsoft Purview account instance will appear in the list on your **Microsoft Purview accounts** page.
- :::image type="content" source="media/create-catalog-portal/create-resource.png" alt-text="Screenshot showing the Create Azure Purview account screen with the Review + Create button highlighted":::
+ :::image type="content" source="media/create-catalog-portal/create-resource.png" alt-text="Screenshot showing the Create Microsoft Purview account screen with the Review + Create button highlighted":::
-## Open Azure Purview Studio
+## Open Microsoft Purview Studio
-After your Azure Purview account is created, you'll use the Azure Purview Studio to access and manage it. There are two ways to open Azure Purview Studio:
+After your Microsoft Purview account is created, you'll use the Microsoft Purview Studio to access and manage it. There are two ways to open Microsoft Purview Studio:
-* Open your Azure Purview account in the [Azure portal](https://portal.azure.com). Select the "Open Azure Purview Studio" tile on the overview page.
- :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Azure Purview account overview page, with the Azure Purview Studio tile highlighted.":::
+* Open your Microsoft Purview account in the [Azure portal](https://portal.azure.com). Select the "Open Microsoft Purview Studio" tile on the overview page.
+ :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Microsoft Purview account overview page, with the Microsoft Purview Studio tile highlighted.":::
-* Alternatively, you can browse to [https://web.purview.azure.com](https://web.purview.azure.com), select your Azure Purview account, and sign in to your workspace.
+* Alternatively, you can browse to [https://web.purview.azure.com](https://web.purview.azure.com), select your Microsoft Purview account, and sign in to your workspace.
## Next steps
-In this quickstart, you learned how to create an Azure Purview account and how to access it through the Azure Purview Studio.
+In this quickstart, you learned how to create a Microsoft Purview account and how to access it through the Microsoft Purview Studio.
-Next, you can create a user-assigned managed identity (UAMI) that will enable your new Azure Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
+Next, you can create a user-assigned managed identity (UAMI) that will enable your new Microsoft Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
To create a UAMI, follow our [guide to create a user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity).
-Follow these next articles to learn how to navigate the Azure Purview Studio, create a collection, and grant access to Azure Purview:
+Follow these next articles to learn how to navigate the Microsoft Purview Studio, create a collection, and grant access to Microsoft Purview:
-* [Using the Azure Purview Studio](use-azure-purview-studio.md)
+* [Using the Microsoft Purview Studio](use-azure-purview-studio.md)
* [Create a collection](quickstart-create-collection.md)
-* [Add users to your Azure Purview account](catalog-permissions.md)
+* [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Create Catalog Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-catalog-powershell.md
Title: 'Quickstart: Create an Azure Purview account with PowerShell/Azure CLI'
-description: This Quickstart describes how to create an Azure Purview account using Azure PowerShell/Azure CLI.
+ Title: 'Quickstart: Create a Microsoft Purview account with PowerShell/Azure CLI'
+description: This Quickstart describes how to create a Microsoft Purview account using Azure PowerShell/Azure CLI.
Last updated 10/28/2021
ms.devlang: azurecli
-#Customer intent: As a data steward, I want create a new Azure Purview Account so that I can scan and classify my data.
+#Customer intent: As a data steward, I want create a new Microsoft Purview Account so that I can scan and classify my data.
-# Quickstart: Create an Azure Purview account using Azure PowerShell/Azure CLI
+# Quickstart: Create a Microsoft Purview account using Azure PowerShell/Azure CLI
-In this Quickstart, you'll create an Azure Purview account using Azure PowerShell/Azure CLI. [PowerShell reference for Azure Purview](/powershell/module/az.purview/) is available, but this article will take you through all the steps needed to create an account with PowerShell.
+In this Quickstart, you'll create a Microsoft Purview account using Azure PowerShell/Azure CLI. [PowerShell reference for Microsoft Purview](/powershell/module/az.purview/) is available, but this article will take you through all the steps needed to create an account with PowerShell.
-Azure Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Azure Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
+Microsoft Purview is a data governance service that helps you manage and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Microsoft Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end linage. Data consumers are able to discover data across your organization, and data administrators are able to audit, secure, and ensure right use of your data.
-For more information about Azure Purview, [see our overview page](overview.md). For more information about deploying Azure Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
+For more information about Microsoft Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
[!INCLUDE [purview-quickstart-prerequisites](includes/purview-quickstart-prerequisites.md)]
For more information about Azure Purview, [see our overview page](overview.md).
Install either Azure PowerShell or Azure CLI in your client machine to deploy the template: [Command-line deployment](../azure-resource-manager/templates/template-tutorial-create-first-template.md?tabs=azure-cli#command-line-deployment)
-## Create an Azure Purview account
+## Create a Microsoft Purview account
1. Sign in with your Azure credential
For more information about Azure Purview, [see our overview page](overview.md).
-1. Create a resource group for your Azure Purview account. You can skip this step if you already have one:
+1. Create a resource group for your Microsoft Purview account. You can skip this step if you already have one:
# [PowerShell](#tab/azure-powershell)
For more information about Azure Purview, [see our overview page](overview.md).
-1. Create or Deploy the Azure Purview account
+1. Create or Deploy the Microsoft Purview account
# [PowerShell](#tab/azure-powershell)
- Use the [New-AzPurviewAccount](/powershell/module/az.purview/new-azpurviewaccount) cmdlet to create the Azure Purview account:
+ Use the [New-AzPurviewAccount](/powershell/module/az.purview/new-azpurviewaccount) cmdlet to create the Microsoft Purview account:
```azurepowershell New-AzPurviewAccount -Name yourPurviewAccountName -ResourceGroupName myResourceGroup -Location eastus -IdentityType SystemAssigned -SkuCapacity 4 -SkuName Standard -PublicNetworkAccess Enabled
For more information about Azure Purview, [see our overview page](overview.md).
# [Azure CLI](#tab/azure-cli)
- 1. Create an Azure Purview template file such as `purviewtemplate.json`. You can update `name`, `location`, and `capacity` (`4` or `16`):
+ 1. Create a Microsoft Purview template file such as `purviewtemplate.json`. You can update `name`, `location`, and `capacity` (`4` or `16`):
```json {
For more information about Azure Purview, [see our overview page](overview.md).
} ```
- 1. Deploy Azure Purview template
+ 1. Deploy Microsoft Purview template
To run this deployment command, you must have the [latest version](/cli/azure/install-azure-cli) of Azure CLI.
For more information about Azure Purview, [see our overview page](overview.md).
1. The deployment command returns results. Look for `ProvisioningState` to see whether the deployment succeeded.
-1. If you deployed the Azure Purview account using a service principal, instead of a user account, you will also need to run the below command in the Azure CLI:
+1. If you deployed the Microsoft Purview account using a service principal, instead of a user account, you will also need to run the below command in the Azure CLI:
```azurecli
- az purview account add-root-collection-admin --account-name [Azure Purview Account Name] --resource-group [Resource Group Name] --object-id [User Object Id]
+ az purview account add-root-collection-admin --account-name [Microsoft Purview Account Name] --resource-group [Resource Group Name] --object-id [User Object Id]
```
- This command will grant the user account [collection admin](catalog-permissions.md#roles) permissions on the root collection in your Azure Purview account. This allows the user to access the Azure Purview Studio and add permission for other users. For more information about permissions in Azure Purview, see our [permissions guide](catalog-permissions.md). For more information about collections, see our [manage collections article](how-to-create-and-manage-collections.md).
+ This command will grant the user account [collection admin](catalog-permissions.md#roles) permissions on the root collection in your Microsoft Purview account. This allows the user to access the Microsoft Purview Studio and add permission for other users. For more information about permissions in Microsoft Purview, see our [permissions guide](catalog-permissions.md). For more information about collections, see our [manage collections article](how-to-create-and-manage-collections.md).
## Next steps
-In this quickstart, you learned how to create an Azure Purview account.
+In this quickstart, you learned how to create a Microsoft Purview account.
-Follow these next articles to learn how to navigate the Azure Purview Studio, create a collection, and grant access to Azure Purview.
+Follow these next articles to learn how to navigate the Microsoft Purview Studio, create a collection, and grant access to Microsoft Purview.
-* [How to use the Azure Purview Studio](use-azure-purview-studio.md)
-* [Add users to your Azure Purview account](catalog-permissions.md)
+* [How to use the Microsoft Purview Studio](use-azure-purview-studio.md)
+* [Add users to your Microsoft Purview account](catalog-permissions.md)
* [Create a collection](quickstart-create-collection.md)
purview Create Sensitivity Label https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/create-sensitivity-label.md
Title: Labeling in Azure Purview
-description: Start utilizing sensitivity labels and classifications to enhance your Azure Purview assets
+ Title: Labeling in Microsoft Purview
+description: Start utilizing sensitivity labels and classifications to enhance your Microsoft Purview assets
Last updated 09/27/2021
-# Labeling in Azure Purview
+# Labeling in Microsoft Purview
> [!IMPORTANT]
-> Azure Purview Sensitivity Labels are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Sensitivity Labels are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
> To get work done, people in your organization collaborate with others both inside and outside the organization. Data doesn't always stay in your cloud, and often roams everywhere, across devices, apps, and services. When your data roams, you still want it to be secure in a way that meets your organization's business and compliance policies.</br>
Applying sensitivity labels to your content enables you to keep your data secure
For example, applying a sensitivity label ΓÇÿhighly confidentialΓÇÖ to a document that contains social security number and credit card numbers helps you identify the sensitivity of the document without knowing the actual data in the document.
-## Benefits of labeling in Azure Purview
+## Benefits of labeling in Microsoft Purview
-Azure Purview allows you to apply sensitivity labels to assets, enabling you to classify and protect your data.
+Microsoft Purview allows you to apply sensitivity labels to assets, enabling you to classify and protect your data.
-* **Label travels with the data:** The sensitivity labels created in Microsoft 365 can also be extended to Azure Purview, SharePoint, Teams, Power BI, and SQL. When you apply a label on an office document and then scan it in Azure Purview, the label will flow to Azure Purview. While the label is applied to the actual file in M365, it is only added as metadata in the Azure Purview catalog. While there are differences in how a label is applied to an asset across various services/applications, labels travel with the data and is recognized by all the services you extend it to.
-* **Overview of your data estate:** Azure Purview provides insights into your data through pre-canned reports. When you scan data in Azure Purview, we hydrate the reports with information on what assets you have, scan history, classifications found in your data, labels applied, glossary terms, etc.
+* **Label travels with the data:** The sensitivity labels created in Microsoft 365 can also be extended to Microsoft Purview, SharePoint, Teams, Power BI, and SQL. When you apply a label on an office document and then scan it in Microsoft Purview, the label will flow to Microsoft Purview. While the label is applied to the actual file in M365, it is only added as metadata in the Microsoft Purview catalog. While there are differences in how a label is applied to an asset across various services/applications, labels travel with the data and is recognized by all the services you extend it to.
+* **Overview of your data estate:** Microsoft Purview provides insights into your data through pre-canned reports. When you scan data in Microsoft Purview, we hydrate the reports with information on what assets you have, scan history, classifications found in your data, labels applied, glossary terms, etc.
* **Automatic labeling:** Labels can be applied automatically based on sensitivity of the data. When an asset is scanned for sensitive data, autolabeling rules are used to decide which sensitivity label to apply. You can create autolabeling rules for each sensitivity label, defining which classification/sensitive information type constitutes a label. * **Apply labels to files and database columns:** Labels can be applied to files in storage like Azure Data Lake, Azure Files, etc. and to schematized data like columns in Azure SQL DB, Cosmos DB, etc. Sensitivity labels are tags that you can apply on assets to classify and protect your data. Learn more about [sensitivity labels here](/microsoft-365/compliance/create-sensitivity-labels).
-## How to apply labels to assets in Azure Purview
+## How to apply labels to assets in Microsoft Purview
-Being able to apply labels to your asset in Azure Purview requires you to perform the following steps:
+Being able to apply labels to your asset in Microsoft Purview requires you to perform the following steps:
-1. [Create or extend existing sensitivity labels to Azure Purview](how-to-automatically-label-your-content.md), in the Microsoft 365 compliance center. Creating sensitivity labels include autolabeling rules that tell us which label should be applied based on the classifications found in your data.
-1. [Register and scan your asset](how-to-automatically-label-your-content.md#scan-your-data-to-apply-sensitivity-labels-automatically) in Azure Purview.
-1. Azure Purview applies classifications: When you schedule a scan on an asset, Azure Purview scans the type of data in your asset and applies classifications to it in the data catalog. Application of classifications is done automatically by Azure Purview, there is no action for you.
-1. Azure Purview applies labels: Once classifications are found on an asset, Azure Purview will apply labels to the assets depending on autolabeling rules. Application of labels is done automatically by Azure Purview, there is no action for you as long as you have created labels with autolabeling rules in step 1.
+1. [Create or extend existing sensitivity labels to Microsoft Purview](how-to-automatically-label-your-content.md), in the Microsoft 365 compliance center. Creating sensitivity labels include autolabeling rules that tell us which label should be applied based on the classifications found in your data.
+1. [Register and scan your asset](how-to-automatically-label-your-content.md#scan-your-data-to-apply-sensitivity-labels-automatically) in Microsoft Purview.
+1. Microsoft Purview applies classifications: When you schedule a scan on an asset, Microsoft Purview scans the type of data in your asset and applies classifications to it in the data catalog. Application of classifications is done automatically by Microsoft Purview, there is no action for you.
+1. Microsoft Purview applies labels: Once classifications are found on an asset, Microsoft Purview will apply labels to the assets depending on autolabeling rules. Application of labels is done automatically by Microsoft Purview, there is no action for you as long as you have created labels with autolabeling rules in step 1.
> [!NOTE] > Autolabeling rules are conditions that you specify, stating when a particular label should be applied. When these conditions are met, the label is automatically assigned to the data. When you create your labels, make sure to define autolabeling rules for both files and database columns to apply your labels automatically with each scan.
Being able to apply labels to your asset in Azure Purview requires you to perfor
## Supported data sources
-Sensitivity labels are supported in Azure Purview for the following data sources:
+Sensitivity labels are supported in Microsoft Purview for the following data sources:
|Data type |Sources | |||
Sensitivity labels are supported in Azure Purview for the following data sources
## Labeling for SQL databases
-In addition to Azure Purview labeling for schematized data assets, Microsoft also supports labeling for SQL database columns using the SQL data classification in [SQL Server Management Studio (SSMS)](/sql/ssms/sql-server-management-studio-ssms). While Azure Purview uses the global [sensitivity labels](/microsoft-365/compliance/sensitivity-labels), SSMS only uses labels defined locally.
+In addition to Microsoft Purview labeling for schematized data assets, Microsoft also supports labeling for SQL database columns using the SQL data classification in [SQL Server Management Studio (SSMS)](/sql/ssms/sql-server-management-studio-ssms). While Microsoft Purview uses the global [sensitivity labels](/microsoft-365/compliance/sensitivity-labels), SSMS only uses labels defined locally.
-Labeling in Azure Purview and labeling in SSMS are separate processes that do not currently interact with each other. Therefore, **labels applied in SSMS are not shown in Azure Purview, and vice versa**. We recommend Azure Purview for labeling SQL databases, as it uses global MIP labels that can be applied across multiple platforms.
+Labeling in Microsoft Purview and labeling in SSMS are separate processes that do not currently interact with each other. Therefore, **labels applied in SSMS are not shown in Microsoft Purview, and vice versa**. We recommend Microsoft Purview for labeling SQL databases, as it uses global MIP labels that can be applied across multiple platforms.
For more information, see the [SQL data discovery and classification documentation](/sql/relational-databases/security/sql-data-discovery-and-classification). </br></br>
purview Deployment Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/deployment-best-practices.md
Title: 'Deployment best practices'
-description: This article provides best practices for deploying Azure Purview. Azure Purview enables any user to register, discover, understand, and consume data sources.
+description: This article provides best practices for deploying Microsoft Purview. Microsoft Purview enables any user to register, discover, understand, and consume data sources.
Last updated 11/23/2020
-# Azure Purview deployment best practices
+# Microsoft Purview deployment best practices
-This article identifies common tasks that can help you deploy Azure Purview into production. These tasks can be completed in phases, over the course of a month or more. Even organizations who have already deployed Azure Purview can use this guide to ensure they're getting the most out of their investment.
+This article identifies common tasks that can help you deploy Microsoft Purview into production. These tasks can be completed in phases, over the course of a month or more. Even organizations who have already deployed Microsoft Purview can use this guide to ensure they're getting the most out of their investment.
-A well-planned deployment of a data governance platform (such as Azure Purview), can give the following benefits:
+A well-planned deployment of a data governance platform (such as Microsoft Purview), can give the following benefits:
- Better data discovery - Improved analytic collaboration
A well-planned deployment of a data governance platform (such as Azure Purview),
## Prerequisites - Access to Microsoft Azure with a development or production subscription-- Ability to create Azure resources including Azure Purview
+- Ability to create Azure resources including Microsoft Purview
- Access to data sources such as Azure Data Lake Storage or Azure SQL in test, development, or production environments - For Data Lake Storage, the required role to scan is Reader Role - For SQL, the identity must be able to query tables for sampling of classifications
The general approach is to break down those overarching objectives into various
Once your organization agrees on the high-level objectives and goals, there will be many questions from multiple groups. ItΓÇÖs crucial to gather these questions in order to craft a plan to address all of the concerns. Some example questions that you may run into during the initial phase: 1. What are the main organization data sources and data systems?
-2. For data sources that are not supported yet by Azure Purview, what are my options?
-3. How many Azure Purview instances do we need?
+2. For data sources that are not supported yet by Microsoft Purview, what are my options?
+3. How many Microsoft Purview instances do we need?
4. Who are the users? 5. Who can scan new data sources?
-6. Who can modify content inside of Azure Purview?
-7. What process can I use to improve the data quality in Azure Purview?
+6. Who can modify content inside of Microsoft Purview?
+7. What process can I use to improve the data quality in Microsoft Purview?
8. How to bootstrap the platform with existing critical assets, glossary terms, and contacts? 9. How to integrate with existing systems? 10. How to gather feedback and build a sustainable process?
While you might not have the answer to most of these questions right away, it ca
## Include the right stakeholders
-To ensure the success of implementing Azure Purview for the entire enterprise, itΓÇÖs important to involve the right stakeholders. Only a few people are involved in the initial phase. However, as the scope expands, you will require additional personas to contribute to the project and provide feedback.
+To ensure the success of implementing Microsoft Purview for the entire enterprise, itΓÇÖs important to involve the right stakeholders. Only a few people are involved in the initial phase. However, as the scope expands, you will require additional personas to contribute to the project and provide feedback.
Some key stakeholders that you may want to include: |Persona|Roles| |||
-|**Chief Data Officer**|The CDO oversees a range of functions that may include data management, data quality, master data management, data science, business intelligence, and creating data strategy. They can be the sponsor of the Azure Purview implementation project.|
+|**Chief Data Officer**|The CDO oversees a range of functions that may include data management, data quality, master data management, data science, business intelligence, and creating data strategy. They can be the sponsor of the Microsoft Purview implementation project.|
|**Domain/Business Owner**|A business person who influences usage of tools and has budget control| |**Data Analyst**|Able to frame a business problem and analyze data to help leaders make business decisions| |**Data Architect**|Design databases for mission-critical line-of-business apps along with designing and implementing data security|
Some key stakeholders that you may want to include:
|**Data Scientist**|Build analytical models and set up data products to be accessed by APIs| |**DB Admin**|Own, track, and resolve database-related incidents and requests within service-level agreements (SLAs); May set up data pipelines| |**DevOps**|Line-of-Business application development and implementation; may include writing scripts and orchestration capabilities|
-|**Data Security Specialist**|Assess overall network and data security, which involves data coming in and out of Azure Purview|
+|**Data Security Specialist**|Assess overall network and data security, which involves data coming in and out of Microsoft Purview|
## Identify key scenarios
-Azure Purview can be used to centrally manage data governance across an organizationΓÇÖs data estate spanning cloud and on-premises environments. To have a successful implementation, you must identify key scenarios that are critical to the business. These scenarios can cross business unit boundaries or impact multiple user personas either upstream or downstream.
+Microsoft Purview can be used to centrally manage data governance across an organizationΓÇÖs data estate spanning cloud and on-premises environments. To have a successful implementation, you must identify key scenarios that are critical to the business. These scenarios can cross business unit boundaries or impact multiple user personas either upstream or downstream.
These scenarios can be written up in various ways, but you should include at least these five dimensions: 1. Persona ΓÇô Who are the users? 2. Source system ΓÇô What are the data sources such as Azure Data Lake Storage Gen2 or Azure SQL Database? 3. Impact Area ΓÇô What is the category of this scenario?
-4. Detail scenarios ΓÇô How the users use Azure Purview to solve problems?
+4. Detail scenarios ΓÇô How the users use Microsoft Purview to solve problems?
5. Expected outcome ΓÇô What is the success criteria? The scenarios must be specific, actionable, and executable with measurable results. Some example scenarios that you can use:
The scenarios must be specific, actionable, and executable with measurable resul
|Discover business-critical assets|I need to have a search engine that can search through all metadata in the catalog. I should be able to search using technical term, business term with either simple or complex search using wildcard.|Business Analyst, Data Scientist, Data Engineer, Data Admin| |Track data to understand its origin and troubleshoot data issues|I need to have data lineage to track data in reports, predictions, or models back to its original source and understand the changes and where the data has resided through the data life cycle. This scenario needs to support prioritized data pipelines Azure Data Factory and Databricks.|Data Engineer, Data Scientist| |Enrich metadata on critical data assets|I need to enrich the data set in the catalog with technical metadata that is generated automatically. Classification and labeling are some examples.|Data Engineer, Domain/Business Owner|
-|Govern data assets with friendly user experience|I need to have a Business glossary for business-specific metadata. The business users can use Azure Purview for self-service scenarios to annotate their data and enable the data to be discovered easily via search.|Domain/Business Owner, Business Analyst, Data Scientist, Data Engineer|
+|Govern data assets with friendly user experience|I need to have a Business glossary for business-specific metadata. The business users can use Microsoft Purview for self-service scenarios to annotate their data and enable the data to be discovered easily via search.|Domain/Business Owner, Business Analyst, Data Scientist, Data Engineer|
## Deployment models
-If you have only one small group using Azure Purview with basic consumption use cases, the approach could be as simple as having one Azure Purview instance to service the entire group. However, you may also wonder whether your organization needs more than one Azure Purview instance. And if using multiple Azure Purview instances, how can employees promote the assets from one stage to another.
+If you have only one small group using Microsoft Purview with basic consumption use cases, the approach could be as simple as having one Microsoft Purview instance to service the entire group. However, you may also wonder whether your organization needs more than one Microsoft Purview instance. And if using multiple Microsoft Purview instances, how can employees promote the assets from one stage to another.
-### Determine the number of Azure Purview instances
+### Determine the number of Microsoft Purview instances
-In most cases, there should only be one Azure Purview account for the entire organization. This approach takes maximum advantage of the ΓÇ£network effectsΓÇ¥ where the value of the platform increases exponentially as a function of the data that resides inside the platform.
+In most cases, there should only be one Microsoft Purview account for the entire organization. This approach takes maximum advantage of the ΓÇ£network effectsΓÇ¥ where the value of the platform increases exponentially as a function of the data that resides inside the platform.
However, there are exceptions to this pattern: 1. **Testing new configurations** ΓÇô Organizations may want to create multiple instances for testing out scan configurations or classifications in isolated environments. Although there is a ΓÇ£versioningΓÇ¥ feature in some areas of the platform such as glossary, it would be easier to have a ΓÇ£disposableΓÇ¥ instance to freely test. 2. **Separating Test, Pre-production and Production** ΓÇô Organizations want to create different platforms for different kinds of data stored in different environments. It is not recommended as those kinds of data are different content types. You could use glossary term at the top hierarchy level or category to segregate content types.
-3. **Conglomerates and federated model** ΓÇô Conglomerates often have many business units (BUs) that operate separately, and, in some cases, they won't even share billing with each other. In those cases, the organization will end up creating an Azure Purview instance for each BU. This model is not ideal, but may be necessary, especially because BUs are often not willing to share billing.
-4. **Compliance** ΓÇô There are some strict compliance regimes, which treat even metadata as sensitive and require it to be in a specific geography. If a company has multiple geographies, the only solution is to have multiple Azure Purview instances, one for each geography.
+3. **Conglomerates and federated model** ΓÇô Conglomerates often have many business units (BUs) that operate separately, and, in some cases, they won't even share billing with each other. In those cases, the organization will end up creating a Microsoft Purview instance for each BU. This model is not ideal, but may be necessary, especially because BUs are often not willing to share billing.
+4. **Compliance** ΓÇô There are some strict compliance regimes, which treat even metadata as sensitive and require it to be in a specific geography. If a company has multiple geographies, the only solution is to have multiple Microsoft Purview instances, one for each geography.
### Create a process to move to production
-Some organizations may decide to keep things simple by working with a single production version of Azure Purview. They probably donΓÇÖt need to go beyond discovery, search, and browse scenarios. If some assets have incorrect glossary terms, itΓÇÖs quite forgiving to let people self-correct. However, most organizations that want to deploy Azure Purview across various business units will want to have some form of process and control.
+Some organizations may decide to keep things simple by working with a single production version of Microsoft Purview. They probably donΓÇÖt need to go beyond discovery, search, and browse scenarios. If some assets have incorrect glossary terms, itΓÇÖs quite forgiving to let people self-correct. However, most organizations that want to deploy Microsoft Purview across various business units will want to have some form of process and control.
-Another important aspect to include in your production process is how classifications and labels can be migrated. Azure Purview has over 90 system classifiers. You can apply system or custom classifications on file, table, or column assets. Classifications are like subject tags and are used to mark and identify content of a specific type found within your data estate during scanning. Sensitivity labels are used to identify the categories of classification types within your organizational data, and then group the policies you wish to apply to each category. It makes use of the same sensitive information types as Microsoft 365, allowing you to stretch your existing security policies and protection across your entire content and data estate. It can scan and automatically classify documents. For example, if you have a file named multiple.docx and it has a National ID number in its content, Azure Purview will add classification such as EU National Identification Number in the Asset Detail page.
+Another important aspect to include in your production process is how classifications and labels can be migrated. Microsoft Purview has over 90 system classifiers. You can apply system or custom classifications on file, table, or column assets. Classifications are like subject tags and are used to mark and identify content of a specific type found within your data estate during scanning. Sensitivity labels are used to identify the categories of classification types within your organizational data, and then group the policies you wish to apply to each category. It makes use of the same sensitive information types as Microsoft 365, allowing you to stretch your existing security policies and protection across your entire content and data estate. It can scan and automatically classify documents. For example, if you have a file named multiple.docx and it has a National ID number in its content, Microsoft Purview will add classification such as EU National Identification Number in the Asset Detail page.
-In Azure Purview, there are several areas where the Catalog Administrators need to ensure consistency and maintenance best practices over its life cycle:
+In Microsoft Purview, there are several areas where the Catalog Administrators need to ensure consistency and maintenance best practices over its life cycle:
-* **Data assets** ΓÇô Data sources will need to be rescanned across environments. ItΓÇÖs not recommended to scan only in development and then regenerate them using APIs in Production. The main reason is that the Azure Purview scanners do a lot more ΓÇ£wiringΓÇ¥ behind the scenes on the data assets, which could be complex to move them to a different Azure Purview instance. ItΓÇÖs much easier to just add the same data source in production and scan the sources again. The general best practice is to have documentation of all scans, connections, and authentication mechanisms being used.
+* **Data assets** ΓÇô Data sources will need to be rescanned across environments. ItΓÇÖs not recommended to scan only in development and then regenerate them using APIs in Production. The main reason is that the Microsoft Purview scanners do a lot more ΓÇ£wiringΓÇ¥ behind the scenes on the data assets, which could be complex to move them to a different Microsoft Purview instance. ItΓÇÖs much easier to just add the same data source in production and scan the sources again. The general best practice is to have documentation of all scans, connections, and authentication mechanisms being used.
* **Scan rule sets** ΓÇô This is your collection of rules assigned to specific scan such as file type and classifications to detect. If you donΓÇÖt have that many scan rule sets, itΓÇÖs possible to just re-create them manually again via Production. This will require an internal process and good documentation. However, if your rule sets change on a daily or weekly basis, this could be addressed by exploring the REST API route. * **Custom classifications** ΓÇô Your classifications may not also change on a regular basis. During the initial phase of deployment, it may take some time to understand various requirements to come up with custom classifications. However, once settled, this will require little change. So the recommendation here is to manually migrate any custom classifications over or use the REST API. * **Glossary** ΓÇô ItΓÇÖs possible to export and import glossary terms via the UX. For automation scenarios, you can also use the REST API.
-* **Resource set pattern policies** ΓÇô This functionality is very advanced for any typical organizations to apply. In some cases, your Azure Data Lake Storage has folder naming conventions and specific structure that may cause problems for Azure Purview to generate the resource set. Your business unit may also want to change the resource set construction with additional customizations to fit the business needs. For this scenario, itΓÇÖs best to keep track of all changes via REST API, and document the changes through external versioning platform.
-* **Role assignment** ΓÇô This is where you control who has access to Azure Purview and which permissions they have. Azure Purview also has REST API to support export and import of users and roles but this is not Atlas API-compatible. The recommendation is to assign an Azure Security Group and manage the group membership instead.
+* **Resource set pattern policies** ΓÇô This functionality is very advanced for any typical organizations to apply. In some cases, your Azure Data Lake Storage has folder naming conventions and specific structure that may cause problems for Microsoft Purview to generate the resource set. Your business unit may also want to change the resource set construction with additional customizations to fit the business needs. For this scenario, itΓÇÖs best to keep track of all changes via REST API, and document the changes through external versioning platform.
+* **Role assignment** ΓÇô This is where you control who has access to Microsoft Purview and which permissions they have. Microsoft Purview also has REST API to support export and import of users and roles but this is not Atlas API-compatible. The recommendation is to assign an Azure Security Group and manage the group membership instead.
-### Plan and implement different integration points with Azure Purview
+### Plan and implement different integration points with Microsoft Purview
-ItΓÇÖs likely that a mature organization already has an existing data catalog. The key question is whether to continue to use the existing technology and sync with Azure Purview or not. To handle syncing with existing products in an organization, Azure Purview provides Atlas REST APIs. Atlas APIs provide a powerful and flexible mechanism handling both push and pull scenarios. Information can be published to Azure Purview using Atlas APIs for bootstrapping or to push latest updates from another system into Azure Purview. The information available in Azure Purview can also be read using Atlas APIs and then synced back to existing products.
+ItΓÇÖs likely that a mature organization already has an existing data catalog. The key question is whether to continue to use the existing technology and sync with Microsoft Purview or not. To handle syncing with existing products in an organization, Microsoft Purview provides Atlas REST APIs. Atlas APIs provide a powerful and flexible mechanism handling both push and pull scenarios. Information can be published to Microsoft Purview using Atlas APIs for bootstrapping or to push latest updates from another system into Microsoft Purview. The information available in Microsoft Purview can also be read using Atlas APIs and then synced back to existing products.
-For other integration scenarios such as ticketing, custom user interface, and orchestration you can use Atlas APIs and Kafka endpoints. In general, there are four integration points with Azure Purview:
+For other integration scenarios such as ticketing, custom user interface, and orchestration you can use Atlas APIs and Kafka endpoints. In general, there are four integration points with Microsoft Purview:
-* **Data Asset** ΓÇô This enables Azure Purview to scan a storeΓÇÖs assets in order to enumerate what those assets are and collect any readily available metadata about them. So for SQL this could be a list of DBs, tables, stored procedures, views and config data about them kept in places like `sys.tables`. For something like Azure Data Factory (ADF) this could be enumerating all the pipelines and getting data on when they were created, last run, current state.
-* **Lineage** ΓÇô This enables Azure Purview to collect information from an analysis/data mutation system on how data is moving around. For something like Spark this could be gathering information from the execution of a notebook to see what data the notebook ingested, how it transformed it and where it outputted it. For something like SQL, it could be analyzing query logs to reverse engineer what mutation operations were executed and what they did. We support both push and pull based lineage depending on the needs.
-* **Classification** ΓÇô This enables Azure Purview to take physical samples from data sources and run them through our classification system. The classification system figures out the semantics of a piece of data. For example, we may know that a file is a Parquet file and has three columns and the third one is a string. But the classifiers we run on the samples will tell us that the string is a name, address, or phone number. Lighting up this integration point means that we have defined how Azure Purview can open up objects like notebooks, pipelines, parquet files, tables, and containers.
-* **Embedded Experience** ΓÇô Products that have a ΓÇ£studioΓÇ¥ like experience (such as ADF, Synapse, SQL Studio, PBI, and Dynamics) usually want to enable users to discover data they want to interact with and also find places to output data. Azure PurviewΓÇÖs catalog can help to accelerate these experiences by providing an embedding experience. This experience can occur at the API or the UX level at the partnerΓÇÖs option. By embedding a call to Azure Purview, the organization can take advantage of Azure PurviewΓÇÖs map of the data estate to find data assets, see lineage, check schemas, look at ratings, contacts etc.
+* **Data Asset** ΓÇô This enables Microsoft Purview to scan a storeΓÇÖs assets in order to enumerate what those assets are and collect any readily available metadata about them. So for SQL this could be a list of DBs, tables, stored procedures, views and config data about them kept in places like `sys.tables`. For something like Azure Data Factory (ADF) this could be enumerating all the pipelines and getting data on when they were created, last run, current state.
+* **Lineage** ΓÇô This enables Microsoft Purview to collect information from an analysis/data mutation system on how data is moving around. For something like Spark this could be gathering information from the execution of a notebook to see what data the notebook ingested, how it transformed it and where it outputted it. For something like SQL, it could be analyzing query logs to reverse engineer what mutation operations were executed and what they did. We support both push and pull based lineage depending on the needs.
+* **Classification** ΓÇô This enables Microsoft Purview to take physical samples from data sources and run them through our classification system. The classification system figures out the semantics of a piece of data. For example, we may know that a file is a Parquet file and has three columns and the third one is a string. But the classifiers we run on the samples will tell us that the string is a name, address, or phone number. Lighting up this integration point means that we have defined how Microsoft Purview can open up objects like notebooks, pipelines, parquet files, tables, and containers.
+* **Embedded Experience** ΓÇô Products that have a ΓÇ£studioΓÇ¥ like experience (such as ADF, Synapse, SQL Studio, PBI, and Dynamics) usually want to enable users to discover data they want to interact with and also find places to output data. Microsoft PurviewΓÇÖs catalog can help to accelerate these experiences by providing an embedding experience. This experience can occur at the API or the UX level at the partnerΓÇÖs option. By embedding a call to Microsoft Purview, the organization can take advantage of Microsoft PurviewΓÇÖs map of the data estate to find data assets, see lineage, check schemas, look at ratings, contacts etc.
## Phase 1: Pilot
-In this phase, Azure Purview must be created and configured for a very small set of users. Usually, it is just a group of 2-3 people working together to run through end-to-end scenarios. They are considered the advocates of Azure Purview in their organization. The main goal of this phase is to ensure key functionalities can be met and the right stakeholders are aware of the project.
+In this phase, Microsoft Purview must be created and configured for a very small set of users. Usually, it is just a group of 2-3 people working together to run through end-to-end scenarios. They are considered the advocates of Microsoft Purview in their organization. The main goal of this phase is to ensure key functionalities can be met and the right stakeholders are aware of the project.
### Tasks to complete |Task|Detail|Duration| |||| |Gather & agree on requirements|Discussion with all stakeholders to gather a full set of requirements. Different personas must participate to agree on a subset of requirements to complete for each phase of the project.|1 Week|
-|Navigating Azure Purview|Understand how to use Azure Purview from the home page.|1 Day|
+|Navigating Microsoft Purview|Understand how to use Microsoft Purview from the home page.|1 Day|
|Configure ADF for lineage|Identify key pipelines and data assets. Gather all information required to connect to an internal ADF account.|1 Day| |Scan a data source such as Azure Data Lake Storage|Add the data source and set up a scan. Ensure the scan successfully detects all assets.|2 Day|
-|Search and browse|Allow end users to access Azure Purview and perform end-to-end search and browse scenarios.|1 Day|
+|Search and browse|Allow end users to access Microsoft Purview and perform end-to-end search and browse scenarios.|1 Day|
### Acceptance criteria
-* Azure Purview account is created successfully in organization subscription under the organization tenant.
-* A small group of users with multiple roles can access Azure Purview.
-* Azure Purview is configured to scan at least one data source.
-* Users should be able to extract key values of Azure Purview such as:
+* Microsoft Purview account is created successfully in organization subscription under the organization tenant.
+* A small group of users with multiple roles can access Microsoft Purview.
+* Microsoft Purview is configured to scan at least one data source.
+* Users should be able to extract key values of Microsoft Purview such as:
* Search and browse * Lineage * Users should be able to assign asset ownership in the asset page.
In this phase, Azure Purview must be created and configured for a very small set
## Phase 2: Minimum viable product
-Once you have the agreed requirements and participated business units to onboard Azure Purview, the next step is to work on a Minimum Viable Product (MVP) release. In this phase, you will expand the usage of Azure Purview to more users who will have additional needs horizontally and vertically. There will be key scenarios that must be met horizontally for all users such as glossary terms, search, and browse. There will also be in-depth requirements vertically for each business unit or group to cover specific end-to-end scenarios such as lineage from Azure Data Lake Storage to Azure Synapse DW to Power BI.
+Once you have the agreed requirements and participated business units to onboard Microsoft Purview, the next step is to work on a Minimum Viable Product (MVP) release. In this phase, you will expand the usage of Microsoft Purview to more users who will have additional needs horizontally and vertically. There will be key scenarios that must be met horizontally for all users such as glossary terms, search, and browse. There will also be in-depth requirements vertically for each business unit or group to cover specific end-to-end scenarios such as lineage from Azure Data Lake Storage to Azure Synapse DW to Power BI.
### Tasks to complete |Task|Detail|Duration| |||| |[Scan Azure Synapse Analytics](register-scan-azure-synapse-analytics.md)|Start to onboard your database sources and scan them to populate key assets|2 Days|
-|[Create custom classifications and rules](create-a-custom-classification-and-classification-rule.md)|Once your assets are scanned, your users may realize that there are additional use cases for more classification beside the default classifications from Azure Purview.|2-4 Weeks|
+|[Create custom classifications and rules](create-a-custom-classification-and-classification-rule.md)|Once your assets are scanned, your users may realize that there are additional use cases for more classification beside the default classifications from Microsoft Purview.|2-4 Weeks|
|[Scan Power BI](register-scan-power-bi-tenant.md)|If your organization uses Power BI, you can scan Power BI in order to gather all data assets being used by Data Scientists or Data Analysts which have requirements to include lineage from the storage layer.|1-2 Weeks|
-|[Import glossary terms](how-to-create-import-export-glossary.md)|In most cases, your organization may already develop a collection of glossary terms and term assignment to assets. This will require an import process into Azure Purview via .csv file.|1 Week|
+|[Import glossary terms](how-to-create-import-export-glossary.md)|In most cases, your organization may already develop a collection of glossary terms and term assignment to assets. This will require an import process into Microsoft Purview via .csv file.|1 Week|
|Add contacts to assets|For top assets, you may want to establish a process to either allow other personas to assign contacts or import via REST APIs.|1 Week| |Add sensitive labels and scan|This might be optional for some organizations, depending on the usage of Labeling from Microsoft 365.|1-2 Weeks|
-|Get classification and sensitive insights|For reporting and insight in Azure Purview, you can access this functionality to get various reports and provide presentation to management.|1 Day|
-|Onboard additional users using Azure Purview managed users|This step will require the Azure Purview Admin to work with the Azure Active Directory Admin to establish new Security Groups to grant access to Azure Purview.|1 Week|
+|Get classification and sensitive insights|For reporting and insight in Microsoft Purview, you can access this functionality to get various reports and provide presentation to management.|1 Day|
+|Onboard additional users using Microsoft Purview managed users|This step will require the Microsoft Purview Admin to work with the Azure Active Directory Admin to establish new Security Groups to grant access to Microsoft Purview.|1 Week|
### Acceptance criteria
-* Successfully onboard a larger group of users to Azure Purview (50+)
+* Successfully onboard a larger group of users to Microsoft Purview (50+)
* Scan business critical data sources * Import and assign all critical glossary terms * Successfully test important labeling on key assets
Once you have the agreed requirements and participated business units to onboard
## Phase 3: Pre-production
-Once the MVP phase has passed, itΓÇÖs time to plan for pre-production milestone. Your organization may decide to have a separate instance of Azure Purview for pre-production and production, or keep the same instance but restrict access. Also in this phase, you may want to include scanning on on-premises data sources such as SQL Server. If there is any gap in data sources not supported by Azure Purview, it is time to explore the Atlas API to understand additional options.
+Once the MVP phase has passed, itΓÇÖs time to plan for pre-production milestone. Your organization may decide to have a separate instance of Microsoft Purview for pre-production and production, or keep the same instance but restrict access. Also in this phase, you may want to include scanning on on-premises data sources such as SQL Server. If there is any gap in data sources not supported by Microsoft Purview, it is time to explore the Atlas API to understand additional options.
### Tasks to complete
Once the MVP phase has passed, itΓÇÖs time to plan for pre-production milestone.
|||| |Refine your scan using scan rule set|Your organization will have a lot of data sources for pre-production. ItΓÇÖs important to pre-define key criteria for scanning so that classifications and file extension can be applied consistently across the board.|1-2 Days| |Assess region availability for scan|Depending on the region of the data sources and organizational requirements on compliance and security, you may want to consider what regions must be available for scanning.|1 Day|
-|Understand firewall concept when scanning|This step requires some exploration of how the organization configures its firewall and how Azure Purview can authenticate itself to access the data sources for scanning.|1 Day|
+|Understand firewall concept when scanning|This step requires some exploration of how the organization configures its firewall and how Microsoft Purview can authenticate itself to access the data sources for scanning.|1 Day|
|Understand Private Link concept when scanning|If your organization uses Private Link, you must lay out the foundation of network security to include Private Link as a part of the requirements.|1 Day| |[Scan on-premises SQL Server](register-scan-on-premises-sql-server.md)|This is optional if you have on-premises SQL Server. The scan will require setting up [Self-hosted Integration Runtime](manage-integration-runtimes.md) and adding SQL Server as a data source.|1-2 Weeks|
-|Use Azure Purview REST API for integration scenarios|If you have requirements to integrate Azure Purview with other 3rd party technologies such as orchestration or ticketing system, you may want to explore REST API area.|1-4 Weeks|
-|Understand Azure Purview pricing|This step will provide the organization important financial information to make decision.|1-5 Days|
+|Use Microsoft Purview REST API for integration scenarios|If you have requirements to integrate Microsoft Purview with other 3rd party technologies such as orchestration or ticketing system, you may want to explore REST API area.|1-4 Weeks|
+|Understand Microsoft Purview pricing|This step will provide the organization important financial information to make decision.|1-5 Days|
### Acceptance criteria
Additional hardening steps can be taken:
## Moving tenants
-If your Azure Subscription moves tenants while you have an Azure Purview account, there are some steps you should follow after the move.
+If your Azure Subscription moves tenants while you have a Microsoft Purview account, there are some steps you should follow after the move.
-Currently your Azure Purview account's system assigned and user assigned managed identities will be cleared during the move to the new tenant. This is because your Azure tenant houses all authentication information, so these need to be updated for your Azure Purview account in the new tenant.
+Currently your Microsoft Purview account's system assigned and user assigned managed identities will be cleared during the move to the new tenant. This is because your Azure tenant houses all authentication information, so these need to be updated for your Microsoft Purview account in the new tenant.
After the move, follow the below steps to clear the old identities, and create new ones:
After the move, follow the below steps to clear the old identities, and create n
> [!IMPORTANT] > Be sure to replace these values in the below commands: > - \<Subscription_Id>: Your Azure Subscription ID
- > - \<Resource_Group_Name>: Name of the resource group where your Azure Purview account is housed.
- > - \<Account_Name>: Your Azure Purview account name
+ > - \<Resource_Group_Name>: Name of the resource group where your Microsoft Purview account is housed.
+ > - \<Account_Name>: Your Microsoft Purview account name
> - \<Access_Token>: The token from the first two steps. ```bash
After the move, follow the below steps to clear the old identities, and create n
curl 'https://management.azure.com/subscriptions/<Subscription_Id>/resourceGroups/<Resource_Group_Name>/providers/Microsoft.Purview/accounts/<Account_Name>?api-version=2021-07-01' -X PATCH -d '{"identity":{"type":"SystemAssigned"}}' -H "Content-Type: application/json" -H "Authorization:Bearer <Access_Token>" ```
-1. If you had a user assigned managed identity (UAMI), to enable one on your new tenant, register your UAMI in Azure Purview as you did originally by following [the steps from the manage credentials article](manage-credentials.md#create-a-user-assigned-managed-identity).
+1. If you had a user assigned managed identity (UAMI), to enable one on your new tenant, register your UAMI in Microsoft Purview as you did originally by following [the steps from the manage credentials article](manage-credentials.md#create-a-user-assigned-managed-identity).
## Next steps
purview Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/disaster-recovery.md
Title: Disaster recovery for Azure Purview
-description: Learn how to configure a disaster recovery environment for Azure Purview.
+ Title: Disaster recovery for Microsoft Purview
+description: Learn how to configure a disaster recovery environment for Microsoft Purview.
Last updated 04/23/2021
-# Disaster recovery for Azure Purview
+# Disaster recovery for Microsoft Purview
-This article explains how to configure a disaster recovery environment for Azure Purview. Azure data center outages are rare, but can last anywhere from a few minutes to hours. Data Center outages can cause disruption to environments that are being relied on for data governance. By following the steps detailed in this article, you can continue to govern your data in the event of a data center outage for the primary region of your Azure Purview account.
+This article explains how to configure a disaster recovery environment for Microsoft Purview. Azure data center outages are rare, but can last anywhere from a few minutes to hours. Data Center outages can cause disruption to environments that are being relied on for data governance. By following the steps detailed in this article, you can continue to govern your data in the event of a data center outage for the primary region of your Microsoft Purview account.
-## Achieve business continuity for Azure Purview
+## Achieve business continuity for Microsoft Purview
-Business continuity and disaster recoveryΓÇ»(BCDR) in an Azure Purview instance refers to the mechanisms, policies, and procedures that enable your business to protect data loss and continue operating in the face of disruption, particularly to its scanning, catalog, and insights tiers. This page explains how to configure a disaster recovery environment for Azure Purview.
+Business continuity and disaster recoveryΓÇ»(BCDR) in a Microsoft Purview instance refers to the mechanisms, policies, and procedures that enable your business to protect data loss and continue operating in the face of disruption, particularly to its scanning, catalog, and insights tiers. This page explains how to configure a disaster recovery environment for Microsoft Purview.
-Today, Azure Purview does not support automated BCDR. Until that support is added, you are responsible to take care of backup and restore activities. You can manually create a secondary Azure Purview account as a warm standby instance in another region.
+Today, Microsoft Purview does not support automated BCDR. Until that support is added, you are responsible to take care of backup and restore activities. You can manually create a secondary Microsoft Purview account as a warm standby instance in another region.
The following steps show how you can achieve disaster recovery manually:
-1. Once the primary Azure Purview account is created in a certain region, you must provision one or more secondary Azure Purview accounts in separate regions from Azure portal.
+1. Once the primary Microsoft Purview account is created in a certain region, you must provision one or more secondary Microsoft Purview accounts in separate regions from Azure portal.
-2. All activities performed on the primary Azure Purview account must be carried out on the secondary Azure Purview accounts as well. This includes:
+2. All activities performed on the primary Microsoft Purview account must be carried out on the secondary Microsoft Purview accounts as well. This includes:
- Maintain Account information - Create and maintain custom Scan rule sets, Classifications, and Classification rules
The following steps show how you can achieve disaster recovery manually:
As you plan your manual BCDR plan, keep the following points in mind: -- You will be charged for primary and secondary Azure Purview accounts.
+- You will be charged for primary and secondary Microsoft Purview accounts.
-- The primary and secondary Azure Purview accounts cannot be configured to the same Azure Data Factory, Azure Data Share and Synapse Analytics accounts, if applicable. As a result, the lineage from Azure Data Factory and Azure Data Share cannot be seen in the secondary Azure Purview accounts. Also, the Synapse Analytics workspace associated with the primary Azure Purview account cannot be associated with secondary Azure Purview accounts. This is a limitation today and will be addressed when automated BCDR is supported.
+- The primary and secondary Microsoft Purview accounts cannot be configured to the same Azure Data Factory, Azure Data Share and Synapse Analytics accounts, if applicable. As a result, the lineage from Azure Data Factory and Azure Data Share cannot be seen in the secondary Microsoft Purview accounts. Also, the Synapse Analytics workspace associated with the primary Microsoft Purview account cannot be associated with secondary Microsoft Purview accounts. This is a limitation today and will be addressed when automated BCDR is supported.
-- The integration runtimes are specific to an Azure Purview account. Hence, if scans must run in primary and secondary Azure Purview accounts in-parallel, multiple self-hosted integration runtimes must be maintained. This limitation will also be addressed when automated BCDR is supported.
+- The integration runtimes are specific to a Microsoft Purview account. Hence, if scans must run in primary and secondary Microsoft Purview accounts in-parallel, multiple self-hosted integration runtimes must be maintained. This limitation will also be addressed when automated BCDR is supported.
-- Parallel execution of scans from both primary and secondary Azure Purview accounts on the same source can affect the performance of the source. This can result in scan durations to vary across the Azure Purview accounts.
+- Parallel execution of scans from both primary and secondary Microsoft Purview accounts on the same source can affect the performance of the source. This can result in scan durations to vary across the Microsoft Purview accounts.
## Related information
As you plan your manual BCDR plan, keep the following points in mind:
## Next steps
-To get started with Azure Purview, see [Create an Azure Purview account](create-catalog-portal.md).
+To get started with Microsoft Purview, see [Create a Microsoft Purview account](create-catalog-portal.md).
purview Glossary Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/glossary-insights.md
Title: Glossary report on your data using Azure Purview Insights
-description: This how-to guide describes how to view and use Azure Purview Insights glossary reporting on your data.
+ Title: Glossary report on your data using Microsoft Purview Insights
+description: This how-to guide describes how to view and use Microsoft Purview Insights glossary reporting on your data.
Last updated 09/27/2021
-# Glossary insights on your data in Azure Purview
+# Glossary insights on your data in Microsoft Purview
-This how-to guide describes how to access, view, and filter Azure Purview Glossary insight reports for your data.
+This how-to guide describes how to access, view, and filter Microsoft Purview Glossary insight reports for your data.
> [!IMPORTANT]
-> Azure Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
In this how-to guide, you'll learn how to: > [!div class="checklist"]
-> - Go to Insights from your Azure Purview account
+> - Go to Insights from your Microsoft Purview account
> - Get a bird's eye view of your data ## Prerequisites
-Before getting started with Azure Purview insights, make sure that you've completed the following steps:
+Before getting started with Microsoft Purview insights, make sure that you've completed the following steps:
- Set up your Azure resources and populate the account with data
Before getting started with Azure Purview insights, make sure that you've comple
- Set up a glossary and attach assets to glossary terms
-For more information, see [Manage data sources in Azure Purview](manage-data-sources.md).
+For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md).
-## Use Azure Purview Glossary Insights
+## Use Microsoft Purview Glossary Insights
-In Azure Purview, you can create glossary terms and attach them to assets. Later, you can view the glossary distribution in Glossary Insights. This tells you the state of your glossary by terms attached to assets. It also tells you terms by status and distribution of roles by number of users.
+In Microsoft Purview, you can create glossary terms and attach them to assets. Later, you can view the glossary distribution in Glossary Insights. This tells you the state of your glossary by terms attached to assets. It also tells you terms by status and distribution of roles by number of users.
**To view Glossary Insights:**
-1. Go to the **Azure Purview** [instance screen in the Azure portal](https://aka.ms/purviewportal) and select your Azure Purview account.
+1. Go to the **Microsoft Purview** [instance screen in the Azure portal](https://aka.ms/purviewportal) and select your Microsoft Purview account.
-1. On the **Overview** page, in the **Get Started** section, select **Open Azure Purview Studio** account tile.
+1. On the **Overview** page, in the **Get Started** section, select **Open Microsoft Purview Studio** account tile.
- :::image type="content" source="./media/glossary-insights/portal-access.png" alt-text="Launch Azure Purview from the Azure portal":::
+ :::image type="content" source="./media/glossary-insights/portal-access.png" alt-text="Launch Microsoft Purview from the Azure portal":::
-1. On the Azure Purview **Home** page, select **Insights** on the left menu.
+1. On the Microsoft Purview **Home** page, select **Insights** on the left menu.
:::image type="content" source="./media/glossary-insights/view-insights.png" alt-text="View your insights in the Azure portal":::
-1. In the **Insights** area, select **Glossary** to display the Azure Purview **Glossary insights** report.
+1. In the **Insights** area, select **Glossary** to display the Microsoft Purview **Glossary insights** report.
**Glossary Insights** provides you as a business user, valuable information to maintain a well-defined glossary for your organization.
-1. The report starts with **High-level KPIs** that shows ***Total terms*** in your Azure Purview account, ***Approved terms without assets*** and ***Expired terms with assets***. Each of these values will help you identify the health of your Glossary.
+1. The report starts with **High-level KPIs** that shows ***Total terms*** in your Microsoft Purview account, ***Approved terms without assets*** and ***Expired terms with assets***. Each of these values will help you identify the health of your Glossary.
:::image type="content" source="./media/glossary-insights/glossary-kpi.png" alt-text="View glossary insights KPI":::
purview How To Automatically Label Your Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-automatically-label-your-content.md
Title: How to automatically apply sensitivity labels to your data in Azure Purview
+ Title: How to automatically apply sensitivity labels to your data in Microsoft Purview
description: Learn how to create sensitivity labels and automatically apply them to your data during a scan.
Last updated 09/27/2021
-# How to automatically apply sensitivity labels to your data in Azure Purview
+# How to automatically apply sensitivity labels to your data in Microsoft Purview
-## Create or extend existing sensitivity labels to Azure Purview
+## Create or extend existing sensitivity labels to Microsoft Purview
> [!IMPORTANT]
-> Azure Purview Sensitivity Labels are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Sensitivity Labels are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
>
-If you don't already have sensitivity labels, you'll need to create them and make them available for Azure Purview. Existing sensitivity labels can also be modified to make them available for Azure Purview.
+If you don't already have sensitivity labels, you'll need to create them and make them available for Microsoft Purview. Existing sensitivity labels can also be modified to make them available for Microsoft Purview.
### Step 1: Licensing requirements
-Sensitivity labels are created and managed in the Microsoft 365 compliance center. To create sensitivity labels for use in Azure Purview, you must have an active Microsoft 365 license which offers the benefit of automatically applying sensitivity labels.
+Sensitivity labels are created and managed in the Microsoft 365 compliance center. To create sensitivity labels for use in Microsoft Purview, you must have an active Microsoft 365 license which offers the benefit of automatically applying sensitivity labels.
-For the full list of licenses, see the [Sensitivity labels in Azure Purview FAQ](sensitivity-labels-frequently-asked-questions.yml). If you do not already have the required license, you can sign up for a trial of [Microsoft 365 E5](https://www.microsoft.com/microsoft-365/business/compliance-solutions#midpagectaregion).
+For the full list of licenses, see the [Sensitivity labels in Microsoft Purview FAQ](sensitivity-labels-frequently-asked-questions.yml). If you do not already have the required license, you can sign up for a trial of [Microsoft 365 E5](https://www.microsoft.com/microsoft-365/business/compliance-solutions#midpagectaregion).
-### Step 2: Consent to use sensitivity labels in Azure Purview
+### Step 2: Consent to use sensitivity labels in Microsoft Purview
-The following steps extend your sensitivity labels and enable them to be available for use in Azure Purview, where you can apply sensitivity labels to files and database columns.
+The following steps extend your sensitivity labels and enable them to be available for use in Microsoft Purview, where you can apply sensitivity labels to files and database columns.
1. In Microsoft 365, navigate to the **Information Protection** page.</br> If you've recently provisioned your subscription for Information Protection, it may take a few hours for the **Information Protection** page to display.
-1. In the **Extend labeling to assets in Azure Purview** area, select the **Turn on** button, and then select **Yes** in the confirmation dialog that appears.
+1. In the **Extend labeling to assets in Microsoft Purview** area, select the **Turn on** button, and then select **Yes** in the confirmation dialog that appears.
For example: > [!TIP]
->If you don't see the button, and you're not sure if consent has been granted to extend labeling to assets in Azure Purview, see [this FAQ](sensitivity-labels-frequently-asked-questions.yml#how-can-i-determine-if-consent-has-been-granted-to-extend-labeling-to-azure-purview) item on how to determine the status.
+>If you don't see the button, and you're not sure if consent has been granted to extend labeling to assets in Microsoft Purview, see [this FAQ](sensitivity-labels-frequently-asked-questions.yml#how-can-i-determine-if-consent-has-been-granted-to-extend-labeling-to-microsoft-purview) item on how to determine the status.
>
-After you've extended labeling to assets in Azure Purview, all published sensitivity labels are available for use in Azure Purview.
+After you've extended labeling to assets in Microsoft Purview, all published sensitivity labels are available for use in Microsoft Purview.
### Step 3: Create or modify existing label to automatically label content
For example:
### Step 4: Publish labels
-Once you create a label, you will need to Scan your data in Azure Purview to automatically apply the labels you've created, based on the autolabeling rules you've defined.
+Once you create a label, you will need to Scan your data in Microsoft Purview to automatically apply the labels you've created, based on the autolabeling rules you've defined.
## Scan your data to apply sensitivity labels automatically
-Scan your data in Azure Purview to automatically apply the labels you've created, based on the autolabeling rules you've defined. Allow up to 24 hours for sensitivity label changes to reflect in Azure Purview.
+Scan your data in Microsoft Purview to automatically apply the labels you've created, based on the autolabeling rules you've defined. Allow up to 24 hours for sensitivity label changes to reflect in Microsoft Purview.
-For more information on how to set up scans on various assets in Azure Purview, see:
+For more information on how to set up scans on various assets in Microsoft Purview, see:
|Source |Reference | |||
For more information on how to set up scans on various assets in Azure Purview,
## View labels on assets in the catalog
-Once you've defined autolabeling rules for your labels in Microsoft 365 and scanned your data in Azure Purview, labels are automatically applied to your assets.
+Once you've defined autolabeling rules for your labels in Microsoft 365 and scanned your data in Microsoft Purview, labels are automatically applied to your assets.
-**To view the labels applied to your assets in the Azure Purview Catalog:**
+**To view the labels applied to your assets in the Microsoft Purview Catalog:**
-In the Azure Purview Catalog, use the **Label** filtering options to show assets with specific labels only. For example:
+In the Microsoft Purview Catalog, use the **Label** filtering options to show assets with specific labels only. For example:
:::image type="content" source="media/how-to-automatically-label-your-content/filter-search-results-small.png" alt-text="Search for assets by label" lightbox="media/how-to-automatically-label-your-content/filter-search-results.png":::
For example:
## View Insight reports for the classifications and sensitivity labels
-Find insights on your classified and labeled data in Azure Purview use the **Classification** and **Sensitivity labeling** reports.
+Find insights on your classified and labeled data in Microsoft Purview use the **Classification** and **Sensitivity labeling** reports.
> [!div class="nextstepaction"] > [Classification insights](./classification-insights.md)
Find insights on your classified and labeled data in Azure Purview use the **Cla
> [Sensitivity label insights](sensitivity-insights.md) > [!div class="nextstepaction"]
-> [Overview of Labeling in Azure Purview](create-sensitivity-label.md)
+> [Overview of Labeling in Microsoft Purview](create-sensitivity-label.md)
> [!div class="nextstepaction"] > [Labeling Frequently Asked Questions](sensitivity-labels-frequently-asked-questions.yml)
purview How To Browse Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-browse-catalog.md
Title: 'How to: browse the Data Catalog'
-description: This article gives an overview of how to browse the Azure Purview data catalog by asset type
+description: This article gives an overview of how to browse the Microsoft Purview data catalog by asset type
Last updated 10/01/2021
-# Browse the Azure Purview data catalog
+# Browse the Microsoft Purview data catalog
-Searching a data catalog is a great tool for data discovery if a data consumer knows what they are looking for, but often users don't know exactly how their data estate is structured. The Azure Purview data catalog offers a browse experience that enables users to explore what data is available to them either by collection or through traversing the hierarchy of each data source in the catalog.
+Searching a data catalog is a great tool for data discovery if a data consumer knows what they are looking for, but often users don't know exactly how their data estate is structured. The Microsoft Purview data catalog offers a browse experience that enables users to explore what data is available to them either by collection or through traversing the hierarchy of each data source in the catalog.
To access the browse experience, select ΓÇ£Browse assetsΓÇ¥ from the data catalog home page. ## Browse by collection
Browse by collection allows you to explore the different collections you are a d
:::image type="content" source="media/how-to-browse-catalog/browse-by-collection.png" alt-text="Screenshot showing the browse by collection page" border="true":::
-Once a collection is selected, you will get a list of assets in that collection with the facets and filters available in search. As a collection can have thousands of assets, browse uses the Azure Purview search relevance engine to boost the most important assets to the top.
+Once a collection is selected, you will get a list of assets in that collection with the facets and filters available in search. As a collection can have thousands of assets, browse uses the Microsoft Purview search relevance engine to boost the most important assets to the top.
:::image type="content" source="media/how-to-browse-catalog/browse-collection-results.png" alt-text="Screenshot showing the browse by collection results" border="true":::
A native browsing experience with hierarchical namespace is provided for each co
- [How to create, import, and export glossary terms](how-to-create-import-export-glossary.md) - [How to manage term templates for business glossary](how-to-manage-term-templates.md)-- [How to search the Azure Purview data catalog](how-to-search-catalog.md)
+- [How to search the Microsoft Purview data catalog](how-to-search-catalog.md)
purview How To Bulk Edit Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-bulk-edit-assets.md
Title: How to bulk edit assets to tag classifications, glossary terms and modify contacts
-description: Learn bulk edit assets in Azure Purview.
+description: Learn bulk edit assets in Microsoft Purview.
This article describes how to tag glossary terms, classifications, owners and ex
## Select assets to bulk edit
-1. Use Azure Purview search or browse to discover assets you wish to edit.
+1. Use Microsoft Purview search or browse to discover assets you wish to edit.
1. In the search results, if you focus on an asset a checkbox appears.
purview How To Certify Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-certify-assets.md
Title: Asset certification in the Azure Purview data catalog
-description: How to certify assets in the Azure Purview data catalog
+ Title: Asset certification in the Microsoft Purview data catalog
+description: How to certify assets in the Microsoft Purview data catalog
Last updated 02/24/2022
-# Asset certification in the Azure Purview data catalog
+# Asset certification in the Microsoft Purview data catalog
-As an Azure Purview data catalog grows in size, it becomes important for data consumers to understand what assets they can trust. Data consumers must know if an asset meet their organization's quality standards and can be regarded as reliable. Azure Purview allows data stewards to manually endorse assets to indicate that they're ready to use across an organization or business unit. This article describes how data stewards can certify assets and data consumers can view certification labels.
+As a Microsoft Purview data catalog grows in size, it becomes important for data consumers to understand what assets they can trust. Data consumers must know if an asset meet their organization's quality standards and can be regarded as reliable. Microsoft Purview allows data stewards to manually endorse assets to indicate that they're ready to use across an organization or business unit. This article describes how data stewards can certify assets and data consumers can view certification labels.
## How to certify an asset
To certify an asset, you must be a **data curator** for the collection containin
:::image type="content" source="media/how-to-certify-assets/view-certified-asset.png" alt-text="An asset with a certified label" border="true"::: > [!NOTE]
-> PowerBI assets can only be [certified in a PowerBI workspace](/power-bi/collaborate-share/service-endorse-content). PowerBI endorsement labels are displayed in Azure Purview's search and browse experiences.
+> PowerBI assets can only be [certified in a PowerBI workspace](/power-bi/collaborate-share/service-endorse-content). PowerBI endorsement labels are displayed in Microsoft Purview's search and browse experiences.
### Certify assets in bulk
-You can use the Azure Purview [bulk edit experience](how-to-bulk-edit-assets.md) to certify multiple assets at once.
+You can use the Microsoft Purview [bulk edit experience](how-to-bulk-edit-assets.md) to certify multiple assets at once.
1. After searching or browsing the data catalog, select checkbox next to the assets you wish to certify.
When search or browsing the data catalog, you'll see a certification label on an
## Next steps
-Discover your assets in the Azure Purview data catalog by either:
+Discover your assets in the Microsoft Purview data catalog by either:
- [Browsing the data catalog](how-to-browse-catalog.md) - [Searching the data catalog](how-to-search-catalog.md)
purview How To Create And Manage Collections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-create-and-manage-collections.md
Title: How to create and manage collections
-description: This article explains how to create and manage collections within Azure Purview.
+description: This article explains how to create and manage collections within Microsoft Purview.
Last updated 01/24/2022
-# Create and manage collections in Azure Purview
+# Create and manage collections in Microsoft Purview
-Collections in Azure Purview can be used to organize assets and sources by your business's flow. They are also the tool used to manage access across Azure Purview. This guide will take you through the creation and management of these collections, as well as cover steps about how to register sources and add assets into your collections.
+Collections in Microsoft Purview can be used to organize assets and sources by your business's flow. They are also the tool used to manage access across Microsoft Purview. This guide will take you through the creation and management of these collections, as well as cover steps about how to register sources and add assets into your collections.
## Prerequisites
Collections in Azure Purview can be used to organize assets and sources by your
* Your own [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
### Check permissions
-In order to create and manage collections in Azure Purview, you will need to be a **Collection Admin** within Azure Purview. We can check these permissions in the [Azure Purview Studio](https://web.purview.azure.com/resource/). You can find Studio in the overview page of the Azure Purview account in [Azure portal](https://portal.azure.com).
+In order to create and manage collections in Microsoft Purview, you will need to be a **Collection Admin** within Microsoft Purview. We can check these permissions in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/). You can find Studio in the overview page of the Microsoft Purview account in [Azure portal](https://portal.azure.com).
1. Select Data Map > Collections from the left pane to open collection management page.
- :::image type="content" source="./media/how-to-create-and-manage-collections/find-collections.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the Collections tab selected." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/find-collections.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the Collections tab selected." border="true":::
-1. Select your root collection. This is the top collection in your collection list and will have the same name as your Azure Purview account. In the following example, it's called Contoso Azure Purview. Alternatively, if collections already exist you can select any collection where you want to create a subcollection.
+1. Select your root collection. This is the top collection in your collection list and will have the same name as your Microsoft Purview account. In the following example, it's called Contoso Microsoft Purview. Alternatively, if collections already exist you can select any collection where you want to create a subcollection.
- :::image type="content" source="./media/how-to-create-and-manage-collections/select-root-collection.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the root collection highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/select-root-collection.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the root collection highlighted." border="true":::
1. Select **Role assignments** in the collection window.
- :::image type="content" source="./media/how-to-create-and-manage-collections/role-assignments.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/role-assignments.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
-1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the Azure Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant your permission.
+1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the Microsoft Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant your permission.
- :::image type="content" source="./media/how-to-create-and-manage-collections/collection-admins.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the collection admin section highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/collection-admins.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the collection admin section highlighted." border="true":::
## Collection management
You'll need to be a collection admin in order to create a collection. If you are
1. Select Data Map > Collections from the left pane to open collection management page.
- :::image type="content" source="./media/how-to-create-and-manage-collections/find-collections.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the Collections tab selected and open." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/find-collections.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the Collections tab selected and open." border="true":::
1. Select **+ Add a collection**. Again, note that only [collection admins](#check-permissions) can manage collections.
- :::image type="content" source="./media/how-to-create-and-manage-collections/select-add-a-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the new collection window, with the 'Add a collection' button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/select-add-a-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the new collection window, with the 'Add a collection' button highlighted." border="true":::
1. In the right panel, enter the collection name and description. If needed you can also add users or groups as collection admins to the new collection. 1. Select **Create**.
- :::image type="content" source="./media/how-to-create-and-manage-collections/create-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/create-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
1. The new collection's information will reflect on the page.
- :::image type="content" source="./media/how-to-create-and-manage-collections/created-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the newly created collection window." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/created-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the newly created collection window." border="true":::
### Edit a collection 1. Select **Edit** either from the collection detail page, or from the collection's dropdown menu.
- :::image type="content" source="./media/how-to-create-and-manage-collections/edit-collection.png" alt-text="Screenshot of Azure Purview studio window, open to collection window, with the 'edit' button highlighted both in the selected collection window, and under the ellipsis button next to the name of the collection." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/edit-collection.png" alt-text="Screenshot of Microsoft Purview studio window, open to collection window, with the 'edit' button highlighted both in the selected collection window, and under the ellipsis button next to the name of the collection." border="true":::
1. Currently collection description and collection admins can be edited. Make any changes, then select **Save** to save your change.
- :::image type="content" source="./media/how-to-create-and-manage-collections/edit-description.png" alt-text="Screenshot of Azure Purview studio window with the edit collection window open, a description added to the collection, and the save button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/edit-description.png" alt-text="Screenshot of Microsoft Purview studio window with the edit collection window open, a description added to the collection, and the save button highlighted." border="true":::
### View Collections 1. Select the triangle icon beside the collection's name to expand or collapse the collection hierarchy. Select the collection names to navigate.
- :::image type="content" source="./media/how-to-create-and-manage-collections/subcollection-menu.png" alt-text="Screenshot of Azure Purview studio collection window, with the button next to the collection name highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/subcollection-menu.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the button next to the collection name highlighted." border="true":::
1. Type in the filter box at the top of the list to filter collections.
- :::image type="content" source="./media/how-to-create-and-manage-collections/filter-collections.png" alt-text="Screenshot of Azure Purview studio collection window, with the filter above the collections highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/filter-collections.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the filter above the collections highlighted." border="true":::
1. Select **Refresh** in Root collection's contextual menu to reload the collection list.
- :::image type="content" source="./media/how-to-create-and-manage-collections/refresh-collections.png" alt-text="Screenshot of Azure Purview studio collection window, with the button next to the Resource name selected, and the refresh button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/refresh-collections.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the button next to the Resource name selected, and the refresh button highlighted." border="true":::
1. Select **Refresh** in collection detail page to reload the single collection.
- :::image type="content" source="./media/how-to-create-and-manage-collections/refresh-single-collection.png" alt-text="Screenshot of Azure Purview studio collection window, with the refresh button under the collection window highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/refresh-single-collection.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the refresh button under the collection window highlighted." border="true":::
### Delete a collection
You'll need to be a collection admin in order to delete a collection. If you are
1. Select **Delete** from the collection detail page.
- :::image type="content" source="./media/how-to-create-and-manage-collections/delete-collections.png" alt-text="Screenshot of Azure Purview studio window to delete a collection" border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/delete-collections.png" alt-text="Screenshot of Microsoft Purview studio window to delete a collection" border="true":::
2. Select **Confirm** when prompted, **Are you sure you want to delete this collection?**
- :::image type="content" source="./media/how-to-create-and-manage-collections/delete-collection-confirmation.png" alt-text="Screenshot of Azure Purview studio window showing confirmation message to delete a collection" border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/delete-collection-confirmation.png" alt-text="Screenshot of Microsoft Purview studio window showing confirmation message to delete a collection" border="true":::
-3. Verify deletion of the collection from your Azure Purview Data Map.
+3. Verify deletion of the collection from your Microsoft Purview Data Map.
## Add roles and restrict access through collections
-Since permissions are managed through collections in Azure Purview, it is important to understand the roles and what permissions they will give your users. A user granted permissions on a collection will have access to sources and assets associated with that collection, and inherit permissions to subcollections. Inheritance [can be restricted](#restrict-inheritance), but is allowed by default.
+Since permissions are managed through collections in Microsoft Purview, it is important to understand the roles and what permissions they will give your users. A user granted permissions on a collection will have access to sources and assets associated with that collection, and inherit permissions to subcollections. Inheritance [can be restricted](#restrict-inheritance), but is allowed by default.
The following guide will discuss the roles, how to manage them, and permissions inheritance.
The following guide will discuss the roles, how to manage them, and permissions
All assigned roles apply to sources, assets, and other objects within the collection where the role is applied.
-* **Collection admins** can edit the collection, its details, and add subcollections. They can also add data curators, data readers, and other Azure Purview roles to a collection scope. Collection admins that are automatically inherited from a parent collection can't be removed.
+* **Collection admins** can edit the collection, its details, and add subcollections. They can also add data curators, data readers, and other Microsoft Purview roles to a collection scope. Collection admins that are automatically inherited from a parent collection can't be removed.
* **Data source admins** can manage data sources and data scans. They can also enter the policy management app to view and publish policies. * **Data curators** can perform create, read, modify, and delete actions on catalog data objects and establish relationships between objects. They can also enter the policy management app to view policies. * **Data readers** can access but not modify catalog data objects.
All assigned roles apply to sources, assets, and other objects within the collec
1. Select the **Role assignments** tab to see all the roles in a collection. Only a collection admin can manage role assignments.
- :::image type="content" source="./media/how-to-create-and-manage-collections/select-role-assignments.png" alt-text="Screenshot of Azure Purview studio collection window, with the role assignments tab highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/select-role-assignments.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the role assignments tab highlighted." border="true":::
1. Select **Edit role assignments** or the person icon to edit each role member.
- :::image type="content" source="./media/how-to-create-and-manage-collections/edit-role-assignments.png" alt-text="Screenshot of Azure Purview studio collection window, with the edit role assignments dropdown list selected." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/edit-role-assignments.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the edit role assignments dropdown list selected." border="true":::
1. Type in the textbox to search for users you want to add to the role member. Select **X** to remove members you don't want to add.
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-user-permissions.png" alt-text="Screenshot of Azure Purview studio collection admin window with the search bar highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/search-user-permissions.png" alt-text="Screenshot of Microsoft Purview studio collection admin window with the search bar highlighted." border="true":::
1. Select **OK** to save your changes, and you will see the new users reflected in the role assignments list.
All assigned roles apply to sources, assets, and other objects within the collec
1. Select **X** button next to a user's name to remove a role assignment.
- :::image type="content" source="./media/how-to-create-and-manage-collections/remove-role-assignment.png" alt-text="Screenshot of Azure Purview studio collection window, with the role assignments tab selected, and the x button beside one of the names highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/remove-role-assignment.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the role assignments tab selected, and the x button beside one of the names highlighted." border="true":::
1. Select **Confirm** if you're sure to remove the user.
All assigned roles apply to sources, assets, and other objects within the collec
### Restrict inheritance
-Collection permissions are inherited automatically from the parent collection. For example, any permissions on the root collection (the collection at the top of the list that has the same name as your Azure Purview account), will be inherited by all collections below it. You can restrict inheritance from a parent collection at any time, using the restrict inherited permissions option.
+Collection permissions are inherited automatically from the parent collection. For example, any permissions on the root collection (the collection at the top of the list that has the same name as your Microsoft Purview account), will be inherited by all collections below it. You can restrict inheritance from a parent collection at any time, using the restrict inherited permissions option.
Once you restrict inheritance, you will need to add users directly to the restricted collection to grant them access. 1. Navigate to the collection where you want to restrict inheritance and select the **Role assignments** tab. 1. Select **Restrict inherited permissions** and select **Restrict access** in the popup dialog to remove inherited permissions from this collection and any subcollections. Note that collection admin permissions won't be affected.
- :::image type="content" source="./media/how-to-create-and-manage-collections/restrict-access-inheritance.png" alt-text="Screenshot of Azure Purview studio collection window, with the role assignments tab selected, and the restrict inherited permissions slide button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/restrict-access-inheritance.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the role assignments tab selected, and the restrict inherited permissions slide button highlighted." border="true":::
1. After restriction, inherited members are removed from the roles expect for collection admin. 1. Select the **Restrict inherited permissions** toggle button again to revert.
- :::image type="content" source="./media/how-to-create-and-manage-collections/remove-restriction.png" alt-text="Screenshot of Azure Purview studio collection window, with the role assignments tab selected, and the unrestrict inherited permissions slide button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/remove-restriction.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the role assignments tab selected, and the unrestrict inherited permissions slide button highlighted." border="true":::
## Register source to a collection 1. Select **Register** or register icon on collection node to register a data source. Only a data source admin can register sources.
- :::image type="content" source="./media/how-to-create-and-manage-collections/register-by-collection.png" alt-text="Screenshot of the data map Azure Purview studio window with the register button highlighted both at the top of the page and under a collection."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/register-by-collection.png" alt-text="Screenshot of the data map Microsoft Purview studio window with the register button highlighted both at the top of the page and under a collection."border="true":::
1. Fill in the data source name, and other source information. It lists all the collections where you have scan permission on the bottom of the form. You can select one collection. All assets under this source will belong to the collection you select.
Once you restrict inheritance, you will need to add users directly to the restri
1. The created data source will be put under the selected collection. Select **View details** to see the data source.
- :::image type="content" source="./media/how-to-create-and-manage-collections/see-registered-source.png" alt-text="Screenshot of the data map Azure Purview studio window with the newly added source card highlighted."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/see-registered-source.png" alt-text="Screenshot of the data map Microsoft Purview studio window with the newly added source card highlighted."border="true":::
1. Select **New scan** to create scan under the data source.
- :::image type="content" source="./media/how-to-create-and-manage-collections/new-scan.png" alt-text="Screenshot of a source Azure Purview studio window with the new scan button highlighted."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/new-scan.png" alt-text="Screenshot of a source Microsoft Purview studio window with the new scan button highlighted."border="true":::
1. Similarly, at the bottom of the form, you can select a collection, and all assets scanned will be included in the collection. The collections listed here are restricted to subcollections of the data source collection.
The collections listed here are restricted to subcollections of the data source
1. Back in the collection window, you will see the data sources linked to the collection on the sources card.
- :::image type="content" source="./media/how-to-create-and-manage-collections/source-under-collection.png" alt-text="Screenshot of the data map Azure Purview studio window with the newly added source card highlighted in the map."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/source-under-collection.png" alt-text="Screenshot of the data map Microsoft Purview studio window with the newly added source card highlighted in the map."border="true":::
## Add assets to collections
Assets and sources are also associated with collections. During a scan, if the s
1. Check the collection information in asset details. You can find collection information in the **Collection path** section on right-top corner of the asset details page.
- :::image type="content" source="./media/how-to-create-and-manage-collections/collection-path.png" alt-text="Screenshot of Azure Purview studio asset window, with the collection path highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/collection-path.png" alt-text="Screenshot of Microsoft Purview studio asset window, with the collection path highlighted." border="true":::
1. Permissions in asset details page: 1. Check the collection-based permission model by following the [add roles and restricting access on collections guide above](#add-roles-and-restrict-access-through-collections).
- 1. If you don't have read permission on a collection, the assets under that collection will not be listed in search results. If you get the direct URL of one asset and open it, you will see the no access page. Contact your Azure Purview admin to grant you the access. You can select the **Refresh** button to check the permission again.
+ 1. If you don't have read permission on a collection, the assets under that collection will not be listed in search results. If you get the direct URL of one asset and open it, you will see the no access page. Contact your Microsoft Purview admin to grant you the access. You can select the **Refresh** button to check the permission again.
- :::image type="content" source="./media/how-to-create-and-manage-collections/no-access.png" alt-text="Screenshot of Azure Purview studio asset window where the user has no permissions, and has no access to information or options." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/no-access.png" alt-text="Screenshot of Microsoft Purview studio asset window where the user has no permissions, and has no access to information or options." border="true":::
1. If you have the read permission to one collection but don't have the write permission, you can browse the asset details page, but the following operations are disabled: * Edit the asset. The **Edit** button will be disabled.
Assets and sources are also associated with collections. During a scan, if the s
* Move asset to another collection. The ellipsis button on the right-top corner of Collection path section will be hidden. 1. The assets in **Hierarchy** section are also affected by permissions. Assets without read permission will be grayed.
- :::image type="content" source="./media/how-to-create-and-manage-collections/hierarchy-permissions.png" alt-text="Screenshot of Azure Purview studio hierarchy window where the user has only read permissions, and has no access to options." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/hierarchy-permissions.png" alt-text="Screenshot of Microsoft Purview studio hierarchy window where the user has only read permissions, and has no access to options." border="true":::
### Move asset to another collection 1. Select the ellipsis button on the right-top corner of Collection path section.
- :::image type="content" source="./media/how-to-create-and-manage-collections/move-asset.png" alt-text="Screenshot of Azure Purview studio asset window with the collection path highlighted and the ellipsis button next to collection path selected." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/move-asset.png" alt-text="Screenshot of Microsoft Purview studio asset window with the collection path highlighted and the ellipsis button next to collection path selected." border="true":::
1. Select the **Move to another collection** button. 1. In the right side panel, choose the target collection you want move to. You can only see the collections where you have write permissions. The asset can also only be added to the subcollections of the data source collection.
- :::image type="content" source="./media/how-to-create-and-manage-collections/move-select-collection.png" alt-text="Screenshot of Azure Purview studio pop-up window with the select a collection dropdown menu highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/move-select-collection.png" alt-text="Screenshot of Microsoft Purview studio pop-up window with the select a collection dropdown menu highlighted." border="true":::
1. Select **Move** button on the bottom of the window to move the asset.
Assets and sources are also associated with collections. During a scan, if the s
### Search by collection
-1. In Azure Purview, the search bar is located at the top of the Azure Purview studio UX.
+1. In Microsoft Purview, the search bar is located at the top of the Microsoft Purview studio UX.
- :::image type="content" source="./media/how-to-create-and-manage-collections/purview-search-bar.png" alt-text="Screenshot showing the location of the Azure Purview search bar" border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/purview-search-bar.png" alt-text="Screenshot showing the location of the Microsoft Purview search bar" border="true":::
1. When you select the search bar, you can see your recent search history and recently accessed assets. Select **View all** to see all of the recently viewed assets. :::image type="content" source="./media/how-to-create-and-manage-collections/search-no-keywords.png" alt-text="Screenshot showing the search bar before any keywords have been entered" border="true":::
-1. Enter in keywords that help identify your asset such as its name, data type, classifications, and glossary terms. As you enter in keywords relating to your desired asset, Azure Purview displays suggestions on what to search and potential asset matches. To complete your search, select **View search results** or press **Enter**.
+1. Enter in keywords that help identify your asset such as its name, data type, classifications, and glossary terms. As you enter in keywords relating to your desired asset, Microsoft Purview displays suggestions on what to search and potential asset matches. To complete your search, select **View search results** or press **Enter**.
:::image type="content" source="./media/how-to-create-and-manage-collections/search-keywords.png" alt-text="Screenshot showing the search bar as a user enters in keywords" border="true":::
Assets and sources are also associated with collections. During a scan, if the s
1. You can browse data assets, by selecting the **Browse assets** on the homepage.
- :::image type="content" source="./media/how-to-create-and-manage-collections/browse-by-collection.png" alt-text="Screenshot of the catalog Azure Purview studio window with the browse assets button highlighted." border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/browse-by-collection.png" alt-text="Screenshot of the catalog Microsoft Purview studio window with the browse assets button highlighted." border="true":::
1. On the Browse asset page, select **By collection** pivot. Collections are listed with hierarchical table view. To further explore assets in each collection, select the corresponding collection name.
- :::image type="content" source="./media/how-to-create-and-manage-collections/by-collection-view.png" alt-text="Screenshot of the asset Azure Purview studio window with the by collection tab selected."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/by-collection-view.png" alt-text="Screenshot of the asset Microsoft Purview studio window with the by collection tab selected."border="true":::
1. On the next page, the search results of the assets under selected collection will be shown. You can narrow the results by selecting the facet filters. Or you can see the assets under other collections by selecting the sub/related collection names.
- :::image type="content" source="./media/how-to-create-and-manage-collections/search-results-by-collection.png" alt-text="Screenshot of the catalog Azure Purview studio window with the by collection tab selected."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/search-results-by-collection.png" alt-text="Screenshot of the catalog Microsoft Purview studio window with the by collection tab selected."border="true":::
1. To view the details of an asset, select the asset name in the search result. Or you can check the assets and bulk edit them.
- :::image type="content" source="./media/how-to-create-and-manage-collections/view-asset-details.png" alt-text="Screenshot of the catalog Azure Purview studio window with the by collection tab selected and asset check boxes highlighted."border="true":::
+ :::image type="content" source="./media/how-to-create-and-manage-collections/view-asset-details.png" alt-text="Screenshot of the catalog Microsoft Purview studio window with the by collection tab selected and asset check boxes highlighted."border="true":::
## Next steps
purview How To Create Import Export Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-create-import-export-glossary.md
Title: How to create, import, export, and manage glossary terms
-description: Learn how to create, import, export, and manage business glossary terms in Azure Purview.
+description: Learn how to create, import, export, and manage business glossary terms in Microsoft Purview.
Last updated 03/09/2022
# How to create, import, and export glossary terms
-This article describes how to work with the business glossary in Azure Purview. Steps are provided to create a business glossary term in Azure Purview data catalog, and import and export glossary terms using .csv files.
+This article describes how to work with the business glossary in Microsoft Purview. Steps are provided to create a business glossary term in Microsoft Purview data catalog, and import and export glossary terms using .csv files.
## Create a new term
To create a new glossary term, follow these steps:
## Import terms into the glossary
-The Azure Purview Data Catalog provides a template .csv file for you to import your terms into your Glossary.
+The Microsoft Purview Data Catalog provides a template .csv file for you to import your terms into your Glossary.
You can import terms in the catalog. The duplicate terms in file will be overwritten.
If [workflows](concept-workflow.md) are enabled on a term, then any creates, upd
- **Deletion** - when a delete approval workflow is enabled on the parent term, you'll see **Submit for approval** instead of **Delete** when deleting the term. Selecting **Submit for approval** will trigger the workflow. However, the term won't be deleted from catalog until all the approvals are met. -- **Importing terms** - when an import approval workflow enabled for Azure Purview's glossary, you'll see **Submit for approval** instead of **OK** in the Import window when importing terms via csv. Selecting **Submit for approval** will trigger the workflow. However, the terms in the file won't be updated in catalog until all the approvals are met.
+- **Importing terms** - when an import approval workflow enabled for Microsoft Purview's glossary, you'll see **Submit for approval** instead of **OK** in the Import window when importing terms via csv. Selecting **Submit for approval** will trigger the workflow. However, the terms in the file won't be updated in catalog until all the approvals are met.
:::image type="content" source="media/how-to-create-import-export-glossary/create-submit-for-approval.png" alt-text="Screenshot of the create new term window. The parent term requires approval, so the available buttons at the bottom of the page are 'Submit for approval' and 'Cancel'." border="true":::
purview How To Data Owner Policies Resource Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-data-owner-policies-resource-group.md
# Resource group and subscription access provisioning by data owner (preview) [!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-[Policies](concept-data-owner-policies.md) allow you to enable access to data sources that have been registered for *Data use governance* in Azure Purview.
+[Policies](concept-data-owner-policies.md) allow you to enable access to data sources that have been registered for *Data use governance* in Microsoft Purview.
You can also [register an entire resource group or subscription](register-scan-azure-multiple-sources.md), and create a single policy that will manage access to **all** data sources in that resource group or subscription. That single policy will cover all existing data sources and any data sources that are created afterwards. This article describes how this is done.
This article describes how this is done.
[!INCLUDE [Access policies generic configuration](./includes/access-policies-configuration-generic.md)] ### Register the subscription or resource group for data use governance
-The subscription or resource group needs to be registered with Azure Purview to later define access policies.
+The subscription or resource group needs to be registered with Microsoft Purview to later define access policies.
To register your resource, follow the **Prerequisites** and **Register** sections of this guide: -- [Register multiple sources in Azure Purview](register-scan-azure-multiple-sources.md#prerequisites)
+- [Register multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md#prerequisites)
-After you've registered your resources, you'll need to enable data use governance. Data use governance affects the security of your data, as it delegates to certain users to manage access to data resources from within Azure Purview.
+After you've registered your resources, you'll need to enable data use governance. Data use governance affects the security of your data, as it delegates to certain users to manage access to data resources from within Microsoft Purview.
To ensure you securely enable data use governance, and follow best practices, follow this guide to enable data use governance for your resource group or subscription:
Execute the steps in the [data-owner policy authoring tutorial](how-to-data-owne
- Creating a policy at subscription or resource group level will enable the Subjects to access Azure Storage system containers, for example, *$logs*. If this is undesired, first scan the data source and then create finer-grained policies for each (that is, at container or subcontainer level). ### Limits
-The limit for Azure Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
+The limit for Microsoft Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
## Next steps Check blog, demo and related tutorials:
-* [Concepts for Azure Purview data owner policies](./concept-data-owner-policies.md)
+* [Concepts for Microsoft Purview data owner policies](./concept-data-owner-policies.md)
* [Data owner policies on an Azure Storage account](./how-to-data-owner-policies-storage.md) * [Blog: resource group-level governance can significantly reduce effort](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-resource-group-level-governance-can/ba-p/3096314) * [Demo of data owner access policies for Azure Storage](/video/media/8ce7c554-0d48-430f-8f63-edf94946947c/purview-policy-storage-dataowner-scenario_mid.mp4)
purview How To Data Owner Policies Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-data-owner-policies-storage.md
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-[Access policies](concept-data-owner-policies.md) allow you to enable access to data sources that have been registered for *Data use governance* in Azure Purview.
-This article describes how a data owner can delegate in Azure Purview management of access to Azure Storage datasets. Currently, these two Azure Storage sources are supported:
+[Access policies](concept-data-owner-policies.md) allow you to enable access to data sources that have been registered for *Data use governance* in Microsoft Purview.
+This article describes how a data owner can delegate in Microsoft Purview management of access to Azure Storage datasets. Currently, these two Azure Storage sources are supported:
+ - Blob storage - Azure Data Lake Storage (ADLS) Gen2
This article describes how a data owner can delegate in Azure Purview management
## Configuration [!INCLUDE [Access policies generic configuration](./includes/access-policies-configuration-generic.md)]
-### Register the data sources in Azure Purview for Data use governance
-The Azure Storage resources need to be registered first with Azure Purview to later define access policies.
+### Register the data sources in Microsoft Purview for Data use governance
+The Azure Storage resources need to be registered first with Microsoft Purview to later define access policies.
To register your resources, follow the **Prerequisites** and **Register** sections of these guides: -- [Register and scan Azure Storage Blob - Azure Purview](register-scan-azure-blob-storage-source.md#prerequisites)
+- [Register and scan Azure Storage Blob - Microsoft Purview](register-scan-azure-blob-storage-source.md#prerequisites)
-- [Register and scan Azure Data Lake Storage (ADLS) Gen2 - Azure Purview](register-scan-adls-gen2.md#prerequisites)
+- [Register and scan Azure Data Lake Storage (ADLS) Gen2 - Microsoft Purview](register-scan-adls-gen2.md#prerequisites)
-After you've registered your resources, you'll need to enable *Data use governance*. Data use governance can affect the security of your data, as it delegates to certain Azure Purview roles to manage access to data sources that have been registered. Secure practices related to *Data use governance* are described in this guide:
+After you've registered your resources, you'll need to enable *Data use governance*. Data use governance can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to data sources that have been registered. Secure practices related to *Data use governance* are described in this guide:
- [How to enable data use governance](./how-to-enable-data-use-governance.md)
Execute the steps in the [data-owner policy authoring tutorial](how-to-data-owne
### Limits-- The limit for Azure Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
+- The limit for Microsoft Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
### Known issues **Known issues** related to Policy creation-- Do not create policy statements based on Azure Purview resource sets. Even if displayed in Azure Purview policy authoring UI, they are not yet enforced. Learn more about [resource sets](concept-resource-sets.md).
+- Do not create policy statements based on Microsoft Purview resource sets. Even if displayed in Microsoft Purview policy authoring UI, they are not yet enforced. Learn more about [resource sets](concept-resource-sets.md).
### Policy action mapping
-This section contains a reference of how actions in Azure Purview data policies map to specific actions in Azure Storage.
+This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in Azure Storage.
-| **Azure Purview policy action** | **Data source specific actions** |
+| **Microsoft Purview policy action** | **Data source specific actions** |
||--| ||| | *Read* |<sub>Microsoft.Storage/storageAccounts/blobServices/containers/read |
This section contains a reference of how actions in Azure Purview data policies
Check blog, demo and related tutorials: * [Demo of access policy for Azure Storage](/video/media/8ce7c554-0d48-430f-8f63-edf94946947c/purview-policy-storage-dataowner-scenario_mid.mp4)
-* [Concepts for Azure Purview data owner policies](./concept-data-owner-policies.md)
-* [Enable Azure Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
-* [Blog: What's New in Azure Purview at Microsoft Ignite 2021](https://techcommunity.microsoft.com/t5/azure-purview/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954)
+* [Concepts for Microsoft Purview data owner policies](./concept-data-owner-policies.md)
+* [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
+* [Blog: What's New in Microsoft Purview at Microsoft Ignite 2021](https://techcommunity.microsoft.com/t5/azure-purview/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954)
* [Blog: Accessing data when folder level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-folder-level-permission/ba-p/3109583) * [Blog: Accessing data when file level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-file-level-permission/ba-p/3102166)
purview How To Data Owner Policy Authoring Generic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-data-owner-policy-authoring-generic.md
Title: Authoring and publishing data owner access policies
-description: Step-by-step guide on how a data owner can author and publish access policies in Azure Purview
+description: Step-by-step guide on how a data owner can author and publish access policies in Microsoft Purview
Last updated 4/18/2022
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-Access policies allow a data owner to delegate in Azure Purview access management to a data source. These policies can be authored directly in Azure Purview Studio, and after publishing, they get enforced by the data source. This tutorial describes how to create, update, and publish access policies in Azure Purview Studio.
+Access policies allow a data owner to delegate in Microsoft Purview access management to a data source. These policies can be authored directly in the Microsoft Purview studio, and after publishing, they get enforced by the data source. This tutorial describes how to create, update, and publish access policies in the Microsoft Purview studio.
## Prerequisites [!INCLUDE [Access policies generic pre-requisites](./includes/access-policies-prerequisites-generic.md)]
Access policies allow a data owner to delegate in Azure Purview access managemen
### Data source configuration
-Before authoring data policies in Azure Purview Studio, you'll need to configure the data sources so that they can enforce those policies.
+Before authoring data policies in Microsoft Purview Studio, you'll need to configure the data sources so that they can enforce those policies.
-1. Follow any policy-specific prerequisites for your source. Check the [Azure Purview supported data sources table](azure-purview-connector-overview.md#azure-purview-data-sources) and select the link in the **Access Policy** column for sources where access policies are available. Follow any steps listed in the Access policy or Prerequisites sections.
-1. Register the data source in Azure Purview. Follow the **Prerequisites** and **Register** sections of the [source pages](azure-purview-connector-overview.md) for your resources.
+1. Follow any policy-specific prerequisites for your source. Check the [Microsoft Purview supported data sources table](azure-purview-connector-overview.md#microsoft-purview-data-sources) and select the link in the **Access Policy** column for sources where access policies are available. Follow any steps listed in the Access policy or Prerequisites sections.
+1. Register the data source in Microsoft Purview. Follow the **Prerequisites** and **Register** sections of the [source pages](azure-purview-connector-overview.md) for your resources.
1. [Enable the data use governance toggle on the data source](how-to-enable-data-use-governance.md#enable-data-use-governance). Additional permissions for this step are described in the linked document. ## Create a new policy
-This section describes the steps to create a new policy in Azure Purview.
+This section describes the steps to create a new policy in Microsoft Purview.
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**. 1. Select the **New Policy** button in the policy page.
- :::image type="content" source="./media/access-policies-common/policy-onboard-guide-1.png" alt-text="Data owner can access the Policy functionality in Azure Purview when it wants to create policies.":::
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-1.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to create policies.":::
1. The new policy page will appear. Enter the policy **Name** and **Description**.
A newly created policy is in the **draft** state. The process of publishing asso
The steps to publish a policy are as follows:
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
- :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Data owner can access the Policy functionality in Azure Purview when it wants to update a policy by selecting 'Data policies'.":::
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy by selecting 'Data policies'.":::
-1. The Policy portal will present the list of existing policies in Azure Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
+1. The Policy portal will present the list of existing policies in Microsoft Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
:::image type="content" source="./media/access-policies-common/publish-policy.png" alt-text="Data owner can publish a policy.":::
The steps to publish a policy are as follows:
## Update or delete a policy
-Steps to update or delete a policy in Azure Purview are as follows.
+Steps to update or delete a policy in Microsoft Purview are as follows.
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
- :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Data owner can access the Policy functionality in Azure Purview when it wants to update a policy.":::
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy.":::
-1. The Policy portal will present the list of existing policies in Azure Purview. Select the policy that needs to be updated.
+1. The Policy portal will present the list of existing policies in Microsoft Purview. Select the policy that needs to be updated.
1. The policy details page will appear, including Edit and Delete options. Select the **Edit** button, which brings up the policy statement builder. Now, any parts of the statements in this policy can be updated. To delete the policy, use the **Delete** button.
Steps to update or delete a policy in Azure Purview are as follows.
For specific guides on creating policies, you can follow these tutorials: -- [Enable Azure Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)-- [Enable Azure Purview data owner policies on an Azure Storage account](./how-to-data-owner-policies-storage.md)
+- [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
+- [Enable Microsoft Purview data owner policies on an Azure Storage account](./how-to-data-owner-policies-storage.md)
purview How To Delete Self Service Data Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-delete-self-service-data-access-policy.md
Last updated 03/22/2022
# How to delete self-service data access policies
-In an Azure Purview catalog, you can now [request access](how-to-request-access.md) to data assets. If policies are currently available for the data source type and the data source has [data use governance enabled](how-to-enable-data-use-governance.md), a self-service policy is generated when a data access request is approved.
+In a Microsoft Purview catalog, you can now [request access](how-to-request-access.md) to data assets. If policies are currently available for the data source type and the data source has [data use governance enabled](how-to-enable-data-use-governance.md), a self-service policy is generated when a data access request is approved.
This article describes how to delete self-service data access policies that have been auto-generated by approved access requests.
This article describes how to delete self-service data access policies that have
Self-service policies must exist to be deleted. To enable and create self-service policies, follow these articles:
-1. [Enable Data Use Governance](how-to-enable-data-use-governance.md) - this will allow Azure Purview to create policies for your sources.
-1. [Create a self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md) - this will enable [users to request access to data sources from within Azure Purview](how-to-request-access.md).
+1. [Enable Data Use Governance](how-to-enable-data-use-governance.md) - this will allow Microsoft Purview to create policies for your sources.
+1. [Create a self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md) - this will enable [users to request access to data sources from within Microsoft Purview](how-to-request-access.md).
1. [Approve a self-service data access request](how-to-workflow-manage-requests-approvals.md#approvals) - after approving a request, if your workflow from the previous step includes the ability to create a self-service data policy, your policy will be created and will be viewable. ## Permission
Only users with **Policy Admin** privilege can delete self-service data access p
## Steps to delete self-service data access policies
-1. Open the Azure portal and launch the [Azure Purview Studio](https://web.purview.azure.com/resource/). The Azure Purview studio can be launched as shown below or by using the [url directly](https://web.purview.azure.com/resource/).
+1. Open the Azure portal and launch the [Microsoft Purview Studio](https://web.purview.azure.com/resource/). The Microsoft Purview studio can be launched as shown below or by using the [url directly](https://web.purview.azure.com/resource/).
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-launch-pic-1.png" alt-text="Screenshot showing an Azure Purview account open in the Azure portal, with the Azure Purview studio button highlighted.":::
+ :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-launch-pic-1.png" alt-text="Screenshot showing a Microsoft Purview account open in the Azure portal, with the Microsoft Purview studio button highlighted.":::
1. Select the policy management tab to launch the self-service access policies.
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-2.png" alt-text="Screenshot of the Azure Purview studio with the leftmost menu open, and the Data policy page option highlighted.":::
+ :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-2.png" alt-text="Screenshot of the Microsoft Purview studio with the leftmost menu open, and the Data policy page option highlighted.":::
1. Open the self-service access policies tab.
- :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-3.png" alt-text="Screenshot of the Azure Purview studio open to the Data policy page with self-service access policies highlighted.":::
+ :::image type="content" source="./media/how-to-delete-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-3.png" alt-text="Screenshot of the Microsoft Purview studio open to the Data policy page with self-service access policies highlighted.":::
1. Here you'll see all your policies. Select the policies that need to be deleted. The policies can be sorted and filtered by any of the displayed columns to improve your search.
purview How To Enable Data Use Governance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-enable-data-use-governance.md
Title: Enabling data use governance on your Azure Purview sources
+ Title: Enabling data use governance on your Microsoft Purview sources
description: Step-by-step guide on how to enable data use access for your registered sources.
Last updated 4/18/2022
-# Enable data use governance on your Azure Purview sources
+# Enable data use governance on your Microsoft Purview sources
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-*Data use governance* (DUG) is an option (enabled/disabled) that gets displayed when registering a data source in Azure Purview. Its purpose is to make that data source available in the policy authoring experience of Azure Purview Studio. In other words, access policies can only be written on data sources that have been previously registered with the DUG toggle set to enable.
+*Data use governance* (DUG) is an option (enabled/disabled) that gets displayed when registering a data source in Microsoft Purview. Its purpose is to make that data source available in the policy authoring experience of the Microsoft Purview studio. In other words, access policies can only be written on data sources that have been previously registered with the DUG toggle set to enable.
## Prerequisites [!INCLUDE [Access policies generic configuration](./includes/access-policies-configuration-generic.md)] ## Enable Data use governance
-To enable *Data use governance* for a resource, the resource will first need to be registered in Azure Purview.
+To enable *Data use governance* for a resource, the resource will first need to be registered in Microsoft Purview.
To register a resource, follow the **Prerequisites** and **Register** sections of the [source pages](azure-purview-connector-overview.md) for your resources. Once you have your resource registered, follow the rest of the steps to enable an individual resource for *Data use governance*.
-1. Go to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Go to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the **Data map** tab in the left menu.
Once you have your resource registered, follow the rest of the steps to enable a
## Disable Data use governance
-To disable data use governance for a source, resource group, or subscription, a user needs to either be a resource IAM **Owner** or an Azure Purview **Data source admin**. Once you have those permissions follow these steps:
+To disable data use governance for a source, resource group, or subscription, a user needs to either be a resource IAM **Owner** or a Microsoft Purview **Data source admin**. Once you have those permissions follow these steps:
-1. Go to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Go to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the **Data map** tab in the left menu.
To disable data use governance for a source, resource group, or subscription, a
1. Set the **Data use governance** toggle to **Disabled**. ## Additional considerations related to Data use governance-- Make sure you write down the **Name** you use when registering in Azure Purview. You will need it when you publish a policy. The recommended practice is to make the registered name exactly the same as the endpoint name.+
+- Make sure you write down the **Name** you use when registering in Microsoft Purview. You will need it when you publish a policy. The recommended practice is to make the registered name exactly the same as the endpoint name.
+- To disable a source for *Data use governance*, remove it first from being bound (i.e. published) in any policy.
+- While user needs to have both data source *Owner* and Microsoft Purview *Data source admin* to enable a source for *Data use governance*, either of those roles can independently disable it.
+- Make sure you write down the **Name** you use when registering in Microsoft Purview. You will need it when you publish a policy. The recommended practice is to make the registered name exactly the same as the endpoint name.
- To disable a source for *Data use governance*, remove it first from being bound (i.e., published) in any policy.-- While user needs to have both data source *Owner* and Azure Purview *Data source admin* to enable a source for *Data use governance*, either of those roles can independently disable it.
+- While user needs to have both data source *Owner* and Microsoft Purview *Data source admin* to enable a source for *Data use governance*, either of those roles can independently disable it.
- Disabling *Data use governance* for a subscription will disable it also for all assets registered in that subscription. > [!WARNING] > **Known issues** related to source registration
-> - Moving data sources to a different resource group or subscription is not yet supported. If want to do that, de-register the data source in Azure Purview before moving it and then register it again after that happens.
+> - Moving data sources to a different resource group or subscription is not yet supported. If want to do that, de-register the data source in Microsoft Purview before moving it and then register it again after that happens.
> - Once a subscription gets disabled for *Data use governance* any underlying assets that are enabled for *Data use governance* will be disabled, which is the right behavior. However, policy statements based on those assets will still be allowed after that. ## Data use governance best practices-- We highly encourage registering data sources for *Data use governance* and managing all associated access policies in a single Azure Purview account.-- Should you have multiple Azure Purview accounts, be aware that **all** data sources belonging to a subscription must be registered for *Data use governance* in a single Azure Purview account. That Azure Purview account can be in any subscription in the tenant. The *Data use governance* toggle will become greyed out when there are invalid configurations. Some examples of valid and invalid configurations follow in the diagram below:
- - **Case 1** shows a valid configuration where a Storage account is registered in an Azure Purview account in the same subscription.
- - **Case 2** shows a valid configuration where a Storage account is registered in an Azure Purview account in a different subscription.
- - **Case 3** shows an invalid configuration arising because Storage accounts S3SA1 and S3SA2 both belong to Subscription 3, but are registered to different Azure Purview accounts. In that case, the *Data use governance* toggle will only enable in the Azure Purview account that wins and registers a data source in that subscription first. The toggle will then be greyed out for the other data source.
-- If the *Data use governance* toggle is greyed out and cannot be enabled, hover over it to know the name of the Azure Purview account that has registered the data resource first.
+- We highly encourage registering data sources for *Data use governance* and managing all associated access policies in a single Microsoft Purview account.
+- Should you have multiple Microsoft Purview accounts, be aware that **all** data sources belonging to a subscription must be registered for *Data use governance* in a single Microsoft Purview account. That Microsoft Purview account can be in any subscription in the tenant. The *Data use governance* toggle will become greyed out when there are invalid configurations. Some examples of valid and invalid configurations follow in the diagram below:
+ - **Case 1** shows a valid configuration where a Storage account is registered in a Microsoft Purview account in the same subscription.
+ - **Case 2** shows a valid configuration where a Storage account is registered in a Microsoft Purview account in a different subscription.
+ - **Case 3** shows an invalid configuration arising because Storage accounts S3SA1 and S3SA2 both belong to Subscription 3, but are registered to different Microsoft Purview accounts. In that case, the *Data use governance* toggle will only enable in the Microsoft Purview account that wins and registers a data source in that subscription first. The toggle will then be greyed out for the other data source.
+- If the *Data use governance* toggle is greyed out and cannot be enabled, hover over it to know the name of the Microsoft Purview account that has registered the data resource first.
-![Diagram shows valid and invalid configurations when using multiple Azure Purview accounts to manage policies.](./media/access-policies-common/valid-and-invalid-configurations.png)
+![Diagram shows valid and invalid configurations when using multiple Microsoft Purview accounts to manage policies.](./media/access-policies-common/valid-and-invalid-configurations.png)
## Next steps - [Create data owner policies for your resources](how-to-data-owner-policy-authoring-generic.md)-- [Enable Azure Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)-- [Enable Azure Purview data owner policies on an Azure Storage account](./how-to-data-owner-policies-storage.md)
+- [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
+- [Enable Microsoft Purview data owner policies on an Azure Storage account](./how-to-data-owner-policies-storage.md)
purview How To Integrate With Azure Security Products https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-integrate-with-azure-security-products.md
Title: Integrate with Azure security products
-description: This article describes how to connect Azure security services and Azure Purview to get enriched security experiences.
+description: This article describes how to connect Azure security services and Microsoft Purview to get enriched security experiences.
Last updated 01/23/2022
-# Integrate Azure Purview with Azure security products
+# Integrate Microsoft Purview with Azure security products
-This document explains the steps required for connecting an Azure Purview account with various Azure security products to enrich security experiences with data classifications and sensitivity labels.
+This document explains the steps required for connecting a Microsoft Purview account with various Azure security products to enrich security experiences with data classifications and sensitivity labels.
## Microsoft Defender for Cloud
-Azure Purview provides rich insights into the sensitivity of your data. This makes it valuable to security teams using Microsoft Defender for Cloud to manage the organizationΓÇÖs security posture and protect against threats to their workloads. Data resources remain a popular target for malicious actors, making it crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments. The integration with Azure Purview expands visibility into the data layer, enabling security teams to prioritize resources that contain sensitive data.
+Microsoft Purview provides rich insights into the sensitivity of your data. This makes it valuable to security teams using Microsoft Defender for Cloud to manage the organizationΓÇÖs security posture and protect against threats to their workloads. Data resources remain a popular target for malicious actors, making it crucial for security teams to identify, prioritize, and secure sensitive data resources across their cloud environments. The integration with Microsoft Purview expands visibility into the data layer, enabling security teams to prioritize resources that contain sensitive data.
-To take advantage of this [enrichment in Microsoft Defender for Cloud](../security-center/information-protection.md), no additional steps are needed in Azure Purview. Start exploring the security enrichments with Microsoft Defender for Cloud's [Inventory page](https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/25) where you can see the list of data sources with classifications and sensitivity labels.
+To take advantage of this [enrichment in Microsoft Defender for Cloud](../security-center/information-protection.md), no additional steps are needed in Microsoft Purview. Start exploring the security enrichments with Microsoft Defender for Cloud's [Inventory page](https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/25) where you can see the list of data sources with classifications and sensitivity labels.
### Supported data sources The integration supports data sources in Azure and AWS; sensitive data discovered in these resources is shared with Microsoft Defender for Cloud:
The integration supports data sources in Azure and AWS; sensitive data discovere
1. Data sensitivity information is currently not shared for sources hosted inside virtual machines - like SAP, Erwin, and Teradata. 2. Data sensitivity information is currently not shared for Amazon RDS. 3. Data sensitivity information is currently not shared for Azure PaaS data sources registered using a connection string.
-5. Unregistering the data source in Azure Purview doesn't remove the data sensitivity enrichment in Microsoft Defender for Cloud.
-6. Deleting the Azure Purview account will persist the data sensitivity enrichment for 30 days in Microsoft Defender for Cloud.
-7. Custom classifications defined in the Microsoft 365 Compliance Center or in Azure Purview are not shared with Microsoft Defender for Cloud.
+5. Unregistering the data source in Microsoft Purview doesn't remove the data sensitivity enrichment in Microsoft Defender for Cloud.
+6. Deleting the Microsoft Purview account will persist the data sensitivity enrichment for 30 days in Microsoft Defender for Cloud.
+7. Custom classifications defined in the Microsoft 365 Compliance Center or in Microsoft Purview are not shared with Microsoft Defender for Cloud.
### FAQ
-#### **Why don't I see the AWS data source I have scanned with Azure Purview in Microsoft Defender for Cloud?**
+#### **Why don't I see the AWS data source I have scanned with Microsoft Purview in Microsoft Defender for Cloud?**
Data sources must be onboarded to Microsoft Defender for Cloud as well. Learn more about how to [connect your AWS accounts](../security-center/quickstart-onboard-aws.md) and see your AWS data sources in Microsoft Defender for Cloud. #### **Why don't I see sensitivity labels in Microsoft Defender for Cloud?**
-Assets must first be labeled in Azure Purview, before the labels are shown in Microsoft Defender for Cloud. Check if you have the [prerequisites of sensitivity labels](./how-to-automatically-label-your-content.md) in place. Once your scan the data, the labels will show up in Azure Purview and then automatically in Microsoft Defender for Cloud.
+Assets must first be labeled in Microsoft Purview, before the labels are shown in Microsoft Defender for Cloud. Check if you have the [prerequisites of sensitivity labels](./how-to-automatically-label-your-content.md) in place. Once your scan the data, the labels will show up in Microsoft Purview and then automatically in Microsoft Defender for Cloud.
## Microsoft Sentinel Microsoft Sentinel is a scalable, cloud-native, solution for both security information and event management (SIEM), and security orchestration, automation, and response (SOAR). Microsoft Sentinel delivers intelligent security analytics and threat intelligence across the enterprise, providing a single solution for attack detection, threat visibility, proactive hunting, and threat response.
-Integrate Azure Purview with Microsoft Sentinel to gain visibility into where on your network sensitive information is stored, in a way that helps you prioritize at-risk data for protection, and understand the most critical incidents and threats to investigate in Microsoft Sentinel.
+Integrate Microsoft Purview with Microsoft Sentinel to gain visibility into where on your network sensitive information is stored, in a way that helps you prioritize at-risk data for protection, and understand the most critical incidents and threats to investigate in Microsoft Sentinel.
-1. Start by ingesting your Azure Purview logs into Microsoft Sentinel through a data source.
-1. Then use a Microsoft Sentinel workbook to view data such as assets scanned, classifications found, and labels applied by Azure Purview.
+1. Start by ingesting your Microsoft Purview logs into Microsoft Sentinel through a data source.
+1. Then use a Microsoft Sentinel workbook to view data such as assets scanned, classifications found, and labels applied by Microsoft Purview.
1. Use analytics rules to create alerts for changes within data sensitivity.
-Customize the Azure Purview workbook and analytics rules to best suit the needs of your organization, and combine Azure Purview logs with data ingested from other sources to create enriched insights within Microsoft Sentinel.
+Customize the Microsoft Purview workbook and analytics rules to best suit the needs of your organization, and combine Microsoft Purview logs with data ingested from other sources to create enriched insights within Microsoft Sentinel.
-For more information, see [Tutorial: Integrate Microsoft Sentinel and Azure Purview](../sentinel/purview-solution.md).
+For more information, see [Tutorial: Integrate Microsoft Sentinel and Microsoft Purview](../sentinel/purview-solution.md).
## Next steps-- [Experiences in Microsoft Defender for Cloud enriched using sensitivity from Azure Purview](../security-center/information-protection.md)
+- [Experiences in Microsoft Defender for Cloud enriched using sensitivity from Microsoft Purview](../security-center/information-protection.md)
purview How To Lineage Azure Synapse Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-azure-synapse-analytics.md
Title: Metadata and lineage from Azure Synapse Analytics
-description: This article describes how to connect Azure Synapse Analytics and Azure Purview to track data lineage.
+description: This article describes how to connect Azure Synapse Analytics and Microsoft Purview to track data lineage.
Last updated 09/27/2021
-# How to get lineage from Azure Synapse Analytics into Azure Purview
+# How to get lineage from Azure Synapse Analytics into Microsoft Purview
-This document explains the steps required for connecting an Azure Synapse workspace with an Azure Purview account to track data lineage. The document also gets into the details of the coverage scope and supported lineage capabilities.
+This document explains the steps required for connecting an Azure Synapse workspace with a Microsoft Purview account to track data lineage. The document also gets into the details of the coverage scope and supported lineage capabilities.
## Supported Azure Synapse capabilities
-Currently, Azure Purview captures runtime lineage from the following Azure Synapse pipeline activities:
+Currently, Microsoft Purview captures runtime lineage from the following Azure Synapse pipeline activities:
- [Copy Data](../data-factory/copy-activity-overview.md?context=/azure/synapse-analytics/context/context) - [Data Flow](../data-factory/concepts-data-flow-overview.md?context=/azure/synapse-analytics/context/context) > [!IMPORTANT]
-> Azure Purview drops lineage if the source or destination uses an unsupported data storage system.
+> Microsoft Purview drops lineage if the source or destination uses an unsupported data storage system.
[!INCLUDE[azure-synapse-supported-activity-lineage-capabilities](includes/data-factory-common-supported-capabilities.md)]
-## Access secured Azure Purview account
+## Access secured Microsoft Purview account
-If your Azure Purview account is protected by firewall, learn how to let Azure Synapse [access a secured Azure Purview account](../synapse-analytics/catalog-and-governance/how-to-access-secured-purview-account.md) through Azure Purview private endpoints.
+If your Microsoft Purview account is protected by firewall, learn how to let Azure Synapse [access a secured Microsoft Purview account](../synapse-analytics/catalog-and-governance/how-to-access-secured-purview-account.md) through Microsoft Purview private endpoints.
-## Bring Azure Synapse lineage into Azure Purview
+## Bring Azure Synapse lineage into Microsoft Purview
-### Step 1: Connect Azure Synapse workspace to your Azure Purview account
+### Step 1: Connect Azure Synapse workspace to your Microsoft Purview account
-You can connect an Azure Synapse workspace to Azure Purview, and the connection enables Azure Synapse to push lineage information to Azure Purview. Follow the steps in [Connect Synapse workspace to Azure Purview](../synapse-analytics/catalog-and-governance/quickstart-connect-azure-purview.md). Multiple Azure Synapse workspaces can connect to a single Azure Purview account for holistic lineage tracking.
+You can connect an Azure Synapse workspace to Microsoft Purview, and the connection enables Azure Synapse to push lineage information to Microsoft Purview. Follow the steps in [Connect Synapse workspace to Microsoft Purview](../synapse-analytics/catalog-and-governance/quickstart-connect-azure-purview.md). Multiple Azure Synapse workspaces can connect to a single Microsoft Purview account for holistic lineage tracking.
### Step 2: Run pipeline in Azure Synapse workspace
After you run the Azure Synapse pipeline, in the Synapse pipeline monitoring vie
:::image type="content" source="../data-factory/media/data-factory-purview/monitor-lineage-reporting-status.png" alt-text="Monitor the lineage reporting status in pipeline monitoring view.":::
-### Step 4: View lineage information in your Azure Purview account
+### Step 4: View lineage information in your Microsoft Purview account
-In your Azure Purview account, you can browse assets and choose type "Azure Synapse Analytics". You can also search the Data Catalog using keywords.
+In your Microsoft Purview account, you can browse assets and choose type "Azure Synapse Analytics". You can also search the Data Catalog using keywords.
Select the Synapse account -> pipeline -> activity, you can view the lineage information. ## Next steps
purview How To Lineage Powerbi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-powerbi.md
Last updated 03/30/2021
-# How to get lineage from Power BI into Azure Purview
+# How to get lineage from Power BI into Microsoft Purview
-This article elaborates on the data lineage aspects of Power BI source in Azure Purview. The prerequisite to see data lineage in Azure Purview for Power BI is to [scan your Power BI.](../purview/register-scan-power-bi-tenant.md)
+This article elaborates on the data lineage aspects of Power BI source in Microsoft Purview. The prerequisite to see data lineage in Microsoft Purview for Power BI is to [scan your Power BI.](../purview/register-scan-power-bi-tenant.md)
## Common scenarios
-1. After the Power BI source is scanned, data consumers can perform root cause analysis of a report or dashboard from Azure Purview. For any data discrepancy in a report, users can easily identify the upstream datasets and contact their owners if necessary.
+1. After the Power BI source is scanned, data consumers can perform root cause analysis of a report or dashboard from Microsoft Purview. For any data discrepancy in a report, users can easily identify the upstream datasets and contact their owners if necessary.
2. Data producers can see the downstream reports or dashboards consuming their dataset. Before making any changes to their datasets, the data owners can make informed decisions. 2. Users can search by name, endorsement status, sensitivity label, owner, description, and other business facets to return the relevant Power BI artifacts.
-## Power BI artifacts in Azure Purview
+## Power BI artifacts in Microsoft Purview
-Once the [scan of your Power BI](../purview/register-scan-power-bi-tenant.md) is complete, following Power BI artifacts will be inventoried in Azure Purview
+Once the [scan of your Power BI](../purview/register-scan-power-bi-tenant.md) is complete, following Power BI artifacts will be inventoried in Microsoft Purview
* Capacity * Workspaces
The workspace artifacts will show lineage of Dataflow -> Dataset -> Report -> Da
> * Column lineage and transformations inside of PowerBI Datasets is currently not supported > * Limited information is currently shown for the Data sources from which the PowerBI Dataflow or PowerBI Dataset is created. E.g.: For SQL server source of a PowerBI datasets only server name is captured.
-## Lineage of Power BI artifacts in Azure Purview
+## Lineage of Power BI artifacts in Microsoft Purview
Users can search for the Power BI artifact by name, description, or other details to see relevant results. Under the asset overview & properties tab the basic details such as description, classification and other information are shown. Under the lineage tab, asset relationships are shown with the upstream and downstream dependencies.
Users can search for the Power BI artifact by name, description, or other detail
## Next steps -- [Learn about Data lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Learn about Data lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Link Azure Data Factory to push automated lineage](how-to-link-azure-data-factory.md)
purview How To Lineage Spark Atlas Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-spark-atlas-connector.md
Last updated 04/28/2021
# How to use Apache Atlas connector to collect Spark lineage
-Apache Atlas Spark Connector is a hook to track Spark SQL/DataFrame data movements and push metadata changes to Azure Purview Atlas endpoint.
+Apache Atlas Spark Connector is a hook to track Spark SQL/DataFrame data movements and push metadata changes to Microsoft Purview Atlas endpoint.
## Supported scenarios
This connector supports following tracking:
3. DataFrame movements that have inputs and outputs. This connector relies on query listener to retrieve query and examine the impacts. It will correlate with other systems like Hive, HDFS to track the life cycle of data in Atlas.
-Since Azure Purview supports Atlas API and Atlas native hook, the connector can report lineage to Azure Purview after configured with Spark. The connector could be configured per job or configured as the cluster default setting.
+Since Microsoft Purview supports Atlas API and Atlas native hook, the connector can report lineage to Microsoft Purview after configured with Spark. The connector could be configured per job or configured as the cluster default setting.
## Configuration requirement
The following steps are documented based on DataBricks as an example:
e. Put the package where the spark cluster could access. For DataBricks cluster, the package could upload to dbfs folder, such as /FileStore/jars. 2. Prepare Connector config
- 1. Get Kafka Endpoint and credential in Azure portal of the Azure Purview Account
- 1. Provide your account with *ΓÇ£Azure Purview Data CuratorΓÇ¥* permission
+ 1. Get Kafka Endpoint and credential in Azure portal of the Microsoft Purview Account
+ 1. Provide your account with *ΓÇ£Microsoft Purview Data CuratorΓÇ¥* permission
:::image type="content" source="./media/how-to-lineage-spark-atlas-connector/assign-purview-data-curator-role.png" alt-text="Screenshot showing data curator role assignment" lightbox="./media/how-to-lineage-spark-atlas-connector/assign-purview-data-curator-role.png":::
The following steps are documented based on DataBricks as an example:
atlas.kafka.sasl.mechanism=PLAIN atlas.kafka.security.protocol=SASL_SSL atlas.kafka.bootstrap.servers= atlas-46c097e6-899a-44aa-9a30-6ccd0b2a2a91.servicebus.windows.net:9093
- atlas.kafka.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="<connection string got from your Azure Purview account>";
+ atlas.kafka.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="<connection string got from your Microsoft Purview account>";
``` c. Make sure the atlas configuration file is in the DriverΓÇÖs classpath generated in [step 1 Generate package section above](../purview/how-to-lineage-spark-atlas-connector.md#step-1-prepare-spark-atlas-connector-package). In cluster mode, ship this config file to the remote Drive *--files atlas-application.properties*
-### Step 2. Prepare your Azure Purview account
+### Step 2. Prepare your Microsoft Purview account
After the Atlas Spark model definition is successfully created, follow below steps 1. Get spark type definition from GitHub https://github.com/apache/atlas/blob/release-2.1.0-rc3/addons/models/1000-Hadoop/1100-spark_model.json 2. Assign role:
- 1. Navigate to your Azure Purview account and select Access control (IAM)
- 1. Add Users and grant your service principal *Azure Purview Data source administrator* role
+ 1. Navigate to your Microsoft Purview account and select Access control (IAM)
+ 1. Add Users and grant your service principal *Microsoft Purview Data source administrator* role
3. Get auth token: 1. Open "postman" or similar tools 1. Use the service principal used in previous step to get the bearer token:
After the Atlas Spark model definition is successfully created, follow below ste
:::image type="content" source="./media/how-to-lineage-spark-atlas-connector/postman-examples.png" alt-text="Screenshot showing postman example" lightbox="./media/how-to-lineage-spark-atlas-connector/postman-examples.png":::
-4. Post Spark Atlas model definition to Azure Purview Account:
- 1. Get Atlas Endpoint of the Azure Purview account from properties section of Azure portal.
- 1. Post Spark type definition into the Azure Purview account:
+4. Post Spark Atlas model definition to Microsoft Purview Account:
+ 1. Get Atlas Endpoint of the Microsoft Purview account from properties section of Azure portal.
+ 1. Post Spark type definition into the Microsoft Purview account:
* Post: {{endpoint}}/api/atlas/v2/types/typedefs * Use the generated access token * Body: choose raw and copy all content from GitHub https://github.com/apache/atlas/blob/release-2.1.0-rc3/addons/models/1000-Hadoop/1100-spark_model.json
spark-submit --class com.microsoft.SparkAtlasTest --master yarn --deploy-mode --
2. Below instructions are for Cluster Setting: The connector jar and listenerΓÇÖs setting should be put in Spark clustersΓÇÖ: *conf/spark-defaults.conf*. Spark-submit will read the options in *conf/spark-defaults.conf* and pass them to your application.
-### Step 5. Run and Check lineage in Azure Purview account
-Kick off The Spark job and check the lineage info in your Azure Purview account.
+### Step 5. Run and Check lineage in Microsoft Purview account
+Kick off The Spark job and check the lineage info in your Microsoft Purview account.
:::image type="content" source="./media/how-to-lineage-spark-atlas-connector/purview-with-spark-lineage.png" alt-text="Screenshot showing purview with spark lineage" lightbox="./media/how-to-lineage-spark-atlas-connector/purview-with-spark-lineage.png":::
Kick off The Spark job and check the lineage info in your Azure Purview account.
## Next steps -- [Learn about Data lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Learn about Data lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Link Azure Data Factory to push automated lineage](how-to-link-azure-data-factory.md)
purview How To Lineage Sql Server Integration Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-lineage-sql-server-integration-services.md
Last updated 06/30/2021
-# How to get lineage from SQL Server Integration Services (SSIS) into Azure Purview
+# How to get lineage from SQL Server Integration Services (SSIS) into Microsoft Purview
-This article elaborates on the data lineage aspects of SQL Server Integration Services (SSIS) in Azure Purview.
+This article elaborates on the data lineage aspects of SQL Server Integration Services (SSIS) in Microsoft Purview.
## Prerequisites
On premises SSIS lineage extraction is not supported yet.
| Azure Synapse Analytics \* | Yes | | SQL Server \* | Yes |
-*\* Azure Purview currently doesn't support query or stored procedure for lineage or scanning. Lineage is limited to table and view sources only.*
+*\* Microsoft Purview currently doesn't support query or stored procedure for lineage or scanning. Lineage is limited to table and view sources only.*
-## How to bring SSIS lineage into Azure Purview
+## How to bring SSIS lineage into Microsoft Purview
-### Step 1. [Connect a Data Factory to Azure Purview](how-to-link-azure-data-factory.md)
+### Step 1. [Connect a Data Factory to Microsoft Purview](how-to-link-azure-data-factory.md)
### Step 2. Trigger SSIS activity execution in Azure Data Factory
You can [run SSIS package with Execute SSIS Package activity](../data-factory/ho
Once Execute SSIS Package activity finishes the execution, you can check lineage report status from the activity output in [Data Factory activity monitor](../data-factory/monitor-visually.md#monitor-activity-runs). :::image type="content" source="media/how-to-lineage-sql-server-integration-services/activity-report-lineage-status.png" alt-text="ssis-status":::
-### Step 3. Browse lineage Information in your Azure Purview account
+### Step 3. Browse lineage Information in your Microsoft Purview account
- You can browse the Data Catalog by choosing asset type ΓÇ£SQL Server Integration ServicesΓÇ¥.
Once Execute SSIS Package activity finishes the execution, you can check lineage
## Next steps - [Lift and shift SQL Server Integration Services workloads to the cloud](/sql/integration-services/lift-shift/ssis-azure-lift-shift-ssis-packages-overview)-- [Learn about Data lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Learn about Data lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Link Azure Data Factory to push automated lineage](how-to-link-azure-data-factory.md)
purview How To Link Azure Data Factory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-link-azure-data-factory.md
Title: Connect to Azure Data Factory
-description: This article describes how to connect Azure Data Factory and Azure Purview to track data lineage.
+description: This article describes how to connect Azure Data Factory and Microsoft Purview to track data lineage.
Last updated 11/01/2021
-# How to connect Azure Data Factory and Azure Purview
+# How to connect Azure Data Factory and Microsoft Purview
-This document explains the steps required for connecting an Azure Data Factory account with an Azure Purview account to track data lineage. The document also gets into the details of the coverage scope and supported lineage patterns.
+This document explains the steps required for connecting an Azure Data Factory account with a Microsoft Purview account to track data lineage. The document also gets into the details of the coverage scope and supported lineage patterns.
## View existing Data Factory connections
-Multiple Azure Data Factories can connect to a single Azure Purview to push lineage information. The current limit allows you to connect up 10 Data Factory accounts at a time from the Azure Purview management center. To show the list of Data Factory accounts connected to your Azure Purview account, do the following:
+Multiple Azure Data Factories can connect to a single Microsoft Purview to push lineage information. The current limit allows you to connect up 10 Data Factory accounts at a time from the Microsoft Purview management center. To show the list of Data Factory accounts connected to your Microsoft Purview account, do the following:
1. Select **Management** on the left navigation pane. 2. Under **Lineage connections**, select **Data Factory**.
Multiple Azure Data Factories can connect to a single Azure Purview to push line
4. Notice the various values for connection **Status**:
- - **Connected**: The data factory is connected to the Azure Purview account.
+ - **Connected**: The data factory is connected to the Microsoft Purview account.
- **Disconnected**: The data factory has access to the catalog, but it's connected to another catalog. As a result, data lineage won't be reported to the catalog automatically. - **CannotAccess**: The current user doesn't have access to the data factory, so the connection status is unknown.
Multiple Azure Data Factories can connect to a single Azure Purview to push line
> > Also, it requires the users to be the data factory's "Owner" or "Contributor".
-Follow the steps below to connect an existing data factory to your Azure Purview account. You can also [connect Data Factory to Azure Purview account from ADF](../data-factory/connect-data-factory-to-azure-purview.md).
+Follow the steps below to connect an existing data factory to your Microsoft Purview account. You can also [connect Data Factory to Microsoft Purview account from ADF](../data-factory/connect-data-factory-to-azure-purview.md).
1. Select **Management** on the left navigation pane. 2. Under **Lineage connections**, select **Data Factory**.
Follow the steps below to connect an existing data factory to your Azure Purview
:::image type="content" source="./media/how-to-link-azure-data-factory/connect-data-factory.png" alt-text="Screenshot showing how to connect Azure Data Factory." lightbox="./media/how-to-link-azure-data-factory/connect-data-factory.png":::
- Some Data Factory instances might be disabled if the data factory is already connected to the current Azure Purview account, or the data factory doesn't have a managed identity.
+ Some Data Factory instances might be disabled if the data factory is already connected to the current Microsoft Purview account, or the data factory doesn't have a managed identity.
- A warning message will be displayed if any of the selected Data Factories are already connected to other Azure Purview account. By selecting OK, the Data Factory connection with the other Azure Purview account will be disconnected. No additional confirmations are required.
+ A warning message will be displayed if any of the selected Data Factories are already connected to other Microsoft Purview account. By selecting OK, the Data Factory connection with the other Microsoft Purview account will be disconnected. No additional confirmations are required.
:::image type="content" source="./media/how-to-link-azure-data-factory/warning-for-disconnect-factory.png" alt-text="Screenshot showing warning to disconnect Azure Data Factory.":::
Follow the steps below to connect an existing data factory to your Azure Purview
### How authentication works
-Data factory's managed identity is used to authenticate lineage push operations from data factory to Azure Purview. When connecting data factory to Azure Purview on UI, it adds the role assignment automatically.
+Data factory's managed identity is used to authenticate lineage push operations from data factory to Microsoft Purview. When connecting data factory to Microsoft Purview on UI, it adds the role assignment automatically.
-Grant the data factory's managed identity **Data Curator** role on Azure Purview **root collection**. Learn more about [Access control in Azure Purview](../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+Grant the data factory's managed identity **Data Curator** role on Microsoft Purview **root collection**. Learn more about [Access control in Microsoft Purview](../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
### Remove data factory connections
To remove a data factory connection, do the following:
## Supported Azure Data Factory activities
-Azure Purview captures runtime lineage from the following Azure Data Factory activities:
+Microsoft Purview captures runtime lineage from the following Azure Data Factory activities:
- [Copy Data](../data-factory/copy-activity-overview.md) - [Data Flow](../data-factory/concepts-data-flow-overview.md) - [Execute SSIS Package](../data-factory/how-to-invoke-ssis-package-ssis-activity.md) > [!IMPORTANT]
-> Azure Purview drops lineage if the source or destination uses an unsupported data storage system.
+> Microsoft Purview drops lineage if the source or destination uses an unsupported data storage system.
-The integration between Data Factory and Azure Purview supports only a subset of the data systems that Data Factory supports, as described in the following sections.
+The integration between Data Factory and Microsoft Purview supports only a subset of the data systems that Data Factory supports, as described in the following sections.
[!INCLUDE[data-factory-supported-lineage-capabilities](includes/data-factory-common-supported-capabilities.md)]
The integration between Data Factory and Azure Purview supports only a subset of
Refer to [supported data stores](how-to-lineage-sql-server-integration-services.md#supported-data-stores).
-## Access secured Azure Purview account
+## Access secured Microsoft Purview account
-If your Azure Purview account is protected by firewall, learn how to let Data Factory [access a secured Azure Purview account](../data-factory/how-to-access-secured-purview-account.md) through Azure Purview private endpoints.
+If your Microsoft Purview account is protected by firewall, learn how to let Data Factory [access a secured Microsoft Purview account](../data-factory/how-to-access-secured-purview-account.md) through Microsoft Purview private endpoints.
-## Bring Data Factory lineage into Azure Purview
+## Bring Data Factory lineage into Microsoft Purview
-For an end to end walkthrough, follow the [Tutorial: Push Data Factory lineage data to Azure Purview](../data-factory/turorial-push-lineage-to-purview.md).
+For an end to end walkthrough, follow the [Tutorial: Push Data Factory lineage data to Microsoft Purview](../data-factory/turorial-push-lineage-to-purview.md).
## Supported lineage patterns
-There are several patterns of lineage that Azure Purview supports. The generated lineage data is based on the type of source and sink used in the Data Factory activities. Although Data Factory supports over 80 source and sinks, Azure Purview supports only a subset, as listed in [Supported Azure Data Factory activities](#supported-azure-data-factory-activities).
+There are several patterns of lineage that Microsoft Purview supports. The generated lineage data is based on the type of source and sink used in the Data Factory activities. Although Data Factory supports over 80 source and sinks, Microsoft Purview supports only a subset, as listed in [Supported Azure Data Factory activities](#supported-azure-data-factory-activities).
To configure Data Factory to send lineage information, see [Get started with lineage](catalog-lineage-user-guide.md#get-started-with-lineage).
An example of this pattern would be the following:
### Data movement with 1:1 lineage and wildcard support
-Another common scenario for capturing lineage, is using a wildcard to copy files from a single input dataset to a single output dataset. The wildcard allows the copy activity to match multiple files for copying using a common portion of the file name. Azure Purview captures file-level lineage for each individual file copied by the corresponding copy activity.
+Another common scenario for capturing lineage, is using a wildcard to copy files from a single input dataset to a single output dataset. The wildcard allows the copy activity to match multiple files for copying using a common portion of the file name. Microsoft Purview captures file-level lineage for each individual file copied by the corresponding copy activity.
An example of this pattern would be the following:
An example of this pattern would be the following:
### Data movement with n:1 lineage
-You can use Data Flow activities to perform data operations like merge, join, and so on. More than one source dataset can be used to produce a target dataset. In this example, Azure Purview captures file-level lineage for individual input files to a SQL table that is part of a Data Flow activity.
+You can use Data Flow activities to perform data operations like merge, join, and so on. More than one source dataset can be used to produce a target dataset. In this example, Microsoft Purview captures file-level lineage for individual input files to a SQL table that is part of a Data Flow activity.
An example of this pattern would be the following:
An example of this pattern would be the following:
### Lineage for resource sets
-A resource set is a logical object in the catalog that represents many partition files in the underlying storage. For more information, see [Understanding Resource sets](concept-resource-sets.md). When Azure Purview captures lineage from the Azure Data Factory, it applies the rules to normalize the individual partition files and create a single logical object.
+A resource set is a logical object in the catalog that represents many partition files in the underlying storage. For more information, see [Understanding Resource sets](concept-resource-sets.md). When Microsoft Purview captures lineage from the Azure Data Factory, it applies the rules to normalize the individual partition files and create a single logical object.
In the following example, an Azure Data Lake Gen2 resource set is produced from an Azure Blob:
In the following example, an Azure Data Lake Gen2 resource set is produced from
## Next steps
-[Tutorial: Push Data Factory lineage data to Azure Purview](../data-factory/turorial-push-lineage-to-purview.md)
+[Tutorial: Push Data Factory lineage data to Microsoft Purview](../data-factory/turorial-push-lineage-to-purview.md)
[Catalog lineage user guide](catalog-lineage-user-guide.md)
purview How To Link Azure Data Share https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-link-azure-data-share.md
Title: Connect to Azure Data Share
-description: This article describes how to connect an Azure Data Share account with Azure Purview to search assets and track data lineage.
+description: This article describes how to connect an Azure Data Share account with Microsoft Purview to search assets and track data lineage.
Last updated 03/14/2022
-# How to connect Azure Data Share and Azure Purview
+# How to connect Azure Data Share and Microsoft Purview
-This article discusses how to connect your [Azure Data Share](../data-share/overview.md) account with Azure Purview and govern the shared datasets (both outgoing and incoming) in your data estate. Various data governance personas can discover and track lineage of data across boundaries like organizations, departments and even data centers.
+This article discusses how to connect your [Azure Data Share](../data-share/overview.md) account with Microsoft Purview and govern the shared datasets (both outgoing and incoming) in your data estate. Various data governance personas can discover and track lineage of data across boundaries like organizations, departments and even data centers.
## Common Scenarios
A report has incorrect information because of upstream data issues from an exter
Data producers want to know who will be impacted upon making a change to their dataset. Using lineage, a data producer can easily understand the impact of the downstream internal or external partners who are consuming data using Azure Data Share.
-## Azure Data Share and Azure Purview connected experience
+## Azure Data Share and Microsoft Purview connected experience
-To connect your Azure Data Share and Azure Purview account, do the following:
+To connect your Azure Data Share and Microsoft Purview account, do the following:
-1. Create an Azure Purview account. All the Data Share lineage information will be collected by an Azure Purview account. You can use an existing one or create a new Azure Purview account.
+1. Create a Microsoft Purview account. All the Data Share lineage information will be collected by a Microsoft Purview account. You can use an existing one or create a new Microsoft Purview account.
-1. Connect your Azure Data Share to your Azure Purview account.
+1. Connect your Azure Data Share to your Microsoft Purview account.
- 1. In the Azure Purview portal, you can go to **Management Center** and connect your Azure Data Share under the **External connections** section.
- 1. Select **+ New** on the top bar, find your Azure Data Share in the pop-up side bar and add the Data Share account. Run a snapshot job after connecting your Data Share to Azure Purview account, so that the Data Share assets and lineage information is visible in Azure Purview.
+ 1. In the Microsoft Purview portal, you can go to **Management Center** and connect your Azure Data Share under the **External connections** section.
+ 1. Select **+ New** on the top bar, find your Azure Data Share in the pop-up side bar and add the Data Share account. Run a snapshot job after connecting your Data Share to Microsoft Purview account, so that the Data Share assets and lineage information is visible in Microsoft Purview.
:::image type="content" source="media/how-to-link-azure-data-share/connect-to-data-share.png" alt-text="Management center to link Azure Data Share"::: 1. Execute your snapshot in Azure Data Share.
- - Once the Azure Data share connection is established with Azure Purview, you can execute a snapshot for your existing shares.
+ - Once the Azure Data share connection is established with Microsoft Purview, you can execute a snapshot for your existing shares.
- If you donΓÇÖt have any existing shares, go to the Azure Data Share portal to [share your data](../data-share/share-your-data.md) [and subscribe to a data share](../data-share/subscribe-to-data-share.md).
- - Once the share snapshot is complete, you can view associated Data Share assets and lineage in Azure Purview.
+ - Once the share snapshot is complete, you can view associated Data Share assets and lineage in Microsoft Purview.
-1. Discover Data Share accounts and share information in your Azure Purview account.
+1. Discover Data Share accounts and share information in your Microsoft Purview account.
- - In the home page of Azure Purview account, select **Browse by asset type** and select the **Azure Data Share** tile. You can search for an account name, share name, share snapshot, or partner organization. Otherwise apply filters on the Search result page for account name, share type (sent vs received shares).
+ - In the home page of Microsoft Purview account, select **Browse by asset type** and select the **Azure Data Share** tile. You can search for an account name, share name, share snapshot, or partner organization. Otherwise apply filters on the Search result page for account name, share type (sent vs received shares).
:::image type="content" source="media/how-to-link-azure-data-share/azure-data-share-search-result-page.png" alt-text="Azure Data share in Search result page"::: >[!Important]
- >For Data Share assets to show in Azure Purview, a snapshot job must be run after you connect your Data Share to Azure Purview.
+ >For Data Share assets to show in Microsoft Purview, a snapshot job must be run after you connect your Data Share to Microsoft Purview.
1. Track lineage of datasets shared with Azure Data Share.
- - From the Azure Purview search result page, choose the Data share snapshot (received/sent) and select the **Lineage** tab, to see a lineage graph with upstream and downstream dependencies.
+ - From the Microsoft Purview search result page, choose the Data share snapshot (received/sent) and select the **Lineage** tab, to see a lineage graph with upstream and downstream dependencies.
:::image type="content" source="media/how-to-link-azure-data-share/azure-data-share-lineage.png" alt-text="Lineage of Datasets shared using Azure Data Share":::
purview How To Manage Quotas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-manage-quotas.md
Title: Manage resources and quotas-
-description: Learn about the quotas and limits on resources for Azure Purview and how to request quota increases.
+
+description: Learn about the quotas and limits on resources for Microsoft Purview and how to request quota increases.
Last updated 03/21/2022
-# Manage and increase quotas for resources with Azure Purview
+# Manage and increase quotas for resources with Microsoft Purview
-This article highlights the limits that currently exist in the Azure Purview service. These limits are also known as quotas.
+This article highlights the limits that currently exist in the Microsoft Purview service. These limits are also known as quotas.
-## Azure Purview limits
+## Microsoft Purview limits
|**Resource**| **Default Limit** |**Maximum Limit**| ||||
-|Azure Purview accounts per region, per tenant (all subscriptions combined)|3|Contact Support|
+|Microsoft Purview accounts per region, per tenant (all subscriptions combined)|3|Contact Support|
|Data Map throughput^ <br><small>There's no default limit on the data map metadata storage</small>| 10 capacity units <br><small>250 operations per second</small> | 100 capacity units <br><small>2,500 operations per second</small> | |vCores available for scanning, per account*|160|160| |Concurrent scans per Purview account. The limit is based on the type of data sources scanned*|5 | 10 |
This article highlights the limits that currently exist in the Azure Purview ser
## Request quota increase
-Use the following steps to create a new support request from the Azure portal to increase quota for Azure Purview. You can create a quota request for Azure Purview accounts in a subscription, accounts in a tenant and the data map throughput of a specific account.
+Use the following steps to create a new support request from the Azure portal to increase quota for Microsoft Purview. You can create a quota request for Microsoft Purview accounts in a subscription, accounts in a tenant and the data map throughput of a specific account.
1. On the [Azure portal](https://portal.azure.com) menu, select **Help + support**.
Use the following steps to create a new support request from the Azure portal to
1. For **Subscription**, select the subscription whose quota you want to increase.
-1. For **Quota type**, select Azure Purview. Then select **Next**.
+1. For **Quota type**, select Microsoft Purview. Then select **Next**.
:::image type="content" source="./media/how-to-manage-quotas/enter-support-details.png" alt-text="Screenshot showing how to enter support information" border="true"::: 1. In the **Details** window, select **Enter details** to enter additional information. 1. Choose your **Quota type**, scope (either location or account) and what you wish the new limit to be
- :::image type="content" source="./media/how-to-manage-quotas/enter-quota-amount.png" alt-text="Screenshot showing how to enter quota amount for Azure Purview accounts per subscription" border="true":::
+ :::image type="content" source="./media/how-to-manage-quotas/enter-quota-amount.png" alt-text="Screenshot showing how to enter quota amount for Microsoft Purview accounts per subscription" border="true":::
1. Enter the rest of the required support information. Review and create the support request ## Next steps > [!div class="nextstepaction"]
->[Concept: Elastic Data Map in Azure Purview](concept-elastic-data-map.md)
+>[Concept: Elastic Data Map in Microsoft Purview](concept-elastic-data-map.md)
> [!div class="nextstepaction"]
->[Tutorial: Scan data with Azure Purview](tutorial-scan-data.md)
+>[Tutorial: Scan data with Microsoft Purview](tutorial-scan-data.md)
> [!div class="nextstepaction"] >[Tutorial: Navigate the home page and search for an asset](tutorial-asset-search.md)
purview How To Manage Term Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-manage-term-templates.md
Title: How to manage term templates for business glossary
-description: Learn how to manage term templates for business glossary in an Azure Purview data catalog.
+description: Learn how to manage term templates for business glossary in a Microsoft Purview data catalog.
Last updated 4/12/2022
# How to manage term templates for business glossary
-Azure Purview allows you to create a glossary of terms that are important for enriching your data. Each new term added to your Azure Purview Data Catalog Glossary is based on a term template that determines the fields for the term. This article describes how to create a term template and custom attributes that can be associated to glossary terms.
+Microsoft Purview allows you to create a glossary of terms that are important for enriching your data. Each new term added to your Microsoft Purview Data Catalog Glossary is based on a term template that determines the fields for the term. This article describes how to create a term template and custom attributes that can be associated to glossary terms.
## Manage term templates and custom attributes
purview How To Monitor Scan Runs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-monitor-scan-runs.md
Title: Monitor scan runs in Azure Purview
-description: This guide describes how to monitor the scan runs in Azure Purview.
+ Title: Monitor scan runs in Microsoft Purview
+description: This guide describes how to monitor the scan runs in Microsoft Purview.
Last updated 04/04/2022
-# Monitor scan runs in Azure Purview
+# Monitor scan runs in Microsoft Purview
-In Azure Purview, you can register and scan various types of data sources, and you can view the scan status over time. This article outlines how to monitor and get a bird's eye view of your scan runs in Azure Purview.
+In Microsoft Purview, you can register and scan various types of data sources, and you can view the scan status over time. This article outlines how to monitor and get a bird's eye view of your scan runs in Microsoft Purview.
> [!IMPORTANT] > The monitoring experience is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. ## Monitor scan runs
-1. Go to your Azure Purview account -> open **Azure Purview Studio** -> **Data map** -> **Monitoring**.
+1. Go to your Microsoft Purview account -> open **Microsoft Purview Studio** -> **Data map** -> **Monitoring**.
1. The high-level KPIs show total scan runs within a period. The time period is defaulted at last 30 days, you can also choose to select last seven days. Based on the time filter selected, you can see the distribution of successful, failed, and canceled scan runs by week or by the day in the graph.
In Azure Purview, you can register and scan various types of data sources, and y
## Next steps
-* [Azure Purview supported data sources and file types](azure-purview-connector-overview.md)
+* [Microsoft Purview supported data sources and file types](azure-purview-connector-overview.md)
* [Manage data sources](manage-data-sources.md) * [Scan and ingestion](concept-scans-and-ingestion.md)
purview How To Monitor With Azure Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-monitor-with-azure-monitor.md
Title: How to monitor Azure Purview
-description: Learn how to configure Azure Purview metrics, alerts, and diagnostic settings by using Azure Monitor.
+ Title: How to monitor Microsoft Purview
+description: Learn how to configure Microsoft Purview metrics, alerts, and diagnostic settings by using Azure Monitor.
Last updated 04/07/2022
-# Azure Purview metrics in Azure Monitor
+# Microsoft Purview metrics in Azure Monitor
-This article describes how to configure metrics, alerts, and diagnostic settings for Azure Purview using Azure Monitor.
+This article describes how to configure metrics, alerts, and diagnostic settings for Microsoft Purview using Azure Monitor.
-## Monitor Azure Purview
+## Monitor Microsoft Purview
-Azure Purview admins can use Azure Monitor to track the operational state of Azure Purview account. Metrics are collected to provide data points for you to track potential problems, troubleshoot, and improve the reliability of the Azure Purview account. The metrics are sent to Azure monitor for events occurring in Azure Purview.
+Microsoft Purview admins can use Azure Monitor to track the operational state of Microsoft Purview account. Metrics are collected to provide data points for you to track potential problems, troubleshoot, and improve the reliability of the Microsoft Purview account. The metrics are sent to Azure monitor for events occurring in Microsoft Purview.
## Aggregated metrics
-The metrics can be accessed from the Azure portal for an Azure Purview account. Access to the metrics are controlled by the role assignment of Azure Purview account. Users need to be part of the "Monitoring Reader" role in Azure Purview to see the metrics. Check out [Monitoring Reader Role permissions](../azure-monitor/roles-permissions-security.md#built-in-monitoring-roles) to learn more about the roles access levels.
+The metrics can be accessed from the Azure portal for a Microsoft Purview account. Access to the metrics are controlled by the role assignment of Microsoft Purview account. Users need to be part of the "Monitoring Reader" role in Microsoft Purview to see the metrics. Check out [Monitoring Reader Role permissions](../azure-monitor/roles-permissions-security.md#built-in-monitoring-roles) to learn more about the roles access levels.
-The person who created the Azure Purview account automatically gets permissions to view metrics. If anyone else wants to see metrics, add them to the **Monitoring Reader** role, by following these steps:
+The person who created the Microsoft Purview account automatically gets permissions to view metrics. If anyone else wants to see metrics, add them to the **Monitoring Reader** role, by following these steps:
### Add a user to the Monitoring Reader role
-To add a user to the **Monitoring Reader** role, the owner of Azure Purview account or the Subscription owner can follow these steps:
+To add a user to the **Monitoring Reader** role, the owner of Microsoft Purview account or the Subscription owner can follow these steps:
-1. Go to the [Azure portal](https://portal.azure.com) and search for the Azure Purview account name.
+1. Go to the [Azure portal](https://portal.azure.com) and search for the Microsoft Purview account name.
1. Select **Access control (IAM)**.
To add a user to the **Monitoring Reader** role, the owner of Azure Purview acco
## Metrics visualization
-Users in the **Monitoring Reader** role can see the aggregated metrics and diagnostic logs sent to Azure Monitor. The metrics are listed in the Azure portal for the corresponding Azure Purview account. In the Azure portal, select the Metrics section to see the list of all available metrics.
+Users in the **Monitoring Reader** role can see the aggregated metrics and diagnostic logs sent to Azure Monitor. The metrics are listed in the Azure portal for the corresponding Microsoft Purview account. In the Azure portal, select the Metrics section to see the list of all available metrics.
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/purview-metrics.png" alt-text="Screenshot showing available Azure Purview metrics section." lightbox="./media/how-to-monitor-with-azure-monitor/purview-metrics.png":::
+ :::image type="content" source="./media/how-to-monitor-with-azure-monitor/purview-metrics.png" alt-text="Screenshot showing available Microsoft Purview metrics section." lightbox="./media/how-to-monitor-with-azure-monitor/purview-metrics.png":::
-Azure Purview users can also access the metrics page directly from the management center of the Azure Purview account. Select Azure Monitor in the main page of Azure Purview management center to launch Azure portal.
+Microsoft Purview users can also access the metrics page directly from the management center of the Microsoft Purview account. Select Azure Monitor in the main page of Microsoft Purview management center to launch Azure portal.
- :::image type="content" source="./media/how-to-monitor-with-azure-monitor/launch-metrics-from-management.png" alt-text="Screenshot to launch Azure Purview metrics from management center." lightbox="./media/how-to-monitor-with-azure-monitor/launch-metrics-from-management.png":::
+ :::image type="content" source="./media/how-to-monitor-with-azure-monitor/launch-metrics-from-management.png" alt-text="Screenshot to launch Microsoft Purview metrics from management center." lightbox="./media/how-to-monitor-with-azure-monitor/launch-metrics-from-management.png":::
### Available metrics
The following table contains the list of metrics available to explore in the Azu
## Diagnostic Logs to Azure Storage account
-Raw telemetry events are emitted to Azure Monitor. Events can be logged to a customer storage account of choice for further analysis. Exporting of logs is done via the Diagnostic settings for the Azure Purview account on the Azure portal.
+Raw telemetry events are emitted to Azure Monitor. Events can be logged to a customer storage account of choice for further analysis. Exporting of logs is done via the Diagnostic settings for the Microsoft Purview account on the Azure portal.
-Follow the steps to create a Diagnostic setting for your Azure Purview account.
+Follow the steps to create a Diagnostic setting for your Microsoft Purview account.
1. Create a new diagnostic setting to collect platform logs and metrics by following this article: [Create diagnostic settings to send platform logs and metrics to different destinations](../azure-monitor/essentials/diagnostic-settings.md). Select the destination only as Azure storage account.
The event tracks the scan life cycle. A scan operation follows progress through
"level": "<The log severity level. Possible values are: |Informational |Error >",
- "location": "<The location of the Azure Purview account>",
+ "location": "<The location of the Microsoft Purview account>",
} ```
The Sample log for an event instance is shown in the below section.
## Next steps
-[Elastic data map in Azure Purview](concept-elastic-data-map.md)
+[Elastic data map in Microsoft Purview](concept-elastic-data-map.md)
purview How To Request Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-request-access.md
Title: How to request access to a data source in Azure Purview.
-description: This article describes how a user can request access to a data source from within Azure Purview.
+ Title: How to request access to a data source in Microsoft Purview.
+description: This article describes how a user can request access to a data source from within Microsoft Purview.
[!INCLUDE [Region Notice](./includes/workflow-regions.md)]
-If you discover a data asset in the catalog that you would like access to, you can request access directly through Azure Purview.
+If you discover a data asset in the catalog that you would like access to, you can request access directly through Microsoft Purview.
The request will trigger a workflow that will request that the owners of the data resource grant you access to that data source. This article outlines how to make an access request.
-1. To find a data asset, use Azure Purview's [search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) functionality.
+1. To find a data asset, use Microsoft Purview's [search](how-to-search-catalog.md) or [browse](how-to-browse-catalog.md) functionality.
- :::image type="content" source="./media/how-to-request-access/search-or-browse.png" alt-text="Screenshot of the Azure Purview studio, with the search bar and browse buttons highlighted.":::
+ :::image type="content" source="./media/how-to-request-access/search-or-browse.png" alt-text="Screenshot of the Microsoft Purview studio, with the search bar and browse buttons highlighted.":::
1. Select the asset to go to asset details.
This article outlines how to make an access request.
## Next steps -- [What are Azure Purview workflows](concept-workflow.md)
+- [What are Microsoft Purview workflows](concept-workflow.md)
- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md) - [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md)
purview How To Resource Set Pattern Rules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-resource-set-pattern-rules.md
Last updated 09/27/2021
# Create resource set pattern rules
-At-scale data processing systems typically store a single table in storage as multiple files. This concept is represented in Azure Purview by using resource sets. A resource set is a single object in the data catalog that represents a large number of assets in storage. To learn more, see [Understanding resource sets](concept-resource-sets.md).
+At-scale data processing systems typically store a single table in storage as multiple files. This concept is represented in Microsoft Purview by using resource sets. A resource set is a single object in the data catalog that represents a large number of assets in storage. To learn more, see [Understanding resource sets](concept-resource-sets.md).
-When scanning a storage account, Azure Purview uses a set of defined patterns to determine if a group of assets is a resource set. In some cases, Azure Purview's resource set grouping may not accurately reflect your data estate. Resource set pattern rules allow you to customize or override how Azure Purview detects which assets are grouped as resource sets and how they are displayed within the catalog.
+When scanning a storage account, Microsoft Purview uses a set of defined patterns to determine if a group of assets is a resource set. In some cases, Microsoft Purview's resource set grouping may not accurately reflect your data estate. Resource set pattern rules allow you to customize or override how Microsoft Purview detects which assets are grouped as resource sets and how they are displayed within the catalog.
Pattern rules are currently supported in the following source types: - Azure Data Lake Storage Gen2
purview How To Search Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-search-catalog.md
Title: 'How to: search the Data Catalog'
-description: This article gives an overview of how to search the Azure Purview data catalog.
+description: This article gives an overview of how to search the Microsoft Purview data catalog.
Last updated 04/11/2022
-# Search the Azure Purview Data Catalog
+# Search the Microsoft Purview Data Catalog
-After data is scanned and ingested into the Azure Purview data map, data consumers need to easily find the data needed for their analytics or governance workloads. Data discovery can be time consuming because you may not know where to find the data that you want. Even after finding the data, you may have doubts about whether you can trust the data and take a dependency on it.
+After data is scanned and ingested into the Microsoft Purview data map, data consumers need to easily find the data needed for their analytics or governance workloads. Data discovery can be time consuming because you may not know where to find the data that you want. Even after finding the data, you may have doubts about whether you can trust the data and take a dependency on it.
-The goal of search in Azure Purview is to speed up the process of quickly finding the data that matters. This article outlines how to search the Azure Purview data catalog to quickly find the data you're looking for.
+The goal of search in Microsoft Purview is to speed up the process of quickly finding the data that matters. This article outlines how to search the Microsoft Purview data catalog to quickly find the data you're looking for.
## Searching the catalog
-The search bar can be quickly accessed from the top bar of the Azure Purview Studio UX. In the data catalog home page, the search bar is in the center of the screen.
+The search bar can be quickly accessed from the top bar of the Microsoft Purview Studio UX. In the data catalog home page, the search bar is in the center of the screen.
Once you click on the search bar, you'll be presented with your search history and the items recently accessed in the data catalog. This allows you to quickly pick up from previous data exploration that was already done. :::image type="content" source="./media/how-to-search-catalog/search-no-keywords.png" alt-text="Screenshot showing the search bar before any keywords have been entered" border="true":::
-Enter in keywords that help narrow down your search such as name, data type, classifications, and glossary terms. As you enter in search keywords, Azure Purview dynamically suggests results and searches that may fit your needs. To complete your search, click on "View search results" or press "Enter".
+Enter in keywords that help narrow down your search such as name, data type, classifications, and glossary terms. As you enter in search keywords, Microsoft Purview dynamically suggests results and searches that may fit your needs. To complete your search, click on "View search results" or press "Enter".
:::image type="content" source="./media/how-to-search-catalog/search-keywords.png" alt-text="Screenshot showing the search bar as a user enters in keywords" border="true":::
-Once you enter in your search, Azure Purview returns a list of data assets and glossary terms a user is a data reader for to that matched to the keywords entered in.
+Once you enter in your search, Microsoft Purview returns a list of data assets and glossary terms a user is a data reader for to that matched to the keywords entered in.
-The Azure Purview relevance engine sorts through all the matches and ranks them based on what it believes their usefulness is to a user. For example, a data consumer is likely more interested in a table curated by a data steward that matches on multiple keywords than an unannotated folder. Many factors determine an assetΓÇÖs relevance score and the Azure Purview search team is constantly tuning the relevance engine to ensure the top search results have value to you.
+The Microsoft Purview relevance engine sorts through all the matches and ranks them based on what it believes their usefulness is to a user. For example, a data consumer is likely more interested in a table curated by a data steward that matches on multiple keywords than an unannotated folder. Many factors determine an assetΓÇÖs relevance score and the Microsoft Purview search team is constantly tuning the relevance engine to ensure the top search results have value to you.
If the top results donΓÇÖt include the assets you're looking for, you can use the facets on the left-hand side to filter down by business metadata such glossary terms, classifications, and the containing collection. If you're interested in a particular data source type such as Azure Data Lake Storage Gen2 or Azure SQL Database, you can use a pill filter to narrow down your search.
From the search results page, you can select an asset to view details such as sc
:::image type="content" source="./media/how-to-search-catalog/search-view-asset.png" alt-text="Screenshot showing the asset details page" border="true":::
-## Searching Azure Purview in connected services
+## Searching Microsoft Purview in connected services
-Once you register your Azure Purview instance to an Azure Data Factory or an Azure Synapse Analytics workspace, you can search the Azure Purview data catalog directly from those services. To learn more, see [Discover data in ADF using Azure Purview](../data-factory/how-to-discover-explore-purview-data.md) and [Discover data in Synapse using Azure Purview](../synapse-analytics/catalog-and-governance/how-to-discover-connect-analyze-azure-purview.md).
+Once you register your Microsoft Purview instance to an Azure Data Factory or an Azure Synapse Analytics workspace, you can search the Microsoft Purview data catalog directly from those services. To learn more, see [Discover data in ADF using Microsoft Purview](../data-factory/how-to-discover-explore-purview-data.md) and [Discover data in Synapse using Microsoft Purview](../synapse-analytics/catalog-and-governance/how-to-discover-connect-analyze-azure-purview.md).
## Bulk edit search results
-If you're looking to make changes to multiple assets returned by search, Azure Purview lets you modify glossary terms, classifications, and contacts in bulk. To learn more, see the [bulk edit assets](how-to-bulk-edit-assets.md) guide.
+If you're looking to make changes to multiple assets returned by search, Microsoft Purview lets you modify glossary terms, classifications, and contacts in bulk. To learn more, see the [bulk edit assets](how-to-bulk-edit-assets.md) guide.
## Browse the data catalog
-While searching is great if you know what you're looking for, there are times where data consumers wish to explore the data available to them. The Azure Purview data catalog offers a browse experience that enables users to explore what data is available to them either by collection or through traversing the hierarchy of each data source in the catalog. For more information, see [browse the data catalog](how-to-browse-catalog.md).
+While searching is great if you know what you're looking for, there are times where data consumers wish to explore the data available to them. The Microsoft Purview data catalog offers a browse experience that enables users to explore what data is available to them either by collection or through traversing the hierarchy of each data source in the catalog. For more information, see [browse the data catalog](how-to-browse-catalog.md).
## Search query syntax
-All search queries consist of keywords and operators. A keyword is a something that would be part of an asset's properties. Potential keywords can be a classification, glossary term, asset description, or an asset name. A keyword can be just a part of the property you're looking to match to. Use keywords and the operators to ensure Azure Purview returns the assets you're looking for.
+All search queries consist of keywords and operators. A keyword is a something that would be part of an asset's properties. Potential keywords can be a classification, glossary term, asset description, or an asset name. A keyword can be just a part of the property you're looking to match to. Use keywords and the operators to ensure Microsoft Purview returns the assets you're looking for.
Certain characters including spaces, dashes, and commas are interpreted as delimiters. Searching a string like `hive-database` is the same as searching two keywords `hive database`.
purview How To View Self Service Data Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-view-self-service-data-access-policy.md
Last updated 03/22/2022
# How to view self-service data access policies
-In an Azure Purview catalog, you can now [request access](how-to-request-access.md) to data assets. If policies are currently available for the data source type and the data source has [data use governance enabled](how-to-enable-data-use-governance.md), a self-service policy is generated when a data access request is approved.
+In a Microsoft Purview catalog, you can now [request access](how-to-request-access.md) to data assets. If policies are currently available for the data source type and the data source has [data use governance enabled](how-to-enable-data-use-governance.md), a self-service policy is generated when a data access request is approved.
This article describes how to view self-service data access policies that have been auto-generated by approved access requests.
This article describes how to view self-service data access policies that have b
Self-service policies must exist for them to be viewed. To enable and create self-service policies, follow these articles:
-1. [Enable Data Use Governance](how-to-enable-data-use-governance.md) - this will allow Azure Purview to create policies for your sources.
-1. [Create a self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md) - this will enable [users to request access to data sources from within Azure Purview](how-to-request-access.md).
+1. [Enable Data Use Governance](how-to-enable-data-use-governance.md) - this will allow Microsoft Purview to create policies for your sources.
+1. [Create a self-service data access workflow](./how-to-workflow-self-service-data-access-hybrid.md) - this will enable [users to request access to data sources from within Microsoft Purview](how-to-request-access.md).
1. [Approve a self-service data access request](how-to-workflow-manage-requests-approvals.md#approvals) - after approving a request, if your workflow from the previous step includes the ability to create a self-service data policy, your policy will be created and will be viewable. ## Permission
-Only the creator of your Azure Purview account, or users with [**Policy Admin**](catalog-permissions.md#roles) permissions can view self-service data access policies.
+Only the creator of your Microsoft Purview account, or users with [**Policy Admin**](catalog-permissions.md#roles) permissions can view self-service data access policies.
-If you need to add or request permissions, follow the [Azure Purview permissions documentation](catalog-permissions.md#add-users-to-roles).
+If you need to add or request permissions, follow the [Microsoft Purview permissions documentation](catalog-permissions.md#add-users-to-roles).
## Steps to view self-service data access policies
-1. Open the Azure portal and launch the [Azure Purview Studio](https://web.purview.azure.com/resource/). The Azure Purview studio can be launched as shown below or by using the [url directly](https://web.purview.azure.com/resource/).
+1. Open the Azure portal and launch the [Microsoft Purview Studio](https://web.purview.azure.com/resource/). The Microsoft Purview studio can be launched as shown below or by using the [url directly](https://web.purview.azure.com/resource/).
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-launch-pic-1.png" alt-text="Screenshot showing an Azure Purview account open in the Azure portal, with the Azure Purview studio button highlighted.":::
+ :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-launch-pic-1.png" alt-text="Screenshot showing a Microsoft Purview account open in the Azure portal, with the Microsoft Purview studio button highlighted.":::
1. Select the policy management tab to launch the self-service access policies.
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-2.png" alt-text="Screenshot of the Azure Purview studio with the leftmost menu open, and the Data policy page option highlighted.":::
+ :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-2.png" alt-text="Screenshot of the Microsoft Purview studio with the leftmost menu open, and the Data policy page option highlighted.":::
1. Open the self-service access policies tab.
- :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-3.png" alt-text="Screenshot of the Azure Purview studio open to the Data policy page with self-service access policies highlighted.":::
+ :::image type="content" source="./media/how-to-view-self-service-data-access-policy/Purview-Studio-self-service-tab-pic-3.png" alt-text="Screenshot of the Microsoft Purview studio open to the Data policy page with self-service access policies highlighted.":::
1. Here you'll see all your policies. The policies can be sorted and filtered by any of the displayed columns to improve your search.
purview How To Workflow Business Terms Approval https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-business-terms-approval.md
Title: Business terms approval workflow
-description: This article describes how to create and manage workflows to approve business terms in Azure Purview.
+description: This article describes how to create and manage workflows to approve business terms in Microsoft Purview.
This guide will take you through the creation and management of approval workflo
## Create and enable a new approval workflow for business terms
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/) and select the Management center. You'll see three new icons in the table of contents.
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/) and select the Management center. You'll see three new icons in the table of contents.
:::image type="content" source="./media/how-to-workflow-business-terms-approval/workflow-section.png" alt-text="Screenshot showing the management center left menu with the new workflow section highlighted.":::
This guide will take you through the creation and management of approval workflo
:::image type="content" source="./media/how-to-workflow-business-terms-approval/select-data-catalog.png" alt-text="Screenshot showing the new workflows menu, with Data Catalog selected.":::
-1. In the next screen, you'll see all the templates provided by Azure Purview to create a workflow. Select the template using which you want to start your authoring experiences and select **Continue**. Each of these templates specifies the kind of action that will trigger the workflow. In the screenshot below we've selected **Create glossary term**. The four different templates available for business glossary are:
+1. In the next screen, you'll see all the templates provided by Microsoft Purview to create a workflow. Select the template using which you want to start your authoring experiences and select **Continue**. Each of these templates specifies the kind of action that will trigger the workflow. In the screenshot below we've selected **Create glossary term**. The four different templates available for business glossary are:
* Create glossary term - when a term is created, approval will be requested. * Update glossary term - when a term is updated, approval will be requested. * Delete glossary term - when a term is deleted, approval will be requested.
This guide will take you through the creation and management of approval workflo
:::image type="content" source="./media/how-to-workflow-business-terms-approval/select-okay.png" alt-text="Screenshot showing the apply workflow window, showing a list of items that the workflow can be applied to. At the bottom of the window, the O K button is selected."::: >[!NOTE]
- > - The Azure Purview workflow engine will always resolve to the closest workflow that the term hierarchy path is associated with. In case a direct binding is not found, it will traverse up in the tree to find the workflow associated with the closest parent in the glossary tree.
+ > - The Microsoft Purview workflow engine will always resolve to the closest workflow that the term hierarchy path is associated with. In case a direct binding is not found, it will traverse up in the tree to find the workflow associated with the closest parent in the glossary tree.
> - Import terms can only be bound to root glossary path as the .CSV can contain terms from different hierarchy paths. 1. By default, the workflow will be enabled. To disable, toggle the Enable button in the top menu.
To delete a workflow, select the workflow and then select **Delete** in the top
For more information about workflows, see these articles: -- [What are Azure Purview workflows](concept-workflow.md)
+- [What are Microsoft Purview workflows](concept-workflow.md)
- [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md) - [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Workflow Manage Requests Approvals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-manage-requests-approvals.md
Title: Manage workflow requests and approvals
-description: This article outlines how to manage requests and approvals generated by a workflow in Azure Purview.
+description: This article outlines how to manage requests and approvals generated by a workflow in Microsoft Purview.
[!INCLUDE [Region Notice](./includes/workflow-regions.md)]
-This article outlines how to manage requests and approvals that generated by a [workflow](concept-workflow.md) in Azure Purview.
+This article outlines how to manage requests and approvals that generated by a [workflow](concept-workflow.md) in Microsoft Purview.
-To view requests you've made or request for approvals that have been sent to you by a workflow instance, navigate to management center in the [Azure Purview Studio](https://web.purview.azure.com/resource/), and select **Requests and Approvals**.
+To view requests you've made or request for approvals that have been sent to you by a workflow instance, navigate to management center in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/), and select **Requests and Approvals**.
:::image type="content" source="./media/how-to-workflow-manage-requests-approval/select-requests-and-approvals.png" alt-text="Screenshot showing management center navigation table with the requests and approvals button highlighted.":::
Select an approval or task to see details and responses from all approvals or ta
Purview approvals and task connectors have in-built email capabilities. Every time an approval/task action is triggered in workflow; it sends email to all the users who need to act on it.
-Users can respond by selecting the links in the email, or by navigating to the Azure Purview studio and viewing their pending tasks.
+Users can respond by selecting the links in the email, or by navigating to the Microsoft Purview studio and viewing their pending tasks.
## Next steps -- [What are Azure Purview workflows](concept-workflow.md)
+- [What are Microsoft Purview workflows](concept-workflow.md)
- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md) - [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md) - [Manage workflow runs](how-to-workflow-manage-runs.md)
purview How To Workflow Manage Runs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-manage-runs.md
This article outlines how to manage workflows that are already running.
-1. To view workflow runs you triggered, sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/), select the Management center, and select **Workflow runs**.
+1. To view workflow runs you triggered, sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/), select the Management center, and select **Workflow runs**.
- :::image type="content" source="./media/how-to-workflow-manage-runs/select-workflow-runs.png" alt-text="Screenshot of the management menu in the Azure Purview studio. The Workflow runs tab is highlighted.":::
+ :::image type="content" source="./media/how-to-workflow-manage-runs/select-workflow-runs.png" alt-text="Screenshot of the management menu in the Microsoft Purview studio. The Workflow runs tab is highlighted.":::
1. You'll be presented with the list of workflow runs and their statuses.
This article outlines how to manage workflows that are already running.
## Next steps -- [What are Azure Purview workflows](concept-workflow.md)
+- [What are Microsoft Purview workflows](concept-workflow.md)
- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md) - [Self-service data access workflow for hybrid data estates](how-to-workflow-self-service-data-access-hybrid.md) - [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview How To Workflow Self Service Data Access Hybrid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-self-service-data-access-hybrid.md
Title: Self-service hybrid data access workflows
-description: This article describes how to create and manage hybrid self-service data access workflows in Azure Purview.
+description: This article describes how to create and manage hybrid self-service data access workflows in Microsoft Purview.
This guide will take you through the creation and management of self-service dat
## Create and enable self-service data access workflow
-1. Sign in to [Azure Purview Studio](https://web.purview.azure.com/resource/) and select the Management center. You'll see three new icons in the table of contents.
+1. Sign in to [Microsoft Purview Studio](https://web.purview.azure.com/resource/) and select the Management center. You'll see three new icons in the table of contents.
:::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/workflow-section.png" alt-text="Screenshot showing the management center left menu with the new workflow section highlighted.":::
This guide will take you through the creation and management of self-service dat
:::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/workflow-authoring-select-new.png" alt-text="Screenshot showing the authoring workflows page, with the + New button highlighted.":::
-1. You'll be presented with different categories workflows creatable in Azure Purview. To create **an access request workflow** Select **Governance** and select **Continue**.
+1. You'll be presented with different categories workflows creatable in Microsoft Purview. To create **an access request workflow** Select **Governance** and select **Continue**.
:::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/select-governance.png" alt-text="Screenshot showing the new workflow window, with the Governance option selected.":::
-1. In the next screen, you'll see all the templates provided by Azure Purview to create a self-service data access workflow. Select the template **Data access request** and select **Continue**.
+1. In the next screen, you'll see all the templates provided by Microsoft Purview to create a self-service data access workflow. Select the template **Data access request** and select **Continue**.
:::image type="content" source="./media/how-to-workflow-self-service-data-access-hybrid/select-data-access-request.png" alt-text="Screenshot showing the new workflow window, with the Data access request option selected.":::
To delete a workflow, select the workflow and then select **Delete**.
For more information about workflows, see these articles: -- [What are Azure Purview workflows](concept-workflow.md)
+- [What are Microsoft Purview workflows](concept-workflow.md)
- [Approval workflow for business terms](how-to-workflow-business-terms-approval.md) - [Manage workflow requests and approvals](how-to-workflow-manage-requests-approvals.md)
purview Manage Credentials https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-credentials.md
Title: Create and manage credentials for scans
-description: Learn about the steps to create and manage credentials in Azure Purview.
+description: Learn about the steps to create and manage credentials in Microsoft Purview.
Last updated 02/16/2022
-# Credentials for source authentication in Azure Purview
+# Credentials for source authentication in Microsoft Purview
-This article describes how you can create credentials in Azure Purview. These saved credentials let you quickly reuse and apply saved authentication information to your data source scans.
+This article describes how you can create credentials in Microsoft Purview. These saved credentials let you quickly reuse and apply saved authentication information to your data source scans.
## Prerequisites
This article describes how you can create credentials in Azure Purview. These sa
## Introduction
-A credential is authentication information that Azure Purview can use to authenticate to your registered data sources. A credential object can be created for various types of authentication scenarios, such as Basic Authentication requiring username/password. Credential capture specific information required to authenticate, based on the chosen type of authentication method. Credentials use your existing Azure Key Vaults secrets for retrieving sensitive authentication information during the Credential creation process.
+A credential is authentication information that Microsoft Purview can use to authenticate to your registered data sources. A credential object can be created for various types of authentication scenarios, such as Basic Authentication requiring username/password. Credential capture specific information required to authenticate, based on the chosen type of authentication method. Credentials use your existing Azure Key Vaults secrets for retrieving sensitive authentication information during the Credential creation process.
-In Azure Purview, there are few options to use as authentication method to scan data sources such as the following options:
+In Microsoft Purview, there are few options to use as authentication method to scan data sources such as the following options:
-- [Azure Purview system-assigned managed identity](#use-azure-purview-system-assigned-managed-identity-to-set-up-scans)
+- [Microsoft Purview system-assigned managed identity](#use-microsoft-purview-system-assigned-managed-identity-to-set-up-scans)
- [User-assigned managed identity](#create-a-user-assigned-managed-identity) (preview)-- Account Key (using [Key Vault](#create-azure-key-vaults-connections-in-your-azure-purview-account))-- SQL Authentication (using [Key Vault](#create-azure-key-vaults-connections-in-your-azure-purview-account))-- Service Principal (using [Key Vault](#create-azure-key-vaults-connections-in-your-azure-purview-account))-- Consumer Key (using [Key Vault](#create-azure-key-vaults-connections-in-your-azure-purview-account))
+- Account Key (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))
+- SQL Authentication (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))
+- Service Principal (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))
+- Consumer Key (using [Key Vault](#create-azure-key-vaults-connections-in-your-microsoft-purview-account))
Before creating any credentials, consider your data source types and networking requirements to decide which authentication method you need for your scenario. Review the following decision tree to find which credential is most suitable: :::image type="content" source="media/manage-credentials/manage-credentials-decision-tree-small.png" alt-text="Manage credentials decision tree" lightbox="media/manage-credentials/manage-credentials-decision-tree.png":::
-## Use Azure Purview system-assigned managed identity to set up scans
+## Use Microsoft Purview system-assigned managed identity to set up scans
-If you're using the Azure Purview system-assigned managed identity (SAMI) to set up scans, you won't need to create a credential and link your key vault to Azure Purview to store them. For detailed instructions on adding the Azure Purview SAMI to have access to scan your data sources, refer to the data source-specific authentication sections below:
+If you're using the Microsoft Purview system-assigned managed identity (SAMI) to set up scans, you won't need to create a credential and link your key vault to Microsoft Purview to store them. For detailed instructions on adding the Microsoft Purview SAMI to have access to scan your data sources, refer to the data source-specific authentication sections below:
- [Azure Blob Storage](register-scan-azure-blob-storage-source.md#authentication-for-a-scan) - [Azure Data Lake Storage Gen1](register-scan-adls-gen1.md#authentication-for-a-scan)
If you're using the Azure Purview system-assigned managed identity (SAMI) to set
- [Azure Synapse Workspace](register-scan-synapse-workspace.md#authentication-for-registration) - [Azure Synapse dedicated SQL pools (formerly SQL DW)](register-scan-azure-synapse-analytics.md#authentication-for-registration)
-## Grant Azure Purview access to your Azure Key Vault
+## Grant Microsoft Purview access to your Azure Key Vault
-To give Azure Purview access to your Azure Key Vault, there are two things you'll need to confirm:
+To give Microsoft Purview access to your Azure Key Vault, there are two things you'll need to confirm:
- [Firewall access to the Azure Key Vault](#firewall-access-to-azure-key-vault)-- [Azure Purview permissions on the Azure Key Vault](#azure-purview-permissions-on-the-azure-key-vault)
+- [Microsoft Purview permissions on the Azure Key Vault](#microsoft-purview-permissions-on-the-azure-key-vault)
### Firewall access to Azure Key Vault
-If your Azure Key Vault has disabled public network access, you have two options to allow access for Azure Purview.
+If your Azure Key Vault has disabled public network access, you have two options to allow access for Microsoft Purview.
- [Trusted Microsoft services](#trusted-microsoft-services) - [Private endpoint connections](#private-endpoint-connections) #### Trusted Microsoft services
-Azure Purview is listed as one of [Azure Key Vault's trusted services](../key-vault/general/overview-vnet-service-endpoints.md#trusted-services), so if public network access is disabled on your Azure Key Vault you can enable access only to trusted Microsoft services, and Azure Purview will be included.
+Microsoft Purview is listed as one of [Azure Key Vault's trusted services](../key-vault/general/overview-vnet-service-endpoints.md#trusted-services), so if public network access is disabled on your Azure Key Vault you can enable access only to trusted Microsoft services, and Microsoft Purview will be included.
You can enable this setting in your Azure Key Vault under the **Networking** tab.
At the bottom of the page, under Exception, enable the **Allow trusted Microsoft
To connect to Azure Key Vault with private endpoints, follow [Azure Key Vault's private endpoint documentation](../key-vault/general/private-link-service.md).
-### Azure Purview permissions on the Azure Key Vault
+### Microsoft Purview permissions on the Azure Key Vault
Currently Azure Key Vault supports two permission models: - [Option 1 - Access Policies](#option-1assign-access-using-key-vault-access-policy) - [Option 2 - Role-based Access Control](#option-2assign-access-using-key-vault-azure-role-based-access-control)
-Before assigning access to the Azure Purview system-assigned managed identity (SAMI), first identify your Azure Key Vault permission model from Key Vault resource **Access Policies** in the menu. Follow steps below based on relevant the permission model.
+Before assigning access to the Microsoft Purview system-assigned managed identity (SAMI), first identify your Azure Key Vault permission model from Key Vault resource **Access Policies** in the menu. Follow steps below based on relevant the permission model.
:::image type="content" source="media/manage-credentials/akv-permission-model.png" alt-text="Azure Key Vault Permission Model":::
Follow these steps only if permission model in your Azure Key Vault resource is
3. Select **Add Access Policy**.
- :::image type="content" source="media/manage-credentials/add-msi-to-akv-2.png" alt-text="Add Azure Purview managed identity to AKV":::
+ :::image type="content" source="media/manage-credentials/add-msi-to-akv-2.png" alt-text="Add Microsoft Purview managed identity to AKV":::
4. In the **Secrets permissions** dropdown, select **Get** and **List** permissions.
-5. For **Select principal**, choose the Azure Purview system managed identity. You can search for the Azure Purview SAMI using either the Azure Purview instance name **or** the managed identity application ID. We don't currently support compound identities (managed identity name + application ID).
+5. For **Select principal**, choose the Microsoft Purview system managed identity. You can search for the Microsoft Purview SAMI using either the Microsoft Purview instance name **or** the managed identity application ID. We don't currently support compound identities (managed identity name + application ID).
:::image type="content" source="media/manage-credentials/add-access-policy.png" alt-text="Add access policy":::
Follow these steps only if permission model in your Azure Key Vault resource is
3. Select **+ Add**.
-4. Set the **Role** to **Key Vault Secrets User** and enter your Azure Purview account name under **Select** input box. Then, select Save to give this role assignment to your Azure Purview account.
+4. Set the **Role** to **Key Vault Secrets User** and enter your Microsoft Purview account name under **Select** input box. Then, select Save to give this role assignment to your Microsoft Purview account.
:::image type="content" source="media/manage-credentials/akv-add-rbac.png" alt-text="Azure Key Vault RBAC":::
-## Create Azure Key Vaults connections in your Azure Purview account
+## Create Azure Key Vaults connections in your Microsoft Purview account
-Before you can create a Credential, first associate one or more of your existing Azure Key Vault instances with your Azure Purview account.
+Before you can create a Credential, first associate one or more of your existing Azure Key Vault instances with your Microsoft Purview account.
-1. From the [Azure portal](https://portal.azure.com), select your Azure Purview account and open the [Azure Purview Studio](https://web.purview.azure.com/resource/). Navigate to the **Management Center** in the studio and then navigate to **credentials**.
+1. From the [Azure portal](https://portal.azure.com), select your Microsoft Purview account and open the [Microsoft Purview Studio](https://web.purview.azure.com/resource/). Navigate to the **Management Center** in the studio and then navigate to **credentials**.
2. From the **Credentials** page, select **Manage Key Vault connections**.
Before you can create a Credential, first associate one or more of your existing
4. Provide the required information, then select **Create**.
-5. Confirm that your Key Vault has been successfully associated with your Azure Purview account as shown in this example:
+5. Confirm that your Key Vault has been successfully associated with your Microsoft Purview account as shown in this example:
:::image type="content" source="media/manage-credentials/view-kv-connections.png" alt-text="View Azure Key Vault connections to confirm."::: ## Create a new credential
-These credential types are supported in Azure Purview:
+These credential types are supported in Microsoft Purview:
- Basic authentication: You add the **password** as a secret in key vault. - Service Principal: You add the **service principal key** as a secret in key vault.
These credential types are supported in Azure Purview:
- Consumer Key: For Salesforce data sources, you can add the **password** and the **consumer secret** in key vault. - User-assigned managed identity (preview): You can add user-assigned managed identity credentials. For more information, see the [create a user-assigned managed identity section](#create-a-user-assigned-managed-identity) below.
-For more information, see [Add a secret to Key Vault](../key-vault/secrets/quick-create-portal.md#add-a-secret-to-key-vault) and [Create a new AWS role for Azure Purview](register-scan-amazon-s3.md#create-a-new-aws-role-for-azure-purview).
+For more information, see [Add a secret to Key Vault](../key-vault/secrets/quick-create-portal.md#add-a-secret-to-key-vault) and [Create a new AWS role for Microsoft Purview](register-scan-amazon-s3.md#create-a-new-aws-role-for-microsoft-purview).
After storing your secrets in the key vault:
-1. In Azure Purview, go to the Credentials page.
+1. In Microsoft Purview, go to the Credentials page.
2. Create your new Credential by selecting **+ New**.
After storing your secrets in the key vault:
User-assigned managed identities (UAMI) enable Azure resources to authenticate directly with other resources using Azure Active Directory (Azure AD) authentication, without the need to manage those credentials. They allow you to authenticate and assign access just like you would with a system assigned managed identity, Azure AD user, Azure AD group, or service principal. User-assigned managed identities are created as their own resource (rather than being connected to a pre-existing resource). For more information about managed identities, see the [managed identities for Azure resources documentation](../active-directory/managed-identities-azure-resources/overview.md).
-The following steps will show you how to create a UAMI for Azure Purview to use.
+The following steps will show you how to create a UAMI for Microsoft Purview to use.
### Supported data sources for UAMI
The following steps will show you how to create a UAMI for Azure Purview to use.
### Create a user-assigned managed identity
-1. In the [Azure portal](https://portal.azure.com/) navigate to your Azure Purview account.
+1. In the [Azure portal](https://portal.azure.com/) navigate to your Microsoft Purview account.
1. In the **Managed identities** section on the left menu, select the **+ Add** button to add user assigned managed identities. :::image type="content" source="media/manage-credentials/create-new-managed-identity.png" alt-text="Screenshot showing managed identity screen in the Azure portal with user-assigned and add highlighted.":::
-1. After finishing the setup, go back to your Azure Purview account in the Azure portal. If the managed identity is successfully deployed, you'll see the Azure Purview account's status as **Succeeded**.
+1. After finishing the setup, go back to your Microsoft Purview account in the Azure portal. If the managed identity is successfully deployed, you'll see the Microsoft Purview account's status as **Succeeded**.
- :::image type="content" source="media/manage-credentials/status-successful.png" alt-text="Screenshot the Azure Purview account in the Azure portal with Status highlighted under the overview tab and essentials menu.":::
+ :::image type="content" source="media/manage-credentials/status-successful.png" alt-text="Screenshot the Microsoft Purview account in the Azure portal with Status highlighted under the overview tab and essentials menu.":::
-1. Once the managed identity is successfully deployed, navigate to the [Azure Purview Studio](https://web.purview.azure.com/), by selecting the **Open Azure Purview Studio** button.
+1. Once the managed identity is successfully deployed, navigate to the [Microsoft Purview Studio](https://web.purview.azure.com/), by selecting the **Open Microsoft Purview Studio** button.
-1. In the [Azure Purview Studio](https://web.purview.azure.com/), navigate to the Management Center in the studio and then navigate to the Credentials section.
+1. In the [Microsoft Purview Studio](https://web.purview.azure.com/), navigate to the Management Center in the studio and then navigate to the Credentials section.
1. Create a user-assigned managed identity by selecting **+New**. 1. Select the Managed identity authentication method, and select your user assigned managed identity from the drop-down menu.
The following steps will show you how to create a UAMI for Azure Purview to use.
:::image type="content" source="media/manage-credentials/new-user-assigned-managed-identity-credential.png" alt-text="Screenshot showing the new managed identity creation tile, with the Learn More link highlighted."::: >[!NOTE]
- > If the portal was open during creation of your user assigned managed identity, you'll need to refresh the Azure Purview web portal to load the settings finished in the Azure portal.
+ > If the portal was open during creation of your user assigned managed identity, you'll need to refresh the Microsoft Purview web portal to load the settings finished in the Azure portal.
1. After all the information is filled in, select **Create**.
purview Manage Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-data-sources.md
Title: How to manage multi-cloud data sources
-description: Learn how to register new data sources, manage collections of data sources, and view sources in Azure Purview.
+description: Learn how to register new data sources, manage collections of data sources, and view sources in Microsoft Purview.
Last updated 09/27/2021
-# Manage data sources in Azure Purview
+# Manage data sources in Microsoft Purview
-In this article, you learn how to register new data sources, manage collections of data sources, and view sources in Azure Purview.
+In this article, you learn how to register new data sources, manage collections of data sources, and view sources in Microsoft Purview.
## Register a new source Use the following steps to register a new source.
-1. Open [Azure Purview Studio](https://web.purview.azure.com/resource/), navigate to the **Data Map**, **Sources**, and select **Register**.
+1. Open [Microsoft Purview Studio](https://web.purview.azure.com/resource/), navigate to the **Data Map**, **Sources**, and select **Register**.
- :::image type="content" source="media/manage-data-sources/purview-studio.png" alt-text="Azure Purview Studio":::
+ :::image type="content" source="media/manage-data-sources/purview-studio.png" alt-text="Microsoft Purview Studio":::
1. Select a source type. This example uses Azure Blob Storage. Select **Continue**.
Use the following steps to register a new source.
## View sources
-You can view all registered sources on the **Data Map** tab of Azure Purview Studio. There are two view types: map view and list view.
+You can view all registered sources on the **Data Map** tab of Microsoft Purview Studio. There are two view types: map view and list view.
### Map view In Map view, you can see all of your sources and collections. In the following image, there is one Azure Blob Storage source. From each source tile, you can edit the source, start a new scan, or view source details. ### Table view In the table view, you can see a sortable list of sources. Hover over the source for options to edit, begin a new scan, or delete. ## Manage collections
-You can group your data sources into collections. To create a new collection, select **+ New collection** on the *Sources* page of Azure Purview Studio. Give the collection a name and select *None* as the Parent. The new collection appears in the map view.
+You can group your data sources into collections. To create a new collection, select **+ New collection** on the *Sources* page of Microsoft Purview Studio. Give the collection a name and select *None* as the Parent. The new collection appears in the map view.
To add sources to a collection, select the **Edit** pencil on the source and choose a collection from the **Select a collection** drop-down menu. To create a hierarchy of collections, assign higher-level collections as a parent to lower-level collections. In the following image, *Fabrikam* is a parent to the *Finance* collection, which contains an Azure Blob Storage data source. You can collapse or expand collections by selecting the circle attached to the arrow between levels. You can remove sources from a hierarchy by selecting *None* for the parent. Unparented sources are grouped in a dotted box in the map view with no arrows linking them to parents.
purview Manage Integration Runtimes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-integration-runtimes.md
Title: Create and manage Integration Runtimes
-description: This article explains the steps to create and manage Integration Runtimes in Azure Purview.
+description: This article explains the steps to create and manage Integration Runtimes in Microsoft Purview.
Last updated 04/13/2022
# Create and manage a self-hosted integration runtime
-The integration runtime (IR) is the compute infrastructure that Azure Purview uses to power data scan across different network environments.
+The integration runtime (IR) is the compute infrastructure that Microsoft Purview uses to power data scan across different network environments.
A self-hosted integration runtime (SHIR) can be used to scan data source in an on-premises network or a virtual network. The installation of a self-hosted integration runtime needs an on-premises machine or a virtual machine inside a private network. This article describes how to create and manage a self-hosted integration runtime. > [!NOTE]
-> The Azure Purview Integration Runtime cannot be shared with an Azure Synapse Analytics or Azure Data Factory Integration Runtime on the same machine. It needs to be installed on a separated machine.
+> The Microsoft Purview Integration Runtime cannot be shared with an Azure Synapse Analytics or Azure Data Factory Integration Runtime on the same machine. It needs to be installed on a separated machine.
## Prerequisites
Installation of the self-hosted integration runtime on a domain controller isn't
### Considerations for using a self-hosted IR - You can use a single self-hosted integration runtime for scanning multiple data sources.-- You can install only one instance of self-hosted integration runtime on any single machine. If you have two Azure Purview accounts that need to scan on-premises data sources, install the self-hosted IR on two machines, one for each Azure Purview account.
+- You can install only one instance of self-hosted integration runtime on any single machine. If you have two Microsoft Purview accounts that need to scan on-premises data sources, install the self-hosted IR on two machines, one for each Microsoft Purview account.
- The self-hosted integration runtime doesn't need to be on the same machine as the data source, unless specially called out as a prerequisite in the respective source article. Having the self-hosted integration runtime close to the data source reduces the time for the self-hosted integration runtime to connect to the data source. ## Setting up a self-hosted integration runtime
To create and set up a self-hosted integration runtime, use the following proced
### Create a self-hosted integration runtime
-1. On the home page of the [Azure Purview Studio](https://web.purview.azure.com/resource/), select **Data Map** from the left navigation pane.
+1. On the home page of the [Microsoft Purview Studio](https://web.purview.azure.com/resource/), select **Data Map** from the left navigation pane.
2. Under **Sources and scanning** on the left pane, select **Integration runtimes**, and then select **+ New**.
If you move your cursor over the icon or message in the notification area, you c
Your self-hosted integration runtime machine needs to connect to several resources to work correctly:
-* The Azure Purview services used to manage the self-hosted integration runtime.
+* The Microsoft Purview services used to manage the self-hosted integration runtime.
* The data sources you want to scan using the self-hosted integration runtime.
-* The managed Storage account and Event Hubs resource created by Azure Purview. Azure Purview uses these resources to ingest the results of the scan, among many other things, so the self-hosted integration runtime need to be able to connect with these resources.
+* The managed Storage account and Event Hubs resource created by Microsoft Purview. Microsoft Purview uses these resources to ingest the results of the scan, among many other things, so the self-hosted integration runtime need to be able to connect with these resources.
* The Azure Key Vault used to store credentials. There are two firewalls to consider:
There are two firewalls to consider:
Here are the domains and outbound ports that you need to allow at both **corporate and Windows/machine firewalls**. > [!TIP]
-> For domains listed with '\<managed_storage_account>' and '\<managed_Event_Hub_resource>', add the name of the managed resources associated with your Azure Purview account. You can find them from Azure portal -> your Azure Purview account -> Managed resources tab.
+> For domains listed with '\<managed_storage_account>' and '\<managed_Event_Hub_resource>', add the name of the managed resources associated with your Microsoft Purview account. You can find them from Azure portal -> your Microsoft Purview account -> Managed resources tab.
| Domain names | Outbound ports | Description | | -- | -- | - |
-| `*.frontend.clouddatahub.net` | 443 | Required to connect to the Azure Purview service. Currently wildcard is required as there's no dedicated resource. |
-| `*.servicebus.windows.net` | 443 | Required for setting up scan on Azure Purview Studio. This endpoint is used for interactive authoring from UI, for example, test connection, browse folder list and table list to scope scan. Currently wildcard is required as there's no dedicated resource. |
-| `<purview_account>.purview.azure.com` | 443 | Required to connect to Azure Purview service. |
-| `<managed_storage_account>.blob.core.windows.net` | 443 | Required to connect to the Azure Purview managed Azure Blob storage account. |
-| `<managed_storage_account>.queue.core.windows.net` | 443 | Required to connect to the Azure Purview managed Azure Queue storage account. |
+| `*.frontend.clouddatahub.net` | 443 | Required to connect to the Microsoft Purview service. Currently wildcard is required as there's no dedicated resource. |
+| `*.servicebus.windows.net` | 443 | Required for setting up scan on Microsoft Purview Studio. This endpoint is used for interactive authoring from UI, for example, test connection, browse folder list and table list to scope scan. Currently wildcard is required as there's no dedicated resource. |
+| `<purview_account>.purview.azure.com` | 443 | Required to connect to Microsoft Purview service. |
+| `<managed_storage_account>.blob.core.windows.net` | 443 | Required to connect to the Microsoft Purview managed Azure Blob storage account. |
+| `<managed_storage_account>.queue.core.windows.net` | 443 | Required to connect to the Microsoft Purview managed Azure Queue storage account. |
| `download.microsoft.com` | 443 | Required to download the self-hosted integration runtime updates. If you have disabled auto-update, you can skip configuring this domain. | | `login.windows.net`<br>`login.microsoftonline.com` | 443 | Required to sign in to the Azure Active Directory. | > [!NOTE]
-> As currently Azure Relay doesn't support service tag, you have to use service tag AzureCloud or Internet in NSG rules for the communication to Azure Relay. For the communication to Azure Purview.
+> As currently Azure Relay doesn't support service tag, you have to use service tag AzureCloud or Internet in NSG rules for the communication to Azure Relay. For the communication to Microsoft Purview.
Depending on the sources you want to scan, you also need to allow other domains and outbound ports for other Azure or external sources. A few examples are provided here:
When configured, the self-hosted integration runtime uses the proxy server to co
:::image type="content" source="media/manage-integration-runtimes/set-http-proxy.png" alt-text="Set the proxy":::
-There are two supported configuration options by Azure Purview:
+There are two supported configuration options by Microsoft Purview:
- **Do not use proxy**: The self-hosted integration runtime doesn't explicitly use any proxy to connect to cloud services. - **Use system proxy**: The self-hosted integration runtime uses the proxy setting that is configured in the executable's configuration files. If no proxy is specified in these files, the self-hosted integration runtime connects to the services directly without going through a proxy. > [!IMPORTANT] >
-> Currently, **custom proxy** is not supported in Azure Purview. In addition, system proxy is supported when scanning Azure data sources and SQL Server; scanning other sources doesn't support proxy.
+> Currently, **custom proxy** is not supported in Microsoft Purview. In addition, system proxy is supported when scanning Azure data sources and SQL Server; scanning other sources doesn't support proxy.
The integration runtime host service restarts automatically after you save the updated proxy settings.
You also need to make sure that Microsoft Azure is in your company's allowlist.
### Possible symptoms for issues related to the firewall and proxy server
-If you see error messages like the following ones, the likely reason is improper configuration of the firewall or proxy server. Such configuration prevents the self-hosted integration runtime from connecting to Azure Purview services. To ensure that your firewall and proxy server are properly configured, refer to the previous section.
+If you see error messages like the following ones, the likely reason is improper configuration of the firewall or proxy server. Such configuration prevents the self-hosted integration runtime from connecting to Microsoft Purview services. To ensure that your firewall and proxy server are properly configured, refer to the previous section.
- When you try to register the self-hosted integration runtime, you receive the following error message: "Failed to register this Integration Runtime node! Confirm that the Authentication key is valid and the integration service host service is running on this machine." - When you open Integration Runtime Configuration Manager, you see a status of **Disconnected** or **Connecting**. When you view Windows event logs, under **Event Viewer** > **Application and Services Logs** > **Microsoft Integration Runtime**, you see error messages like this one:
If you see error messages like the following ones, the likely reason is improper
## Java Runtime Environment Installation
-If you scan Parquet files using the self-hosted integration runtime with Azure Purview, you'll need to install either the Java Runtime Environment or OpenJDK on your self-hosted IR machine.
+If you scan Parquet files using the self-hosted integration runtime with Microsoft Purview, you'll need to install either the Java Runtime Environment or OpenJDK on your self-hosted IR machine.
When scanning Parquet files using the self-hosted IR, the service locates the Java runtime by firstly checking the registry *`(SOFTWARE\JavaSoft\Java Runtime Environment\{Current Version}\JavaHome)`* for JRE, if not found, secondly checking system variable *`JAVA_HOME`* for OpenJDK.
When scanning Parquet files using the self-hosted IR, the service locates the Ja
## Next steps -- [Azure Purview network architecture and best practices](concept-best-practices-network.md)
+- [Microsoft Purview network architecture and best practices](concept-best-practices-network.md)
-- [Use private endpoints with Azure Purview](catalog-private-link.md)
+- [Use private endpoints with Microsoft Purview](catalog-private-link.md)
purview Manage Kafka Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-kafka-dotnet.md
Title: Publish messages to and process messages from Azure Purview's Atlas Kafka topics via Event Hubs using .NET
-description: This article provides a walkthrough to create a .NET Core application that sends/receives events to/from Azure Purview's Apache Atlas Kafka topics by using the latest Azure.Messaging.EventHubs package.
+ Title: Publish messages to and process messages from Microsoft Purview's Atlas Kafka topics via Event Hubs using .NET
+description: This article provides a walkthrough to create a .NET Core application that sends/receives events to/from Microsoft Purview's Apache Atlas Kafka topics by using the latest Azure.Messaging.EventHubs package.
Last updated 09/27/2021
-# Publish messages to and process messages from Azure Purview's Atlas Kafka topics via Event Hubs using .NET
-This quickstart shows how to send events to and receive events from Azure Purview's Atlas Kafka topics via event hub using the **Azure.Messaging.EventHubs** .NET library.
+# Publish messages to and process messages from Microsoft Purview's Atlas Kafka topics via Event Hubs using .NET
+This quickstart shows how to send events to and receive events from Microsoft Purview's Atlas Kafka topics via event hub using the **Azure.Messaging.EventHubs** .NET library.
> [!IMPORTANT]
-> A managed event hub is created as part of Azure Purview account creation, see [Azure Purview account creation](create-catalog-portal.md). You can publish messages to the event hub kafka topic ATLAS_HOOK and Azure Purview will consume and process it. Azure Purview will notify entity changes to event hub kafka topic ATLAS_ENTITIES and user can consume and process it.This quickstart uses the new **Azure.Messaging.EventHubs** library.
+> A managed event hub is created as part of Microsoft Purview account creation, see [Microsoft Purview account creation](create-catalog-portal.md). You can publish messages to the event hub kafka topic ATLAS_HOOK and Microsoft Purview will consume and process it. Microsoft Purview will notify entity changes to event hub kafka topic ATLAS_ENTITIES and user can consume and process it.This quickstart uses the new **Azure.Messaging.EventHubs** library.
## Prerequisites
To complete this quickstart, you need the following prerequisites:
- **Microsoft Azure subscription**. To use Azure services, including Azure Event Hubs, you need a subscription. If you don't have an existing Azure account, you can sign up for a [free trial](https://azure.microsoft.com/free/) or use your MSDN subscriber benefits when you [create an account](https://azure.microsoft.com). - **Microsoft Visual Studio 2019**. The Azure Event Hubs client library makes use of new features that were introduced in C# 8.0. You can still use the library with previous C# language versions, but the new syntax won't be available. To make use of the full syntax, it is recommended that you compile with the [.NET Core SDK](https://dotnet.microsoft.com/download) 3.0 or higher and [language version](/dotnet/csharp/language-reference/configure-language-version#override-a-default) set to `latest`. If you're using Visual Studio, versions before Visual Studio 2019 aren't compatible with the tools needed to build C# 8.0 projects. Visual Studio 2019, including the free Community edition, can be downloaded [here](https://visualstudio.microsoft.com/vs/).
-## Publish messages to Azure Purview
-This section shows you how to create a .NET Core console application to send events to an Azure Purview via event hub kafka topic **ATLAS_HOOK**.
+## Publish messages to Microsoft Purview
+This section shows you how to create a .NET Core console application to send events to a Microsoft Purview via event hub kafka topic **ATLAS_HOOK**.
## Create a Visual Studio project
Next, create a C# .NET console application in Visual Studio:
private const string eventHubName = "<EVENT HUB NAME>"; ```
- You can get event hub namespace associated with purview account by looking at Atlas kafka endpoint primary/secondary connection strings in properties tab of Azure Purview account.
+ You can get event hub namespace associated with purview account by looking at Atlas kafka endpoint primary/secondary connection strings in properties tab of Microsoft Purview account.
:::image type="content" source="media/manage-eventhub-kafka-dotnet/properties.png" alt-text="Event Hub Namespace":::
- The event hub name should be **ATLAS_HOOK** for sending messages to Azure Purview.
+ The event hub name should be **ATLAS_HOOK** for sending messages to Microsoft Purview.
-3. Replace the `Main` method with the following `async Main` method and add an `async ProduceMessage` to push messages into Azure Purview. See the code comments for details.
+3. Replace the `Main` method with the following `async Main` method and add an `async ProduceMessage` to push messages into Microsoft Purview. See the code comments for details.
```csharp static async Task Main()
Next, create a C# .NET console application in Visual Studio:
```
-## Consume messages from Azure Purview
-This section shows how to write a .NET Core console application that receives messages from an event hub using an event processor. You need to use ATLAS_ENTITIES event hub to receive messages from Azure Purview.The event processor simplifies receiving events from event hubs by managing persistent checkpoints and parallel receptions from those event hubs.
+## Consume messages from Microsoft Purview
+This section shows how to write a .NET Core console application that receives messages from an event hub using an event processor. You need to use ATLAS_ENTITIES event hub to receive messages from Microsoft Purview.The event processor simplifies receiving events from event hubs by managing persistent checkpoints and parallel receptions from those event hubs.
> [!WARNING] > If you run this code on Azure Stack Hub, you will experience runtime errors unless you target a specific Storage API version. That's because the Event Hubs SDK uses the latest available Azure Storage API available in Azure that may not be available on your Azure Stack Hub platform. Azure Stack Hub may support a different version of Storage Blob SDK than those typically available on Azure. If you are using Azure Blob Storage as a checkpoint store, check the [supported Azure Storage API version for your Azure Stack Hub build](/azure-stack/user/azure-stack-acs-differences?#api-version) and target that version in your code.
In this quickstart, you use Azure Storage as the checkpoint store. Follow these
private const string blobContainerName = "<BLOB CONTAINER NAME>"; ```
- You can get event hub namespace associated with purview account by looking at Atlas kafka endpoint primary/secondary connection strings in properties tab of Azure Purview account.
+ You can get event hub namespace associated with purview account by looking at Atlas kafka endpoint primary/secondary connection strings in properties tab of Microsoft Purview account.
:::image type="content" source="media/manage-eventhub-kafka-dotnet/properties.png" alt-text="Event Hub Namespace":::
- The event hub name should be **ATLAS_ENTITIES** for sending messages to Azure Purview.
+ The event hub name should be **ATLAS_ENTITIES** for sending messages to Microsoft Purview.
3. Replace the `Main` method with the following `async Main` method. See the code comments for details.
In this quickstart, you use Azure Storage as the checkpoint store. Follow these
> For the complete source code with more informational comments, see [this file on the GitHub](https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/eventhub/Azure.Messaging.EventHubs.Processor/samples/Sample01_HelloWorld.md). 6. Run the receiver application.
-### Sample Message received from Azure Purview
+### Sample Message received from Microsoft Purview
```json {
In this quickstart, you use Azure Storage as the checkpoint store. Follow these
``` > [!IMPORTANT]
-> Atlas currently supports the following operation types: **ENTITY_CREATE_V2**, **ENTITY_PARTIAL_UPDATE_V2**, **ENTITY_FULL_UPDATE_V2**, **ENTITY_DELETE_V2**. Pushing messages to Azure Purview is currently enabled by default. If the scenario involves reading from Azure Purview contact us as it needs to be allow-listed. (provide subscription id and name of Azure Purview account).
+> Atlas currently supports the following operation types: **ENTITY_CREATE_V2**, **ENTITY_PARTIAL_UPDATE_V2**, **ENTITY_FULL_UPDATE_V2**, **ENTITY_DELETE_V2**. Pushing messages to Microsoft Purview is currently enabled by default. If the scenario involves reading from Microsoft Purview contact us as it needs to be allow-listed. (provide subscription id and name of Microsoft Purview account).
## Next steps
purview Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/overview.md
Title: Introduction to Azure Purview
-description: This article provides an overview of Azure Purview, including its features and the problems it addresses. Azure Purview enables any user to register, discover, understand, and consume data sources.
+ Title: Introduction to Microsoft Purview
+description: This article provides an overview of Microsoft Purview, including its features and the problems it addresses. Microsoft Purview enables any user to register, discover, understand, and consume data sources.
Last updated 12/06/2021
-# What is Azure Purview?
+# What is Microsoft Purview?
-Azure Purview is a unified data governance service that helps you manage and govern your on-premises, multi-cloud, and software-as-a-service (SaaS) data. Create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Enable data curators to manage and secure your data estate. Empower data consumers to find valuable, trustworthy data.
+Microsoft Purview is a unified data governance service that helps you manage and govern your on-premises, multi-cloud, and software-as-a-service (SaaS) data. Create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Enable data curators to manage and secure your data estate. Empower data consumers to find valuable, trustworthy data.
-Azure Purview automates data discovery by providing data scanning and classification as a service for assets across your data estate. Metadata and descriptions of discovered data assets are integrated into a holistic map of your data estate. Atop this map, there are purpose-built apps that create environments for data discovery, access management, and insights about your data landscape.
+Microsoft Purview automates data discovery by providing data scanning and classification as a service for assets across your data estate. Metadata and descriptions of discovered data assets are integrated into a holistic map of your data estate. Atop this map, there are purpose-built apps that create environments for data discovery, access management, and insights about your data landscape.
|App |Description | |-|--|
Azure Purview automates data discovery by providing data scanning and classifica
## Data Map
-Azure Purview Data Map provides the foundation for data discovery and effective data governance. Azure Purview Data Map is a cloud native PaaS service that captures metadata about enterprise data present in analytics and operation systems on-premises and cloud. Azure Purview Data Map is automatically kept up to date with built-in automated scanning and classification system. Business users can configure and use the Azure Purview Data Map through an intuitive UI and developers can programmatically interact with the Data Map using open-source Apache Atlas 2.0 APIs.
-Azure Purview Data Map powers the Azure Purview Data Catalog and Azure Purview data insights as unified experiences within the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+Microsoft Purview Data Map provides the foundation for data discovery and effective data governance. Microsoft Purview Data Map is a cloud native PaaS service that captures metadata about enterprise data present in analytics and operation systems on-premises and cloud. Microsoft Purview Data Map is automatically kept up to date with built-in automated scanning and classification system. Business users can configure and use the Microsoft Purview Data Map through an intuitive UI and developers can programmatically interact with the Data Map using open-source Apache Atlas 2.0 APIs.
+Microsoft Purview Data Map powers the Microsoft Purview Data Catalog and Microsoft Purview data insights as unified experiences within the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
For more information, see our [introduction to Data Map](concept-elastic-data-map.md). ## Data Catalog
-With the Azure Purview Data Catalog, business and technical users alike can quickly & easily find relevant data using a search experience with filters based on various lenses like glossary terms, classifications, sensitivity labels and more. For subject matter experts, data stewards and officers, the Azure Purview Data Catalog provides data curation features like business glossary management and ability to automate tagging of data assets with glossary terms. Data consumers and producers can also visually trace the lineage of data assets starting from the operational systems on-premises, through movement, transformation & enrichment with various data storage & processing systems in the cloud to consumption in an analytics system like Power BI.
+With the Microsoft Purview Data Catalog, business and technical users alike can quickly & easily find relevant data using a search experience with filters based on various lenses like glossary terms, classifications, sensitivity labels and more. For subject matter experts, data stewards and officers, the Microsoft Purview Data Catalog provides data curation features like business glossary management and ability to automate tagging of data assets with glossary terms. Data consumers and producers can also visually trace the lineage of data assets starting from the operational systems on-premises, through movement, transformation & enrichment with various data storage & processing systems in the cloud to consumption in an analytics system like Power BI.
For more information, see our [introduction to search using Data Catalog](how-to-search-catalog.md). ## Data Insights
-With the Azure Purview data insights, data officers and security officers can get a birdΓÇÖs eye view and at a glance understand what data is actively scanned, where sensitive data is and how it moves.
+With the Microsoft Purview data insights, data officers and security officers can get a birdΓÇÖs eye view and at a glance understand what data is actively scanned, where sensitive data is and how it moves.
For more information, see our [introduction to Data Insights](concept-insights.md).
Users who are responsible for ensuring the security of their organization's data
* Understanding the risk levels in your organization's data requires diving deep into your content, looking for keywords, RegEx patterns, and sensitive data types. Sensitive data types can include Credit Card numbers, Social Security numbers, or Bank Account numbers, to name a few. You constantly monitor all data sources for sensitive content, as even the smallest amount of data loss can be critical to your organization. * Ensuring that your organization continues to comply with corporate security policies is a challenging task as your content grows and changes, and as those requirements and policies are updated for changing digital realities. Security administrators are often tasked with ensuring data security in the quickest time possible.
-## Azure Purview advantages
+## Microsoft Purview advantages
-Azure Purview is designed to address the issues mentioned in the previous sections and to help enterprises get the most value from their existing information assets. The catalog makes data sources easily discoverable and understandable by the users who manage the data.
+Microsoft Purview is designed to address the issues mentioned in the previous sections and to help enterprises get the most value from their existing information assets. The catalog makes data sources easily discoverable and understandable by the users who manage the data.
-Azure Purview provides a cloud-based service into which you can register data sources. During registration, the data remains in its existing location, but a copy of its metadata is added to Azure Purview, along with a reference to the data source location. The metadata is also indexed to make each data source easily discoverable via search and understandable to the users who discover it.
+Microsoft Purview provides a cloud-based service into which you can register data sources. During registration, the data remains in its existing location, but a copy of its metadata is added to Microsoft Purview, along with a reference to the data source location. The metadata is also indexed to make each data source easily discoverable via search and understandable to the users who discover it.
After you register a data source, you can then enrich its metadata. Either the user who registered the data source or another user in the enterprise adds the metadata. Any user can annotate a data source by providing descriptions, tags, or other metadata for requesting data source access. This descriptive metadata supplements the structural metadata, such as column names and data types, that's registered from the data source.
At the same time, users can contribute to the catalog by tagging, documenting, a
## Next steps
-To get started with Azure Purview, see [Create an Azure Purview account](create-catalog-portal.md).
+To get started with Microsoft Purview, see [Create a Microsoft Purview account](create-catalog-portal.md).
purview Quickstart ARM Create Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-ARM-create-azure-purview.md
Title: 'Quickstart: Create an Azure Purview account using an ARM Template'
-description: This Quickstart describes how to create an Azure Purview account using an ARM Template.
+ Title: 'Quickstart: Create a Microsoft Purview account using an ARM Template'
+description: This Quickstart describes how to create a Microsoft Purview account using an ARM Template.
Last updated 04/05/2022
-# Quickstart: Create an Azure Purview account using an ARM template
+# Quickstart: Create a Microsoft Purview account using an ARM template
-This quickstart describes the steps to deploy an Azure Purview account using an Azure Resource Manager (ARM) template.
+This quickstart describes the steps to deploy a Microsoft Purview account using an Azure Resource Manager (ARM) template.
-After you have created an Azure Purview account you can begin registering your data sources and using Azure Purview to understand and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Azure Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end data linage. Data consumers are able to discover data across your organization and data administrators are able to audit, secure, and ensure right use of your data.
+After you have created a Microsoft Purview account you can begin registering your data sources and using Microsoft Purview to understand and govern your data landscape. By connecting to data across your on-premises, multi-cloud, and software-as-a-service (SaaS) sources, Microsoft Purview creates an up-to-date map of your information. It identifies and classifies sensitive data, and provides end-to-end data linage. Data consumers are able to discover data across your organization and data administrators are able to audit, secure, and ensure right use of your data.
-For more information about Azure Purview, [see our overview page](overview.md). For more information about deploying Azure Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
+For more information about Microsoft Purview, [see our overview page](overview.md). For more information about deploying Microsoft Purview across your organization, [see our deployment best practices](deployment-best-practices.md).
-To deploy an Azure Purview account to your subscription using an ARM template, follow the guide below.
+To deploy a Microsoft Purview account to your subscription using an ARM template, follow the guide below.
[!INCLUDE [purview-quickstart-prerequisites](includes/purview-quickstart-prerequisites.md)] ## Deploy a custom template If your environment meets the prerequisites and you're familiar with using ARM templates, select the **Deploy to Azure** button. The template will open in the Azure portal where you can customize values and deploy.
-The template will deploy an Azure Purview account into a new or existing resource group in your subscription.
+The template will deploy a Microsoft Purview account into a new or existing resource group in your subscription.
[![Deploy to Azure](../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.azurepurview%2Fazure-purview-deployment%2Fazuredeploy.json)
The following resources are defined in the template:
The template performs the following tasks:
-* Creates an Azure Purview account in the specified resource group.
+* Creates a Microsoft Purview account in the specified resource group.
-## Open Azure Purview Studio
+## Open Microsoft Purview Studio
-After your Azure Purview account is created, you'll use the Azure Purview Studio to access and manage it. There are two ways to open Azure Purview Studio:
+After your Microsoft Purview account is created, you'll use the Microsoft Purview Studio to access and manage it. There are two ways to open Microsoft Purview Studio:
-* Open your Azure Purview account in the [Azure portal](https://portal.azure.com). Select the "Open Azure Purview Studio" tile on the overview page.
- :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Azure Purview account overview page, with the Azure Purview Studio tile highlighted.":::
+* Open your Microsoft Purview account in the [Azure portal](https://portal.azure.com). Select the "Open Microsoft Purview Studio" tile on the overview page.
+ :::image type="content" source="media/create-catalog-portal/open-purview-studio.png" alt-text="Screenshot showing the Microsoft Purview account overview page, with the Microsoft Purview Studio tile highlighted.":::
-* Alternatively, you can browse to [https://web.purview.azure.com](https://web.purview.azure.com), select your Azure Purview account, and sign in to your workspace.
+* Alternatively, you can browse to [https://web.purview.azure.com](https://web.purview.azure.com), select your Microsoft Purview account, and sign in to your workspace.
## Get started with your Purview resource
Write-Host "Press [ENTER] to continue..."
## Next steps
-In this quickstart, you learned how to create an Azure Purview account and how to access it through the Azure Purview Studio.
+In this quickstart, you learned how to create a Microsoft Purview account and how to access it through the Microsoft Purview Studio.
-Next, you can create a user-assigned managed identity (UAMI) that will enable your new Azure Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
+Next, you can create a user-assigned managed identity (UAMI) that will enable your new Microsoft Purview account to authenticate directly with resources using Azure Active Directory (Azure AD) authentication.
To create a UAMI, follow our [guide to create a user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity).
-Follow these next articles to learn how to navigate the Azure Purview Studio, create a collection, and grant access to Azure Purview:
+Follow these next articles to learn how to navigate the Microsoft Purview Studio, create a collection, and grant access to Microsoft Purview:
> [!div class="nextstepaction"]
-> [Using the Azure Purview Studio](use-azure-purview-studio.md)
+> [Using the Microsoft Purview Studio](use-azure-purview-studio.md)
> [Create a collection](quickstart-create-collection.md)
-> [Add users to your Azure Purview account](catalog-permissions.md)
+> [Add users to your Microsoft Purview account](catalog-permissions.md)
purview Quickstart Create Collection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-create-collection.md
Title: 'Quickstart: Create a collection'
-description: Collections are used for access control, and asset organization in Azure Purview. This article describes how to create a collection and add permissions, register sources, and register assets to collections.
+description: Collections are used for access control, and asset organization in Microsoft Purview. This article describes how to create a collection and add permissions, register sources, and register assets to collections.
Last updated 11/04/2021
-# Quickstart: Create a collection and assign permissions in Azure Purview
+# Quickstart: Create a collection and assign permissions in Microsoft Purview
-Collections are Azure Purview's tool to manage ownership and access control across assets, sources, and information. They also organize your sources and assets into categories that are customized to match your management experience with your data. This guide will take you through setting up your first collection and collection admin to prepare your Azure Purview environment for your organization.
+Collections are Microsoft Purview's tool to manage ownership and access control across assets, sources, and information. They also organize your sources and assets into categories that are customized to match your management experience with your data. This guide will take you through setting up your first collection and collection admin to prepare your Microsoft Purview environment for your organization.
## Prerequisites
Collections are Azure Purview's tool to manage ownership and access control acro
* Your own [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
## Check permissions
-In order to create and manage collections in Azure Purview, you will need to be a **Collection Admin** within Azure Purview. We can check these permissions in the [Azure Purview Studio](use-azure-purview-studio.md). You can find the studio by going to your Azure Purview account in the [Azure portal](https://portal.azure.com), and selecting the **Open Azure Purview Studio** tile on the overview page.
+In order to create and manage collections in Microsoft Purview, you will need to be a **Collection Admin** within Microsoft Purview. We can check these permissions in the [Microsoft Purview Studio](use-azure-purview-studio.md). You can find the studio by going to your Microsoft Purview account in the [Azure portal](https://portal.azure.com), and selecting the **Open Microsoft Purview Studio** tile on the overview page.
1. Select Data Map > Collections from the left pane to open collection management page.
- :::image type="content" source="./media/quickstart-create-collection/find-collections.png" alt-text="Screenshot of Azure Purview studio opened to the Data Map, with the Collections tab selected." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/find-collections.png" alt-text="Screenshot of Microsoft Purview studio opened to the Data Map, with the Collections tab selected." border="true":::
-1. Select your root collection. This is the top collection in your collection list and will have the same name as your Azure Purview account. In our example below, it's called Contoso Azure Purview.
+1. Select your root collection. This is the top collection in your collection list and will have the same name as your Microsoft Purview account. In our example below, it's called Contoso Microsoft Purview.
- :::image type="content" source="./media/quickstart-create-collection/select-root-collection.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the root collection highlighted." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/select-root-collection.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the root collection highlighted." border="true":::
1. Select role assignments in the collection window.
- :::image type="content" source="./media/quickstart-create-collection/role-assignments.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/role-assignments.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
-1. To create a collection, you will need to be in the collection admin list under role assignments. If you created the Azure Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant you permission.
+1. To create a collection, you will need to be in the collection admin list under role assignments. If you created the Microsoft Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant you permission.
- :::image type="content" source="./media/quickstart-create-collection/collection-admins.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the collection admin section highlighted." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/collection-admins.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the collection admin section highlighted." border="true":::
## Create a collection in the portal
-To create your collection, we'll start in the [Azure Purview Studio](use-azure-purview-studio.md). You can find the studio by going to your Azure Purview account in the Azure portal and selecting the **Open Azure Purview Studio** tile on the overview page.
+To create your collection, we'll start in the [Microsoft Purview Studio](use-azure-purview-studio.md). You can find the studio by going to your Microsoft Purview account in the Azure portal and selecting the **Open Microsoft Purview Studio** tile on the overview page.
1. Select Data Map > Collections from the left pane to open collection management page.
- :::image type="content" source="./media/quickstart-create-collection/find-collections-2.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the Collections tab selected." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/find-collections-2.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the Collections tab selected." border="true":::
1. Select **+ Add a collection**.
- :::image type="content" source="./media/quickstart-create-collection/select-add-collection.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the Collections tab selected and Add a Collection highlighted." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/select-add-collection.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the Collections tab selected and Add a Collection highlighted." border="true":::
1. In the right panel, enter the collection name, description, and search for users to add them as collection admins.
- :::image type="content" source="./media/quickstart-create-collection/create-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/create-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
1. Select **Create**. The collection information will reflect on the page.
- :::image type="content" source="./media/quickstart-create-collection/created-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the newly created collection window." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/created-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the newly created collection window." border="true":::
## Assign permissions to collection
-Now that you have a collection, you can assign permissions to this collection to manage your users access to Azure Purview.
+Now that you have a collection, you can assign permissions to this collection to manage your users access to Microsoft Purview.
### Roles
All assigned roles apply to sources, assets, and other objects within the collec
1. Select **Role assignments** tab to see all the roles in a collection.
- :::image type="content" source="./media/quickstart-create-collection/select-role-assignments.png" alt-text="Screenshot of Azure Purview studio collection window, with the role assignments tab highlighted." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/select-role-assignments.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the role assignments tab highlighted." border="true":::
1. Select **Edit role assignments** or the person icon to edit each role member.
- :::image type="content" source="./media/quickstart-create-collection/edit-role-assignments.png" alt-text="Screenshot of Azure Purview studio collection window, with the edit role assignments dropdown list selected." border="true":::
+ :::image type="content" source="./media/quickstart-create-collection/edit-role-assignments.png" alt-text="Screenshot of Microsoft Purview studio collection window, with the edit role assignments dropdown list selected." border="true":::
1. Type in the textbox to search for users you want to add to the role member. Select **OK** to save the change.
purview Reference Azure Purview Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/reference-azure-purview-glossary.md
Title: Azure Purview product glossary
-description: A glossary defining the terminology used throughout Azure Purview
+ Title: Microsoft Purview product glossary
+description: A glossary defining the terminology used throughout Microsoft Purview
Last updated 04/14/2022
-# Azure Purview product glossary
+# Microsoft Purview product glossary
-Below is a glossary of terminology used throughout Azure Purview.
+Below is a glossary of terminology used throughout Microsoft Purview.
## Advanced resource sets
-A set of features activated at the Azure Purview instance level that, when enabled, enrich resource set assets by computing additional aggregations on the metadata to provide information such as partition counts, total size, and schema counts. Resource set pattern rules are also included.
+A set of features activated at the Microsoft Purview instance level that, when enabled, enrich resource set assets by computing additional aggregations on the metadata to provide information such as partition counts, total size, and schema counts. Resource set pattern rules are also included.
## Annotation
-Information that is associated with data assets in Azure Purview, for example, glossary terms and classifications. After they are applied, annotations can be used within Search to aid in the discovery of the data assets.
+Information that is associated with data assets in Microsoft Purview, for example, glossary terms and classifications. After they are applied, annotations can be used within Search to aid in the discovery of the data assets.
## Approved The state given to any request that has been accepted as satisfactory by the designated individual or group who has authority to change the state of the request. ## Asset
-Any single object that is stored within an Azure Purview data catalog.
+Any single object that is stored within a Microsoft Purview data catalog.
> [!NOTE] > A single object in the catalog could potentially represent many objects in storage, for example, a resource set is an asset but it's made up of many partition files in storage. ## Azure Information Protection
A cloud solution that supports labeling of documents and emails to classify and
## Business glossary A searchable list of specialized terms that an organization uses to describe key business words and their definitions. Using a business glossary can provide consistent data usage across the organization. ## Capacity unit
-A measure of data map usage. All Azure Purview data maps include one capacity unit by default, which provides up to 2GB of metadata storage and has a throughput of 25 data map operations/second.
+A measure of data map usage. All Microsoft Purview data maps include one capacity unit by default, which provides up to 2GB of metadata storage and has a throughput of 25 data map operations/second.
## Classification report A report that shows key classification details about the scanned data. ## Classification
A type of annotation used to identify an attribute of an asset or a column such
## Classification rule A classification rule is a set of conditions that determine how scanned data should be classified when content matches the specified pattern. ## Classified asset
-An asset where Azure Purview extracts schema and applies classifications during an automated scan. The scan rule set determines which assets get classified. If the asset is considered a candidate for classification and no classifications are applied during scan time, an asset is still considered a classified asset.
+An asset where Microsoft Purview extracts schema and applies classifications during an automated scan. The scan rule set determines which assets get classified. If the asset is considered a candidate for classification and no classifications are applied during scan time, an asset is still considered a classified asset.
## Collection An organization-defined grouping of assets, terms, annotations, and sources. Collections allow for easier fine-grained access control and discoverability of assets within a data catalog. ## Collection admin
-A role that can assign roles in Azure Purview. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.
+A role that can assign roles in Microsoft Purview. Collection admins can add users to roles on collections where they're admins. They can also edit collections, their details, and add subcollections.
## Column pattern A regular expression included in a classification rule that represents the column names that you want to match. ## Contact
An operation that manages resources in your subscription, such as role-based acc
## Credential A verification of identity or tool used in an access control system. Credentials can be used to authenticate an individual or group to grant access to a data asset. ## Data catalog
-Azure Purview features that enable customers to view and manage the metadata for assets in your data estate.
+Microsoft Purview features that enable customers to view and manage the metadata for assets in your data estate.
## Data curator A role that provides access to the data catalog to manage assets, configure custom classifications, set up glossary terms, and view insights. Data curators can create, read, modify, move, and delete assets. They can also apply annotations to assets. ## Data map
-Azure Purview features that enable customers to manage their data estate, such as scanning, lineage, and movement.
+A metadata repository that is the foundation of Microsoft Purview. The data map is a graph that describes assets across a data estate and is populated through scans and other data ingestion processes. This graph helps organizations understand and govern their data by providing rich descriptions of assets, representing data lineage, classifying assets, storing relationships between assets, and housing information at both the technical and semantic layers. The data map is an open platform that can be interacted with and accessed through Apache Atlas APIs or the Microsoft Purview Governance Portal.
## Data map operation A create, read, update, or delete action performed on an entity in the data map. For example, creating an asset in the data map is considered a data map operation. ## Data owner
An individual or group responsible for managing a data asset.
## Data pattern A regular expression that represents the data that is stored in a data field. For example, a data pattern for employee ID could be Employee{GUID}. ## Data plane operation
-An operation within a specific Azure Purview instance, such as editing an asset or creating a glossary term. Each instance has predefined roles, such as "data reader" and "data curator" that control which data plane operations a user can perform.
+An operation within a specific Microsoft Purview instance, such as editing an asset or creating a glossary term. Each instance has predefined roles, such as "data reader" and "data curator" that control which data plane operations a user can perform.
## Data reader A role that provides read-only access to data assets, classifications, classification rules, collections, glossary terms, and insights. ## Data source admin
-A role that can manage data sources and scans. A user in the Data source admin role doesn't have access to Azure Purview studio. Combining this role with the Data reader or Data curator roles at any collection scope provides Azure Purview studio access.
+A role that can manage data sources and scans. A user in the Data source admin role doesn't have access to Microsoft Purview studio. Combining this role with the Data reader or Data curator roles at any collection scope provides Microsoft Purview studio access.
## Data steward An individual or group responsible for maintaining nomenclature, data quality standards, security controls, compliance requirements, and rules for the associated object. ## Data dictionary A list of canonical names of database columns and their corresponding data types. It is often used to describe the format and structure of a database, and the relationship between its elements. ## Discovered asset
-An asset that Azure Purview identifies in a data source during the scanning process. The number of discovered assets includes all files or tables before resource set grouping.
+An asset that Microsoft Purview identifies in a data source during the scanning process. The number of discovered assets includes all files or tables before resource set grouping.
## Distinct match threshold The total number of distinct data values that need to be found in a column before the scanner runs the data pattern on it. For example, a distinct match threshold of eight for employee ID requires that there are at least eight unique data values among the sampled values in the column that match the data pattern set for employee ID. ## Expert
An entry in the Business glossary that defines a concept specific to an organiza
## Incremental scan A scan that detects and processes assets that have been created, modified, or deleted since the previous successful scan. To run an incremental scan, at least one full scan must be completed on the source. ## Ingested asset
-An asset that has been scanned, classified (when applicable), and added to the Azure Purview data map. Ingested assets are discoverable and consumable within the data catalog through automated scanning or external connections, such as Azure Data Factory and Azure Synapse.
+An asset that has been scanned, classified (when applicable), and added to the Microsoft Purview data map. Ingested assets are discoverable and consumable within the data catalog through automated scanning or external connections, such as Azure Data Factory and Azure Synapse.
## Insights
-An area within Azure Purview where you can view reports that summarize information about your data.
+An area within Microsoft Purview where you can view reports that summarize information about your data.
## Integration runtime The compute infrastructure used to scan in a data source. ## Lineage How data transforms and flows as it moves from its origin to its destination. Understanding this flow across the data estate helps organizations see the history of their data, and aid in troubleshooting or impact analysis. ## Management
-An area within Azure Purview where you can manage connections, users, roles, and credentials. Also referred to as "Management center."
+An area within Microsoft Purview where you can manage connections, users, roles, and credentials. Also referred to as "Management center."
## Minimum match threshold The minimum percentage of matches among the distinct data values in a column that must be found by the scanner for a classification to be applied.
Data that is in a data center controlled by a customer, for example, not in the
## Owner An individual or group in charge of managing a data asset. ## Pattern rule
-A configuration that overrides how Azure Purview groups assets as resource sets and displays them within the catalog.
-## Azure Purview instance
-A single Azure Purview account.
+A configuration that overrides how Microsoft Purview groups assets as resource sets and displays them within the catalog.
+## Microsoft Purview instance
+A single Microsoft Purview account.
## Registered source
-A source that has been added to an Azure Purview instance and is now managed as a part of the Data catalog.
+A source that has been added to a Microsoft Purview instance and is now managed as a part of the Data catalog.
## Related terms Glossary terms that are linked to other terms within the organization. ## Resource set
-A single asset that represents many partitioned files or objects in storage. For example, Azure Purview stores partitioned Apache Spark output as a single resource set instead of unique assets for each individual file.
+A single asset that represents many partitioned files or objects in storage. For example, Microsoft Purview stores partitioned Apache Spark output as a single resource set instead of unique assets for each individual file.
## Role
-Permissions assigned to a user within an Azure Purview instance. Roles, such as Azure Purview Data Curator or Azure Purview Data Reader, determine what can be done within the product.
+Permissions assigned to a user within a Microsoft Purview instance. Roles, such as Microsoft Purview Data Curator or Microsoft Purview Data Reader, determine what can be done within the product.
## Root collection
-A system-generated collection that has the same friendly name as the Azure Purview account. All assets belong to the root collection by default.
+A system-generated collection that has the same friendly name as the Microsoft Purview account. All assets belong to the root collection by default.
## Scan
-An Azure Purview process that examines a source or set of sources and ingests its metadata into the data catalog. Scans can be run manually or on a schedule using a scan trigger.
+A Microsoft Purview process that discovers and examines metadata in a source or set of sources to populate the data map. A scan automatically connects to a source, extracts metadata, captures lineage, and applies classifications. Scans can be run manually or on a schedule.
## Scan rule set A set of rules that define which data types and classifications a scan ingests into a catalog. ## Scan trigger
The scoring of data assets that determine the order search results are returned.
## Self-hosted integration runtime An integration runtime installed on an on-premises machine or virtual machine inside a private network that is used to connect to data on-premises or in a private network. ## Sensitivity label
-Annotations that classify and protect an organizationΓÇÖs data. Azure Purview integrates with Microsoft Information Protection for creation of sensitivity labels.
+Annotations that classify and protect an organizationΓÇÖs data. Microsoft Purview integrates with Microsoft Information Protection for creation of sensitivity labels.
## Sensitivity label report A summary of which sensitivity labels are applied across the data estate. ## Service A product that provides standalone functionality and is available to customers by subscription or license. ## Source
-A system where data is stored. Sources can be hosted in various places such as a cloud or on-premises. You register and scan sources so that you can manage them in Azure Purview.
+A system where data is stored. Sources can be hosted in various places such as a cloud or on-premises. You register and scan sources so that you can manage them in Microsoft Purview.
## Source type
-A categorization of the registered sources used in an Azure Purview instance, for example, Azure SQL Database, Azure Blob Storage, Amazon S3, or SAP ECC.
+A categorization of the registered sources used in a Microsoft Purview instance, for example, Azure SQL Database, Azure Blob Storage, Amazon S3, or SAP ECC.
## Steward An individual who defines the standards for a glossary term. They are responsible for maintaining quality standards, nomenclature, and rules for the assigned entity. ## Term template
An automated process that coordinates the creation and modification of catalog e
## Next steps
-To get started with Azure Purview, see [Quickstart: Create an Azure Purview account](create-catalog-portal.md).
+To get started with Microsoft Purview, see [Quickstart: Create a Microsoft Purview account](create-catalog-portal.md).
purview Register Scan Adls Gen1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-adls-gen1.md
Title: 'Register and scan Azure Data Lake Storage (ADLS) Gen1'
-description: This article outlines the process to register an Azure Data Lake Storage Gen1 data source in Azure Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen 1 source
+description: This article outlines the process to register an Azure Data Lake Storage Gen1 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen 1 source
Last updated 11/10/2021
-# Connect to Azure Data Lake Gen1 in Azure Purview
+# Connect to Azure Data Lake Gen1 in Microsoft Purview
-This article outlines the process to register an Azure Data Lake Storage Gen1 data source in Azure Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen1 source.
+This article outlines the process to register an Azure Data Lake Storage Gen1 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen1 source.
> [!Note] > Azure Data Lake Storage Gen2 is now generally available. We recommend that you start using it today. For more information, see the [product page](https://azure.microsoft.com/services/storage/data-lake-storage/).
This article outlines the process to register an Azure Data Lake Storage Gen1 da
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
This section will enable you to register the ADLS Gen1 data source and set up an
### Steps to register
-It is important to register the data source in Azure Purview prior to setting up a scan for the data source.
+It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-purview-acct.png" alt-text="Screenshot that shows the Azure Purview account used to register the data source":::
+ :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-purview-acct.png" alt-text="Screenshot that shows the Microsoft Purview account used to register the data source":::
-1. **Open Azure Purview Studio** and navigate to the **Data Map --> Sources**
+1. **Open Microsoft Purview Studio** and navigate to the **Data Map --> Sources**
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-open-purview-studio.png" alt-text="Screenshot that shows the link to open Azure Purview Studio":::
+ :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-open-purview-studio.png" alt-text="Screenshot that shows the link to open Microsoft Purview Studio":::
:::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-sources.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
The following options are supported:
> [!Note] > If you have firewall enabled for the storage account, you must use managed identity authentication method when setting up a scan.
-* **System-assigned managed identity (Recommended)** - As soon as the Azure Purview Account is created, a system **Managed Identity** is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Azure Purview SAMI to perform the scans.
+* **System-assigned managed identity (Recommended)** - As soon as the Microsoft Purview Account is created, a system **Managed Identity** is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview SAMI to perform the scans.
-* **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity is a credential resource that can be used to allow Azure Purview to authenticate against Azure Active Directory. For more information, you can see our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
+* **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
* **Service Principal** - In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant.
The following options are supported:
#### Using system or user-assigned managed identity for scanning
-It is important to give your Azure Purview account the permission to scan the ADLS Gen1 data source. You can add the system managed identity, or user-assigned managed identity at the Subscription, Resource Group, or Resource level, depending on what you want it to have scan permissions on.
+It is important to give your Microsoft Purview account the permission to scan the ADLS Gen1 data source. You can add the system managed identity, or user-assigned managed identity at the Subscription, Resource Group, or Resource level, depending on what you want it to have scan permissions on.
> [!Note] > You need to be an owner of the subscription to be able to add a managed identity on an Azure resource.
It is important to give your Azure Purview account the permission to scan the AD
:::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-storage-access.png" alt-text="Screenshot that shows the Data explorer for the storage account":::
-1. Choose **Select** and add the _Azure Purview Name_ (which is the system managed identity) or the _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_(preview), that has already been registered in Azure Purview, in the **Select user or group** menu.
+1. Choose **Select** and add the _Microsoft Purview Name_ (which is the system managed identity) or the _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_(preview), that has already been registered in Microsoft Purview, in the **Select user or group** menu.
1. Select **Read** and **Execute** permissions. Make sure to choose **This folder and all children**, and **An access permission entry and a default permission entry** in the Add options as shown in the below screenshot. Select **OK**
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Azure Purview account":::
+ :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
> [!Tip] > An **access permission entry** is a permission entry on _current_ files and folders. A **default permission entry** is a permission entry that will be _inherited_ by new files and folders.
It is important to give your service principal the permission to scan the ADLS G
### Creating the scan
-1. Open your **Azure Purview account** and select the **Open Azure Purview Studio**
+1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview Studio**
- :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-purview-acct.png" alt-text="Screenshot that shows the Open Azure Purview Studio":::
+ :::image type="content" source="media/register-scan-adls-gen1/register-adls-gen1-purview-acct.png" alt-text="Screenshot that shows the Open Microsoft Purview Studio":::
1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy
Scans can be managed or run again on completion.
> [!NOTE] > * Deleting your scan does not delete catalog assets created from previous scans.
- > * The asset will no longer be updated with schema changes if your source table has changed and you re-scan the source table after editing the description in the schema tab of Azure Purview.
+ > * The asset will no longer be updated with schema changes if your source table has changed and you re-scan the source table after editing the description in the schema tab of Microsoft Purview.
1. You can _run an incremental scan_ or a _full scan_ again.
Scans can be managed or run again on completion.
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Adls Gen2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-adls-gen2.md
Title: 'Register and scan Azure Data Lake Storage (ADLS) Gen2'
-description: This article outlines the process to register an Azure Data Lake Storage Gen2 data source in Azure Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen2 source
+description: This article outlines the process to register an Azure Data Lake Storage Gen2 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen2 source
Last updated 01/24/2022
-# Connect to Azure Data Lake Gen2 in Azure Purview
+# Connect to Azure Data Lake Gen2 in Microsoft Purview
-This article outlines the process to register an Azure Data Lake Storage Gen2 data source in Azure Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen2 source
+This article outlines the process to register an Azure Data Lake Storage Gen2 data source in Microsoft Purview including instructions to authenticate and interact with the Azure Data Lake Storage Gen2 source
## Supported capabilities
This article outlines the process to register an Azure Data Lake Storage Gen2 da
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
This section will enable you to register the ADLS Gen2 data source and set up an
### Steps to register
-It is important to register the data source in Azure Purview prior to setting up a scan for the data source.
+It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-purview-acct.png" alt-text="Screenshot that shows the Azure Purview account used to register the data source":::
+ :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-purview-acct.png" alt-text="Screenshot that shows the Microsoft Purview account used to register the data source":::
-1. **Open Azure Purview Studio** and navigate to the **Data Map --> Sources**
+1. **Open Microsoft Purview Studio** and navigate to the **Data Map --> Sources**
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-open-purview-studio.png" alt-text="Screenshot that shows the link to open Azure Purview Studio":::
+ :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-open-purview-studio.png" alt-text="Screenshot that shows the link to open Microsoft Purview Studio":::
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-sources.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
The following options are supported:
> [!Note] > If you have firewall enabled for the storage account, you must use managed identity authentication method when setting up a scan.
-* **System-assigned managed identity (Recommended)** - As soon as the Azure Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Azure Purview system-assigned managed identity (SAMI) to perform the scans.
+* **System-assigned managed identity (Recommended)** - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview system-assigned managed identity (SAMI) to perform the scans.
-* **User-assigned managed identity** (preview) - Similar to a system managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Azure Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
+* **User-assigned managed identity** (preview) - Similar to a system managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-* **Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Azure Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
+* **Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Microsoft Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
> [!Note]
- > If you use this option, you need to deploy an _Azure key vault_ resource in your subscription and assign _Azure Purview accountΓÇÖs_ SAMI with required access permission to secrets inside _Azure key vault_.
+ > If you use this option, you need to deploy an _Azure key vault_ resource in your subscription and assign _Microsoft Purview accountΓÇÖs_ SAMI with required access permission to secrets inside _Azure key vault_.
* **Service Principal** - In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant.
The following options are supported:
#### Using a system or user assigned managed identity for scanning
-It is important to give your Azure Purview account or user-assigned managed identity (UAMI) the permission to scan the ADLS Gen2 data source. You can add your Azure Purview account's system-assigned managed identity (which has the same name as your Azure Purview account) or UAMI at the Subscription, Resource Group, or Resource level, depending on what level scan permissions are needed.
+It is important to give your Microsoft Purview account or user-assigned managed identity (UAMI) the permission to scan the ADLS Gen2 data source. You can add your Microsoft Purview account's system-assigned managed identity (which has the same name as your Microsoft Purview account) or UAMI at the Subscription, Resource Group, or Resource level, depending on what level scan permissions are needed.
> [!Note] > You need to be an owner of the subscription to be able to add a managed identity on an Azure resource.
It is important to give your Azure Purview account or user-assigned managed iden
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _Azure Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under the **Select** input box. Then, select **Save** to give this role assignment to your Azure Purview account.
+1. Set the **Role** to **Storage Blob Data Reader** and enter your _Microsoft Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under the **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
- :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Azure Purview account":::
+ :::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
> [!Note] > For more details, please see steps in [Authorize access to blobs and queues using Azure Active Directory](../storage/blobs/authorize-access-azure-active-directory.md)
When authentication method selected is **Account Key**, you need to get your acc
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-secret.png" alt-text="Screenshot that shows the key vault option to create a secret":::
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan #### Using Service Principal for scanning
It is important to give your service principal the permission to scan the ADLS G
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _service principal_ under **Select** input box. Then, select **Save** to give this role assignment to your Azure Purview account.
+1. Set the **Role** to **Storage Blob Data Reader** and enter your _service principal_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-sp-permission.png" alt-text="Screenshot that shows the details to provide storage account permissions to the service principal"::: ### Create the scan
-1. Open your **Azure Purview account** and select the **Open Azure Purview Studio**
+1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview Studio**
1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy 1. Select the **New Scan** icon under the **ADLS Gen2 data source** registered earlier
It is important to give your service principal the permission to scan the ADLS G
## Access policy
-Access policies allow data owners to manage access to datasets from Azure Purview. Owners can monitor and manage data use from within the Azure Purview Studio, without directly modifying the storage account where the data is housed.
+Access policies allow data owners to manage access to datasets from Microsoft Purview. Owners can monitor and manage data use from within the Microsoft Purview Studio, without directly modifying the storage account where the data is housed.
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
To create an access policy for Azure Data Lake Storage Gen 2, follow the guideli
### Enable data use governance
-Data use governance is an option on your Azure Purview sources that will allow you to manage access for that source from within Azure Purview.
+Data use governance is an option on your Microsoft Purview sources that will allow you to manage access for that source from within Microsoft Purview.
To enable data use governance, follow [the data use governance guide](how-to-enable-data-use-governance.md#enable-data-use-governance). ### Create an access policy
Or you can follow the [generic guide for creating data access policies](how-to-d
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Amazon Rds https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-amazon-rds.md
Title: Amazon RDS Multi-cloud scanning connector for Azure Purview
+ Title: Amazon RDS Multi-cloud scanning connector for Microsoft Purview
description: This how-to guide describes details of how to scan Amazon RDS databases, including both Microsoft SQL and PostgreSQL data.
Last updated 10/18/2021
-# Customer intent: As a security officer, I need to understand how to use the Azure Purview connector for Amazon RDS service to set up, configure, and scan my Amazon RDS databases.
+# Customer intent: As a security officer, I need to understand how to use the Microsoft Purview connector for Amazon RDS service to set up, configure, and scan my Amazon RDS databases.
-# Amazon RDS Multi-Cloud Scanning Connector for Azure Purview (Public preview)
+# Amazon RDS Multi-Cloud Scanning Connector for Microsoft Purview (Public preview)
-The Multi-Cloud Scanning Connector for Azure Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services, in addition to Azure storage services.
+The Multi-Cloud Scanning Connector for Microsoft Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services, in addition to Azure storage services.
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-This article describes how to use Azure Purview to scan your structured data currently stored in Amazon RDS, including both Microsoft SQL and PostgreSQL databases, and discover what types of sensitive information exists in your data. You'll also learn how to identify the Amazon RDS databases where the data is currently stored for easy information protection and data compliance.
+This article describes how to use Microsoft Purview to scan your structured data currently stored in Amazon RDS, including both Microsoft SQL and PostgreSQL databases, and discover what types of sensitive information exists in your data. You'll also learn how to identify the Amazon RDS databases where the data is currently stored for easy information protection and data compliance.
-For this service, use Azure Purview to provide a Microsoft account with secure access to AWS, where the Multi-Cloud Scanning Connectors for Azure Purview will run. The Multi-Cloud Scanning Connectors for Azure Purview use this access to your Amazon RDS databases to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Azure Purview classification and labeling reports to analyze and review your data scan results.
+For this service, use Microsoft Purview to provide a Microsoft account with secure access to AWS, where the Multi-Cloud Scanning Connectors for Microsoft Purview will run. The Multi-Cloud Scanning Connectors for Microsoft Purview use this access to your Amazon RDS databases to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Microsoft Purview classification and labeling reports to analyze and review your data scan results.
> [!IMPORTANT]
-> The Multi-Cloud Scanning Connectors for Azure Purview are separate add-ons to Azure Purview. The terms and conditions for the Multi-Cloud Scanning Connectors for Azure Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/.
+> The Multi-Cloud Scanning Connectors for Microsoft Purview are separate add-ons to Microsoft Purview. The terms and conditions for the Multi-Cloud Scanning Connectors for Microsoft Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/.
>
-## Azure Purview scope for Amazon RDS
+## Microsoft Purview scope for Amazon RDS
-- **Supported database engines**: Amazon RDS structured data storage supports multiple database engines. Azure Purview supports Amazon RDS with/based on Microsoft SQL and PostgreSQL.
+- **Supported database engines**: Amazon RDS structured data storage supports multiple database engines. Microsoft Purview supports Amazon RDS with/based on Microsoft SQL and PostgreSQL.
- **Maximum columns supported**: Scanning RDS tables with more than 300 columns is not supported. -- **Public access support**: Azure Purview supports scanning only with VPC Private Link in AWS, and does not include public access scanning.
+- **Public access support**: Microsoft Purview supports scanning only with VPC Private Link in AWS, and does not include public access scanning.
-- **Supported regions**: Azure Purview only supports Amazon RDS databases that are located in the following AWS regions:
+- **Supported regions**: Microsoft Purview only supports Amazon RDS databases that are located in the following AWS regions:
- US East (Ohio) - US East (N. Virginia)
For this service, use Azure Purview to provide a Microsoft account with secure a
For more information, see: -- [Manage and increase quotas for resources with Azure Purview](how-to-manage-quotas.md)-- [Supported data sources and file types in Azure Purview](sources-and-scans.md)-- [Use private endpoints for your Azure Purview account](catalog-private-link.md)
+- [Manage and increase quotas for resources with Microsoft Purview](how-to-manage-quotas.md)
+- [Supported data sources and file types in Microsoft Purview](sources-and-scans.md)
+- [Use private endpoints for your Microsoft Purview account](catalog-private-link.md)
## Prerequisites
-Ensure that you've performed the following prerequisites before adding your Amazon RDS database as Azure Purview data sources and scanning your RDS data.
+Ensure that you've performed the following prerequisites before adding your Amazon RDS database as Microsoft Purview data sources and scanning your RDS data.
> [!div class="checklist"]
-> * You need to be an Azure Purview Data Source Admin.
-> * You need an Azure Purview account. [Create an Azure Purview account instance](create-catalog-portal.md), if you don't yet have one.
+> * You need to be a Microsoft Purview Data Source Admin.
+> * You need a Microsoft Purview account. [Create a Microsoft Purview account instance](create-catalog-portal.md), if you don't yet have one.
> * You need an Amazon RDS PostgreSQL or Microsoft SQL database, with data.
-## Configure AWS to allow Azure Purview to connect to your RDS VPC
+## Configure AWS to allow Microsoft Purview to connect to your RDS VPC
-Azure Purview supports scanning only when your database is hosted in a virtual private cloud (VPC), where your RDS database can only be accessed from within the same VPC.
+Microsoft Purview supports scanning only when your database is hosted in a virtual private cloud (VPC), where your RDS database can only be accessed from within the same VPC.
-The Azure Multi-Cloud Scanning Connectors for Azure Purview service run in a separate, Microsoft account in AWS. To scan your RDS databases, the Microsoft AWS account needs to be able to access your RDS databases in your VPC. To allow this access, youΓÇÖll need to configure [AWS PrivateLink](https://aws.amazon.com/privatelink/) between the RDS VPC (in the customer account) to the VPC where the Multi-Cloud Scanning Connectors for Azure Purview run (in the Microsoft account).
+The Azure Multi-Cloud Scanning Connectors for Microsoft Purview service run in a separate, Microsoft account in AWS. To scan your RDS databases, the Microsoft AWS account needs to be able to access your RDS databases in your VPC. To allow this access, youΓÇÖll need to configure [AWS PrivateLink](https://aws.amazon.com/privatelink/) between the RDS VPC (in the customer account) to the VPC where the Multi-Cloud Scanning Connectors for Microsoft Purview run (in the Microsoft account).
-The following diagram shows the components in both your customer account and Microsoft account. Highlighted in yellow are the components youΓÇÖll need to create to enable connectivity RDS VPC in your account to the VPC where the Multi-Cloud Scanning Connectors for Azure Purview run in the Microsoft account.
+The following diagram shows the components in both your customer account and Microsoft account. Highlighted in yellow are the components youΓÇÖll need to create to enable connectivity RDS VPC in your account to the VPC where the Multi-Cloud Scanning Connectors for Microsoft Purview run in the Microsoft account.
> [!IMPORTANT]
The following diagram shows the components in both your customer account and Mic
### Configure AWS PrivateLink using a CloudFormation template
-The following procedure describes how to use an AWS CloudFormation template to configure AWS PrivateLink, allowing Azure Purview to connect to your RDS VPC. This procedure is performed in AWS and is intended for an AWS admin.
+The following procedure describes how to use an AWS CloudFormation template to configure AWS PrivateLink, allowing Microsoft Purview to connect to your RDS VPC. This procedure is performed in AWS and is intended for an AWS admin.
This CloudFormation template is available for download from the [Azure GitHub repository](https://github.com/Azure/Azure-Purview-Starter-Kit/tree/main/Amazon/AWS/RDS), and will help you create a target group, load balancer, and endpoint service. - **If you have multiple RDS servers in the same VPC**, perform this procedure once, [specifying all RDS server IP addresses and ports](#parameters). In this case, the CloudFormation output will include different ports for each RDS server.
- When [registering these RDS servers as data sources in Azure Purview](#register-an-amazon-rds-data-source), use the ports included in the output instead of the real RDS server ports.
+ When [registering these RDS servers as data sources in Microsoft Purview](#register-an-amazon-rds-data-source), use the ports included in the output instead of the real RDS server ports.
- **If you have RDS servers in multiple VPCs**, perform this procedure for each of the VPCs.
This CloudFormation template is available for download from the [Azure GitHub re
|Name |Description | |||
- |**Endpoint & port** | Enter the resolved IP address of the RDS endpoint URL and port. For example: `192.168.1.1:5432` <br><br>- **If an RDS proxy is configured**, use the IP address of the read/write endpoint of the proxy for the relevant database. We recommend using an RDS proxy when working with Azure Purview, as the IP address is static.<br><br>- **If you have multiple endpoints behind the same VPC**, enter up to 10, comma-separated endpoints. In this case, a single load balancer is created to the VPC, allowing a connection from the Amazon RDS Multi-Cloud Scanning Connector for Azure Purview in AWS to all RDS endpoints in the VPC. |
+ |**Endpoint & port** | Enter the resolved IP address of the RDS endpoint URL and port. For example: `192.168.1.1:5432` <br><br>- **If an RDS proxy is configured**, use the IP address of the read/write endpoint of the proxy for the relevant database. We recommend using an RDS proxy when working with Microsoft Purview, as the IP address is static.<br><br>- **If you have multiple endpoints behind the same VPC**, enter up to 10, comma-separated endpoints. In this case, a single load balancer is created to the VPC, allowing a connection from the Amazon RDS Multi-Cloud Scanning Connector for Microsoft Purview in AWS to all RDS endpoints in the VPC. |
|**Networking** | Enter your VPC ID | |**VPC IPv4 CIDR** | Enter the value your VPC's CIDR. You can find this value by selecting the VPC link on your RDS database page. For example: `192.168.0.0/16` | |**Subnets** |Select all the subnets that are associated with your VPC. |
This CloudFormation template is available for download from the [Azure GitHub re
- **Resources**: Shows the newly created target group, load balancer, and endpoint service - **Outputs**: Displays the **ServiceName** value, and the IP address and port of the RDS servers
- If you have multiple RDS servers configured, a different port is displayed. In this case, use the port shown here instead of the actual RDS server port when [registering your RDS database](#register-an-amazon-rds-data-source) as Azure Purview data source.
+ If you have multiple RDS servers configured, a different port is displayed. In this case, use the port shown here instead of the actual RDS server port when [registering your RDS database](#register-an-amazon-rds-data-source) as Microsoft Purview data source.
1. In the **Outputs** tab, copy the **ServiceName** key value to the clipboard.
- You'll use the value of the **ServiceName** key in the Azure Purview portal, when [registering your RDS database](#register-an-amazon-rds-data-source) as Azure Purview data source. There, enter the **ServiceName** key in the **Connect to private network via endpoint service** field.
+ You'll use the value of the **ServiceName** key in the Microsoft Purview portal, when [registering your RDS database](#register-an-amazon-rds-data-source) as Microsoft Purview data source. There, enter the **ServiceName** key in the **Connect to private network via endpoint service** field.
## Register an Amazon RDS data source
-**To add your Amazon RDS server as an Azure Purview data source**:
+**To add your Amazon RDS server as a Microsoft Purview data source**:
-1. In Azure Purview, navigate to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png).
+1. In Microsoft Purview, navigate to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png).
1. On the **Sources** page, select **Register.** On the **Register sources** page that appears on the right, select the **Database** tab, and then select **Amazon RDS (PostgreSQL)** or **Amazon RDS (SQL)**.
This CloudFormation template is available for download from the [Azure GitHub re
|**Server name** | Enter the name of your RDS database in the following syntax: `<instance identifier>.<xxxxxxxxxxxx>.<region>.rds.amazonaws.com` <br><br>We recommend that you copy this URL from the Amazon RDS portal, and make sure that the URL includes the AWS region. | |**Port** | Enter the port used to connect to the RDS database:<br><br> - PostgreSQL: `5432`<br> - Microsoft SQL: `1433`<br><br> If you've [configured AWS PrivateLink using a CloudFormation template](#configure-aws-privatelink-using-a-cloudformation-template) and have multiple RDS servers in the same VPC, use the ports listed in the CloudFormation **Outputs** tab instead of the read RDS server ports. | |**Connect to private network via endpoint service** | Enter the **ServiceName** key value obtained at the end of the [previous procedure](#configure-aws-privatelink-using-a-cloudformation-template). <br><br>If you've prepared your RDS database manually, use the **Service Name** value obtained at the end of [Step 5: Create an endpoint service](#step-5-create-an-endpoint-service). |
- |**Collection** (optional) | Select a collection to add your data source to. For more information, see [Manage data sources in Azure Purview (Preview)](manage-data-sources.md). |
+ |**Collection** (optional) | Select a collection to add your data source to. For more information, see [Manage data sources in Microsoft Purview (Preview)](manage-data-sources.md). |
| | | 1. Select **Register** when youΓÇÖre ready to continue.
Your RDS data source appears in the Sources map or list. For example:
:::image type="content" source="media/register-scan-amazon-rds/amazon-rds-in-sources.png" alt-text="Screenshot of an Amazon RDS data source on the Sources page.":::
-## Create Azure Purview credentials for your RDS scan
+## Create Microsoft Purview credentials for your RDS scan
Credentials supported for Amazon RDS data sources include username/password authentication only, with a password stored in an Azure KeyVault secret.
-### Create a secret for your RDS credentials to use in Azure Purview
+### Create a secret for your RDS credentials to use in Microsoft Purview
1. Add your password to an Azure KeyVault as a secret. For more information, see [Set and retrieve a secret from Key Vault using Azure portal](../key-vault/secrets/quick-create-portal.md). 1. Add an access policy to your KeyVault with **Get** and **List** permissions. For example:
- :::image type="content" source="media/register-scan-amazon-rds/keyvault-for-rds.png" alt-text="Screenshot of an access policy for RDS in Azure Purview.":::
+ :::image type="content" source="media/register-scan-amazon-rds/keyvault-for-rds.png" alt-text="Screenshot of an access policy for RDS in Microsoft Purview.":::
- When defining the principal for the policy, select your Azure Purview account. For example:
+ When defining the principal for the policy, select your Microsoft Purview account. For example:
- :::image type="content" source="media/register-scan-amazon-rds/select-purview-as-principal.png" alt-text="Screenshot of selecting your Azure Purview account as Principal.":::
+ :::image type="content" source="media/register-scan-amazon-rds/select-purview-as-principal.png" alt-text="Screenshot of selecting your Microsoft Purview account as Principal.":::
Select **Save** to save your Access Policy update. For more information, see [Assign an Azure Key Vault access policy](/azure/key-vault/general/assign-access-policy-portal).
-1. In Azure Purview, add a KeyVault connection to connect the KeyVault with your RDS secret to Azure Purview. For more information, see [Credentials for source authentication in Azure Purview](manage-credentials.md).
+1. In Microsoft Purview, add a KeyVault connection to connect the KeyVault with your RDS secret to Microsoft Purview. For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
-### Create your Azure Purview credential object for RDS
+### Create your Microsoft Purview credential object for RDS
-In Azure Purview, create a credentials object to use when scanning your Amazon RDS account.
+In Microsoft Purview, create a credentials object to use when scanning your Amazon RDS account.
-1. In the Azure Purview **Management** area, select **Security and access** > **Credentials** > **New**.
+1. In the Microsoft Purview **Management** area, select **Security and access** > **Credentials** > **New**.
1. Select **SQL authentication** as the authentication method. Then, enter details for the Key Vault where your RDS credentials are stored, including the names of your Key Vault and secret.
In Azure Purview, create a credentials object to use when scanning your Amazon R
:::image type="content" source="media/register-scan-amazon-rds/new-credential-for-rds.png" alt-text="Screenshot of a new credential for RDS.":::
-For more information, see [Credentials for source authentication in Azure Purview](manage-credentials.md).
+For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
## Scan an Amazon RDS database
-To configure an Azure Purview scan for your RDS database:
+To configure a Microsoft Purview scan for your RDS database:
-1. From the Azure Purview **Sources** page, select the Amazon RDS data source to scan.
+1. From the Microsoft Purview **Sources** page, select the Amazon RDS data source to scan.
1. Select :::image type="icon" source="media/register-scan-amazon-s3/new-scan-button.png" border="false"::: **New scan** to start defining your scan. In the pane that opens on the right, enter the following details, and then select **Continue**. - **Name**: Enter a meaningful name for your scan.
- - **Database name**: Enter the name of the database you want to scan. YouΓÇÖll need to find the names available from outside Azure Purview, and create a separate scan for each database in the registered RDS server.
- - **Credential**: Select the credential you created earlier for the Multi-Cloud Scanning Connectors for Azure Purview to access the RDS database.
+ - **Database name**: Enter the name of the database you want to scan. YouΓÇÖll need to find the names available from outside Microsoft Purview, and create a separate scan for each database in the registered RDS server.
+ - **Credential**: Select the credential you created earlier for the Multi-Cloud Scanning Connectors for Microsoft Purview to access the RDS database.
1. On the **Select a scan rule set** pane, select the scan rule set you want to use, or create a new one. For more information, see [Create a scan rule set](create-a-scan-rule-set.md).
While you run your scan, select **Refresh** to monitor the scan progress.
## Explore scanning results
-After an Azure Purview scan is complete on your Amazon RDS databases, drill down in the Azure Purview **Data Map** area to view the scan history. Select a data source to view its details, and then select the **Scans** tab to view any currently running or completed scans.
+After a Microsoft Purview scan is complete on your Amazon RDS databases, drill down in the Microsoft Purview **Data Map** area to view the scan history. Select a data source to view its details, and then select the **Scans** tab to view any currently running or completed scans.
-Use the other areas of Azure Purview to find out details about the content in your data estate, including your Amazon RDS databases:
+Use the other areas of Microsoft Purview to find out details about the content in your data estate, including your Amazon RDS databases:
-- **Explore RDS data in the catalog**. The Azure Purview catalog shows a unified view across all source types, and RDS scanning results are displayed in a similar way to Azure SQL. You can browse the catalog using filters or browse the assets and navigate through the hierarchy. For more information, see:
+- **Explore RDS data in the catalog**. The Microsoft Purview catalog shows a unified view across all source types, and RDS scanning results are displayed in a similar way to Azure SQL. You can browse the catalog using filters or browse the assets and navigate through the hierarchy. For more information, see:
- - [Tutorial: Browse assets in Azure Purview (preview) and view their lineage](tutorial-browse-and-view-lineage.md)
- - [Search the Azure Purview Data Catalog](how-to-search-catalog.md)
+ - [Tutorial: Browse assets in Microsoft Purview (preview) and view their lineage](tutorial-browse-and-view-lineage.md)
+ - [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
- [Register and scan an Azure SQL Database](register-scan-azure-sql-database.md) - **View Insight reports** to view statistics for the classification, sensitivity labels, file types, and more details about your content.
- All Azure Purview Insight reports include the Amazon RDS scanning results, along with the rest of the results from your Azure data sources. When relevant, an **Amazon RDS** asset type is added to the report filtering options.
+ All Microsoft Purview Insight reports include the Amazon RDS scanning results, along with the rest of the results from your Azure data sources. When relevant, an **Amazon RDS** asset type is added to the report filtering options.
- For more information, see the [Understand Insights in Azure Purview](concept-insights.md).
+ For more information, see the [Understand Insights in Microsoft Purview](concept-insights.md).
-- **View RDS data in other Azure Purview features**, such as the **Scans** and **Glossary** areas. For more information, see:
+- **View RDS data in other Microsoft Purview features**, such as the **Scans** and **Glossary** areas. For more information, see:
- [Create a scan rule set](create-a-scan-rule-set.md)
- - [Tutorial: Create and import glossary terms in Azure Purview (preview)](tutorial-import-create-glossary-terms.md)
+ - [Tutorial: Create and import glossary terms in Microsoft Purview (preview)](tutorial-import-create-glossary-terms.md)
## Configure AWS PrivateLink manually (advanced)
-This procedure describes the manual steps required for preparing your RDS database in a VPC to connect to Azure Purview.
+This procedure describes the manual steps required for preparing your RDS database in a VPC to connect to Microsoft Purview.
By default, we recommend that you use a CloudFormation template instead, as described earlier in this article. For more information, see [Configure AWS PrivateLink using a CloudFormation template](#configure-aws-privatelink-using-a-cloudformation-template).
After the [Load Balancer is created](#step-4-create-a-load-balancer) and its Sta
> For more information, see [modify-vpc-endpoint-service-permissions ΓÇö AWS CLI 2.2.7 Command Reference (amazonaws.com)](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ec2/modify-vpc-endpoint-service-permissions.html). >
-<a name="service-name"></a>**To copy the service name for use in Azure Purview**:
+<a name="service-name"></a>**To copy the service name for use in Microsoft Purview**:
-After youΓÇÖve created your endpoint service, you can copy the **Service name** value in the Azure Purview portal, when [registering your RDS database](#register-an-amazon-rds-data-source) as Azure Purview data source.
+After youΓÇÖve created your endpoint service, you can copy the **Service name** value in the Microsoft Purview portal, when [registering your RDS database](#register-an-amazon-rds-data-source) as Microsoft Purview data source.
Locate the **Service name** on the **Details** tab for your selected endpoint service.
Locate the **Service name** on the **Details** tab for your selected endpoint se
## Troubleshoot your VPC connection
-This section describes common errors that may occur when configuring your VPC connection with Azure Purview, and how to troubleshoot and resolve them.
+This section describes common errors that may occur when configuring your VPC connection with Microsoft Purview, and how to troubleshoot and resolve them.
### Invalid VPC service name
-If an error of `Invalid VPC service name` or `Invalid endpoint service` appears in Azure Purview, use the following steps to troubleshoot:
+If an error of `Invalid VPC service name` or `Invalid endpoint service` appears in Microsoft Purview, use the following steps to troubleshoot:
1. Make sure that your VPC service name is correct. For example:
If an error of `Invalid VPC service name` or `Invalid endpoint service` appears
For more information, see [Step 5: Create an endpoint service](#step-5-create-an-endpoint-service).
-1. Make sure that your RDS database is listed in one of the supported regions. For more information, see [Azure Purview scope for Amazon RDS](#azure-purview-scope-for-amazon-rds).
+1. Make sure that your RDS database is listed in one of the supported regions. For more information, see [Microsoft Purview scope for Amazon RDS](#microsoft-purview-scope-for-amazon-rds).
### Invalid availability zone
-If an error of `Invalid Availability Zone` appears in Azure Purview, make sure that your RDS is defined for at least one of the following three regions:
+If an error of `Invalid Availability Zone` appears in Microsoft Purview, make sure that your RDS is defined for at least one of the following three regions:
- **us-east-1a** - **us-east-1b**
For more information, see the [AWS documentation](https://docs.aws.amazon.com/el
### RDS errors
-The following errors may appear in Azure Purview:
+The following errors may appear in Microsoft Purview:
- `Unknown database`. In this case, the database defined doesn't exist. Check to see that the configured database name is correct
The following errors may appear in Azure Purview:
## Next steps
-Learn more about Azure Purview Insight reports:
+Learn more about Microsoft Purview Insight reports:
> [!div class="nextstepaction"]
-> [Understand Insights in Azure Purview](concept-insights.md)
+> [Understand Insights in Microsoft Purview](concept-insights.md)
purview Register Scan Amazon S3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-amazon-s3.md
Title: Amazon S3 multi-cloud scanning connector for Azure Purview
-description: This how-to guide describes details of how to scan Amazon S3 buckets in Azure Purview.
+ Title: Amazon S3 multi-cloud scanning connector for Microsoft Purview
+description: This how-to guide describes details of how to scan Amazon S3 buckets in Microsoft Purview.
Last updated 12/07/2021
-# Customer intent: As a security officer, I need to understand how to use the Azure Purview connector for Amazon S3 service to set up, configure, and scan my Amazon S3 buckets.
+# Customer intent: As a security officer, I need to understand how to use the Microsoft Purview connector for Amazon S3 service to set up, configure, and scan my Amazon S3 buckets.
-# Amazon S3 Multi-Cloud Scanning Connector for Azure Purview
+# Amazon S3 Multi-Cloud Scanning Connector for Microsoft Purview
-The Multi-Cloud Scanning Connector for Azure Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services in addition to Azure storage services.
+The Multi-Cloud Scanning Connector for Microsoft Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services in addition to Azure storage services.
-This article describes how to use Azure Purview to scan your unstructured data currently stored in Amazon S3 standard buckets, and discover what types of sensitive information exists in your data. This how-to guide also describes how to identify the Amazon S3 Buckets where the data is currently stored for easy information protection and data compliance.
+This article describes how to use Microsoft Purview to scan your unstructured data currently stored in Amazon S3 standard buckets, and discover what types of sensitive information exists in your data. This how-to guide also describes how to identify the Amazon S3 Buckets where the data is currently stored for easy information protection and data compliance.
-For this service, use Azure Purview to provide a Microsoft account with secure access to AWS, where the Multi-Cloud Scanning Connector for Azure Purview will run. The Multi-Cloud Scanning Connector for Azure Purview uses this access to your Amazon S3 buckets to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Azure Purview classification and labeling reports to analyze and review your data scan results.
+For this service, use Microsoft Purview to provide a Microsoft account with secure access to AWS, where the Multi-Cloud Scanning Connector for Microsoft Purview will run. The Multi-Cloud Scanning Connector for Microsoft Purview uses this access to your Amazon S3 buckets to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Microsoft Purview classification and labeling reports to analyze and review your data scan results.
> [!IMPORTANT]
-> The Multi-Cloud Scanning Connector for Azure Purview is a separate add-on to Azure Purview. The terms and conditions for the Multi-Cloud Scanning Connector for Azure Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/.
+> The Multi-Cloud Scanning Connector for Microsoft Purview is a separate add-on to Microsoft Purview. The terms and conditions for the Multi-Cloud Scanning Connector for Microsoft Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/.
> ## Supported capabilities
For this service, use Azure Purview to provide a Microsoft account with secure a
\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md)
-## Azure Purview scope for Amazon S3
+## Microsoft Purview scope for Amazon S3
We currently do not support ingestion private endpoints that work with your AWS sources.
-For more information about Azure Purview limits, see:
+For more information about Microsoft Purview limits, see:
-- [Manage and increase quotas for resources with Azure Purview](how-to-manage-quotas.md)-- [Supported data sources and file types in Azure Purview](sources-and-scans.md)
+- [Manage and increase quotas for resources with Microsoft Purview](how-to-manage-quotas.md)
+- [Supported data sources and file types in Microsoft Purview](sources-and-scans.md)
### Storage and scanning regions
-The Azure Purview connector for the Amazon S3 service is currently deployed in specific regions only. The following table maps the regions where you data is stored to the region where it would be scanned by Azure Purview.
+The Microsoft Purview connector for the Amazon S3 service is currently deployed in specific regions only. The following table maps the regions where you data is stored to the region where it would be scanned by Microsoft Purview.
> [!IMPORTANT] > Customers will be charged for all related data transfer charges according to the region of their bucket.
The Azure Purview connector for the Amazon S3 service is currently deployed in s
## Prerequisites
-Ensure that you've performed the following prerequisites before adding your Amazon S3 buckets as Azure Purview data sources and scanning your S3 data.
+Ensure that you've performed the following prerequisites before adding your Amazon S3 buckets as Microsoft Purview data sources and scanning your S3 data.
> [!div class="checklist"]
-> * You need to be an Azure Purview Data Source Admin.
-> * [Create an Azure Purview account](#create-an-azure-purview-account) if you don't yet have one
-> * [Create a new AWS role for use with Azure Purview](#create-a-new-aws-role-for-azure-purview)
-> * [Create an Azure Purview credential for your AWS bucket scan](#create-an-azure-purview-credential-for-your-aws-s3-scan)
+> * You need to be a Microsoft Purview Data Source Admin.
+> * [Create a Microsoft Purview account](#create-a-microsoft-purview-account) if you don't yet have one
+> * [Create a new AWS role for use with Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview)
+> * [Create a Microsoft Purview credential for your AWS bucket scan](#create-a-microsoft-purview-credential-for-your-aws-s3-scan)
> * [Configure scanning for encrypted Amazon S3 buckets](#configure-scanning-for-encrypted-amazon-s3-buckets), if relevant > * Make sure that your bucket policy does not block the connection. For more information, see [Bucket policy requirements](#confirm-your-bucket-policy-access) and [SCP policy requirements](#confirm-your-scp-policy-access). For these items, you may need to consult with an AWS expert to ensure that your policies allow required access.
-> * When adding your buckets as Azure Purview resources, you'll need the values of your [AWS ARN](#retrieve-your-new-role-arn), [bucket name](#retrieve-your-amazon-s3-bucket-name), and sometimes your [AWS account ID](#locate-your-aws-account-id).
+> * When adding your buckets as Microsoft Purview resources, you'll need the values of your [AWS ARN](#retrieve-your-new-role-arn), [bucket name](#retrieve-your-amazon-s3-bucket-name), and sometimes your [AWS account ID](#locate-your-aws-account-id).
-### Create an Azure Purview account
+### Create a Microsoft Purview account
-- **If you already have an Azure Purview account,** you can continue with the configurations required for AWS S3 support. Start with [Create an Azure Purview credential for your AWS bucket scan](#create-an-azure-purview-credential-for-your-aws-s3-scan).
+- **If you already have a Microsoft Purview account,** you can continue with the configurations required for AWS S3 support. Start with [Create a Microsoft Purview credential for your AWS bucket scan](#create-a-microsoft-purview-credential-for-your-aws-s3-scan).
-- **If you need to create an Azure Purview account,** follow the instructions in [Create an Azure Purview account instance](create-catalog-portal.md). After creating your account, return here to complete configuration and begin using Azure Purview connector for Amazon S3.
+- **If you need to create a Microsoft Purview account,** follow the instructions in [Create a Microsoft Purview account instance](create-catalog-portal.md). After creating your account, return here to complete configuration and begin using Microsoft Purview connector for Amazon S3.
-### Create a new AWS role for Azure Purview
+### Create a new AWS role for Microsoft Purview
-The Azure Purview scanner is deployed in a Microsoft account in AWS. To allow the Azure Purview scanner to read your S3 data, you must create a dedicated role in the AWS portal, in the IAM area, to be used by the scanner.
+The Microsoft Purview scanner is deployed in a Microsoft account in AWS. To allow the Microsoft Purview scanner to read your S3 data, you must create a dedicated role in the AWS portal, in the IAM area, to be used by the scanner.
-This procedure describes how to create the AWS role, with the required Microsoft Account ID and External ID from Azure Purview, and then enter the Role ARN value in Azure Purview.
+This procedure describes how to create the AWS role, with the required Microsoft Account ID and External ID from Microsoft Purview, and then enter the Role ARN value in Microsoft Purview.
**To locate your Microsoft Account ID and External ID**:
-1. In Azure Purview, go to the **Management Center** > **Security and access** > **Credentials**.
+1. In Microsoft Purview, go to the **Management Center** > **Security and access** > **Credentials**.
1. Select **New** to create a new credential.
This procedure describes how to create the AWS role, with the required Microsoft
[ ![Locate your Microsoft account ID and External ID values.](./media/register-scan-amazon-s3/locate-account-id-external-id.png) ](./media/register-scan-amazon-s3/locate-account-id-external-id.png#lightbox)
-**To create your AWS role for Azure Purview**:
+**To create your AWS role for Microsoft Purview**:
1. Open your **Amazon Web Services** console, and under **Security, Identity, and Compliance**, select **IAM**.
This procedure describes how to create the AWS role, with the required Microsoft
- [Confirm your bucket policy access](#confirm-your-bucket-policy-access) - [Confirm your SCP policy access](#confirm-your-scp-policy-access)
-### Create an Azure Purview credential for your AWS S3 scan
+### Create a Microsoft Purview credential for your AWS S3 scan
-This procedure describes how to create a new Azure Purview credential to use when scanning your AWS buckets.
+This procedure describes how to create a new Microsoft Purview credential to use when scanning your AWS buckets.
> [!TIP]
-> If you're continuing directly on from [Create a new AWS role for Azure Purview](#create-a-new-aws-role-for-azure-purview), you may already have the **New credential** pane open in Azure Purview.
+> If you're continuing directly on from [Create a new AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview), you may already have the **New credential** pane open in Microsoft Purview.
> > You can also create a new credential in the middle of the process, while [configuring your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets). In that case, in the **Credential** field, select **New**. >
-1. In Azure Purview, go to the **Management Center**, and under **Security and access**, select **Credentials**.
+1. In Microsoft Purview, go to the **Management Center**, and under **Security and access**, select **Credentials**.
-1. Select **New**, and in the **New credential** pane that appears on the right, use the following fields to create your Azure Purview credential:
+1. Select **New**, and in the **New credential** pane that appears on the right, use the following fields to create your Microsoft Purview credential:
|Field |Description | ||| |**Name** |Enter a meaningful name for this credential. | |**Description** |Enter a optional description for this credential, such as `Used to scan the tutorial S3 buckets` | |**Authentication method** |Select **Role ARN**, since you're using a role ARN to access your bucket. |
- |**Role ARN** | Once you've [created your Amazon IAM role](#create-a-new-aws-role-for-azure-purview), navigate to your role in the AWS IAM area, copy the **Role ARN** value, and enter it here. For example: `arn:aws:iam::181328463391:role/S3Role`. <br><br>For more information, see [Retrieve your new Role ARN](#retrieve-your-new-role-arn). |
+ |**Role ARN** | Once you've [created your Amazon IAM role](#create-a-new-aws-role-for-microsoft-purview), navigate to your role in the AWS IAM area, copy the **Role ARN** value, and enter it here. For example: `arn:aws:iam::181328463391:role/S3Role`. <br><br>For more information, see [Retrieve your new Role ARN](#retrieve-your-new-role-arn). |
| | |
- The **Microsoft account ID** and the **External ID** values are used when [creating your Role ARN in AWS.](#create-a-new-aws-role-for-azure-purview).
+ The **Microsoft account ID** and the **External ID** values are used when [creating your Role ARN in AWS.](#create-a-new-aws-role-for-microsoft-purview).
1. Select **Create** when you're done to finish creating the credential.
-For more information about Azure Purview credentials, see [Credentials for source authentication in Azure Purview](manage-credentials.md).
+For more information about Microsoft Purview credentials, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
### Configure scanning for encrypted Amazon S3 buckets
AWS buckets support multiple encryption types. For buckets that use **AWS-KMS**
1. Attach your new policy to the role you added for scanning.
- 1. Navigate back to the **IAM** > **Roles** page, and select the role you added [earlier](#create-a-new-aws-role-for-azure-purview).
+ 1. Navigate back to the **IAM** > **Roles** page, and select the role you added [earlier](#create-a-new-aws-role-for-microsoft-purview).
1. On the **Permissions** tab, select **Attach policies**.
AWS buckets support multiple encryption types. For buckets that use **AWS-KMS**
Make sure that the S3 bucket [policy](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-iam-policies.html) does not block the connection: 1. In AWS, navigate to your S3 bucket, and then select the **Permissions** tab > **Bucket policy**.
-1. Check the policy details to make sure that it doesn't block the connection from the Azure Purview scanner service.
+1. Check the policy details to make sure that it doesn't block the connection from the Microsoft Purview scanner service.
### Confirm your SCP policy access
For example, your SCP policy might block read API calls to the [AWS Region](#sto
- Required API calls, which must be allowed by your SCP policy, include: `AssumeRole`, `GetBucketLocation`, `GetObject`, `ListBucket`, `GetBucketPublicAccessBlock`. - Your SCP policy must also allow calls to the **us-east-1** AWS Region, which is the default Region for API calls. For more information, see the [AWS documentation](https://docs.aws.amazon.com/general/latest/gr/rande.html).
-Follow the [SCP documentation](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps_create.html), review your organizationΓÇÖs SCP policies, and make sure all the [permissions required for the Azure Purview scanner](#create-a-new-aws-role-for-azure-purview) are available.
+Follow the [SCP documentation](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps_create.html), review your organizationΓÇÖs SCP policies, and make sure all the [permissions required for the Microsoft Purview scanner](#create-a-new-aws-role-for-microsoft-purview) are available.
### Retrieve your new Role ARN
-You'll need to record your AWS Role ARN and copy it in to Azure Purview when [creating a scan for your Amazon S3 bucket](#create-a-scan-for-one-or-more-amazon-s3-buckets).
+You'll need to record your AWS Role ARN and copy it in to Microsoft Purview when [creating a scan for your Amazon S3 bucket](#create-a-scan-for-one-or-more-amazon-s3-buckets).
**To retrieve your role ARN:**
-1. In the AWS **Identity and Access Management (IAM)** > **Roles** area, search for and select the new role you [created for Azure Purview](#create-an-azure-purview-credential-for-your-aws-s3-scan).
+1. In the AWS **Identity and Access Management (IAM)** > **Roles** area, search for and select the new role you [created for Microsoft Purview](#create-a-microsoft-purview-credential-for-your-aws-s3-scan).
1. On the role's **Summary** page, select the **Copy to clipboard** button to the right of the **Role ARN** value. ![Copy the role ARN value to the clipboard.](./media/register-scan-amazon-s3/aws-copy-role-purview.png)
-In Azure Purview, you can edit your credential for AWS S3, and paste the retrieved role in the **Role ARN** field. For more information, see [Create a scan for one or more Amazon S3 buckets](#create-a-scan-for-one-or-more-amazon-s3-buckets).
+In Microsoft Purview, you can edit your credential for AWS S3, and paste the retrieved role in the **Role ARN** field. For more information, see [Create a scan for one or more Amazon S3 buckets](#create-a-scan-for-one-or-more-amazon-s3-buckets).
### Retrieve your Amazon S3 bucket name
-You'll need the name of your Amazon S3 bucket to copy it in to Azure Purview when [creating a scan for your Amazon S3 bucket](#create-a-scan-for-one-or-more-amazon-s3-buckets)
+You'll need the name of your Amazon S3 bucket to copy it in to Microsoft Purview when [creating a scan for your Amazon S3 bucket](#create-a-scan-for-one-or-more-amazon-s3-buckets)
**To retrieve your bucket name:**
You'll need the name of your Amazon S3 bucket to copy it in to Azure Purview whe
![Retrieve and copy the S3 bucket URL.](./media/register-scan-amazon-s3/retrieve-bucket-url-amazon.png)
- Paste your bucket name in a secure file, and add an `s3://` prefix to it to create the value you'll need to enter when configuring your bucket as an Azure Purview account.
+ Paste your bucket name in a secure file, and add an `s3://` prefix to it to create the value you'll need to enter when configuring your bucket as a Microsoft Purview account.
For example: `s3://purview-tutorial-bucket` > [!TIP]
-> Only the root level of your bucket is supported as an Azure Purview data source. For example, the following URL, which includes a sub-folder is *not* supported: `s3://purview-tutorial-bucket/view-data`
+> Only the root level of your bucket is supported as a Microsoft Purview data source. For example, the following URL, which includes a sub-folder is *not* supported: `s3://purview-tutorial-bucket/view-data`
> > However, if you configure a scan for a specific S3 bucket, you can select one or more specific folders for your scan. For more information, see the step to [scope your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets). > ### Locate your AWS account ID
-You'll need your AWS account ID to register your AWS account as an Azure Purview data source, together with all of its buckets.
+You'll need your AWS account ID to register your AWS account as a Microsoft Purview data source, together with all of its buckets.
Your AWS account ID is the ID you use to log in to the AWS console. You can also find it once you're logged in on the IAM dashboard, on the left under the navigation options, and at the top, as the numerical part of your sign-in URL:
For example:
![Retrieve your AWS account ID.](./media/register-scan-amazon-s3/aws-locate-account-id.png)
-## Add a single Amazon S3 bucket as an Azure Purview account
+## Add a single Amazon S3 bucket as a Microsoft Purview account
-Use this procedure if you only have a single S3 bucket that you want to register to Azure Purview as a data source, or if you have multiple buckets in your AWS account, but do not want to register all of them to Azure Purview.
+Use this procedure if you only have a single S3 bucket that you want to register to Microsoft Purview as a data source, or if you have multiple buckets in your AWS account, but do not want to register all of them to Microsoft Purview.
**To add your bucket**:
-1. In Azure Purview, go to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) > **Amazon S3** > **Continue**.
+1. In Microsoft Purview, go to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) > **Amazon S3** > **Continue**.
- ![Add an Amazon AWS bucket as an Azure Purview data source.](./media/register-scan-amazon-s3/add-s3-datasource-to-purview.png)
+ ![Add an Amazon AWS bucket as a Microsoft Purview data source.](./media/register-scan-amazon-s3/add-s3-datasource-to-purview.png)
> [!TIP] > If you have multiple [collections](manage-data-sources.md#manage-collections) and want to add your Amazon S3 to a specific collection, select the **Map view** at the top right, and then select the **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) button inside your collection.
Use this procedure if you only have a single S3 bucket that you want to register
||| |**Name** |Enter a meaningful name, or use the default provided. | |**Bucket URL** | Enter your AWS bucket URL, using the following syntax: `s3://<bucketName>` <br><br>**Note**: Make sure to use only the root level of your bucket. For more information, see [Retrieve your Amazon S3 bucket name](#retrieve-your-amazon-s3-bucket-name). |
- |**Select a collection** |If you selected to register a data source from within a collection, that collection already listed. <br><br>Select a different collection as needed, **None** to assign no collection, or **New** to create a new collection now. <br><br>For more information about Azure Purview collections, see [Manage data sources in Azure Purview](manage-data-sources.md#manage-collections).|
+ |**Select a collection** |If you selected to register a data source from within a collection, that collection already listed. <br><br>Select a different collection as needed, **None** to assign no collection, or **New** to create a new collection now. <br><br>For more information about Microsoft Purview collections, see [Manage data sources in Microsoft Purview](manage-data-sources.md#manage-collections).|
| | | When you're done, select **Finish** to complete the registration. Continue with [Create a scan for one or more Amazon S3 buckets.](#create-a-scan-for-one-or-more-amazon-s3-buckets).
-## Add an AWS account as an Azure Purview account
+## Add an AWS account as a Microsoft Purview account
-Use this procedure if you have multiple S3 buckets in your Amazon account, and you want to register all of them as Azure Purview data sources.
+Use this procedure if you have multiple S3 buckets in your Amazon account, and you want to register all of them as Microsoft Purview data sources.
When [configuring your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets), you'll be able to select the specific buckets you want to scan, if you don't want to scan all of them together. **To add your Amazon account**:
-1. In Azure Purview, go to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) > **Amazon accounts** > **Continue**.
+1. In Microsoft Purview, go to the **Data Map** page, and select **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) > **Amazon accounts** > **Continue**.
- ![Add an Amazon account as an Azure Purview data source.](./media/register-scan-amazon-s3/add-s3-account-to-purview.png)
+ ![Add an Amazon account as a Microsoft Purview data source.](./media/register-scan-amazon-s3/add-s3-account-to-purview.png)
> [!TIP] > If you have multiple [collections](manage-data-sources.md#manage-collections) and want to add your Amazon S3 to a specific collection, select the **Map view** at the top right, and then select the **Register** ![Register icon.](./media/register-scan-amazon-s3/register-button.png) button inside your collection.
When [configuring your scan](#create-a-scan-for-one-or-more-amazon-s3-buckets),
||| |**Name** |Enter a meaningful name, or use the default provided. | |**AWS account ID** | Enter your AWS account ID. For more information, see [Locate your AWS account ID](#locate-your-aws-account-id)|
- |**Select a collection** |If you selected to register a data source from within a collection, that collection already listed. <br><br>Select a different collection as needed, **None** to assign no collection, or **New** to create a new collection now. <br><br>For more information about Azure Purview collections, see [Manage data sources in Azure Purview](manage-data-sources.md#manage-collections).|
+ |**Select a collection** |If you selected to register a data source from within a collection, that collection already listed. <br><br>Select a different collection as needed, **None** to assign no collection, or **New** to create a new collection now. <br><br>For more information about Microsoft Purview collections, see [Manage data sources in Microsoft Purview](manage-data-sources.md#manage-collections).|
| | | When you're done, select **Finish** to complete the registration.
Continue with [Create a scan for one or more Amazon S3 buckets](#create-a-scan-f
## Create a scan for one or more Amazon S3 buckets
-Once you've added your buckets as Azure Purview data sources, you can configure a scan to run at scheduled intervals or immediately.
+Once you've added your buckets as Microsoft Purview data sources, you can configure a scan to run at scheduled intervals or immediately.
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/), and then do one of the following:
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/), and then do one of the following:
- In the **Map view**, select **New scan** ![New scan icon.](./media/register-scan-amazon-s3/new-scan-button.png) in your data source box. - In the **List view**, hover over the row for your data source, and select **New scan** ![New scan icon.](./media/register-scan-amazon-s3/new-scan-button.png).
Once you've added your buckets as Azure Purview data sources, you can configure
|Field |Description | ||| |**Name** | Enter a meaningful name for your scan or use the default. |
- |**Type** |Displayed only if you've added your AWS account, with all buckets included. <br><br>Current options include only **All** > **Amazon S3**. Stay tuned for more options to select as Azure Purview's support matrix expands. |
- |**Credential** | Select an Azure Purview credential with your role ARN. <br><br>**Tip**: If you want to create a new credential at this time, select **New**. For more information, see [Create an Azure Purview credential for your AWS bucket scan](#create-an-azure-purview-credential-for-your-aws-s3-scan). |
+ |**Type** |Displayed only if you've added your AWS account, with all buckets included. <br><br>Current options include only **All** > **Amazon S3**. Stay tuned for more options to select as Microsoft Purview's support matrix expands. |
+ |**Credential** | Select a Microsoft Purview credential with your role ARN. <br><br>**Tip**: If you want to create a new credential at this time, select **New**. For more information, see [Create a Microsoft Purview credential for your AWS bucket scan](#create-a-microsoft-purview-credential-for-your-aws-s3-scan). |
| **Amazon S3** | Displayed only if you've added your AWS account, with all buckets included. <br><br>Select one or more buckets to scan, or **Select all** to scan all the buckets in your account. | | | |
- Azure Purview automatically checks that the role ARN is valid, and that the buckets and objects within the buckets are accessible, and then continues if the connection succeeds.
+ Microsoft Purview automatically checks that the role ARN is valid, and that the buckets and objects within the buckets are accessible, and then continues if the connection succeeds.
> [!TIP] > To enter different values and test the connection yourself before continuing, select **Test connection** at the bottom right before selecting **Continue**.
Once you've added your buckets as Azure Purview data sources, you can configure
> Once started, scanning can take up to 24 hours to complete. You'll be able to review your **Insight Reports** and search the catalog 24 hours after you started each scan. >
-For more information, see [Explore Azure Purview scanning results](#explore-azure-purview-scanning-results).
+For more information, see [Explore Microsoft Purview scanning results](#explore-microsoft-purview-scanning-results).
-## Explore Azure Purview scanning results
+## Explore Microsoft Purview scanning results
-Once an Azure Purview scan is complete on your Amazon S3 buckets, drill down in the Azure Purview **Data Map** area to view the scan history.
+Once a Microsoft Purview scan is complete on your Amazon S3 buckets, drill down in the Microsoft Purview **Data Map** area to view the scan history.
Select a data source to view its details, and then select the **Scans** tab to view any currently running or completed scans. If you've added an AWS account with multiple buckets, the scan history for each bucket is shown under the account.
For example:
![Show the AWS S3 bucket scans under your AWS account source.](./media/register-scan-amazon-s3/account-scan-history.png)
-Use the other areas of Azure Purview to find out details about the content in your data estate, including your Amazon S3 buckets:
+Use the other areas of Microsoft Purview to find out details about the content in your data estate, including your Amazon S3 buckets:
-- **Search the Azure Purview data catalog,** and filter for a specific bucket. For example:
+- **Search the Microsoft Purview data catalog,** and filter for a specific bucket. For example:
![Search the catalog for AWS S3 assets.](./media/register-scan-amazon-s3/search-catalog-screen-aws.png) - **View Insight reports** to view statistics for the classification, sensitivity labels, file types, and more details about your content.
- All Azure Purview Insight reports include the Amazon S3 scanning results, along with the rest of the results from your Azure data sources. When relevant, an additional **Amazon S3** asset type was added to the report filtering options.
+ All Microsoft Purview Insight reports include the Amazon S3 scanning results, along with the rest of the results from your Azure data sources. When relevant, an additional **Amazon S3** asset type was added to the report filtering options.
- For more information, see the [Understand Insights in Azure Purview](concept-insights.md).
+ For more information, see the [Understand Insights in Microsoft Purview](concept-insights.md).
## Minimum permissions for your AWS policy
-The default procedure for [creating an AWS role for Azure Purview](#create-a-new-aws-role-for-azure-purview) to use when scanning your S3 buckets uses the **AmazonS3ReadOnlyAccess** policy.
+The default procedure for [creating an AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview) to use when scanning your S3 buckets uses the **AmazonS3ReadOnlyAccess** policy.
The **AmazonS3ReadOnlyAccess** policy provides minimum permissions required for scanning your S3 buckets, and may include other permissions as well.
Make sure to define your resource with a wildcard. For example:
## Troubleshooting
-Scanning Amazon S3 resources requires [creating a role in AWS IAM](#create-a-new-aws-role-for-azure-purview) to allow the Azure Purview scanner service running in a Microsoft account in AWS to read the data.
+Scanning Amazon S3 resources requires [creating a role in AWS IAM](#create-a-new-aws-role-for-microsoft-purview) to allow the Microsoft Purview scanner service running in a Microsoft account in AWS to read the data.
Configuration errors in the role can lead to connection failure. This section describes some examples of connection failures that may occur while setting up the scan, and the troubleshooting guidelines for each case. If all of the items described in the following sections are properly configured, and scanning S3 buckets still fails with errors, contact Microsoft support. > [!NOTE]
-> For policy access issues, make sure that neither your bucket policy, nor your SCP policy are blocking access to your S3 bucket from Azure Purview.
+> For policy access issues, make sure that neither your bucket policy, nor your SCP policy are blocking access to your S3 bucket from Microsoft Purview.
> >For more information, see [Confirm your bucket policy access](#confirm-your-bucket-policy-access) and [Confirm your SCP policy access](#confirm-your-scp-policy-access). >
Make sure that the AWS role has **KMS Decrypt** permissions. For more informatio
Make sure that the AWS role has the correct external ID: 1. In the AWS IAM area, select the **Role > Trust relationships** tab.
-1. Follow the steps in [Create a new AWS role for Azure Purview](#create-a-new-aws-role-for-azure-purview) again to verify your details.
+1. Follow the steps in [Create a new AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview) again to verify your details.
### Error found with the role ARN
This is a general error that indicates an issue when using the Role ARN. For exa
- Make sure that the AWS role has the required permissions to read the selected S3 bucket. Required permissions include `AmazonS3ReadOnlyAccess` or the [minimum read permissions](#minimum-permissions-for-your-aws-policy), and `KMS Decrypt` for encrypted buckets. -- Make sure that the AWS role has the correct Microsoft account ID. In the AWS IAM area, select the **Role > Trust relationships** tab and then follow the steps in [Create a new AWS role for Azure Purview](#create-a-new-aws-role-for-azure-purview) again to verify your details.
+- Make sure that the AWS role has the correct Microsoft account ID. In the AWS IAM area, select the **Role > Trust relationships** tab and then follow the steps in [Create a new AWS role for Microsoft Purview](#create-a-new-aws-role-for-microsoft-purview) again to verify your details.
For more information, see [Cannot find the specified bucket](#cannot-find-the-specified-bucket),
For more information, see [Cannot find the specified bucket](#cannot-find-the-sp
Make sure that the S3 bucket URL is properly defined: 1. In AWS, navigate to your S3 bucket, and copy the bucket name.
-1. In Azure Purview, edit the Amazon S3 data source, and update the bucket URL to include your copied bucket name, using the following syntax: `s3://<BucketName>`
+1. In Microsoft Purview, edit the Amazon S3 data source, and update the bucket URL to include your copied bucket name, using the following syntax: `s3://<BucketName>`
## Next steps
-Learn more about Azure Purview Insight reports:
+Learn more about Microsoft Purview Insight reports:
> [!div class="nextstepaction"]
-> [Understand Insights in Azure Purview](concept-insights.md)
+> [Understand Insights in Microsoft Purview](concept-insights.md)
purview Register Scan Azure Blob Storage Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-blob-storage-source.md
Title: 'Register and scan Azure Blob Storage'
-description: This article outlines the process to register an Azure Blob Storage data source in Azure Purview including instructions to authenticate and interact with the Azure Blob Storage Gen2 source
+description: This article outlines the process to register an Azure Blob Storage data source in Microsoft Purview including instructions to authenticate and interact with the Azure Blob Storage Gen2 source
Last updated 01/24/2022
-# Connect to Azure Blob storage in Azure Purview
+# Connect to Azure Blob storage in Microsoft Purview
-This article outlines the process to register an Azure Blob Storage account in Azure Purview including instructions to authenticate and interact with the Azure Blob Storage source
+This article outlines the process to register an Azure Blob Storage account in Microsoft Purview including instructions to authenticate and interact with the Azure Blob Storage source
## Supported capabilities
For file types such as csv, tsv, psv, ssv, the schema is extracted when the foll
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
This section will enable you to register the Azure Blob storage account and set
### Steps to register
-It is important to register the data source in Azure Purview prior to setting up a scan for the data source.
+It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-purview-acct.png" alt-text="Screenshot that shows the Azure Purview account used to register the data source":::
+ :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-purview-acct.png" alt-text="Screenshot that shows the Microsoft Purview account used to register the data source":::
-1. **Open Azure Purview Studio** and navigate to the **Data Map --> Sources**
+1. **Open Microsoft Purview Studio** and navigate to the **Data Map --> Sources**
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-open-purview-studio.png" alt-text="Screenshot that shows the link to open Azure Purview Studio":::
+ :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-open-purview-studio.png" alt-text="Screenshot that shows the link to open Microsoft Purview Studio":::
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-sources.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
The following options are supported:
> [!Note] > If you have firewall enabled for the storage account, you must use managed identity authentication method when setting up a scan. -- **System-assigned managed identity (Recommended)** - As soon as the Azure Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Azure Purview SAMI to perform the scans.
+- **System-assigned managed identity (Recommended)** - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview SAMI to perform the scans.
-- **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Azure Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
+- **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
-- **Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Azure Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
+- **Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Microsoft Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password, or a password.
> [!Note]
- > If you use this option, you need to deploy an _Azure key vault_ resource in your subscription and assign _Azure Purview accountΓÇÖs_ SAMI with required access permission to secrets inside _Azure key vault_.
+ > If you use this option, you need to deploy an _Azure key vault_ resource in your subscription and assign _Microsoft Purview accountΓÇÖs_ SAMI with required access permission to secrets inside _Azure key vault_.
- **Service Principal** - In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant. #### Using a system or user assigned managed identity for scanning
-It is important to give your Azure Purview account the permission to scan the Azure Blob data source. You can add access for the SAMI or UAMI at the Subscription, Resource Group, or Resource level, depending on what level scan permission is needed.
+It is important to give your Microsoft Purview account the permission to scan the Azure Blob data source. You can add access for the SAMI or UAMI at the Subscription, Resource Group, or Resource level, depending on what level scan permission is needed.
> [!NOTE] > If you have firewall enabled for the storage account, you must use **managed identity** authentication method when setting up a scan.
It is important to give your Azure Purview account the permission to scan the Az
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _Azure Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under **Select** input box. Then, select **Save** to give this role assignment to your Azure Purview account.
+1. Set the **Role** to **Storage Blob Data Reader** and enter your _Microsoft Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
- :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Azure Purview account":::
+ :::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
1. Go into your Azure Blob storage account in [Azure portal](https://portal.azure.com) 1. Navigate to **Security + networking > Networking**
When authentication method selected is **Account Key**, you need to get your acc
1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan #### Using Service Principal for scanning
It is important to give your service principal the permission to scan the Azure
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _service principal_ under **Select** input box. Then, select **Save** to give this role assignment to your Azure Purview account.
+1. Set the **Role** to **Storage Blob Data Reader** and enter your _service principal_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-sp-permission.png" alt-text="Screenshot that shows the details to provide storage account permissions to the service principal"::: ### Creating the scan
-1. Open your **Azure Purview account** and select the **Open Azure Purview Studio**
+1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview Studio**
1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy 1. Select the **New Scan** icon under the **Azure Blob data source** registered earlier
It is important to give your service principal the permission to scan the Azure
#### If using a system or user assigned managed identity
-Provide a **Name** for the scan, select the Azure Purview accounts SAMI or UAMI under **Credential**, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**
+Provide a **Name** for the scan, select the Microsoft Purview accounts SAMI or UAMI under **Credential**, choose the appropriate collection for the scan, and select **Test connection**. On a successful connection, select **Continue**
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-managed-identity.png" alt-text="Screenshot that shows the managed identity option to run the scan":::
Scans can be managed or run again on completion
## Access policy
-Access policies allow data owners to manage access to datasets from Azure Purview. Owners can monitor and manage data use from within the Azure Purview Studio, without directly modifying the storage account where the data is housed.
+Access policies allow data owners to manage access to datasets from Microsoft Purview. Owners can monitor and manage data use from within the Microsoft Purview Studio, without directly modifying the storage account where the data is housed.
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
To create an access policy for an Azure Storage account, follow the guidelines b
### Enable data use governance
-Data use governance is an option on your Azure Purview sources that will allow you to manage access for that source from within Azure Purview.
+Data use governance is an option on your Microsoft Purview sources that will allow you to manage access for that source from within Microsoft Purview.
To enable data use governance, follow [the data use governance guide](how-to-enable-data-use-governance.md#enable-data-use-governance). ### Create an access policy
Or you can follow the [generic guide for creating data access policies](how-to-d
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-* [Data insights in Azure Purview](concept-insights.md)
-* [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+* [Data insights in Microsoft Purview](concept-insights.md)
+* [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
* [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Cosmos Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-cosmos-database.md
Title: 'Register and scan Azure Cosmos Database (SQL API)'
-description: This article outlines the process to register an Azure Cosmos data source (SQL API) in Azure Purview including instructions to authenticate and interact with the Azure Cosmos database
+description: This article outlines the process to register an Azure Cosmos data source (SQL API) in Microsoft Purview including instructions to authenticate and interact with the Azure Cosmos database
Last updated 11/02/2021
-# Connect to Azure Cosmos database (SQL API) in Azure Purview
+# Connect to Azure Cosmos database (SQL API) in Microsoft Purview
-This article outlines the process to register an Azure Cosmos database (SQL API) in Azure Purview including instructions to authenticate and interact with the Azure Cosmos database source
+This article outlines the process to register an Azure Cosmos database (SQL API) in Microsoft Purview including instructions to authenticate and interact with the Azure Cosmos database source
## Supported capabilities
This article outlines the process to register an Azure Cosmos database (SQL API)
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
This section will enable you to register the Azure Cosmos database (SQL API) and
### Steps to register
-It is important to register the data source in Azure Purview prior to setting up a scan for the data source.
+It is important to register the data source in Microsoft Purview prior to setting up a scan for the data source.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
- :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-purview-acct.png" alt-text="Screenshot that shows the Azure Purview account used to register the data source":::
+ :::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-purview-acct.png" alt-text="Screenshot that shows the Microsoft Purview account used to register the data source":::
-1. **Open Azure Purview Studio** and navigate to the **Data Map --> Collections**
+1. **Open Microsoft Purview Studio** and navigate to the **Data Map --> Collections**
:::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-open-purview-studio.png" alt-text="Screenshot that navigates to the Sources link in the Data Map":::
In order to have access to scan the data source, an authentication method in the
There is only one way to set up authentication for Azure Cosmos Database:
-**Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Azure Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password or a password.
+**Account Key** - Secrets can be created inside an Azure Key Vault to store credentials in order to enable access for Microsoft Purview to scan data sources securely using the secrets. A secret can be a storage account key, SQL login password or a password.
> [!Note]
-> You need to deploy an _Azure key vault_ resource in your subscription and assign _Azure Purview accountΓÇÖs_ MSI with required access permission to secrets inside _Azure key vault_.
+> You need to deploy an _Azure key vault_ resource in your subscription and assign _Microsoft Purview accountΓÇÖs_ MSI with required access permission to secrets inside _Azure key vault_.
#### Using Account Key for scanning
You need to get your access key and store in the key vault:
:::image type="content" source="media/register-scan-azure-cosmos-database/register-cosmos-db-key-vault-options.png" alt-text="Screenshot that shows the key vault option to enter the secret values":::
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan. ### Creating the scan
-1. Open your **Azure Purview account** and select the **Open Azure Purview Studio**
+1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview Studio**
1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy 1. Select the **New Scan** icon under the **Azure Cosmos database** registered earlier
Scans can be managed or run again on completion.
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Data Explorer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-data-explorer.md
Title: 'Connect to and manage Azure Data Explorer'
-description: This guide describes how to connect to Azure Data Explorer in Azure Purview, and use Azure Purview's features to scan and manage your Azure Data Explorer source.
+description: This guide describes how to connect to Azure Data Explorer in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Data Explorer source.
Last updated 12/03/2021
-# Connect to and manage Azure Data Explorer in Azure Purview
+# Connect to and manage Azure Data Explorer in Microsoft Purview
-This article outlines how to register Azure Data Explorer, and how to authenticate and interact with Azure Data Explorer in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Azure Data Explorer, and how to authenticate and interact with Azure Data Explorer in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register Azure Data Explorer, and how to authentica
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register Azure Data Explorer in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Azure Data Explorer in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
It is required to get the Service Principal's application ID and secret:
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** of your choice and **Value** as the **Client secret** from your Service Principal 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the Service Principal to set up your scan #### Granting the Service Principal access to your Azure data explorer instance
It is required to get the Service Principal's application ID and secret:
### System or user assigned managed identity to register
-* **System-assigned managed identity** - As soon as the Azure Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. It has the same name as your Azure Purview account.
+* **System-assigned managed identity** - As soon as the Microsoft Purview Account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant. It has the same name as your Microsoft Purview account.
-* **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Azure Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
+* **User-assigned managed identity** (preview) - Similar to a system-managed identity, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. For more information, you can see our [User-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
To register using either of these managed identities, follow these steps:
To register using either of these managed identities, follow these steps:
To register a new Azure Data Explorer (Kusto) account in your data catalog, follow these steps:
-1. Navigate to your Azure Purview account
+1. Navigate to your Microsoft Purview account
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On **Register sources**, select **Azure Data Explorer**
Follow the steps below to scan Azure Data Explorer to automatically identify ass
To create and run a new scan, follow these steps:
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the Azure Data Explorer source that you registered.
To create and run a new scan, follow these steps:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Files Storage Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-files-storage-source.md
Title: Connect to and manage Azure Files
-description: This guide describes how to connect to Azure Files in Azure Purview, and use Azure Purview's features to scan and manage your Azure Files source.
+description: This guide describes how to connect to Azure Files in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Files source.
Last updated 11/02/2021
-# Connect to and manage Azure Files in Azure Purview
+# Connect to and manage Azure Files in Microsoft Purview
-This article outlines how to register Azure Files, and how to authenticate and interact with Azure Files in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Azure Files, and how to authenticate and interact with Azure Files in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
For file types such as csv, tsv, psv, ssv, the schema is extracted when the foll
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register Azure Files in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Azure Files in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
When authentication method selected is **Account Key**, you need to get your acc
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *key* from your storage account 1. Select **Create** to complete
-1. If your key vault isn't connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault isn't connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan ### Steps to register To register a new Azure Files account in your data catalog, follow these steps:
-1. Navigate to your Azure Purview Data Studio.
+1. Navigate to your Microsoft Purview Data Studio.
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On **Register sources**, select **Azure Files**
Follow the steps below to scan Azure Files to automatically identify assets and
To create and run a new scan, follow these steps:
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the Azure Files source that you registered.
To create and run a new scan, follow these steps:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Multiple Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-multiple-sources.md
Title: Connect to and manage multiple Azure sources
-description: This guide describes how to connect to multiple Azure sources in Azure Purview at once, and use Azure Purview's features to scan and manage your sources.
+description: This guide describes how to connect to multiple Azure sources in Microsoft Purview at once, and use Microsoft Purview's features to scan and manage your sources.
Last updated 11/02/2021
-# Connect to and manage multiple Azure sources in Azure Purview
+# Connect to and manage multiple Azure sources in Microsoft Purview
-This article outlines how to register multiple Azure sources and how to authenticate and interact with them in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register multiple Azure sources and how to authenticate and interact with them in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register multiple Azure sources and how to authenti
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register multiple Azure sources in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register multiple Azure sources in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Prerequisites for registration
-Azure Purview needs permissions to be able to list resources under a subscription or resource group.
+Microsoft Purview needs permissions to be able to list resources under a subscription or resource group.
[!INCLUDE [Permissions to list resources](./includes/authentication-to-enumerate-resources.md)]
To learn how to add permissions on each resource type within a subscription or r
### Steps to register
-1. Go to your Azure Purview account.
+1. Go to your Microsoft Purview account.
1. Select **Data Map** on the left menu. 1. Select **Register**. 1. On **Register sources**, select **Azure (multiple)**.
Follow the steps below to scan multiple Azure sources to automatically identify
To create and run a new scan, do the following:
-1. Select the **Data Map** tab on the left pane in the Azure Purview Studio.
+1. Select the **Data Map** tab on the left pane in the Microsoft Purview Studio.
1. Select the data source that you registered. 1. Select **View details** > **+ New scan**, or use the **Scan** quick-action icon on the source tile. 1. For **Name**, fill in the name.
To create and run a new scan, do the following:
- If you leave the option as **All**, then future resources of that type will also be scanned in future scan runs. - If you select specific storage accounts or SQL databases, then future resources of that type created within this subscription or resource group will not be included for scans, unless the scan is explicitly edited in the future.
-1. Select **Test connection**. This will first test access to check if you've applied the Azure Purview MSI file as a reader on the subscription or resource group. If you get an error message, follow [these instructions](#prerequisites-for-registration) to resolve it. Then it will test your authentication and connection to each of your selected sources and generate a report. The number of sources selected will impact the time it takes to generate this report. If failed on some resources, hovering over the **X** icon will display the detailed error message.
+1. Select **Test connection**. This will first test access to check if you've applied the Microsoft Purview MSI file as a reader on the subscription or resource group. If you get an error message, follow [these instructions](#prerequisites-for-registration) to resolve it. Then it will test your authentication and connection to each of your selected sources and generate a report. The number of sources selected will impact the time it takes to generate this report. If failed on some resources, hovering over the **X** icon will display the detailed error message.
:::image type="content" source="media/register-scan-azure-multiple-sources/test-connection.png" alt-text="Screenshot showing the scan set up slider, with the Test Connection button highlighted."::: :::image type="content" source="media/register-scan-azure-multiple-sources/test-connection-report.png" alt-text="Screenshot showing an example test connection report, with some connections passing and some failing. Hovering over one of the failed connections shows a detailed error report.":::
To manage a scan, do the following:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Mysql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-mysql-database.md
Title: 'Connect to and manage Azure Database for MySQL'
-description: This guide describes how to connect to Azure Database for MySQL in Azure Purview, and use Azure Purview's features to scan and manage your Azure Database for MySQL source.
+description: This guide describes how to connect to Azure Database for MySQL in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Database for MySQL source.
Last updated 11/02/2021
-# Connect to and manage Azure Database for MySQL in Azure Purview
+# Connect to and manage Azure Database for MySQL in Microsoft Purview
-This article outlines how to register a database in Azure Database for MySQL, and how to authenticate and interact with Azure Database for MySQL in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register a database in Azure Database for MySQL, and how to authenticate and interact with Azure Database for MySQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register a database in Azure Database for MySQL, an
|||||||| | [Yes](#register) | [Yes](#scan)| [Yes*](#scan) | [Yes](#scan) | [Yes](#scan) | No | No** |
-\* Azure Purview relies on UPDATE_TIME metadata from Azure Database for MySQL for incremental scans. In some cases, this field might not persist in the database and a full scan is performed. For more information, see [The INFORMATION_SCHEMA TABLES Table](https://dev.mysql.com/doc/refman/5.7/en/information-schema-tables-table.html) for MySQL.
+\* Microsoft Purview relies on UPDATE_TIME metadata from Azure Database for MySQL for incremental scans. In some cases, this field might not persist in the database and a full scan is performed. For more information, see [The INFORMATION_SCHEMA TABLES Table](https://dev.mysql.com/doc/refman/5.7/en/information-schema-tables-table.html) for MySQL.
\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md) > [!Important]
-> Azure Purview only supports single server deployment option for Azure Database for MySQL.
+> Microsoft Purview only supports single server deployment option for Azure Database for MySQL.
## Prerequisites * An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register an Azure Database for MySQL in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register an Azure Database for MySQL in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
Follow the instructions in [CREATE DATABASES AND USERS](../mysql/howto-create-us
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your Azure SQL Database 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) of type SQL authentication using the **username** and **password** to set up your scan. ### Steps to register To register a new Azure Database for MySQL in your data catalog, do the following:
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation.
Follow the steps below to scan Azure Database for MySQL to automatically identif
To create and run a new scan, do the following:
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the Azure Database for MySQL source that you registered.
To create and run a new scan, do the following:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-postgresql.md
Title: 'Connect to and manage an Azure Database for PostgreSQL'
-description: This guide describes how to connect to an Azure Database for PostgreSQL single server in Azure Purview, and use Azure Purview's features to scan and manage your Azure Database for PostgreSQL source.
+description: This guide describes how to connect to an Azure Database for PostgreSQL single server in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Database for PostgreSQL source.
Last updated 11/02/2021
-# Connect to and manage an Azure Database for PostgreSQL in Azure Purview
+# Connect to and manage an Azure Database for PostgreSQL in Microsoft Purview
-This article outlines how to register an Azure Database for PostgreSQL deployed with single server deployment option, as well as how to authenticate and interact with an Azure Database for PostgreSQL in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register an Azure Database for PostgreSQL deployed with single server deployment option, as well as how to authenticate and interact with an Azure Database for PostgreSQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register an Azure Database for PostgreSQL deployed
\** Lineage is supported if dataset is used as a source/sink in [Data Factory Copy activity](how-to-link-azure-data-factory.md) > [!Important]
-> Azure Purview only supports single server deployment option for Azure Database for PostgreSQL.
+> Microsoft Purview only supports single server deployment option for Azure Database for PostgreSQL.
> Versions 8.x to 12.x ## Prerequisites * An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register an Azure Database for PostgreSQL in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register an Azure Database for PostgreSQL in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
Connecting to an Azure Database for PostgreSQL database requires the fully quali
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your Azure PostgreSQL Database 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) of type SQL authentication using the **username** and **password** to set up your scan ### Steps to register To register a new Azure Database for PostgreSQL in your data catalog, do the following:
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation.
Follow the steps below to scan an Azure Database for PostgreSQL database to auto
To create and run a new scan, do the following:
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the Azure Database for PostgreSQL source that you registered.
To create and run a new scan, do the following:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Sql Database Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-sql-database-managed-instance.md
Title: 'Connect to and manage Azure SQL Database Managed Instance'
-description: This guide describes how to connect to Azure SQL Database Managed Instance in Azure Purview, and use Azure Purview's features to scan and manage your Azure SQL Database Managed Instance source.
+description: This guide describes how to connect to Azure SQL Database Managed Instance in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure SQL Database Managed Instance source.
Last updated 11/02/2021
-# Connect to and manage an Azure SQL Database Managed Instance in Azure Purview
+# Connect to and manage an Azure SQL Database Managed Instance in Microsoft Purview
-This article outlines how to register and Azure SQL Database Managed Instance, as well as how to authenticate and interact with the Azure SQL Database Managed Instance in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md)
+This article outlines how to register and Azure SQL Database Managed Instance, as well as how to authenticate and interact with the Azure SQL Database Managed Instance in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md)
## Supported capabilities
This article outlines how to register and Azure SQL Database Managed Instance, a
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* [Configure public endpoint in Azure SQL Managed Instance](../azure-sql/managed-instance/public-endpoint-configure.md) > [!Note]
- > We now support scanning Azure SQL Database Managed Instances over the private connection using Azure Purview ingestion private endpoints and a self-hosted integration runtime VM.
- > For more information related to prerequisites, see [Connect to your Azure Purview and scan data sources privately and securely](./catalog-private-link-end-to-end.md)
+ > We now support scanning Azure SQL Database Managed Instances over the private connection using Microsoft Purview ingestion private endpoints and a self-hosted integration runtime VM.
+ > For more information related to prerequisites, see [Connect to your Microsoft Purview and scan data sources privately and securely](./catalog-private-link-end-to-end.md)
## Register
-This section describes how to register an Azure SQL Database Managed Instance in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register an Azure SQL Database Managed Instance in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
-If you need to create new authentication, you need to [authorize database access to SQL Database Managed Instance](../azure-sql/database/logins-create-manage.md). There are three authentication methods that Azure Purview supports today:
+If you need to create new authentication, you need to [authorize database access to SQL Database Managed Instance](../azure-sql/database/logins-create-manage.md). There are three authentication methods that Microsoft Purview supports today:
- [System or user assigned managed identity](#system-or-user-assigned-managed-identity-to-register) - [Service Principal](#service-principal-to-register)
If you need to create new authentication, you need to [authorize database access
#### System or user assigned managed identity to register
-You can use either your Azure Purview system-assigned managed identity (SAMI), or a [user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) (UAMI) to authenticate. Both options allow you to assign authentication directly to Azure Purview, like you would for any other user, group, or service principal. The Azure Purview system-assigned managed identity is created automatically when the account is created and has the same name as your Azure Purview account. A user-assigned managed identity is a resource that can be created independently. To create one you can follow our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
+You can use either your Microsoft Purview system-assigned managed identity (SAMI), or a [user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) (UAMI) to authenticate. Both options allow you to assign authentication directly to Microsoft Purview, like you would for any other user, group, or service principal. The Microsoft Purview system-assigned managed identity is created automatically when the account is created and has the same name as your Microsoft Purview account. A user-assigned managed identity is a resource that can be created independently. To create one you can follow our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity).
You can find your managed identity Object ID in the Azure portal by following these steps:
-For Azure Purview accountΓÇÖs system-assigned managed identity:
-1. Open the Azure portal, and navigate to your Azure Purview account.
+For Microsoft Purview accountΓÇÖs system-assigned managed identity:
+1. Open the Azure portal, and navigate to your Microsoft Purview account.
1. Select the **Properties** tab on the left side menu. 1. Select the **Managed identity object ID** value and copy it. For user-assigned managed identity (preview):
-1. Open the Azure portal, and navigate to your Azure Purview account.
+1. Open the Azure portal, and navigate to your Microsoft Purview account.
1. Select the Managed identities tab on the left side menu 1. Select the user assigned managed identities, select the intended identity to view the details. 1. The object (principal) ID is displayed in the overview essential section.
Either managed identity will need permission to get metadata for the database, s
#### Service Principal to register
-There are several steps to allow Azure Purview to use service principal to scan your Azure SQL Database Managed Instance.
+There are several steps to allow Microsoft Purview to use service principal to scan your Azure SQL Database Managed Instance.
#### Create or use an existing service principal
The service principal must have permission to get metadata for the database, sch
- Create an Azure AD user in Azure SQL Database Managed Instance by following the prerequisites and tutorial on [Create contained users mapped to Azure AD identities](../azure-sql/database/authentication-aad-configure.md?tabs=azure-powershell#create-contained-users-mapped-to-azure-ad-identities) - Assign `db_datareader` permission to the identity.
-#### Add service principal to key vault and Azure Purview's credential
+#### Add service principal to key vault and Microsoft Purview's credential
It is required to get the service principal's application ID and secret:
It is required to get the service principal's application ID and secret:
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** of your choice and **Value** as the **Client secret** from your Service Principal 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the Service Principal to set up your scan. #### SQL authentication to register > [!Note]
-> Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about **15 minutes** after granting permission, the Azure Purview account should have the appropriate permissions to be able to scan the resource(s).
+> Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about **15 minutes** after granting permission, the Microsoft Purview account should have the appropriate permissions to be able to scan the resource(s).
You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-login-transact-sql?view=azuresqldb-current&preserve-view=true#examples-1) to create a login for Azure SQL Database Managed Instance if you don't have this login available. You will need **username** and **password** for the next steps.
You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-l
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your Azure SQL Database Managed Instance 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the **username** and **password** to set up your scan. ### Steps to register
-1. Navigate to your [Azure Purview Studio](https://web.purview.azure.com/resource/)
+1. Navigate to your [Microsoft Purview Studio](https://web.purview.azure.com/resource/)
1. Select **Data Map** on the left navigation.
Follow the steps below to scan an Azure SQL Database Managed Instance to automat
To create and run a new scan, complete the following steps:
-1. Select the **Data Map** tab on the left pane in the Azure Purview Studio.
+1. Select the **Data Map** tab on the left pane in the Microsoft Purview Studio.
1. Select the Azure SQL Database Managed Instance source that you registered.
To create and run a new scan, complete the following steps:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-sql-database.md
Title: 'Register and scan Azure SQL DB'
-description: This article outlines the process to register an Azure SQL database in Azure Purview including instructions to authenticate and interact with the Azure SQL DB source
+description: This article outlines the process to register an Azure SQL database in Microsoft Purview including instructions to authenticate and interact with the Azure SQL DB source
Last updated 11/10/2021
-# Connect to Azure SQL Database in Azure Purview
+# Connect to Azure SQL Database in Microsoft Purview
-This article outlines the process to register an Azure SQL data source in Azure Purview including instructions to authenticate and interact with the Azure SQL database source
+This article outlines the process to register an Azure SQL data source in Microsoft Purview including instructions to authenticate and interact with the Azure SQL database source
## Supported capabilities
This article outlines the process to register an Azure SQL data source in Azure
* Data lineage extraction is currently supported only for Stored procedure runs
-When scanning Azure SQL Database, Azure Purview supports:
+When scanning Azure SQL Database, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can further scope the scan after providing the databas
### Known limitations
-* Azure Purview doesn't support over 300 columns in the Schema tab and it will show "Additional-Columns-Truncated" if there are more than 300 columns.
+* Microsoft Purview doesn't support over 300 columns in the Schema tab and it will show "Additional-Columns-Truncated" if there are more than 300 columns.
* Column level lineage is currently not supported in the lineage tab. However, the columnMapping attribute in properties tab of Azure SQL Stored Procedure Run captures column lineage in plain text. * Stored procedures with dynamic SQL, running from remote data integration tools like Azure Data Factory is currently not supported * Data lineage extraction is currently not supported for Functions, Triggers.
When setting up scan, you can further scope the scan after providing the databas
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
This section will enable you to register the Azure SQL DB data source and set up
### Steps to register
-It's important to register the data source in Azure Purview before setting up a scan.
+It's important to register the data source in Microsoft Purview before setting up a scan.
-1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Azure Purview accounts** page and select your _Purview account_
+1. Go to the [Azure portal](https://portal.azure.com), and navigate to the **Microsoft Purview accounts** page and select your _Purview account_
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-purview-acct.png" alt-text="Screenshot that shows the Azure Purview account used to register the data source.":::
+ :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-purview-acct.png" alt-text="Screenshot that shows the Microsoft Purview account used to register the data source.":::
-1. **Open Azure Purview Studio** and navigate to the **Data Map**
+1. **Open Microsoft Purview Studio** and navigate to the **Data Map**
:::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-open-purview-studio.png" alt-text="Screenshot that navigates to the Sources link in the Data Map.":::
If your database server has a firewall enabled, you'll need to update the firewa
#### Allow Azure Connections
-Enabling Azure connections will allow Azure Purview to reach and connect the server without updating the firewall itself. You can follow the How-to guide for [Connections from inside Azure](../azure-sql/database/firewall-configure.md#connections-from-inside-azure).
+Enabling Azure connections will allow Microsoft Purview to reach and connect the server without updating the firewall itself. You can follow the How-to guide for [Connections from inside Azure](../azure-sql/database/firewall-configure.md#connections-from-inside-azure).
1. Navigate to your database account 1. Select the server name in the **Overview** page
The following options are supported:
* **SQL Authentication**
-* **System-assigned managed identity** - As soon as the Azure Purview account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant, and has the same name as your Azure Purview account. Depending on the type of resource, specific RBAC role assignments are required for the Azure Purview SAMI to be able to scan.
+* **System-assigned managed identity** - As soon as the Microsoft Purview account is created, a system-assigned managed identity (SAMI) is created automatically in Azure AD tenant, and has the same name as your Microsoft Purview account. Depending on the type of resource, specific RBAC role assignments are required for the Microsoft Purview SAMI to be able to scan.
-* **User-assigned managed identity** (preview) - Similar to a SAMI, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Azure Purview to authenticate against Azure Active Directory. Depending on the type of resource, specific RBAC role assignments are required when using a UAMI credential to run scans.
+* **User-assigned managed identity** (preview) - Similar to a SAMI, a user-assigned managed identity (UAMI) is a credential resource that can be used to allow Microsoft Purview to authenticate against Azure Active Directory. Depending on the type of resource, specific RBAC role assignments are required when using a UAMI credential to run scans.
* **Service Principal**- In this method, you can create a new or use an existing service principal in your Azure Active Directory tenant.
Select your method of authentication from the tabs below for steps to authentica
# [SQL authentication](#tab/sql-authentication) > [!Note]
-> Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about **15 minutes** after granting permission, the Azure Purview account should have the appropriate permissions to be able to scan the resource(s).
+> Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about **15 minutes** after granting permission, the Microsoft Purview account should have the appropriate permissions to be able to scan the resource(s).
-1. You'll need a SQL login with at least `db_datareader` permissions to be able to access the information Azure Purview needs to scan the database. You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-login-transact-sql?view=azuresqldb-current&preserve-view=true#examples-1) to create a sign-in for Azure SQL Database. You'll need to save the **username** and **password** for the next steps.
+1. You'll need a SQL login with at least `db_datareader` permissions to be able to access the information Microsoft Purview needs to scan the database. You can follow the instructions in [CREATE LOGIN](/sql/t-sql/statements/create-login-transact-sql?view=azuresqldb-current&preserve-view=true#examples-1) to create a sign-in for Azure SQL Database. You'll need to save the **username** and **password** for the next steps.
1. Navigate to your key vault in the Azure portal.
Select your method of authentication from the tabs below for steps to authentica
1. Select **Create** to complete
-1. If your key vault isn't connected to Azure Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan.
Select your method of authentication from the tabs below for steps to authentica
The managed identity needs permission to get metadata for the database, schemas, and tables. It must also be authorized to query the tables to sample for classification. - If you haven't already, [configure Azure AD authentication with Azure SQL](../azure-sql/database/authentication-aad-configure.md)-- Create Azure AD user in Azure SQL Database with the exact Azure Purview's managed identity by following tutorial on [create the user in Azure SQL Database](../azure-sql/database/authentication-aad-service-principal-tutorial.md#create-the-service-principal-user-in-azure-sql-database). Assign proper permission (for example: `db_datareader`) to the identity. Example SQL syntax to create user and grant permission:
+- Create Azure AD user in Azure SQL Database with the exact Microsoft Purview's managed identity by following tutorial on [create the user in Azure SQL Database](../azure-sql/database/authentication-aad-service-principal-tutorial.md#create-the-service-principal-user-in-azure-sql-database). Assign proper permission (for example: `db_datareader`) to the identity. Example SQL syntax to create user and grant permission:
```sql CREATE USER [Username] FROM EXTERNAL PROVIDER
The managed identity needs permission to get metadata for the database, schemas,
``` > [!Note]
- > The `Username` is your Azure Purview's managed identity name. You can read more about [fixed-database roles and their capabilities](/sql/relational-databases/security/authentication-access/database-level-roles#fixed-database-roles).
+ > The `Username` is your Microsoft Purview's managed identity name. You can read more about [fixed-database roles and their capabilities](/sql/relational-databases/security/authentication-access/database-level-roles#fixed-database-roles).
##### Configure Portal Authentication
-It's important to give your Azure Purview account's system-managed identity or [user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) the permission to scan the Azure SQL DB. You can add the SAMI or UAMI at the Subscription, Resource Group, or Resource level, depending on the breadth of the scan.
+It's important to give your Microsoft Purview account's system-managed identity or [user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) the permission to scan the Azure SQL DB. You can add the SAMI or UAMI at the Subscription, Resource Group, or Resource level, depending on the breadth of the scan.
> [!Note] > You need to be an owner of the subscription to be able to add a managed identity on an Azure resource.
It's important to give your Azure Purview account's system-managed identity or [
:::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-sql-ds.png" alt-text="Screenshot that shows the Azure SQL database.":::
-1. Set the **Role** to **Reader** and enter your _Azure Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under **Select** input box. Then, select **Save** to give this role assignment to your Azure Purview account.
+1. Set the **Role** to **Reader** and enter your _Microsoft Purview account name_ or _[user-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity)_ under **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
- :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-access-managed-identity.png" alt-text="Screenshot that shows the details to assign permissions for the Azure Purview account.":::
+ :::image type="content" source="media/register-scan-azure-sql-database/register-scan-azure-sql-db-access-managed-identity.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account.":::
# [Service principal](#tab/service-principal)
The service principal needs permission to get metadata for the database, schemas
:::image type="content" source="media/register-scan-azure-sql-database/select-create.png" alt-text="Screenshot that shows the Key Vault Create a secret menu, with the Create button highlighted.":::
-1. If your key vault isn't connected to Azure Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Then, [create a new credential](manage-credentials.md#create-a-new-credential).
The service principal needs permission to get metadata for the database, schemas
### Creating the scan
-1. Open your **Azure Purview account** and select the **Open Azure Purview Studio**
+1. Open your **Microsoft Purview account** and select the **Open Microsoft Purview Studio**
1. Navigate to the **Data map** --> **Sources** to view the collection hierarchy 1. Select the **New Scan** icon under the **Azure SQL DB** registered earlier
Scans can be managed or run again on completion
## Lineage(Preview)
-Azure Purview supports lineage from Azure SQL Database. At the time of setting up a scan, enable lineage extraction toggle button to extract lineage.
+Microsoft Purview supports lineage from Azure SQL Database. At the time of setting up a scan, enable lineage extraction toggle button to extract lineage.
### Prerequisites for setting up scan with Lineage extraction
-1. Follow steps under [authentication for a scan using Managed Identity](#authentication-for-a-scan) section to authorize Azure Purview scan your Azure SQL DataBase
+1. Follow steps under [authentication for a scan using Managed Identity](#authentication-for-a-scan) section to authorize Microsoft Purview scan your Azure SQL DataBase
2. Sign in to Azure SQL Database with Azure AD account and assign proper permission (for example: db_owner) to Purview Managed identity. Use below example SQL syntax to create user and grant permission by replacing 'purview-account' with your Account name:
You can [browse data catalog](how-to-browse-catalog.md) or [search data catalog]
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Azure Synapse Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-synapse-analytics.md
Title: 'Connect to and manage dedicated SQL pools (formerly SQL DW)'
-description: This guide describes how to connect to dedicated SQL pools (formerly SQL DW) in Azure Purview, and use Azure Purview's features to scan and manage your dedicated SQL pools source.
+description: This guide describes how to connect to dedicated SQL pools (formerly SQL DW) in Microsoft Purview, and use Microsoft Purview's features to scan and manage your dedicated SQL pools source.
Last updated 11/10/2021
-# Connect to and manage dedicated SQL pools in Azure Purview
+# Connect to and manage dedicated SQL pools in Microsoft Purview
-This article outlines how to register dedicated SQL pools (formerly SQL DW), and how to authenticate and interact with dedicated SQL pools in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md)
+This article outlines how to register dedicated SQL pools (formerly SQL DW), and how to authenticate and interact with dedicated SQL pools in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md)
> [!NOTE] > If you are looking to register and scan a dedicated SQL database within a Synapse workspace, you must follow instructions [here](register-scan-synapse-workspace.md).
This article outlines how to register dedicated SQL pools (formerly SQL DW), and
### Known limitations
-* Azure Purview doesn't support over 300 columns in the Schema tab and it will show "Additional-Columns-Truncated".
+* Microsoft Purview doesn't support over 300 columns in the Schema tab and it will show "Additional-Columns-Truncated".
## Prerequisites * An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register dedicated SQL pools in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register dedicated SQL pools in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
There are three ways to set up authentication:
- [SQL authentication](#sql-authentication-to-register) > [!Note]
- > Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about 15 minutes after granting permission, the Azure Purview account should have the appropriate permissions to be able to scan the resource(s).
+ > Only the server-level principal login (created by the provisioning process) or members of the `loginmanager` database role in the master database can create new logins. It takes about 15 minutes after granting permission, the Microsoft Purview account should have the appropriate permissions to be able to scan the resource(s).
#### System or user assigned managed identity to register
-You can use either your Azure Purview system-assigned managed identity (SAMI), or a [User-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) (UAMI) to authenticate. Both options allow you to assign authentication directly to Azure Purview, like you would for any other user, group, or service principal. The Azure Purview SAMI is created automatically when the account is created. A UAMI is a resource that can be created independently, and to create one you can follow our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity). Create an Azure AD user in the dedicated SQL pool using your managed identity object name by following the prerequisites and tutorial on [Create Azure AD users using Azure AD applications](../azure-sql/database/authentication-aad-service-principal-tutorial.md).
+You can use either your Microsoft Purview system-assigned managed identity (SAMI), or a [User-assigned managed identity](manage-credentials.md#create-a-user-assigned-managed-identity) (UAMI) to authenticate. Both options allow you to assign authentication directly to Microsoft Purview, like you would for any other user, group, or service principal. The Microsoft Purview SAMI is created automatically when the account is created. A UAMI is a resource that can be created independently, and to create one you can follow our [user-assigned managed identity guide](manage-credentials.md#create-a-user-assigned-managed-identity). Create an Azure AD user in the dedicated SQL pool using your managed identity object name by following the prerequisites and tutorial on [Create Azure AD users using Azure AD applications](../azure-sql/database/authentication-aad-service-principal-tutorial.md).
Example SQL syntax to create user and grant permission:
It is required to get the Service Principal's application ID and secret:
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** of your choice and **Value** as the **Client secret** from your Service Principal 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the Service Principal to set up your scan. ##### Granting the Service Principal access
GO
``` > [!Note]
-> Azure Purview will need the **Application (client) ID** and the **client secret** in order to scan.
+> Microsoft Purview will need the **Application (client) ID** and the **client secret** in order to scan.
#### SQL authentication to register
When authentication method selected is **SQL Authentication**, you need to get y
1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* for your SQL login 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the key to set up your scan. ### Steps to register
-To register a new SQL dedicated pool in Azure Purview, complete the following steps:
+To register a new SQL dedicated pool in Microsoft Purview, complete the following steps:
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On **Register sources**, select **Azure Dedicated SQL Pool (formerly SQL DW)**.
Follow the steps below to scan dedicated SQL pools to automatically identify ass
To create and run a new scan, complete the following steps:
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the SQL dedicated pool source that you registered.
To create and run a new scan, complete the following steps:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Cassandra Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-cassandra-source.md
Title: Connect to and manage Cassandra
-description: This guide describes how to connect to Cassandra in Azure Purview, and use Azure Purview's features to scan and manage your Cassandra source.
+description: This guide describes how to connect to Cassandra in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Cassandra source.
Last updated 03/05/2022
-# Connect to and manage Cassandra in Azure Purview (Preview)
+# Connect to and manage Cassandra in Microsoft Purview (Preview)
-This article outlines how to register Cassandra, and how to authenticate and interact with Cassandra in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Cassandra, and how to authenticate and interact with Cassandra in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register Cassandra, and how to authenticate and int
The supported Cassandra server versions are 3.*x* or 4.*x*.
-When scanning Cassandra source, Azure Purview supports:
+When scanning Cassandra source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire Cassandra instance, or sc
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
**If your data store is not publically accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.) you need to configure a self-hosted integration runtime to connect to it:
When setting up scan, you can choose to scan an entire Cassandra instance, or sc
## Register
-This section describes how to register Cassandra in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Cassandra in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new Cassandra server in your data catalog:
-1. Go to your Azure Purview account.
+1. Go to your Microsoft Purview account.
1. Select **Data Map** on the left pane. 1. Select **Register**. 1. On the **Register sources** screen, select **Cassandra**, and then select **Continue**:
To create and run a new scan:
* In the **User name** box, provide the name of the user you're making the connection for. * In the key vault's secret, save the password of the Cassandra user you're making the connection for.
- For more information, see [Credentials for source authentication in Azure Purview](manage-credentials.md).
+ For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
1. **Keyspaces**: Specify a list of Cassandra keyspaces to import. Multiple keyspaces must be separated with semicolons. For example, keyspace1; keyspace2. When the list is empty, all available keyspaces are imported.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Db2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-db2.md
Title: Connect to and manage Db2
-description: This guide describes how to connect to Db2 in Azure Purview, and use Azure Purview's features to scan and manage your Db2 source.
+description: This guide describes how to connect to Db2 in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Db2 source.
Last updated 01/20/2022
-# Connect to and manage Db2 in Azure Purview (Preview)
+# Connect to and manage Db2 in Microsoft Purview (Preview)
-This article outlines how to register Db2, and how to authenticate and interact with Db2 in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Db2, and how to authenticate and interact with Db2 in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register Db2, and how to authenticate and interact
The supported IBM Db2 versions are Db2 for LUW 9.7 to 11.x. Db2 for z/OS (mainframe) and iSeries (AS/400) aren't supported now.
-When scanning IBM Db2 source, Azure Purview supports:
+When scanning IBM Db2 source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire Db2 database, or scope th
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.12.7984.1.
When setting up scan, you can choose to scan an entire Db2 database, or scope th
> [!Note] > The driver should be accessible to all accounts in the VM. Do not install it in a user account.
-* The Db2 user must have the CONNECT permission. Azure Purview connects to the syscat tables in IBM Db2 environment when importing metadata.
+* The Db2 user must have the CONNECT permission. Microsoft Purview connects to the syscat tables in IBM Db2 environment when importing metadata.
## Register
-This section describes how to register Db2 in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Db2 in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new Db2 source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **Db2**. Select **Continue**.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Erwin Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-erwin-source.md
Title: Connect to and manage erwin Mart servers
-description: This guide describes how to connect to erwin Mart servers in Azure Purview, and use Azure Purview's features to scan and manage your erwin Mart server source.
+description: This guide describes how to connect to erwin Mart servers in Microsoft Purview, and use Microsoft Purview's features to scan and manage your erwin Mart server source.
Last updated 01/20/2022
-# Connect to and manage erwin Mart servers in Azure Purview (Preview)
+# Connect to and manage erwin Mart servers in Microsoft Purview (Preview)
-This article outlines how to register erwin Mart servers, and how to authenticate and interact with erwin Mart Servers in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register erwin Mart servers, and how to authenticate and interact with erwin Mart Servers in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register erwin Mart servers, and how to authenticat
The supported erwin Mart versions are 9.x to 2021.
-When scanning erwin Mart source, Azure Purview supports:
+When scanning erwin Mart source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire erwin Mart server, or sco
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
When setting up scan, you can choose to scan an entire erwin Mart server, or sco
## Register
-This section describes how to register erwin Mart servers in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register erwin Mart servers in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
The only supported authentication for an erwin Mart source is **Server Authentication** in the form of username and password. ### Steps to register
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **erwin**. Select **Continue.**
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Google Bigquery Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-google-bigquery-source.md
Title: Connect to and manage Google BigQuery projects
-description: This guide describes how to connect to Google BigQuery projects in Azure Purview, and use Azure Purview's features to scan and manage your Google BigQuery source.
+description: This guide describes how to connect to Google BigQuery projects in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Google BigQuery source.
Last updated 01/20/2022
-# Connect to and manage Google BigQuery projects in Azure Purview (Preview)
+# Connect to and manage Google BigQuery projects in Microsoft Purview (Preview)
-This article outlines how to register Google BigQuery projects, and how to authenticate and interact with Google BigQuery in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Google BigQuery projects, and how to authenticate and interact with Google BigQuery in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register Google BigQuery projects, and how to authe
|||||||| | [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| [Yes](#lineage)|
-When scanning Google BigQuery source, Azure Purview supports:
+When scanning Google BigQuery source, Microsoft Purview supports:
- Extracting technical metadata including:
When scanning Google BigQuery source, Azure Purview supports:
When setting up scan, you can choose to scan an entire Google BigQuery project, or scope the scan to a subset of datasets matching the given name(s) or name pattern(s). >[!NOTE]
-> Currently, Azure Purview only supports scanning Google BigQuery datasets in US multi-regional location. If the specified dataset is in other location e.g. us-east1 or EU, you will observe scan completes but no assets shown up in Azure Purview.
+> Currently, Microsoft Purview only supports scanning Google BigQuery datasets in US multi-regional location. If the specified dataset is in other location e.g. us-east1 or EU, you will observe scan completes but no assets shown up in Microsoft Purview.
## Prerequisites * An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
When setting up scan, you can choose to scan an entire Google BigQuery project,
## Register
-This section describes how to register a Google BigQuery project in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register a Google BigQuery project in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register.** 1. On Register sources, select **Google BigQuery** . Select **Continue.**
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Hive Metastore Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-hive-metastore-source.md
Title: Connect to and manage Hive Metastore databases
-description: This guide describes how to connect to Hive Metastore databases in Azure Purview, and how to use Azure Purview to scan and manage your Hive Metastore database source.
+description: This guide describes how to connect to Hive Metastore databases in Microsoft Purview, and how to use Microsoft Purview to scan and manage your Hive Metastore database source.
Last updated 02/25/2022
-# Connect to and manage Hive Metastore databases in Azure Purview
+# Connect to and manage Hive Metastore databases in Microsoft Purview
-This article outlines how to register Hive Metastore databases, and how to authenticate and interact with Hive Metastore databases in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Hive Metastore databases, and how to authenticate and interact with Hive Metastore databases in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register Hive Metastore databases, and how to authe
The supported Hive versions are 2.x to 3.x. The supported platforms are Apache Hadoop, Cloudera, Hortonworks, and Azure Databricks (versions 8.0 and later).
-When scanning Hive metastore source, Azure Purview supports:
+When scanning Hive metastore source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire Hive metastore database,
* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* You must have an active [Azure Purview account](create-catalog-portal.md).
+* You must have an active [Microsoft Purview account](create-catalog-portal.md).
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in Azure Purview Studio. For more information about permissions, see [Access control in Azure Purview](catalog-permissions.md).
+* You need Data Source Administrator and Data Reader permissions to register a source and manage it in Microsoft Purview Studio. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and configure a self-hosted integration runtime](manage-integration-runtimes.md).
When setting up scan, you can choose to scan an entire Hive metastore database,
## Register
-This section describes how to register a Hive Metastore database in Azure Purview by using [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register a Hive Metastore database in Microsoft Purview by using [Microsoft Purview Studio](https://web.purview.azure.com/).
The only supported authentication for a Hive Metastore database is Basic Authentication.
-1. Go to your Azure Purview account.
+1. Go to your Microsoft Purview account.
1. Select **Data Map** on the left pane.
The only supported authentication for a Hive Metastore database is Basic Authent
1. On the **Register sources (Hive Metastore)** screen, do the following:
- 1. For **Name**, enter a name that Azure Purview will list as the data source.
+ 1. For **Name**, enter a name that Microsoft Purview will list as the data source.
1. For **Hive Cluster URL**, enter a value that you get from the Ambari URL or the Azure Databricks workspace URL. For example, enter **hive.azurehdinsight.net** or **adb-19255636414785.5.azuredatabricks.net**.
The only supported authentication for a Hive Metastore database is Basic Authent
## Scan
-Use the following steps to scan Hive Metastore databases to automatically identify assets and classify your data. For more information about scanning in general, see [Scans and ingestion in Azure Purview](concept-scans-and-ingestion.md).
+Use the following steps to scan Hive Metastore databases to automatically identify assets and classify your data. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
1. In the Management Center, select integration runtimes. Make sure that a self-hosted integration runtime is set up. If it isn't set up, use the steps in [Create and manage a self-hosted integration runtime](./manage-integration-runtimes.md).
Use the following steps to scan Hive Metastore databases to automatically identi
* Provide the Metastore username in the appropriate box. * Store the Metastore password in the secret key.
- For more information, see [Credentials for source authentication in Azure Purview](manage-credentials.md).
+ For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
**Azure Databricks usage**: Go to your Azure Databricks cluster, select **Apps**, and then select **Launch Web Terminal**. Run the cmdlet `cat /databricks/hive/conf/hive-site.xml`.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, use the following guides to learn more about Azure Purview and your data:
+Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Looker Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-looker-source.md
Title: Connect to and manage Looker
-description: This guide describes how to connect to Looker in Azure Purview, and use Azure Purview's features to scan and manage your Looker source.
+description: This guide describes how to connect to Looker in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Looker source.
Last updated 03/05/2022
-# Connect to and manage Looker in Azure Purview (Preview)
+# Connect to and manage Looker in Microsoft Purview (Preview)
-This article outlines how to register Looker, and how to authenticate and interact with Looker in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Looker, and how to authenticate and interact with Looker in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register Looker, and how to authenticate and intera
The supported Looker server version is 7.2.
-When scanning Looker source, Azure Purview supports:
+When scanning Looker source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire Looker server, or scope t
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
If your data store is publicly accessible, you can use the managed Azure integration runtime for scan without additional settings. Otherwise, if your data store limits access from on-premises network, private network or specific IPs, you need to configure a self-hosted integration runtime to connect to it:
If your data store is publicly accessible, you can use the managed Azure integra
## Register
-This section describes how to register Looker in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Looker in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
An API3 key is required to connect to the Looker server. The API3 key consists i
To register a new Looker server in your data catalog, do the following:
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register.** 1. On Register sources, select **Looker**. Select **Continue.**
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-mongodb.md
Title: Connect to and manage MongoDB
-description: This guide describes how to connect to MongoDB in Azure Purview, and use Azure Purview's features to scan and manage your MongoDB source.
+description: This guide describes how to connect to MongoDB in Microsoft Purview, and use Microsoft Purview's features to scan and manage your MongoDB source.
Last updated 04/12/2022
-# Connect to and manage MongoDB in Azure Purview (Preview)
+# Connect to and manage MongoDB in Microsoft Purview (Preview)
-This article outlines how to register MongoDB, and how to authenticate and interact with MongoDB in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register MongoDB, and how to authenticate and interact with MongoDB in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register MongoDB, and how to authenticate and inter
The supported MongoDB versions are 2.6 to 5.1.
-When scanning MongoDB source, Azure Purview supports extracting technical metadata including:
+When scanning MongoDB source, Microsoft Purview supports extracting technical metadata including:
- Server - Databases
When setting up scan, you can choose to scan one or more MongoDB database(s) ent
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.16.8093.1.
When setting up scan, you can choose to scan one or more MongoDB database(s) ent
## Register
-This section describes how to register MongoDB in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register MongoDB in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new MongoDB source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **MongoDB**. Select **Continue**.
To create and run a new scan, do the following:
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-mysql.md
Title: Connect to and manage MySQL
-description: This guide describes how to connect to MySQL in Azure Purview, and use Azure Purview's features to scan and manage your MySQL source.
+description: This guide describes how to connect to MySQL in Microsoft Purview, and use Microsoft Purview's features to scan and manage your MySQL source.
Last updated 03/05/2022
-# Connect to and manage MySQL in Azure Purview (Preview)
+# Connect to and manage MySQL in Microsoft Purview (Preview)
-This article outlines how to register MySQL, and how to authenticate and interact with MySQL in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register MySQL, and how to authenticate and interact with MySQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register MySQL, and how to authenticate and interac
The supported MySQL server versions are 5.7 to 8.x.
-When scanning MySQL source, Azure Purview supports:
+When scanning MySQL source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire MySQL server, or scope th
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See [Microsoft Purview Permissions page](catalog-permissions.md) for details.
**If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.) you need to configure a self-hosted integration runtime to connect to it:
The MySQL user must have the SELECT, SHOW VIEW and EXECUTE permissions for each
## Register
-This section describes how to register MySQL in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register MySQL in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new MySQL source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **MySQL**. Select **Continue**.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan On Premises Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-on-premises-sql-server.md
Title: Connect to and manage on-premises SQL server instances
-description: This guide describes how to connect to on-premises SQL server instances in Azure Purview, and use Azure Purview's features to scan and manage your on-premises SQL server source.
+description: This guide describes how to connect to on-premises SQL server instances in Microsoft Purview, and use Microsoft Purview's features to scan and manage your on-premises SQL server source.
Last updated 11/02/2021
-# Connect to and manage an on-premises SQL server instance in Azure Purview
+# Connect to and manage an on-premises SQL server instance in Microsoft Purview
-This article outlines how to register on-premises SQL server instances, and how to authenticate and interact with an on-premises SQL server instance in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register on-premises SQL server instances, and how to authenticate and interact with an on-premises SQL server instance in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register on-premises SQL server instances, and how
The supported SQL Server versions are 2005 and above. SQL Server Express LocalDB is not supported.
-When scanning on-premises SQL server, Azure Purview supports:
+When scanning on-premises SQL server, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to specify the database name to scan one da
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). ## Register
-This section describes how to register an on-premises SQL server instance in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register an on-premises SQL server instance in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
A change to the Server Authentication will require a restart of the SQL Server I
If you would like to create a new login and user to be able to scan your SQL server, follow the steps below:
-The account must have access to the **master** database. This is because the `sys.databases` is in the master database. The Azure Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
+The account must have access to the **master** database. This is because the `sys.databases` is in the master database. The Microsoft Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
> [!Note] > All the steps below can be executed using the code provided [here](https://github.com/Azure/Purview-Samples/blob/master/TSQL-Code-Permissions/grant-access-to-on-prem-sql-databases.sql)
The account must have access to the **master** database. This is because the `sy
:::image type="content" source="media/register-scan-on-premises-sql-server/change-password.png" alt-text="change password.":::
-##### Storing your SQL login password in a key vault and creating a credential in Azure Purview
+##### Storing your SQL login password in a key vault and creating a credential in Microsoft Purview
1. Navigate to your key vault in the Azure portal1. Select **Settings > Secrets** 1. Select **+ Generate/Import** and enter the **Name** and **Value** as the *password* from your SQL server login 1. Select **Create** to complete
-1. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+1. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
1. Finally, [create a new credential](manage-credentials.md#create-a-new-credential) using the **username** and **password** to set up your scan. Make sure the right authentication method is selected when creating a new credential. If SQL Authentication is applied, select "SQL authentication" as the authentication method. If Windows Authentication is applied, then select "Windows authentication". ### Steps to register
-1. Navigate to your Azure Purview account
+1. Navigate to your Microsoft Purview account
1. Under Sources and scanning in the left navigation, select **Integration runtimes**. Make sure a self-hosted integration runtime is set up. If it is not set up, follow the steps mentioned [here](manage-integration-runtimes.md) to create a self-hosted integration runtime for scanning on an on-premises or Azure VM that has access to your on-premises network.
Follow the steps below to scan on-premises SQL server instances to automatically
To create and run a new scan, do the following:
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the SQL Server source that you registered.
To create and run a new scan, do the following:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Oracle Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-oracle-source.md
Title: Connect to and manage Oracle
-description: This guide describes how to connect to Oracle in Azure Purview, and use Azure Purview's features to scan and manage your Oracle source.
+description: This guide describes how to connect to Oracle in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Oracle source.
Last updated 03/28/2022
-# Connect to and manage Oracle in Azure Purview
+# Connect to and manage Oracle in Microsoft Purview
-This article outlines how to register Oracle, and how to authenticate and interact with Oracle in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Oracle, and how to authenticate and interact with Oracle in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register Oracle, and how to authenticate and intera
The supported Oracle server versions are 6i to 19c. Proxy server isn't supported when scanning Oracle source.
-When scanning Oracle source, Azure Purview supports:
+When scanning Oracle source, Microsoft Purview supports:
- Extracting technical metadata including:
Currently, the Oracle service name isn't captured in the metadata or hierarchy.
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
Currently, the Oracle service name isn't captured in the metadata or hierarchy.
## Register
-This section describes how to register Oracle in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Oracle in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Prerequisites for registration
The only supported authentication for an Oracle source is **Basic authentication
To register a new Oracle source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **Oracle**. Select **Continue**.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-postgresql.md
Title: Connect to and manage PostgreSQL
-description: This guide describes how to connect to PostgreSQL in Azure Purview, and use Azure Purview's features to scan and manage your PostgreSQL source.
+description: This guide describes how to connect to PostgreSQL in Microsoft Purview, and use Microsoft Purview's features to scan and manage your PostgreSQL source.
Last updated 03/05/2022
-# Connect to and manage PostgreSQL in Azure Purview (Preview)
+# Connect to and manage PostgreSQL in Microsoft Purview (Preview)
-This article outlines how to register PostgreSQL, and how to authenticate and interact with PostgreSQL in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register PostgreSQL, and how to authenticate and interact with PostgreSQL in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register PostgreSQL, and how to authenticate and in
The supported PostgreSQL server versions are 8.4 to 12.x.
-When scanning PostgreSQL source, Azure Purview supports:
+When scanning PostgreSQL source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire PostgreSQL database, or s
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
**If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.) you need to configure a self-hosted integration runtime to connect to it:
The PostgreSQL user must have read access to system tables in order to access ad
## Register
-This section describes how to register PostgreSQL in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register PostgreSQL in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new PostgreSQL source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **PostgreSQL**. Select **Continue**.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Power Bi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-power-bi-tenant.md
Title: Connect to and manage a Power BI tenant
-description: This guide describes how to connect to a Power BI tenant in Azure Purview, and use Azure Purview's features to scan and manage your Power BI tenant source.
+description: This guide describes how to connect to a Power BI tenant in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Power BI tenant source.
Last updated 04/08/2022
-# Connect to and manage a Power BI tenant in Azure Purview
+# Connect to and manage a Power BI tenant in Microsoft Purview
-This article outlines how to register a Power BI tenant, and how to authenticate and interact with the tenant in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register a Power BI tenant, and how to authenticate and interact with the tenant in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register a Power BI tenant, and how to authenticate
### Supported scenarios for Power BI scans
-|**Azure Purview public access allowed/denied** |**Power BI public access allowed /denied** | **Power BI tenant same/cross** | **Runtime option** |
+|**Microsoft Purview public access allowed/denied** |**Power BI public access allowed /denied** | **Power BI tenant same/cross** | **Runtime option** |
||||| |Allowed |Allowed |Same tenant |[Azure Runtime & Managed Identity](#authenticate-to-power-bi-tenant-managed-identity-only) | |Allowed |Allowed |Same tenant |[Self-hosted runtime & Delegated authentication](#scan-same-tenant-using-self-hosted-ir-and-delegated-authentication) |
This article outlines how to register a Power BI tenant, and how to authenticate
### Known limitations -- If Azure Purview or Power BI tenant is protected behind a private endpoint, Self-hosted runtime is the only option to scan
+- If Microsoft Purview or Power BI tenant is protected behind a private endpoint, Self-hosted runtime is the only option to scan
- Delegated authentication is the only supported authentication option if self-hosted integration runtime is used during the scan - For cross-tenant scenario, delegated authentication is only supported option for scanning.-- You can create only one scan for a Power BI data source that is registered in your Azure Purview account
+- You can create only one scan for a Power BI data source that is registered in your Microsoft Purview account
- If Power BI dataset schema is not shown after scan, it is due to one of the current limitations with [Power BI Metadata scanner](/power-bi/admin/service-admin-metadata-scanning) ## Prerequisites - An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). -- An active [Azure Purview account](create-catalog-portal.md).
+- An active [Microsoft Purview account](create-catalog-portal.md).
-- You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+- You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
- If delegated auth is used: - Make sure proper [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to Power BI admin user that is used for the scan.
In Azure Active Directory Tenant, where Power BI tenant is located:
:::image type="content" source="./media/setup-power-bi-scan-PowerShell/security-group.png" alt-text="Screenshot of security group type.":::
-4. Add your Azure Purview managed identity to this security group. Select **Members**, then select **+ Add members**.
+4. Add your Microsoft Purview managed identity to this security group. Select **Members**, then select **+ Add members**.
:::image type="content" source="./media/setup-power-bi-scan-PowerShell/add-group-member.png" alt-text="Screenshot of how to add the catalog's managed instance to group.":::
-5. Search for your Azure Purview managed identity and select it.
+5. Search for your Microsoft Purview managed identity and select it.
:::image type="content" source="./media/setup-power-bi-scan-PowerShell/add-catalog-to-group-by-search.png" alt-text="Screenshot showing how to add catalog by searching for its name.":::
In Azure Active Directory Tenant, where Power BI tenant is located:
:::image type="content" source="./media/setup-power-bi-scan-PowerShell/allow-service-principals-power-bi-admin.png" alt-text="Image showing how to allow service principals to get read-only Power BI admin API permissions.":::
-5. Select **Admin API settings** > **Enhance admin APIs responses with detailed metadata** > Enable the toggle to allow Azure Purview Data Map automatically discover the detailed metadata of Power BI datasets as part of its scans.
+5. Select **Admin API settings** > **Enhance admin APIs responses with detailed metadata** > Enable the toggle to allow Microsoft Purview Data Map automatically discover the detailed metadata of Power BI datasets as part of its scans.
> [!IMPORTANT] > After you update the Admin API settings on your power bi tenant, wait around 15 minutes before registering a scan and test connection.
In Azure Active Directory Tenant, where Power BI tenant is located:
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-sub-artifacts.png" alt-text="Image showing the Power BI admin portal config to enable subartifact scan."::: > [!Caution]
- > When you allow the security group you created (that has your Azure Purview managed identity as a member) to use read-only Power BI admin APIs, you also allow it to access the metadata (e.g. dashboard and report names, owners, descriptions, etc.) for all of your Power BI artifacts in this tenant. Once the metadata has been pulled into the Azure Purview, Azure Purview's permissions, not Power BI permissions, determine who can see that metadata.
+ > When you allow the security group you created (that has your Microsoft Purview managed identity as a member) to use read-only Power BI admin APIs, you also allow it to access the metadata (e.g. dashboard and report names, owners, descriptions, etc.) for all of your Power BI artifacts in this tenant. Once the metadata has been pulled into the Microsoft Purview, Microsoft Purview's permissions, not Power BI permissions, determine who can see that metadata.
> [!Note]
- > You can remove the security group from your developer settings, but the metadata previously extracted won't be removed from the Azure Purview account. You can delete it separately, if you wish.
+ > You can remove the security group from your developer settings, but the metadata previously extracted won't be removed from the Microsoft Purview account. You can delete it separately, if you wish.
### Register same Power BI tenant
-This section describes how to register a Power BI tenant in Azure Purview for same-tenant scenario.
+This section describes how to register a Power BI tenant in Microsoft Purview for same-tenant scenario.
1. Select the **Data Map** on the left navigation.
This section describes how to register a Power BI tenant in Azure Purview for sa
#### Scan same tenant using Azure IR and Managed Identity
-This is a suitable scenario, if both Azure Purview and Power BI tenant are configured to allow public access in the network settings.
+This is a suitable scenario, if both Microsoft Purview and Power BI tenant are configured to allow public access in the network settings.
To create and run a new scan, do the following:
-1. In the Azure Purview Studio, navigate to the **Data map** in the left menu.
+1. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu.
1. Navigate to **Sources**.
To create and run a new scan, do the following:
3. Select **Test Connection** before continuing to next steps. If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem. 1. Access - Failed status means the user authentication failed. Scans using managed identity will always pass because no user authentication required.
- 2. Assets (+ lineage) - Failed status means the Azure Purview - Power BI authorization has failed. Make sure the Azure Purview managed identity is added to the security group associated in Power BI admin portal.
+ 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the Microsoft Purview managed identity is added to the security group associated in Power BI admin portal.
3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata** :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-test-connection-status-report.png" alt-text="Screenshot of test connection status report page."::: 4. Set up a scan trigger. Your options are **Recurring**, and **Once**.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Azure Purview scan scheduler.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
5. On **Review new scan**, select **Save and run** to launch your scan.
To create and run a new scan, do the following:
#### Scan same tenant using Self-hosted IR and Delegated authentication
-This scenario can be used when Azure Purview and Power BI tenant or both, are configured to use private endpoint and deny public access. Additionally, this option is also applicable if Azure Purview and Power BI tenant are configured to allow public access.
+This scenario can be used when Microsoft Purview and Power BI tenant or both, are configured to use private endpoint and deny public access. Additionally, this option is also applicable if Microsoft Purview and Power BI tenant are configured to allow public access.
> [!IMPORTANT]
-> Additional configuration may be required for your Power BI tenant and Azure Purview account, if you are planning to scan Power BI tenant through private network where either Azure Purview account, Power BI tenant or both are configured with private endpoint with public access denied.
+> Additional configuration may be required for your Power BI tenant and Microsoft Purview account, if you are planning to scan Power BI tenant through private network where either Microsoft Purview account, Power BI tenant or both are configured with private endpoint with public access denied.
> > For more information related to Power BI network, see [How to configure private endpoints for accessing Power BI](/power-bi/enterprise/service-security-private-links). >
-> For more information about Azure Purview network settings, see [Use private endpoints for your Azure Purview account](catalog-private-link.md).
+> For more information about Microsoft Purview network settings, see [Use private endpoints for your Microsoft Purview account](catalog-private-link.md).
To create and run a new scan, do the following:
To create and run a new scan, do the following:
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret.png" alt-text="Screenshot how to generate an Azure Key Vault secret.":::
-5. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+5. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
6. Create an App Registration in your Azure Active Directory tenant. Provide a web URL in the **Redirect URI**. Take note of Client ID(App ID).
To create and run a new scan, do the following:
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-delegated-permissions.png" alt-text="Screenshot of delegated permissions for Power BI Service and Microsoft Graph.":::
-8. In the Azure Purview Studio, navigate to the **Data map** in the left menu.
+8. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu.
9. Navigate to **Sources**.
To create and run a new scan, do the following:
16. Select **Test Connection** before continuing to next steps. If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem 1. Access - Failed status means the user authentication failed. Scans using managed identity will always pass because no user authentication required.
- 2. Assets (+ lineage) - Failed status means the Azure Purview - Power BI authorization has failed. Make sure the Azure Purview managed identity is added to the security group associated in Power BI admin portal.
+ 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the Microsoft Purview managed identity is added to the security group associated in Power BI admin portal.
3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata** :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-test-connection-status-report.png" alt-text="Screenshot of test connection status report page."::: 17. Set up a scan trigger. Your options are **Recurring**, and **Once**.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Azure Purview scan scheduler.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
18. On **Review new scan**, select **Save and run** to launch your scan.
To create and run a new scan, do the following:
1. Give your Power BI instance a friendly name. The name must be between 3-63 characters long and must contain only letters, numbers, underscores, and hyphens. Spaces aren't allowed.
-1. Edit the Tenant ID field to replace with cross Power BI tenant you want to register and scan. By default, Power BI tenant ID that exists in the same Azure Active Directory as Azure Purview will be populated.
+1. Edit the Tenant ID field to replace with cross Power BI tenant you want to register and scan. By default, Power BI tenant ID that exists in the same Azure Active Directory as Microsoft Purview will be populated.
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/register-cross-tenant.png" alt-text="Image showing the registration experience for cross tenant Power BI":::
To create and run a new scan using Azure runtime, perform the following steps:
2. Assign proper Power BI license to the user.
-2. Navigate to your Azure key vault in the tenant where Azure Purview is created.
+2. Navigate to your Azure key vault in the tenant where Microsoft Purview is created.
3. Select **Settings** > **Secrets** and select **+ Generate/Import**.
To create and run a new scan using Azure runtime, perform the following steps:
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret.png" alt-text="Screenshot how to generate an Azure Key Vault secret.":::
-6. If your key vault is not connected to Azure Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+6. If your key vault is not connected to Microsoft Purview yet, you will need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
7. Create an App Registration in your Azure Active Directory tenant where Power BI is located. Provide a web URL in the **Redirect URI**. Take note of Client ID(App ID).
To create and run a new scan using Azure runtime, perform the following steps:
11. Under **Advanced settings**, enable **Allow Public client flows**.
-12. In the Azure Purview Studio, navigate to the **Data map** in the left menu. Navigate to **Sources**.
+12. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu. Navigate to **Sources**.
13. Select the registered Power BI source from cross tenant.
To create and run a new scan using Azure runtime, perform the following steps:
If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem: 1. Access - Failed status means the user authentication failed: Validate if username and password is correct. review if the Credential contains correct Client (App) ID from the App Registration.
- 2. Assets (+ lineage) - Failed status means the Azure Purview - Power BI authorization has failed. Make sure the user is added to Power BI Administrator role and has proper Power BI license assigned to.
+ 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the user is added to Power BI Administrator role and has proper Power BI license assigned to.
3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata** 20. Set up a scan trigger. Your options are **Recurring**, and **Once**.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Azure Purview scan scheduler.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
18. On **Review new scan**, select **Save and run** to launch your scan.
If delegated auth is used:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Salesforce https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-salesforce.md
Title: Connect to and manage Salesforce
-description: This guide describes how to connect to Salesforce in Azure Purview, and use Azure Purview's features to scan and manage your Salesforce source.
+description: This guide describes how to connect to Salesforce in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Salesforce source.
Last updated 03/05/2022
-# Connect to and manage Salesforce in Azure Purview (Preview)
+# Connect to and manage Salesforce in Microsoft Purview (Preview)
-This article outlines how to register Salesforce, and how to authenticate and interact with Salesforce in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Salesforce, and how to authenticate and interact with Salesforce in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register Salesforce, and how to authenticate and in
|||||||| | [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No|
-When scanning Salesforce source, Azure Purview supports extracting technical metadata including:
+When scanning Salesforce source, Microsoft Purview supports extracting technical metadata including:
- Organization - Objects including the fields, foreign keys, and unique_constraints
When setting up scan, you can choose to scan an entire Salesforce organization,
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
You can use the fully managed Azure integration runtime for scan - make sure to provide the security token to authenticate to Salesforce, learn more from the credential configuration in [Scan](#scan) section. Otherwise, if you want the scan to be initiated from a Salesforce trusted IP range for your organization, you can configure a self-hosted integration runtime to connect to it:
For Standard Objects, ensure that the "Documents" section has the Read permissio
## Register
-This section describes how to register Salesforce in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Salesforce in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new Salesforce source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **Salesforce**. Select **Continue**.
On the **Register sources (Salesforce)** screen, do the following:
Follow the steps below to scan Salesforce to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-Azure Purview uses Salesforce REST API version 41.0 to extract metadata, including REST requests like 'Describe Global' URI (/v41.0/sobjects/),'sObject Basic Information' URI (/v41.0/sobjects/sObject/), and 'SOQL Query' URI (/v41.0/query?).
+Microsoft Purview uses Salesforce REST API version 41.0 to extract metadata, including REST requests like 'Describe Global' URI (/v41.0/sobjects/),'sObject Basic Information' URI (/v41.0/sobjects/sObject/), and 'SOQL Query' URI (/v41.0/query?).
### Authentication for a scan
To create and run a new scan, do the following:
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Sap Bw https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-sap-bw.md
Title: Connect to and manage an SAP Business Warehouse
-description: This guide describes how to connect to SAP Business Warehouse in Azure Purview, and use Azure Purview's features to scan and manage your SAP BW source.
+description: This guide describes how to connect to SAP Business Warehouse in Microsoft Purview, and use Microsoft Purview's features to scan and manage your SAP BW source.
Last updated 03/05/2022
-# Connect to and manage SAP Business Warehouse in Azure Purview (Preview)
+# Connect to and manage SAP Business Warehouse in Microsoft Purview (Preview)
-This article outlines how to register SAP Business Warehouse (BW), and how to authenticate and interact with SAP BW in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register SAP Business Warehouse (BW), and how to authenticate and interact with SAP BW in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register SAP Business Warehouse (BW), and how to au
The supported SAP BW versions are 7.3 to 7.5. SAP BW4/HANA isn't supported.
-When scanning SAP BW source, Azure Purview supports extracting technical metadata including:
+When scanning SAP BW source, Microsoft Purview supports extracting technical metadata including:
- Instance - InfoArea
When scanning SAP BW source, Azure Purview supports extracting technical metadat
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview resource](create-catalog-portal.md).
+* An active [Microsoft Purview resource](create-catalog-portal.md).
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in Azure Purview Studio. For more information about permissions, see [Access control in Azure Purview](catalog-permissions.md).
+* You need Data Source Administrator and Data Reader permissions to register a source and manage it in Microsoft Purview Studio. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.15.8079.1.
When scanning SAP BW source, Azure Purview supports extracting technical metadat
## Register
-This section describes how to register SAP BW in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register SAP BW in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
The only supported authentication for SAP BW source is **Basic authentication**.
### Steps to register
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register**. 1. In **Register sources**, select **SAP BW** > **Continue**.
Follow the steps below to scan SAP BW to automatically identify assets and class
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
- [Search Data Catalog](how-to-search-catalog.md)-- [Data insights in Azure Purview](concept-insights.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
- [Supported data sources and file types](azure-purview-connector-overview.md)
purview Register Scan Sap Hana https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-sap-hana.md
Title: Connect to and manage SAP HANA
-description: This guide describes how to connect to SAP HANA in Azure Purview, and how to use Azure Purview to scan and manage your SAP HANA source.
+description: This guide describes how to connect to SAP HANA in Microsoft Purview, and how to use Microsoft Purview to scan and manage your SAP HANA source.
Last updated 01/11/2022
-# Connect to and manage SAP HANA in Azure Purview (Preview)
+# Connect to and manage SAP HANA in Microsoft Purview (Preview)
-This article outlines how to register SAP HANA, and how to authenticate and interact with SAP HANA in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register SAP HANA, and how to authenticate and interact with SAP HANA in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register SAP HANA, and how to authenticate and inte
|||||||| | [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| No |
-When scanning SAP HANA source, Azure Purview supports extracting technical metadata including:
+When scanning SAP HANA source, Microsoft Purview supports extracting technical metadata including:
- Server - Databases
When setting up scan, you can choose to scan an entire SAP HANA database, or sco
* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* You must have an active [Azure Purview account](create-catalog-portal.md).
+* You must have an active [Microsoft Purview account](create-catalog-portal.md).
-* You need Data Source Administrator and Data Reader permissions to register a source and manage it in Azure Purview Studio. For more information about permissions, see [Access control in Azure Purview](catalog-permissions.md).
+* You need Data Source Administrator and Data Reader permissions to register a source and manage it in Microsoft Purview Studio. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and configure a self-hosted integration runtime](manage-integration-runtimes.md). The minimal supported Self-hosted Integration Runtime version is 5.13.8013.1.
When setting up scan, you can choose to scan an entire SAP HANA database, or sco
### Required permissions for scan
-Azure Purview supports basic authentication (username and password) for scanning SAP HANA.
+Microsoft Purview supports basic authentication (username and password) for scanning SAP HANA.
The SAP HANA user you specified must have the permission to select metadata of the schemas you want to import.
GRANT SELECT ON SCHEMA _SYS_BIC TO <user>;
## Register
-This section describes how to register a SAP HANA in Azure Purview by using [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register a SAP HANA in Microsoft Purview by using [Microsoft Purview Studio](https://web.purview.azure.com/).
-1. Go to your Azure Purview account.
+1. Go to your Microsoft Purview account.
1. Select **Data Map** on the left pane.
This section describes how to register a SAP HANA in Azure Purview by using [Azu
1. On the **Register sources (SAP HANA)** screen, do the following:
- 1. For **Name**, enter a name that Azure Purview will list as the data source.
+ 1. For **Name**, enter a name that Microsoft Purview will list as the data source.
1. For **Server**, enter the host name or IP address used to connect to a SAP HANA source. For example, `MyDatabaseServer.com` or `192.169.1.2`.
This section describes how to register a SAP HANA in Azure Purview by using [Azu
## Scan
-Use the following steps to scan SAP HANA databases to automatically identify assets and classify your data. For more information about scanning in general, see [Scans and ingestion in Azure Purview](concept-scans-and-ingestion.md).
+Use the following steps to scan SAP HANA databases to automatically identify assets and classify your data. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
### Authentication for a scan
The supported authentication type for a SAP HANA source is **Basic authenticatio
* Provide the user name used to connect to the database server in the User name input field. * Store the user password used to connect to the database server in the secret key.
- For more information, see [Credentials for source authentication in Azure Purview](manage-credentials.md).
+ For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
1. **Database**: Specify the name of the database instance to import.
The supported authentication type for a SAP HANA source is **Basic authenticatio
## Next steps
-Now that you've registered your source, use the following guides to learn more about Azure Purview and your data:
+Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search the data catalog](how-to-search-catalog.md)
purview Register Scan Sapecc Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-sapecc-source.md
Title: Connect to and manage an SAP ECC source
-description: This guide describes how to connect to SAP ECC in Azure Purview, and use Azure Purview's features to scan and manage your SAP ECC source.
+description: This guide describes how to connect to SAP ECC in Microsoft Purview, and use Microsoft Purview's features to scan and manage your SAP ECC source.
Last updated 01/20/2022
-# Connect to and manage SAP ECC in Azure Purview
+# Connect to and manage SAP ECC in Microsoft Purview
-This article outlines how to register SAP ECC, and how to authenticate and interact with SAP ECC in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register SAP ECC, and how to authenticate and interact with SAP ECC in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register SAP ECC, and how to authenticate and inter
\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-When scanning SAP ECC source, Azure Purview supports:
+When scanning SAP ECC source, Microsoft Purview supports:
- Extracting technical metadata including:
When scanning SAP ECC source, Azure Purview supports:
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
When scanning SAP ECC source, Azure Purview supports:
## Register
-This section describes how to register SAP ECC in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register SAP ECC in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
The only supported authentication for SAP ECC source is **Basic authentication**
### Steps to register
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **SAP ECC**. Select **Continue.**
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Saps4hana Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-saps4hana-source.md
Title: Connect to and manage an SAP S/4HANA source
-description: This guide describes how to connect to SAP S/4HANA in Azure Purview, and use Azure Purview's features to scan and manage your SAP S/4HANA source.
+description: This guide describes how to connect to SAP S/4HANA in Microsoft Purview, and use Microsoft Purview's features to scan and manage your SAP S/4HANA source.
Last updated 01/20/2022
-# Connect to and manage SAP S/4HANA in Azure Purview
+# Connect to and manage SAP S/4HANA in Microsoft Purview
-This article outlines how to register SAP S/4HANA, and how to authenticate and interact with SAP S/4HANA in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register SAP S/4HANA, and how to authenticate and interact with SAP S/4HANA in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register SAP S/4HANA, and how to authenticate and i
\* *Besides the lineage on assets within the data source, lineage is also supported if dataset is used as a source/sink in [Data Factory](how-to-link-azure-data-factory.md) or [Synapse pipeline](how-to-lineage-azure-synapse-analytics.md).*
-When scanning SAP S/4HANA source, Azure Purview supports:
+When scanning SAP S/4HANA source, Microsoft Purview supports:
- Extracting technical metadata including:
When scanning SAP S/4HANA source, Azure Purview supports:
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
When scanning SAP S/4HANA source, Azure Purview supports:
## Register
-This section describes how to register SAP S/4HANA in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register SAP S/4HANA in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
The only supported authentication for SAP S/4HANA source is **Basic authenticati
### Steps to register
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **SAP S/4HANA.** Select **Continue**
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Snowflake https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-snowflake.md
Title: Connect to and manage Snowflake
-description: This guide describes how to connect to Snowflake in Azure Purview, and use Azure Purview's features to scan and manage your Snowflake source.
+description: This guide describes how to connect to Snowflake in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Snowflake source.
Last updated 03/05/2022
-# Connect to and manage Snowflake in Azure Purview (Preview)
+# Connect to and manage Snowflake in Microsoft Purview (Preview)
-This article outlines how to register Snowflake, and how to authenticate and interact with Snowflake in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Snowflake, and how to authenticate and interact with Snowflake in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
This article outlines how to register Snowflake, and how to authenticate and int
|||||||| | [Yes](#register)| [Yes](#scan)| No | [Yes](#scan) | No | No| [Yes](#lineage) |
-When scanning Snowflake source, Azure Purview supports:
+When scanning Snowflake source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan one or more Snowflake database(s) e
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
**If your data store is not publicly accessible** (if your data store limits access from on-premises network, private network or specific IPs, etc.) you need to configure a self-hosted integration runtime to connect to it:
When setting up scan, you can choose to scan one or more Snowflake database(s) e
### Required permissions for scan
-Azure Purview supports basic authentication (username and password) for scanning Snowflake. The default role of the given user will be used to perform the scan. The Snowflake user must have usage rights on a warehouse and the database(s) to be scanned, and read access to system tables in order to access advanced metadata.
+Microsoft Purview supports basic authentication (username and password) for scanning Snowflake. The default role of the given user will be used to perform the scan. The Snowflake user must have usage rights on a warehouse and the database(s) to be scanned, and read access to system tables in order to access advanced metadata.
-Here's a sample walkthrough to create a user specifically for Azure Purview scan and set up the permissions. If you choose to use an existing user, make sure it has adequate rights to the warehouse and database objects.
+Here's a sample walkthrough to create a user specifically for Microsoft Purview scan and set up the permissions. If you choose to use an existing user, make sure it has adequate rights to the warehouse and database objects.
1. Set up a `purview_reader` role. You need _ACCOUNTADMIN_ rights to do this. ```sql USE ROLE ACCOUNTADMIN;
- --create role to allow read only access - this will later be assigned to the Azure Purview user
+ --create role to allow read only access - this will later be assigned to the Microsoft Purview user
CREATE OR REPLACE ROLE purview_reader; --make sysadmin the parent role GRANT ROLE purview_reader TO ROLE sysadmin; ```
-2. Create a warehouse for Azure Purview to use and grant rights.
+2. Create a warehouse for Microsoft Purview to use and grant rights.
```sql --create warehouse - account admin required
Here's a sample walkthrough to create a user specifically for Azure Purview scan
GRANT USAGE ON WAREHOUSE purview_wh TO ROLE purview_reader; ```
-3. Create a user `purview` for Azure Purview scan.
+3. Create a user `purview` for Microsoft Purview scan.
```sql CREATE OR REPLACE USER purview
Here's a sample walkthrough to create a user specifically for Azure Purview scan
## Register
-This section describes how to register Snowflake in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Snowflake in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register To register a new Snowflake source in your data catalog, do the following:
-1. Navigate to your Azure Purview account in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your Microsoft Purview account in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **Snowflake**. Select **Continue**.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
- Check your account identifer in the source registration step. Don't include `https://` part at the front. - Make sure the warehouse name and database name are in capital case on the scan setup page. - Check your key vault. Make sure there are no typos in the password.-- Check the credential you set up in Azure Purview. The user you specify must have a default role with the necessary access rights to both the warehouse and the database you're trying to scan. See [Required permissions for scan](#required-permissions-for-scan). USE `DESCRIBE USER;` to verify the default role of the user you've specified for Azure Purview.
+- Check the credential you set up in Microsoft Purview. The user you specify must have a default role with the necessary access rights to both the warehouse and the database you're trying to scan. See [Required permissions for scan](#required-permissions-for-scan). USE `DESCRIBE USER;` to verify the default role of the user you've specified for Microsoft Purview.
- Use Query History in Snowflake to see if any activity is coming across. - If there's a problem with the account identifer or password, you won't see any activity. - If there's a problem with the default role, you should at least see a `USE WAREHOUSE . . .` statement.
- - You can use the [QUERY_HISTORY_BY_USER table function](https://docs.snowflake.com/en/sql-reference/functions/query_history.html) to identify what role is being used by the connection. Setting up a dedicated Azure Purview user will make troubleshooting easier.
+ - You can use the [QUERY_HISTORY_BY_USER table function](https://docs.snowflake.com/en/sql-reference/functions/query_history.html) to identify what role is being used by the connection. Setting up a dedicated Microsoft Purview user will make troubleshooting easier.
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Synapse Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-synapse-workspace.md
Title: Connect to and manage Azure Synapse Analytics workspaces
-description: This guide describes how to connect to Azure Synapse Analytics workspaces in Azure Purview, and use Azure Purview's features to scan and manage your Azure Synapse Analytics workspace source.
+description: This guide describes how to connect to Azure Synapse Analytics workspaces in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Azure Synapse Analytics workspace source.
Last updated 03/14/2022
-# Connect to and manage Azure Synapse Analytics workspaces in Azure Purview
+# Connect to and manage Azure Synapse Analytics workspaces in Microsoft Purview
-This article outlines how to register Azure Synapse Analytics workspaces and how to authenticate and interact with Azure Synapse Analytics workspaces in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Azure Synapse Analytics workspaces and how to authenticate and interact with Azure Synapse Analytics workspaces in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
Required. Add any relevant/source-specific prerequisites for connecting with thi
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You will need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
## Register
-This section describes how to register Azure Synapse Analytics workspaces in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Azure Synapse Analytics workspaces in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Authentication for registration
-Only a user with at least a *Reader* role on the Azure Synapse workspace and who is also *data source administrators* in Azure Purview can register an Azure Synapse workspace.
+Only a user with at least a *Reader* role on the Azure Synapse workspace and who is also *data source administrators* in Microsoft Purview can register an Azure Synapse workspace.
### Steps to register
-1. Go to your Azure Purview account.
+1. Go to your Microsoft Purview account.
1. On the left pane, select **Sources**. 1. Select **Register**. 1. Under **Register sources**, select **Azure Synapse Analytics (multiple)**. 1. Select **Continue**.
- :::image type="content" source="media/register-scan-synapse-workspace/register-synapse-source.png" alt-text="Screenshot of a selection of sources in Azure Purview, including Azure Synapse Analytics.":::
+ :::image type="content" source="media/register-scan-synapse-workspace/register-synapse-source.png" alt-text="Screenshot of a selection of sources in Microsoft Purview, including Azure Synapse Analytics.":::
1. On the **Register sources (Azure Synapse Analytics)** page, do the following:
Only a user with at least a *Reader* role on the Azure Synapse workspace and who
Follow the steps below to scan Azure Synapse Analytics workspaces to automatically identify assets and classify your data. For more information about scanning in general, see our [introduction to scans and ingestion](concept-scans-and-ingestion.md).
-You will first need to set up authentication for enumerating for either your [dedicated](#authentication-for-enumerating-dedicated-sql-database-resources) or [serverless](#authentication-for-enumerating-serverless-sql-database-resources) resources. This will allow Azure Purview to enumerate your workspace assets and perform scans.
+You will first need to set up authentication for enumerating for either your [dedicated](#authentication-for-enumerating-dedicated-sql-database-resources) or [serverless](#authentication-for-enumerating-serverless-sql-database-resources) resources. This will allow Microsoft Purview to enumerate your workspace assets and perform scans.
Then, you will need to [apply permissions to scan the contents of the workspace](#apply-permissions-to-scan-the-contents-of-the-workspace).
Then, you will need to [apply permissions to scan the contents of the workspace]
> You must be an *owner* or *user access administrator* to add a role on the resource. 1. Select the **Add** button.
-1. Set the **Reader** role and enter your Azure Purview account name, which represents its managed service identity (MSI).
+1. Set the **Reader** role and enter your Microsoft Purview account name, which represents its managed service identity (MSI).
1. Select **Save** to finish assigning the role. > [!NOTE]
-> If you're planning to register and scan multiple Azure Synapse workspaces in your Azure Purview account, you can also assign the role from a higher level, such as a resource group or a subscription.
+> If you're planning to register and scan multiple Azure Synapse workspaces in your Microsoft Purview account, you can also assign the role from a higher level, such as a resource group or a subscription.
### Authentication for enumerating serverless SQL database resources
-There are three places you will need to set authentication to allow Azure Purview to enumerate your serverless SQL database resources: The Azure Synapse workspace, the associated storage, and the Azure Synapse serverless databases. The steps below will set permissions for all three.
+There are three places you will need to set authentication to allow Microsoft Purview to enumerate your serverless SQL database resources: The Azure Synapse workspace, the associated storage, and the Azure Synapse serverless databases. The steps below will set permissions for all three.
#### Azure Synapse workspace
There are three places you will need to set authentication to allow Azure Purvie
> You must be an *owner* or *user access administrator* to add a role on the resource. 1. Select the **Add** button.
-1. Set the **Reader** role and enter your Azure Purview account name, which represents its managed service identity (MSI).
+1. Set the **Reader** role and enter your Microsoft Purview account name, which represents its managed service identity (MSI).
1. Select **Save** to finish assigning the role. #### Storage account
There are three places you will need to set authentication to allow Azure Purvie
> [!NOTE] > You must be an *owner* or *user access administrator* to add a role in the **Resource group** or **Subscription** fields. 1. Select the **Add** button.
-1. Set the **Storage blob data reader** role and enter your Azure Purview account name (which represents its MSI) in the **Select** box.
+1. Set the **Storage blob data reader** role and enter your Microsoft Purview account name (which represents its MSI) in the **Select** box.
1. Select **Save** to finish assigning the role. #### Azure Synapse serverless database
There are three places you will need to set authentication to allow Azure Purvie
1. Go to your Azure Synapse workspace and open the Synapse Studio. 1. Select the **Data** tab on the left menu. 1. Select the ellipsis (**...**) next to one of your databases, and then start a new SQL script.
-1. Add the Azure Purview account MSI (represented by the account name) on the serverless SQL databases. You do so by running the following command in your SQL script:
+1. Add the Microsoft Purview account MSI (represented by the account name) on the serverless SQL databases. You do so by running the following command in your SQL script:
```sql CREATE LOGIN [PurviewAccountName] FROM EXTERNAL PROVIDER; ```
You can set up authentication for an Azure Synapse source in either of two ways:
> [!NOTE] > To run the commands in the following procedure, you must be an *Azure Synapse administrator* on the workspace. For more information about Azure Synapse Analytics permissions, see: [Set up access control for your Azure Synapse workspace](../synapse-analytics/security/how-to-set-up-access-control.md).
-1. Add the Azure Purview account MSI (represented by the account name) as **db_datareader** on the dedicated SQL database. You do so by running the following command in your SQL script:
+1. Add the Microsoft Purview account MSI (represented by the account name) as **db_datareader** on the dedicated SQL database. You do so by running the following command in your SQL script:
```sql CREATE USER [PurviewAccountName] FROM EXTERNAL PROVIDER
You can set up authentication for an Azure Synapse source in either of two ways:
1. Go to your Azure Synapse workspace. 1. Go to the **Data** section, and select one of your SQL databases. 1. Select the ellipsis (**...**) next to your database, and then start a new SQL script.
-1. Add the Azure Purview account MSI (represented by the account name) as **db_datareader** on the serverless SQL databases. You do so by running the following command in your SQL script:
+1. Add the Microsoft Purview account MSI (represented by the account name) as **db_datareader** on the serverless SQL databases. You do so by running the following command in your SQL script:
```sql CREATE USER [PurviewAccountName] FOR LOGIN [PurviewAccountName]; ALTER ROLE db_datareader ADD MEMBER [PurviewAccountName];
You can set up authentication for an Azure Synapse source in either of two ways:
#### Grant permission to use credentials for external tables
-If the Azure Synapse workspace has any external tables, the Azure Purview managed identity must be given References permission on the external table scoped credentials. With the References permission, Azure Purview can read data from external tables.
+If the Azure Synapse workspace has any external tables, the Microsoft Purview managed identity must be given References permission on the external table scoped credentials. With the References permission, Microsoft Purview can read data from external tables.
```sql GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[scoped_credential] TO [PurviewAccountName];
GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[scoped_credential] TO [PurviewA
#### Use a service principal for dedicated SQL databases > [!NOTE]
-> You must first set up a new *credential* of type *Service Principal* by following the instructions in [Credentials for source authentication in Azure Purview](manage-credentials.md).
+> You must first set up a new *credential* of type *Service Principal* by following the instructions in [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
1. Go to your **Azure Synapse workspace**. 1. Go to the **Data** section, and then look for one of your dedicated SQL databases.
GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[scoped_credential] TO [PurviewA
1. Select **Save**. > [!IMPORTANT]
-> Currently, we do not support setting up scans for an Azure Synapse workspace from Azure Purview Studio, if you cannot enable **Allow Azure services and resources to access this workspace** on your Azure Synapse workspaces. In this case:
-> - You can use [Azure Purview Rest API - Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/) to create a new scan for your Synapse workspaces including dedicated and serverless pools.
+> Currently, we do not support setting up scans for an Azure Synapse workspace from Microsoft Purview Studio, if you cannot enable **Allow Azure services and resources to access this workspace** on your Azure Synapse workspaces. In this case:
+> - You can use [Microsoft Purview Rest API - Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/) to create a new scan for your Synapse workspaces including dedicated and serverless pools.
> - You must use **SQL Auth** as authentication mechanism. ### Create and run scan To create and run a new scan, do the following:
-1. Select the **Data Map** tab on the left pane in [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the data source that you registered.
To create and run a new scan, do the following:
## Next steps
-Now that you have registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you have registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Register Scan Teradata Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-teradata-source.md
Title: Connect to and manage Teradata
-description: This guide describes how to connect to Teradata in Azure Purview, and use Azure Purview's features to scan and manage your Teradata source.
+description: This guide describes how to connect to Teradata in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Teradata source.
Last updated 03/14/2022
-# Connect to and manage Teradata in Azure Purview
+# Connect to and manage Teradata in Microsoft Purview
-This article outlines how to register Teradata, and how to authenticate and interact with Teradata in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
+This article outlines how to register Teradata, and how to authenticate and interact with Teradata in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
## Supported capabilities
This article outlines how to register Teradata, and how to authenticate and inte
The supported Teradata database versions are 12.x to 17.x.
-When scanning Teradata source, Azure Purview supports:
+When scanning Teradata source, Microsoft Purview supports:
- Extracting technical metadata including:
When setting up scan, you can choose to scan an entire Teradata server, or scope
### Required permissions for scan
-Azure Purview supports basic authentication (username and password) for scanning Teradata. The Teradata user must have read access to system tables in order to access advanced metadata.
+Microsoft Purview supports basic authentication (username and password) for scanning Teradata. The Teradata user must have read access to system tables in order to access advanced metadata.
-To retrieve data types of view columns, Azure Purview issues a prepare statement for `select * from <view>` for each of the view queries and parse the metadata that contains the data type details for better performance. It requires the SELECT data permission on views. If the permission is missing, view column data types will be skipped.
+To retrieve data types of view columns, Microsoft Purview issues a prepare statement for `select * from <view>` for each of the view queries and parse the metadata that contains the data type details for better performance. It requires the SELECT data permission on views. If the permission is missing, view column data types will be skipped.
## Prerequisites * An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An active [Azure Purview account](create-catalog-portal.md).
+* An active [Microsoft Purview account](create-catalog-portal.md).
-* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Azure Purview Studio. See our [Azure Purview Permissions page](catalog-permissions.md) for details.
+* You'll need to be a Data Source Administrator and Data Reader to register a source and manage it in the Microsoft Purview Studio. See our [Microsoft Purview Permissions page](catalog-permissions.md) for details.
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [the create and configure a self-hosted integration runtime guide](manage-integration-runtimes.md).
To retrieve data types of view columns, Azure Purview issues a prepare statement
## Register
-This section describes how to register Teradata in Azure Purview using the [Azure Purview Studio](https://web.purview.azure.com/).
+This section describes how to register Teradata in Microsoft Purview using the [Microsoft Purview Studio](https://web.purview.azure.com/).
### Steps to register
-1. Navigate to your Azure Purview account.
+1. Navigate to your Microsoft Purview account.
1. Select **Data Map** on the left navigation. 1. Select **Register** 1. On Register sources, select **Teradata**. Select **Continue**
Follow the steps below to scan Teradata to automatically identify assets and cla
1. In the Management Center, select **Integration runtimes**. Make sure a self-hosted integration runtime is set up. If it isn't set up, use the steps mentioned [here](./manage-integration-runtimes.md) to set up a self-hosted integration runtime
-1. Select the **Data Map** tab on the left pane in the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Select the **Data Map** tab on the left pane in the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the registered Teradata source.
Go to the asset -> lineage tab, you can see the asset relationship when applicab
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Azure Purview and your data.
+Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
-- [Data insights in Azure Purview](concept-insights.md)-- [Lineage in Azure Purview](catalog-lineage-user-guide.md)
+- [Data insights in Microsoft Purview](concept-insights.md)
+- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
- [Search Data Catalog](how-to-search-catalog.md)
purview Sensitivity Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/sensitivity-insights.md
Title: Sensitivity label reporting on your data in Azure Purview using Azure Purview Insights
-description: This how-to guide describes how to view and use Azure Purview Sensitivity label reporting on your data.
+ Title: Sensitivity label reporting on your data in Microsoft Purview using Microsoft Purview Insights
+description: This how-to guide describes how to view and use Microsoft Purview Sensitivity label reporting on your data.
Last updated 09/27/2021
-# Customer intent: As a security officer, I need to understand how to use Azure Purview Insights to learn about sensitive data identified and classified and labeled during scanning.
+# Customer intent: As a security officer, I need to understand how to use Microsoft Purview Insights to learn about sensitive data identified and classified and labeled during scanning.
-# Sensitivity label insights about your data in Azure Purview
+# Sensitivity label insights about your data in Microsoft Purview
This how-to guide describes how to access, view, and filter security insights provided by sensitivity labels applied to your data. > [!IMPORTANT]
-> Azure Purview Sensitivity Label Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> Microsoft Purview Sensitivity Label Insights are currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
Supported data sources include: Azure Blob Storage, Azure Data Lake Storage (ADLS) GEN 1, Azure Data Lake Storage (ADLS) GEN 2, SQL Server, Azure SQL Database, Azure SQL Managed Instance, Amazon S3 buckets, Amazon RDS databases (public preview), Power BI In this how-to guide, you'll learn how to: > [!div class="checklist"]
-> - Launch your Azure Purview account from Azure.
+> - Launch your Microsoft Purview account from Azure.
> - View sensitivity labeling insights on your data > - Drill down for more sensitivity labeling details on your data ## Prerequisites
-Before getting started with Azure Purview insights, make sure that you've completed the following steps:
+Before getting started with Microsoft Purview insights, make sure that you've completed the following steps:
- Set up your Azure resources and populated the relevant accounts with test data -- [Extended Microsoft 365 sensitivity labels to assets in Azure Purview](create-sensitivity-label.md), and created or selected the labels you want to apply to your data.
+- [Extended Microsoft 365 sensitivity labels to assets in Microsoft Purview](create-sensitivity-label.md), and created or selected the labels you want to apply to your data.
-- Set up and completed a scan on the test data in each data source. For more information, see [Manage data sources in Azure Purview](manage-data-sources.md) and [Create a scan rule set](create-a-scan-rule-set.md).
+- Set up and completed a scan on the test data in each data source. For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md) and [Create a scan rule set](create-a-scan-rule-set.md).
-- Signed in to Azure Purview with account with a [Data Reader or Data Curator role](catalog-permissions.md#roles).
+- Signed in to Microsoft Purview with account with a [Data Reader or Data Curator role](catalog-permissions.md#roles).
-For more information, see [Manage data sources in Azure Purview](manage-data-sources.md) and [Automatically label your data in Azure Purview](create-sensitivity-label.md).
+For more information, see [Manage data sources in Microsoft Purview](manage-data-sources.md) and [Automatically label your data in Microsoft Purview](create-sensitivity-label.md).
-## Use Azure Purview Sensitivity labeling insights
+## Use Microsoft Purview Sensitivity labeling insights
-In Azure Purview, classifications are similar to subject tags, and are used to mark and identify data of a specific type that's found within your data estate during scanning.
+In Microsoft Purview, classifications are similar to subject tags, and are used to mark and identify data of a specific type that's found within your data estate during scanning.
Sensitivity labels enable you to state how sensitive certain data is in your organization. For example, a specific project name might be highly confidential within your organization, while that same term is not confidential to other organizations.
Classifications are matched directly, such as a social security number, which ha
In contrast, sensitivity labels are applied when one or more classifications and conditions are found together. In this context, [conditions](/microsoft-365/compliance/apply-sensitivity-label-automatically) refer to all the parameters that you can define for unstructured data, such as **proximity to another classification**, and **% confidence**.
-Azure Purview uses the same classifications, also known as [sensitive information types](/microsoft-365/compliance/sensitive-information-type-entity-definitions), as Microsoft 365. This enables you to extend your existing sensitivity labels across your Azure Purview assets.
+Microsoft Purview uses the same classifications, also known as [sensitive information types](/microsoft-365/compliance/sensitive-information-type-entity-definitions), as Microsoft 365. This enables you to extend your existing sensitivity labels across your Microsoft Purview assets.
> [!NOTE] > After you have scanned your source types, give **Sensitivity labeling** Insights a couple of hours to reflect the new assets. **To view sensitivity labeling insights:**
-1. Go to the **Azure Purview** home page.
+1. Go to the **Microsoft Purview** home page.
-1. On the **Overview** page, in the **Get Started** section, select the **Launch Azure Purview account** tile.
+1. On the **Overview** page, in the **Get Started** section, select the **Launch Microsoft Purview account** tile.
-1. In Azure Purview, select the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: menu item on the left to access your **Insights** area.
+1. In Microsoft Purview, select the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: menu item on the left to access your **Insights** area.
-1. In the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: area, select **Sensitivity labels** to display the Azure Purview **Sensitivity labeling insights** report.
+1. In the **Insights** :::image type="icon" source="media/insights/ico-insights.png" border="false"::: area, select **Sensitivity labels** to display the Microsoft Purview **Sensitivity labeling insights** report.
> [!NOTE]
- > If this report is empty, you may not have extended your sensitivity labels to Azure Purview. For more information, see [Automatically label your data in Azure Purview](create-sensitivity-label.md).
+ > If this report is empty, you may not have extended your sensitivity labels to Microsoft Purview. For more information, see [Automatically label your data in Microsoft Purview](create-sensitivity-label.md).
:::image type="content" source="media/insights/sensitivity-labeling-insights-small.png" alt-text="Sensitivity labeling insights":::
Azure Purview uses the same classifications, also known as [sensitive informatio
||| |**Overview of sources with sensitivity labels** |Displays tiles that provide: <br>- The number of subscriptions found in your data. <br>- The number of unique sensitivity labels applied on your data <br>- The number of sources with sensitivity labels applied <br>- The number of files and tables found with sensitivity labels applied| |**Top sources with labeled data (last 30 days)** | Shows the trend, over the past 30 days, of the number of sources with sensitivity labels applied. |
- |**Top labels applied across sources** |Shows the top labels applied across all of your Azure Purview data resources. |
+ |**Top labels applied across sources** |Shows the top labels applied across all of your Microsoft Purview data resources. |
|**Top labels applied on files** |Shows the top sensitivity labels applied to files in your data. | |**Top labels applied on tables** | Shows the top sensitivity labels applied to database tables in your data. | | **Labeling activity** | Displays separate graphs for files and tables, each showing the number of files or tables labeled over the selected time frame. <br>**Default**: 30 days<br>Select the **Time** filter above the graphs to select a different time frame to display. |
Do any of the following to learn more:
|**Sort the grid** |Select a column header to sort the grid by that column. | |**Edit columns** | To display more or fewer columns in your grid, select **Edit Columns** :::image type="icon" source="media/insights/ico-columns.png" border="false":::, and then select the columns you want to view or change the order. <br><br>Select a column header to sort the grid by that column. | |**Drill down further** | To drill down to a specific label, select a name in the **Sensitivity label** column to view the **Label by source** report. <br><br>This report displays data for the selected label, including the source name, source type, subscription ID, and the numbers of classified files and tables. |
-|**Browse assets** | To browse through the assets found with a specific label or source, select one or more labels or sources, depending on the report you're viewing, and then select **Browse assets** :::image type="icon" source="medi). |
+|**Browse assets** | To browse through the assets found with a specific label or source, select one or more labels or sources, depending on the report you're viewing, and then select **Browse assets** :::image type="icon" source="medi). |
| | | ## Sensitivity label integration with Microsoft 365 compliance
-Close integration with [Microsoft Information Protection](/microsoft-365/compliance/information-protection) offered in Microsoft 365 means that Azure Purview enables direct ways to extend visibility into your data estate, and classify and label your data.
+Close integration with [Microsoft Information Protection](/microsoft-365/compliance/information-protection) offered in Microsoft 365 means that Microsoft Purview enables direct ways to extend visibility into your data estate, and classify and label your data.
-For your Microsoft 365 sensitivity labels to be extended to your assets in Azure Purview, you must actively turn on Information Protection for Azure Purview, in the Microsoft 365 compliance center.
+For your Microsoft 365 sensitivity labels to be extended to your assets in Microsoft Purview, you must actively turn on Information Protection for Microsoft Purview, in the Microsoft 365 compliance center.
-For more information, see [Automatically label your data in Azure Purview](create-sensitivity-label.md).
+For more information, see [Automatically label your data in Microsoft Purview](create-sensitivity-label.md).
## Next steps
-Learn more about these Azure Purview insight reports:
+Learn more about these Microsoft Purview insight reports:
- [Glossary insights](glossary-insights.md) - [Classification insights](./classification-insights.md)
purview Supported Browsers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/supported-browsers.md
Title: Supported browsers
-description: This article provides the list of supported browsers for Azure Purview.
+description: This article provides the list of supported browsers for Microsoft Purview.
Last updated 11/18/2020
# Supported Browsers
-Azure Purview supports the following browsers. We recommend that you use the most up-to-date browser that's compatible with your operating system.
+Microsoft Purview supports the following browsers. We recommend that you use the most up-to-date browser that's compatible with your operating system.
* Microsoft Edge (latest version) * Safari (latest version, Mac only)
Azure Purview supports the following browsers. We recommend that you use the mos
## Chrome Incognito mode
- Chrome Incognito blocking 3rd party cookies must be disabled for Azure Purview Studio to work.
+ Chrome Incognito blocking 3rd party cookies must be disabled for Microsoft Purview Studio to work.
:::image type="content" source="./media/supported-browsers/incognito-chrome.png" alt-text="Screenshot showing chrome."::: ## Chromium Edge InPrivate mode
-Chromium Edge InPrivate using Strict Tracking Prevention must be disabled for Azure Purview Studio to work.
+Chromium Edge InPrivate using Strict Tracking Prevention must be disabled for Microsoft Purview Studio to work.
:::image type="content" source="./media/supported-browsers/incognito-edge.png" alt-text="Screenshot showing edge.":::
purview Supported Classifications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/supported-classifications.md
Title: List of supported classifications
-description: This page lists the supported system classifications in Azure Purview.
+description: This page lists the supported system classifications in Microsoft Purview.
Last updated 09/27/2021
#Customer intent: As a data steward or catalog administrator, I need to understand what's supported under classifications.
-# System classifications in Azure Purview
+# System classifications in Microsoft Purview
-This article lists the supported system classifications in Azure Purview. To learn more about classification, see [Classification](concept-classification.md).
+This article lists the supported system classifications in Microsoft Purview. To learn more about classification, see [Classification](concept-classification.md).
-Azure Purview classifies data by [RegEx](https://wikipedia.org/wiki/Regular_expression) and [Bloom Filter](https://wikipedia.org/wiki/Bloom_filter). The following lists describe the format, pattern, and keywords for the Azure Purview defined system classifications. Each classification name is prefixed by *MICROSOFT*.
+Microsoft Purview classifies data by [RegEx](https://wikipedia.org/wiki/Regular_expression) and [Bloom Filter](https://wikipedia.org/wiki/Bloom_filter). The following lists describe the format, pattern, and keywords for the Microsoft Purview defined system classifications. Each classification name is prefixed by *MICROSOFT*.
> [!Note]
-> Azure Purview can classify both structured (CSV, TSV, JSON, SQL Table etc.) as well as unstructured data (DOC, PDF, TXT etc.). However, there are certain classifications that are only applicable to structured data. Here is the list of classifications that Azure Purview doesn't apply on unstructured data - City Name, Country Name, Date Of Birth, Email, Ethnic Group, GeoLocation, Person Name, U.S. Phone Number, U.S. States, U.S. ZipCode
+> Microsoft Purview can classify both structured (CSV, TSV, JSON, SQL Table etc.) as well as unstructured data (DOC, PDF, TXT etc.). However, there are certain classifications that are only applicable to structured data. Here is the list of classifications that Microsoft Purview doesn't apply on unstructured data - City Name, Country Name, Date Of Birth, Email, Ethnic Group, GeoLocation, Person Name, U.S. Phone Number, U.S. States, U.S. ZipCode
> [!Note] > **Minimum match threshold**: It is the minimum percentage of data value matches in a column that must be found by the scanner for the classification to be applied. For system classification minimum match threshold value is set at 60% and cannot be changed. For custom classification, this value is configurable.
Person Name bloom filter has been prepared using the below two datasets.
- [Popular Baby Names (from SSN), using all years 1880-2019 (98-K entries)](https://www.ssa.gov/oact/babynames/limits.html) > [!NOTE]
-> Azure Purview classifies columns only when the data contains first/last names. Azure Purview doesn't classify columns that contain full names.
+> Microsoft Purview classifies columns only when the data contains first/last names. Microsoft Purview doesn't classify columns that contain full names.
## RegEx Classifications
purview Troubleshoot Connections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/troubleshoot-connections.md
Title: Troubleshoot your connections in Azure Purview
-description: This article explains the steps to troubleshoot your connections in Azure Purview.
+ Title: Troubleshoot your connections in Microsoft Purview
+description: This article explains the steps to troubleshoot your connections in Microsoft Purview.
Last updated 09/27/2021
-# Troubleshoot your connections in Azure Purview
+# Troubleshoot your connections in Microsoft Purview
-This article describes how to troubleshoot connection errors while setting up scans on data sources in Azure Purview.
+This article describes how to troubleshoot connection errors while setting up scans on data sources in Microsoft Purview.
## Permission the credential on the data source
There are specific instructions for each [source type](azure-purview-connector-o
> [!IMPORTANT] > Verify that you have followed all prerequisite and authentication steps for the source you are connecting to.
-> You can find all available sources listed in the [Azure Purview supported sources article](azure-purview-connector-overview.md).
+> You can find all available sources listed in the [Microsoft Purview supported sources article](azure-purview-connector-overview.md).
-## Verifying Azure Role-based Access Control to enumerate Azure resources in Azure Purview Studio
+## Verifying Azure Role-based Access Control to enumerate Azure resources in Microsoft Purview Studio
### Registering single Azure data source
-To register a single data source in Azure Purview, such as an Azure Blog Storage or an Azure SQL Database, you must be granted at least **Reader** role on the resource or inherited from higher scope such as resource group or subscription. Some Azure RBAC roles, such as Security Admin, don't have read access to view Azure resources in control plane.
+To register a single data source in Microsoft Purview, such as an Azure Blog Storage or an Azure SQL Database, you must be granted at least **Reader** role on the resource or inherited from higher scope such as resource group or subscription. Some Azure RBAC roles, such as Security Admin, don't have read access to view Azure resources in control plane.
Verify this by following the steps below:
-1. From the [Azure portal](https://portal.azure.com), navigate to the resource that you're trying to register in Azure Purview. If you can view the resource, it's likely, that you already have at least reader role on the resource.
+1. From the [Azure portal](https://portal.azure.com), navigate to the resource that you're trying to register in Microsoft Purview. If you can view the resource, it's likely, that you already have at least reader role on the resource.
2. Select **Access control (IAM)** > **Role Assignments**.
-3. Search by name or email address of the user who is trying to register data sources in Azure Purview.
+3. Search by name or email address of the user who is trying to register data sources in Microsoft Purview.
4. Verify if any role assignments, such as Reader, exist in the list or add a new role assignment if needed. ### Scanning multiple Azure data sources
Verify this by following the steps below:
1. From the [Azure portal](https://portal.azure.com), navigate to the subscription or the resource group. 2. Select **Access Control (IAM)** from the left menu. 3. Select **+Add**.
-4. In the **Select input** box, select the **Reader** role and enter your Azure Purview account name (which represents its MSI name).
+4. In the **Select input** box, select the **Reader** role and enter your Microsoft Purview account name (which represents its MSI name).
5. Select **Save** to finish the role assignment.
-6. Repeat the steps above to add the identity of the user who is trying to create a new scan for multiple data sources in Azure Purview.
+6. Repeat the steps above to add the identity of the user who is trying to create a new scan for multiple data sources in Microsoft Purview.
## Scanning data sources using Private Link
If public endpoint is restricted on your data sources, to scan Azure data source
For more information about setting up a self-hosted integration runtime, see [Ingestion private endpoints and scanning sources](catalog-private-link-ingestion.md#deploy-self-hosted-integration-runtime-ir-and-scan-your-data-sources)
-For more information how to create a new credential in Azure Purview, see [Credentials for source authentication in Azure Purview](manage-credentials.md#create-azure-key-vaults-connections-in-your-azure-purview-account)
+For more information how to create a new credential in Microsoft Purview, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
## Storing your credential in your key vault and using the right secret name and version
Verify this by following the steps below:
1. Select the secret you're using to authenticate against your data source for scans. 1. Select the version that you intend to use and verify that the password or account key is correct by selecting **Show Secret Value**.
-## Verify permissions for the Azure Purview managed identity on your Azure Key Vault
+## Verify permissions for the Microsoft Purview managed identity on your Azure Key Vault
-Verify that the correct permissions have been configured for the Azure Purview managed identity to access your Azure Key Vault.
+Verify that the correct permissions have been configured for the Microsoft Purview managed identity to access your Azure Key Vault.
To verify this, do the following steps: 1. Navigate to your key vault and to the **Access policies** section
-1. Verify that your Azure Purview managed identity shows under the _Current access policies_ section with at least **Get** and **List** permissions on Secrets
+1. Verify that your Microsoft Purview managed identity shows under the _Current access policies_ section with at least **Get** and **List** permissions on Secrets
:::image type="content" source="./media/troubleshoot-connections/verify-minimum-permissions.png" alt-text="Image showing dropdown selection of both Get and List permission options":::
-If you don't see your Azure Purview managed identity listed, then follow the steps in [Create and manage credentials for scans](manage-credentials.md) to add it.
+If you don't see your Microsoft Purview managed identity listed, then follow the steps in [Create and manage credentials for scans](manage-credentials.md) to add it.
## Next steps -- [Browse the Azure Purview Data catalog](how-to-browse-catalog.md)-- [Search the Azure Purview Data Catalog](how-to-search-catalog.md)
+- [Browse the Microsoft Purview Data catalog](how-to-browse-catalog.md)
+- [Search the Microsoft Purview Data Catalog](how-to-search-catalog.md)
purview Tutorial Azure Purview Checklist https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-azure-purview-checklist.md
Title: Learn about prerequisites to successfully deploy an Azure Purview account
-description: This tutorial lists prerequisites to deploy an Azure Purview account.
+ Title: Learn about prerequisites to successfully deploy a Microsoft Purview account
+description: This tutorial lists prerequisites to deploy a Microsoft Purview account.
Last updated 03/15/2022
-# Customer Intent: As a Data and Data Security administrator, I want to deploy Azure Purview as a unified data governance solution.
+# Customer Intent: As a Data and Data Security administrator, I want to deploy Microsoft Purview as a unified data governance solution.
-# Azure Purview deployment checklist
+# Microsoft Purview deployment checklist
-This article lists prerequisites that help you get started quickly on Azure Purview planning and deployment.
+This article lists prerequisites that help you get started quickly on Microsoft Purview planning and deployment.
|No. |Prerequisite / Action |Required permission |Additional guidance and recommendations | |:|:|:|:|
-|1 | Azure Active Directory Tenant |N/A |An [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md) should be associated with your subscription. <ul><li>*Global Administrator* or *Information Protection Administrator* role is required, if you plan to [extend Microsoft 365 Sensitivity Labels to Azure Purview for files and db columns](create-sensitivity-label.md)</li><li> *Global Administrator* or *Power BI Administrator* role is required, if you're planning to [scan Power BI tenants](register-scan-power-bi-tenant.md).</li></ul> |
-|2 |An active Azure Subscription |*Subscription Owner* |An Azure subscription is needed to deploy Azure Purview and its managed resources. If you don't have an Azure subscription, create a [free subscription](https://azure.microsoft.com/free/) before you begin. |
-|3 |Define whether you plan to deploy an Azure Purview with managed Event Hub | N/A |A managed Event Hub is created as part of Azure Purview account creation, see Azure Purview account creation. You can publish messages to the Event Hub kafka topic ATLAS_HOOK and Azure Purview will consume and process it. Azure Purview will notify entity changes to Event Hub kafka topic ATLAS_ENTITIES and user can consume and process it. |
-|4 |Register the following resource providers: <ul><li>Microsoft.Storage</li><li>Microsoft.EventHub (optional)</li><li>Microsoft.Purview</li></ul> |*Subscription Owner* or custom role to register Azure resource providers (_/register/action_) | [Register required Azure Resource Providers](/azure/azure-resource-manager/management/resource-providers-and-types) in the Azure Subscription that is designated for Azure Purview Account. Review [Azure resource provider operations](../role-based-access-control/resource-provider-operations.md). |
-|5 |Update Azure Policy to allow deployment of the following resources in your Azure subscription: <ul><li>Azure Purview</li><li>Azure Storage</li><li>Azure Event Hub (optional)</li></ul> |*Subscription Owner* |Use this step if an existing Azure Policy prevents deploying such Azure resources. If a blocking policy exists and needs to remain in place, please follow our [Azure Purview exception tag guide](create-azure-purview-portal-faq.md) and follow the steps to create an exception for Azure Purview accounts. |
-|6 | Define your network security requirements. | Network and Security architects. |<ul><li> Review [Azure Purview network architecture and best practices](concept-best-practices-network.md) to define what scenario is more relevant to your network requirements. </li><li>If private network is needed, use [Azure Purview Managed IR](catalog-managed-vnet.md) to scan Azure data sources when possible to reduce complexity and administrative overhead. </li></ul> |
-|7 |An Azure Virtual Network and Subnet(s) for Azure Purview private endpoints. | *Network Contributor* to create or update Azure VNet. |Use this step if you're planning to deploy [private endpoint connectivity with Azure Purview](catalog-private-link.md): <ul><li>Private endpoints for **Ingestion**.</li><li>Private endpoint for Azure Purview **Account**.</li><li>Private endpoint for Azure Purview **Portal**.</li></ul> <br> Deploy [Azure Virtual Network](../virtual-network/quick-create-portal.md) if you need one. |
+|1 | Azure Active Directory Tenant |N/A |An [Azure Active Directory tenant](../active-directory/fundamentals/active-directory-access-create-new-tenant.md) should be associated with your subscription. <ul><li>*Global Administrator* or *Information Protection Administrator* role is required, if you plan to [extend Microsoft 365 Sensitivity Labels to Microsoft Purview for files and db columns](create-sensitivity-label.md)</li><li> *Global Administrator* or *Power BI Administrator* role is required, if you're planning to [scan Power BI tenants](register-scan-power-bi-tenant.md).</li></ul> |
+|2 |An active Azure Subscription |*Subscription Owner* |An Azure subscription is needed to deploy Microsoft Purview and its managed resources. If you don't have an Azure subscription, create a [free subscription](https://azure.microsoft.com/free/) before you begin. |
+|3 |Define whether you plan to deploy a Microsoft Purview with managed Event Hub | N/A |A managed Event Hub is created as part of Microsoft Purview account creation, see Microsoft Purview account creation. You can publish messages to the Event Hub kafka topic ATLAS_HOOK and Microsoft Purview will consume and process it. Microsoft Purview will notify entity changes to Event Hub kafka topic ATLAS_ENTITIES and user can consume and process it. |
+|4 |Register the following resource providers: <ul><li>Microsoft.Storage</li><li>Microsoft.EventHub (optional)</li><li>Microsoft.Purview</li></ul> |*Subscription Owner* or custom role to register Azure resource providers (_/register/action_) | [Register required Azure Resource Providers](/azure/azure-resource-manager/management/resource-providers-and-types) in the Azure Subscription that is designated for Microsoft Purview Account. Review [Azure resource provider operations](../role-based-access-control/resource-provider-operations.md). |
+|5 |Update Azure Policy to allow deployment of the following resources in your Azure subscription: <ul><li>Microsoft Purview</li><li>Azure Storage</li><li>Azure Event Hub (optional)</li></ul> |*Subscription Owner* |Use this step if an existing Azure Policy prevents deploying such Azure resources. If a blocking policy exists and needs to remain in place, please follow our [Microsoft Purview exception tag guide](create-azure-purview-portal-faq.md) and follow the steps to create an exception for Microsoft Purview accounts. |
+|6 | Define your network security requirements. | Network and Security architects. |<ul><li> Review [Microsoft Purview network architecture and best practices](concept-best-practices-network.md) to define what scenario is more relevant to your network requirements. </li><li>If private network is needed, use [Microsoft Purview Managed IR](catalog-managed-vnet.md) to scan Azure data sources when possible to reduce complexity and administrative overhead. </li></ul> |
+|7 |An Azure Virtual Network and Subnet(s) for Microsoft Purview private endpoints. | *Network Contributor* to create or update Azure VNet. |Use this step if you're planning to deploy [private endpoint connectivity with Microsoft Purview](catalog-private-link.md): <ul><li>Private endpoints for **Ingestion**.</li><li>Private endpoint for Microsoft Purview **Account**.</li><li>Private endpoint for Microsoft Purview **Portal**.</li></ul> <br> Deploy [Azure Virtual Network](../virtual-network/quick-create-portal.md) if you need one. |
|8 |Deploy private endpoint for Azure data sources. |*Network Contributor* to set up private endpoints for each data source. |Perform this step, if you're planning to use [Private Endpoint for Ingestion](catalog-private-link-end-to-end.md). |
-|9 |Define whether to deploy new or use existing Azure Private DNS Zones. |Required [Azure Private DNS Zones](catalog-private-link-name-resolution.md) can be created automatically during Purview Account deployment using Subscription Owner / Contributor role |Use this step if you're planning to use Private Endpoint connectivity with Azure Purview. Required DNS Zones for Private Endpoint: <ul><li>privatelink.purview.azure.com</li><li>privatelink.purviewstudio.azure.com</li><li>privatelink.blob.core.windows.net</li><li>privatelink.queue.core.windows.net</li><li>privatelink.servicebus.windows.net</li></ul> |
-|10 |A management machine in your CorpNet or inside Azure VNet to launch Azure Purview Studio. |N/A |Use this step if you're planning to set **Allow Public Network** to **deny** on your Azure Purview Account. |
-|11 |Deploy an Azure Purview Account |Subscription Owner / Contributor |Purview account is deployed with 1 Capacity Unit and will scale up based [on demand](concept-elastic-data-map.md). |
-|12 |Deploy a Managed Integration Runtime and Managed private endpoints for Azure data sources. |*Data source admin* to setup Managed VNet inside Azure Purview. <br> *Network Contributor* to approve managed private endpoint for each Azure data source. |Perform this step if you're planning to use [Managed VNet](catalog-managed-vnet.md). within your Azure Purview account for scanning purposes. |
+|9 |Define whether to deploy new or use existing Azure Private DNS Zones. |Required [Azure Private DNS Zones](catalog-private-link-name-resolution.md) can be created automatically during Purview Account deployment using Subscription Owner / Contributor role |Use this step if you're planning to use Private Endpoint connectivity with Microsoft Purview. Required DNS Zones for Private Endpoint: <ul><li>privatelink.purview.azure.com</li><li>privatelink.purviewstudio.azure.com</li><li>privatelink.blob.core.windows.net</li><li>privatelink.queue.core.windows.net</li><li>privatelink.servicebus.windows.net</li></ul> |
+|10 |A management machine in your CorpNet or inside Azure VNet to launch Microsoft Purview Studio. |N/A |Use this step if you're planning to set **Allow Public Network** to **deny** on your Microsoft Purview Account. |
+|11 |Deploy a Microsoft Purview Account |Subscription Owner / Contributor |Purview account is deployed with 1 Capacity Unit and will scale up based [on demand](concept-elastic-data-map.md). |
+|12 |Deploy a Managed Integration Runtime and Managed private endpoints for Azure data sources. |*Data source admin* to setup Managed VNet inside Microsoft Purview. <br> *Network Contributor* to approve managed private endpoint for each Azure data source. |Perform this step if you're planning to use [Managed VNet](catalog-managed-vnet.md). within your Microsoft Purview account for scanning purposes. |
|13 |Deploy Self-hosted integration runtime VMs inside your network. |Azure: *Virtual Machine Contributor* <br> On-prem: Application owner |Use this step if you're planning to perform any scans using [Self-hosted Integration Runtime](manage-integration-runtimes.md). |
-|14 |Create a Self-hosted integration runtime inside Azure Purview. |Data curator <br> VM Administrator or application owner |Use this step if you're planning to use Self-hosted Integration Runtime instead of Managed Integration Runtime or Azure Integration Runtime. <br><br> <br> [download](https://www.microsoft.com/en-us/download/details.aspx?id=39717) |
+|14 |Create a Self-hosted integration runtime inside Microsoft Purview. |Data curator <br> VM Administrator or application owner |Use this step if you're planning to use Self-hosted Integration Runtime instead of Managed Integration Runtime or Azure Integration Runtime. <br><br> <br> [download](https://www.microsoft.com/en-us/download/details.aspx?id=39717) |
|15 |Register your Self-hosted integration runtime | Virtual machine administrator |Use this step if you have **on-premises** or **VM-based data sources** (e.g. SQL Server). <br> Use this step are using **Private Endpoint** to scan to **any** data sources. |
-|16 |Grant Azure RBAC **Reader** role to **Azure Purview MSI** at data sources' Subscriptions |*Subscription owner* or *User Access Administrator* |Use this step if you're planning to register [multiple](register-scan-azure-multiple-sources.md) or **any** of the following data sources: <ul><li>[Azure Blob Storage](register-scan-azure-blob-storage-source.md)</li><li>[Azure Data Lake Storage Gen1](register-scan-adls-gen1.md)</li><li>[Azure Data Lake Storage Gen2](register-scan-adls-gen2.md)</li><li>[Azure SQL Database](register-scan-azure-sql-database.md)</li><li>[Azure SQL Database Managed Instance](register-scan-azure-sql-database-managed-instance.md)</li><li>[Azure Synapse Analytics](register-scan-synapse-workspace.md)</li></ul> |
-|17 |Grant Azure RBAC **Storage Blob Data Reader** role to **Azure Purview MSI** at data sources Subscriptions. |*Subscription owner* or *User Access Administrator* | **Skip** this step if you are using Private Endpoint to connect to data sources. Use this step if you have these data sources:<ul><li>[Azure Blob Storage](register-scan-azure-blob-storage-source.md#using-a-system-or-user-assigned-managed-identity-for-scanning)</li><li>[Azure Data Lake Storage Gen2](register-scan-adls-gen2.md#using-a-system-or-user-assigned-managed-identity-for-scanning)</li></ul> |
+|16 |Grant Azure RBAC **Reader** role to **Microsoft Purview MSI** at data sources' Subscriptions |*Subscription owner* or *User Access Administrator* |Use this step if you're planning to register [multiple](register-scan-azure-multiple-sources.md) or **any** of the following data sources: <ul><li>[Azure Blob Storage](register-scan-azure-blob-storage-source.md)</li><li>[Azure Data Lake Storage Gen1](register-scan-adls-gen1.md)</li><li>[Azure Data Lake Storage Gen2](register-scan-adls-gen2.md)</li><li>[Azure SQL Database](register-scan-azure-sql-database.md)</li><li>[Azure SQL Database Managed Instance](register-scan-azure-sql-database-managed-instance.md)</li><li>[Azure Synapse Analytics](register-scan-synapse-workspace.md)</li></ul> |
+|17 |Grant Azure RBAC **Storage Blob Data Reader** role to **Microsoft Purview MSI** at data sources Subscriptions. |*Subscription owner* or *User Access Administrator* | **Skip** this step if you are using Private Endpoint to connect to data sources. Use this step if you have these data sources:<ul><li>[Azure Blob Storage](register-scan-azure-blob-storage-source.md#using-a-system-or-user-assigned-managed-identity-for-scanning)</li><li>[Azure Data Lake Storage Gen2](register-scan-adls-gen2.md#using-a-system-or-user-assigned-managed-identity-for-scanning)</li></ul> |
|18 |Enable network connectivity to allow AzureServices to access data sources: <br> e.g. Enable "**Allow trusted Microsoft services to access this storage account**". |*Owner* or *Contributor* at Data source |Use this step if **Service Endpoint** is used in your data sources. (Don't use this step if Private Endpoint is used) | |19 |Enable **Azure Active Directory Authentication** on **Azure SQL Servers**, **Azure SQL Database Managed Instance** and **Azure Synapse Analytics** |Azure SQL Server Contributor |Use this step if you have **Azure SQL DB** or **Azure SQL Database Managed Instance** or **Azure Synapse Analytics** as data source. **Skip** this step if you are using **Private Endpoint** to connect to data sources. |
-|20 |Grant **Azure Purview MSI** account with **db_datareader** role to Azure SQL databases and Azure SQL Database Managed Instance databases |Azure SQL Administrator |Use this step if you have **Azure SQL DB** or **Azure SQL Database Managed Instance** as data source. **Skip** this step if you are using **Private Endpoint** to connect to data sources. |
+|20 |Grant **Microsoft Purview MSI** account with **db_datareader** role to Azure SQL databases and Azure SQL Database Managed Instance databases |Azure SQL Administrator |Use this step if you have **Azure SQL DB** or **Azure SQL Database Managed Instance** as data source. **Skip** this step if you are using **Private Endpoint** to connect to data sources. |
|21 |Grant Azure RBAC **Storage Blob Data Reader** to **Synapse SQL Server** for staging Storage Accounts |Owner or User Access Administrator at data source |Use this step if you have **Azure Synapse Analytics** as data sources. **Skip** this step if you are using Private Endpoint to connect to data sources. |
-|22 |Grant Azure RBAC **Reader** role to **Azure Purview MSI** at **Synapse workspace** resources |Owner or User Access Administrator at data source |Use this step if you have **Azure Synapse Analytics** as data sources. **Skip** this step if you are using Private Endpoint to connect to data sources. |
+|22 |Grant Azure RBAC **Reader** role to **Microsoft Purview MSI** at **Synapse workspace** resources |Owner or User Access Administrator at data source |Use this step if you have **Azure Synapse Analytics** as data sources. **Skip** this step if you are using Private Endpoint to connect to data sources. |
|23 |Grant Azure **Purview MSI account** with **db_datareader** role |Azure SQL Administrator |Use this step if you have **Azure Synapse Analytics (Dedicated SQL databases)**. <br> **Skip** this step if you are using **Private Endpoint** to connect to data sources. |
-|24 |Grant **Azure Purview MSI** account with **sysadmin** role |Azure SQL Administrator |Use this step if you have Azure Synapse Analytics (Serverless SQL databases). **Skip** this step if you are using **Private Endpoint** to connect to data sources. |
+|24 |Grant **Microsoft Purview MSI** account with **sysadmin** role |Azure SQL Administrator |Use this step if you have Azure Synapse Analytics (Serverless SQL databases). **Skip** this step if you are using **Private Endpoint** to connect to data sources. |
|25 |Create an app registration or service principal inside your Azure Active Directory tenant | Azure Active Directory *Global Administrator* or *Application Administrator* | Use this step if you're planning to perform a scan on a data source using Delegated Auth or [Service Principal](create-service-principal-azure.md).| |26 |Create an **Azure Key Vault** and a **Secret** to save data source credentials or service principal secret. |*Contributor* or *Key Vault Administrator* |Use this step if you have **on-premises** or **VM-based data sources** (e.g. SQL Server). <br> Use this step are using **ingestion private endpoints** to scan a data source. |
-|27 |Grant Key **Vault Access Policy** to Azure Purview MSI: **Secret: get/list** |*Key Vault Administrator* |Use this step if you have **on-premises** / **VM-based data sources** (e.g. SQL Server) <br> Use this step if **Key Vault Permission Model** is set to [Vault Access Policy](../key-vault/general/assign-access-policy.md). |
-|28 |Grant **Key Vault RBAC role** Key Vault Secrets User to Azure Purview MSI. | *Owner* or *User Access Administrator* |Use this step if you have **on-premises** or **VM-based data sources** (e.g. SQL Server) <br> Use this step if **Key Vault Permission Model** is set to [Azure role-based access control](../key-vault/general/rbac-guide.md). |
-|29 | Create a new connection to Azure Key Vault from Azure Purview Studio | *Data source admin* | Use this step if you are planing to use any of the following [authentication options](manage-credentials.md#create-a-new-credential) to scan a data source in Azure Purview: <ul><li>Account key</li><li>Basic Authentication</li><li>Delegated Auth</li><li>SQL Authentication</li><li>Service Principal</li><li>Consumer Key</li></ul>
-|30 |Deploy a private endpoint for Power BI tenant |*Power BI Administrator* <br> *Network contributor* |Use this step if you're planning to register a Power BI tenant as data source and your Azure Purview account is set to **deny public access**. <br> For more information, see [How to configure private endpoints for accessing Power BI](/power-bi/enterprise/service-security-private-links). |
-|31 |Connect Azure Data Factory to Azure Purview from Azure Data Factory Portal. **Manage** -> **Azure Purview**. Select **Connect to a Purview account**. <br> Validate if Azure resource tag **catalogUri** exists in ADF Azure resource. |Azure Data Factory Contributor / Data curator |Use this step if you have **Azure Data Factory**. |
-|32 |Verify if you have at least one **Microsoft 365 required license** in your Azure Active Directory tenant to use sensitivity labels in Azure Purview. |Azure Active Directory *Global Reader* |Perform this step if you're planning in extending **Sensitivity Labels from Microsoft 365 to Azure Purview** <br> For more information, see [licensing requirements to use sensitivity labels on files and database columns in Azure Purview](sensitivity-labels-frequently-asked-questions.yml) |
-|33 |Consent "**Extend labeling to assets in Azure Purview**" |Compliance Administrator <br> Azure Information Protection Administrator |Use this step if you are interested in extending Sensitivity Labels from Microsoft 365 to Azure Purview. <br> Use this step if you are interested in extending **Sensitivity Labels** from Microsoft 365 to Azure Purview. |
-|34 |Create new collections and assign roles in Azure Purview |*Collection admin* | [Create a collection and assign permissions in Azure Purview](./quickstart-create-collection.md). |
-|36 |Register and scan Data Sources in Azure Purview |*Data Source admin* <br> *Data Reader* or *Data Curator* | For more information, see [supported data sources and file types](azure-purview-connector-overview.md) |
-|35 |Grant access to data roles in the organization |*Collection admin* |Provide access to other teams to use Azure Purview: <ul><li> Data curator</li><li>Data reader</li><li>Collection admin</li><li>Data source admin</li><li>Policy Author</li><li>Workflow admin</li></ul> <br> For more information, see [Access control in Azure Purview](catalog-permissions.md). |
+|27 |Grant Key **Vault Access Policy** to Microsoft Purview MSI: **Secret: get/list** |*Key Vault Administrator* |Use this step if you have **on-premises** / **VM-based data sources** (e.g. SQL Server) <br> Use this step if **Key Vault Permission Model** is set to [Vault Access Policy](../key-vault/general/assign-access-policy.md). |
+|28 |Grant **Key Vault RBAC role** Key Vault Secrets User to Microsoft Purview MSI. | *Owner* or *User Access Administrator* |Use this step if you have **on-premises** or **VM-based data sources** (e.g. SQL Server) <br> Use this step if **Key Vault Permission Model** is set to [Azure role-based access control](../key-vault/general/rbac-guide.md). |
+|29 | Create a new connection to Azure Key Vault from Microsoft Purview Studio | *Data source admin* | Use this step if you are planing to use any of the following [authentication options](manage-credentials.md#create-a-new-credential) to scan a data source in Microsoft Purview: <ul><li>Account key</li><li>Basic Authentication</li><li>Delegated Auth</li><li>SQL Authentication</li><li>Service Principal</li><li>Consumer Key</li></ul>
+|30 |Deploy a private endpoint for Power BI tenant |*Power BI Administrator* <br> *Network contributor* |Use this step if you're planning to register a Power BI tenant as data source and your Microsoft Purview account is set to **deny public access**. <br> For more information, see [How to configure private endpoints for accessing Power BI](/power-bi/enterprise/service-security-private-links). |
+|31 |Connect Azure Data Factory to Microsoft Purview from Azure Data Factory Portal. **Manage** -> **Microsoft Purview**. Select **Connect to a Purview account**. <br> Validate if Azure resource tag **catalogUri** exists in ADF Azure resource. |Azure Data Factory Contributor / Data curator |Use this step if you have **Azure Data Factory**. |
+|32 |Verify if you have at least one **Microsoft 365 required license** in your Azure Active Directory tenant to use sensitivity labels in Microsoft Purview. |Azure Active Directory *Global Reader* |Perform this step if you're planning in extending **Sensitivity Labels from Microsoft 365 to Microsoft Purview** <br> For more information, see [licensing requirements to use sensitivity labels on files and database columns in Microsoft Purview](sensitivity-labels-frequently-asked-questions.yml) |
+|33 |Consent "**Extend labeling to assets in Microsoft Purview**" |Compliance Administrator <br> Azure Information Protection Administrator |Use this step if you are interested in extending Sensitivity Labels from Microsoft 365 to Microsoft Purview. <br> Use this step if you are interested in extending **Sensitivity Labels** from Microsoft 365 to Microsoft Purview. |
+|34 |Create new collections and assign roles in Microsoft Purview |*Collection admin* | [Create a collection and assign permissions in Microsoft Purview](./quickstart-create-collection.md). |
+|36 |Register and scan Data Sources in Microsoft Purview |*Data Source admin* <br> *Data Reader* or *Data Curator* | For more information, see [supported data sources and file types](azure-purview-connector-overview.md) |
+|35 |Grant access to data roles in the organization |*Collection admin* |Provide access to other teams to use Microsoft Purview: <ul><li> Data curator</li><li>Data reader</li><li>Collection admin</li><li>Data source admin</li><li>Policy Author</li><li>Workflow admin</li></ul> <br> For more information, see [Access control in Microsoft Purview](catalog-permissions.md). |
## Next steps-- [Review Azure Purview deployment best practices](./deployment-best-practices.md)
+- [Review Microsoft Purview deployment best practices](./deployment-best-practices.md)
purview Tutorial Azure Purview Tools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-azure-purview-tools.md
Title: Learn about Azure Purview open-source tools and utilities
-description: This tutorial lists various tools and utilities available in Azure Purview and discusses their usage.
+ Title: Learn about Microsoft Purview open-source tools and utilities
+description: This tutorial lists various tools and utilities available in Microsoft Purview and discusses their usage.
Last updated 10/10/2021
-# Customer Intent: As an Azure Purview administrator, I want to kickstart and be up and running with Azure Purview service in a matter of minutes; additionally, I want to perform and set up automations, batch-mode API executions and scripts that help me run Azure Purview smoothly and effectively for the long-term on a regular basis.
+# Customer Intent: As a Microsoft Purview administrator, I want to kickstart and be up and running with Microsoft Purview service in a matter of minutes; additionally, I want to perform and set up automations, batch-mode API executions and scripts that help me run Microsoft Purview smoothly and effectively for the long-term on a regular basis.
-# Azure Purview open-source tools and utilities
+# Microsoft Purview open-source tools and utilities
-This article lists several open-source tools and utilities (command-line, python, and PowerShell interfaces) that help you get started quickly on Azure Purview service in a matter of minutes! These tools have been authored & developed by collective effort of the Azure Purview Product Group and the open-source community. The objective of such tools is to make learning, starting up, regular usage, and long-term adoption of Azure Purview breezy and super fast.
+This article lists several open-source tools and utilities (command-line, python, and PowerShell interfaces) that help you get started quickly on Microsoft Purview service in a matter of minutes! These tools have been authored & developed by collective effort of the Microsoft Purview Product Group and the open-source community. The objective of such tools is to make learning, starting up, regular usage, and long-term adoption of Microsoft Purview breezy and super fast.
### Intended audience -- Azure Purview community including customers, developers, ISVs, partners, evangelists, and enthusiasts.
+- Microsoft Purview community including customers, developers, ISVs, partners, evangelists, and enthusiasts.
-- Azure Purview catalog is based on [Apache Atlas](https://atlas.apache.org/) and extends full support for Apache Atlas APIs. We welcome Apache Atlas community, enthusiasts, and developers to wholeheartedly build on and evangelize Azure Purview.
+- Microsoft Purview catalog is based on [Apache Atlas](https://atlas.apache.org/) and extends full support for Apache Atlas APIs. We welcome Apache Atlas community, enthusiasts, and developers to wholeheartedly build on and evangelize Microsoft Purview.
-### Azure Purview customer journey stages
+### Microsoft Purview customer journey stages
-- *Azure Purview Learners*: Learners who are starting fresh with Azure Purview service and are keen to understand and explore how a multi-cloud unified data governance solution works. A section of learners includes users who want to compare and contrast Azure Purview with other competing solutions in the data governance market and try it before adopting for long-term usage.
+- *Microsoft Purview Learners*: Learners who are starting fresh with Microsoft Purview service and are keen to understand and explore how a multi-cloud unified data governance solution works. A section of learners includes users who want to compare and contrast Microsoft Purview with other competing solutions in the data governance market and try it before adopting for long-term usage.
-- *Azure Purview Innovators*: Innovators who are keen to understand existing and latest features, ideate, and conceptualize features upcoming on Azure Purview. They are adept at building and developing solutions for customers, and have futuristic forward-looking ideas for the next-gen cutting-edge data governance product.
+- *Microsoft Purview Innovators*: Innovators who are keen to understand existing and latest features, ideate, and conceptualize features upcoming on Microsoft Purview. They are adept at building and developing solutions for customers, and have futuristic forward-looking ideas for the next-gen cutting-edge data governance product.
-- *Azure Purview Enthusiasts/Evangelists*: Enthusiasts who are a combination of Learners and Innovators. They have developed solid understanding and knowledge of Azure Purview, hence, are upbeat about adoption of Azure Purview. They can help evangelize Azure Purview as a service and educate several other Azure Purview users and probable customers across the globe.
+- *Microsoft Purview Enthusiasts/Evangelists*: Enthusiasts who are a combination of Learners and Innovators. They have developed solid understanding and knowledge of Microsoft Purview, hence, are upbeat about adoption of Microsoft Purview. They can help evangelize Microsoft Purview as a service and educate several other Microsoft Purview users and probable customers across the globe.
-- *Azure Purview Adopters*: Adopters who have migrated from starting up and exploring Azure Purview and are smoothly using Azure Purview for more than a few months.
+- *Microsoft Purview Adopters*: Adopters who have migrated from starting up and exploring Microsoft Purview and are smoothly using Microsoft Purview for more than a few months.
-- *Azure Purview Long-Term Regular Users*: Long-term users who have been using Azure Purview for more than one year and are now confident and comfortable using most advanced Azure Purview use cases on the Azure portal and Azure Purview Studio; furthermore they have near perfect knowledge and awareness of the Azure Purview REST APIs and the other use cases supported via Azure Purview APIs.
+- *Microsoft Purview Long-Term Regular Users*: Long-term users who have been using Microsoft Purview for more than one year and are now confident and comfortable using most advanced Microsoft Purview use cases on the Azure portal and Microsoft Purview Studio; furthermore they have near perfect knowledge and awareness of the Microsoft Purview REST APIs and the other use cases supported via Microsoft Purview APIs.
-## Azure Purview open-source tools and utilities list
+## Microsoft Purview open-source tools and utilities list
1. [Purview-API-via-PowerShell](https://github.com/Azure/Azure-Purview-API-PowerShell/blob/main/README.md) - **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: This utility is based on and covers the entire set of [Azure Purview REST API Reference](/rest/api/purview/) Microsoft Docs. [Download & Install from PowerShell Gallery](https://aka.ms/purview-api-ps). It helps you execute all the documented Azure Purview REST APIs through a breezy fast and easy to use PowerShell interface. Use and automate Azure Purview APIs for regular and long-term usage via command-line and scripted methods. This is an alternative for customers looking to do bulk tasks in automated manner, batch-mode, or scheduled cron jobs; as against the GUI method of using the Azure portal and Azure Purview Studio. Detailed documentation, sample usage guide, self-help, and examples are available on [GitHub:Azure-Purview-API-PowerShell](https://github.com/Azure/Azure-Purview-API-PowerShell).
+ - **Description**: This utility is based on and covers the entire set of [Microsoft Purview REST API Reference](/rest/api/purview/) Microsoft Docs. [Download & Install from PowerShell Gallery](https://aka.ms/purview-api-ps). It helps you execute all the documented Microsoft Purview REST APIs through a breezy fast and easy to use PowerShell interface. Use and automate Microsoft Purview APIs for regular and long-term usage via command-line and scripted methods. This is an alternative for customers looking to do bulk tasks in automated manner, batch-mode, or scheduled cron jobs; as against the GUI method of using the Azure portal and Microsoft Purview Studio. Detailed documentation, sample usage guide, self-help, and examples are available on [GitHub:Azure-Purview-API-PowerShell](https://github.com/Azure/Azure-Purview-API-PowerShell).
1. [Purview-Starter-Kit](https://aka.ms/PurviewKickstart) - **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts*
- - **Description**: PowerShell script to perform initial setup of Azure Purview account. Useful for anyone looking to set up several fresh new Azure Purview account(s) in less than 5 minutes!
+ - **Description**: PowerShell script to perform initial setup of Microsoft Purview account. Useful for anyone looking to set up several fresh new Microsoft Purview account(s) in less than 5 minutes!
-1. [Azure Purview Lab](https://aka.ms/purviewlab)
+1. [Microsoft Purview Lab](https://aka.ms/purviewlab)
- **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts*
- - **Description**: A hands-on-lab introducing the myriad features of Azure Purview and helping you learn the concepts in a practical and hands-on approach where you execute each step on your own by hand to develop the best possible understanding of Azure Purview.
+ - **Description**: A hands-on-lab introducing the myriad features of Microsoft Purview and helping you learn the concepts in a practical and hands-on approach where you execute each step on your own by hand to develop the best possible understanding of Microsoft Purview.
-1. [Azure Purview CLI](https://aka.ms/purviewcli)
+1. [Microsoft Purview CLI](https://aka.ms/purviewcli)
- **Recommended customer journey stages**: *Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: Python-based tool to execute the Azure Purview APIs similar to [Purview-API-via-PowerShell](https://aka.ms/purview-api-ps) but has limited/lesser functionality than the PowerShell-based framework.
+ - **Description**: Python-based tool to execute the Microsoft Purview APIs similar to [Purview-API-via-PowerShell](https://aka.ms/purview-api-ps) but has limited/lesser functionality than the PowerShell-based framework.
-1. [Azure Purview Demo](https://aka.ms/pvdemo)
+1. [Microsoft Purview Demo](https://aka.ms/pvdemo)
- **Recommended customer journey stages**: *Learners, Innovators, Enthusiasts*
- - **Description**: An Azure Resource Manager (ARM) template-based tool to automatically set up and deploy fresh new Azure Purview account quickly and securely at the issue of just one command. It is similar to [Purview-Starter-Kit](https://aka.ms/PurviewKickstart), the extra feature being it deploys a few more pre-configured data sources - Azure SQL Database, Azure Data Lake Storage Gen2 Account, Azure Data Factory, Azure Synapse Analytics Workspace
+ - **Description**: An Azure Resource Manager (ARM) template-based tool to automatically set up and deploy fresh new Microsoft Purview account quickly and securely at the issue of just one command. It is similar to [Purview-Starter-Kit](https://aka.ms/PurviewKickstart), the extra feature being it deploys a few more pre-configured data sources - Azure SQL Database, Azure Data Lake Storage Gen2 Account, Azure Data Factory, Azure Synapse Analytics Workspace
-1. [PyApacheAtlas: Interface between Azure Purview and Apache Atlas](https://github.com/wjohnson/pyapacheatlas) using Atlas APIs
+1. [PyApacheAtlas: Interface between Microsoft Purview and Apache Atlas](https://github.com/wjohnson/pyapacheatlas) using Atlas APIs
- **Recommended customer journey stages**: *Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: A Python package to work with Azure Purview and Apache Atlas API. Supports bulk loading, custom lineage, and more from a Pythonic set of classes and Excel templates. The package supports programmatic interaction and an Excel template for low-code uploads.
+ - **Description**: A Python package to work with Microsoft Purview and Apache Atlas API. Supports bulk loading, custom lineage, and more from a Pythonic set of classes and Excel templates. The package supports programmatic interaction and an Excel template for low-code uploads.
-1. [Azure Purview Event Hubs Notifications Reader](https://github.com/Azure/Azure-Purview-API-PowerShell/blob/main/purview_atlas_eventhub_sample.py)
+1. [Microsoft Purview Event Hubs Notifications Reader](https://github.com/Azure/Azure-Purview-API-PowerShell/blob/main/purview_atlas_eventhub_sample.py)
- **Recommended customer journey stages**: *Innovators, Enthusiasts, Adopters, Long-Term Regular Users*
- - **Description**: This tool demonstrates how to read Azure Purview's Event Hubs and catch real-time Kafka notifications from the Event Hubs in Atlas Notifications (https://atlas.apache.org/2.0.0/Notifications.html) format. Further, it generates an excel sheet CSV of the entities and assets on the fly that are discovered live during a scan, and any other notifications of interest that Azure Purview generates.
+ - **Description**: This tool demonstrates how to read Microsoft Purview's Event Hubs and catch real-time Kafka notifications from the Event Hubs in Atlas Notifications (https://atlas.apache.org/2.0.0/Notifications.html) format. Further, it generates an excel sheet CSV of the entities and assets on the fly that are discovered live during a scan, and any other notifications of interest that Microsoft Purview generates.
## Feedback and disclaimer
purview Tutorial Data Owner Policies Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-data-owner-policies-storage.md
Last updated 04/08/2022
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-[Policies](concept-data-owner-policies.md) in Azure Purview allow you to enable access to data sources that have been registered to a collection. This tutorial describes how a data owner can use Azure Purview to enable access to datasets in Azure Storage through Azure Purview.
+[Policies](concept-data-owner-policies.md) in Microsoft Purview allow you to enable access to data sources that have been registered to a collection. This tutorial describes how a data owner can use Microsoft Purview to enable access to datasets in Azure Storage through Microsoft Purview.
In this tutorial, you learn how to: > [!div class="checklist"] > * Prepare your Azure environment
-> * Configure permissions to allow Azure Purview to connect to your resources
+> * Configure permissions to allow Microsoft Purview to connect to your resources
> * Register your Azure Storage resource for data use governance > * Create and publish a policy for your resource group or subscription
In this tutorial, you learn how to:
[!INCLUDE [Access policies generic configuration](./includes/access-policies-configuration-generic.md)]
-### Register the data sources in Azure Purview for data use governance
+### Register the data sources in Microsoft Purview for data use governance
-Your Azure Storage account needs to be registered in Azure Purview to later define access policies, and during registration we'll enable data use governance. **Data use governance** is an available feature in Azure Purview that allows users to manage access to a resource from within Azure Purview. This allows you to centralize data discovery and access management, however it's a feature that directly impacts your data security.
+Your Azure Storage account needs to be registered in Microsoft Purview to later define access policies, and during registration we'll enable data use governance. **Data use governance** is an available feature in Microsoft Purview that allows users to manage access to a resource from within Microsoft Purview. This allows you to centralize data discovery and access management, however it's a feature that directly impacts your data security.
> [!WARNING] > Before enabling data use governance for any of your resources, read through our [**data use governance article**](how-to-enable-data-use-governance.md).
To register your resource and enable data use governance, follow these steps:
:::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-access-control.png" alt-text="Screenshot that shows the access control for the storage account":::
-1. Set the **Role** to **Storage Blob Data Reader** and enter your _Azure Purview account name_ under the **Select** input box. Then, select **Save** to give this role assignment to your Azure Purview account.
+1. Set the **Role** to **Storage Blob Data Reader** and enter your _Microsoft Purview account name_ under the **Select** input box. Then, select **Save** to give this role assignment to your Microsoft Purview account.
- :::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Azure Purview account":::
+ :::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-assign-permissions.png" alt-text="Screenshot that shows the details to assign permissions for the Microsoft Purview account":::
1. If you have a firewall enabled on your Storage account, follow these steps as well: 1. Go into your Azure Storage account in [Azure portal](https://portal.azure.com).
To register your resource and enable data use governance, follow these steps:
:::image type="content" source="media/tutorial-data-owner-policies-storage/register-blob-permission.png" alt-text="Screenshot that shows the exceptions to allow trusted Microsoft services to access the storage account.":::
-1. Once you have set up authentication for your storage account, go to the [Azure Purview Studio](https://web.purview.azure.com/).
+1. Once you have set up authentication for your storage account, go to the [Microsoft Purview Studio](https://web.purview.azure.com/).
1. Select **Data Map** on the left menu.
- :::image type="content" source="media/tutorial-data-owner-policies-storage/select-data-map.png" alt-text="Screenshot that shows the far left menu in the Azure Purview Studio open with Data Map highlighted.":::
+ :::image type="content" source="media/tutorial-data-owner-policies-storage/select-data-map.png" alt-text="Screenshot that shows the far left menu in the Microsoft Purview Studio open with Data Map highlighted.":::
1. Select **Register**.
- :::image type="content" source="media/tutorial-data-owner-policies-storage/select-register.png" alt-text="Screenshot that shows Azure Purview Studio Data Map sources, with the register button highlighted at the top.":::
+ :::image type="content" source="media/tutorial-data-owner-policies-storage/select-register.png" alt-text="Screenshot that shows Microsoft Purview Studio Data Map sources, with the register button highlighted at the top.":::
1. On **Register sources**, select **Azure Blob Storage**.
To register your resource and enable data use governance, follow these steps:
>If the data use governance toggle is greyed out and unable to be selected: > 1. Confirm you have followed all prerequisites to enable Data use governance across your resources. > 1. Confirm that you have selected a storage account to be registered.
- > 1. It may be that this resource is already registered in another Azure Purview account. Hover over it to know the name of the Azure Purview account that has registered the data resource.first. Only one Azure Purview account can register a resource for data use governance at at time.
+ > 1. It may be that this resource is already registered in another Microsoft Purview account. Hover over it to know the name of the Microsoft Purview account that has registered the data resource.first. Only one Microsoft Purview account can register a resource for data use governance at at time.
- 1. Select **Register** to register the resource group or subscription with Azure Purview with data use governance enabled.
+ 1. Select **Register** to register the resource group or subscription with Microsoft Purview with data use governance enabled.
>[!TIP] > For more information about data use governance, including best practices or known issues, see our [data use governance article](how-to-enable-data-use-governance.md). ## Create a data owner policy
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**. 1. Select the **New Policy** button in the policy page.
- :::image type="content" source="./media/access-policies-common/policy-onboard-guide-1.png" alt-text="Data owner can access the Policy functionality in Azure Purview when it wants to create policies.":::
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-1.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to create policies.":::
1. The new policy page will appear. Enter the policy **Name** and **Description**.
To register your resource and enable data use governance, follow these steps:
## Publish a data owner policy
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
- :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Screenshot showing the Azure Purview studio with the leftmost menu open, Policy Management highlighted, and Data Policies selected on the next page.":::
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Screenshot showing the Microsoft Purview studio with the leftmost menu open, Policy Management highlighted, and Data Policies selected on the next page.":::
-1. The Policy portal will present the list of existing policies in Azure Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
+1. The Policy portal will present the list of existing policies in Microsoft Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
:::image type="content" source="./media/access-policies-common/publish-policy.png" alt-text="Screenshot showing the policy editing menu with the Publish button highlighted in the top right of the page.":::
To register your resource and enable data use governance, follow these steps:
## Clean up resources
-To delete a policy in Azure Purview, follow these steps:
+To delete a policy in Microsoft Purview, follow these steps:
-1. Sign in to the [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Sign in to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**. :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Screenshot showing the leftmost menu open, Policy Management highlighted, and Data Policies selected on the next page.":::
-1. The Policy portal will present the list of existing policies in Azure Purview. Select the policy that needs to be updated.
+1. The Policy portal will present the list of existing policies in Microsoft Purview. Select the policy that needs to be updated.
1. The policy details page will appear, including Edit and Delete options. Select the **Edit** button, which brings up the policy statement builder. Now, any parts of the statements in this policy can be updated. To delete the policy, use the **Delete** button.
Check our demo and related tutorials:
> [!div class="nextstepaction"] > [Demo of access policy for Azure Storage](https://docs.microsoft.com/video/media/8ce7c554-0d48-430f-8f63-edf94946947c/purview-policy-storage-dataowner-scenario_mid.mp4)
-> [Concepts for Azure Purview data owner policies](./concept-data-owner-policies.md)
-> [Enable Azure Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
+> [Concepts for Microsoft Purview data owner policies](./concept-data-owner-policies.md)
+> [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
purview Tutorial Data Sources Readiness https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-data-sources-readiness.md
Title: 'Check data source readiness at scale'
-description: In this tutorial, you'll verify the readiness of your Azure data sources before you register and scan them in Azure Purview.
+description: In this tutorial, you'll verify the readiness of your Azure data sources before you register and scan them in Microsoft Purview.
Last updated 09/27/2021
# Tutorial: Check data source readiness at scale
-To scan data sources, Azure Purview requires access to them. It uses credentials to obtain this access. A *credential* is the authentication information that Azure Purview can use to authenticate to your registered data sources. There are a few ways to set up the credentials for Azure Purview, including:
-- The managed identity assigned to the Azure Purview account.
+To scan data sources, Microsoft Purview requires access to them. It uses credentials to obtain this access. A *credential* is the authentication information that Microsoft Purview can use to authenticate to your registered data sources. There are a few ways to set up the credentials for Microsoft Purview, including:
+- The managed identity assigned to the Microsoft Purview account.
- Secrets stored in Azure Key Vault. - Service principals.
-In this two-part tutorial series, we'll help you verify and configure required Azure role assignments and network access for various Azure data sources across your Azure subscriptions at scale. You can then register and scan your Azure data sources in Azure Purview.
+In this two-part tutorial series, we'll help you verify and configure required Azure role assignments and network access for various Azure data sources across your Azure subscriptions at scale. You can then register and scan your Azure data sources in Microsoft Purview.
-Run the [Azure Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script after you deploy your Azure Purview account and before you register and scan your Azure data sources.
+Run the [Microsoft Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script after you deploy your Microsoft Purview account and before you register and scan your Azure data sources.
In part 1 of this tutorial series, you'll:
In part 1 of this tutorial series, you'll:
> > * Locate your data sources and prepare a list of data source subscriptions. > * Run the readiness checklist script to find any missing role-based access control (RBAC) or network configurations across your data sources in Azure.
-> * In the output report, review missing network configurations and role assignments required by Azure Purview Managed Identity (MSI).
+> * In the output report, review missing network configurations and role assignments required by Microsoft Purview Managed Identity (MSI).
> * Share the report with data Azure subscription owners so they can take suggested actions. ## Prerequisites * Azure subscriptions where your data sources are located. If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
-* An [Azure Purview account](create-catalog-portal.md).
+* A [Microsoft Purview account](create-catalog-portal.md).
* An Azure Key Vault resource in each subscription that has data sources like Azure SQL Database, Azure Synapse Analytics, or Azure SQL Managed Instance.
-* The [Azure Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script.
+* The [Microsoft Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script.
> [!NOTE]
-> The Azure Purview data sources readiness checklist is available only for Windows.
-> This readiness checklist script is currently supported for Azure Purview MSI.
+> The Microsoft Purview data sources readiness checklist is available only for Windows.
+> This readiness checklist script is currently supported for Microsoft Purview MSI.
## Prepare Azure subscriptions list for data sources
Before running the script, create a .csv file (for example, C:\temp\Subscription
Follow these steps to run the script from your Windows computer:
-1. [Download the Azure Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script to the location of your choice.
+1. [Download the Microsoft Purview data sources readiness checklist](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-Readiness) script to the location of your choice.
2. On your computer, enter **PowerShell** in the search box on the Windows taskbar. In the search list, select and hold (or right-click) **Windows PowerShell** and then select **Run as administrator**.
Before you run the PowerShell script to verify the readiness of data source subs
- `All` -- `PurviewAccount`: Your existing Azure Purview account resource name.
+- `PurviewAccount`: Your existing Microsoft Purview account resource name.
-- `PurviewSub`: Subscription ID where the Azure Purview account is deployed.
+- `PurviewSub`: Subscription ID where the Microsoft Purview account is deployed.
## Verify your permissions
Role or permission | Scope |
|-|--| | **Global Reader** | Azure AD tenant | | **Reader** | Azure subscriptions where your Azure data sources are located |
-| **Reader** | Subscription where your Azure Purview account was created |
+| **Reader** | Subscription where your Microsoft Purview account was created |
| **SQL Admin** (Azure AD Authentication) | Azure Synapse dedicated pools, Azure SQL Database instances, Azure SQL managed instances | | Access to your Azure key vault | Access to get/list key vault's secret or Azure Key Vault secret user |
You can choose all or any of these data sources as the input parameter when you
#### Azure Blob Storage (BlobStorage) -- RBAC. Check whether Azure Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.-- RBAC. Check whether Azure Purview MSI is assigned the **Reader** role on the selected scope.
+- RBAC. Check whether Microsoft Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.
+- RBAC. Check whether Microsoft Purview MSI is assigned the **Reader** role on the selected scope.
- Service endpoint. Check whether service endpoint is on, and check whether **Allow trusted Microsoft services to access this storage account** is enabled. - Networking: Check whether private endpoint is created for storage and enabled for Blob Storage. #### Azure Data Lake Storage Gen2 (ADLSGen2) -- RBAC. Check whether Azure Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.-- RBAC. Check whether Azure Purview MSI is assigned the **Reader** role on the selected scope.
+- RBAC. Check whether Microsoft Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.
+- RBAC. Check whether Microsoft Purview MSI is assigned the **Reader** role on the selected scope.
- Service endpoint. Check whether service endpoint is on, and check whether **Allow trusted Microsoft services to access this storage account** is enabled. - Networking: Check whether private endpoint is created for storage and enabled for Blob Storage. #### Azure Data Lake Storage Gen1 (ADLSGen1) - Networking. Check whether service endpoint is on, and check whether **Allow all Azure services to access this Data Lake Storage Gen1 account** is enabled.-- Permissions. Check whether Azure Purview MSI has Read/Execute permissions.
+- Permissions. Check whether Microsoft Purview MSI has Read/Execute permissions.
#### Azure SQL Database (AzureSQLDB)
You can choose all or any of these data sources as the input parameter when you
- Azure AD administration. Populate the Azure SQL Server Azure AD admin user or group. - SQL databases:
- - SQL role. Check whether Azure Purview MSI is assigned the **db_datareader** role.
+ - SQL role. Check whether Microsoft Purview MSI is assigned the **db_datareader** role.
#### Azure SQL Managed Instance (AzureSQLMI)
You can choose all or any of these data sources as the input parameter when you
- Azure AD administration. Populate the Azure SQL Server Azure AD admin user or group. - SQL databases:
- - SQL role. Check whether Azure Purview MSI is assigned the **db_datareader** role.
+ - SQL role. Check whether Microsoft Purview MSI is assigned the **db_datareader** role.
#### Azure Synapse (Synapse) dedicated pool -- RBAC. Check whether Azure Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.-- RBAC. Check whether Azure Purview MSI is assigned the **Reader** role on the selected scope.
+- RBAC. Check whether Microsoft Purview MSI is assigned the **Storage Blob Data Reader** role in each of the subscriptions below the selected scope.
+- RBAC. Check whether Microsoft Purview MSI is assigned the **Reader** role on the selected scope.
- SQL Server instances (dedicated pools): - Network: Check whether public endpoint or private endpoint is enabled. - Firewall: Check whether **Allow Azure services and resources to access this server** is enabled.
You can choose all or any of these data sources as the input parameter when you
- Azure AD administration: Populate the Azure SQL Server Azure AD admin user or group. - SQL databases:
- - SQL role. Check whether Azure Purview MSI is assigned the **db_datareader** role.
+ - SQL role. Check whether Microsoft Purview MSI is assigned the **db_datareader** role.
## Next steps In this tutorial, you learned how to: > [!div class="checklist"] >
-> * Run the Azure Purview readiness checklist to check, at scale, whether your Azure subscriptions are missing configuration, before you register and scan them in Azure Purview.
+> * Run the Microsoft Purview readiness checklist to check, at scale, whether your Azure subscriptions are missing configuration, before you register and scan them in Microsoft Purview.
-Go to the next tutorial to learn how to identify the required access and set up required authentication and network rules for Azure Purview across Azure data sources:
+Go to the next tutorial to learn how to identify the required access and set up required authentication and network rules for Microsoft Purview across Azure data sources:
> [!div class="nextstepaction"]
-> [Configure access to data sources for Azure Purview MSI at scale](tutorial-msi-configuration.md)
+> [Configure access to data sources for Microsoft Purview MSI at scale](tutorial-msi-configuration.md)
purview Tutorial Metadata Policy Collections Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-metadata-policy-collections-apis.md
Title: Learn about Azure Purview collections metadata policy and roles APIs
-description: This tutorial discusses how to manage role-based access control over Azure Purview collections to users, groups, or service principals.
+ Title: Learn about Microsoft Purview collections metadata policy and roles APIs
+description: This tutorial discusses how to manage role-based access control over Microsoft Purview collections to users, groups, or service principals.
Last updated 09/24/2021
-# Customer intent: As an Azure Purview collection administrator, I want to manage collections and control access to each collection in the Azure Purview account by adding or removing users, groups, or service principals via the REST API interface.
+# Customer intent: As a Microsoft Purview collection administrator, I want to manage collections and control access to each collection in the Microsoft Purview account by adding or removing users, groups, or service principals via the REST API interface.
-# Tutorial: Use REST APIs to manage role-based access control on Azure Purview collections
+# Tutorial: Use REST APIs to manage role-based access control on Microsoft Purview collections
-In August 2021, access control in Azure Purview moved from Azure Identity & Access Management (IAM) (control plane) to [Azure Purview collections](how-to-create-and-manage-collections.md) (data plane). This change gives enterprise data curators and administrators more precise, granular access control on their data sources scanned by Azure Purview. The change also enables organizations to audit right access and right use of their data.
+In August 2021, access control in Microsoft Purview moved from Azure Identity & Access Management (IAM) (control plane) to [Microsoft Purview collections](how-to-create-and-manage-collections.md) (data plane). This change gives enterprise data curators and administrators more precise, granular access control on their data sources scanned by Microsoft Purview. The change also enables organizations to audit right access and right use of their data.
-This tutorial guides you through step-by-step usage of the Azure Purview Metadata Policy APIs to help you add users, groups, or service principals to a collection, and manage or remove their roles within that collection. REST APIs are an alternative method to using the Azure portal or Azure Purview Studio to achieve the same granular role-based access control.
+This tutorial guides you through step-by-step usage of the Microsoft Purview Metadata Policy APIs to help you add users, groups, or service principals to a collection, and manage or remove their roles within that collection. REST APIs are an alternative method to using the Azure portal or Microsoft Purview Studio to achieve the same granular role-based access control.
-For more information about the built-in roles in Azure Purview, see the [Azure Purview permissions guide](catalog-permissions.md#roles). The guide maps the roles to the level of access permissions that are granted to users.
+For more information about the built-in roles in Microsoft Purview, see the [Microsoft Purview permissions guide](catalog-permissions.md#roles). The guide maps the roles to the level of access permissions that are granted to users.
## Metadata Policy API Reference summary
-The following table gives an overview of the [Azure Purview Metadata Policy API Reference](/rest/api/purview/metadatapolicydataplane/Metadata-Policy).
+The following table gives an overview of the [Microsoft Purview Metadata Policy API Reference](/rest/api/purview/metadatapolicydataplane/Metadata-Policy).
> [!NOTE]
-> Replace {pv-acc-name} with the name of your Azure Purview account before running these APIs. For instance, if your Azure Purview account name is *FabrikamPurviewAccount*, your API endpoints will become *FabrikamPurviewAccount.purview.azure.com*. The "api-version" parameter is subject to change. Please refer the [Azure Purview Metadata policy REST API documentation](/rest/api/purview/metadatapolicydataplane/Metadata-Policy) for the latest "api-version" and the API signature.
+> Replace {pv-acc-name} with the name of your Microsoft Purview account before running these APIs. For instance, if your Microsoft Purview account name is *FabrikamPurviewAccount*, your API endpoints will become *FabrikamPurviewAccount.purview.azure.com*. The "api-version" parameter is subject to change. Please refer the [Microsoft Purview Metadata policy REST API documentation](/rest/api/purview/metadatapolicydataplane/Metadata-Policy) for the latest "api-version" and the API signature.
| API&nbsp;function | REST&nbsp;method | API&nbsp;endpoint | Description | | :- | :- | :- | :- |
-| Read All Metadata Roles| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataroles?&api-version=2021-07-01| Reads all metadata roles from your Azure Purview account.|
-| Read Metadata Policy By Collection Name| GET| https://{pv-acc-name}.purview.azure.com /policystore/collections/{collectionName}/metadataPolicy?&api-version=2021-07-01| Reads the metadata policy by using a specified collection name (the 6-character random name that's generated by Azure Purview when it creates the policy).|
+| Read All Metadata Roles| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataroles?&api-version=2021-07-01| Reads all metadata roles from your Microsoft Purview account.|
+| Read Metadata Policy By Collection Name| GET| https://{pv-acc-name}.purview.azure.com /policystore/collections/{collectionName}/metadataPolicy?&api-version=2021-07-01| Reads the metadata policy by using a specified collection name (the 6-character random name that's generated by Microsoft Purview when it creates the policy).|
| Read Metadata Policy By PolicyID| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies/{policyId}?&api-version=2021-07-01| Reads the metadata policy by using a specified policy ID. The policy ID is in GUID format.|
-| Read All Metadata Policies| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies?&api-version=2021-07-01| Reads all metadata policies from your Azure Purview account. You can pick a certain policy to work with from the JSON output list that's generated by this API.|
+| Read All Metadata Policies| GET| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies?&api-version=2021-07-01| Reads all metadata policies from your Microsoft Purview account. You can pick a certain policy to work with from the JSON output list that's generated by this API.|
| Update/PUT Metadata Policy| PUT| https://{pv-acc-name}.purview.azure.com /policystore/metadataPolicies/{policyId}?&api-version=2021-07-01| Updates the metadata policy by using a specified policy ID. The policy ID is in GUID format.| | | |
-## Azure Purview catalog collections API reference summary
-The following table gives an overview of the Azure Purview collections APIs. For complete documentation about each API, select the API operation in the left column.
+## Microsoft Purview catalog collections API reference summary
+The following table gives an overview of the Microsoft Purview collections APIs. For complete documentation about each API, select the API operation in the left column.
| Operation | Description | | :- | :- |
The following table gives an overview of the Azure Purview collections APIs. For
| [List collections](/rest/api/purview/accountdataplane/collections/list-collections) | Lists the collections in the account.| -- If you're using the API, the service principal, user, or group that executes the API should have a [Collection Admin](how-to-create-and-manage-collections.md#check-permissions) role assigned in Azure Purview to execute this API successfully.
+- If you're using the API, the service principal, user, or group that executes the API should have a [Collection Admin](how-to-create-and-manage-collections.md#check-permissions) role assigned in Microsoft Purview to execute this API successfully.
-- For all Azure Purview APIs that require {collectionName}, you will need to use *"name"* (and not *"friendlyName"*). Replace {collectionName} with the actual six-character alphanumeric collection name string.
+- For all Microsoft Purview APIs that require {collectionName}, you will need to use *"name"* (and not *"friendlyName"*). Replace {collectionName} with the actual six-character alphanumeric collection name string.
> [!NOTE] > This name is different from the friendly display name you supplied when you created the collection. If you don't have {collectionName} handy, use the [List Collections API](/rest/api/purview/accountdataplane/collections/list-collections) to select the six-character collection name from the JSON output.
Here are some of the important identifiers in the JSON output that's received fr
## Add or remove users from a collection or role
-Use Azure Purview REST APIs to add or remove a user, group, or service principal from a collection or role. Detailed API usage is provided along with sample JSON outputs. We highly recommend that you follow the instructions in the next sections sequentially for best understanding of the Azure Purview metadata policy APIs.
+Use Microsoft Purview REST APIs to add or remove a user, group, or service principal from a collection or role. Detailed API usage is provided along with sample JSON outputs. We highly recommend that you follow the instructions in the next sections sequentially for best understanding of the Microsoft Purview metadata policy APIs.
## Get all metadata roles
The default metadata roles are listed in the following table:
"properties": { "provisioningState": "Provisioned", "roleType": "BuiltIn",
- "friendlyName": "Azure Purview Reader",
+ "friendlyName": "Microsoft Purview Reader",
"cnfCondition": [ [ {
As described in the following two sections, both APIs serve the same purpose, an
GET https://{your_purview_account_name}.purview.azure.com/policystore/collections/{collectionName}/metadataPolicy?api-version=2021-07-01 ```
-1. The Azure Purview account name is {your_purview_account_name}. Replace it with your Azure Purview account name.
+1. The Microsoft Purview account name is {your_purview_account_name}. Replace it with your Microsoft Purview account name.
1. In the JSON output of the previous API, "Get All Metadata Policies," look for the following section:
GET https://{your_purview_account_name}.purview.azure.com/policystore/collection
GET https://{your_purview_account_name}.purview.azure.com/policystore/metadataPolicies/{policyId}?api-version=2021-07-01 ```
-1. The Azure Purview account name is {your_purview_account_name}. Replace it with your Azure Purview account name.
+1. The Microsoft Purview account name is {your_purview_account_name}. Replace it with your Microsoft Purview account name.
1. In the JSON output of the previous API, "Get All Metadata Policies," look for the following section:
GET https://{your_purview_account_name}.purview.azure.com/policystore/metadataPo
PUT https://{your_purview_account_name}.purview.azure.com/policystore/metadataPolicies/{policyId}?api-version=2021-07-01 ```
-In this section, you update the policy JSON that you obtained in the preceding step by adding or removing a user, group, or service principal from the collection. You then push it to the Azure Purview service by using a PUT REST method.
+In this section, you update the policy JSON that you obtained in the preceding step by adding or removing a user, group, or service principal from the collection. You then push it to the Microsoft Purview service by using a PUT REST method.
Whether you're adding or removing a user, group, or service principal, you'll follow the same API process.
Whether you're adding or removing a user, group, or service principal, you'll fo
} ``` ## Add the Root Collection Administrator role
-By default, the user who created the Azure Purview account is the Root Collection Administrator (that is, the administrator of the topmost level of the collection hierarchy). However, in some cases, an organization needs to change the Root Collection Administrator by using the API. For instance, it's possible that the current Root Collection Administrator no longer exists in the organization. In such a case, the Azure portal might be inaccessible to anyone in the organization. For this reason, using the API to assign a new Root Collection Administrator and manage collection permissions becomes the only way to regain access to the Azure Purview account.
+By default, the user who created the Microsoft Purview account is the Root Collection Administrator (that is, the administrator of the topmost level of the collection hierarchy). However, in some cases, an organization needs to change the Root Collection Administrator by using the API. For instance, it's possible that the current Root Collection Administrator no longer exists in the organization. In such a case, the Azure portal might be inaccessible to anyone in the organization. For this reason, using the API to assign a new Root Collection Administrator and manage collection permissions becomes the only way to regain access to the Microsoft Purview account.
```ruby POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Purview/accounts/{accountName}/addRootCollectionAdmin?api-version=2021-07-01
To run the preceding command, you need only to pass the new Root Collection Admi
``` > [!NOTE]
-> Users who call this API must have Owner or User Account and Authentication (UAA) permissions on the Azure Purview account to execute a write action on the account.
+> Users who call this API must have Owner or User Account and Authentication (UAA) permissions on the Microsoft Purview account to execute a write action on the account.
## Additional resources
-You may choose to execute Azure Purview REST APIs by using the [PowerShell utility](https://aka.ms/purview-api-ps). It can be readily installed from PowerShell Gallery. With this utility, you can execute all the same commands, but from Windows PowerShell.
+You may choose to execute Microsoft Purview REST APIs by using the [PowerShell utility](https://aka.ms/purview-api-ps). It can be readily installed from PowerShell Gallery. With this utility, you can execute all the same commands, but from Windows PowerShell.
## Next steps
purview Tutorial Msi Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-msi-configuration.md
Title: 'Configure access to data sources for Azure Purview MSI at scale'
+ Title: 'Configure access to data sources for Microsoft Purview MSI at scale'
description: In this tutorial, you'll configure Azure MSI settings on your Azure data source subscriptions.
Last updated 09/27/2021 # Customer intent: As a data steward or catalog administrator, I need to onboard Azure data sources at scale before I register and scan them.
-# Tutorial: Configure access to data sources for Azure Purview MSI at scale
+# Tutorial: Configure access to data sources for Microsoft Purview MSI at scale
-To scan data sources, Azure Purview requires access to them. This tutorial is intended for Azure subscription owners and Azure Purview Data Source Administrators. It will help you identify required access and set up required authentication and network rules for Azure Purview across Azure data sources.
+To scan data sources, Microsoft Purview requires access to them. This tutorial is intended for Azure subscription owners and Microsoft Purview Data Source Administrators. It will help you identify required access and set up required authentication and network rules for Microsoft Purview across Azure data sources.
In part 2 of this tutorial series, you'll:
In part 2 of this tutorial series, you'll:
## Prerequisites * Azure subscriptions where your data sources are located. If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
-* An [Azure Purview account](create-catalog-portal.md).
+* An [Microsoft Purview account](create-catalog-portal.md).
* An Azure Key Vault resource in each subscription that has data sources like Azure SQL Database, Azure Synapse Analytics, or Azure SQL Managed Instance.
-* The [Azure Purview MSI Configuration](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-MSI-Configuration) script.
+* The [Microsoft Purview MSI Configuration](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-MSI-Configuration) script.
> [!NOTE]
-> The Azure Purview MSI Configuration script is available only for Windows.
-> This script is currently supported for Azure Purview Managed Identity (MSI).
+> The Microsoft Purview MSI Configuration script is available only for Windows.
+> This script is currently supported for Microsoft Purview Managed Identity (MSI).
> [!IMPORTANT] > We strongly recommend that you test and verify all the changes the script performs in your Azure environment before you deploy it into your production environment.
Before you run the script, create a .csv file (for example, "C:\temp\Subscriptio
Follow these steps to run the script from your Windows computer:
-1. [Download Azure Purview MSI Configuration](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-MSI-Configuration) script to the location of your choice.
+1. [Download Microsoft Purview MSI Configuration](https://github.com/Azure/Purview-Samples/tree/master/Data-Source-MSI-Configuration) script to the location of your choice.
2. On your computer, enter **PowerShell** in the search box on the Windows taskbar. In the search list, select and hold (or right-click) **Windows PowerShell** and then select **Run as administrator**.
Before you run the PowerShell script to verify the readiness of data source subs
- `All` -- `PurviewAccount`: Your existing Azure Purview account resource name.
+- `PurviewAccount`: Your existing Microsoft Purview account resource name.
-- `PurviewSub`: Subscription ID where the Azure Purview account is deployed.
+- `PurviewSub`: Subscription ID where the Microsoft Purview account is deployed.
## Verify your permissions
At a minimum, you need the following permissions to run the script in your Azure
Role | Scope | Why is it needed? | |-|--|--|
-| **Global Reader** | Azure AD tenant | To read Azure SQL Admin user group membership and Azure Purview MSI |
+| **Global Reader** | Azure AD tenant | To read Azure SQL Admin user group membership and Microsoft Purview MSI |
| **Global Administrator** | Azure AD tenant | To assign the **Directory Reader** role to Azure SQL managed instances |
-| **Contributor** | Subscription or resource group where your Azure Purview account is created | To read the Azure Purview account resource and create a Key Vault resource and secret |
+| **Contributor** | Subscription or resource group where your Microsoft Purview account is created | To read the Microsoft Purview account resource and create a Key Vault resource and secret |
| **Owner or User Access Administrator** | Management group or subscription where your Azure data sources are located | To assign RBAC | | **Contributor** | Management group or subscription where your Azure data sources are located | To set up network configuration |
-| **SQL Admin** (Azure AD Authentication) | Azure SQL Server instances or Azure SQL managed instances | To assign the **db_datareader** role to Azure Purview |
+| **SQL Admin** (Azure AD Authentication) | Azure SQL Server instances or Azure SQL managed instances | To assign the **db_datareader** role to Microsoft Purview |
| Access to your Azure key vault | Access to get/list Key Vault secret for Azure SQL Database, Azure SQL Managed Instance, or Azure Synapse authentication |
This script can help you automatically complete the following tasks:
#### Azure Blob Storage (BlobStorage) -- RBAC. Assign the Azure RBAC **Reader** role to Azure Purview MSI on the selected scope. Verify the assignment. -- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Azure Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.
+- RBAC. Assign the Azure RBAC **Reader** role to Microsoft Purview MSI on the selected scope. Verify the assignment.
+- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Microsoft Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.
- Networking. Report whether private endpoint is created for storage and enabled for Blob Storage. - Service endpoint. If private endpoint is off, check whether service endpoint is on, and enable **Allow trusted Microsoft services to access this storage account**. #### Azure Data Lake Storage Gen2 (ADLSGen2) -- RBAC. Assign the Azure RBAC **Reader** role to Azure Purview MSI on the selected scope. Verify the assignment. -- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Azure Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.
+- RBAC. Assign the Azure RBAC **Reader** role to Microsoft Purview MSI on the selected scope. Verify the assignment.
+- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Microsoft Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.
- Networking. Report whether private endpoint is created for storage and enabled for Blob Storage. - Service endpoint. If private endpoint is off, check whether service endpoint is on, and enable **Allow trusted Microsoft services to access this storage account**. #### Azure Data Lake Storage Gen1 (ADLSGen1) - Networking. Verify that service endpoint is on, and enable **Allow all Azure services to access this Data Lake Storage Gen1 account** on Data Lake Storage.-- Permissions. Assign Read/Execute access to Azure Purview MSI. Verify the access.
+- Permissions. Assign Read/Execute access to Microsoft Purview MSI. Verify the access.
#### Azure SQL Database (AzureSQLDB)
This script can help you automatically complete the following tasks:
- Azure AD administration. Enable Azure AD authentication for Azure SQL Database. - SQL databases:
- - SQL role. Assign the **db_datareader** role to Azure Purview MSI.
+ - SQL role. Assign the **db_datareader** role to Microsoft Purview MSI.
#### Azure SQL Managed Instance (AzureSQLMI)
This script can help you automatically complete the following tasks:
- Azure AD administration. Enable Azure AD authentication for Azure SQL Managed Instance. - SQL databases:
- - SQL role. Assign the **db_datareader** role to Azure Purview MSI.
+ - SQL role. Assign the **db_datareader** role to Microsoft Purview MSI.
#### Azure Synapse (Synapse) dedicated pool -- RBAC. Assign the Azure RBAC **Reader** role to Azure Purview MSI on the selected scope. Verify the assignment. -- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Azure Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.
+- RBAC. Assign the Azure RBAC **Reader** role to Microsoft Purview MSI on the selected scope. Verify the assignment.
+- RBAC. Assign the Azure RBAC **Storage Blob Data Reader** role to Microsoft Purview MSI in each of the subscriptions below the selected scope. Verify the assignments.
- SQL Server instances (dedicated pools): - Network. Report whether public endpoint or private endpoint is on. - Firewall. If private endpoint is off, verify firewall rules and enable **Allow Azure services and resources to access this server**. - Azure AD administration. Enable Azure AD authentication for Azure SQL Database. - SQL databases:
- - SQL role. Assign the **db_datareader** role to Azure Purview MSI.
+ - SQL role. Assign the **db_datareader** role to Microsoft Purview MSI.
## Next steps In this tutorial, you learned how to: > [!div class="checklist"] >
-> * Identify required access and set up required authentication and network rules for Azure Purview across Azure data sources.
+> * Identify required access and set up required authentication and network rules for Microsoft Purview across Azure data sources.
-Go to the next tutorial to learn how to [Register and scan multiple sources in Azure Purview](register-scan-azure-multiple-sources.md).
+Go to the next tutorial to learn how to [Register and scan multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md).
purview Tutorial Purview Audit Logs Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-purview-audit-logs-diagnostics.md
Title: Enable and capture Azure Purview audit logs and time series activity history via Azure Diagnostics event hubs
-description: This tutorial lists the step-by-step configuration required to enable and capture Azure Purview audit logs and time series activity history via Azure Diagnostics event hubs.
+ Title: Enable and capture Microsoft Purview audit logs and time series activity history via Azure Diagnostics event hubs
+description: This tutorial lists the step-by-step configuration required to enable and capture Microsoft Purview audit logs and time series activity history via Azure Diagnostics event hubs.
Last updated 02/10/2022
-# Azure Purview: Audit logs, diagnostics, and activity history
+# Microsoft Purview: Audit logs, diagnostics, and activity history
-This tutorial lists the step-by-step configuration required to enable and capture Azure Purview audit and diagnostics logs via Azure Event Hubs.
+This tutorial lists the step-by-step configuration required to enable and capture Microsoft Purview audit and diagnostics logs via Azure Event Hubs.
-An Azure Purview administrator or Azure Purview data-source admin needs the ability to monitor audit and diagnostics logs captured from [Azure Purview](https://azure.microsoft.com/services/purview/#get-started). Audit and diagnostics information consists of the timestamped history of actions taken and changes made to an Azure Purview account by every user. Captured activity history includes actions in the [Azure Purview portal](https://ms.web.purview.azure.com) and outside the portal. Actions outside the portal include calling [Azure Purview REST APIs](/rest/api/purview/) to perform write operations.
+An Microsoft Purview administrator or Microsoft Purview data-source admin needs the ability to monitor audit and diagnostics logs captured from [Microsoft Purview](https://azure.microsoft.com/services/purview/#get-started). Audit and diagnostics information consists of the timestamped history of actions taken and changes made to a Microsoft Purview account by every user. Captured activity history includes actions in the [Microsoft Purview portal](https://ms.web.purview.azure.com) and outside the portal. Actions outside the portal include calling [Microsoft Purview REST APIs](/rest/api/purview/) to perform write operations.
-This tutorial takes you through the steps to enable audit logging on Azure Purview. It also shows you how to configure and capture streaming audit events from Azure Purview via Azure Diagnostics event hubs.
+This tutorial takes you through the steps to enable audit logging on Microsoft Purview. It also shows you how to configure and capture streaming audit events from Microsoft Purview via Azure Diagnostics event hubs.
-## Azure Purview audit events categories
+## Microsoft Purview audit events categories
-Some of the important categories of Azure Purview audit events that are currently available for capture and analysis are listed in the table.
+Some of the important categories of Microsoft Purview audit events that are currently available for capture and analysis are listed in the table.
-More types and categories of activity audit events will be added to Azure Purview.
+More types and categories of activity audit events will be added to Microsoft Purview.
| Category | Activity | Operation | |||--|
More types and categories of activity audit events will be added to Azure Purvie
| Management | Data source | Update | | Management | Data source | Delete |
-## Enable Azure Purview audit and diagnostics
+## Enable Microsoft Purview audit and diagnostics
-The following sections walk you through the process of enabling Azure Purview audit and diagnostics.
+The following sections walk you through the process of enabling Microsoft Purview audit and diagnostics.
### Configure Event Hubs
For step-by-step explanations and manual setup:
- [Event Hubs: Use an ARM template to enable Event Hubs capture](../event-hubs/event-hubs-resource-manager-namespace-event-hub-enable-capture.md) - [Event Hubs: Enable capturing of events streaming manually by using the Azure portal](../event-hubs/event-hubs-capture-enable-through-portal.md)
-### Connect an Azure Purview account to Diagnostics event hubs
+### Connect a Microsoft Purview account to Diagnostics event hubs
-Now that Event Hubs is deployed and created, connect Azure Purview diagnostics audit logging to Event Hubs.
+Now that Event Hubs is deployed and created, connect Microsoft Purview diagnostics audit logging to Event Hubs.
-1. Go to your Azure Purview account home page. This page is where the overview information is displayed. It's not the Azure Purview Studio home page.
+1. Go to your Microsoft Purview account home page. This page is where the overview information is displayed. It's not the Microsoft Purview Studio home page.
1. On the left menu, select **Monitoring** > **Diagnostic settings**. :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-e.png" alt-text="Screenshot that shows selecting Diagnostic settings." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-e.png":::
-1. Select **Add diagnostic setting** or **Edit setting**. Adding more than one diagnostic setting row in the context of Azure Purview isn't recommended. In other words, if you already have a diagnostic setting row, don't select **Add diagnostic**. Select **Edit** instead.
+1. Select **Add diagnostic setting** or **Edit setting**. Adding more than one diagnostic setting row in the context of Microsoft Purview isn't recommended. In other words, if you already have a diagnostic setting row, don't select **Add diagnostic**. Select **Edit** instead.
:::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-f.png" alt-text="Screenshot that shows the Add or Edit Diagnostic settings screen." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-f.png":::
-1. Select the **audit** and **allLogs** checkboxes to enable collection of Azure Purview audit logs. Optionally, select **AllMetrics** if you also want to capture Data Map capacity units and Data Map size metrics of the Azure Purview account.
+1. Select the **audit** and **allLogs** checkboxes to enable collection of Microsoft Purview audit logs. Optionally, select **AllMetrics** if you also want to capture Data Map capacity units and Data Map size metrics of the Microsoft Purview account.
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-g.png" alt-text="Screenshot that shows configuring Azure Purview Diagnostic settings and selecting diagnostic types." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-g.png":::
+ :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-g.png" alt-text="Screenshot that shows configuring Microsoft Purview Diagnostic settings and selecting diagnostic types." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-g.png":::
-Diagnostics configuration on the Azure Purview account is complete.
+Diagnostics configuration on the Microsoft Purview account is complete.
-Now that Azure Purview diagnostics audit logging configuration is complete, configure the data capture and data retention settings for Event Hubs.
+Now that Microsoft Purview diagnostics audit logging configuration is complete, configure the data capture and data retention settings for Event Hubs.
1. Go to the [Azure portal](https://portal.azure.com) home page, and search for the name of the Event Hubs namespace you created earlier.
Now that Azure Purview diagnostics audit logging configuration is complete, conf
:::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-i.png" alt-text="Screenshot that shows Event Hubs Properties message retention period." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-i.png":::
-1. At this stage, the Event Hubs configuration is complete. Azure Purview will start streaming all its audit history and diagnostics data to this event hub. You can now proceed to read, extract, and perform further analytics and operations on the captured diagnostics and audit events.
+1. At this stage, the Event Hubs configuration is complete. Microsoft Purview will start streaming all its audit history and diagnostics data to this event hub. You can now proceed to read, extract, and perform further analytics and operations on the captured diagnostics and audit events.
### Read captured audit events
-To analyze the captured audit and diagnostics log data from Azure Purview:
+To analyze the captured audit and diagnostics log data from Microsoft Purview:
-1. Go to **Process data** on the Event Hubs page to see a preview of the captured Azure Purview audit logs and diagnostics.
+1. Go to **Process data** on the Event Hubs page to see a preview of the captured Microsoft Purview audit logs and diagnostics.
:::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-d.png" alt-text="Screenshot that shows configuring Event Hubs Process data." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-d.png":::
To analyze the captured audit and diagnostics log data from Azure Purview:
1. Switch between the **Table** and **Raw** views of the JSON output.
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-a.png" alt-text="Screenshot that shows exploring Azure Purview audit events on Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-a.png":::
+ :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-a.png" alt-text="Screenshot that shows exploring Microsoft Purview audit events on Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-a.png":::
1. Select **Download sample data** and analyze the results carefully.
- :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-b.png" alt-text="Screenshot that shows Query and Process Azure Purview Audit data on Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-b.png":::
+ :::image type="content" source="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-b.png" alt-text="Screenshot that shows Query and Process Microsoft Purview Audit data on Event Hubs." lightbox="./media/tutorial-purview-audit-logs-diagnostics/azure-purview-diagnostics-audit-eventhub-b.png":::
Now that you know how to gather this information, you can use automatic, scheduled scripts to extract, read, and perform further analytics on the Event Hubs audit and diagnostics data. You can even build your own utilities and custom code to extract business value from captured audit events.
While you're free to use any programming or scripting language of your choice to
## Next steps
-Enable diagnostic audit logging and kickstart your Azure Purview journey.
+Enable diagnostic audit logging and kickstart your Microsoft Purview journey.
> [!div class="nextstepaction"]
-> [Azure Purview: Automated new account setup](https://aka.ms/PurviewKickstart)
+> [Microsoft Purview: Automated new account setup](https://aka.ms/PurviewKickstart)
purview Tutorial Register Scan On Premises Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-register-scan-on-premises-sql-server.md
Title: 'Tutorial: Register and scan an on-premises SQL Server'
-description: This tutorial describes how to register an on-prem SQL Server to Azure Purview, and scan the server using a self-hosted IR.
+description: This tutorial describes how to register an on-prem SQL Server to Microsoft Purview, and scan the server using a self-hosted IR.
# Tutorial: Register and scan an on-premises SQL Server
-Azure Purview is designed to connect to data sources to help you manage sensitive data, simplify data discovery, and ensure right use. Azure Purview can connect to sources across your entire landscape, including multi-cloud and on-premises. For this scenario, you'll use a self-hosted integration runtime to connect to data on an on-premises SQL server. Then you'll use Azure Purview to scan and classify that data.
+Microsoft Purview is designed to connect to data sources to help you manage sensitive data, simplify data discovery, and ensure right use. Microsoft Purview can connect to sources across your entire landscape, including multi-cloud and on-premises. For this scenario, you'll use a self-hosted integration runtime to connect to data on an on-premises SQL server. Then you'll use Microsoft Purview to scan and classify that data.
In this tutorial, you'll learn how to: > [!div class="checklist"]
-> * Sign in to the Azure Purview Studio.
-> * Create a collection in Azure Purview.
+> * Sign in to the Microsoft Purview Studio.
+> * Create a collection in Microsoft Purview.
> * Create a self-hosted integration runtime. > * Store credentials in an Azure Key Vault.
-> * Register an on-premises SQL Server to Azure Purview.
+> * Register an on-premises SQL Server to Microsoft Purview.
> * Scan the SQL Server. > * Browse your data catalog to view assets in your SQL Server.
In this tutorial, you'll learn how to:
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - An active [Azure Key Vault](../key-vault/general/quick-create-portal.md).-- An Azure Purview account. If you don't already have one, you can [follow our quickstart guide to create one](create-catalog-portal.md).
+- An Microsoft Purview account. If you don't already have one, you can [follow our quickstart guide to create one](create-catalog-portal.md).
- An [on-premises SQL Server](https://www.microsoft.com/sql-server/sql-server-downloads).
-## Sign in to Azure Purview Studio
+## Sign in to Microsoft Purview Studio
-To interact with Azure Purview, you'll connect to the [Azure Purview Studio](https://web.purview.azure.com/resource/) through the Azure portal. You can find the studio by going to your Azure Purview account in the [Azure portal](https://portal.azure.com), and selecting the **Open Azure Purview Studio** tile on the overview page.
+To interact with Microsoft Purview, you'll connect to the [Microsoft Purview Studio](https://web.purview.azure.com/resource/) through the Azure portal. You can find the studio by going to your Microsoft Purview account in the [Azure portal](https://portal.azure.com), and selecting the **Open Microsoft Purview Studio** tile on the overview page.
## Create a collection
-Collections in Azure Purview are used to organize assets and sources into a custom hierarchy for organization and discoverability. They're also the tool used to manage access across Azure Purview. In this tutorial, we'll create one collection to house your SQL Server source and all its assets. This tutorial won't cover information about assigning permissions to other users, so for more information you can follow our [Azure Purview permissions guide](catalog-permissions.md).
+Collections in Microsoft Purview are used to organize assets and sources into a custom hierarchy for organization and discoverability. They're also the tool used to manage access across Microsoft Purview. In this tutorial, we'll create one collection to house your SQL Server source and all its assets. This tutorial won't cover information about assigning permissions to other users, so for more information you can follow our [Microsoft Purview permissions guide](catalog-permissions.md).
### Check permissions
-To create and manage collections in Azure Purview, you'll need to be a **Collection Admin** within Azure Purview. We can check these permissions in the [Azure Purview Studio](use-azure-purview-studio.md).
+To create and manage collections in Microsoft Purview, you'll need to be a **Collection Admin** within Microsoft Purview. We can check these permissions in the [Microsoft Purview Studio](use-azure-purview-studio.md).
1. Select **Data Map > Collections** from the left pane to open the collection management page.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/find-collections.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the Collections tab selected." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/find-collections.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the Collections tab selected." border="true":::
-1. Select your root collection. The root collection is the top collection in your collection list and will have the same name as your Azure Purview account. In our example below, it is called Azure Purview Account.
+1. Select your root collection. The root collection is the top collection in your collection list and will have the same name as your Microsoft Purview account. In our example below, it is called Microsoft Purview Account.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/select-root-collection.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the root collection highlighted." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/select-root-collection.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the root collection highlighted." border="true":::
1. Select **Role assignments** in the collection window.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/role-assignments.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/role-assignments.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the role assignments tab highlighted." border="true":::
-1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the Azure Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant you permission.
+1. To create a collection, you'll need to be in the collection admin list under role assignments. If you created the Microsoft Purview account, you should be listed as a collection admin under the root collection already. If not, you'll need to contact the collection admin to grant you permission.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/collection-admins.png" alt-text="Screenshot of Azure Purview studio window, opened to the Data Map, with the collection admin section highlighted." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/collection-admins.png" alt-text="Screenshot of Microsoft Purview studio window, opened to the Data Map, with the collection admin section highlighted." border="true":::
### Create the collection 1. Select **+ Add a collection**. Again, only [collection admins](#check-permissions) can manage collections.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/select-add-a-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the new collection window, with the 'add a collection' buttons highlighted." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/select-add-a-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the new collection window, with the 'add a collection' buttons highlighted." border="true":::
1. In the right panel, enter the collection name and description. If needed you can also add users or groups as collection admins to the new collection. 1. Select **Create**.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/create-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/create-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the new collection window, with a display name and collection admins selected, and the create button highlighted." border="true":::
1. The new collection's information will reflect on the page.
- :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/created-collection.png" alt-text="Screenshot of Azure Purview studio window, showing the newly created collection window." border="true":::
+ :::image type="content" source="./media/tutorial-register-scan-on-premises-sql-server/created-collection.png" alt-text="Screenshot of Microsoft Purview studio window, showing the newly created collection window." border="true":::
## Create a self-hosted integration runtime
-The Self-Hosted Integration Runtime (SHIR) is the compute infrastructure used by Azure Purview to connect to on-premises data sources. The SHIR is downloaded and installed on a machine within the same network as the on-premises data source.
+The Self-Hosted Integration Runtime (SHIR) is the compute infrastructure used by Microsoft Purview to connect to on-premises data sources. The SHIR is downloaded and installed on a machine within the same network as the on-premises data source.
-This tutorial assumes the machine where you'll install your self-hosted integration runtime can make network connections to the internet. This connection allows the SHIR to communicate between your source and Azure Purview. If your machine has a restricted firewall, or if you would like to secure your firewall, look into the [network requirements for the self-hosted integration runtime](manage-integration-runtimes.md#networking-requirements).
+This tutorial assumes the machine where you'll install your self-hosted integration runtime can make network connections to the internet. This connection allows the SHIR to communicate between your source and Microsoft Purview. If your machine has a restricted firewall, or if you would like to secure your firewall, look into the [network requirements for the self-hosted integration runtime](manage-integration-runtimes.md#networking-requirements).
-1. On the home page of Azure Purview Studio, select **Data Map** from the left navigation pane.
+1. On the home page of Microsoft Purview Studio, select **Data Map** from the left navigation pane.
1. Under **Source management** on the left pane, select **Integration runtimes**, and then select **+ New**.
There is only one way to set up authentication for SQL server on-premises:
### SQL authentication
-The SQL account must have access to the **master** database. This is because the `sys.databases` is in the database. The Azure Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
+The SQL account must have access to the **master** database. This is because the `sys.databases` is in the database. The Microsoft Purview scanner needs to enumerate `sys.databases` in order to find all the SQL databases on the server.
#### Create a new login and user
If you would like to create a new login and user to be able to scan your SQL ser
:::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/create-credential-secret.png" alt-text="Add values to key vault credential."::: 1. Select **Create** to complete.
-1. In the [Azure Purview Studio](#sign-in-to-azure-purview-studio), navigate to the **Management** page in the left menu.
+1. In the [Microsoft Purview Studio](#sign-in-to-microsoft-purview-studio), navigate to the **Management** page in the left menu.
:::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-management.png" alt-text="Select Management page on left menu.":::
If you would like to create a new login and user to be able to scan your SQL ser
1. Provide the required information, then select **Create**.
-1. Confirm that your Key Vault has been successfully associated with your Azure Purview account as shown in this example:
+1. Confirm that your Key Vault has been successfully associated with your Microsoft Purview account as shown in this example:
:::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/view-kv-connections.png" alt-text="View Azure Key Vault connections to confirm.":::
If you would like to create a new login and user to be able to scan your SQL ser
## Register SQL Server
-1. Navigate to your Azure Purview account in the [Azure portal](https://portal.azure.com), and select the [Azure Purview Studio](#sign-in-to-azure-purview-studio).
+1. Navigate to your Microsoft Purview account in the [Azure portal](https://portal.azure.com), and select the [Microsoft Purview Studio](#sign-in-to-microsoft-purview-studio).
1. Under Sources and scanning in the left navigation, select **Integration runtimes**. Make sure a self-hosted integration runtime is set up. If it's not set up, follow the steps mentioned [here](manage-integration-runtimes.md) to create a self-hosted integration runtime for scanning on an on-premises or Azure VM that has access to your on-premises network.
If you would like to create a new login and user to be able to scan your SQL ser
To create and run a new scan, do the following:
-1. Select the **Data Map** tab on the left pane in the Azure Purview Studio.
+1. Select the **Data Map** tab on the left pane in the Microsoft Purview Studio.
1. Select the SQL Server source that you registered.
To create and run a new scan, do the following:
## Clean up resources
-If you're not going to continue to use this Azure Purview or SQL source moving forward, you can follow the steps below to delete the integration runtime, SQL credential, and purview resources.
+If you're not going to continue to use this Microsoft Purview or SQL source moving forward, you can follow the steps below to delete the integration runtime, SQL credential, and purview resources.
-### Remove SHIR from Azure Purview
+### Remove SHIR from Microsoft Purview
-1. On the home page of [Azure Purview Studio](https://web.purview.azure.com/resource/), select **Data Map** from the left navigation pane.
+1. On the home page of [Microsoft Purview Studio](https://web.purview.azure.com/resource/), select **Data Map** from the left navigation pane.
1. Under **Source management** on the left pane, select **Integration runtimes**.
If you're not going to continue to use this Azure Purview or SQL source moving f
### Remove SQL credentials
-1. Go to the [Azure portal](https://portal.azure.com) and navigate to the Key Vault resource where you stored your Azure Purview credentials.
+1. Go to the [Azure portal](https://portal.azure.com) and navigate to the Key Vault resource where you stored your Microsoft Purview credentials.
1. Under **Settings** in the left menu, select **Secrets**
If you're not going to continue to use this Azure Purview or SQL source moving f
1. Select **Yes** to permanently delete the resource.
-### Delete Azure Purview account
+### Delete Microsoft Purview account
-If you would like to delete your Azure Purview account after completing this tutorial, follow these steps.
+If you would like to delete your Microsoft Purview account after completing this tutorial, follow these steps.
1. Go to the [Azure portal](https://portal.azure.com) and navigate to your purview account. 1. At the top of the page, select the **Delete** button.
- :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-delete.png" alt-text="Delete button on the Azure Purview account page in the Azure portal is selected.":::
+ :::image type="content" source="media/tutorial-register-scan-on-premises-sql-server/select-delete.png" alt-text="Delete button on the Microsoft Purview account page in the Azure portal is selected.":::
1. When the process is complete, you'll receive a notification in the Azure portal. ## Next steps > [!div class="nextstepaction"]
-> [Use Azure Purview REST APIs](tutorial-using-rest-apis.md)
+> [Use Microsoft Purview REST APIs](tutorial-using-rest-apis.md)
purview Tutorial Using Rest Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-using-rest-apis.md
Title: "How to use REST APIs for Azure Purview Data Planes"
-description: This tutorial describes how to use the Azure Purview REST APIs to access the contents of your Azure Purview.
+ Title: "How to use REST APIs for Microsoft Purview Data Planes"
+description: This tutorial describes how to use the Microsoft Purview REST APIs to access the contents of your Microsoft Purview.
Last updated 09/17/2021
-# Customer intent: I can call the Data plane REST APIs to perform CRUD operations on Azure Purview account.
+# Customer intent: I can call the Data plane REST APIs to perform CRUD operations on Microsoft Purview account.
# Tutorial: Use the REST APIs
-In this tutorial, you learn how to use the Azure Purview REST APIs. Anyone who wants to submit data to an Azure Purview, include Azure Purview as part of an automated process, or build their own user experience on the Azure Purview can use the REST APIs to do so.
+In this tutorial, you learn how to use the Microsoft Purview REST APIs. Anyone who wants to submit data to a Microsoft Purview, include Microsoft Purview as part of an automated process, or build their own user experience on the Microsoft Purview can use the REST APIs to do so.
If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin. ## Prerequisites
-* To get started, you must have an existing Azure Purview account. If you don't have a catalog, see the [quickstart for creating an Azure Purview account](create-catalog-portal.md).
+* To get started, you must have an existing Microsoft Purview account. If you don't have a catalog, see the [quickstart for creating a Microsoft Purview account](create-catalog-portal.md).
## Create a service principal (application)
its password. Here's how:
Once service principal is created, you need to assign Data plane roles of your purview account to the service principal created above. The below steps need to be followed to assign role to establish trust between the service principal and purview account.
-1. Navigate to your [Azure Purview Studio](https://web.purview.azure.com/resource/).
+1. Navigate to your [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
1. Select the Data Map in the left menu. 1. Select Collections.
-1. Select the root collection in the collections menu. This will be the top collection in the list, and will have the same name as your Azure Purview account.
+1. Select the root collection in the collections menu. This will be the top collection in the list, and will have the same name as your Microsoft Purview account.
>[!NOTE] >You can also assign your service principal permission to any sub-collections, instead of the root collection. However, all APIs will be scoped to that collection (and sub-collections that inherit permissions), and users trying to call the API for another collection will get errors.
Once service principal is created, you need to assign Data plane roles of your p
1. Select the **Role** tab.
-1. Assign the following roles to the service principal created previously to access various data planes in Azure Purview. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
+1. Assign the following roles to the service principal created previously to access various data planes in Microsoft Purview. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
* Data Curator role to access Catalog Data plane. * Data Source Administrator role to access Scanning Data plane. * Collection Admin role to access Account Data Plane and Metadata policy Data Plane. > [!Note]
- > Only members of the Collection Admin role can assign data plane roles in Azure Purview. For more information about Azure Purview roles, see [Access Control in Azure Purview](./catalog-permissions.md).
+ > Only members of the Collection Admin role can assign data plane roles in Microsoft Purview. For more information about Microsoft Purview roles, see [Access Control in Microsoft Purview](./catalog-permissions.md).
## Get token You can send a POST request to the following URL to get access token.
https://login.microsoftonline.com/{your-tenant-id}/oauth2/token
The following parameters need to be passed to the above URL. -- **client_id**: client ID of the application registered in Azure Active directory and is assigned to a data plane role for the Azure Purview account.
+- **client_id**: client ID of the application registered in Azure Active directory and is assigned to a data plane role for the Microsoft Purview account.
- **client_secret**: client secret created for the above application. - **grant_type**: This should be ΓÇÿclient_credentialsΓÇÖ. - **resource**: This should be ΓÇÿhttps://purview.azure.netΓÇÖ
Use the access token above to call the Data plane APIs.
> [!div class="nextstepaction"] > [Manage data sources](manage-data-sources.md)
-> [Azure Purview Data Plane REST APIs](/rest/api/purview/)
+> [Microsoft Purview Data Plane REST APIs](/rest/api/purview/)
purview Use Azure Purview Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/use-azure-purview-studio.md
Title: Use the Azure Purview Studio
-description: This article describes how to use Azure Purview Studio.
+ Title: Use the Microsoft Purview Studio
+description: This article describes how to use Microsoft Purview Studio.
Last updated 02/12/2022
-# Use Azure Purview Studio
+# Use Microsoft Purview Studio
-This article gives an overview of some of the main features of Azure Purview.
+This article gives an overview of some of the main features of Microsoft Purview.
## Prerequisites
-* An Active Azure Purview account is already created in Azure portal and the user has permissions to access [Azure Purview Studio](https://web.purview.azure.com/resource/).
+* An Active Microsoft Purview account is already created in Azure portal and the user has permissions to access [Microsoft Purview Studio](https://web.purview.azure.com/resource/).
-## Launch Azure Purview account
+## Launch Microsoft Purview account
-* To launch your Azure Purview account, go to Azure Purview accounts in Azure portal, select the account you want to launch and launch the account.
+* To launch your Microsoft Purview account, go to Microsoft Purview accounts in Azure portal, select the account you want to launch and launch the account.
- :::image type="content" source="./media/use-purview-studio/open-purview-studio.png" alt-text="Screenshot of Azure Purview window in Azure portal, with Azure Purview Studio button highlighted." border="true":::
+ :::image type="content" source="./media/use-purview-studio/open-purview-studio.png" alt-text="Screenshot of Microsoft Purview window in Azure portal, with Microsoft Purview Studio button highlighted." border="true":::
-* Another way to launch Azure Purview account is to go to `https://web.purview.azure.com`, select **Azure Active Directory** and an account name to launch the account.
+* Another way to launch Microsoft Purview account is to go to `https://web.purview.azure.com`, select **Azure Active Directory** and an account name to launch the account.
## Home page
-**Home** is the starting page for the Azure Purview client.
+**Home** is the starting page for the Microsoft Purview client.
:::image type="content" source="./media/use-purview-studio/purview-homepage.png" alt-text="Screenshot of the homepage.":::
The following list summarizes the main features of **Home page**. Each number in
* For *data source admin* + *data reader*, the buttons are **Browse assets**, **View glossary**, and **Knowledge center**. > [!NOTE]
- > For more information about Azure Purview roles, see [Access control in Azure Purview](catalog-permissions.md).
+ > For more information about Microsoft Purview roles, see [Access control in Microsoft Purview](catalog-permissions.md).
5. The left navigation bar helps you locate the main pages of the application. 6. The **Recently accessed** tab shows a list of recently accessed data assets. For information about accessing assets, see [Search the Data Catalog](how-to-search-catalog.md) and [Browse by asset type](how-to-browse-catalog.md). **My items** tab is a list of data assets owned by the logged-on user.
-7. **Links** contains links to region status, documentation, pricing, overview, and Azure Purview status
+7. **Links** contains links to region status, documentation, pricing, overview, and Microsoft Purview status
8. The top navigation bar contains information about release notes/updates, change purview account, notifications, help, and feedback sections. ## Knowledge center
-Knowledge center is where you can find all the videos and tutorials related to Azure Purview.
+Knowledge center is where you can find all the videos and tutorials related to Microsoft Purview.
## Localization
-Azure Purview is localized in 18 languages. To change the language used, go to the **Settings** from the top bar and select the desired language from the dropdown.
+Microsoft Purview is localized in 18 languages. To change the language used, go to the **Settings** from the top bar and select the desired language from the dropdown.
> [!NOTE] > Only generally available features are localized. Features still in preview are in English regardless of which language is selected.
Azure Purview is localized in 18 languages. To change the language used, go to t
## Guided tours
-Each UX in Azure Purview Studio will have guided tours to give overview of the page. To start the guided tour, select **help** on the top bar and select **guided tours**.
+Each UX in Microsoft Purview Studio will have guided tours to give overview of the page. To start the guided tour, select **help** on the top bar and select **guided tours**.
:::image type="content" source="./media/use-purview-studio/guided-tour.png" alt-text="Screenshot of the guided tour.":::
search Search Security Manage Encryption Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-security-manage-encryption-keys.md
This article walks you through the steps of setting up customer-managed key (CMK
+ CMK encryption occurs when an object is created. You can't encrypt objects that already exist.
-## CMK-qualified encryption
+## CMK encryption support
Objects that can be encrypted include indexes, synonym lists, indexers, data sources, and skillsets. Encryption is computationally expensive to decrypt so only sensitive content is encrypted.
-Encryption is performed over the following objects:
+Encryption is performed over the following content:
+ All content within indexes and synonym lists, including descriptions.
search Search Security Rbac https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-security-rbac.md
Built-in roles include generally available and preview roles.
| [Owner](../role-based-access-control/built-in-roles.md#owner) | (Generally available) Full access to the search resource, including the ability to assign Azure roles. Subscription administrators are members by default. | | [Contributor](../role-based-access-control/built-in-roles.md#contributor) | (Generally available) Same level of access as Owner, minus the ability to assign roles or change authorization options. | | [Reader](../role-based-access-control/built-in-roles.md#reader) | (Generally available) Limited access to partial service information. In the portal, the Reader role can access information in the service Overview page, in the Essentials section and under the Monitoring tab. All other tabs and pages are off limits. </br></br>This role has access to service information: resource group, service status, location, subscription name and ID, tags, URL, pricing tier, replicas, partitions, and search units. This role also has access to service metrics: search latency, percentage of throttled requests, average queries per second. </br></br>There is no access to API keys, role assignments, content (indexes or synonym maps), or content metrics (storage consumed, number of objects). |
-| [Search Service Contributor](../role-based-access-control/built-in-roles.md#search-service-contributor) | (Generally available) This role is equivalent to Contributor at the service-level. </br></br>(Preview) Provides full access to all actions on indexes, synonym maps, indexers, data sources, and skillsets through [`Microsoft.Search/searchServices/*`](../role-based-access-control/resource-provider-operations.md#microsoftsearch). This role is for search service administrators who need to fully manage the service. This role has been extended to include data plane operations. Data plane support is in preview. </br></br>Like Contributor, members of this role cannot make or manage role assignments or change authorization options. Your service must be enabled for the preview for data requests. |
-| [Search Index Data Contributor](../role-based-access-control/built-in-roles.md#search-index-data-contributor) | (Preview) Provides full access to content in all indexes on the search service. This role is for developers or index owners who need to import, refresh, or query the documents collection of an index. |
-| [Search Index Data Reader](../role-based-access-control/built-in-roles.md#search-index-data-reader) | (Preview) Provides read-only access to search indexes on the search service. This role is for apps and users who run queries. |
+| [Search Service Contributor](../role-based-access-control/built-in-roles.md#search-service-contributor) | (Generally available) This role is identical to the Contributor role for control plane operations. </p>(Preview) Provides full access to all data plane actions on indexes, synonym maps, indexers, data sources, and skillsets through [`Microsoft.Search/searchServices/*`](../role-based-access-control/resource-provider-operations.md#microsoftsearch). This role is for search service administrators who need to fully manage the service and its content. In preview, this role has been extended to include data plane operations. </br></br>Like Contributor, members of this role cannot make or manage role assignments or change authorization options. Your service must be enabled for the preview for data requests. |
+| [Search Index Data Contributor](../role-based-access-control/built-in-roles.md#search-index-data-contributor) | (Preview) Provides full data plane access to content in all indexes on the search service. This role is for developers or index owners who need to import, refresh, or query the documents collection of an index. |
+| [Search Index Data Reader](../role-based-access-control/built-in-roles.md#search-index-data-reader) | (Preview) Provides read-only data plane access to search indexes on the search service. This role is for apps and users who run queries. |
> [!NOTE] > Azure resources have the concept of [control plane and data plane](../azure-resource-manager/management/control-plane-and-data-plane.md) categories of operations. In Cognitive Search, "control plane" refers to any operation supported in the [Management REST API](/rest/api/searchmanagement/) or equivalent client libraries. The "data plane" refers to operations against the search service endpoint, such as indexing or queries, or any other operation specified in the [Search REST API](/rest/api/searchservice/) or equivalent client libraries. Most roles apply to just one plane. The exception is Search Service Contributor which supports actions across both.
security Feature Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/security/fundamentals/feature-availability.md
The following tables display the current Microsoft Sentinel feature availability
| - [Azure Active Directory](../../sentinel/connect-azure-active-directory.md) | GA | GA | | - [Azure ADIP](../../sentinel/data-connectors-reference.md#azure-active-directory-identity-protection) | GA | GA | | - [Azure DDoS Protection](../../sentinel/data-connectors-reference.md#azure-ddos-protection) | GA | GA |
-| - [Azure Purview](../../sentinel/data-connectors-reference.md#azure-purview) | Public Preview | Not Available |
+| - [Azure Purview](../../sentinel/data-connectors-reference.md#microsoft-purview) | Public Preview | Not Available |
| - [Microsoft Defender for Cloud](../../sentinel/connect-azure-security-center.md) | GA | GA | | - [Microsoft Defender for IoT](../../sentinel/data-connectors-reference.md#microsoft-defender-for-iot) | GA | GA | | - [Microsoft Insider Risk Management](/azure/sentinel/sentinel-solutions-catalog#domain-solutions) | Public Preview | Not Available |
security Pen Testing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/security/fundamentals/pen-testing.md
We donΓÇÖt perform penetration testing of your application for you, but we do un
As of June 15, 2017, Microsoft no longer requires pre-approval to conduct a penetration test against Azure resources. This process is only related to Microsoft Azure, and not applicable to any other Microsoft Cloud Service. >[!IMPORTANT]
->While notifying Microsoft of pen testing activities is no longer required customers must still comply with the [Microsoft Cloud Unified Penetration Testing Rules of Engagement](https://technet.microsoft.com/mt784683).
+>While notifying Microsoft of pen testing activities is no longer required customers must still comply with the [Microsoft Cloud Unified Penetration Testing Rules of Engagement](https://www.microsoft.com/msrc/pentest-rules-of-engagement).
Standard tests you can perform include:
sentinel Ci Cd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/ci-cd.md
Each connection can support multiple types of custom content, including analytic
After the connection is created, a new workflow or pipeline is generated in your repository, and the content stored in your repository is deployed to your Microsoft Sentinel workspace.
-The deployment time may vary depending on the amount of content that you're deploying.
+The deployment time may vary depending on the volume of content that you're deploying.
### View the deployment status
After the deployment is complete:
- The connection details on the **Repositories** page are updated with the link to the connection's deployment logs. For example: :::image type="content" source="media/ci-cd/deployment-logs-link.png" alt-text="Screenshot of a GitHub repository connection's deployment logs.":::
+
+### Improve deployment performance with smart deployments
+
+Smart deployments is a back-end capability that improves the performance of deployments by actively tracking modifications made to the content files of a connected repository/branch using a csv file within the '.sentinel' folder in your repository. By actively tracking modifications made to content in each commit, your Microsoft Sentinel repositories will avoid redeploying any content that has not been modified since the last deployment into your Microsoft Sentinel workspace(s). This will improve your deployment performance and avoid unintentionally tampering with unchanged content in your workspace, such as resetting the dynamic schedules of your analytics rules by redeploying them.
+
+While smart deployments is enabled by default on newly created connections, we understand that some customers would prefer all their source control content to be deployed every time a deployment is triggered, regardless of whether that content was modified or not. You can modify your workflow to disable smart deployments to have your connection deploy all content regardless of its modification status. See [Customize the deployment workflow](#customize-the-deployment-workflow) for more details.
+
+ > [!NOTE]
+ > This capapbilty was launched in public preview on April 20th, 2022. Connections created prior to launch would need to be updated or recreated for smart deployments to be turned on.
+ >
### Customize the deployment workflow
The default content deployment deploys all of the relevant custom content from t
If the default configuration for your content deployment from GitHub or Azure DevOps does not meet all your requirements, you can modify the experience to fit your needs.
-For example, you may want to configure different deployment triggers, or deploy content only from a specific root folder.
+For example, you may want to turn off smart deployments, configure different deployment triggers, or deploy content only from a specific root folder.
Select one of the following tabs depending on your connection type:
Select one of the following tabs depending on your connection type:
... directory: '${{ github.workspace }}/SentinelContent' ```
+ - **To disable smart deployments**:
+ Navigate to the `jobs` section of your workflow. Switch the `smartDeployment` default value (typically on line 33) from `true` to `false`. This will turn off the smart deployments functionality and all future deployments for this connection will redeploy all the repository's relevant content files to the connected workspace(s) once this change is committed.
For more information, see the [GitHub documentation](https://docs.github.com/en/actions/learn-github-actions/workflow-syntax-for-github-actions#onpushpull_requestpaths) on GitHub Actions and editing GitHub workflows.
For more information, see the [GitHub documentation](https://docs.github.com/en/
azureSubscription: `Sentinel_Deploy_ServiceConnection_0000000000000000` workingDirectory: `SentinelContent` ```
+
+ - **To disable smart deployments**:
+ Navigate to the `ScriptArguments` section of your pipeline. Switch the `smartDeployment` default value (typically on line 33) from `true` to `false`. This will turn off the smart deployments functionality and all future deployments for this connection will redeploy all the repository's relevant content files to the connected workspace(s) once this change is committed.
For more information, see the [Azure DevOps documentation](/azure/devops/pipelines/yaml-schema) on the Azure DevOps YAML schema.
sentinel Data Connectors Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/data-connectors-reference.md
For more information, see the Cognito Detect Syslog Guide, which can be download
| **Log Analytics table(s)** | [CommonSecurityLog](/azure/azure-monitor/reference/tables/commonsecuritylog) | | **DCR support** | [Workspace transformation DCR](../azure-monitor/logs/tutorial-ingestion-time-transformations.md) | | **Kusto function alias:** | AkamaiSIEMEvent |
-| **Kusto function URL:** | https://aka.ms/Sentinel-akamaisecurityevents-parser |
+| **Kusto function URL:** | https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/Akamai%20Security%20Events/Parsers/AkamaiSIEMEvent.txt |
| **Vendor documentation/<br>installation instructions** | [Configure Security Information and Event Management (SIEM) integration](https://developer.akamai.com/tools/integrations/siem)<br>[Set up a CEF connector](https://developer.akamai.com/tools/integrations/siem/siem-cef-connector). | | **Supported by** | [Akamai](https://www.akamai.com/us/en/support/) |
See [Microsoft Defender for Cloud](#microsoft-defender-for-cloud).
| **Supported by** | Microsoft |
-## Azure Purview
+## Microsoft Purview
| Connector attribute | Description | | | |
-| **Data ingestion method** | **Azure service-to-service integration: <br>[Diagnostic settings-based connections](connect-azure-windows-microsoft-services.md?tabs=AP#diagnostic-settings-based-connections)**<br><br>For more information, see [Tutorial: Integrate Microsoft Sentinel and Azure Purview](purview-solution.md). |
+| **Data ingestion method** | **Azure service-to-service integration: <br>[Diagnostic settings-based connections](connect-azure-windows-microsoft-services.md?tabs=AP#diagnostic-settings-based-connections)**<br><br>For more information, see [Tutorial: Integrate Microsoft Sentinel and Microsoft Purview](purview-solution.md). |
| **Log Analytics table(s)** | PurviewDataSensitivityLogs | | **DCR support** | Not currently supported | | **Supported by** | Microsoft |
sentinel Notebooks Hunt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/notebooks-hunt.md
Other resources:
- **Find more notebooks** in the [Microsoft Sentinel GitHub repository](https://github.com/Azure/Azure-Sentinel-Notebooks):
- - The [`Sample-Notebooks`](https://github.com/Azure/Azure-Sentinel-Notebooks/tree/master/Sample-Notebooks) directory includes sample notebooks that are saved with data that you can use to show intended output.
+ - The [`Example-Notebooks`](https://github.com/Azure/Azure-Sentinel-Notebooks/tree/master/tutorials-and-examples/example-notebooks) directory includes sample notebooks that are saved with data that you can use to show intended output.
- - The [`HowTos`](https://github.com/Azure/Azure-Sentinel-Notebooks/tree/master/HowTos) directory includes notebooks that describe concepts such as setting your default Python version, creating Microsoft Sentinel bookmarks from a notebook, and more.
+ - The [`HowTos`](https://github.com/Azure/Azure-Sentinel-Notebooks/tree/master/tutorials-and-examples/how-tos) directory includes notebooks that describe concepts such as setting your default Python version, creating Microsoft Sentinel bookmarks from a notebook, and more.
For more information, see:
For more information, see:
- [Webinar: Microsoft Sentinel notebooks fundamentals](https://www.youtube.com/watch?v=rewdNeX6H94) - [Proactively hunt for threats](hunting.md) - [Use bookmarks to save interesting information while hunting](bookmarks.md)-- [Jupyter, msticpy, and Microsoft Sentinel](https://msticpy.readthedocs.io/en/latest/getting_started/JupyterAndAzureSentinel.html)
+- [Jupyter, msticpy, and Microsoft Sentinel](https://msticpy.readthedocs.io/en/latest/getting_started/JupyterAndAzureSentinel.html)
sentinel Notebooks Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/notebooks-troubleshoot.md
If the *Runtime dependency of PyGObject is missing* error appears when you load
ModuleNotFoundError: No module named 'gi' ```
-1. Use the [aml-compute-setup.sh](https://github.com/Azure/Azure-Sentinel-Notebooks/blob/master/HowTos/aml-compute-setup.sh) script, located in the Microsoft Sentinel Notebooks GitHub repository, to automatically install the `pygobject` in all notebooks and Anaconda environments on the Compute instance.
+1. Use the [aml-compute-setup.sh](https://github.com/Azure/Azure-Sentinel-Notebooks/blob/master/tutorials-and-examples/how-tos/aml-compute-setup.sh) script, located in the Microsoft Sentinel Notebooks GitHub repository, to automatically install the `pygobject` in all notebooks and Anaconda environments on the Compute instance.
> [!TIP] > You can also fix this Warning by running the following code from a notebook:
If the *Runtime dependency of PyGObject is missing* error appears when you load
## Next steps
-We welcome feedback, suggestions, requests for features, contributed notebooks, bug reports or improvements and additions to existing notebooks. Go to the [Microsoft Sentinel GitHub repository](https://github.com/Azure/Azure-Sentinel) to create an issue or fork and upload a contribution.
+We welcome feedback, suggestions, requests for features, contributed notebooks, bug reports or improvements and additions to existing notebooks. Go to the [Microsoft Sentinel GitHub repository](https://github.com/Azure/Azure-Sentinel) to create an issue or fork and upload a contribution.
sentinel Purview Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/purview-solution.md
Title: Integrate Microsoft Sentinel and Azure Purview | Microsoft Docs
-description: This tutorial describes how to use the Microsoft Sentinel data connector and solution for Azure Purview to enable data sensitivity insights, create rules to monitor when classifications have been detected, and get an overview about data found by Azure Purview, and where sensitive data resides in your organization.
+ Title: Integrate Microsoft Sentinel and Microsoft Purview | Microsoft Docs
+description: This tutorial describes how to use the Microsoft Sentinel data connector and solution for Microsoft Purview to enable data sensitivity insights, create rules to monitor when classifications have been detected, and get an overview about data found by Microsoft Purview, and where sensitive data resides in your organization.
Last updated 02/08/2022
-# Tutorial: Integrate Microsoft Sentinel and Azure Purview (Public Preview)
+# Tutorial: Integrate Microsoft Sentinel and Microsoft Purview (Public Preview)
> [!IMPORTANT] >
-> The *Azure Purview* solution is in **PREVIEW**. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+> The *Microsoft Purview* solution is in **PREVIEW**. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
>
-[Azure Purview](../purview/index.yml) provides organizations with visibility into where sensitive information is stored, helping prioritize at-risk data for protection.
+[Microsoft Purview](../purview/index.yml) provides organizations with visibility into where sensitive information is stored, helping prioritize at-risk data for protection.
-Integrate Azure Purview with Microsoft Sentinel to help narrow down the high volume of incidents and threats surfaced in Microsoft Sentinel, and understand the most critical areas to start.
+Integrate Microsoft Purview with Microsoft Sentinel to help narrow down the high volume of incidents and threats surfaced in Microsoft Sentinel, and understand the most critical areas to start.
-Start by ingesting your Azure Purview logs into Microsoft Sentinel through a data connector. Then use a Microsoft Sentinel workbook to view data such as assets scanned, classifications found, and labels applied by Azure Purview. Use analytics rules to create alerts for changes within data sensitivity.
+Start by ingesting your Microsoft Purview logs into Microsoft Sentinel through a data connector. Then use a Microsoft Sentinel workbook to view data such as assets scanned, classifications found, and labels applied by Microsoft Purview. Use analytics rules to create alerts for changes within data sensitivity.
-Customize the Azure Purview workbook and analytics rules to best suit the needs of your organization, and combine Azure Purview logs with data ingested from other sources to create enriched insights within Microsoft Sentinel.
+Customize the Microsoft Purview workbook and analytics rules to best suit the needs of your organization, and combine Microsoft Purview logs with data ingested from other sources to create enriched insights within Microsoft Sentinel.
In this tutorial, you: > [!div class="checklist"] >
-> * Install the Microsoft Sentinel solution for Azure Purview
-> * Enable your Azure Purview data connector
-> * Learn about the workbook and analytics rules deployed to your Microsoft Sentinel workspace with the Azure Purview solution
+> * Install the Microsoft Sentinel solution for Microsoft Purview
+> * Enable your Microsoft Purview data connector
+> * Learn about the workbook and analytics rules deployed to your Microsoft Sentinel workspace with the Microsoft Purview solution
## Prerequisites
-Before you start, make sure you have both a [Microsoft Sentinel workspace](quickstart-onboard.md) and [Azure Purview](../purview/create-catalog-portal.md) onboarded, and that your user has the following roles:
+Before you start, make sure you have both a [Microsoft Sentinel workspace](quickstart-onboard.md) and [Microsoft Purview](../purview/create-catalog-portal.md) onboarded, and that your user has the following roles:
-- **An Azure Purview account [Owner](../role-based-access-control/built-in-roles.md) or [Contributor](../role-based-access-control/built-in-roles.md) role**, to set up diagnostic settings and configure the data connector.
+- **An Microsoft Purview account [Owner](../role-based-access-control/built-in-roles.md) or [Contributor](../role-based-access-control/built-in-roles.md) role**, to set up diagnostic settings and configure the data connector.
- **A [Microsoft Sentinel Contributor](../role-based-access-control/built-in-roles.md#microsoft-sentinel-contributor) role**, with write permissions to enable data connector, view the workbook, and create analytic rules.
-## Install the Azure Purview solution
+## Install the Microsoft Purview solution
-The **Azure Purview** solution is a set of bundled content, including a data connector, workbook, and analytics rules configured specifically for Azure Purview data.
+The **Microsoft Purview** solution is a set of bundled content, including a data connector, workbook, and analytics rules configured specifically for Microsoft Purview data.
> [!TIP] > Microsoft Sentinel [solutions](sentinel-solutions.md) can help you onboard Microsoft Sentinel security content for a specific data connector using a single process. **To install the solution**
-1. In Microsoft Sentinel, under **Content management**, select **Content hub** and then locate the **Azure Purview** solution.
+1. In Microsoft Sentinel, under **Content management**, select **Content hub** and then locate the **Microsoft Purview** solution.
1. At the bottom right, select **View details**, and then **Create**. Select the subscription, resource group, and workspace where you want to install the solution, and then review the data connector and related security content that will be deployed.
The **Azure Purview** solution is a set of bundled content, including a data con
For more information, see [About Microsoft Sentinel content and solutions](sentinel-solutions.md) and [Centrally discover and deploy out-of-the-box content and solutions](sentinel-solutions-deploy.md).
-## Start ingesting Azure Purview data in Microsoft Sentinel
+## Start ingesting Microsoft Purview data in Microsoft Sentinel
-Configure diagnostic settings to have Azure Purview data sensitivity logs flow into Microsoft Sentinel, and then run an Azure Purview scan to start ingesting your data.
+Configure diagnostic settings to have Microsoft Purview data sensitivity logs flow into Microsoft Sentinel, and then run a Microsoft Purview scan to start ingesting your data.
Diagnostics settings send log events only after a full scan is run, or when a change is detected during an incremental scan. It typically takes about 10-15 minutes for the logs to start appearing in Microsoft Sentinel. > [!TIP]
-> Instructions for enabling your data connector also available in Microsoft Sentinel, on the **Azure Purview** data connector page.
+> Instructions for enabling your data connector also available in Microsoft Sentinel, on the **Microsoft Purview** data connector page.
> **To enable data sensitivity logs to flow into Microsoft Sentinel**:
-1. Navigate to your Azure Purview account in the Azure portal and select **Diagnostic settings**.
+1. Navigate to your Microsoft Purview account in the Azure portal and select **Diagnostic settings**.
- :::image type="content" source="media/purview-solution/diagnostics-settings.png" alt-text="Screenshot of an Azure Purview account Diagnostics settings page.":::
+ :::image type="content" source="media/purview-solution/diagnostics-settings.png" alt-text="Screenshot of a Microsoft Purview account Diagnostics settings page.":::
-1. Select **+ Add diagnostic setting** and configure the new setting to send logs from Azure Purview to Microsoft Sentinel:
+1. Select **+ Add diagnostic setting** and configure the new setting to send logs from Microsoft Purview to Microsoft Sentinel:
- Enter a meaningful name for your setting. - Under **Logs**, select **DataSensitivityLogEvent**.
Diagnostics settings send log events only after a full scan is run, or when a ch
For more information, see [Connect Microsoft Sentinel to Azure, Windows, Microsoft, and Amazon services](connect-azure-windows-microsoft-services.md#diagnostic-settings-based-connections).
-**To run an Azure Purview scan and view data in Microsoft Sentinel**:
+**To run a Microsoft Purview scan and view data in Microsoft Sentinel**:
-1. In Azure Purview, run a full scan of your resources. For more information, see [Manage data sources in Azure Purview](../purview/manage-data-sources.md).
+1. In Microsoft Purview, run a full scan of your resources. For more information, see [Manage data sources in Microsoft Purview](../purview/manage-data-sources.md).
-1. After your Azure Purview scans have completed, go back to the Azure Purview data connector in Microsoft Sentinel and confirm that data has been received.
+1. After your Microsoft Purview scans have completed, go back to the Microsoft Purview data connector in Microsoft Sentinel and confirm that data has been received.
-## View recent data discovered by Azure Purview
+## View recent data discovered by Microsoft Purview
-The Azure Purview solution provides two analytics rule templates out-of-the-box that you can enable, including a generic rule and a customized rule.
+The Microsoft Purview solution provides two analytics rule templates out-of-the-box that you can enable, including a generic rule and a customized rule.
-- The generic version, *Sensitive Data Discovered in the Last 24 Hours*, monitors for the detection of any classifications found across your data estate during an Azure Purview scan.
+- The generic version, *Sensitive Data Discovered in the Last 24 Hours*, monitors for the detection of any classifications found across your data estate during a Microsoft Purview scan.
- The customized version, *Sensitive Data Discovered in the Last 24 Hours - Customized*, monitors and generates alerts each time the specified classification, such as Social Security Number, has been detected.
-Use this procedure to customize the Azure Purview analytics rules' queries to detect assets with specific classification, sensitivity label, source region, and more. Combine the data generated with other data in Microsoft Sentinel to enrich your detections and alerts.
+Use this procedure to customize the Microsoft Purview analytics rules' queries to detect assets with specific classification, sensitivity label, source region, and more. Combine the data generated with other data in Microsoft Sentinel to enrich your detections and alerts.
> [!NOTE] > Microsoft Sentinel analytics rules are KQL queries that trigger alerts when suspicious activity has been detected. Customize and group your rules together to create incidents for your SOC team to investigate. >
-### Modify the Azure Purview analytics rule templates
+### Modify the Microsoft Purview analytics rule templates
1. In Microsoft Sentinel, under **Configuration** select **Analytics** > **Active rules**, and search for a rule named **Sensitive Data Discovered in the Last 24 Hours - Customized**.
Use this procedure to customize the Azure Purview analytics rules' queries to de
For more information, see [Create custom analytics rules to detect threats](detect-threats-custom.md).
-### View Azure Purview data in Microsoft Sentinel workbooks
+### View Microsoft Purview data in Microsoft Sentinel workbooks
-In Microsoft Sentinel, under **Threat management**, select **Workbooks** > **My workbooks**, and locate the **Azure Purview** workbook deployed with the **Azure Purview** solution. Open the workbook and customize any parameters as needed.
+In Microsoft Sentinel, under **Threat management**, select **Workbooks** > **My workbooks**, and locate the **Microsoft Purview** workbook deployed with the **Microsoft Purview** solution. Open the workbook and customize any parameters as needed.
-The Azure Purview workbook displays the following tabs:
+The Microsoft Purview workbook displays the following tabs:
- **Overview**: Displays the regions and resources types where the data is located. - **Classifications**: Displays assets that contain specified classifications, like Credit Card Numbers. - **Sensitivity labels**: Displays the assets that have confidential labels, and the assets that currently have no labels.
-To drill down in the Azure Purview workbook:
+To drill down in the Microsoft Purview workbook:
- Select a specific data source to jump to that resource in Azure. - Select an asset path link to show more details, with all the data fields shared in the ingested logs. - Select a row in the **Data Source**, **Classification**, or **Sensitivity Label** tables to filter the Asset Level data as configured.
-### Investigate incidents triggered by Azure Purview events
+### Investigate incidents triggered by Microsoft Purview events
-When investigating incidents triggered by the Azure Purview analytics rules, find detailed information on the assets and classifications found in the incident's **Events**.
+When investigating incidents triggered by the Microsoft Purview analytics rules, find detailed information on the assets and classifications found in the incident's **Events**.
For example:
For more information, see:
- [Create custom analytics rules to detect threats](detect-threats-custom.md) - [Investigate incidents with Microsoft Sentinel](investigate-cases.md) - [About Microsoft Sentinel content and solutions](sentinel-solutions.md)-- [Centrally discover and deploy Microsoft Sentinel out-of-the-box content and solutions (Public preview)](sentinel-solutions-deploy.md)
+- [Centrally discover and deploy Microsoft Sentinel out-of-the-box content and solutions (Public preview)](sentinel-solutions-deploy.md)
sentinel Sentinel Solutions Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/sentinel-solutions-catalog.md
For more information, see [Centrally discover and deploy Microsoft Sentinel out-
|Name |Includes |Categories |Supported by | ||||| |**Azure Firewall Solution for Sentinel**| [Data connector](data-connectors-reference.md#azure-firewall), workbook, analytics rules, playbooks, hunting queries, custom Logic App connector |Security - Network Security, Networking | Community|
-| **Azure Purview** | [Data connector](data-connectors-reference.md#azure-purview), workbook, analytics rules <br><br>For more information, see [Tutorial: Integrate Microsoft Sentinel and Azure Purview](purview-solution.md). | Compliance, Security- Cloud Security, and Security- Information Protection | Microsoft |
+| **Microsoft Purview** | [Data connector](data-connectors-reference.md#microsoft-purview), workbook, analytics rules <br><br>For more information, see [Tutorial: Integrate Microsoft Sentinel and Microsoft Purview](purview-solution.md). | Compliance, Security- Cloud Security, and Security- Information Protection | Microsoft |
|**Microsoft Sentinel for SQL PaaS** | [Data connector](data-connectors-reference.md#azure-sql-databases), workbook, analytics rules, playbooks, hunting queries | Application | Community | |**Microsoft Sentinel Training Lab** |Workbook, analytics rules, playbooks, hunting queries | Training and tutorials |Microsoft | |**Azure SQL** | [Data connector](data-connectors-reference.md#azure-sql-databases), workbook, analytics, playbooks, hunting queries | Application |Microsoft |
sentinel Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/whats-new.md
For more information, see:
- [New custom log ingestion and data transformation at ingestion time (Public preview)](#new-custom-log-ingestion-and-data-transformation-at-ingestion-time-public-preview) - [View MITRE support coverage (Public preview)](#view-mitre-support-coverage-public-preview)-- [View Azure Purview data in Microsoft Sentinel (Public preview)](#view-azure-purview-data-in-microsoft-sentinel-public-preview)
+- [View Microsoft Purview data in Microsoft Sentinel (Public preview)](#view-microsoft-purview-data-in-microsoft-sentinel-public-preview)
- [Manually run playbooks based on the incident trigger (Public preview)](#manually-run-playbooks-based-on-the-incident-trigger-public-preview) - [Search across long time spans in large datasets (public preview)](#search-across-long-time-spans-in-large-datasets-public-preview) - [Restore archived logs from search (public preview)](#restore-archived-logs-from-search-public-preview)
For example:
For more information, see [Understand security coverage by the MITRE ATT&CK® framework](mitre-coverage.md).
-### View Azure Purview data in Microsoft Sentinel (Public Preview)
+### View Microsoft Purview data in Microsoft Sentinel (Public Preview)
-Microsoft Sentinel now integrates directly with Azure Purview by providing an out-of-the-box solution.
+Microsoft Sentinel now integrates directly with Microsoft Purview by providing an out-of-the-box solution.
-The Azure Purview solution includes the Azure Purview data connector, related analytics rule templates, and a workbook that you can use to visualize sensitivity data detected by Azure Purview, together with other data ingested in Microsoft Sentinel.
+The Microsoft Purview solution includes the Microsoft Purview data connector, related analytics rule templates, and a workbook that you can use to visualize sensitivity data detected by Microsoft Purview, together with other data ingested in Microsoft Sentinel.
-For more information, see [Tutorial: Integrate Microsoft Sentinel and Azure Purview](purview-solution.md).
+For more information, see [Tutorial: Integrate Microsoft Sentinel and Microsoft Purview](purview-solution.md).
### Manually run playbooks based on the incident trigger (Public preview)
service-fabric How To Managed Cluster Stateless Node Type https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-fabric/how-to-managed-cluster-stateless-node-type.md
Title: Deploy a Service Fabric managed cluster with stateless node types description: Learn how to create and deploy stateless node types in Service Fabric managed clusters- Previously updated : 2/14/2022+ Last updated : 4/11/2022+++ # Deploy a Service Fabric managed cluster with stateless node types
-Service Fabric node types come with an inherent assumption that at some point of time, stateful services might be placed on the nodes. Stateless node types relax this assumption for a node type. Relaxing this assumption enables node stateless node types to benefit from faster scale-out operations by removing some of the restrictions on repair and maintenance operations.
+Service Fabric node types come with an inherent assumption that at some point of time, stateful services might be placed on the nodes. Stateless node types change this assumption for a node type. This allows the node type to benefit from features such as faster scale out operations, support for Automatic OS Upgrades, Spot VMs, and scaling out to more than 100 nodes in a node type.
-* Primary node types cannot be configured to be stateless.
+* Primary node types can't be configured to be stateless.
* Stateless node types require an API version of **2021-05-01** or later.
-* This will automatically set the **multipleplacementgroup** property to **true** which you can [learn more here](how-to-managed-cluster-large-virtual-machine-scale-sets.md).
+* This will automatically set the **multipleplacementgroup** property to **true** which you can [learn more about here](how-to-managed-cluster-large-virtual-machine-scale-sets.md).
* This enables support for up to 1000 nodes for the given node type. * Stateless node types can utilize a VM SKU temporary disk.
-Sample templates are available: [Service Fabric Stateless Node types template](https://github.com/Azure-Samples/service-fabric-cluster-templates)
+## Enabling stateless node types in a Service Fabric managed cluster
+
+To set one or more node types as stateless in a node type resource, set the **isStateless** property to **true**. When deploying a Service Fabric cluster with stateless node types, it's required to have at least one primary node type, which is not stateless in the cluster.
-## Enable stateless node types in a Service Fabric managed cluster
-To set one or more node types as stateless in a node type resource, set the **isStateless** property to **true**. When deploying a Service Fabric cluster with stateless node types, it is required to have at least one primary node type, which is not stateless in the cluster.
+Sample templates are available: [Service Fabric Stateless Node types template](https://github.com/Azure-Samples/service-fabric-cluster-templates)
* The Service Fabric managed cluster resource apiVersion should be **2021-05-01** or later.
To set one or more node types as stateless in a node type resource, set the **is
} ```
-## Configure stateless node types with multiple Availability Zones
-To configure a Stateless node type spanning across multiple availability zones follow [Service Fabric clusters across availability zones](.\service-fabric-cross-availability-zones.md).
+## Enabling stateless node types using Spot VMs in a Service Fabric managed cluster (Preview)
+
+[Azure Spot Virtual Machines on scale sets](../virtual-machine-scale-sets/use-spot.md) enables users to take advantage of unused compute capacity at a significant cost savings. At any point in time when Azure needs the capacity back, the Azure infrastructure will evict these Azure Spot Virtual Machine instances. Therefore, Spot VM node types are great for workloads that can handle interruptions and don't need to be completed within a specific time frame. Recommended workloads include development, testing, batch processing jobs, big data, or other large-scale stateless scenarios.
+
+To set one or more stateless node types to use Spot VM, set both **isStateless** and **IsSpotVM** properties to true. When deploying a Service Fabric cluster with stateless node types, it's required to have at least one primary node type, which is not stateless in the cluster. Stateless node types configured to use Spot VMs have Eviction Policy set to 'Delete'.
+
+Sample templates are available: [Service Fabric Stateless Node types template](https://github.com/Azure-Samples/service-fabric-cluster-templates)
+
+* The Service Fabric managed cluster resource apiVersion should be **2022-02-01-preview** or later.
+
+```json
+{
+ "apiVersion": "[variables('sfApiVersion')]",
+ "type": "Microsoft.ServiceFabric/managedclusters/nodetypes",
+ "name": "[concat(parameters('clusterName'), '/', parameters('nodeTypeName'))]",
+ "location": "[resourcegroup().location]",
+ "dependsOn": [
+ "[concat('Microsoft.ServiceFabric/managedclusters/', parameters('clusterName'))]"
+ ],
+ "properties": {
+ "isStateless": true,
+ "isPrimary": false,
+ "IsSpotVM": true,
+ "vmImagePublisher": "[parameters('vmImagePublisher')]",
+ "vmImageOffer": "[parameters('vmImageOffer')]",
+ "vmImageSku": "[parameters('vmImageSku')]",
+ "vmImageVersion": "[parameters('vmImageVersion')]",
+ "vmSize": "[parameters('nodeTypeSize')]",
+ "vmInstanceCount": "[parameters('nodeTypeVmInstanceCount')]",
+ "dataDiskSizeGB": "[parameters('nodeTypeDataDiskSizeGB')]"
+ }
+}
+```
+
+## Configure stateless node types for zone resiliency
+To configure a Stateless node type for zone resiliency you must [configure managed cluster zone spanning](how-to-managed-cluster-availability-zones.md) at the cluster level.
>[!NOTE]
-> The zonal resiliency property must be set at the cluster level, and this property cannot be changed in place.
+> The zonal resiliency property must be set at the cluster level, and this property can't be changed in place.
## Temporary disk support Stateless node types can be configured to use temporary disk as the data disk instead of a Managed Disk. Using a temporary disk can reduce costs for stateless workloads. To configure a stateless node type to use the temporary disk set the **useTempDataDisk** property to **true**.
Stateless node types can be configured to use temporary disk as the data disk in
## Migrate to using stateless node types in a cluster
-For all migration scenarios, a new stateless node type needs to be added. Existing node type cannot be migrated to be stateless. You can add a new stateless node type to an existing Service Fabric managed cluster, and remove any original node types from the cluster.
+For all migration scenarios, a new stateless node type needs to be added. An existing node type can't be migrated to be stateless. You can add a new stateless node type to an existing Service Fabric managed cluster, and remove any original node types from the cluster.
## Next steps
service-fabric Service Fabric How To Specify Environment Variables https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-fabric/service-fabric-how-to-specify-environment-variables.md
In this example, you set an environment variable for a container. The article as
```xml <ServiceManifestImport>
- <ServiceManifestVersion="1.0.0" />
+ <ServiceManifestRef ServiceManifestName="Guest1Pkg" ServiceManifestVersion="1.0.0" />
<EnvironmentOverrides CodePackageRef="MyCode"> <EnvironmentVariable Name="MyEnvVariable" Value="OverrideValue"/> </EnvironmentOverrides>
sql-database Sql Database Import Purview Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sql-database/scripts/sql-database-import-purview-labels.md
Title: Classify your Azure SQL data using Azure Purview labels
-description: Import your classification from Azure Purview in your Azure SQL Database and Azure Synpase Analytics
+ Title: Classify your Azure SQL data using Microsoft Purview labels
+description: Import your classification from Microsoft Purview in your Azure SQL Database and Azure Synpase Analytics
Last updated 02/17/2021
-# Classify your Azure SQL data using Azure Purview labels
+# Classify your Azure SQL data using Microsoft Purview labels
[!INCLUDE[appliesto-sqldb-asa](../../azure-sql/includes/appliesto-sqldb-asa.md)]
-This document describes how to add Azure Purview labels in your Azure SQL Database and Azure Synapse Analytics (formerly SQL DW).
+This document describes how to add Microsoft Purview labels in your Azure SQL Database and Azure Synapse Analytics (formerly SQL DW).
## Create an application
This document describes how to add Azure Purview labels in your Azure SQL Databa
## Provide permissions to the application
-1. In your Azure portal, search for **Azure Purview accounts**.
-2. Select the Azure Purview account where your SQL databases and Synapse are classified.
+1. In your Azure portal, search for **Microsoft Purview accounts**.
+2. Select the Microsoft Purview account where your SQL databases and Synapse are classified.
3. Open **Access control (IAM)**, select **Add**. 4. Select **Add role assignment**.
-5. In the **Role** section, search for **Azure Purview Data Reader** and select it.
+5. In the **Role** section, search for **Microsoft Purview Data Reader** and select it.
6. In the **Select** section, search for the application you previously created, select it, and hit **Save**.
-## Extract the classification from Azure Purview
+## Extract the classification from Microsoft Purview
-1. Open your Azure Purview account, and in the Home page, search for your Azure SQL Database or Azure Synapse Analytics where you want to copy the labels.
+1. Open your Microsoft Purview account, and in the Home page, search for your Azure SQL Database or Azure Synapse Analytics where you want to copy the labels.
2. Copy the qualifiedName under **Properties**, and keep it for future use. 3. Open your PowerShell shell.
foreach ($referredEntity in $referredEntities.psobject.Properties.GetEnumerator(
$tableName = $Matches.tableName; $columnName = $Matches.columnName;
- Write-Output "ADD SENSITIVITY CLASSIFICATION TO ${schemaName}.${tableName}.${columnName} WITH (LABEL='Azure Purview Label', LABEL_ID='${labelId}');";
+ Write-Output "ADD SENSITIVITY CLASSIFICATION TO ${schemaName}.${tableName}.${columnName} WITH (LABEL='Microsoft Purview Label', LABEL_ID='${labelId}');";
} } ```
foreach ($referredEntity in $referredEntities.psobject.Properties.GetEnumerator(
$tableName = $Matches.tableName; $columnName = $Matches.columnName;
- Write-Output "ADD SENSITIVITY CLASSIFICATION TO ${schemaName}.${tableName}.${columnName} WITH (LABEL='Azure Purview Label', LABEL_ID='${labelId}');";
+ Write-Output "ADD SENSITIVITY CLASSIFICATION TO ${schemaName}.${tableName}.${columnName} WITH (LABEL='Microsoft Purview Label', LABEL_ID='${labelId}');";
} } ```
foreach ($referredEntity in $referredEntities.psobject.Properties.GetEnumerator(
For more information on the Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
-For more information on Azure Purview, see [Azure Purview documentation](../../purview/index.yml).
+For more information on Microsoft Purview, see [Microsoft Purview documentation](../../purview/index.yml).
storage Assign Azure Role Data Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/assign-azure-role-data-access.md
Previously updated : 02/14/2021 Last updated : 04/19/2022
Keep in mind the following points about Azure role assignments in Azure Storage:
- If you have set the appropriate allow permissions to access data via Azure AD and are unable to access the data, for example you are getting an "AuthorizationPermissionMismatch" error. Be sure to allow enough time for the permissions changes you have made in Azure AD to replicate, and be sure that you do not have any deny assignments that block your access, see [Understand Azure deny assignments](../../role-based-access-control/deny-assignments.md). > [!NOTE]
-> You also can make your own Azure custom roles to access blob data. For more information, see [Azure custom roles](../../role-based-access-control/custom-roles.md).
+> You can create custom Azure RBAC roles for granular access to blob data. For more information, see [Azure custom roles](../../role-based-access-control/custom-roles.md).
## Next steps
storage Lifecycle Management Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/lifecycle-management-overview.md
description: Use Azure Storage lifecycle management policies to create automated
Previously updated : 02/24/2022 Last updated : 04/18/2022
For more information about pricing, see [Block Blob pricing](https://azure.micro
## FAQ
-**I created a new policy, why do the actions not run immediately?**
+### I created a new policy. Why do the actions not run immediately?
The platform runs the lifecycle policy once a day. Once you configure a policy, it can take up to 24 hours for some actions to run for the first time.
-**If I update an existing policy, how long does it take for the actions to run?**
+### If I update an existing policy, how long does it take for the actions to run?
The updated policy takes up to 24 hours to go into effect. Once the policy is in effect, it could take up to 24 hours for the actions to run. Therefore, the policy actions may take up to 48 hours to complete. If the update is to disable or delete a rule, and enableAutoTierToHotFromCool was used, auto-tiering to Hot tier will still happen. For example, set a rule including enableAutoTierToHotFromCool based on last access. If the rule is disabled/deleted, and a blob is currently in cool and then accessed, it will move back to Hot as that is applied on access outside of lifecycle management. The blob will not then move from Hot to Cool given the lifecycle management rule is disabled/deleted. The only way to prevent autoTierToHotFromCool is to turn off last access time tracking.
-**I manually rehydrated an archived blob, how do I prevent it from being moved back to the Archive tier temporarily?**
+### I manually rehydrated an archived blob. How do I prevent it from being moved back to the Archive tier temporarily?
When a blob is moved from one access tier to another, its last modification time doesn't change. If you manually rehydrate an archived blob to hot tier, it would be moved back to archive tier by the lifecycle management engine. Disable the rule that affects this blob temporarily to prevent it from being archived again. Re-enable the rule when the blob can be safely moved back to archive tier. You may also copy the blob to another location if it needs to stay in hot or cool tier permanently.
-**The blob prefix match string did not apply your actions to the blobs that you expected it to**
+### The blob prefix match string did not apply the policy to the expected blobs
-The blob prefix match field of a policy is a full or partial blob path, which is used to match the blobs you want the policy actions to apply to. The path must start with the blob container name. If no prefix match is specified, then the policy will apply to all the blobs in the storage account. The prefix match string format is [container name]/[blob name], where the container name or blob name can be a full or partial container name.
-Here are some common misconceptions about the prefix match string:
-- A prefix match string of container1/ applies to all blobs in the blob container named container1. A prefix match string of container1 (note that there is no trailing / character in the prefix string) applies to all blobs in all containers where the blob container name starts with the string container1. This includes blob containers named container11, container1234, container1ab, and so on.-- A prefix match string of container1/sub1/ would apply to all blobs in the container with the name container1, whose blob names that start with the string sub1/ like container1/sub1/test.txt or container1/sub1/sub2/test.txt.-- Wildcard character * - This doesn't mean 'matches one or more occurrences of any character'. The asterisk character * is a valid character in a blob name in Azure Storage. If added in a rule, it means match the blobs with the asterisk in the blob name.-- Wildcard character ? - This doesn't mean 'match a single occurrence of any character'. The question mark character ? is a valid character in a blob name in Azure Storage. If added in a rule, it means match the blobs with a question mark in the blob name.-- prefixMatch with != - The prefixMatch rules only consider positive (=) logical comparisons. Therefore, negative (!=) logical comparisons are ignored.
+The blob prefix match field of a policy is a full or partial blob path, which is used to match the blobs you want the policy actions to apply to. The path must start with the container name. If no prefix match is specified, then the policy will apply to all the blobs in the storage account. The format of the prefix match string is `[container name]/[blob name]`.
+Keep in mind the following points about the prefix match string:
+
+- A prefix match string like *container1/* applies to all blobs in the container named *container1*. A prefix match string of *container1*, without the trailing forward slash character (/), applies to all blobs in all containers where the container name begins with the string *container1*. The prefix will match containers named *container11*, *container1234*, *container1ab*, and so on.
+- A prefix match string of *container1/sub1/* applies to all blobs in the container named *container1* that begin with the string *sub1/*. For example, the prefix will match blobs named *container1/sub1/test.txt* or *container1/sub1/sub2/test.txt*.
+- The asterisk character `*` is a valid character in a blob name. If the asterisk character is used in a prefix, then the prefix will match blobs with an asterisk in their names. The asterisk does not function as a wildcard character.
+- The question mark character `?` is a valid character in a blob name. If the question mark character is used in a prefix, then the prefix will match blobs with a question mark in their names. The question mark does not function as a wildcard character.
+- The prefix match considers only positive (=) logical comparisons. Negative (!=) logical comparisons are ignored.
+
+### Is there a way to identify the time at which the policy will be executing?
+
+Unfortunately, there is no way to track the time at which the policy will be executing, as it is a background scheduling process. However, the platform will run the policy once per day.
## Next steps
storage Object Replication Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/object-replication-overview.md
Previously updated : 09/02/2021 Last updated : 04/19/2022
This table shows how this feature is supported in your account and the impact on
<sup>1</sup> Data Lake Storage Gen2, Network File System (NFS) 3.0 protocol, and SSH File Transfer Protocol (SFTP) support all require a storage account with a hierarchical namespace enabled.
-<sup>2</sup> Feature is supported at the preview level.
+<sup>2</sup> Feature is supported in preview.
## Billing
storage Scalability Targets Premium Block Blobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/scalability-targets-premium-block-blobs.md
# Scalability targets for premium block blob storage accounts A premium-performance block blob storage account is optimized for applications that use smaller, kilobyte-range objects. It's ideal for applications that require high transaction rates or consistent low-latency storage. Premium performance block blob storage is designed to scale with your applications. If your scenario requires that you deploy application(s) that require hundreds of thousands of requests per second or petabytes of storage capacity, contact Microsoft by submitting a support request in the [Azure portal](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade).+
+The service-level agreement (SLA) for Azure Storage accounts is available at [SLA for Storage Accounts](/support/legal/sla/storage/v1_5/).
storage Scalability Targets Premium Page Blobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/scalability-targets-premium-page-blobs.md
[!INCLUDE [storage-scalability-intro-include](../../../includes/storage-scalability-intro-include.md)]
+The service-level agreement (SLA) for Azure Storage accounts is available at [SLA for Storage Accounts](/support/legal/sla/storage/v1_5/).
+ ## Scale targets for premium page blob accounts A premium-performance page blob storage account is optimized for read/write operations. This type of storage account backs an unmanaged disk for an Azure virtual machine.
storage Scalability Targets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/scalability-targets.md
[!INCLUDE [storage-scalability-intro-include](../../../includes/storage-scalability-intro-include.md)]
+The service-level agreement (SLA) for Azure Storage accounts is available at [SLA for Storage Accounts](/support/legal/sla/storage/v1_5/).
+ ## Scale targets for Blob storage [!INCLUDE [storage-blob-scale-targets](../../../includes/storage-blob-scale-targets.md)]
storage Storage Performance Checklist https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-performance-checklist.md
Previously updated : 10/10/2019 Last updated : 04/19/2022
While parallelism can be great for performance, be careful about using unbounded
For best performance, always use the latest client libraries and tools provided by Microsoft. Azure Storage client libraries are available for a variety of languages. Azure Storage also supports PowerShell and Azure CLI. Microsoft actively develops these client libraries and tools with performance in mind, keeps them up-to-date with the latest service versions, and ensures that they handle many of the proven performance practices internally. > [!TIP]
-> The [ABFS driver](data-lake-storage-abfs-driver.md) was designed to overcome the inherent deficiencies of WASB. Favor using the ABFS driver over the WASB driver, as the ABFS driver is optimized specifically for big data analytics.
+> The [ABFS driver](data-lake-storage-abfs-driver.md) was designed to overcome the inherent deficiencies of WASB. Microsoft recommends using the ABFS driver over the WASB driver, as the ABFS driver is optimized specifically for big data analytics.
## Handle service errors
storage Sas Expiration Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/sas-expiration-policy.md
Previously updated : 02/17/2022 Last updated : 04/18/2022
When a SAS expiration policy is in effect for the storage account, the signed st
When you create a SAS expiration policy on a storage account, the policy applies to each type of SAS that is signed with the account key. The types of shared access signatures that are signed with the account key are the service SAS and the account SAS.
-To configure a SAS expiration policy for a storage account, use the Azure portal, PowerShell, or Azure CLI.
+> [!NOTE]
+> Before you can create a SAS expiration policy, you may need to rotate each of your account access keys at least once.
### [Azure portal](#tab/azure-portal)
To create a SAS expiration policy in the Azure portal, follow these steps:
1. Navigate to your storage account in the Azure portal. 1. Under **Settings**, select **Configuration**.
-1. Locate the setting for **Allow recommended upper limit for shared access signature (SAS) expiry interval**, and set it to **Enabled**. You must rotate both access keys at least once before you can set a recommended upper limit for SAS expiry interval else the option will come as disabled.
+1. Locate the setting for **Allow recommended upper limit for shared access signature (SAS) expiry interval**, and set it to **Enabled**. If the setting appears disabled, then you need to rotate both account access keys before you can set a recommended upper limit for SAS expiry interval.
1. Specify the recommended interval for any new shared access signatures that are created on resources in this storage account. :::image type="content" source="media/sas-expiration-policy/configure-sas-expiration-policy-portal.png" alt-text="Screenshot showing how to configure a SAS expiration policy in the Azure portal":::
storage Scalability Targets Resource Provider https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/scalability-targets-resource-provider.md
[!INCLUDE [storage-scalability-intro-include](../../../includes/storage-scalability-intro-include.md)]
+The service-level agreement (SLA) for Azure Storage accounts is available at [SLA for Storage Accounts](/support/legal/sla/storage/v1_5/).
+ ## Scale targets for the resource provider [!INCLUDE [azure-storage-limits-azure-resource-manager](../../../includes/azure-storage-limits-azure-resource-manager.md)]
storage Scalability Targets Standard Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/scalability-targets-standard-account.md
[!INCLUDE [storage-scalability-intro-include](../../../includes/storage-scalability-intro-include.md)]
+The service-level agreement (SLA) for Azure Storage accounts is available at [SLA for Storage Accounts](/support/legal/sla/storage/v1_5/).
+ ## Scale targets for standard storage accounts [!INCLUDE [azure-storage-account-limits-standard](../../../includes/azure-storage-account-limits-standard.md)]
storage Storage Account Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/storage-account-overview.md
Previously updated : 01/24/2022 Last updated : 04/05/2022
The following table describes the types of storage accounts recommended by Micro
Legacy storage accounts are also supported. For more information, see [Legacy storage account types](#legacy-storage-account-types).
-You canΓÇÖt change a storage account to a different type after it's created. To move your data to a storage account of a different type, you must create a new account and copy the data to the new account.
+The service-level agreement (SLA) for Azure Storage accounts is available at [SLA for Storage Accounts](/support/legal/sla/storage/v1_5/).
+
+> [!NOTE]
+> You can't change a storage account to a different type after it's created. To move your data to a storage account of a different type, you must create a new account and copy the data to the new account.
## Storage account endpoints
storage Storage Network Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/storage-network-security.md
You can use the same technique for an account that has the hierarchical namespac
| Azure Machine Learning Service | Microsoft.MachineLearningServices | Authorized Azure Machine Learning workspaces write experiment output, models, and logs to Blob storage and read the data. [Learn more](../../machine-learning/how-to-network-security-overview.md#secure-the-workspace-and-associated-resources). | | Azure Media Services | Microsoft.Media/mediaservices | Allows access to storage accounts through Media Services. | | Azure Migrate | Microsoft.Migrate/migrateprojects | Allows access to storage accounts through Azure Migrate. |
-| Azure Purview | Microsoft.Purview/accounts | Allows Azure Purview to access storage accounts. |
+| Microsoft Purview | Microsoft.Purview/accounts | Allows Microsoft Purview to access storage accounts. |
| Azure Remote Rendering | Microsoft.MixedReality/remoteRenderingAccounts | Allows access to storage accounts through Remote Rendering. | | Azure Site Recovery | Microsoft.RecoveryServices/vaults | Allows access to storage accounts through Site Recovery. | | Azure SQL Database | Microsoft.Sql | Allows [writing](../../azure-sql/database/audit-write-storage-account-behind-vnet-firewall.md) audit data to storage accounts behind firewall. |
storage Storage Redundancy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/storage-redundancy.md
Previously updated : 11/30/2021 Last updated : 04/19/2022
The following table shows which types of storage accounts support ZRS in which r
| Storage account type | Supported regions | Supported services | |--|--|--| | General-purpose v2<sup>1</sup> | (Africa) South Africa North<br /> (Asia Pacific) Southeast Asia<br /> (Asia Pacific) Australia East<br /> (Asia Pacific) Japan East<br /> (Asia Pacific) Central India<br /> (Canada) Canada Central<br /> (Europe) North Europe<br /> (Europe) West Europe<br /> (Europe) France Central<br /> (Europe) Germany West Central<br /> (Europe) UK South<br /> (South America) Brazil South<br /> (US) Central US<br /> (US) East US<br /> (US) East US 2<br /> (US) South Central US<br /> (US) West US 2 | Block blobs<br /> Page blobs<sup>2</sup><br /> File shares (standard)<br /> Tables<br /> Queues<br /> |
-| Premium block blobs<sup>1</sup> | Asia Southeast<br /> Australia East<br /> Brazil South<br /> Europe North<br /> Europe West<br /> France Central <br /> Japan East<br /> UK South <br /> US East <br /> US East 2 <br /> US West 2| Premium block blobs only |
+| Premium block blobs<sup>1</sup> | (Asia) Southeast Asia<br />(Asia Pacific) Australia East<br /> Brazil South<br /> Europe North<br /> Europe West<br /> France Central <br /> Japan East<br /> UK South <br /> US East <br /> US East 2 <br /> US West 2| Premium block blobs only |
| Premium file shares | Asia Southeast<br /> Australia East<br /> Brazil South<br /> Europe North<br /> Europe West<br /> France Central <br /> Japan East<br /> UK South <br /> US East <br /> US East 2 <br /> US West 2 | Premium files shares only | <sup>1</sup> The archive tier is not currently supported for ZRS accounts.<br />
storage Files Nfs Protocol https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/files-nfs-protocol.md
description: Learn about file shares hosted in Azure Files using the Network Fil
Previously updated : 11/16/2021 Last updated : 04/19/2022 # NFS file shares in Azure Files
-Azure Files offers two industry-standard protocols for mounting Azure file share: the [Server Message Block (SMB)](/windows/win32/fileio/microsoft-smb-protocol-and-cifs-protocol-overview) protocol and the [Network File System (NFS)](https://en.wikipedia.org/wiki/Network_File_System) protocol. Azure Files enables you to pick the file system protocol that is the best fit for your workload. Azure file shares don't support accessing an individual Azure file share with both the SMB and NFS protocols, although you can create SMB and NFS file shares within the same storage account. For all file shares, Azure Files offers enterprise-grade file shares that can scale up to meet your storage needs and can be accessed concurrently by thousands of clients.
+Azure Files offers two industry-standard file system protocols for mounting Azure file shares: the [Server Message Block (SMB)](/windows/win32/fileio/microsoft-smb-protocol-and-cifs-protocol-overview) protocol and the [Network File System (NFS)](https://en.wikipedia.org/wiki/Network_File_System) protocol, allowing you to pick the protocol that is the best fit for your workload. Azure file shares don't support accessing an individual Azure file share with both the SMB and NFS protocols, although you can create SMB and NFS file shares within the same storage account. Azure Files offers enterprise-grade file shares that can scale up to meet your storage needs and can be accessed concurrently by thousands of clients.
This article covers NFS Azure file shares. For information about SMB Azure file shares, see [SMB file shares in Azure Files](files-smb-protocol.md).
This article covers NFS Azure file shares. For information about SMB Azure file
NFS file shares are often used in the following scenarios: - Backing storage for Linux/UNIX-based applications, such as line-of-business applications written using Linux or POSIX file system APIs (even if they don't require POSIX-compliance).-- Workloads that require POSIX-compliant file shares, case sensitivity, or Unix style permissions(UID/GID).
+- Workloads that require POSIX-compliant file shares, case sensitivity, or Unix style permissions (UID/GID).
- New application and service development, particularly if that application or service has a requirement for random IO and hierarchical storage. ## Features
NFS file shares are often used in the following scenarios:
## Security and networking All data stored in Azure Files is encrypted at rest using Azure storage service encryption (SSE). Storage service encryption works similarly to BitLocker on Windows: data is encrypted beneath the file system level. Because data is encrypted beneath the Azure file share's file system, as it's encoded to disk, you don't have to have access to the underlying key on the client to read or write to the Azure file share. Encryption at rest applies to both the SMB and NFS protocols.
-For encryption in transit, Azure provides a layer of encryption for all data in transit between Azure datacenters using [MACSec](https://en.wikipedia.org/wiki/IEEE_802.1AE). Through this, encryption exists when data is transferred between Azure datacenters. Unlike Azure Files using the SMB protocol, file shares using the NFS protocol do not offer user-based authentication. Authentication for NFS shares is based on the configured network security rules. Due to this, to ensure only secure connections are established to your NFS share, you must use either service endpoints or private endpoints. If you want to access shares from on-premises then, in addition to a private endpoint, you must setup a VPN or ExpressRoute. Requests that do not originate from the following sources will be rejected:
+For encryption in transit, Azure provides a layer of encryption for all data in transit between Azure datacenters using [MACSec](https://en.wikipedia.org/wiki/IEEE_802.1AE). Through this, encryption exists when data is transferred between Azure datacenters. Unlike Azure Files using the SMB protocol, file shares using the NFS protocol do not offer user-based authentication. Authentication for NFS shares is based on the configured network security rules. Due to this, to ensure only secure connections are established to your NFS share, you must use either service endpoints or private endpoints. If you want to access shares from on-premises, then you must set up a VPN or ExpressRoute in addition to a private endpoint. Requests that do not originate from the following sources will be rejected:
- [A private endpoint](storage-files-networking-overview.md#private-endpoints) - [Azure VPN Gateway](../../vpn-gateway/vpn-gateway-about-vpngateways.md)
For more details on the available networking options, see [Azure Files networkin
The following table shows the current level of support for Azure Storage features in accounts that have the NFS 4.1 feature enabled.
-The status of items that appear in this tables may change over time as support continues to expand.
+The status of items that appear in this table may change over time as support continues to expand.
| Storage feature | Supported for NFS shares | |--||
The status of items that appear in this tables may change over time as support c
[!INCLUDE [files-nfs-regional-availability](../../../includes/files-nfs-regional-availability.md)] ## Performance
-NFS Azure file shares are only offered on premium file shares, which stores data on solid-state drives (SSD). The IOPS and the throughput of NFS shares scale with the provisioned capacity. See the [provisioned model](understanding-billing.md#provisioned-model) section of the understanding billing article to understand the formulas for IOPS, IO bursting, and throughput. The average IO latencies are low-single-digit-millisecond for small IO size while average metadata latencies are high-single-digit-millisecond. Metadata heavy operations such as untar and workloads like WordPress may face additional latencies due to high number of open and close operations.
+NFS Azure file shares are only offered on premium file shares, which store data on solid-state drives (SSD). The IOPS and throughput of NFS shares scale with the provisioned capacity. See the [provisioned model](understanding-billing.md#provisioned-model) section of the **Understanding billing** article to understand the formulas for IOPS, IO bursting, and throughput. The average IO latencies are low-single-digit-millisecond for small IO size, while average metadata latencies are high-single-digit-millisecond. Metadata heavy operations such as untar and workloads like WordPress may face additional latencies due to the high number of open and close operations.
## Workloads > [!IMPORTANT]
-> Before using NFS file shares for production, see the [Troubleshoot Azure NFS file shares](storage-troubleshooting-files-nfs.md) article for list of known issues.
+> Before using NFS file shares for production, see the [Troubleshoot Azure NFS file shares](storage-troubleshooting-files-nfs.md) article for a list of known issues.
NFS has been validated to work well with workloads such as SAP application layer, database backups, database replication, messaging queues, home directories for general purpose file servers, and content repositories for application workloads.
storage Storage Files Networking Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/storage-files-networking-overview.md
description: An overview of networking options for Azure Files.
Previously updated : 04/12/2022 Last updated : 04/19/2022 # Azure Files networking considerations
-You can access your Azure file shares over the public internet accessible endpoint, over one or more private endpoints on your network(s), or by caching your Azure file share on-premises with Azure File Sync (SMB file shares only). This article focuses on how to configure Azure Files for direct access over the public and/or private endpoints. To learn more about how to cache your Azure file share on-premises with Azure File Sync, see [Introduction to Azure File Sync](../file-sync/file-sync-introduction.md).
+You can access your Azure file shares over the public internet accessible endpoint, over one or more private endpoints on your network(s), or by caching your Azure file share on-premises with Azure File Sync (SMB file shares only). This article focuses on how to configure Azure Files for direct access over public and/or private endpoints. To learn how to cache your Azure file share on-premises with Azure File Sync, see [Introduction to Azure File Sync](../file-sync/file-sync-introduction.md).
We recommend reading [Planning for an Azure Files deployment](storage-files-planning.md) prior to reading this conceptual guide.
Directly accessing the Azure file share often requires additional thought with r
- NFS file shares rely on network-level authentication and are therefore only accessible via restricted networks. Using an NFS file share always requires some level of networking configuration.
-Configuration of the public and private endpoints for Azure Files is done on the top-level management object for Azure Files, the Azure storage account. A storage account is a management construct that represents a shared pool of storage in which you can deploy multiple Azure file shares, as well as the storage resources for other Azure storage services, such as blob containers or queues.
+Configuring public and private endpoints for Azure Files is done on the top-level management object for Azure Files, the Azure storage account. A storage account is a management construct that represents a shared pool of storage in which you can deploy multiple Azure file shares, as well as the storage resources for other Azure storage services, such as blob containers or queues.
:::row::: :::column:::
Configuration of the public and private endpoints for Azure Files is done on the
| Premium file shares (FileStorage), LRS/ZRS | ![Yes](../media/icons/yes-icon.png) | ![Yes](../media/icons/yes-icon.png) | ## Secure transfer
-By default, Azure storage accounts require secure transfer, regardless of data access is done over the public or private endpoint. For Azure Files, the **require secure transfer** setting is enforced for all protocol access to the data stored on Azure file shares, inclusive of SMB, NFS, and FileREST. The **require secure transfer** setting may be disabled to allow unencrypted traffic. You may also see this mislabeled as "require secure transfer for REST API operations".
+By default, Azure storage accounts require secure transfer, regardless of whether data is accessed over the public or private endpoint. For Azure Files, the **require secure transfer** setting is enforced for all protocol access to the data stored on Azure file shares, including SMB, NFS, and FileREST. The **require secure transfer** setting may be disabled to allow unencrypted traffic. You may also see this setting mislabeled as "require secure transfer for REST API operations".
The SMB, NFS, and FileREST protocols have slightly different behavior with respect to the **require secure transfer** setting: -- When require secure transfer is enabled on a storage account, all SMB file shares in that storage account will require the SMB 3.x protocol with AES-128-CCM, AES-128-GCM, or AES-256-GCM encryption algorithms, depending on the available/required encryption negotiation between the SMB client and Azure Files. You can toggle which SMB encryption algorithms are allowed via the [SMB security settings](files-smb-protocol.md#smb-security-settings). Disabling the **require secure transfer** setting enables SMB 2.1 and SMB 3.x mounts without encryption.
+- When **require secure transfer** is enabled on a storage account, all SMB file shares in that storage account will require the SMB 3.x protocol with AES-128-CCM, AES-128-GCM, or AES-256-GCM encryption algorithms, depending on the available/required encryption negotiation between the SMB client and Azure Files. You can toggle which SMB encryption algorithms are allowed via the [SMB security settings](files-smb-protocol.md#smb-security-settings). Disabling the **require secure transfer** setting enables SMB 2.1 and SMB 3.x mounts without encryption.
- NFS file shares do not support an encryption mechanism, so in order to use the NFS protocol to access an Azure file share, you must disable **require secure transfer** for the storage account.
The SMB, NFS, and FileREST protocols have slightly different behavior with respe
## Public endpoint The public endpoint for the Azure file shares within a storage account is an internet exposed endpoint. The public endpoint is the default endpoint for a storage account, however, it can be disabled if desired.
-The SMB, NFS, and the FileREST protocols can all use the public endpoint. However, each has slightly different rules for access:
+The SMB, NFS, and FileREST protocols can all use the public endpoint. However, each has slightly different rules for access:
-- SMB file shares are accessible from anywhere in the world via the storage account's public endpoint with SMB 3.x with encryption. This means that authenticated requests, such as requests authorized by a user's logon identity, can originate securely from inside or outside of Azure region. If SMB 2.1 or SMB 3.x without encryption is desired, two conditions must be met:
+- SMB file shares are accessible from anywhere in the world via the storage account's public endpoint with SMB 3.x with encryption. This means that authenticated requests, such as requests authorized by a user's logon identity, can originate securely from inside or outside of the Azure region. If SMB 2.1 or SMB 3.x without encryption is desired, two conditions must be met:
1. The storage account's **require secure transfer** setting must be disabled. 2. The request must originate from inside of the Azure region. As previously mentioned, encrypted SMB requests are allowed from anywhere, inside or outside of the Azure region. - NFS file shares are accessible from the storage account's public endpoint if and only if the storage account's public endpoint is restricted to specific virtual networks using *service endpoints*. See [public endpoint firewall settings](#public-endpoint-firewall-settings) for additional information on *service endpoints*. -- FileREST is accessible via the public endpoint. If secure transfer is required, only HTTPS requests are accepted. If secure transfer is disabled, HTTP requests are accepted by the public endpoint regardless of origin
+- FileREST is accessible via the public endpoint. If secure transfer is required, only HTTPS requests are accepted. If secure transfer is disabled, HTTP requests are accepted by the public endpoint regardless of origin.
### Public endpoint firewall settings The storage account firewall restricts access to the public endpoint for a storage account. Using the storage account firewall, you can restrict access to certain IP addresses/IP address ranges, to specific virtual networks, or disable the public endpoint entirely.
-When you restrict the traffic of the public endpoint to one or more virtual networks, you are using a capability of the virtual network called *service endpoints*. Requests directed to the service endpoint of Azure Files are still going to the storage account public IP address, however the networking layer is doing additional verification of the request to validate that it is coming from an authorized virtual network. SMB, NFS, and FileREST all support service endpoints, however, unlike SMB and FileREST, NFS file shares can only be access with the public endpoint through use of a service endpoint.
+When you restrict the traffic of the public endpoint to one or more virtual networks, you are using a capability of the virtual network called *service endpoints*. Requests directed to the service endpoint of Azure Files are still going to the storage account public IP address; however, the networking layer is doing additional verification of the request to validate that it is coming from an authorized virtual network. The SMB, NFS, and FileREST protocols all support service endpoints. Unlike SMB and FileREST, however, NFS file shares can only be accessed with the public endpoint through use of a *service endpoint*.
To learn more about how to configure the storage account firewall, see [configure Azure storage firewalls and virtual networks](storage-files-networking-endpoints.md#restrict-access-to-the-public-endpoint-to-specific-virtual-networks).
Using private endpoints with Azure Files enables you to:
To create a private endpoint, see [Configuring private endpoints for Azure Files](storage-files-networking-endpoints.md#create-a-private-endpoint). ### Tunneling traffic over a virtual private network or ExpressRoute
-To make use of private endpoints to access SMB or NFS file shares from on-premises, you must establish a network tunnel between your on-premises network and Azure. A [virtual network](../../virtual-network/virtual-networks-overview.md), or VNet, is similar to a traditional network that you'd operate on-premises. Like an Azure storage account or an Azure VM, a VNet is an Azure resource that is deployed in a resource group.
+To use private endpoints to access SMB or NFS file shares from on-premises, you must establish a network tunnel between your on-premises network and Azure. A [virtual network](../../virtual-network/virtual-networks-overview.md), or VNet, is similar to a traditional on-premises network. Like an Azure storage account or an Azure VM, a VNet is an Azure resource that is deployed in a resource group.
Azure Files supports the following mechanisms to tunnel traffic between your on-premises workstations and servers and Azure SMB/NFS file shares: -- [Azure VPN Gateway](../../vpn-gateway/vpn-gateway-about-vpngateways.md): A VPN gateway is a specific type of virtual network gateway that is used to send encrypted traffic between an Azure virtual network and an alternate location (such as on-premises) over the internet. An Azure VPN Gateway is an Azure resource that can be deployed in a resource group along side of a storage account or other Azure resources. VPN gateways expose two different types of connections:
- - [Point-to-Site (P2S) VPN](../../vpn-gateway/point-to-site-about.md) gateway connections, which are VPN connections between Azure and an individual client. This solution is primarily useful for devices that are not part of your organization's on-premises network, such as telecommuters who want to be able to mount their Azure file share from home, a coffee shop, or hotel while on the road. To use a P2S VPN connection with Azure Files, a P2S VPN connection will need to be configured for each client that wants to connect. To simplify the deployment of a P2S VPN connection, see [Configure a Point-to-Site (P2S) VPN on Windows for use with Azure Files](storage-files-configure-p2s-vpn-windows.md) and [Configure a Point-to-Site (P2S) VPN on Linux for use with Azure Files](storage-files-configure-p2s-vpn-linux.md).
- - [Site-to-Site (S2S) VPN](../../vpn-gateway/design.md#s2smulti), which are VPN connections between Azure and your organization's network. A S2S VPN connection enables you to configure a VPN connection once, for a VPN server or device hosted on your organization's network, rather than doing for every client device that needs to access your Azure file share. To simplify the deployment of a S2S VPN connection, see [Configure a Site-to-Site (S2S) VPN for use with Azure Files](storage-files-configure-s2s-vpn.md).
+- [Azure VPN Gateway](../../vpn-gateway/vpn-gateway-about-vpngateways.md): A VPN gateway is a specific type of virtual network gateway that is used to send encrypted traffic between an Azure virtual network and an alternate location (such as on-premises) over the internet. An Azure VPN Gateway is an Azure resource that can be deployed in a resource group alongside of a storage account or other Azure resources. VPN gateways expose two different types of connections:
+ - [Point-to-Site (P2S) VPN](../../vpn-gateway/point-to-site-about.md) gateway connections, which are VPN connections between Azure and an individual client. This solution is primarily useful for devices that are not part of your organization's on-premises network. A common use case is for telecommuters who want to be able to mount their Azure file share from home, a coffee shop, or hotel while on the road. To use a P2S VPN connection with Azure Files, you'll need to configure a P2S VPN connection for each client that wants to connect. To simplify the deployment of a P2S VPN connection, see [Configure a Point-to-Site (P2S) VPN on Windows for use with Azure Files](storage-files-configure-p2s-vpn-windows.md) and [Configure a Point-to-Site (P2S) VPN on Linux for use with Azure Files](storage-files-configure-p2s-vpn-linux.md).
+ - [Site-to-Site (S2S) VPN](../../vpn-gateway/design.md#s2smulti), which are VPN connections between Azure and your organization's network. A S2S VPN connection enables you to configure a VPN connection once for a VPN server or device hosted on your organization's network, rather than configuring a connection for every client device that needs to access your Azure file share. To simplify the deployment of a S2S VPN connection, see [Configure a Site-to-Site (S2S) VPN for use with Azure Files](storage-files-configure-s2s-vpn.md).
- [ExpressRoute](../../expressroute/expressroute-introduction.md), which enables you to create a defined route between Azure and your on-premises network that doesn't traverse the internet. Because ExpressRoute provides a dedicated path between your on-premises datacenter and Azure, ExpressRoute may be useful when network performance is a consideration. ExpressRoute is also a good option when your organization's policy or regulatory requirements require a deterministic path to your resources in the cloud. > [!Note]
-> Although we recommend using private endpoints to assist in extending your on-premises network into Azure, it is technically possible to route to the public endpoint over the VPN connection, however this requires hard-coding the IP address for the public endpoint for Azure storage cluster that serves your storage account. Since storage accounts may be moved between storage clusters at any time and new clusters are added and removed all the time, this requires regularly hard-coding all possible the Azure storage IP addresses into your routing rules.
+> Although we recommend using private endpoints to assist in extending your on-premises network into Azure, it is technically possible to route to the public endpoint over the VPN connection. However, this requires hard-coding the IP address for the public endpoint for the Azure storage cluster that serves your storage account. Because storage accounts may be moved between storage clusters at any time and new clusters are frequently added and removed, this requires regularly hard-coding all the possible Azure storage IP addresses into your routing rules.
### DNS configuration
-When you create a private endpoint, by default we also create a (or update an existing) private DNS zone corresponding to the `privatelink` subdomain. Strictly speaking, creating a private DNS zone is not required to use a private endpoint for your storage account, but it is highly recommended in general and explicitly required when mounting your Azure file share with an Active Directory user principal or accessing from the FileREST API.
+When you create a private endpoint, by default we also create a (or update an existing) private DNS zone corresponding to the `privatelink` subdomain. Strictly speaking, creating a private DNS zone is not required to use a private endpoint for your storage account. However, it is highly recommended in general and explicitly required when mounting your Azure file share with an Active Directory user principal or accessing it from the FileREST API.
> [!Note]
-> This article uses the storage account DNS suffix for the Azure Public regions, `core.windows.net`. This commentary also applies to Azure Sovereign clouds such as the Azure US Government cloud and the Azure China cloud - just substitute the the appropriate suffixes for your environment.
+> This article uses the storage account DNS suffix for the Azure Public regions, `core.windows.net`. This commentary also applies to Azure Sovereign clouds such as the Azure US Government cloud and the Azure China cloud - just substitute the appropriate suffixes for your environment.
-In your private DNS zone, we create an A record for `storageaccount.privatelink.file.core.windows.net` and a CNAME record for the regular name of the storage account, which follows the pattern `storageaccount.file.core.windows.net`. Since your Azure private DNS zone is connected to the virtual network containing the private endpoint, you can observe the DNS configuration when by calling the `Resolve-DnsName` cmdlet from PowerShell in an Azure VM (alternately `nslookup` in Windows and Linux):
+In your private DNS zone, we create an A record for `storageaccount.privatelink.file.core.windows.net` and a CNAME record for the regular name of the storage account, which follows the pattern `storageaccount.file.core.windows.net`. Because your Azure private DNS zone is connected to the virtual network containing the private endpoint, you can observe the DNS configuration by calling the `Resolve-DnsName` cmdlet from PowerShell in an Azure VM (alternately `nslookup` in Windows and Linux):
```powershell Resolve-DnsName -Name "storageaccount.file.core.windows.net"
IP4Address : 52.239.194.40
This reflects the fact that the storage account can expose both the public endpoint and one or more private endpoints. To ensure that the storage account name resolves to the private endpoint's private IP address, you must change the configuration on your on-premises DNS servers. This can be accomplished in several ways: -- Modifying the hosts file on your clients to make `storageaccount.file.core.windows.net` resolve to the desired private endpoint's private IP address. This is strongly discouraged for production environments, since you will need make these changes to every client that wants to mount your Azure file shares and changes to the storage account or private endpoint will not be automatically handled.-- Creating an A record for `storageaccount.file.core.windows.net` in your on-premises DNS servers. This has the advantage that clients in your on-premises environment will be able to automatically resolve the storage account without needing to configure each client, however this solution is similarly brittle to modifying the hosts file because changes are not reflected. Although this solution is brittle, it may be the best choice for some environments.
+- Modifying the *hosts* file on your clients to make `storageaccount.file.core.windows.net` resolve to the desired private endpoint's private IP address. This is strongly discouraged for production environments, because you will need to make these changes to every client that wants to mount your Azure file shares, and changes to the storage account or private endpoint will not be automatically handled.
+- Creating an A record for `storageaccount.file.core.windows.net` in your on-premises DNS servers. This has the advantage that clients in your on-premises environment will be able to automatically resolve the storage account without needing to configure each client. However, this solution is similarly brittle to modifying the *hosts* file because changes are not reflected. Although this solution is brittle, it may be the best choice for some environments.
- Forward the `core.windows.net` zone from your on-premises DNS servers to your Azure private DNS zone. The Azure private DNS host can be reached through a special IP address (`168.63.129.16`) that is only accessible inside virtual networks that are linked to the Azure private DNS zone. To work around this limitation, you can run additional DNS servers within your virtual network that will forward `core.windows.net` on to the Azure private DNS zone. To simplify this set up, we have provided PowerShell cmdlets that will auto-deploy DNS servers in your Azure virtual network and configure them as desired. To learn how to set up DNS forwarding, see [Configuring DNS with Azure Files](storage-files-networking-dns.md). ## SMB over QUIC Windows Server 2022 Azure Edition supports a new transport protocol called QUIC for the SMB server provided by the File Server role. QUIC is a replacement for TCP that is built on top of UDP, providing numerous advantages over TCP while still providing a reliable transport mechanism. Although there are multiple advantages to QUIC as a transport protocol, one key advantage for the SMB protocol is that all transport is done over port 443, which is widely open outbound to support HTTPS. This effectively means that SMB over QUIC offers a "SMB VPN" for file sharing over the public internet. Windows 11 ships with a SMB over QUIC capable client.
-Unfortunately, Azure Files does not directly support SMB over QUIC, however, you can create a lightweight cache of your Azure file shares on a Windows Server 2022 Azure Edition VM using Azure File Sync. To learn more about this option, see [Deploy Azure File Sync](../file-sync/file-sync-deployment-guide.md) and [SMB over QUIC](/windows-server/storage/file-server/smb-over-quic).
+At this time, Azure Files does not directly support SMB over QUIC. However, you can create a lightweight cache of your Azure file shares on a Windows Server 2022 Azure Edition VM using Azure File Sync. To learn more about this option, see [Deploy Azure File Sync](../file-sync/file-sync-deployment-guide.md) and [SMB over QUIC](/windows-server/storage/file-server/smb-over-quic).
## See also - [Azure Files overview](storage-files-introduction.md)
storage Storage Files Planning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/storage-files-planning.md
description: Understand planning for an Azure Files deployment. You can either d
Previously updated : 04/12/2022 Last updated : 04/19/2022 # Planning for an Azure Files deployment
-[Azure Files](storage-files-introduction.md) can be deployed in two main ways: by directly mounting the serverless Azure file shares or by caching Azure file shares on-premises using Azure File Sync. Which deployment option you choose changes the things you need to consider as you plan for your deployment.
+[Azure Files](storage-files-introduction.md) can be deployed in two main ways: by directly mounting the serverless Azure file shares or by caching Azure file shares on-premises using Azure File Sync. Deployment considerations will differ based on which option you choose.
- **Direct mount of an Azure file share**: Because Azure Files provides either Server Message Block (SMB) or Network File System (NFS) access, you can mount Azure file shares on-premises or in the cloud using the standard SMB or NFS clients available in your OS. Because Azure file shares are serverless, deploying for production scenarios does not require managing a file server or NAS device. This means you don't have to apply software patches or swap out physical disks.
This article primarily addresses deployment considerations for deploying an Azure file share to be directly mounted by an on-premises or cloud client. To plan for an Azure File Sync deployment, see [Planning for an Azure File Sync deployment](../file-sync/file-sync-planning.md). ## Available protocols
-Azure Files offers two industry-standard protocols for mounting Azure file share: the [Server Message Block (SMB)](files-smb-protocol.md) protocol and the [Network File System (NFS)](files-nfs-protocol.md) protocol. Azure Files enables you to pick the file system protocol that is the best fit for your workload. Azure file shares do not support both the SMB and NFS protocols on the same file share, although you can create SMB and NFS Azure file shares within the same storage account. NFS 4.1 is currently only supported within new **FileStorage** storage account type (premium file shares only).
+Azure Files offers two industry-standard file system protocols for mounting Azure file shares: the [Server Message Block (SMB)](files-smb-protocol.md) protocol and the [Network File System (NFS)](files-nfs-protocol.md) protocol, allowing you to choose the protocol that is the best fit for your workload. Azure file shares do not support both the SMB and NFS protocols on the same file share, although you can create SMB and NFS Azure file shares within the same storage account. NFS 4.1 is currently only supported within new **FileStorage** storage account type (premium file shares only).
With both SMB and NFS file shares, Azure Files offers enterprise-grade file shares that can scale up to meet your storage needs and can be accessed concurrently by thousands of clients.
With both SMB and NFS file shares, Azure Files offers enterprise-grade file shar
When deploying Azure file shares into storage accounts, we recommend: -- Only deploying Azure file shares into storage accounts with other Azure file shares. Although GPv2 storage accounts allow you to have mixed purpose storage accounts, since storage resources such as Azure file shares and blob containers share the storage account's limits, mixing resources together may make it more difficult to troubleshoot performance issues later on.
+- Only deploying Azure file shares into storage accounts with other Azure file shares. Although GPv2 storage accounts allow you to have mixed purpose storage accounts, because storage resources such as Azure file shares and blob containers share the storage account's limits, mixing resources together may make it more difficult to troubleshoot performance issues later on.
-- Paying attention to a storage account's IOPS limitations when deploying Azure file shares. Ideally, you would map file shares 1:1 with storage accounts, however this may not always be possible due to various limits and restrictions, both from your organization and from Azure. When it is not possible to have only one file share deployed in one storage account, consider which shares will be highly active and which shares will be less active to ensure that the hottest file shares don't get put in the same storage account together.
+- Paying attention to a storage account's IOPS limitations when deploying Azure file shares. Ideally, you would map file shares 1:1 with storage accounts. However, this may not always be possible due to various limits and restrictions, both from your organization and from Azure. When it is not possible to have only one file share deployed in one storage account, consider which shares will be highly active and which shares will be less active to ensure that the hottest file shares don't get put in the same storage account together.
-- Only deploy GPv2 and FileStorage accounts and upgrade GPv1 and classic storage accounts when you find them in your environment.
+- Only deploying GPv2 and FileStorage accounts and upgrading GPv1 and classic storage accounts when you find them in your environment.
## Identity
-To access an Azure file share, the user of the file share must be authenticated and have authorization to access the share. This is done based on the identity of the user accessing the file share. Azure Files integrates with three main identity providers:
-- **On-premises Active Directory Domain Services (AD DS, or on-premises AD DS)**: Azure storage accounts can be domain joined to a customer-owned, Active Directory Domain Services, just like a Windows Server file server or NAS device. You can deploy a domain controller on-premises, in an Azure VM, or even as a VM in another cloud provider; Azure Files is agnostic to where your domain controller is hosted. Once a storage account is domain-joined, the end user can mount a file share with the user account they signed into their PC with. AD-based authentication uses the Kerberos authentication protocol.
+To access an Azure file share, the user of the file share must be authenticated and authorized to access the share. This is done based on the identity of the user accessing the file share. Azure Files integrates with three main identity providers:
+- **On-premises Active Directory Domain Services (AD DS, or on-premises AD DS)**: Azure storage accounts can be domain joined to a customer-owned Active Directory Domain Services, just like a Windows Server file server or NAS device. You can deploy a domain controller on-premises, in an Azure VM, or even as a VM in another cloud provider; Azure Files is agnostic to where your domain controller is hosted. Once a storage account is domain-joined, the end user can mount a file share with the user account they signed into their PC with. AD-based authentication uses the Kerberos authentication protocol.
- **Azure Active Directory Domain Services (Azure AD DS)**: Azure AD DS provides a Microsoft-managed domain controller that can be used for Azure resources. Domain joining your storage account to Azure AD DS provides similar benefits to domain joining it to a customer-owned Active Directory. This deployment option is most useful for application lift-and-shift scenarios that require AD-based permissions. Since Azure AD DS provides AD-based authentication, this option also uses the Kerberos authentication protocol.-- **Azure storage account key**: Azure file shares may also be mounted with an Azure storage account key. To mount a file share this way, the storage account name is used as the username and the storage account key is used as a password. Using the storage account key to mount the Azure file share is effectively an administrator operation, since the mounted file share will have full permissions to all of the files and folders on the share, even if they have ACLs. When using the storage account key to mount over SMB, the NTLMv2 authentication protocol is used.
+- **Azure storage account key**: Azure file shares may also be mounted with an Azure storage account key. To mount a file share this way, the storage account name is used as the username and the storage account key is used as a password. Using the storage account key to mount the Azure file share is effectively an administrator operation, because the mounted file share will have full permissions to all of the files and folders on the share, even if they have ACLs. When using the storage account key to mount over SMB, the NTLMv2 authentication protocol is used.
For customers migrating from on-premises file servers, or creating new file shares in Azure Files intended to behave like Windows file servers or NAS appliances, domain joining your storage account to **Customer-owned Active Directory** is the recommended option. To learn more about domain joining your storage account to a customer-owned Active Directory, see [Azure Files Active Directory overview](storage-files-active-directory-overview.md).
Azure Files supports two different types of encryption: encryption in transit, w
By default, all Azure storage accounts have encryption in transit enabled. This means that when you mount a file share over SMB or access it via the FileREST protocol (such as through the Azure portal, PowerShell/CLI, or Azure SDKs), Azure Files will only allow the connection if it is made with SMB 3.x with encryption or HTTPS. Clients that do not support SMB 3.x or clients that support SMB 3.x but not SMB encryption will not be able to mount the Azure file share if encryption in transit is enabled. For more information about which operating systems support SMB 3.x with encryption, see our detailed documentation for [Windows](storage-how-to-use-files-windows.md), [macOS](storage-how-to-use-files-mac.md), and [Linux](storage-how-to-use-files-linux.md). All current versions of the PowerShell, CLI, and SDKs support HTTPS.
-You can disable encryption in transit for an Azure storage account. When encryption is disabled, Azure Files will also allow SMB 2.1, SMB 3.x without encryption, and unencrypted FileREST API calls over HTTP. The primary reason to disable encryption in transit is to support a legacy application that must be run on an older operating system, such as Windows Server 2008 R2 or older Linux distribution. Azure Files only allows SMB 2.1 connections within the same Azure region as the Azure file share; an SMB 2.1 client outside of the Azure region of the Azure file share, such as on-premises or in a different Azure region, will not be able to access the file share.
+You can disable encryption in transit for an Azure storage account. When encryption is disabled, Azure Files will also allow SMB 2.1 and SMB 3.x without encryption, and unencrypted FileREST API calls over HTTP. The primary reason to disable encryption in transit is to support a legacy application that must be run on an older operating system, such as Windows Server 2008 R2 or an older Linux distribution. Azure Files only allows SMB 2.1 connections within the same Azure region as the Azure file share; an SMB 2.1 client outside of the Azure region of the Azure file share, such as on-premises or in a different Azure region, will not be able to access the file share.
We strongly recommend ensuring encryption of data in-transit is enabled.
storage Understanding Billing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/files/understanding-billing.md
description: Learn how to interpret the provisioned and pay-as-you-go billing mo
Previously updated : 4/16/2022 Last updated : 4/19/2022
Azure Files provides two distinct billing models: provisioned and pay-as-you-go.
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/m5_-GsKv4-o" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> :::column-end::: :::column:::
- This video is an interview discussing covering the basics of the Azure Files billing model including how to optimize Azure file shares to achieve the lowest costs possible and how to compare Azure Files to other file storage offerings on-premises and in the cloud.
+ This video is an interview discussing covering the basics of the Azure Files billing model, including how to optimize Azure file shares to achieve the lowest costs possible and how to compare Azure Files to other file storage offerings on-premises and in the cloud.
:::column-end::: :::row-end:::
If you are migrating to Azure Files from on-premises or comparing Azure Files to
- **How do you pay for storage, IOPS, and bandwidth?** With Azure Files, the billing model you use depends on whether you are deploying [premium](#provisioned-model) or [standard](#pay-as-you-go-model) file shares. Most cloud solutions have models that align with the principles of either provisioned storage (price determinism, simplicity) or pay-as-you-go storage (pay only for what you actually use). Of particular interest for provisioned models are minimum provisioned share size, the provisioning unit, and the ability to increase and decrease provisioning. -- **Are there any methods to optimized storage costs?** With Azure Files, you can use [capacity reservations](#reserve-capacity) to achieve an up to 36% discount on storage. Other solutions may employ storage efficiency strategies like deduplication or compression to optionally optimized storage, but remember, these storage optimization strategies often have non-monetary costs, such as reducing performance. Azure Files capacity reservations have no side-effects on performance.
+- **Are there any methods to optimize storage costs?** With Azure Files, you can use [capacity reservations](#reserve-capacity) to achieve an up to 36% discount on storage. Other solutions may employ storage efficiency strategies like deduplication or compression to optionally optimize storage, but remember, these storage optimization strategies often have non-monetary costs, such as reducing performance. Azure Files capacity reservations have no side effects on performance.
- **How do you achieve storage resiliency and redundancy?** With Azure Files, storage resiliency and redundancy are baked into the product offering. All tiers and redundancy levels ensure that data is highly available and at least three copies of your data are accessible. When considering other file storage options, consider whether storage resiliency and redundancy is built-in or something you must assemble yourself. - **What do you need to manage?** With Azure Files, the basic unit of management is a storage account. Other solutions may require additional management, such as operating system updates or virtual resource management (VMs, disks, network IP addresses, etc.). -- **What are the costs of value-added products, like backup, security, etc.?** Azure Files supports integrations with multiple first- and third-party [value-added services](#value-added-services), such as Azure Backup, Azure File Sync, and Azure Defender that provide backup, replication and caching, and additional security functionality for Azure Files. Value-added solutions on-premises or with other cloud storage solutions will have their own licensing and product costs, and should be considered consistently as part of the total cost of ownership for file storage.
+- **What are the costs of value-added products, like backup, security, etc.?** Azure Files supports integrations with multiple first- and third-party [value-added services](#value-added-services), such as Azure Backup, Azure File Sync, and Azure Defender, that provide backup, replication and caching, and additional security functionality for Azure Files. Value-added solutions, whether on-premises or in the cloud, will have their own licensing and product costs, and should be considered as part of the total cost of ownership for file storage.
## Reserve capacity Azure Files supports storage capacity reservations, which enable you to achieve a discount on storage by pre-committing to storage utilization. You should consider purchasing reserved instances for any production workload, or dev/test workloads with consistent footprints. When you purchase reserved capacity, your reservation must specify the following dimensions: - **Capacity size**: Capacity reservations can be for either 10 TiB or 100 TiB, with more significant discounts for purchasing a higher capacity reservation. You can purchase multiple reservations, including reservations of different capacity sizes to meet your workload requirements. For example, if your production deployment has 120 TiB of file shares, you could purchase one 100 TiB reservation and two 10 TiB reservations to meet the total capacity requirements.-- **Term**: Reservations can be purchased for either a one year or three year term, with more significant discounts for purchasing a longer reservation term.
+- **Term**: Reservations can be purchased for either a one-year or three-year term, with more significant discounts for purchasing a longer reservation term.
- **Tier**: The tier of Azure Files for the capacity reservation. Reservations for Azure Files currently are available for the premium, hot, and cool tiers. - **Location**: The Azure region for the capacity reservation. Capacity reservations are available in a subset of Azure regions. - **Redundancy**: The storage redundancy for the capacity reservation. Reservations are supported for all redundancies Azure Files supports, including LRS, ZRS, GRS, and GZRS.
Once you purchase a capacity reservation, it will automatically be consumed by y
For more information on how to purchase storage reservations, see [Optimize costs for Azure Files with reserved capacity](files-reserve-capacity.md). ## Provisioned model
-Azure Files uses a provisioned model for premium file shares. In a provisioned business model, you proactively specify to the Azure Files service what your storage requirements are, rather than being billed based on what you use. This is similar to buying hardware on-premises, in that when you provision an Azure file share with a certain amount of storage, you pay for that storage regardless of whether you use it or not, just like you don't start paying the costs of physical media on-premises when you start to use space. Unlike purchasing physical media on-premises, provisioned file shares can be dynamically scaled up or down depending on your storage and IO performance characteristics.
+Azure Files uses a provisioned model for premium file shares. In a provisioned business model, you proactively specify to the Azure Files service what your storage requirements are, rather than being billed based on what you use. This is similar to buying hardware on-premises in that when you provision an Azure file share with a certain amount of storage, you pay for that storage regardless of whether you use it or not, just like you don't start paying the costs of physical media on-premises when you start to use space. Unlike purchasing physical media on-premises, provisioned file shares can be dynamically scaled up or down depending on your storage and IO performance characteristics.
The provisioned size of the file share can be increased at any time but can be decreased only after 24 hours since the last increase. After waiting for 24 hours without a quota increase, you can decrease the share quota as many times as you like, until you increase it again. IOPS/throughput scale changes will be effective within a few minutes after the provisioned size change.
-It is possible to decrease the size of your provisioned share below your used GiB. If you do this, you will not lose data but, you will still be billed for the size used and receive the performance (baseline IOPS, throughput, and burst IOPS) of the provisioned share, not the size used.
+It is possible to decrease the size of your provisioned share below your used GiB. If you do this, you will not lose data, but you will still be billed for the size used and receive the performance (baseline IOPS, throughput, and burst IOPS) of the provisioned share, not the size used.
### Provisioning method When you provision a premium file share, you specify how many GiBs your workload requires. Each GiB that you provision entitles you to additional IOPS and throughput on a fixed ratio. In addition to the baseline IOPS for which you are guaranteed, each premium file share supports bursting on a best effort basis. The formulas for IOPS and throughput are as follows:
The following table illustrates a few examples of these formulae for the provisi
| 51,200 | 54,200 | Up to 100,000 | 164,880,000 | 5,220 | | 102,400 | 100,000 | Up to 100,000 | 0 | 10,340 |
-Effective file share performance is subject to machine network limits, available network bandwidth, IO sizes, parallelism, among many other factors. For example, based on internal testing with 8 KiB read/write IO sizes, a single Windows virtual machine without SMB Multichannel enabled, *Standard F16s_v2*, connected to premium file share over SMB could achieve 20K read IOPS and 15K write IOPS. With 512 MiB read/write IO sizes, the same VM could achieve 1.1 GiB/s egress and 370 MiB/s ingress throughput. The same client can achieve up to \~3x performance if SMB Multichannel is enabled on the premium shares. To achieve maximum performance scale, [enable SMB Multichannel](files-smb-protocol.md#smb-multichannel) and spread the load across multiple VMs. Refer to [SMB Multichannel performance](storage-files-smb-multichannel-performance.md) and [troubleshooting guide](storage-troubleshooting-files-performance.md) for some common performance issues and workarounds.
+Effective file share performance is subject to machine network limits, available network bandwidth, IO sizes, and parallelism, among many other factors. For example, based on internal testing with 8 KiB read/write IO sizes, a single Windows virtual machine without SMB Multichannel enabled, *Standard F16s_v2*, connected to premium file share over SMB could achieve 20K read IOPS and 15K write IOPS. With 512 MiB read/write IO sizes, the same VM could achieve 1.1 GiB/s egress and 370 MiB/s ingress throughput. The same client can achieve up to \~3x performance if SMB Multichannel is enabled on the premium shares. To achieve maximum performance scale, [enable SMB Multichannel](files-smb-protocol.md#smb-multichannel) and spread the load across multiple VMs. Refer to [SMB Multichannel performance](storage-files-smb-multichannel-performance.md) and [troubleshooting guide](storage-troubleshooting-files-performance.md) for some common performance issues and workarounds.
### Bursting
-If your workload needs the extra performance to meet peak demand, your share can use burst credits to go above share baseline IOPS limit to offer share performance it needs to meet the demand. Premium file shares can burst their IOPS up to 4,000 or up to a factor of three, whichever is a higher value. Bursting is automated and operates based on a credit system. Bursting works on a best effort basis and the burst limit is not a guarantee.
+If your workload needs the extra performance to meet peak demand, your share can use burst credits to go above the share's baseline IOPS limit to give the share the performance it needs to meet the demand. Premium file shares can burst their IOPS up to 4,000 or up to a factor of three, whichever is a higher value. Bursting is automated and operates based on a credit system. Bursting works on a best effort basis, and the burst limit is not a guarantee.
-Credits accumulate in a burst bucket whenever traffic for your file share is below baseline IOPS. For example, a 100 GiB share has 500 baseline IOPS. If actual traffic on the share was 100 IOPS for a specific 1-second interval, then the 400 unused IOPS are credited to a burst bucket. Similarly, an idle 1 TiB share, accrues burst credit at 1,424 IOPS. These credits will then be used later when operations would exceed the baseline IOPS.
+Credits accumulate in a burst bucket whenever traffic for your file share is below baseline IOPS. For example, a 100 GiB share has 500 baseline IOPS. If actual traffic on the share was 100 IOPS for a specific 1-second interval, then the 400 unused IOPS are credited to a burst bucket. Similarly, an idle 1 TiB share accrues burst credit at 1,424 IOPS. These credits will then be used later when operations would exceed the baseline IOPS.
-Whenever a share exceeds the baseline IOPS and has credits in a burst bucket, it will burst up to the max allowed peak burst rate. Shares can continue to burst as long as credits are remaining but, this is based on the number of burst credits accrued. Each IO beyond baseline IOPS consumes one credit and once all credits are consumed the share would return to the baseline IOPS.
+Whenever a share exceeds the baseline IOPS and has credits in a burst bucket, it will burst up to the maximum allowed peak burst rate. Shares can continue to burst as long as credits are remaining, but this is based on the number of burst credits accrued. Each IO beyond baseline IOPS consumes one credit, and once all credits are consumed, the share would return to the baseline IOPS.
Share credits have three states:
Share credits have three states:
New file shares start with the full number of credits in its burst bucket. Burst credits will not be accrued if the share IOPS fall below baseline IOPS due to throttling by the server. ## Pay-as-you-go model
-Azure Files uses a pay-as-you-go business model for standard file shares. In a pay-as-you-go business model, the amount you pay is determined by how much you actually use, rather than based on a provisioned amount. At a high level, you pay a cost for the amount of logical data stored, and then an additional set of transactions based on your usage of that data. A pay-as-you-go model can be cost-efficient, because you don't need to overprovision to account for future growth or performance requirements or deprovision if your workload is data footprint varies over time. On the other hand, a pay-as-you-go model can also be difficult to plan as part of a budgeting process, because the pay-as-you-go billing model is driven by end-user consumption.
+Azure Files uses a pay-as-you-go business model for standard file shares. In a pay-as-you-go business model, the amount you pay is determined by how much you actually use, rather than based on a provisioned amount. At a high level, you pay a cost for the amount of logical data stored, and then an additional set of transactions based on your usage of that data. A pay-as-you-go model can be cost-efficient, because you don't need to overprovision to account for future growth or performance requirements, or deprovision if your workload and data footprint vary over time. On the other hand, a pay-as-you-go model can also be difficult to plan as part of a budgeting process, because the pay-as-you-go billing model is driven by end-user consumption.
### Differences in standard tiers
-When you create a standard file share, you pick between the transaction optimized, hot, and cool tiers. All three tiers are stored on the exact same standard storage hardware. The main difference for these three tiers is their data at-rest storage prices, which are lower in cooler tiers, and the transaction prices, which are higher in the cooler tiers. This means:
+When you create a standard file share, you pick between the following tiers: transaction optimized, hot, and cool. All three tiers are stored on the exact same standard storage hardware. The main difference for these three tiers is their data at-rest storage prices, which are lower in cooler tiers, and the transaction prices, which are higher in the cooler tiers. This means:
- Transaction optimized, as the name implies, optimizes the price for high transaction workloads. Transaction optimized has the highest data at-rest storage price, but the lowest transaction prices. - Hot is for active workloads that do not involve a large number of transactions, and has a slightly lower data at-rest storage price, but slightly higher transaction prices as compared to transaction optimized. Think of it as the middle ground between the transaction optimized and cool tiers.
When you create a standard file share, you pick between the transaction optimize
If you put an infrequently accessed workload in the transaction optimized tier, you will pay almost nothing for the few times in a month that you make transactions against your share, but you will pay a high amount for the data storage costs. If you were to move this same share to the cool tier, you would still pay almost nothing for the transaction costs, simply because you are very infrequently making transactions for this workload, but the cool tier has a much cheaper data storage price. Selecting the appropriate tier for your use case allows you to considerably reduce your costs.
-Similarly, if you put a highly accessed workload in the cool tier, you will pay a lot more in transaction costs, but less for data storage costs. This can lead to a situation where the increased costs from the transaction prices increase outweigh the savings from the decreased data storage price, leading you to pay more money on cool than you would have on transaction optimized. It is possible for some usage levels that while the hot tier will be the most cost efficient tier, the cool tier will be more expensive than transaction optimized.
+Similarly, if you put a highly accessed workload in the cool tier, you will pay a lot more in transaction costs, but less for data storage costs. This can lead to a situation where the increased costs from the transaction prices increase outweigh the savings from the decreased data storage price, leading you to pay more money on cool than you would have on transaction optimized. For some usage levels, it's possible that the hot tier will be the most cost efficient, and the cool tier will be more expensive than transaction optimized.
Your workload and activity level will determine the most cost efficient tier for your standard file share. In practice, the best way to pick the most cost efficient tier involves looking at the actual resource consumption of the share (data stored, write transactions, etc.). ### Choosing a tier
-Regardless of how you migrate existing data into Azure Files, we recommend initially creating the file share in transaction optimized tier due to the large number of transactions incurred during migration. After your migration is complete and you've operated for a few days/weeks with regular usage, you can plug in your transaction counts into the [pricing calculator](https://azure.microsoft.com/pricing/calculator/) to figure out which tier is best suited for your workload.
+Regardless of how you migrate existing data into Azure Files, we recommend initially creating the file share in transaction optimized tier due to the large number of transactions incurred during migration. After your migration is complete and you've operated for a few days/weeks with regular usage, you can plug your transaction counts into the [pricing calculator](https://azure.microsoft.com/pricing/calculator/) to figure out which tier is best suited for your workload.
Because standard file shares only show transaction information at the storage account level, using the storage metrics to estimate which tier is cheaper at the file share level is an imperfect science. If possible, we recommend deploying only one file share in each storage account to ensure full visibility into billing.
There are five basic transaction categories: write, list, read, other, and delet
| Delete transactions | <ul><li>`DeleteShare`</li></ul> | <ul><li>`ClearRange`</li><li>`DeleteDirectory`</li></li>`DeleteFile`</li></ul> | > [!Note]
-> NFS 4.1 is only available for premium file shares, which use the provisioned billing model, transactions do not affect billing for premium file shares.
+> NFS 4.1 is only available for premium file shares, which use the provisioned billing model. Transactions do not affect billing for premium file shares.
## Provisioned/quota, logical size, and physical size Azure Files tracks three distinct quantities with respect to share capacity: -- **Provisioned size or quota**: With both premium and standard file shares, you specify the maximum size that the file share is allowed to grow to. In premium file shares, this value is called the provisioned size of the file share and whatever you amount you provision is what you pay for, regardless of how much you actually use. In standard file shares, this value is called quota and does not directly affect your bill. Provisioned size is a required field for premium file shares, while standard file shares will default if not directly specified to the maximum value supported by the storage account, either 5 TiB or 100 TiB, depending on the storage account type and settings.
+- **Provisioned size or quota**: With both premium and standard file shares, you specify the maximum size that the file share is allowed to grow to. In premium file shares, this value is called the provisioned size, and whatever amount you provision is what you pay for, regardless of how much you actually use. In standard file shares, this value is called quota and does not directly affect your bill. Provisioned size is a required field for premium file shares, while standard file shares will default if not directly specified to the maximum value supported by the storage account, either 5 TiB or 100 TiB, depending on the storage account type and settings.
-- **Logical size**: The logical size of a file share or of a particular file relates to how big the file is without considering how the file is actually stored, where additional optimizations may be applied. One way to think about this is that the logical size of the file is how many KiB/MiB/GiB will be transferred over the wire if you copy it to a different location. In both premium and standard file shares, the total logical size of the file share is what is used for enforcement against provisioned size/quota. In standard file shares, the logical size is the quantity used for the data at-rest usage billing. Logical size is referred to as "size" in the Windows properties dialog for a file/folder and as "content length" by Azure Files metrics.
+- **Logical size**: The logical size of a file share or file relates to how big it is without considering how it is actually stored, where additional optimizations may be applied. One way to think about this is that the logical size of the file is how many KiB/MiB/GiB will be transferred over the wire if you copy it to a different location. In both premium and standard file shares, the total logical size of the file share is what is used for enforcement against provisioned size/quota. In standard file shares, the logical size is the quantity used for the data at-rest usage billing. Logical size is referred to as "size" in the Windows properties dialog for a file/folder and as "content length" by Azure Files metrics.
- **Physical size**: The physical size of the file relates to the size of the file as encoded on disk. This may align with the file's logical size, or it may be smaller, depending on how the file has been written to by the operating system. A common reason for the logical size and physical size to be different is through the use of [sparse files](/windows/win32/fileio/sparse-files). The physical size of the files in the share is used for snapshot billing, although allocated ranges are shared between snapshots if they are unchanged (differential storage). To learn more about how snapshots are billed in Azure Files, see [Snapshots](#snapshots).
Like on-premises storage solutions which offer first- and third-party features/p
Costs are generally broken down into three buckets: -- **Licensing costs for the value-added service.** These may come in the form of a fixed cost per customer, end-user (sometimes referred to as a "head cost"), Azure file share or storage account, or in units of storage utilization, such as a fixed cost for every 500 GiB chunk of data in the file share.
+- **Licensing costs for the value-added service.** These may come in the form of a fixed cost per customer, end user (sometimes referred to as a "head cost"), Azure file share or storage account, or in units of storage utilization, such as a fixed cost for every 500 GiB chunk of data in the file share.
-- **Transaction costs for the value-added service.** Some value-added services have their own concept of transactions distinct from what Azure Files views as a transaction. These transactions will show up on your bill under the value-added service's charges; however, relate directly to how you use the value-added service with your file share.
+- **Transaction costs for the value-added service.** Some value-added services have their own concept of transactions distinct from what Azure Files views as a transaction. These transactions will show up on your bill under the value-added service's charges; however, they relate directly to how you use the value-added service with your file share.
-- **Azure Files costs for using a value-added service.** Azure Files does not directly charge customers costs for adding value-added services, but as part of adding value to the Azure file share, the value-added service might increase the costs that you see on your Azure file share. This is really easy to see with standard file shares, because standard file shares have a pay-as-you-go model with transaction charges. If the value-added service does transactions against the file share on your behalf, they will show up in your Azure Files transaction bill even though you didn't directly do those transactions yourself. This applies to premium file shares as well, although it may be less noticeable. Additional transactions against premium file shares from value-added services count against your provisioned IOPS numbers, meaning that value-added services may require provisioning additional storage to have enough IOPS or throughput available for your workload.
+- **Azure Files costs for using a value-added service.** Azure Files does not directly charge customers costs for adding value-added services, but as part of adding value to the Azure file share, the value-added service might increase the costs that you see on your Azure file share. This is easy to see with standard file shares, because standard file shares have a pay-as-you-go model with transaction charges. If the value-added service does transactions against the file share on your behalf, they will show up in your Azure Files transaction bill even though you didn't directly do those transactions yourself. This applies to premium file shares as well, although it may be less noticeable. Additional transactions against premium file shares from value-added services count against your provisioned IOPS numbers, meaning that value-added services may require provisioning additional storage to have enough IOPS or throughput available for your workload.
When computing the total cost of ownership for your file share, you should consider the costs of Azure Files and of all value-added services that you would like to use with Azure Files.
If you are migrating to Azure File Sync from StorSimple, see [Comparing the cost
### Azure Backup Azure Backup provides a serverless backup solution for Azure Files that seamlessly integrates with your file shares, as well as other value-added services such as Azure File Sync. Azure Backup for Azure Files is a snapshot-based backup solution, meaning that Azure Backup provides a scheduling mechanism for automatically taking snapshots on an administrator-defined schedule and a user-friendly interface for restoring deleted files/folders or the entire share to a particular point in time. To learn more about Azure Backup for Azure Files, see [About Azure file share backup](../../backup/azure-file-share-backup-overview.md?toc=/azure/storage/files/toc.json).
-When considering the costs of using Azure Backup to backup your Azure file shares, you should consider the following:
+When considering the costs of using Azure Backup to back up your Azure file shares, you should consider the following:
- **Protected instance licensing cost for Azure file share data.** Azure Backup charges a protected instance licensing cost per storage account containing backed up Azure file shares. A protected instance is defined as 250 GiB of Azure file share storage. Storage accounts containing less than 250 GiB of Azure file share storage are subject to a fractional protected instance cost. See [Azure Backup pricing](https://azure.microsoft.com/pricing/details/backup/) for more information (note that you must select *Azure Files* from the list of services Azure Backup can protect).
synapse-analytics How To Access Secured Purview Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/catalog-and-governance/how-to-access-secured-purview-account.md
Title: Access a secured Azure Purview account
-description: Learn about how to access a a firewall protected Azure Purview account through private endpoints from Synapse
+ Title: Access a secured Microsoft Purview account
+description: Learn about how to access a a firewall protected Microsoft Purview account through private endpoints from Synapse
-# Access a secured Azure Purview account from Azure Synapse Analytics
+# Access a secured Microsoft Purview account from Azure Synapse Analytics
-This article describes how to access a secured Azure Purview account from Azure Synapse Analytics for different integration scenarios.
+This article describes how to access a secured Microsoft Purview account from Azure Synapse Analytics for different integration scenarios.
-## Azure Purview private endpoint deployment scenarios
+## Microsoft Purview private endpoint deployment scenarios
-You can use [Azure private endpoints](../../private-link/private-endpoint-overview.md) for your Azure Purview accounts to allow secure access from a virtual network (VNet) to the catalog over a Private Link. Azure Purview provides different types of private points for various access need: *account* private endpoint, *portal* private endpoint, and *ingestion* private endpoints. Learn more from [Azure Purview private endpoints conceptual overview](../../purview/catalog-private-link.md#conceptual-overview).
+You can use [Azure private endpoints](../../private-link/private-endpoint-overview.md) for your Microsoft Purview accounts to allow secure access from a virtual network (VNet) to the catalog over a Private Link. Microsoft Purview provides different types of private points for various access need: *account* private endpoint, *portal* private endpoint, and *ingestion* private endpoints. Learn more from [Microsoft Purview private endpoints conceptual overview](../../purview/catalog-private-link.md#conceptual-overview).
-If your Azure Purview account is protected by firewall and denies public access, make sure you follow below checklist to set up the private endpoints so Synapse can successfully connect to Azure Purview.
+If your Microsoft Purview account is protected by firewall and denies public access, make sure you follow below checklist to set up the private endpoints so Synapse can successfully connect to Microsoft Purview.
-| Scenario | Required Azure Purview private endpoints |
+| Scenario | Required Microsoft Purview private endpoints |
| | |
-| [Run pipeline and report lineage to Azure Purview](../../purview/how-to-lineage-azure-synapse-analytics.md) | For Synapse pipeline to push lineage to Azure Purview, Azure Purview ***account*** and ***ingestion*** private endpoints are required. <br>- When using **Azure Integration Runtime**, follow the steps in [Managed private endpoints for Azure Purview](#managed-private-endpoints-for-azure-purview) section to create managed private endpoints in the Synapse managed virtual network.<br>- When using **Self-hosted Integration Runtime**, follow the steps in [this section](../../purview/catalog-private-link-end-to-end.md#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-azure-purview-accounts) to create the *account* and *ingestion* private endpoints in your integration runtime's virtual network. |
-| [Discover and explore data using Azure Purview on Synapse Studio](how-to-discover-connect-analyze-azure-purview.md) | To use the search bar at the top center of Synapse Studio to search for Azure Purview data and perform actions, you need to create Azure Purview ***account*** and ***portal*** private endpoints in the virtual network that you launch the Synapse Studio. Follow the steps in [Enable *account* and *portal* private endpoint](../../purview/catalog-private-link-account-portal.md#option-2enable-account-and-portal-private-endpoint-on-existing-azure-purview-accounts). |
+| [Run pipeline and report lineage to Microsoft Purview](../../purview/how-to-lineage-azure-synapse-analytics.md) | For Synapse pipeline to push lineage to Microsoft Purview, Microsoft Purview ***account*** and ***ingestion*** private endpoints are required. <br>- When using **Azure Integration Runtime**, follow the steps in [Managed private endpoints for Microsoft Purview](#managed-private-endpoints-for-microsoft-purview) section to create managed private endpoints in the Synapse managed virtual network.<br>- When using **Self-hosted Integration Runtime**, follow the steps in [this section](../../purview/catalog-private-link-end-to-end.md#option-2enable-account-portal-and-ingestion-private-endpoint-on-existing-microsoft-purview-accounts) to create the *account* and *ingestion* private endpoints in your integration runtime's virtual network. |
+| [Discover and explore data using Microsoft Purview on Synapse Studio](how-to-discover-connect-analyze-azure-purview.md) | To use the search bar at the top center of Synapse Studio to search for Microsoft Purview data and perform actions, you need to create Microsoft Purview ***account*** and ***portal*** private endpoints in the virtual network that you launch the Synapse Studio. Follow the steps in [Enable *account* and *portal* private endpoint](../../purview/catalog-private-link-account-portal.md#option-2enable-account-and-portal-private-endpoint-on-existing-microsoft-purview-accounts). |
-## Managed private endpoints for Azure Purview
+## Managed private endpoints for Microsoft Purview
-[Managed private endpoints](../security/synapse-workspace-managed-private-endpoints.md) are private endpoints created a Managed Virtual Network associated with your Azure Synapse workspace. When you run pipeline and report lineage to a firewall protected Azure Purview account, make sure your Synapse workspace is created with "Managed virtual network" option enabled, then create the Azure Purview ***account*** and ***ingestion*** managed private endpoints as follows.
+[Managed private endpoints](../security/synapse-workspace-managed-private-endpoints.md) are private endpoints created a Managed Virtual Network associated with your Azure Synapse workspace. When you run pipeline and report lineage to a firewall protected Microsoft Purview account, make sure your Synapse workspace is created with "Managed virtual network" option enabled, then create the Microsoft Purview ***account*** and ***ingestion*** managed private endpoints as follows.
### Create managed private endpoints
-To create managed private endpoints for Azure Purview on Synapse Studio:
+To create managed private endpoints for Microsoft Purview on Synapse Studio:
-1. Go to **Manage** -> **Azure Purview**, and click **Edit** to edit your existing connected Azure Purview account or click **Connect to an Azure Purview account** to connect to a new Azure Purview account.
+1. Go to **Manage** -> **Microsoft Purview**, and click **Edit** to edit your existing connected Microsoft Purview account or click **Connect to a Microsoft Purview account** to connect to a new Microsoft Purview account.
2. Select **Yes** for **Create managed private endpoints**. You need to have "**workspaces/managedPrivateEndpoint/write**" permission, e.g. Synapse Administrator or Synapse Linked Data Manager role.
-3. Click **+ Create all** button to batch create the needed Azure Purview private endpoints, including the ***account*** private endpoint and the ***ingestion*** private endpoints for the Azure Purview managed resources - Blob storage, Queue storage, and Event Hubs namespace. You need to have at least **Reader** role on your Azure Purview account for Synapse to retrieve the Azure Purview managed resources' information.
+3. Click **+ Create all** button to batch create the needed Microsoft Purview private endpoints, including the ***account*** private endpoint and the ***ingestion*** private endpoints for the Microsoft Purview managed resources - Blob storage, Queue storage, and Event Hubs namespace. You need to have at least **Reader** role on your Microsoft Purview account for Synapse to retrieve the Microsoft Purview managed resources' information.
- :::image type="content" source="./media/purview-create-all-managed-private-endpoints.png" alt-text="Create managed private endpoint for your connected Azure Purview account.":::
+ :::image type="content" source="./media/purview-create-all-managed-private-endpoints.png" alt-text="Create managed private endpoint for your connected Microsoft Purview account.":::
4. In the next page, specify a name for the private endpoint. It will be used to generate names for the ingestion private endpoints as well with suffix.
- :::image type="content" source="./media/name-purview-private-endpoints.png" alt-text="Name the managed private endpoints for your connected Azure Purview account.":::
+ :::image type="content" source="./media/name-purview-private-endpoints.png" alt-text="Name the managed private endpoints for your connected Microsoft Purview account.":::
-5. Click **Create** to create the private endpoints. After creation, 4 private endpoint requests will be generated that must [get approved by an owner of Azure Purview](#approve-private-endpoint-connections).
+5. Click **Create** to create the private endpoints. After creation, 4 private endpoint requests will be generated that must [get approved by an owner of Microsoft Purview](#approve-private-endpoint-connections).
-Such batch managed private endpoint creation is provided on the Synapse Studio only. If you want to create the managed private endpoints programmatically, you need to create those PEs individually. You can find Azure Purview managed resources' information from Azure portal -> your Azure Purview account -> Managed resources.
+Such batch managed private endpoint creation is provided on the Synapse Studio only. If you want to create the managed private endpoints programmatically, you need to create those PEs individually. You can find Microsoft Purview managed resources' information from Azure portal -> your Microsoft Purview account -> Managed resources.
### Approve private endpoint connections
-After you create the managed private endpoints for Azure Purview, you see "Pending" state first. The Azure Purview owner need to approve the private endpoint connections for each resource.
+After you create the managed private endpoints for Microsoft Purview, you see "Pending" state first. The Microsoft Purview owner need to approve the private endpoint connections for each resource.
-If you have permission to approve the Azure Purview private endpoint connection, from Synapse Studio:
+If you have permission to approve the Microsoft Purview private endpoint connection, from Synapse Studio:
-1. Go to **Manage** -> **Azure Purview** -> **Edit**
+1. Go to **Manage** -> **Microsoft Purview** -> **Edit**
2. In the private endpoint list, click the **Edit** (pencil) button next to each private endpoint name 3. Click **Manage approvals in Azure portal** which will bring you to the resource. 4. On the given resource, go to **Networking** -> **Private endpoint connection** to approve it. The private endpoint is named as `data_factory_name.your_defined_private_endpoint_name` with description as "Requested by data_factory_name". 5. Repeat this operation for all private endpoints.
-If you don't have permission to approve the Azure Purview private endpoint connection, ask the Azure Purview account owner to do as follows.
+If you don't have permission to approve the Microsoft Purview private endpoint connection, ask the Microsoft Purview account owner to do as follows.
-- For *account* private endpoint, go to Azure portal -> your Azure Purview account -> Networking -> Private endpoint connection to approve.-- For *ingestion* private endpoints, go to Azure portal -> your Azure Purview account -> Managed resources, click into the Storage account and Event Hubs namespace respectively, and approve the private endpoint connection in Networking -> Private endpoint connection page.
+- For *account* private endpoint, go to Azure portal -> your Microsoft Purview account -> Networking -> Private endpoint connection to approve.
+- For *ingestion* private endpoints, go to Azure portal -> your Microsoft Purview account -> Managed resources, click into the Storage account and Event Hubs namespace respectively, and approve the private endpoint connection in Networking -> Private endpoint connection page.
### Monitor managed private endpoints
-You can monitor the created managed private endpoints for Azure Purview at two places:
+You can monitor the created managed private endpoints for Microsoft Purview at two places:
-- Go to **Manage** -> **Azure Purview** -> **Edit** to open your existing connected Azure Purview account. To see all the relevant private endpoints, you need to have at least **Reader** role on your Azure Purview account for Synapse to retrieve the Azure Purview managed resources' information. Otherwise, you only see *account* private endpoint with warning.-- Go to **Manage** -> **Managed private endpoints** where you see all the managed private endpoints created under the Synapse workspace. If you have at least **Reader** role on your Azure Purview account, you see Azure Purview relevant private endpoints being grouped together. Otherwise, they show up separately in the list.
+- Go to **Manage** -> **Microsoft Purview** -> **Edit** to open your existing connected Microsoft Purview account. To see all the relevant private endpoints, you need to have at least **Reader** role on your Microsoft Purview account for Synapse to retrieve the Microsoft Purview managed resources' information. Otherwise, you only see *account* private endpoint with warning.
+- Go to **Manage** -> **Managed private endpoints** where you see all the managed private endpoints created under the Synapse workspace. If you have at least **Reader** role on your Microsoft Purview account, you see Microsoft Purview relevant private endpoints being grouped together. Otherwise, they show up separately in the list.
## Next steps -- [Connect Synapse workspace to Azure Purview](quickstart-connect-azure-purview.md)
+- [Connect Synapse workspace to Microsoft Purview](quickstart-connect-azure-purview.md)
- [Metadata and lineage from Azure Synapse Analytics](../../purview/how-to-lineage-azure-synapse-analytics.md)-- [Discover, connect and explore data in Synapse using Azure Purview](how-to-discover-connect-analyze-azure-purview.md)
+- [Discover, connect and explore data in Synapse using Microsoft Purview](how-to-discover-connect-analyze-azure-purview.md)
synapse-analytics How To Discover Connect Analyze Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/catalog-and-governance/how-to-discover-connect-analyze-azure-purview.md
Title: Discover, connect, and explore data in Synapse using Azure Purview
+ Title: Discover, connect, and explore data in Synapse using Microsoft Purview
description: Guide on how to discover data, connect them and explore them in Synapse
-# Discover, connect, and explore data in Synapse using Azure Purview
+# Discover, connect, and explore data in Synapse using Microsoft Purview
-In this document, you will learn the type of interactions that you can perform when registering an Azure Purview Account into Synapse.
+In this document, you will learn the type of interactions that you can perform when registering a Microsoft Purview Account into Synapse.
## Prerequisites -- [Azure Azure Purview account](../../purview/create-catalog-portal.md)
+- [Azure Microsoft Purview account](../../purview/create-catalog-portal.md)
- [Synapse workspace](../quickstart-create-workspace.md) -- [Connect an Azure Purview Account into Synapse](quickstart-connect-azure-purview.md)
+- [Connect a Microsoft Purview Account into Synapse](quickstart-connect-azure-purview.md)
-## Using Azure Purview in Synapse
+## Using Microsoft Purview in Synapse
-The use Azure Purview in Synapse requires you to have access to that Azure Purview account. Synapse passes-through your Azure Purview permission. As an example, if you have a curator permission role, you will be able to edit metadata scanned by Azure Purview.
+The use Microsoft Purview in Synapse requires you to have access to that Microsoft Purview account. Synapse passes-through your Microsoft Purview permission. As an example, if you have a curator permission role, you will be able to edit metadata scanned by Microsoft Purview.
### Data discovery: search datasets
-To discover data registered and scanned by Azure Purview, you can use the Search bar at the top center of Synapse workspace. Make sure that you select Azure Purview to search for all of your organization data.
+To discover data registered and scanned by Microsoft Purview, you can use the Search bar at the top center of Synapse workspace. Make sure that you select Microsoft Purview to search for all of your organization data.
-[![Search for Azure Purview assets](./media/purview-access.png)](./media/purview-access.png#lightbox)
+[![Search for Microsoft Purview assets](./media/purview-access.png)](./media/purview-access.png#lightbox)
-## Azure Purview actions
+## Microsoft Purview actions
-Here is a list of the Azure Purview features that are available in Synapse:
+Here is a list of the Microsoft Purview features that are available in Synapse:
- **Overview** of the metadata - View and edit **schema** of the metadata with classifications, glossary terms, data types, and descriptions - View **lineage** to understand dependencies and do impact analysis. For more information about, see [lineage](../../purview/catalog-lineage-user-guide.md)
With **New data flow**, you can create an integration dataset that can be used a
##  Next steps -- [Register and scan Azure Synapse assets in Azure Purview](../../purview/register-scan-azure-synapse-analytics.md)-- [How to Search Data in Azure Purview Data Catalog](../../purview/how-to-search-catalog.md)
+- [Register and scan Azure Synapse assets in Microsoft Purview](../../purview/register-scan-azure-synapse-analytics.md)
+- [How to Search Data in Microsoft Purview Data Catalog](../../purview/how-to-search-catalog.md)
synapse-analytics Quickstart Connect Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/catalog-and-governance/quickstart-connect-azure-purview.md
Title: Connect Synapse workspace to Azure Purview
-description: Connect a Synapse workspace to an Azure Purview account.
+ Title: Connect Synapse workspace to Microsoft Purview
+description: Connect a Synapse workspace to a Microsoft Purview account.
-# QuickStart: Connect a Synapse workspace to an Azure Purview account
+# QuickStart: Connect a Synapse workspace to a Microsoft Purview account
-In this quickstart, you will register an Azure Purview Account to a Synapse workspace. That connection allows you to discover Azure Purview assets, interact with them through Synapse capabilities, and push lineage information to Azure Purview.
+In this quickstart, you will register a Microsoft Purview Account to a Synapse workspace. That connection allows you to discover Microsoft Purview assets, interact with them through Synapse capabilities, and push lineage information to Microsoft Purview.
You can perform the following tasks in Synapse:-- Use the search box at the top to find Azure Purview assets based on keywords
+- Use the search box at the top to find Microsoft Purview assets based on keywords
- Understand the data based on metadata, [lineage](../../purview/catalog-lineage-user-guide.md), annotations - Connect those data to your workspace with linked services or integration datasets - Analyze those datasets with Synapse Apache Spark, Synapse SQL, and Data Flow -- Execute pipelines and [push lineage information to Azure Purview](../../purview/how-to-lineage-azure-synapse-analytics.md)
+- Execute pipelines and [push lineage information to Microsoft Purview](../../purview/how-to-lineage-azure-synapse-analytics.md)
## Prerequisites -- [Azure Azure Purview account](../../purview/create-catalog-portal.md)
+- [Azure Microsoft Purview account](../../purview/create-catalog-portal.md)
- [Synapse workspace](../quickstart-create-workspace.md)
-## Permissions for connecting an Azure Purview account
+## Permissions for connecting a Microsoft Purview account
-To connect an Azure Purview Account to a Synapse workspace, you need a **Contributor** role in Synapse workspace from Azure portal IAM and you need access to that Azure Purview Account. For more information, see [Azure Purview permissions](../../purview/catalog-permissions.md).
+To connect a Microsoft Purview Account to a Synapse workspace, you need a **Contributor** role in Synapse workspace from Azure portal IAM and you need access to that Microsoft Purview Account. For more information, see [Microsoft Purview permissions](../../purview/catalog-permissions.md).
-## Connect an Azure Purview account
+## Connect a Microsoft Purview account
-Follow the steps to connect an Azure Purview account:
+Follow the steps to connect a Microsoft Purview account:
1. Go to [https://web.azuresynapse.net](https://web.azuresynapse.net) and sign in to your Synapse workspace.
-2. Go to **Manage** -> **Azure Purview**, select **Connect to an Azure Purview account**.
+2. Go to **Manage** -> **Microsoft Purview**, select **Connect to a Microsoft Purview account**.
3. You can choose **From Azure subscription** or **Enter manually**. **From Azure subscription**, you can select the account that you have access to.
-4. Once connected, you can see the name of the Azure Purview account in the tab **Azure Purview account**.
+4. Once connected, you can see the name of the Microsoft Purview account in the tab **Microsoft Purview account**.
-If your Azure Purview account is protected by firewall, create the managed private endpoints for Azure Purview. Learn more about how to let Azure Synapse [access a secured Azure Purview account](how-to-access-secured-purview-account.md). You can either do it during the initial connection or edit an existing connection later.
+If your Microsoft Purview account is protected by firewall, create the managed private endpoints for Microsoft Purview. Learn more about how to let Azure Synapse [access a secured Microsoft Purview account](how-to-access-secured-purview-account.md). You can either do it during the initial connection or edit an existing connection later.
-The Azure Purview connection information is stored in the Synapse workspace resource like the following. To establish the connection programmatically, you can update the Synapse workspace and add the `purviewConfiguration` settings.
+The Microsoft Purview connection information is stored in the Synapse workspace resource like the following. To establish the connection programmatically, you can update the Synapse workspace and add the `purviewConfiguration` settings.
```json {
The Azure Purview connection information is stored in the Synapse workspace reso
## Set up authentication
-Synapse workspace's managed identity is used to authenticate lineage push operations from Synapse workspace to Azure Purview.
+Synapse workspace's managed identity is used to authenticate lineage push operations from Synapse workspace to Microsoft Purview.
-Grant the Synapse workspace's managed identity **Data Curator** role on your Azure Purview **root collection**. Learn more about [Access control in Azure Purview](../../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+Grant the Synapse workspace's managed identity **Data Curator** role on your Microsoft Purview **root collection**. Learn more about [Access control in Microsoft Purview](../../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
-When connecting Synapse workspace to Azure Purview in Synapse Studio, Synapse tries to add such role assignment automatically. If you have **Collection admins** role on the Azure Purview root collection and have access to Azure Purview account from your network, this operation is done successfully.
+When connecting Synapse workspace to Microsoft Purview in Synapse Studio, Synapse tries to add such role assignment automatically. If you have **Collection admins** role on the Microsoft Purview root collection and have access to Microsoft Purview account from your network, this operation is done successfully.
-## Monitor Azure Purview connection
+## Monitor Microsoft Purview connection
-Once you connect the Synapse workspace to an Azure Purview account, you see the following page with details on the enabled integration capabilities.
+Once you connect the Synapse workspace to a Microsoft Purview account, you see the following page with details on the enabled integration capabilities.
For **Data Lineage - Synapse Pipeline**, you may see one of below status: -- **Connected**: The Synapse workspace is successfully connected to the Azure Purview account. Note this indicates Synapse workspace is associated with an Azure Purview account and has permission to push lineage to it. If your Azure Purview account is protected by firewall, you also need to make sure the integration runtime used to execute the activities and conduct lineage push can reach the Azure Purview account. Learn more from [Access a secured Azure Purview account](how-to-access-secured-purview-account.md).-- **Disconnected**: The Synapse workspace cannot push lineage to Azure Purview because Azure Purview Data Curator role is not granted to Synapse workspace's managed identity. To fix this issue, go to your Azure Purview account to check the role assignments, and manually grant the role as needed. Learn more from [Set up authentication](#set-up-authentication) section.
+- **Connected**: The Synapse workspace is successfully connected to the Microsoft Purview account. Note this indicates Synapse workspace is associated with a Microsoft Purview account and has permission to push lineage to it. If your Microsoft Purview account is protected by firewall, you also need to make sure the integration runtime used to execute the activities and conduct lineage push can reach the Microsoft Purview account. Learn more from [Access a secured Microsoft Purview account](how-to-access-secured-purview-account.md).
+- **Disconnected**: The Synapse workspace cannot push lineage to Microsoft Purview because Microsoft Purview Data Curator role is not granted to Synapse workspace's managed identity. To fix this issue, go to your Microsoft Purview account to check the role assignments, and manually grant the role as needed. Learn more from [Set up authentication](#set-up-authentication) section.
- **Unknown**: Azure Synapse cannot check the status. Possible reasons are:
- - Cannot reach the Azure Purview account from your current network because the account is protected by firewall. You can launch the Synapse Studio from a private network that has connectivity to your Azure Purview account instead.
- - You don't have permission to check role assignments on the Azure Purview account. You can contact the Azure Purview account admin to check the role assignments for you. Learn about the needed Azure Purview role from [Set up authentication](#set-up-authentication) section.
+ - Cannot reach the Microsoft Purview account from your current network because the account is protected by firewall. You can launch the Synapse Studio from a private network that has connectivity to your Microsoft Purview account instead.
+ - You don't have permission to check role assignments on the Microsoft Purview account. You can contact the Microsoft Purview account admin to check the role assignments for you. Learn about the needed Microsoft Purview role from [Set up authentication](#set-up-authentication) section.
-## Report lineage to Azure Purview
+## Report lineage to Microsoft Purview
-Once you connect the Synapse workspace to an Azure Purview account, when you execute pipelines, Synapse reports lineage information to the Azure Purview account. For detailed supported capabilities and an end to end walkthrough, see [Metadata and lineage from Azure Synapse Analytics](../../purview/how-to-lineage-azure-synapse-analytics.md).
+Once you connect the Synapse workspace to a Microsoft Purview account, when you execute pipelines, Synapse reports lineage information to the Microsoft Purview account. For detailed supported capabilities and an end to end walkthrough, see [Metadata and lineage from Azure Synapse Analytics](../../purview/how-to-lineage-azure-synapse-analytics.md).
-## Discover and explore data using Azure Purview
+## Discover and explore data using Microsoft Purview
-Once you connect the Synapse workspace to an Azure Purview account, you can use the search bar at the top center of Synapse workspace to search for data and perform actions. Learn more from [Discover, connect and explore data in Synapse using Azure Purview](how-to-discover-connect-analyze-azure-purview.md).
+Once you connect the Synapse workspace to a Microsoft Purview account, you can use the search bar at the top center of Synapse workspace to search for data and perform actions. Learn more from [Discover, connect and explore data in Synapse using Microsoft Purview](how-to-discover-connect-analyze-azure-purview.md).
## Next steps
-[Discover, connect and explore data in Synapse using Azure Purview](how-to-discover-connect-analyze-azure-purview.md)
+[Discover, connect and explore data in Synapse using Microsoft Purview](how-to-discover-connect-analyze-azure-purview.md)
[Metadata and lineage from Azure Synapse Analytics](../../purview/how-to-lineage-azure-synapse-analytics.md)
-[Access a secured Azure Purview account](how-to-access-secured-purview-account.md)
+[Access a secured Microsoft Purview account](how-to-access-secured-purview-account.md)
-[Register and scan Azure Synapse assets in Azure Purview](../../purview/register-scan-azure-synapse-analytics.md)
+[Register and scan Azure Synapse assets in Microsoft Purview](../../purview/register-scan-azure-synapse-analytics.md)
-[Get lineage from Power BI into Azure Purview](../../purview/how-to-lineage-powerbi.md)
+[Get lineage from Power BI into Microsoft Purview](../../purview/how-to-lineage-powerbi.md)
-[Connect Azure Data Share and Azure Purview](../../purview/how-to-link-azure-data-share.md)
+[Connect Azure Data Share and Microsoft Purview](../../purview/how-to-link-azure-data-share.md)
synapse-analytics Create Empty Lake Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/database-designer/create-empty-lake-database.md
In this article, you'll learn how to create an empty [lake database](./concepts-
- Your database will be validated for errors before it's published. Any errors found will be showing in the notifications tab with instructions on how to remedy the error. ![Screenshot of the validation pane showing validation errors in the database](./media/create-empty-lake-database/validation-error.png)
- - Publishing will create your database schema in the Azure Synapse Metastore. After publishing, the database and table objects will be visible to other Azure services and allow the metadata from your database to flow into apps like Power BI or Azure Purview.
+ - Publishing will create your database schema in the Azure Synapse Metastore. After publishing, the database and table objects will be visible to other Azure services and allow the metadata from your database to flow into apps like Power BI or Microsoft Purview.
11. You've now created an empty lake database in Azure Synapse, and added tables to it using the **Custom** and **From data lake** options.
synapse-analytics Create Lake Database From Lake Database Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/database-designer/create-lake-database-from-lake-database-templates.md
In this article, you'll learn how to use the Azure Synapse database templates to
- Your database will be validated for errors before it's published. Any errors found will be showing in the notifications tab with instructions on how to remedy the error. ![Screenshot of the validation pane showing validation errors in the database](./media/create-lake-database-from-lake-database-template/validation-error.png)
- - Publishing will create your database schema in the Azure Synapse Metastore. After publishing, the database and table objects will be visible to other Azure services and allow the metadata from your database to flow into apps like Power BI or Azure Purview.
+ - Publishing will create your database schema in the Azure Synapse Metastore. After publishing, the database and table objects will be visible to other Azure services and allow the metadata from your database to flow into apps like Power BI or Microsoft Purview.
12. You've now created a lake database using a lake database template in Azure Synapse.
synapse-analytics Modify Lake Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/database-designer/modify-lake-database.md
In this article, you'll learn how to modify an existing [lake database](./concep
- Your database will be validated for errors before it's published. Any errors found will be showing in the notifications tab with instructions on how to remedy the error. ![Screenshot of the validation pane showing validation errors in the database](./media/create-lake-database-from-lake-database-template/validation-error.png)
- - Publishing will create your database schema in the Azure Synapse Metastore. After publishing, the database and table objects will be visible to other Azure services and allow the metadata from your database to flow into apps like Power BI or Azure Purview.
+ - Publishing will create your database schema in the Azure Synapse Metastore. After publishing, the database and table objects will be visible to other Azure services and allow the metadata from your database to flow into apps like Power BI or Microsoft Purview.
## Customize tables within a database
synapse-analytics Security White Paper Data Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/guidance/security-white-paper-data-protection.md
Once the data discovery process is complete, it provides classification recommen
Azure Synapse provides two options for data discovery and classification: - [Data Discovery & Classification](../../azure-sql/database/data-discovery-and-classification-overview.md), which is built into Azure Synapse and dedicated SQL pool (formerly SQL DW).-- [Azure Purview](https://azure.microsoft.com/services/purview/), which is a unified data governance solution that helps manage and govern on-premises, multicloud, and software-as-a-service (SaaS) data. It can automate data discovery, lineage identification, and data classification. By producing a unified map of data assets and their relationships, it makes data easily discoverable.
+- [Microsoft Purview](https://azure.microsoft.com/services/purview/), which is a unified data governance solution that helps manage and govern on-premises, multicloud, and software-as-a-service (SaaS) data. It can automate data discovery, lineage identification, and data classification. By producing a unified map of data assets and their relationships, it makes data easily discoverable.
> [!NOTE]
-> Azure Purview data discovery and classification is in public preview for Azure Synapse, dedicated SQL pool (formerly SQL DW), and serverless SQL pool. However, data lineage is currently not supported for Azure Synapse, dedicated SQL pool (formerly SQL DW), and serverless SQL pool. Apache Spark pool only supports [lineage tracking](../../purview/how-to-lineage-spark-atlas-connector.md).
+> Microsoft Purview data discovery and classification is in public preview for Azure Synapse, dedicated SQL pool (formerly SQL DW), and serverless SQL pool. However, data lineage is currently not supported for Azure Synapse, dedicated SQL pool (formerly SQL DW), and serverless SQL pool. Apache Spark pool only supports [lineage tracking](../../purview/how-to-lineage-spark-atlas-connector.md).
## Data encryption
synapse-analytics Data Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/partner/data-management.md
This article highlights Microsoft partner companies with data management tools a
| - | -- | -- | | ![Aginity](./media/data-management/aginity-logo.png) |**Aginity**<br>Aginity is an analytics development tool. It puts the full power of MicrosoftΓÇÖs Synapse platform in the hands of analysts and engineers. The rich and intuitive SQL development environment allows team members to connect to over a dozen industry leading analytics platforms. It allows users to ingest data in a variety of formats, and quickly build complex business calculation to serve the results into Business Intelligence and Machine Learning use cases. The entire application is built around a central catalog which makes collaboration across the analytics team a reality, and the sophisticated management capabilities and fine grained security make governance a breeze. |[Product page](https://www.aginity.com/databases/microsoft/)<br> | | ![Alation](./media/data-management/alation-logo.png) |**Alation**<br>AlationΓÇÖs data catalog dramatically improves the productivity, increases the accuracy, and drives confident data-driven decision making for analysts. AlationΓÇÖs data catalog empowers everyone in your organization to find, understand, and govern data. |[Product page](https://www.alation.com/product/data-catalog/)<br> |
-| ![BI Builders (Xpert BI)](./media/data-integration/bibuilders-logo.png) |**BI Builders (Xpert BI)**<br> Xpert BI provides an intuitive and searchable catalog for the line-of-business user to find, trust, and understand data and reports. The solution covers the whole data platform including Azure Synapse Analytics, ADLS Gen 2, Azure SQL Database, Analysis Services and Power BI, and also data flows and data movement end-to-end. Data stewards can update descriptions and tag data to follow regulatory requirements. Xpert BI can be integrated via APIs to other catalogs such as Azure Purview. It supplements traditional data catalogs with a business user perspective. |[Product page](https://www.bi-builders.com/adding-automation-and-governance-to-azure-analytics/)<br>[Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps/bi-builders-as.xpert-bi-vm)<br>|
+| ![BI Builders (Xpert BI)](./media/data-integration/bibuilders-logo.png) |**BI Builders (Xpert BI)**<br> Xpert BI provides an intuitive and searchable catalog for the line-of-business user to find, trust, and understand data and reports. The solution covers the whole data platform including Azure Synapse Analytics, ADLS Gen 2, Azure SQL Database, Analysis Services and Power BI, and also data flows and data movement end-to-end. Data stewards can update descriptions and tag data to follow regulatory requirements. Xpert BI can be integrated via APIs to other catalogs such as Microsoft Purview. It supplements traditional data catalogs with a business user perspective. |[Product page](https://www.bi-builders.com/adding-automation-and-governance-to-azure-analytics/)<br>[Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps/bi-builders-as.xpert-bi-vm)<br>|
| ![Coffing Data Warehousing](./media/data-management/coffing-data-warehousing-logo.png) |**Coffing Data Warehousing**<br>Coffing Data Warehousing provides Nexus Chameleon, a tool with 10 years of design dedicated to querying systems. Nexus is available as a query tool for dedicated SQL pool in Azure Synapse Analytics. Use Nexus to query in-house and cloud computers and join data across different platforms. Point-Click-Report! |[Product page](https://coffingdw.com/software/nexus/)<br> | | ![Inbrein](./media/data-management/inbrein-logo.png) |**Inbrein MicroERD**<br>Inbrein MicroERD provides the tools that you need to create a precise data model, reduce data redundancy, improve productivity, and observe standards. By using its UI, which was developed based on extensive user experiences, a modeler can work on DB models easily and conveniently. You can continuously enjoy new and improved functions of MicroERD through prompt functional improvements and updates. |Product page<br> | | ![Infolibrarian](./media/data-management/infolibrarian-logo.png) |**Infolibrarian (Metadata Management Server)**<br>InfoLibrarian catalogs, stores, and manages metadata to help you solve key pain points of data management. Infolibrarian provides metadata management, data governance, and asset management solutions for managing and publishing metadata from a diverse set of tools and technologies. |[Product page](http://www.infolibcorp.com/metadata-management/software-tools)<br> [Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps/infolibrarian.infolibrarian-metadata-management-server)<br> |
synapse-analytics Quickstart Transform Data Using Spark Job Definition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/quickstart-transform-data-using-spark-job-definition.md
On this panel, you can reference to the Spark job definition to run.
* Expand the Spark job definition list, you can choose an existing Apache Spark job definition. You can also create a new Apache Spark job definition by selecting the **New** button to reference the Spark job definition to be run.
-* You can add command-line arguments by clicking the **New** button. It should be noted that adding command-line arguments will override the command-line arguments defined by the Spark job definition. <br> *Sample: `abfss://…/path/to/shakespeare.txt` `abfss://…/path/to/result`* <br>
-
+* (Optional) You can fill in information for Apache Spark job definition. If the following settings are empty, the settings of the spark job definition itself will be used to run; if the following settings are not empty, these settings will replace the settings of the spark job definition itself.
+
+ | Property | Description |
+ | -- | -- |
+ |Main definition file| The main file used for the job. Select a PY/JAR/ZIP file from your storage. You can select **Upload file** to upload the file to a storage account. <br> Sample: `abfss://…/path/to/wordcount.jar`|
+ |Main class name| The fully qualified identifier or the main class that is in the main definition file. <br> Sample: `WordCount`|
+ |Command-line arguments| You can add command-line arguments by clicking the **New** button. It should be noted that adding command-line arguments will override the command-line arguments defined by the Spark job definition. <br> *Sample: `abfss://…/path/to/shakespeare.txt` `abfss://…/path/to/result`* <br> |
+ |Apache Spark pool| You can select Apache Spark pool from the list.|
+ |Dynamically allocate executors| This setting maps to the dynamic allocation property in Spark configuration for Spark Application executors allocation.|
+ |Min executors| Min number of executors to be allocated in the specified Spark pool for the job.|
+ |Max executors| Max number of executors to be allocated in the specified Spark pool for the job.|
+ |Driver size| Number of cores and memory to be used for driver given in the specified Apache Spark pool for the job.|
+
+
![spark job definition pipline settings](media/quickstart-transform-data-using-spark-job-definition/spark-job-definition-pipline-settings.png) * You can add dynamic content by clicking the **Add Dynamic Content** button or by pressing the shortcut key <kbd>Alt</kbd>+<kbd>Shift</kbd>+<kbd>D</kbd>. In the **Add Dynamic Content** page, you can use any combination of expressions, functions, and system variables to add to dynamic content.
synapse-analytics Sql Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/sql/sql-authentication.md
Last updated 03/07/2022
+ # SQL Authentication in Azure Synapse Analytics Azure Synapse Analytics has two SQL form-factors that enable you to control your resource consumption. This article explains how the two form-factors control the user authentication.
SQL authorization enables legacy applications to connect to Azure Synapse SQL in
## Administrative accounts
-There are two administrative accounts (**Server admin** and **Active Directory admin**) that act as administrators. To identify these administrator accounts for your SQL server, open the Azure portal, and navigate to the Properties tab of your Synapse SQL.
+There are two administrative accounts (**SQL admin username** and **SQL Active Directory admin**) that act as administrators. To identify these administrator accounts for your SQL pools open the Azure portal, and navigate to the Properties tab of your Synapse workspace.
![SQL Server Admins](./media/sql-authentication/sql-admins.png) -- **Server admin**
+- **SQL admin username**
When you create an Azure Synapse Analytics, you must name a **Server admin login**. SQL server creates that account as a login in the `master` database. This account connects using SQL Server authentication (user name and password). Only one of these accounts can exist. -- **Azure Active Directory admin**
+- **SQL Active Directory admin**
One Azure Active Directory account, either an individual or security group account, can also be configured as an administrator. It's optional to configure an Azure AD administrator, but an Azure AD administrator **must** be configured if you want to use Azure AD accounts to connect to Synapse SQL. - The Azure Active Directory admin account controls access to dedicated SQL pools, while Synapse RBAC roles are used to control access to serverless pools, for example, the **Synapse Administrator** role. Changing the Azure Active Directory administrator account will only affect the account's access to dedicated SQL pools.
-The **Server admin** and **Azure AD admin** accounts have the following characteristics:
+The **SQL admin username** and **SQL Active Directory admin** accounts have the following characteristics:
- Are the only accounts that can automatically connect to any SQL Database on the server. (To connect to a user database, other accounts must either be the owner of the database, or have a user account in the user database.) - These accounts enter user databases as the `dbo` user and they have all the permissions in the user databases. (The owner of a user database also enters the database as the `dbo` user.)
CREATE LOGIN Mary WITH PASSWORD = '<strong_password>';
CREATE LOGIN [Mary@domainname.net] FROM EXTERNAL PROVIDER; ```
-Once the login exists, you can create users in the individual databases within the serverless SQL pool endpoint and grant required permissions to these users. To create a user, you can use the following syntax:
+When the login exists, you can create users in the individual databases within the serverless SQL pool endpoint and grant required permissions to these users. To create a user, you can use the following syntax:
```sql CREATE USER Mary FROM LOGIN Mary;
CREATE USER [mike@contoso.com] FROM EXTERNAL PROVIDER;
Once login and user are created, you can use the regular SQL Server syntax to grant rights.
-## [dedicated SQL pool](#tab/provisioned)
+## [Dedicated SQL pool](#tab/provisioned)
### Administrator access path
-When the server-level firewall is properly configured, the **SQL server admin** and the **Azure Active Directory admin** can connect using client tools such as SQL Server Management Studio or SQL Server Data Tools. Only the latest tools provide all the features and capabilities.
+When the workspace-level firewall is properly configured, the **SQL admin username** and the **SQL Active Directory admin** can connect using client tools such as SQL Server Management Studio or SQL Server Data Tools. Only the latest tools provide all the features and capabilities.
The following diagram shows a typical configuration for the two administrator accounts:
EXEC sp_addrolemember 'db_owner', 'Mary';
``` > [!NOTE]
-> One common reason to create a database user based on a server login is for users that need access to multiple databases. Since contained database users are individual entities, each database maintains its own user and its own password. This can cause overhead as the user must then remember each password for each database, and it can become untenable when having to change multiple passwords for many databases. However, when using SQL Server Logins and high availability (active geo-replication and failover groups), the SQL Server logins must be set manually at each server. Otherwise, the database user will no longer be mapped to the server login after a failover occurs, and will not be able to access the database post failover.
-
-For more information on configuring logins for geo-replication, see [Configure and manage Azure SQL Database security for geo-restore or failover](../../azure-sql/database/active-geo-replication-security-configure.md).
-
-### Configuring the database-level firewall
-
-As a best practice, non-administrator users should only have access through the firewall to the databases that they use. Instead of authorizing their IP addresses through the server-level firewall and giving them access to all databases, use the [sp_set_database_firewall_rule](/sql/relational-databases/system-stored-procedures/sp-set-database-firewall-rule-azure-sql-database?view=azure-sqldw-latest&preserve-view=true) statement to configure the database-level firewall. The database-level firewall cannot be configured by using the portal.
-
-### Non-administrator access path
-
-When the database-level firewall is properly configured, the database users can connect using client tools such as SQL Server Management Studio or SQL Server Data Tools. Only the latest tools provide all the features and capabilities. The following diagram shows a typical non-administrator access path.
-
-![Non-administrator access path](./media/sql-authentication/2sql-db-nonadmin-access.png)
+> One common reason to create a database user based on a server login is for users that need access to multiple databases. Since contained database users are individual entities, each database maintains its own user and its own password. This can cause overhead as the user must then remember each password for each database, and it can become untenable when having to change multiple passwords for many databases.
## Groups and roles
When managing logins and users in SQL Database, consider the following points:
- To connect to a user database, you must provide the name of the database in the connection string. - Only the server-level principal login and the members of the **loginmanager** database role in the `master` database have permission to execute the `CREATE LOGIN`, `ALTER LOGIN`, and `DROP LOGIN` statements. - When executing the `CREATE/ALTER/DROP LOGIN` and `CREATE/ALTER/DROP DATABASE` statements in an ADO.NET application, using parameterized commands isn't allowed. For more information, see [Commands and Parameters](/dotnet/framework/data/adonet/commands-and-parameters).-- When executing the `CREATE/ALTER/DROP DATABASE` and `CREATE/ALTER/DROP LOGIN` statements, each of these statements must be the only statement in a Transact-SQL batch. Otherwise, an error occurs. For example, the following Transact-SQL checks whether the database exists. If it exists, a `DROP DATABASE` statement is called to remove the database. Because the `DROP DATABASE` statement is not the only statement in the batch, executing the following Transact-SQL statement results in an error.-
- ```sql
- IF EXISTS (SELECT [name]
- FROM [sys].[databases]
- WHERE [name] = N'database_name')
- DROP DATABASE [database_name];
- GO
- ```
-
- Instead, use the following Transact-SQL statement:
-
- ```sql
- DROP DATABASE IF EXISTS [database_name]
- ```
- - When executing the `CREATE USER` statement with the `FOR/FROM LOGIN` option, it must be the only statement in a Transact-SQL batch. - When executing the `ALTER USER` statement with the `WITH LOGIN` option, it must be the only statement in a Transact-SQL batch. - `CREATE/ALTER/DROP LOGIN` and `CREATE/ALTER/DROP USER` statements are not supported when Azure AD-only authentication is enabled for the Azure Synapse workspace.
synapse-analytics Synapse Notebook Activity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/synapse-notebook-activity.md
You can create a Synapse notebook activity directly from the Synapse pipeline ca
Drag and drop **Synapse notebook** under **Activities** onto the Synapse pipeline canvas. Select on the Synapse notebook activity box and config the notebook content for current activity in the **settings**. You can select an existing notebook from the current workspace or add a new one.
+You can also select an Apache Spark pool in the settings. It should be noted that the Apache spark pool set here will replace the Apache spark pool used in the notebook. If Apache spark pool is not selected in the settings of notebook content for current activity, the Apache spark pool selected in that notebook will be used to run.
+ ![screenshot-showing-create-notebook-activity](./media/synapse-notebook-activity/create-synapse-notebook-activity.png) > [!NOTE]
synapse-analytics Whats New Archive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/whats-new-archive.md
The following updates are new to Azure Synapse Analytics this month.
### Governance
-* Synapse workspaces can now automatically push lineage data to Azure Purview [blog](https://techcommunity.microsoft.com/t5/azure-synapse-analytics/azure-synapse-analytics-october-update/ba-p/2875372#synapse-purview-lineage) [article](../purview/how-to-lineage-azure-synapse-analytics.md)
+* Synapse workspaces can now automatically push lineage data to Microsoft Purview [blog](https://techcommunity.microsoft.com/t5/azure-synapse-analytics/azure-synapse-analytics-october-update/ba-p/2875372#synapse-purview-lineage) [article](../purview/how-to-lineage-azure-synapse-analytics.md)
### Integrate
traffic-manager Traffic Manager Traffic View Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/traffic-manager/traffic-manager-traffic-view-overview.md
Title: Traffic View in Azure Traffic Manager
description: In this introduction, learn how Traffic manager Traffic view works. documentationcenter: traffic-manager-+ Previously updated : 01/22/2021- Last updated : 04/18/2022+
Traffic View works by look at the incoming queries received over the last seven
In the next step, Traffic Manager correlates the user base region to Azure region mapping with the network intelligence latency tables. This table is maintained for different end-user networks to understand the average latency experienced by users from those regions when connecting to Azure regions. All these calculations are then combined at a per local DNS resolver IP level before it's presented to you. You can consume the information in various ways.
-The frequency of Traffic view data update depends on multiple internal service variables. However, the data is updated once every 24 hours.
+The frequency of Traffic view data update depends on multiple internal service variables. However, the data is updated once every 48 hours.
>[!NOTE] >The latency described in Traffic View is a representative latency between the end user and the Azure regions to which they had connected to, and is not the DNS lookup latency. Traffic View makes a best effort estimate of the latency between the local DNS resolver and the Azure region the query was routed to, if there is insufficient data available then the latency returned will be null.
virtual-machines Disk Bursting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/disk-bursting.md
Title: Managed disk bursting
description: Learn about disk bursting for Azure disks and Azure virtual machines. Previously updated : 11/09/2021 Last updated : 04/19/2022
The following scenarios can benefit greatly from bursting:
Currently, there are two managed disk types that can burst, [premium SSDs](disks-types.md#premium-ssds), and [standard SSDs](disks-types.md#standard-ssds). Other disk types cannot currently burst. There are two models of bursting for disks: - An on-demand bursting model, where the disk bursts whenever its needs exceed its current capacity. This model incurs additional charges anytime the disk bursts. On-demand bursting is only available for premium SSDs larger than 512 GiB.-- A credit-based model, where the disk will burst only if it has burst credits accumulated in its credit bucket. This model does not incur additional charges when the disk bursts. Credit-based bursting is only available for premium and standard SSDs 512 GiB and smaller.
+- A credit-based model, where the disk will burst only if it has burst credits accumulated in its credit bucket. This model does not incur additional charges when the disk bursts. Credit-based bursting is only available for premium SSDs 512 GiB and smaller, and standard SSDs 1024 GiB and smaller.
Azure [premium SSDs](disks-types.md#premium-ssds) can use either bursting model, but [standard SSDs](disks-types.md#standard-ssds) currently only offer credit-based bursting.
Additionally, the [performance tier of managed disks can be changed](disks-chang
||||| | **Scenarios**|Ideal for short-term scaling (30 minutes or less).|Ideal for short-term scaling(Not time restricted).|Ideal if your workload would otherwise continually be running in burst.| |**Cost** |Free |Cost is variable, see the [Billing](#billing) section for details. |The cost of each performance tier is fixed, see [Managed Disks pricing](https://azure.microsoft.com/pricing/details/managed-disks/) for details. |
-|**Availability** |Only available for premium and standard SSDs 512 GiB and smaller. |Only available for premium SSDs larger than 512 GiB. |Available to all premium SSD sizes. |
+|**Availability** |Only available for premium SSDs 512 GiB and smaller, and standard SSDs 1024 GiB and smaller. |Only available for premium SSDs larger than 512 GiB. |Available to all premium SSD sizes. |
|**Enablement** |Enabled by default on eligible disks. |Must be enabled by user. |User must manually change their tier. | [!INCLUDE [managed-disks-bursting](../../includes/managed-disks-bursting-2.md)]
virtual-machines Share Gallery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/share-gallery.md
ms.devlang: azurecli
The Azure Compute Gallery, definitions, and versions are all resources, they can be shared using the built-in native Azure RBAC controls. Using Azure RBAC you can share these resources to other users, service principals, and groups. You can even share access to individuals outside of the tenant they were created within. Once a user has access to the image or application version, they can deploy a VM or a Virtual Machine Scale Set.
-We recommend sharing at the gallery level for the best experience. We do not recommend sharing individual image or application versions. For more information about Azure RBAC, see [Assign Azure roles](../role-based-access-control/role-assignments-portal.md).
+We recommend sharing at the gallery level for the best experience and prevent management overhead. We do not recommend sharing individual image or application versions. For more information about Azure RBAC, see [Assign Azure roles](../role-based-access-control/role-assignments-portal.md).
If the user is outside of your organization, they will get an email invitation to join the organization. The user needs to accept the invitation, then they will be able to see the gallery and all of the image definitions and versions in their list of resources.
virtual-machines Automation Configure Sap Parameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-sap-parameters.md
disks:
- { host: 'rh8scs01l84f', LUN: 0, type: 'sap' } ```
+### Oracle support
+
+From the v3.4 release, it is possible to deploy SAP on Azure systems in a Shared Home configuration using an Oracle database backend. For more information on running SAP on Oracle in Azure, see [Azure Virtual Machines Oracle DBMS deployment for SAP workload](dbms_guide_oracle.md).
+
+In order to install the Oracle backend using the SAP Deployment Automation Framework, you need to provide the following parameters
+
+> [!div class="mx-tdCol2BreakAll "]
+> | Parameter | Description | Type |
+> | - | - | - |
+> | `platform` | The database backend, 'ORACLE' | Required |
+> | `ora_release` | The Oracle release version, for example 19 | Required |
+> | `ora_release` | The Oracle release version, for example 19.0.0 | Required |
+> | `oracle_sbp_patch` | The Oracle SBP patch file name | Required |
+
+### Shared Home support
+
+To configure shared home support for Oracle, you need to add a dictionary defining the SIDs to be deployed. You can do that by adding the parameter 'MULTI_SIDS' that contains a list of the SIDs and the SID details.
+
+```yaml
+MULTI_SIDS:
+- {sid: 'DE1', dbsid_uid: '3005', sidadm_uid: '2001', ascs_inst_no: '00', pas_inst_no: '00', app_inst_no: '00'}
+- {sid: 'QE1', dbsid_uid: '3006', sidadm_uid: '2002', ascs_inst_no: '01', pas_inst_no: '01', app_inst_no: '01'}
+```
+
+Each row must specify the following parameters.
+
+> [!div class="mx-tdCol2BreakAll "]
+> | Parameter | Description | Type |
+> | - | - | - |
+> | `sid` | The SID for the instance | Required |
+> | `dbsid_uid` | The UID for the DB admin user for the instance | Required |
+> | `sidadm_uid` | The UID for the SID admin user for the instance | Required |
+> | `ascs_inst_no` | The ASCS instance number for the instance | Required |
+> | `pas_inst_no` | The PAS instance number for the instance | Required |
+> | `app_inst_no` | The APP instance number for the instance | Required |
++ ## Next steps > [!div class="nextstepaction"]
virtual-machines Automation Deploy Control Plane https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-deploy-control-plane.md
New-SAPAutomationRegion -DeployerParameterfile .\DEPLOYER\MGMT-WEEU-DEP00-INFRAS
> Be sure to replace the sample value `<subscriptionID>` with your subscription ID. > Replace the `<appID>`, `<password>`, `<tenant>` values with the output values of the SPN creation +
+### Manually configure the deployer using Azure Bastion
+
+Connect to the deployer by following these steps:
+
+1. Sign in to the [Azure portal](https://portal.azure.com).
+
+1. Navigate to the resource group containing the deployer virtual machine.
+
+1. Connect to the virtual machine using Azure Bastion.
+
+1. The default username is *azureadm*
+
+1. Choose *SSH Private Key from Azure Key Vault*
+
+1. Select the subscription containing the control plane.
+
+1. Select the deployer key vault.
+
+1. From the list of secrets choose the secret ending with *-sshkey*.
+
+1. Connect to the virtual machine.
+
+Run the following script to configure the deployer.
+
+```bash
+mkdir -p ~/Azure_SAP_Automated_Deployment
+
+cd ~/Azure_SAP_Automated_Deployment
+
+git clone https://github.com/Azure/sap-automation.git
+
+cd sap-automation/deploy/scripts
+
+./configure_deployer.sh
+```
+
+The script will install Terraform and Ansible and configure the deployer.
+ ### Manually configure the deployer (deployments without public IP) If you deploy the deployer without a public IP Terraform isn't able to configure the deployer Virtual Machine as it will not be able to connect to it.
+> [!NOTE]
+>You need to connect to the deployer virtual Machine from a computer that is able to reach the Azure Virtual Network
+ Connect to the deployer by following these steps: 1. Sign in to the [Azure portal](https://portal.azure.com).
Connect to the deployer by following these steps:
1. Save the file. If you're prompted to **Save as type**, select **All files** if **SSH** isn't an option. For example, use `deployer.ssh`.
-1. Connect to the deployer VM through any SSH client such as VSCode. Use the public IP address you noted earlier, and the SSH key you downloaded. For instructions on how to connect to the Deployer using VSCode see [Connecting to Deployer using VSCode](automation-tools-configuration.md#configuring-visual-studio-code). If you're using PuTTY, convert the SSH key file first using PuTTYGen.
+1. Connect to the deployer VM through any SSH client such as VSCode. Use the private IP address of the deployer, and the SSH key you downloaded. For instructions on how to connect to the Deployer using VSCode see [Connecting to Deployer using VSCode](automation-tools-configuration.md#configuring-visual-studio-code). If you're using PuTTY, convert the SSH key file first using PuTTYGen.
> [!NOTE] >The default username is *azureadm*
Connect to the deployer by following these steps:
Configure the deployer using the following script:
-```cloudshell-interactive
+```bash
mkdir -p ~/Azure_SAP_Automated_Deployment cd ~/Azure_SAP_Automated_Deployment
virtual-machines Automation Naming Module https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-naming-module.md
Title: Configure custom naming module for the automation framework
+ Title: Configure custom naming for the automation framework
description: Explanation of how to implement custom naming conventions for the SAP deployment automation framework on Azure.
-# Configure custom naming module
+# Overview
The [SAP deployment automation framework on Azure](automation-deployment-framework.md) uses a standard naming convention for Azure [resource naming](automation-naming.md).
-The Terraform module `sap_namegenerator` defines the names of all resources that the automation framework deploys. The module is located at `/deploy/terraform/terraform-units/modules/sap_namegenerator/` in the repository.
+The Terraform module `sap_namegenerator` defines the names of all resources that the automation framework deploys. The module is located at `/deploy/terraform/terraform-units/modules/sap_namegenerator/` in the repository. The framework also supports providing you own names for some of the resources using the [parameter files](automation-configure-system.md).
-The framework also supports providing you own names for some of the resources using the [parameter files](automation-configure-system.md). If these capabilities are not enough you can also use custom naming logic by modifying the naming module used by the automation.
+If these capabilities are not enough, you can also use custom naming logic by either providing a custom json file containing the resource names or by modifying the naming module used by the automation.
+
+## Provide name overrides using a json file
+
+You can specify a custom naming json file in your tfvars parameter file using the 'name_override_file' parameter.
+
+The json file has sections for the different resource types.
+
+The deployment types are:
+
+- DEPLOYER (Control Plane)
+- SDU (SAP System Infrastructure)
+- VNET (Workload zone)
+
+### Key Vault names
+
+The names for the key vaults are defined in the "keyvault_names" structure. The example below lists the key vault names for a deployment in the "DEV" environment in West Europe.
+
+```json
+"keyvault_names": {
+ "DEPLOYER": {
+ "private_access": "DEVWEEUprvtABC",
+ "user_access": "DEVWEEUuserABC"
+ },
+ "SDU": {
+ "private_access": "DEVWEEUSAP01X00pABC",
+ "user_access": "DEVWEEUSAP01X00uABC"
+ },
+ "VNET": {
+ "private_access": "DEVWEEUSAP01prvtABC",
+ "user_access": "DEVWEEUSAP01userABC"
+ }
+ }
+```
+
+> [!NOTE]
+> This key vault names need to be unique across Azure, SDAF appends 3 random characters (ABC in the example) at the end of the key vault name to reduce the likelihood for name conflicts.
+
+The "private_access" names are currently not used.
+
+### Storage Account names
+
+The names for the storage accounts are defined in the "storageaccount_names" structure. The example below lists the storage account names for a deployment in the "DEV" environment in West Europe.
+
+```json
+"storageaccount_names": {
+ "DEPLOYER": "devweeudiagabc",
+ "LIBRARY": {
+ "library_storageaccount_name": "devweeusaplibabc",
+ "terraformstate_storageaccount_name": "devweeutfstateabc"
+ },
+ "SDU": "devweeusap01diagabc",
+ "VNET": {
+ "landscape_shared_transport_storage_account_name": "devweeusap01sharedabc",
+ "landscape_storageaccount_name": "devweeusap01diagabc",
+ "witness_storageaccount_name": "devweeusap01witnessabc"
+ }
+ }
+```
+
+> [!NOTE]
+> This key vault names need to be unique across Azure, SDAF appends 3 random characters (abc in the example) at the end of the key vault name to reduce the likelihood for name conflicts.
+
+### Virtual Machine names
+
+The names for the virtual machines are defined in the "virtualmachine_names" structure. Both the computer and the virtual machine names can be provided.
+
+The example below lists the virtual machine names for a deployment in the "DEV" environment in West Europe. The deployment has a database server, two application servers, a Central Services server and a web dispatcher.
+
+```json
+ "virtualmachine_names": {
+ "ANCHOR_COMPUTERNAME": [],
+ "ANCHOR_SECONDARY_DNSNAME": [],
+ "ANCHOR_VMNAME": [],
+ "ANYDB_COMPUTERNAME": [
+ "x00db00l0abc"
+ ],
+ "ANYDB_SECONDARY_DNSNAME": [
+ "x00dhdb00l0abc",
+ "x00dhdb00l1abc"
+ ],
+ "ANYDB_VMNAME": [
+ "x00db00l0abc"
+ ],
+ "APP_COMPUTERNAME": [
+ "x00app00labc",
+ "x00app01labc"
+ ],
+ "APP_SECONDARY_DNSNAME": [
+ "x00app00labc",
+ "x00app01labc"
+ ],
+ "APP_VMNAME": [
+ "x00app00labc",
+ "x00app01labc"
+ ],
+ "DEPLOYER": [
+ "devweeudeploy00"
+ ],
+ "HANA_COMPUTERNAME": [
+ "x00dhdb00l0af"
+ ],
+ "HANA_SECONDARY_DNSNAME": [
+ "x00dhdb00l0abc"
+ ],
+ "HANA_VMNAME": [
+ "x00dhdb00l0abc"
+ ],
+ "ISCSI_COMPUTERNAME": [
+ "devsap01weeuiscsi00"
+ ],
+ "OBSERVER_COMPUTERNAME": [
+ "x00observer00labc"
+ ],
+ "OBSERVER_VMNAME": [
+ "x00observer00labc"
+ ],
+ "SCS_COMPUTERNAME": [
+ "x00scs00labc"
+ ],
+ "SCS_SECONDARY_DNSNAME": [
+ "x00scs00labc"
+ ],
+ "SCS_VMNAME": [
+ "x00scs00labc"
+ ],
+ "WEB_COMPUTERNAME": [
+ "x00web00labc"
+ ],
+ "WEB_SECONDARY_DNSNAME": [
+ "x00web00labc"
+ ],
+ "WEB_VMNAME": [
+ "x00web00labc"
+ ]
+ }
+```
+
+## Configure custom naming module
There are multiple files within the module for naming resources:
The different resource names are identified by prefixes in the Terraform code.
- SAP landscape deployments use resource names with the prefix `vnet_` - SAP system deployments use resource names with the prefix `sdu_`
-The calculated names are returned in a data dictionary which is used by all the terraform modules.
+The calculated names are returned in a data dictionary, which is used by all the terraform modules.
## Using custom names
virtual-machines Automation Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-tutorial.md
Make sure you can connect to your deployer VM:
- Once connected to the deployer VM, you can now download the SAP software using the Bill of Materials (BOM).
+## Connect to deployer VM when not using a public IP
+
+For deployments without public IPs connectivity direct connectivity over the internet is not allowed. In these cases you may use either Azure Bastion, a jump box or perform the next step from a computer that has connectivity to the Azure virtual network.
+
+The following example uses Azure Bastion.
+
+Connect to the deployer by following these steps:
+
+1. Sign in to the [Azure portal](https://portal.azure.com).
+
+1. Navigate to the resource group containing the deployer virtual machine.
+
+1. Connect to the virtual machine using Azure Bastion.
+
+1. The default username is *azureadm*
+
+1. Choose *SSH Private Key from Azure Key Vault*
+
+1. Select the subscription containing the control plane.
+
+1. Select the deployer key vault.
+
+1. From the list of secrets choose the secret ending with *-sshkey*.
+
+1. Connect to the virtual machine.
+
+Run the following script to configure the deployer.
+
+```bash
+mkdir -p ~/Azure_SAP_Automated_Deployment
+
+cd ~/Azure_SAP_Automated_Deployment
+
+git clone https://github.com/Azure/sap-automation.git
+
+cd sap-automation/deploy/scripts
+
+./configure_deployer.sh
+```
+
+The script will install Terraform and Ansible and configure the deployer.
> [!IMPORTANT] > The rest of the tasks need to be executed on the Deployer
virtual-network-manager Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network-manager/faq.md
Previously updated : 11/02/2021 Last updated : 4/18/2022
* North Central US
+* South Central US
+ * West US * West US 2
* East US 2
+* Canada Central
+ * North Europe * West Europe
-* France Central
+* UK South
+
+* Switzerland North
+
+* Southeast Asia
+
+* Japan East
+
+* Japan West
+
+* Australia East
+
+* Central India
+
+* All regions that have [Availability Zones](../availability-zones/az-overview.md#azure-regions-with-availability-zones), except France Central.
+
+> [!NOTE]
+> Even if an Azure Virtual Network Manager instance isn't available because all zones are down, configurations applied to resources will still persist.
+>
### What are common use cases for using Azure Virtual Network Manager?
virtual-network Accelerated Networking Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/accelerated-networking-overview.md
az vm list-skus \
If you're using a custom image and your image supports Accelerated Networking, make sure that you have the required drivers to work with Mellanox ConnectX-3, ConnectX-4 Lx, and ConnectX-5 NICs on Azure. Also, Accelerated Networking requires network configurations that exempt the configuration of the virtual functions (mlx4_en and mlx5_core drivers). In images that have cloud-init >=19.4, networking is correctly configured to support Accelerated Networking during provisioning. +
+Sample configuration drop-in for NetworkManager (RHEL, CentOS):
+```
+sudo mkdir -p /etc/NetworkManager/conf.d
+sudo cat /etc/NetworkManager/conf.d/99-azure-unmanaged-devices.conf <<EOF
+# Ignore SR-IOV interface on Azure, since it'll be transparently bonded
+# to the synthetic interface
+[keyfile]
+unmanaged-devices=driver:mlx4_core;driver:mlx5_core
+EOF
+```
+
+Sample configuration drop-in for networkd (Ubuntu, Debian, Flatcar):
+```
+sudo mkdir -p /etc/systemd/network
+sudo cat /etc/systemd/network/99-azure-unmanaged-devices.network <<EOF
+# Ignore SR-IOV interface on Azure, since it'll be transparently bonded
+# to the synthetic interface
+[Match]
+Driver=mlx4_en mlx5_en mlx4_core mlx5_core
+[Link]
+Unmanaged=yes
+EOF
+```
+ ### Regions Accelerated networking is available in all global Azure regions and Azure Government Cloud.
virtual-network Nat Gateway Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/nat-gateway/nat-gateway-resource.md
For UDP traffic, after a connection has closed, the port will be in hold down fo
| Timer | Description | Value | |||| | TCP idle timeout | TCP connections can go idle when no data is transmitted between either endpoint for a prolonged period of time. A timer can be configured from 4 minutes (default) to 120 minutes (2 hours) to time out a connection that has gone idle. Traffic on the flow will reset the idle timeout timer. | Configurable; 4 minutes (default) - 120 minutes |
+| UDP idle timeout | UDP connections can go idle when no data is transmitted between either endpoint for a prolonged period of time. UDP idle timeout timers are 4 minutes and are **not configurable**. Traffic on the flow will reset the idle timeout timer. | **Not configurable**; 4 minutes |
> [!NOTE] > These timer settings are subject to change. The values are provided to help with troubleshooting and you should not take a dependency on specific timers at this time.
For UDP traffic, after a connection has closed, the port will be in hold down fo
Design recommendations for configuring timers: -- In an idle connection scenario, NAT gateway holds onto SNAT ports until the connection idle times out. Because long idle timeout timers can unnecessarily increase the likelihood of SNAT port exhaustion, it isn't recommended to increase the idle timeout duration to longer than the default time of 4 minutes. If a flow never goes idle, then it will not be impacted by the idle timer.
+- In an idle connection scenario, NAT gateway holds onto SNAT ports until the connection idle times out. Because long idle timeout timers can unnecessarily increase the likelihood of SNAT port exhaustion, it isn't recommended to increase the TCP idle timeout duration to longer than the default time of 4 minutes. If a flow never goes idle, then it will not be impacted by the idle timer.
- TCP keepalives can be used to provide a pattern of refreshing long idle connections and endpoint liveness detection. TCP keepalives appear as duplicate ACKs to the endpoints, are low overhead, and invisible to the application layer.
+- Because UDP idle timeout timers are not configurable, UDP keepalives should be used to ensure that the idle timeout value isn't reached and that the connection is maintained. Unlike TCP connections, a UDP keepalive enabled on one side of the connection only applies to traffic flow in one direction. UDP keepalives must be enabled on both sides of the traffic flow in order to keep the traffic flow alive.
+ ## Limitations - Basic load balancers and basic public IP addresses aren't compatible with NAT. Use standard SKU load balancers and public IPs instead.
virtual-network Troubleshoot Nat https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/nat-gateway/troubleshoot-nat.md
The table below describes two common scenarios in which outbound connectivity ma
### TCP idle timeout timers set higher than the default value
-NAT gateway has a configurable TCP idle timeout timer that defaults to 4 minutes. If this setting is changed to a higher value, NAT gateway will hold on to flows longer and can create [additional pressure on SNAT port inventory](nat-gateway-resource.md#timers). The table below describes a common scenarion in which a high TCP idle timeout may be causing SNAT exhaustion and provides possible mitigation steps to take:
+The NAT gateway TCP idle timeout timer is set to 4 minutes by default but is configurable up to 120 minutes. If this setting is changed to a higher value than the default, NAT gateway will hold on to flows longer and can create [additional pressure on SNAT port inventory](nat-gateway-resource.md#timers). The table below describes a common scenarion in which a high TCP idle timeout may be causing SNAT exhaustion and provides possible mitigation steps to take:
| Scenario | Evidence | Mitigation | ||||
As described in the [TCP timers](#tcp-idle-timeout-timers-set-higher-than-the-de
### UDP idle timeout
-Unlike TCP idle timeout timers for NAT gateway, UDP idle timeout timers are not configurable. The table below describes a common scenario encountered with connections dropping due to UDP traffic idle timing out and steps to take to mitigate the issue.
+UDP idle timeout timers are set to 4 minutes. Unlike TCP idle timeout timers for NAT gateway, UDP idle timeout timers are not configurable. The table below describes a common scenario encountered with connections dropping due to UDP traffic idle timing out and steps to take to mitigate the issue.
| Scenario | Evidence | Mitigation | ||||
To learn more about NAT gateway, see:
* [Virtual Network NAT](nat-overview.md) * [NAT gateway resource](nat-gateway-resource.md)
-* [Metrics and alerts for NAT gateway resources](nat-metrics.md).
+* [Metrics and alerts for NAT gateway resources](nat-metrics.md).
virtual-wan Virtual Wan Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-wan/virtual-wan-faq.md
Title: 'Azure Virtual WAN FAQ | Microsoft Docs'
description: See answers to frequently asked questions about Azure Virtual WAN networks, clients, gateways, devices, partners, and connections. - Previously updated : 12/07/2021+ Last updated : 04/19/2022 # Customer intent: As someone with a networking background, I want to read more details about Virtual WAN in a FAQ format.
### Is Azure Virtual WAN in GA?
-Yes, Azure Virtual WAN is Generally Available (GA). However, Virtual WAN consists of several features and scenarios. There are feature or scenarios within Virtual WAN where Microsoft applies the Preview tag. In those cases, the specific feature, or the scenario itself, is in Preview. If you do not use a specific preview feature, regular GA support applies. For more information about Preview support, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+Yes, Azure Virtual WAN is Generally Available (GA). However, Virtual WAN consists of several features and scenarios. There are feature or scenarios within Virtual WAN where Microsoft applies the Preview tag. In those cases, the specific feature, or the scenario itself, is in Preview. If you don't use a specific preview feature, regular GA support applies. For more information about Preview support, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+
+### Which locations and regions are available?
+
+For information, see [Available locations and regions](virtual-wan-locations-partners.md#locations).
### Does the user need to have hub and spoke with SD-WAN/VPN devices to use Azure Virtual WAN?
-Virtual WAN provides many functionalities built into a single pane of glass such as Site/Site-to-site VPN connectivity, User/P2S connectivity, ExpressRoute connectivity, Virtual Network connectivity, VPN ExpressRoute Interconnectivity, VNet-to-VNet transitive connectivity, Centralized Routing, Azure Firewall and Firewall Manager security, Monitoring, ExpressRoute Encryption, and many other capabilities. You do not have to have all of these use-cases to start using Virtual WAN. You can get started with just one use case.
+Virtual WAN provides many functionalities built into a single pane of glass such as site/site-to-site VPN connectivity, User/P2S connectivity, ExpressRoute connectivity, virtual network connectivity, VPN ExpressRoute Interconnectivity, VNet-to-VNet transitive connectivity, Centralized Routing, Azure Firewall and Firewall Manager security, Monitoring, ExpressRoute Encryption, and many other capabilities. You don't have to have all of these use-cases to start using Virtual WAN. You can get started with just one use case.
-The Virtual WAN architecture is a hub and spoke architecture with scale and performance built in where branches (VPN/SD-WAN devices), users (Azure VPN Clients, openVPN, or IKEv2 Clients), ExpressRoute circuits, Virtual Networks serve as spokes to virtual hub(s). All hubs are connected in full mesh in a Standard Virtual WAN making it easy for the user to use the Microsoft backbone for any-to-any (any spoke) connectivity. For hub and spoke with SD-WAN/VPN devices, users can either manually set it up in the Azure Virtual WAN portal or use the Virtual WAN Partner CPE (SD-WAN/VPN) to set up connectivity to Azure.
+The Virtual WAN architecture is a hub and spoke architecture with scale and performance built in where branches (VPN/SD-WAN devices), users (Azure VPN Clients, openVPN, or IKEv2 Clients), ExpressRoute circuits, virtual networks serve as spokes to virtual hub(s). All hubs are connected in full mesh in a Standard Virtual WAN making it easy for the user to use the Microsoft backbone for any-to-any (any spoke) connectivity. For hub and spoke with SD-WAN/VPN devices, users can either manually set it up in the Azure Virtual WAN portal or use the Virtual WAN Partner CPE (SD-WAN/VPN) to set up connectivity to Azure.
-Virtual WAN partners provide automation for connectivity, which is the ability to export the device info into Azure, download the Azure configuration and establish connectivity to the Azure Virtual WAN hub. For Point-to-site/User VPN connectivity, we support [Azure VPN client](https://go.microsoft.com/fwlink/?linkid=2117554), OpenVPN, or IKEv2 client.
+Virtual WAN partners provide automation for connectivity, which is the ability to export the device info into Azure, download the Azure configuration and establish connectivity to the Azure Virtual WAN hub. For point-to-site/User VPN connectivity, we support [Azure VPN client](https://go.microsoft.com/fwlink/?linkid=2117554), OpenVPN, or IKEv2 client.
### Can you disable fully meshed hubs in a Virtual WAN?
-Virtual WAN comes in two flavors: Basic and Standard. In Basic Virtual WAN, hubs are not meshed. In a Standard Virtual WAN, hubs are meshed and automatically connected when the virtual WAN is first set up. The user does not need to do anything specific. The user also does not have to disable or enable the functionality to obtain meshed hubs. Virtual WAN provides you many routing options to steer traffic between any spoke (VNet, VPN, or ExpressRoute). It provides the ease of fully meshed hubs, and also the flexibility of routing traffic per your needs.
+Virtual WAN comes in two flavors: Basic and Standard. In Basic Virtual WAN, hubs aren't meshed. In a Standard Virtual WAN, hubs are meshed and automatically connected when the virtual WAN is first set up. The user doesn't need to do anything specific. The user also doesn't have to disable or enable the functionality to obtain meshed hubs. Virtual WAN provides you many routing options to steer traffic between any spoke (VNet, VPN, or ExpressRoute). It provides the ease of fully meshed hubs, and also the flexibility of routing traffic per your needs.
### How are Availability Zones and resiliency handled in Virtual WAN? Virtual WAN is a collection of hubs and services made available inside the hub. The user can have as many Virtual WAN per their need. In a Virtual WAN hub, there are multiple services like VPN, ExpressRoute etc. Each of these services is automatically deployed across Availability Zones (except Azure Firewall), if the region supports Availability Zones. If a region becomes an Availability Zone after the initial deployment in the hub, the user can recreate the gateways, which will trigger an Availability Zone deployment. All gateways are provisioned in a hub as active-active, implying there is resiliency built in within a hub. Users can connect to multiple hubs if they want resiliency across regions.
-Currently, Azure Firewall can be deployed to support Availability Zones using Azure Firewall Manager Portal, [PowerShell](/powershell/module/az.network/new-azfirewall#example-6--create-a-firewall-with-no-rules-and-with-availability-zones) or CLI. There is currently no way to configure an existing Firewall to be deployed across availability zones. You will need to delete and redeploy your Azure Firewall.
+Currently, Azure Firewall can be deployed to support Availability Zones using Azure Firewall Manager Portal, [PowerShell](/powershell/module/az.network/new-azfirewall#example-6--create-a-firewall-with-no-rules-and-with-availability-zones) or CLI. There is currently no way to configure an existing Firewall to be deployed across availability zones. You'll need to delete and redeploy your Azure Firewall.
-While the concept of Virtual WAN is global, the actual Virtual WAN resource is Resource Manager-based and deployed regionally. If the virtual WAN region itself were to have an issue, all hubs in that virtual WAN will continue to function as is, but the user will not be able to create new hubs until the virtual WAN region is available.
+While the concept of Virtual WAN is global, the actual Virtual WAN resource is Resource Manager-based and deployed regionally. If the virtual WAN region itself were to have an issue, all hubs in that virtual WAN will continue to function as is, but the user won't be able to create new hubs until the virtual WAN region is available.
-### What client does the Azure Virtual WAN User VPN (Point-to-site) support?
+### What client does the Azure Virtual WAN User VPN (point-to-site) support?
-Virtual WAN supports [Azure VPN client](https://go.microsoft.com/fwlink/?linkid=2117554), OpenVPN Client, or any IKEv2 client. Azure AD authentication is supported with Azure VPN Client.A minimum of Windows 10 client OS version 17763.0 or higher is required. OpenVPN client(s) can support certificate-based authentication. Once cert-based auth is selected on the gateway, you will see the.ovpn* file to download to your device. IKEv2 supports both certificate and RADIUS authentication.
+Virtual WAN supports [Azure VPN client](https://go.microsoft.com/fwlink/?linkid=2117554), OpenVPN Client, or any IKEv2 client. Azure AD authentication is supported with Azure VPN Client.A minimum of Windows 10 client OS version 17763.0 or higher is required. OpenVPN client(s) can support certificate-based authentication. Once cert-based auth is selected on the gateway, you'll see the.ovpn* file to download to your device. IKEv2 supports both certificate and RADIUS authentication.
-### For User VPN (Point-to-site)- why is the P2S client pool split into two routes?
+### For User VPN (point-to-site)- why is the P2S client pool split into two routes?
Each gateway has two instances, the split happens so that each gateway instance can independently allocate client IPs for connected clients and traffic from the virtual network is routed back to the correct gateway instance to avoid inter-gateway instance hop.
There are two options to add DNS servers for the P2S clients. The first method i
// Specify custom dns servers for P2SVpnGateway VirtualHub while updating existing gateway $P2SVpnGateway = Get-AzP2sVpnGateway -ResourceGroupName $rgName -Name $P2SvpnGatewayName
- $updatedP2SVpnGateway = Update-AzP2sVpnGateway -ResourceGroupName $rgName -Name $P2SvpnGatewayName -CustomDnsServer $customDnsServers
+ $updatedP2SVpnGateway = Update-AzP2sVpnGateway -ResourceGroupName $rgName -Name $P2SvpnGatewayName -CustomDnsServer $customDnsServers
// Re-generate Vpn profile either from PS/Portal for Vpn clients to have the specified dns servers ```
-2. Or, if you are using the Azure VPN Client for Windows 10, you can modify the downloaded profile XML file and add the **\<dnsservers>\<dnsserver> \</dnsserver>\</dnsservers>** tags before importing it.
+
+2. Or, if you're using the Azure VPN Client for Windows 10, you can modify the downloaded profile XML file and add the **\<dnsservers>\<dnsserver> \</dnsserver>\</dnsservers>** tags before importing it.
```powershell <azvpnprofile>
There are two options to add DNS servers for the P2S clients. The first method i
</azvpnprofile> ```
-### For User VPN (Point-to-site)- how many clients are supported?
+### For User VPN (point-to-site)- how many clients are supported?
Each User VPN P2S gateway has two instances. Each instance supports up to a certain number of connections as the scale units change. Scale unit 1-3 supports 500 connections, scale unit 4-6 supports 1000 connections, scale unit 7-12 supports 5000 connections, and scale unit 13-18 supports up to 10,000 connections.
-For example, let's say the user chooses 1 scale unit. Each scale unit would imply an active-active gateway deployed and each of the instances (in this case 2) would support up to 500 connections. Since you can get 500 connections * 2 per gateway, it does not mean that you plan for 1000 instead of the 500 for this scale unit. Instances may need to be serviced during which connectivity for the extra 500 may be interrupted if you surpass the recommended connection count. Also, be sure to plan for downtime in case you decide to scale up or down on the scale unit, or change the point-to-site configuration on the VPN gateway.
+For example, let's say the user chooses 1 scale unit. Each scale unit would imply an active-active gateway deployed and each of the instances (in this case 2) would support up to 500 connections. Since you can get 500 connections * 2 per gateway, it doesn't mean that you plan for 1000 instead of the 500 for this scale unit. Instances may need to be serviced during which connectivity for the extra 500 may be interrupted if you surpass the recommended connection count. Also, be sure to plan for downtime in case you decide to scale up or down on the scale unit, or change the point-to-site configuration on the VPN gateway.
### What are Virtual WAN gateway scale units?
-A scale unit is a unit defined to pick an aggregate throughput of a gateway in Virtual hub. 1 scale unit of VPN = 500 Mbps. 1 scale unit of ExpressRoute = 2 Gbps. Example: 10 scale unit of VPN would imply 500 Mbps * 10 = 5 Gbps
+A scale unit is a unit defined to pick an aggregate throughput of a gateway in Virtual hub. 1 scale unit of VPN = 500 Mbps. 1 scale unit of ExpressRoute = 2 Gbps. Example: 10 scale unit of VPN would imply 500 Mbps * 10 = 5 Gbps.
### What is the difference between an Azure virtual network gateway (VPN Gateway) and an Azure Virtual WAN VPN gateway?
-Virtual WAN provides large-scale site-to-site connectivity and is built for throughput, scalability, and ease of use. When you connect a site to a Virtual WAN VPN gateway, it is different from a regular virtual network gateway that uses a gateway type 'Site-to-site VPN'. When you want to connect remote users to Virtual WAN, you use a gateway type 'Point-to-site VPN'. The Point-to-site and Site-to-site VPN Gateways are separate entities in the Virtual WAN hub and must be individually deployed. Similarly, when you connect an ExpressRoute circuit to a Virtual WAN hub, it uses a different resource for the ExpressRoute gateway than the regular virtual network gateway that uses gateway type 'ExpressRoute'.
+Virtual WAN provides large-scale site-to-site connectivity and is built for throughput, scalability, and ease of use. When you connect a site to a Virtual WAN VPN gateway, it's different from a regular virtual network gateway that uses a gateway type 'site-to-site VPN'. When you want to connect remote users to Virtual WAN, you use a gateway type 'point-to-site VPN'. The point-to-site and site-to-site VPN gateways are separate entities in the Virtual WAN hub and must be individually deployed. Similarly, when you connect an ExpressRoute circuit to a Virtual WAN hub, it uses a different resource for the ExpressRoute gateway than the regular virtual network gateway that uses gateway type 'ExpressRoute'.
-Virtual WAN supports up to 20-Gbps aggregate throughput both for VPN and ExpressRoute. Virtual WAN also has automation for connectivity with an ecosystem of CPE branch device partners. CPE branch devices have built-in automation that autoprovisions and connects into Azure Virtual WAN. These devices are available from a growing ecosystem of SD-WAN and VPN partners. See the [Preferred Partner List](virtual-wan-locations-partners.md).
+Virtual WAN supports up to 20-Gbps aggregate throughput both for VPN and ExpressRoute. Virtual WAN also has automation for connectivity with an ecosystem of CPE branch device partners. CPE branch devices have built-in automation that autoprovisions and connects into Azure Virtual WAN. These devices are available from a growing ecosystem of SD-WAN and VPN partners. See the [Preferred partner list](virtual-wan-locations-partners.md).
### How is Virtual WAN different from an Azure virtual network gateway?
-A virtual network gateway VPN is limited to 30 tunnels. For connections, you should use Virtual WAN for large-scale VPN. You can connect up to 1,000 branch connections per Virtual Hub with aggregate of 20 Gbps per Hub. A connection is an active-active tunnel from the on-premises VPN device to the virtual hub. You can also have multiple virtual hubs per region, which means you can connect more than 1,000 branches to a single Azure Region by deploying multiple Virtual WAN hubs in that Azure Region, each with its own Site-to-site VPN gateway.
+A virtual network gateway VPN is limited to 30 tunnels. For connections, you should use Virtual WAN for large-scale VPN. You can connect up to 1,000 branch connections per virtual hub with aggregate of 20 Gbps per hub. A connection is an active-active tunnel from the on-premises VPN device to the virtual hub. You can also have multiple virtual hubs per region, which means you can connect more than 1,000 branches to a single Azure Region by deploying multiple Virtual WAN hubs in that Azure Region, each with its own site-to-site VPN gateway.
-### What is the recommended algorithm and Packets per second per Site-to-site instance in Virtual WAN hub? How many tunnels is support per instance? What is the max throughput supported in a single tunnel?
+### What is the recommended algorithm and Packets per second per site-to-site instance in Virtual WAN hub? How many tunnels is support per instance? What is the max throughput supported in a single tunnel?
-Virtual WAN supports 2 active Site-to-Site VPN Gateway instances in a virtual hub. This means there are 2 active-active set of VPN gateway instances in a virtual hub. During maintenance operations, each instance is upgraded one by one due to which a user may experience brief decrease in aggregate throughput of a VPN gateway.
+Virtual WAN supports 2 active site-to-site VPN gateway instances in a virtual hub. This means there are 2 active-active set of VPN gateway instances in a virtual hub. During maintenance operations, each instance is upgraded one by one due to which a user may experience brief decrease in aggregate throughput of a VPN gateway.
While Virtual WAN VPN supports many algorithms, our recommendation is GCMAES256 for both IPSEC Encryption and Integrity for optimal performance. AES256 and SHA256 are considered less performant and therefore performance degradation such as latency and packet drops can be expected for similar algorithm types. Packets per second or PPS is a factor of the total # of packets and the throughput supported per instance. This is best understood with an example. Lets say a 1 scale unit 500-Mbps site-to-site VPN gateway instance is deployed in a virtual WAN hub. Assuming a packet size of 1480, expected PPS for that vpn gateway instance *at a minimum* = [(500 Mbps * 1024 * 1024) /8/1480] ~ 43690.
-Virtual WAN has concepts of VPN connection, link connection and tunnels. A single VPN connection consists of link connections. Virtual WAN supports up to 4 link connections in a VPN connection. Each link connection consists of two IPsec tunnels that terminate in two instances of an active-active VPN gateway deployed in a virtual hub. The total number of tunnels that can terminate in a single active instance is 1000, which also implies that throughput for 1 instance will be available aggregated across all the tunnels connecting to that instance. Each tunnel also has certain throughput values. For GCM algorithm, a tunnel can support up to a maximum 1.25 Gbps. In cases of multiple tunnels connected to a lower value scale unit gateway, it is best to evaluate the need per tunnel and plan for a VPN gateway that is an aggregate value for throughput across all tunnels terminating in the VPN instance.
+Virtual WAN has concepts of VPN connection, link connection and tunnels. A single VPN connection consists of link connections. Virtual WAN supports up to 4 link connections in a VPN connection. Each link connection consists of two IPsec tunnels that terminate in two instances of an active-active VPN gateway deployed in a virtual hub. The total number of tunnels that can terminate in a single active instance is 1000, which also implies that throughput for 1 instance will be available aggregated across all the tunnels connecting to that instance. Each tunnel also has certain throughput values. For GCM algorithm, a tunnel can support up to a maximum 1.25 Gbps. In cases of multiple tunnels connected to a lower value scale unit gateway, it's best to evaluate the need per tunnel and plan for a VPN gateway that is an aggregate value for throughput across all tunnels terminating in the VPN instance.
### Which device providers (Virtual WAN partners) are supported?
No. You can use any VPN-capable device that adheres to the Azure requirements fo
### How do Virtual WAN partners automate connectivity with Azure Virtual WAN?
-Software-defined connectivity solutions typically manage their branch devices using a controller, or a device provisioning center. The controller can use Azure APIs to automate connectivity to the Azure Virtual WAN. The automation includes uploading branch information, downloading the Azure configuration, setting up IPsec tunnels into Azure Virtual hubs, and automatically setting up connectivity form the branch device to Azure Virtual WAN. When you have hundreds of branches, connecting using Virtual WAN CPE Partners is easy because the onboarding experience takes away the need to set up, configure, and manage large-scale IPsec connectivity. For more information, see [Virtual WAN partner automation](virtual-wan-configure-automation-providers.md).
+Software-defined connectivity solutions typically manage their branch devices using a controller, or a device provisioning center. The controller can use Azure APIs to automate connectivity to the Azure Virtual WAN. The automation includes uploading branch information, downloading the Azure configuration, setting up IPsec tunnels into Azure Virtual hubs, and automatically setting up connectivity from the branch device to Azure Virtual WAN. When you have hundreds of branches, connecting using Virtual WAN CPE Partners is easy because the onboarding experience takes away the need to set up, configure, and manage large-scale IPsec connectivity. For more information, see [Virtual WAN partner automation](virtual-wan-configure-automation-providers.md).
-### What if a device I am using is not in the Virtual WAN partner list? Can I still use it to connect to Azure Virtual WAN VPN?
+### What if a device I'm using isn't in the Virtual WAN partner list? Can I still use it to connect to Azure Virtual WAN VPN?
-Yes as long as the device supports IPsec IKEv1 or IKEv2. Virtual WAN partners automate connectivity from the device to Azure VPN end points. This implies automating steps such as 'branch information upload', 'IPsec and configuration' and 'connectivity'. Because your device is not from a Virtual WAN partner ecosystem, you will need to do the heavy lifting of manually taking the Azure configuration and updating your device to set up IPsec connectivity.
+Yes as long as the device supports IPsec IKEv1 or IKEv2. Virtual WAN partners automate connectivity from the device to Azure VPN end points. This implies automating steps such as 'branch information upload', 'IPsec and configuration' and 'connectivity'. Because your device isn't from a Virtual WAN partner ecosystem, you'll need to do the heavy lifting of manually taking the Azure configuration and updating your device to set up IPsec connectivity.
-### How do new partners that are not listed in your launch partner list get onboarded?
+### How do new partners that aren't listed in your launch partner list get onboarded?
-All virtual WAN APIs are open API. You can go over the documentation [Virtual WAN partner automation](virtual-wan-configure-automation-providers.md) to assess technical feasibility. An ideal partner is one that has a device that can be provisioned for IKEv1 or IKEv2 IPsec connectivity. Once the company has completed the automation work for their CPE device based on the automation guidelines provided above, you can reach out to azurevirtualwan@microsoft.com to be listed here [Connectivity through partners](virtual-wan-locations-partners.md#partners). If you are a customer that would like a certain company solution to be listed as a Virtual WAN partner, have the company contact the Virtual WAN by sending an email to azurevirtualwan@microsoft.com.
+All virtual WAN APIs are open API. You can go over the documentation [Virtual WAN partner automation](virtual-wan-configure-automation-providers.md) to assess technical feasibility. An ideal partner is one that has a device that can be provisioned for IKEv1 or IKEv2 IPsec connectivity. Once the company has completed the automation work for their CPE device based on the automation guidelines provided above, you can reach out to azurevirtualwan@microsoft.com to be listed here [Connectivity through partners](virtual-wan-locations-partners.md#partners). If you're a customer that would like a certain company solution to be listed as a Virtual WAN partner, have the company contact the Virtual WAN by sending an email to azurevirtualwan@microsoft.com.
### How is Virtual WAN supporting SD-WAN devices?
-Virtual WAN partners automate IPsec connectivity to Azure VPN end points. If the Virtual WAN partner is an SD-WAN provider, then it is implied that the SD-WAN controller manages automation and IPsec connectivity to Azure VPN end points. If the SD-WAN device requires its own end point instead of Azure VPN for any proprietary SD-WAN functionality, you can deploy the SD-WAN end point in an Azure VNet and coexist with Azure Virtual WAN.
+Virtual WAN partners automate IPsec connectivity to Azure VPN end points. If the Virtual WAN partner is an SD-WAN provider, then it's implied that the SD-WAN controller manages automation and IPsec connectivity to Azure VPN end points. If the SD-WAN device requires its own end point instead of Azure VPN for any proprietary SD-WAN functionality, you can deploy the SD-WAN end point in an Azure VNet and coexist with Azure Virtual WAN.
### How many VPN devices can connect to a single hub?
Up to 1,000 connections are supported per virtual hub. Each connection consists
### What is a branch connection to Azure Virtual WAN?
-A connection from a branch or VPN device into Azure Virtual WAN is a VPN connection that connects virtually the VPN Site and the Azure VPN Gateway in a virtual hub.
+A connection from a branch or VPN device into Azure Virtual WAN is a VPN connection that connects virtually the VPN site and the Azure VPN gateway in a virtual hub.
### What happens if the on-premises VPN device only has 1 tunnel to an Azure Virtual WAN VPN gateway?
-An Azure Virtual WAN connection is composed of 2 tunnels. A Virtual WAN VPN gateway is deployed in a virtual hub in active-active mode, which implies that there are separate tunnels from on-premises devices terminating on separate instances. This is the recommendation for all users. However, if the user chooses to only have 1 tunnel to one of the Virtual WAN VPN gateway instances, if for any reason (maintenance, patches etc.) the gateway instance is taken offline, the tunnel will be moved to the secondary active instance and the user may experience a reconnect. BGP sessions will not move across instances.
+An Azure Virtual WAN connection is composed of 2 tunnels. A Virtual WAN VPN gateway is deployed in a virtual hub in active-active mode, which implies that there are separate tunnels from on-premises devices terminating on separate instances. This is the recommendation for all users. However, if the user chooses to only have 1 tunnel to one of the Virtual WAN VPN gateway instances, if for any reason (maintenance, patches etc.) the gateway instance is taken offline, the tunnel will be moved to the secondary active instance and the user may experience a reconnect. BGP sessions won't move across instances.
-### What happens during a Gateway Reset in a Virtual WAN VPN Gateway?
+### What happens during a Gateway Reset in a Virtual WAN VPN gateway?
-The Gateway Reset button should be used if your on-premises devices are all working as expected but Site to Site VPN connections in Azure are in a Disconnected state. Virtual WAN VPN Gateways are always deployed in an Active-Active state for high availability. This means there is always more than one instance deployed in a VPN Gateway at any point of time. When the Gateway Reset button is used, it reboots the instances in the VPN Gateway in a sequential manner, so your connections are not disrupted. There will be a brief gap as connections move from one instance to the other, but this gap should be less than a minute. Additionally, please note that resetting the gateways will not change your Public IPs.
+The Gateway Reset button should be used if your on-premises devices are all working as expected, but the site-to-site VPN connection in Azure is in a Disconnected state. Virtual WAN VPN gateways are always deployed in an Active-Active state for high availability. This means there's always more than one instance deployed in a VPN gateway at any point of time. When the Gateway Reset button is used, it reboots the instances in the VPN gateway in a sequential manner so your connections aren't disrupted. There will be a brief gap as connections move from one instance to the other, but this gap should be less than a minute. Additionally, note that resetting the gateways won't change your Public IPs.
### Can the on-premises VPN device connect to multiple hubs? Yes. Traffic flow, when commencing, is from the on-premises device to the closest Microsoft network edge, and then to the virtual hub. ### Are there new Resource Manager resources available for Virtual WAN?
-
-Yes, Virtual WAN has new Resource Manager resources. For more information, please see the [Overview](virtual-wan-about.md).
+
+Yes, Virtual WAN has new Resource Manager resources. For more information, see the [Overview](virtual-wan-about.md).
### Can I deploy and use my favorite network virtual appliance (in an NVA VNet) with Azure Virtual WAN?
Yes, you can connect your favorite network virtual appliance (NVA) VNet to the A
### Can I create a Network Virtual Appliance inside the virtual hub?
-A Network Virtual Appliance (NVA) cannot be deployed inside a virtual hub. However, you can create it in a spoke VNet that is connected to the virtual hub and enable appropriate routing to direct traffic per your needs.
+A Network Virtual Appliance (NVA) can't be deployed inside a virtual hub. However, you can create it in a spoke VNet that is connected to the virtual hub and enable appropriate routing to direct traffic per your needs.
### Can a spoke VNet have a virtual network gateway?
-No. The spoke VNet cannot have a virtual network gateway if it is connected to the virtual hub.
+No. The spoke VNet can't have a virtual network gateway if it's connected to the virtual hub.
### Is there support for BGP in VPN connectivity?
In some scenarios, spoke VNets can also be directly peered with each other using
### Is branch-to-branch connectivity allowed in Virtual WAN?
-Yes, branch-to-branch connectivity is available in Virtual WAN. Branch is conceptually applicable to VPN Site, ExpressRoute circuits, or Point-to-Site/User VPN users. Enabling branch-to-branch is enabled by default and can be located in WAN **Configuration** settings. This lets VPN branches/users connect to other VPN branches and transit connectivity is also enabled between VPN and ExpressRoute users.
+Yes, branch-to-branch connectivity is available in Virtual WAN. Branch is conceptually applicable to VPN site, ExpressRoute circuits, or point-to-site/User VPN users. Enabling branch-to-branch is enabled by default and can be located in WAN **Configuration** settings. This lets VPN branches/users connect to other VPN branches and transit connectivity is also enabled between VPN and ExpressRoute users.
### Does branch-to-branch traffic traverse through the Azure Virtual WAN?
Yes. Branch-to-branch traffic traverses through Azure Virtual WAN.
### Does Virtual WAN require ExpressRoute from each site?
-No. Virtual WAN does not require ExpressRoute from each site. Your sites may be connected to a provider network using an ExpressRoute circuit. For sites that are connected using ExpressRoute to a virtual hub and IPsec VPN into the same hub, virtual hub provides transit connectivity between the VPN and ExpressRoute user.
+No. Virtual WAN doesn't require ExpressRoute from each site. Your sites may be connected to a provider network using an ExpressRoute circuit. For sites that are connected using ExpressRoute to a virtual hub and IPsec VPN into the same hub, virtual hub provides transit connectivity between the VPN and ExpressRoute user.
### Is there a network throughput or connection limit when using Azure Virtual WAN?
-Network throughput is per service in a virtual WAN hub. In each hub, the VPN aggregate throughput is up to 20 Gbps, the ExpressRoute aggregate throughput is up to 20 Gbps, and the User VPN/Point-to-site VPN aggregate throughput is up to 20 Gbps. The router in virtual hub supports up to 50 Gbps for VNet-to-VNet traffic flows and assumes a total of 2000 VM workload across all VNets connected to a single virtual hub. This [limit](../azure-resource-manager/management/azure-subscription-service-limits.md#virtual-wan-limits) can be increased opening an online customer support request. For cost implication, see *Routing Infrastructure Unit* cost in the [Azure Virtual WAN Pricing](https://azure.microsoft.com/pricing/details/virtual-wan/) page.
+Network throughput is per service in a virtual WAN hub. In each hub, the VPN aggregate throughput is up to 20 Gbps, the ExpressRoute aggregate throughput is up to 20 Gbps, and the User VPN/point-to-site VPN aggregate throughput is up to 20 Gbps. The router in virtual hub supports up to 50 Gbps for VNet-to-VNet traffic flows and assumes a total of 2000 VM workload across all VNets connected to a single virtual hub. This [limit](../azure-resource-manager/management/azure-subscription-service-limits.md#virtual-wan-limits) can be increased opening an online customer support request. For cost implication, see *Routing Infrastructure Unit* cost in the [Azure Virtual WAN Pricing](https://azure.microsoft.com/pricing/details/virtual-wan/) page.
When VPN sites connect into a hub, they do so with connections. Virtual WAN supports up to 1000 connections or 2000 IPsec tunnels per virtual hub. When remote users connect into virtual hub, they connect to the P2S VPN gateway, which supports up to 100,000 users depending on the scale unit(bandwidth) chosen for the P2S VPN gateway in the virtual hub.
On-premises device solutions can apply traffic policies to steer traffic across
### What is global transit architecture?
-For information about global transit architecture, see [Global transit network architecture and Virtual WAN](virtual-wan-global-transit-network-architecture.md).
+For information, see [Global transit network architecture and Virtual WAN](virtual-wan-global-transit-network-architecture.md).
### How is traffic routed on the Azure backbone?
The traffic follows the pattern: branch device ->ISP->Microsoft network edge->Mi
Yes. An internet connection and physical device that supports IPsec, preferably from our integrated [Virtual WAN partners](virtual-wan-locations-partners.md). Optionally, you can manually manage the configuration and connectivity to Azure from your preferred device.
-### How do I enable default route (0.0.0.0/0) for a connection (VPN, ExpressRoute, or Virtual Network)?
+### How do I enable default route (0.0.0.0/0) for a connection (VPN, ExpressRoute, or virtual network)?
-A virtual hub can propagate a learned default route to a virtual network/site-to-site VPN/ExpressRoute connection if the flag is 'Enabled' on the connection. This flag is visible when the user edits a virtual network connection, a VPN connection, or an ExpressRoute connection. By default, this flag is disabled when a site or an ExpressRoute circuit is connected to a hub. It is enabled by default when a virtual network connection is added to connect a VNet to a virtual hub.
+A virtual hub can propagate a learned default route to a virtual network/site-to-site VPN/ExpressRoute connection if the flag is 'Enabled' on the connection. This flag is visible when the user edits a virtual network connection, a VPN connection, or an ExpressRoute connection. By default, this flag is disabled when a site or an ExpressRoute circuit is connected to a hub. It's enabled by default when a virtual network connection is added to connect a VNet to a virtual hub.
-The default route does not originate in the Virtual WAN hub; the default route is propagated if it is already learned by the Virtual WAN hub as a result of deploying a firewall in the hub, or if another connected site has forced-tunneling enabled. A default route does not propagate between hubs (inter-hub).
+The default route doesn't originate in the Virtual WAN hub; the default route is propagated if it's already learned by the Virtual WAN hub as a result of deploying a firewall in the hub, or if another connected site has forced-tunneling enabled. A default route doesn't propagate between hubs (inter-hub).
### Is it possible to create multiple virtual WAN hubs in the same region?
-Yes. Customers can now create more than one hub in the same region for the same Azure Virtual WAN.
+Yes. Customers can now create more than one hub in the same region for the same Azure Virtual WAN.
### How does the virtual hub in a virtual WAN select the best path for a route from multiple hubs?
If a virtual hub learns the same route from multiple remote hubs, the order in w
### Does the Virtual WAN hub allow connectivity between ExpressRoute circuits?
-Transit between ER-to-ER is always via Global reach. Virtual hub gateways are deployed in DC or Azure regions. When two ExpressRoute circuits connect via Global reach, there is no need for the traffic to come all the way from the edge routers to the virtual hub DC.
+Transit between ER-to-ER is always via Global reach. Virtual hub gateways are deployed in DC or Azure regions. When two ExpressRoute circuits connect via Global reach, there's no need for the traffic to come all the way from the edge routers to the virtual hub DC.
### Is there a concept of weight in Azure Virtual WAN ExpressRoute circuits or VPN connections
When multiple ExpressRoute circuits are connected to a virtual hub, routing weig
Yes. Virtual WAN prefers ExpressRoute over VPN for traffic egressing Azure.
-### When a Virtual WAN hub has an ExpressRoute circuit and a VPN Site connected to it, what would cause a VPN connection route to be preferred over ExpressRoute?
+### When a Virtual WAN hub has an ExpressRoute circuit and a VPN site connected to it, what would cause a VPN connection route to be preferred over ExpressRoute?
When an ExpressRoute circuit is connected to virtual hub, the Microsoft edge routers are the first node for communication between on-premises and Azure. These edge routers communicate with the Virtual WAN ExpressRoute gateways that, in turn, learn routes from the virtual hub router that controls all routes between any gateways in Virtual WAN. The Microsoft edge routers process virtual hub ExpressRoute routes with higher preference over routes learned from on-premises.
-For any reason, if the VPN connection becomes the primary medium for the virtual hub to learn routes from (e.g failover scenarios between ExpressRoute and VPN), unless the VPN Site has a longer AS Path length, the virtual hub will continue to share VPN learned routes with the ExpressRoute gateway. This causes the Microsoft edge routers to prefer VPN routes over on-premises routes.
+For any reason, if the VPN connection becomes the primary medium for the virtual hub to learn routes from (e.g failover scenarios between ExpressRoute and VPN), unless the VPN site has a longer AS Path length, the virtual hub will continue to share VPN learned routes with the ExpressRoute gateway. This causes the Microsoft edge routers to prefer VPN routes over on-premises routes.
-### <a name="expressroute-bow-tie"></a>When two hubs (hub 1 and 2) are connected and there is an ExpressRoute circuit connected as a bow-tie to both the hubs, what is the path for a VNet connected to hub 1 to reach a VNet connected in hub 2?
+### <a name="expressroute-bow-tie"></a>When two hubs (hub 1 and 2) are connected and there's an ExpressRoute circuit connected as a bow-tie to both the hubs, what is the path for a VNet connected to hub 1 to reach a VNet connected in hub 2?
-The current behavior is to prefer the ExpressRoute circuit path over hub-to-hub for VNet-to-VNet connectivity. However, this is not encouraged in a Virtual WAN setup. To resolve this, you can do one of two things:
+The current behavior is to prefer the ExpressRoute circuit path over hub-to-hub for VNet-to-VNet connectivity. However, this isn't encouraged in a Virtual WAN setup. To resolve this, you can do one of two things:
- * Configure multiple ExpressRoute circuits (different providers) to connect to one hub and use the hub-to-hub connectivity provided by Virtual WAN for inter-region traffic flows.
+* Configure multiple ExpressRoute circuits (different providers) to connect to one hub and use the hub-to-hub connectivity provided by Virtual WAN for inter-region traffic flows.
- * Contact the product team to take part in the gated public preview. In this preview, traffic between the 2 hubs traverses through the Azure Virtual WAN router in each hub and uses a hub-to-hub path instead of the ExpressRoute path (which traverses through the Microsoft edge routers/MSEE). To use this feature during preview, email **previewpreferh2h@microsoft.com** with the Virtual WAN IDs, Subscription ID, and the Azure region. Expect a response within 48 business hours (Monday-Friday) with confirmation that the feature is enabled.
+* Contact the product team to take part in the gated public preview. In this preview, traffic between the 2 hubs traverses through the Azure Virtual WAN router in each hub and uses a hub-to-hub path instead of the ExpressRoute path (which traverses through the Microsoft edge routers/MSEE). To use this feature during preview, email **previewpreferh2h@microsoft.com** with the Virtual WAN IDs, Subscription ID, and the Azure region. Expect a response within 48 business hours (Monday-Friday) with confirmation that the feature is enabled.
### Can hubs be created in different resource group in Virtual WAN?
Yes. This option is currently available via PowerShell only. The Virtual WAN por
### What is the recommended hub address space during hub creation?
-The recommended Virtual WAN hub address space is /23. Virtual WAN hub assigns subnets to various gateways (ExpressRoute, Site-to-site VPN, Point-to-site VPN, Azure Firewall, Virtual hub Router). For scenarios where NVAs are deployed inside a virtual hub, a /28 is typically carved out for the NVA instances. However if the user were to provision multiple NVAs, a /27 subnet may be assigned. Therefore keeping a future architecture in mind, while Virtual WAN hubs are deployed with a minimum size of /24, the recommended hub address space at creation time for user to input is /23.
+The recommended Virtual WAN hub address space is /23. Virtual WAN hub assigns subnets to various gateways (ExpressRoute, site-to-site VPN, point-to-site VPN, Azure Firewall, Virtual hub Router). For scenarios where NVAs are deployed inside a virtual hub, a /28 is typically carved out for the NVA instances. However if the user were to provision multiple NVAs, a /27 subnet may be assigned. Therefore, keeping a future architecture in mind, while Virtual WAN hubs are deployed with a minimum size of /24, the recommended hub address space at creation time for user to input is /23.
-### Can you resize or change the address prefixes of a spoke Virtual Network connected to the Virtual WAN Hub?
+### Can you resize or change the address prefixes of a spoke virtual network connected to the Virtual WAN hub?
-No. This is currently not possible. To change the address prefixes of a spoke Virtual Network, please remove the connection between the spoke Virtual Network and the Virtual WAN hub, modify the address spaces of the spoke Virtual Network, and then re-create the connection between the spoke Virtual Network and the Virtual WAN Hub.
+No. This is currently not possible. To change the address prefixes of a spoke virtual network, remove the connection between the spoke virtual network and the Virtual WAN hub, modify the address spaces of the spoke virtual network, and then re-create the connection between the spoke virtual network and the Virtual WAN hub.
### Is there support for IPv6 in Virtual WAN?
-IPv6 is not supported in the Virtual WAN hub and its gateways. If you have a VNet that has IPv4 and IPv6 support and you would like to connect the VNet to Virtual WAN, this scenario not currently supported.
+IPv6 isn't supported in the Virtual WAN hub and its gateways. If you have a VNet that has IPv4 and IPv6 support and you would like to connect the VNet to Virtual WAN, this scenario not currently supported.
-For the point-to-site User VPN scenario with internet breakout via Azure Firewall, you will likely have to turn off IPv6 connectivity on your client device to force traffic to the Virtual WAN hub. This is because modern devices, by default, use IPv6 addresses.
+For the point-to-site User VPN scenario with internet breakout via Azure Firewall, you'll likely have to turn off IPv6 connectivity on your client device to force traffic to the Virtual WAN hub. This is because modern devices, by default, use IPv6 addresses.
### What is the recommended API version to be used by scripts automating various Virtual WAN functionalities?
See [Basic and Standard Virtual WANs](virtual-wan-about.md#basicstandard). For p
### Does Virtual WAN store customer data?
-No. Virtual WAN does not store any customer data.
+No. Virtual WAN doesn't store any customer data.
### Are there any Managed Service Providers that can manage Virtual WAN for users as a service? Yes. For a list of Managed Service Provider (MSP) solutions enabled via Azure Marketplace, see [Azure Marketplace offers by Azure Networking MSP partners](../networking/networking-partners-msp.md#msp).
-### How does Virtual WAN Hub routing differ from Azure Route Server in a VNet?
+### How does Virtual WAN hub routing differ from Azure Route Server in a VNet?
-Both Azure Virtual WAN hub and Azure Route Server provide Border Gateway Protocol (BGP) peering capabilities that can be utilized by NVAs (Network Virtual Appliance) to advertise IP addresses from the NVA to the userΓÇÖs Azure virtual networks. The deployment options differ in the sense that Azure Route Server is typically deployed by a self-managed customer hub VNet whereas Azure Virtual WAN provides a zero-touch fully meshed hub service to which customers connect their various spokes end points (Azure VNET, on-premise branches with Site-to-site VPN or SDWAN, remote users with Point-to-site/Remote User VPN and Private connections with ExpressRoute) and enjoy BGP Peering for NVAs deployed in spoke VNET along with other vWAN capabilities such as transit connectivity for VNet-to-VNet , transit connectivity between VPN and ExpressRoute , custom/advanced routing, custom route association and propagation, routing intent/policies for no hassle inter-region security, Secure Hub/Azure firewall etc. For more details about Virtual WAN BGP Peering, please see [How to peer BGP with a virtual hub](scenario-bgp-peering-hub.md).
+Both Azure Virtual WAN hub and Azure Route Server provide Border Gateway Protocol (BGP) peering capabilities that can be utilized by NVAs (Network Virtual Appliance) to advertise IP addresses from the NVA to the userΓÇÖs Azure virtual networks. The deployment options differ in the sense that Azure Route Server is typically deployed by a self-managed customer hub VNet whereas Azure Virtual WAN provides a zero-touch fully meshed hub service to which customers connect their various spokes end points (Azure VNet, on-premise branches with site-to-site VPN or SDWAN, remote users with point-to-site/Remote User VPN and Private connections with ExpressRoute) and enjoy BGP Peering for NVAs deployed in spoke VNet along with other vWAN capabilities such as transit connectivity for VNet-to-VNet, transit connectivity between VPN and ExpressRoute, custom/advanced routing, custom route association and propagation, routing intent/policies for no hassle inter-region security, Secure Hub/Azure firewall etc. For more details about Virtual WAN BGP Peering, please see [How to peer BGP with a virtual hub](scenario-bgp-peering-hub.md).
-### If I am using a third-party security provider (Zscaler, iBoss or Checkpoint) to secure my internet traffic, why don't I see the VPN site associated to the third-party security provider in the Azure portal?
+### If I'm using a third-party security provider (Zscaler, iBoss or Checkpoint) to secure my internet traffic, why don't I see the VPN site associated to the third-party security provider in the Azure portal?
-When you choose to deploy a security partner provider to protect Internet access for your users, the third-party security provider creates a VPN site on your behalf. Because the third-party security provider is created automatically by the provider and is not a user-created VPN site, this VPN site will not show up in the Azure portal.
+When you choose to deploy a security partner provider to protect Internet access for your users, the third-party security provider creates a VPN site on your behalf. Because the third-party security provider is created automatically by the provider and isn't a user-created VPN site, this VPN site won't show up in the Azure portal.
For more information regarding the available options third-party security providers and how to set this up, see [Deploy a security partner provider](../firewall-manager/deploy-trusted-security-partner.md). ### Will BGP communities generated by on-premises be preserved in Virtual WAN? Yes, BGP communities generated by on-premises will be preserved in Virtual WAN. You can use your own public ASNs or private ASNs for your on-premises networks. You can't use the ranges reserved by Azure or IANA:
- * ASNs reserved by Azure:
- * Public ASNs: 8074, 8075, 12076
- * Private ASNs: 65515, 65517, 65518, 65519, 65520
- * ASNs reserved by IANA: 23456, 64496-64511, 65535-65551
+
+* ASNs reserved by Azure:
+ * Public ASNs: 8074, 8075, 12076
+ * Private ASNs: 65515, 65517, 65518, 65519, 65520
+ * ASNs reserved by IANA: 23456, 64496-64511, 65535-65551
### In Virtual WAN, what are the estimated performances by ExpressRoute gateway SKU?
Yes, BGP communities generated by on-premises will be preserved in Virtual WAN.
### Why am I seeing a message and button called "Update router to latest software version" in portal?
-The Virtual WAN team has been working on upgrading virtual routers from their current Cloud Services infrastructure to Virtual Machine Scale Sets (VMSS) based deployments. This will enable the virtual hub router to now be availability zone aware and have enhanced scaling out capabilities during high CPU usage. If you navigate to your Virtual WAN hub resource and see this message and button, then you can upgrade your router to the lastest version by clicking on the button. The Cloud Services infrastructure will be deprecated soon. If you would like to take advantage of new Virtual WAN features, such as [BGP peering with the hub](create-bgp-peering-hub-portal.md), you will have to update your virtual hub router via Azure Portal.
+The Virtual WAN team has been working on upgrading virtual routers from their current Cloud Services infrastructure to Virtual Machine Scale Sets (VMSS) based deployments. This will enable the virtual hub router to now be availability zone aware and have enhanced scaling out capabilities during high CPU usage. If you navigate to your Virtual WAN hub resource and see this message and button, then you can upgrade your router to the latest version by clicking on the button. The Cloud Services infrastructure will be deprecated soon. If you would like to take advantage of new Virtual WAN features, such as [BGP peering with the hub](create-bgp-peering-hub-portal.md), you'll have to update your virtual hub router using the Azure portal.
-Note that youΓÇÖll only be able to update your virtual hub router if all the resources (gateways/route tables/VNET connections) in your hub are in a succeeded state. Additionally, as this operation requires deployment of new VMSS based virtual hub routers, youΓÇÖll face an expected downtime of 30 minutes per hub. Within a single Virtual WAN resource, hubs should be updated one at a time instead of updating multiple at the same time. When the Router Version says ΓÇ£LatestΓÇ¥, then the hub is done updating. There will be no routing behavior changes after this update. If the update fails for any reason, your hub will be auto recovered to the old version to ensure there is still a working setup.
+YouΓÇÖll only be able to update your virtual hub router if all the resources (gateways/route tables/VNet connections) in your hub are in a succeeded state. Additionally, as this operation requires deployment of new VMSS based virtual hub routers, youΓÇÖll face an expected downtime of 30 minutes per hub. Within a single Virtual WAN resource, hubs should be updated one at a time instead of updating multiple at the same time. When the Router Version says ΓÇ£LatestΓÇ¥, then the hub is done updating. There will be no routing behavior changes after this update. If the update fails for any reason, your hub will be auto recovered to the old version to ensure there is still a working setup.
### Is there a route limit for OpenVPN clients connecting to an Azure P2S VPN gateway?
-The route limit for OpenVPN clients is 1000.
+The route limit for OpenVPN clients is 1000.
### How is Virtual WAN SLA calculated?
-Virtual WAN is a networking-as-a-service platform that has a 99.95% SLA. However, Virtual WAN combines many different components such as Azure Firewall, Site-to-site VPN, ExpressRoute, Point-to-site VPN, and Virtual WAN Hub/Integrated Network Virtual Appliances.
+Virtual WAN is a networking-as-a-service platform that has a 99.95% SLA. However, Virtual WAN combines many different components such as Azure Firewall, site-to-site VPN, ExpressRoute, point-to-site VPN, and Virtual WAN Hub/Integrated Network Virtual Appliances.
-The SLA for each component is calculated individually. For example, if ExpressRoute has a 10 minute downtime, the availability of ExpressRoute would be calculated as (Maximum Available Minutes - downtime) / Maximum Available Minutes * 100.
+The SLA for each component is calculated individually. For example, if ExpressRoute has a 10 minute downtime, the availability of ExpressRoute would be calculated as (Maximum Available Minutes - downtime) / Maximum Available Minutes * 100.
## Next steps