Updates from: 10/20/2023 01:19:43
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Troubleshoot With Application Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/troubleshoot-with-application-insights.md
For more information about querying, see [Overview of log queries in Azure Monit
We recommend you to install the [Azure AD B2C extension](https://marketplace.visualstudio.com/items?itemName=AzureADB2CTools.aadb2c) for [VS Code](https://code.visualstudio.com/). With the Azure AD B2C extension, the logs are organized for you by the policy name, correlation ID (the application insights presents the first digit of the correlation ID), and the log timestamp. This feature helps you to find the relevant log based on the local timestamp and see the user journey as executed by Azure AD B2C. > [!NOTE]
-> The community has developed the vs code extension for Azure AD B2C to help identity developers. The extension is not supported by Microsoft, and is made available strictly as-is.
+> The community has developed the VS Code extension to help people implementing and maintaining Azure AD B2C solutions. The extension is not supported by Microsoft, and is made available strictly as-is.
### Set Application Insights API access
active-directory On Premises Scim Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/on-premises-scim-provisioning.md
# Microsoft Entra on-premises application provisioning to SCIM-enabled apps
-The Microsoft Entra provisioning service supports a [SCIM 2.0](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/provisioning-with-scim-getting-started/ba-p/880010) client that can be used to automatically provision users into cloud or on-premises applications. This article outlines how you can use the Microsoft Entra provisioning service to provision users into an on-premises application that's SCIM enabled. If you want to provision users into non-SCIM on-premises applications that use SQL as a data store, see the [Microsoft Entra ECMA Connector Host Generic SQL Connector tutorial](tutorial-ecma-sql-connector.md). If you want to provision users into cloud apps such as DropBox and Atlassian, review the app-specific [tutorials](../../active-directory/saas-apps/tutorial-list.md).
+The Microsoft Entra provisioning service supports a [SCIM 2.0](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/provisioning-with-scim-getting-started/ba-p/880010) client that can be used to automatically provision users into cloud or on-premises applications. This article outlines how you can use the Microsoft Entra provisioning service to provision users into an on-premises application that's SCIM enabled. If you want to provision users into non-SCIM on-premises applications that use SQL as a data store, see the [Microsoft Entra ECMA Connector Host Generic SQL Connector tutorial](tutorial-ecma-sql-connector.md). If you want to provision users into cloud apps such as DropBox and Atlassian, review the app-specific [tutorials](../saas-apps/tutorial-list.md).
![Diagram that shows SCIM architecture.](./media/on-premises-scim-provisioning/scim-4.png)
Once the agent is installed, no further configuration is necessary on-premises,
1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator). 1. Browse to **Identity** > **Applications** > **Enterprise applications**.
-1. Add the **On-premises SCIM app** from the [gallery](../../active-directory/manage-apps/add-application-portal.md).
+1. Add the **On-premises SCIM app** from the [gallery](../manage-apps/add-application-portal.md).
1. From the left hand menu navigate to the **Provisioning** option and select **Get started**. 1. Select **Automatic** from the dropdown list and expand the **On-Premises Connectivity** option. 1. Select the agent that you installed from the dropdown list and select **Assign Agent(s)**.
Once the agent is installed, no further configuration is necessary on-premises,
> If the test connection fails, you will see the request made. Please note that while the URL in the test connection error message is truncated, the actual request sent to the application contains the entire URL provided above. 1. Configure any [attribute mappings](customize-application-attributes.md) or [scoping](define-conditional-rules-for-provisioning-user-accounts.md) rules required for your application.
-1. Add users to scope by [assigning users and groups](../../active-directory/manage-apps/add-application-portal-assign-users.md) to the application.
+1. Add users to scope by [assigning users and groups](../manage-apps/add-application-portal-assign-users.md) to the application.
1. Test provisioning a few users [on demand](provision-on-demand.md). 1. Add more users into scope by assigning them to your application. 1. Go to the **Provisioning** pane, and select **Start provisioning**.
-1. Monitor using the [provisioning logs](../../active-directory/reports-monitoring/concept-provisioning-logs.md).
+1. Monitor using the [provisioning logs](../reports-monitoring/concept-provisioning-logs.md).
The following video provides an overview of on-premises provisioning.
active-directory User Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/user-provisioning.md
In Microsoft Entra ID, the term *app provisioning* refers to automatically creat
![Diagram that shows provisioning scenarios.](../governance/media/what-is-provisioning/provisioning.png)
-Microsoft Entra application provisioning refers to automatically creating user identities and roles in the applications that users need access to. In addition to creating user identities, automatic provisioning includes the maintenance and removal of user identities as status or roles change. Common scenarios include provisioning a Microsoft Entra user into SaaS applications like [Dropbox](../../active-directory/saas-apps/dropboxforbusiness-provisioning-tutorial.md), [Salesforce](../../active-directory/saas-apps/salesforce-provisioning-tutorial.md), [ServiceNow](../../active-directory/saas-apps/servicenow-provisioning-tutorial.md), and many more.
+Microsoft Entra application provisioning refers to automatically creating user identities and roles in the applications that users need access to. In addition to creating user identities, automatic provisioning includes the maintenance and removal of user identities as status or roles change. Common scenarios include provisioning a Microsoft Entra user into SaaS applications like [Dropbox](../saas-apps/dropboxforbusiness-provisioning-tutorial.md), [Salesforce](../saas-apps/salesforce-provisioning-tutorial.md), [ServiceNow](../saas-apps/servicenow-provisioning-tutorial.md), and many more.
Microsoft Entra ID also supports provisioning users into applications hosted on-premises or in a virtual machine, without having to open up any firewalls. The table below provides a mapping of protocols to connectors supported.
To help automate provisioning and deprovisioning, apps expose proprietary user a
To address these challenges, the System for Cross-domain Identity Management (SCIM) specification provides a common user schema to help users move into, out of, and around apps. SCIM is becoming the de facto standard for provisioning and, when used with federation standards like Security Assertions Markup Language (SAML) or OpenID Connect (OIDC), provides administrators an end-to-end standards-based solution for access management.
-For detailed guidance on developing a SCIM endpoint to automate the provisioning and deprovisioning of users and groups to an application, see [Build a SCIM endpoint and configure user provisioning](use-scim-to-provision-users-and-groups.md). Many applications integrate directly with Microsoft Entra ID. Some examples include Slack, Azure Databricks, and Snowflake. For these apps, skip the developer documentation and use the tutorials provided in [Tutorials for integrating SaaS applications with Microsoft Entra ID](../../active-directory/saas-apps/tutorial-list.md).
+For detailed guidance on developing a SCIM endpoint to automate the provisioning and deprovisioning of users and groups to an application, see [Build a SCIM endpoint and configure user provisioning](use-scim-to-provision-users-and-groups.md). Many applications integrate directly with Microsoft Entra ID. Some examples include Slack, Azure Databricks, and Snowflake. For these apps, skip the developer documentation and use the tutorials provided in [Tutorials for integrating SaaS applications with Microsoft Entra ID](../saas-apps/tutorial-list.md).
## Manual vs. automatic provisioning Applications in the Microsoft Entra gallery support one of two provisioning modes: * **Manual** provisioning means there's no automatic Microsoft Entra provisioning connector for the app yet. You must create them manually. Examples are adding users directly into the app's administrative portal or uploading a spreadsheet with user account detail. Consult the documentation provided by the app, or contact the app developer to determine what mechanisms are available.
-* **Automatic** means that a Microsoft Entra provisioning connector is available this application. Follow the setup tutorial specific to setting up provisioning for the application. Find the app tutorials at [Tutorials for integrating SaaS applications with Microsoft Entra ID](../../active-directory/saas-apps/tutorial-list.md).
+* **Automatic** means that a Microsoft Entra provisioning connector is available this application. Follow the setup tutorial specific to setting up provisioning for the application. Find the app tutorials at [Tutorials for integrating SaaS applications with Microsoft Entra ID](../saas-apps/tutorial-list.md).
The provisioning mode supported by an application is also visible on the **Provisioning** tab after you've added the application to your enterprise apps.
active-directory Application Proxy Integrate With Microsoft Cloud Application Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-proxy/application-proxy-integrate-with-microsoft-cloud-application-security.md
After adding your application to Microsoft Entra ID, use the steps in [Test the
## Deploy Conditional Access App Control
-To configure your application with the Conditional Access Application Control, follow the instructions in [Deploy Conditional Access Application Control for Microsoft Entra apps](/cloud-app-security/proxy-deployment-aad).
+To configure your application with the Conditional Access Application Control, follow the instructions in [Deploy Conditional Access Application Control for Microsoft Entra apps](/defender-cloud-apps/proxy-deployment-aad).
## Test Conditional Access App Control
active-directory 2 Secure Access Current State https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/2-secure-access-current-state.md
If you use entitlement management, you can confine access packages to a subset o
With an inventory of external users and organizations, determine the access to grant to the users. You can use the Microsoft Graph API to determine Microsoft Entra group membership or application assignment.
-* [Working with groups in Microsoft Graph](/graph/api/resources/groups-overview?context=graph%2Fcontext&view=graph-rest-1.0&preserve-view=true)
+* [Working with groups in Microsoft Graph](/graph/api/resources/groups-overview?context=graph/context&view=graph-rest-1.0&preserve-view=true)
* [Applications API overview](/graph/applications-concept-overview?view=graph-rest-1.0&preserve-view=true) ### Enumerate application permissions
active-directory 8 Secure Access Sensitivity Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/8-secure-access-sensitivity-labels.md
Use sensitivity labels to help control access to your content in Office 365 applications, and in containers like Microsoft Teams, Microsoft 365 Groups, and SharePoint sites. They protect content without hindering user collaboration. Use sensitivity labels to send organization-wide content across devices, apps, and services, while protecting data. Sensitivity labels help organizations meet compliance and security policies.
-See, [Learn about sensitivity labels](/microsoft-365/compliance/sensitivity-labels?view=o365-worldwide&preserve-view=true)
+See, [Learn about sensitivity labels](/purview/sensitivity-labels?preserve-view=true&view=o365-worldwide)
## Before you begin
Enforce protection settings such as encryption, watermarks, and access restricti
Learn more:
-* [Restrict access to content by using sensitivity labels to apply encryption](/microsoft-365/compliance/encryption-sensitivity-labels?view=o365-worldwide&preserve-view=true)
-* [Use sensitivity labels to protect content in Microsoft Teams, Microsoft 365 Groups, and SharePoint sites](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites)
+* [Restrict access to content by using sensitivity labels to apply encryption](/purview/encryption-sensitivity-labels?preserve-view=true&view=o365-worldwide)
+* [Use sensitivity labels to protect content in Microsoft Teams, Microsoft 365 Groups, and SharePoint sites](/purview/sensitivity-labels-teams-groups-sites)
Sensitivity labels on containers can restrict access to the container, but content in the container doesn't inherit the label. For example, a user takes content from a protected site, downloads it, and then shares it without restrictions, unless the content had a sensitivity label.
As you plan the governance of external access to your content, consider content,
To define High, Medium, or Low Business Impact (HBI, MBI, LBI) for data, sites, and groups, consider the effect on your organization if the wrong content types are shared. * Credit card, passport, national/regional ID numbers
- * [Apply a sensitivity label to content automatically](/microsoft-365/compliance/apply-sensitivity-label-automatically?view=o365-worldwide&preserve-view=true)
+ * [Apply a sensitivity label to content automatically](/purview/apply-sensitivity-label-automatically?preserve-view=true&view=o365-worldwide)
* Content created by corporate officers: compliance, finance, executive, etc. * Strategic or financial data in libraries or sites.
A sensitivity label in a document or email is customizable, clear text, and pers
Determine the access criteria if Microsoft 365 Groups, Teams, or SharePoint sites are restricted with sensitivity labels. You can label content in containers or use automatic labeling for files in SharePoint, OneDrive, etc.
-Learn more: [Get started with sensitivity labels](/microsoft-365/compliance/get-started-with-sensitivity-labels?view=o365-worldwide&preserve-view=true)
+Learn more: [Get started with sensitivity labels](/purview/get-started-with-sensitivity-labels?preserve-view=true&view=o365-worldwide)
#### Sensitivity labels on containers
active-directory Multi Tenant Common Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/multi-tenant-common-considerations.md
Microsoft Teams has features to limit access and based on user type. Changes to
The tenant switching mechanism for Microsoft Teams might require users to manually switch the context of their Teams client when working in Teams outside their home tenant.
-You can enable Teams users from another entire external domain to find, call, chat, and set up meetings with your users with Teams Federation. [Manage external meetings and chat with people and organizations using Microsoft identities](/microsoftteams/manage-external-access) describes how you can allow users in your organization to chat and meet with people outside the organization who are using Microsoft as an identity provider.
+You can enable Teams users from another entire external domain to find, call, chat, and set up meetings with your users with Teams Federation. [Manage external meetings and chat with people and organizations using Microsoft identities](/microsoftteams/trusted-organizations-external-meetings-chat) describes how you can allow users in your organization to chat and meet with people outside the organization who are using Microsoft as an identity provider.
### Licensing considerations for guest users in Teams
active-directory Ops Guide Auth https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/ops-guide-auth.md
If you're managing devices with MDM or Microsoft Intune, but not using device co
#### Device trust access policies recommended reading - [How To: Plan your Microsoft Entra hybrid join implementation](../devices/hybrid-join-plan.md)-- [Identity and device access configurations](/microsoft-365/enterprise/microsoft-365-policies-configurations)
+- [Identity and device access configurations](/microsoft-365/security/office-365-security/microsoft-365-policies-configurations)
### Windows Hello for Business
active-directory Ops Guide Iam https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/ops-guide-iam.md
As you review your list, you may find you need to either assign an owner for tas
### Identify and resolve synchronization issues
-Microsoft recommends you have a good baseline and understanding of the issues in your on-premises environment that can result in synchronization issues to the cloud. Since automated tools such as [IdFix](/office365/enterprise/prepare-directory-attributes-for-synch-with-idfix) and [Microsoft Entra Connect Health](../hybrid/connect/whatis-azure-ad-connect.md#why-use-azure-ad-connect-health) can generate a high volume of false positives, we recommend you identify synchronization errors that have been left unaddressed for more than 100 days by cleaning up those objects in error. Long term unresolved synchronization errors can generate support incidents. [Troubleshooting errors during synchronization](../hybrid/connect/tshoot-connect-sync-errors.md) provides an overview of different types of sync errors, some of the possible scenarios that cause those errors and potential ways to fix the errors.
+Microsoft recommends you have a good baseline and understanding of the issues in your on-premises environment that can result in synchronization issues to the cloud. Since automated tools such as [IdFix](/microsoft-365/enterprise/set-up-directory-synchronization) and [Microsoft Entra Connect Health](../hybrid/connect/whatis-azure-ad-connect.md#why-use-azure-ad-connect-health) can generate a high volume of false positives, we recommend you identify synchronization errors that have been left unaddressed for more than 100 days by cleaning up those objects in error. Long term unresolved synchronization errors can generate support incidents. [Troubleshooting errors during synchronization](../hybrid/connect/tshoot-connect-sync-errors.md) provides an overview of different types of sync errors, some of the possible scenarios that cause those errors and potential ways to fix the errors.
<a name='azure-ad-connect-sync-configuration'></a>
active-directory Protect M365 From On Premises Attacks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/protect-m365-from-on-premises-attacks.md
In Microsoft Entra ID, users who have privileged roles, such as administrators,
- Use cloud-only accounts for Microsoft Entra ID and Microsoft 365 privileged roles. -- Deploy privileged access devices for privileged access to manage Microsoft 365 and Microsoft Entra ID. See [Device roles and profiles](/security/compass/privileged-access-devices#device-roles-and-profiles).
+- Deploy privileged access devices for privileged access to manage Microsoft 365 and Microsoft Entra ID. See [Device roles and profiles](/security/privileged-access-workstations/privileged-access-devices#device-roles-and-profiles).
Deploy Microsoft Entra Privileged Identity Management (PIM) for just-in-time access to all human accounts that have privileged roles. Require strong authentication to activate roles. See [What is Microsoft Entra Privileged Identity Management](../privileged-identity-management/pim-configure.md).
active-directory Recoverability Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/recoverability-overview.md
Microsoft Graph APIs are highly customizable based on your organizational needs.
| Resource types| Reference links | | - | - |
-| Users, groups, and other directory objects| [directoryObject API](/graph/api/resources/directoryObject) |
+| Users, groups, and other directory objects| [directoryObject API](/graph/api/resources/directoryobject) |
| Directory roles| [directoryRole API](/graph/api/resources/directoryrole) | | Conditional Access policies| [Conditional Access policy API](/graph/api/resources/conditionalaccesspolicy) | | Devices| [devices API](/graph/api/resources/device) |
active-directory Resilience B2c Developer Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/resilience-b2c-developer-best-practices.md
The Azure AD B2C directory service supports billions of authentications a day. I
### How to optimize directory reads and writes -- **Avoid write functions to the directory on sign-in**: Never execute a write on sign-in without a precondition (if clause) in your custom policies. One use case that requires a write on a sign-in is [just-in-time migration of user passwords](https://github.com/azure-ad-b2c/user-migration/tree/master/seamless-account-migration). Avoid any scenario that requires a write on every sign-in. [Preconditions](../../active-directory-b2c/userjourneys.md) in a user journey will look like this:
+- **Avoid write functions to the directory on sign-in**: Never execute a write on sign-in without a precondition (if clause) in your custom policies. One use case that requires a write on a sign-in is [just-in-time migration of user passwords](https://github.com/azure-ad-b2c/user-migration/tree/master/seamless-account-migration). Avoid any scenario that requires a write on every sign-in. [Preconditions](/azure/active-directory-b2c/userjourneys) in a user journey will look like this:
```xml <Precondition Type="ClaimEquals" ExecuteActionsIf="true">
The Azure AD B2C directory service supports billions of authentications a day. I
- Understand and plan your migration timeline. When planning to migrate users to Azure AD B2C using Microsoft Graph, consider the application and tenant limits to calculate the time needed to complete the migration of users. If you split your user creation job or script using two applications, you can use the per application limit. It would still need to remain below the per tenant threshold. - Understand the effects of your migration job on other applications. Consider the live traffic served by other relying applications to make sure you don't cause throttling at the tenant level and resource starvation for your live application. For more information, see the [Microsoft Graph throttling guidance](/graph/throttling). - Use a [load test sample](https://github.com/azure-ad-b2c/load-tests) to simulate sign-up and sign-in.
- - Learn more about [Azure Active Directory B2C service limits and restrictions](../../active-directory-b2c/service-limits.md?pivots=b2c-custom-policy).
+ - Learn more about [Azure Active Directory B2C service limits and restrictions](/azure/active-directory-b2c/service-limits?pivots=b2c-custom-policy).
## Extend token lifetimes
-In an unlikely event, when the Azure AD B2C authentication service is unable to complete new sign-ups and sign-ins, you can still provide mitigation for users who are signed in. With [configuration](../../active-directory-b2c/configure-tokens.md), you can allow users that are already signed in to continue using the application without any perceived disruption until the user signs out from the application or the [session](../../active-directory-b2c/session-behavior.md) times out due to inactivity.
+In an unlikely event, when the Azure AD B2C authentication service is unable to complete new sign-ups and sign-ins, you can still provide mitigation for users who are signed in. With [configuration](/azure/active-directory-b2c/configure-tokens), you can allow users that are already signed in to continue using the application without any perceived disruption until the user signs out from the application or the [session](/azure/active-directory-b2c/session-behavior) times out due to inactivity.
Your business requirements and desired end-user experience will dictate your frequency of token refresh for both web and Single-page applications (SPAs).
active-directory Road To The Cloud Implement https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/road-to-the-cloud-implement.md
Client workstations are traditionally joined to Active Directory and managed via
* Manage workstations from the cloud by using unified endpoint management (UEM) solutions such as [Intune](/mem/intune/fundamentals/what-is-intune).
-[Windows Autopilot](/mem/autopilot/windows-autopilot) can help you establish a streamlined onboarding and device provisioning, which can enforce these directives.
+[Windows Autopilot](/autopilot/windows-autopilot) can help you establish a streamlined onboarding and device provisioning, which can enforce these directives.
[Windows Local Administrator Password Solution](../devices/howto-manage-local-admin-passwords.md) (LAPS) enables a cloud-first solution to manage the passwords of local administrator accounts.
active-directory Road To The Cloud Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/road-to-the-cloud-migrate.md
You can integrate non-Windows workstations with Microsoft Entra ID to enhance th
* Plan to deploy [Platform SSO for macOS 13](https://techcommunity.microsoft.com/t5/microsoft-intune-blog/microsoft-simplifies-endpoint-manager-enrollment-for-apple/ba-p/3570319).
-* For Linux, you can [sign in to a Linux virtual machine (VM) by using Microsoft Entra credentials](../../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+* For Linux, you can [sign in to a Linux virtual machine (VM) by using Microsoft Entra credentials](../devices/howto-vm-sign-in-azure-ad-linux.md).
### Replace other Windows versions for workstations
active-directory Secure Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/secure-best-practices.md
When designing isolated environments, it's important to consider the following p
* **Use only modern authentication** - Applications deployed in isolated environments must use claims-based modern authentication (for example, SAML, * Auth, OAuth2, and OpenID Connect) to use capabilities such as federation, Microsoft Entra B2B collaboration, delegation, and the consent framework. This way, legacy applications that have dependency on legacy authentication methods such as NT LAN Manager (NTLM) won't carry forward in isolated environments.
-* **Enforce strong authentication** - Strong authentication must always be used when accessing the isolated environment services and infrastructure. Whenever possible, [passwordless authentication](../authentication/concept-authentication-passwordless.md) such as [Windows for Business Hello](/windows/security/identity-protection/hello-for-business/hello-overview) or a [FIDO2 security keys](../authentication/howto-authentication-passwordless-security-key.md)) should be used.
+* **Enforce strong authentication** - Strong authentication must always be used when accessing the isolated environment services and infrastructure. Whenever possible, [passwordless authentication](../authentication/concept-authentication-passwordless.md) such as [Windows for Business Hello](/windows/security/identity-protection/hello-for-business/hello-overview) or a [FIDO2 security keys](../authentication/howto-authentication-passwordless-security-key.md) should be used.
-* **Deploy secure workstations** - [Secure workstations](/security/compass/privileged-access-devices) provide the mechanism to ensure that the platform and the identity that platform represents is properly attested and secured against exploitation. Two other approaches to consider are:
+* **Deploy secure workstations** - [Secure workstations](/security/privileged-access-workstations/privileged-access-devices) provide the mechanism to ensure that the platform and the identity that platform represents is properly attested and secured against exploitation. Two other approaches to consider are:
* Use Windows 365 Cloud PCs (Cloud PC) with the Microsoft Graph API.
In addition to the guidance in the [Microsoft Entra general operations guide](./
### Privileged Accounts
-Provision accounts in the isolated environment for administrative personnel and IT teams who operate the environment. This enables you to add stronger security policies such as device-based access control for [secure workstations](/security/compass/privileged-access-deployment). As discussed in previous sections, nonproduction environments can potentially utilize Microsoft Entra B2B collaboration to onboard privileged accounts to the non-production tenants using the same posture and security controls designed for privileged access in their production environment.
+Provision accounts in the isolated environment for administrative personnel and IT teams who operate the environment. This enables you to add stronger security policies such as device-based access control for [secure workstations](/security/privileged-access-workstations/privileged-access-deployment). As discussed in previous sections, nonproduction environments can potentially utilize Microsoft Entra B2B collaboration to onboard privileged accounts to the non-production tenants using the same posture and security controls designed for privileged access in their production environment.
Cloud-only accounts are the simplest way to provision human identities in a Microsoft Entra tenant and it's a good fit for green field environments. However, if there's an existing on-premises infrastructure that corresponds to the isolated environment (for example, pre-production or management Active Directory forest), you could consider synchronizing identities from there. This holds especially true if the on-premises infrastructure described herein is used for IaaS solutions that require server access to manage the solution data plane. For more information on this scenario, see [Protecting Microsoft 365 from on-premises attacks](./protect-m365-from-on-premises-attacks.md). Synchronizing from isolated on-premises environments might also be needed if there are specific regulatory compliance requirements such as smart-card only authentication.
All human identities (local accounts and external identities provisioned through
#### Passwordless credentials
-A [passwordless solution](../authentication/concept-authentication-passwordless.md) is the best solution for ensuring the most convenient and secure method of authentication. Passwordless credentials such as [FIDO security keys](../authentication/howto-authentication-passwordless-security-key.md) and [Windows Hello for Business](/windows/security/identity-protection/hello-for-business/hello-overview) are recommended for human identities with privileged roles.
+A [passwordless solution](../authentication/concept-authentication-passwordless.md) is the best solution for ensuring the most convenient and secure method of authentication. Passwordless credentials such as [FIDO security keys](../authentication/howto-authentication-passwordless-security-key.md) and [Windows Hello for Business](/windows/security/identity-protection/hello-for-business/) are recommended for human identities with privileged roles.
#### Password protection
active-directory Secure Fundamentals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/secure-fundamentals.md
Non-production environments are commonly referred to as sandbox environments.
**Human identities** are user objects that generally represent people in an organization. These identities are either created and managed directly in Microsoft Entra ID or are synchronized from an on-premises Active Directory to Microsoft Entra ID for a given organization. These types of identities are referred to as **local identities**. There can also be user objects invited from a partner organization or a social identity provider using [Microsoft Entra B2B collaboration](../external-identities/what-is-b2b.md). In this content, we refer to these types of identity as **external identities**.
-**Non-human identities** include any identity not associated with a human. This type of identity is an object such as an application that requires an identity to run. In this content, we refer to this type of identity as a **workload identity**. Various terms are used to describe this type of identity, including [application objects and service principals](../../marketplace/manage-aad-apps.md).
+**Non-human identities** include any identity not associated with a human. This type of identity is an object such as an application that requires an identity to run. In this content, we refer to this type of identity as a **workload identity**. Various terms are used to describe this type of identity, including [application objects and service principals](/partner-center/marketplace/manage-aad-apps).
* **Application object**. A Microsoft Entra application is defined by its application object. The object resides in the Microsoft Entra tenant where the application registered. The tenant is known as the application's "home" tenant.
Non-production environments are commonly referred to as sandbox environments.
* **Multi-tenant** applications allow identities from any Microsoft Entra tenant to authenticate.
-* **Service principal object**. Although there are [exceptions](../../marketplace/manage-aad-apps.md), application objects can be considered the *definition* of an application. Service principal objects can be considered an instance of an application. Service principals generally reference an application object, and one application object is referenced by multiple service principals across directories.
+* **Service principal object**. Although there are [exceptions](/partner-center/manage-aad-apps), application objects can be considered the *definition* of an application. Service principal objects can be considered an instance of an application. Service principals generally reference an application object, and one application object is referenced by multiple service principals across directories.
**Service principal objects** are also directory identities that can perform tasks independently from human intervention. The service principal defines the access policy and permissions for a user or application in the Microsoft Entra tenant. This mechanism enables core features such as authentication of the user or application during sign-in and authorization during resource access.
Some legacy scenarios required a human identity to be used in *non-human* scenar
* **Microsoft Entra Domain Joined**. Devices that are owned by the organization and joined to the organization's Microsoft Entra tenant. Typically a device purchased and managed by an organization that is joined to Microsoft Entra ID and managed by a service such as [Microsoft Intune](https://www.microsoft.com/microsoft-365/enterprise-mobility-security/microsoft-intune).
- * **Microsoft Entra registered**. Devices not owned by the organization, for example, a personal device, used to access company resources. Organizations may require the device be enrolled via [Mobile Device Management (MDM)](https://www.microsoft.com/itshowcase/mobile-device-management-at-microsoft), or enforced through [Mobile Application Management (MAM)](/office365/enterprise/office-365-client-support-mobile-application-management) without enrollment to access resources. This capability can be provided by a service such as Microsoft Intune.
+ * **Microsoft Entra registered**. Devices not owned by the organization, for example, a personal device, used to access company resources. Organizations may require the device be enrolled via [Mobile Device Management (MDM)](https://www.microsoft.com/itshowcase/mobile-device-management-at-microsoft), or enforced through [Mobile Application Management (MAM)](/mem/intune/apps/apps-supported-intune-apps) without enrollment to access resources. This capability can be provided by a service such as Microsoft Intune.
* **Group objects** contain objects for the purposes of assigning resource access, applying controls, or configuration. Group objects contain attributes that have the required information about the group including the name, description, group members, group owners, and the group type. Groups in Microsoft Entra ID take multiple forms based on an organization's requirements and can be mastered in Microsoft Entra ID or synchronized from on-premises Active Directory Domain Services (AD DS).
active-directory Secure Single Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/secure-single-tenant.md
In this diagram, there are nonproduction Azure resources and nonproduction insta
>[!NOTE] >You cannot have more than one Microsoft 365 environment in a single Microsoft Entra tenant. However, you can have multiple Dynamics 365 environments in a single Microsoft Entra tenant.
-Another scenario for isolation within a single tenant could be separation between locations, subsidiary or implementation of tiered administration (according to the "[Enterprise Access Model](/security/compass/privileged-access-access-model)").
+Another scenario for isolation within a single tenant could be separation between locations, subsidiary or implementation of tiered administration (according to the "[Enterprise Access Model](/security/privileged-access-workstations/privileged-access-access-model)").
Azure RBAC role assignments allow scoped administration of Azure resources. Similarly, Microsoft Entra ID allows granular management of Microsoft Entra ID trusting applications through multiple capabilities such as Conditional Access, user and group filtering, administrative unit assignments and application assignments.
active-directory Security Operations Applications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/security-operations-applications.md
Many applications use credentials to authenticate in Microsoft Entra ID. Any oth
* Azure Monitor ΓÇô [Microsoft Entra workbook to help you assess Solorigate risk - Microsoft Tech Community](https://techcommunity.microsoft.com/t5/azure-active-directory-identity/azure-ad-workbook-to-help-you-assess-solorigate-risk/ba-p/2010718)
-* Defender for Cloud Apps ΓÇô [Defender for Cloud Apps anomaly detection alerts investigation guide](/cloud-app-security/investigate-anomaly-alerts)
+* Defender for Cloud Apps ΓÇô [Defender for Cloud Apps anomaly detection alerts investigation guide](/defender-cloud-apps/investigate-anomaly-alerts)
* PowerShell - [Sample PowerShell script to find credential lifetime](https://github.com/madansr7/appCredAge).
active-directory Security Operations Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/security-operations-devices.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
+* [Microsoft 365 Audit logs](/purview/audit-solutions-overview)
* [Azure Key Vault logs](/azure/key-vault/general/logging?tabs=Vault)
From the Azure portal, you can view the Microsoft Entra audit logs and download
* **[Azure Event Hubs](/azure/event-hubs/event-hubs-about) -integrated with a SIEM**- [Microsoft Entra logs can be integrated to other SIEMs](../reports-monitoring/howto-stream-logs-to-event-hub.md) such as Splunk, ArcSight, QRadar, and Sumo Logic via the Azure Event Hubs integration.
-* **[Microsoft Defender for Cloud Apps](/cloud-app-security/what-is-cloud-app-security)** ΓÇô enables you to discover and manage apps, govern across apps and resources, and check your cloud appsΓÇÖ compliance.
+* **[Microsoft Defender for Cloud Apps](/defender-cloud-apps/what-is-defender-for-cloud-apps)** ΓÇô enables you to discover and manage apps, govern across apps and resources, and check your cloud appsΓÇÖ compliance.
* **[Securing workload identities with Identity Protection Preview](..//identity-protection/concept-workload-identity-risk.md)** - Used to detect risk on workload identities across sign-in behavior and offline indicators of compromise.
active-directory Security Operations Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/security-operations-infrastructure.md
The remainder of this article describes what to monitor and alert on. It is orga
In hybrid environments that contain both on-premises and cloud-based resources and accounts, the Active Directory infrastructure is a key part of the authentication stack. The stack is also a target for attacks so must be configured to maintain a secure environment and must be monitored properly. Examples of current types of attacks used against your authentication infrastructure use Password Spray and Solorigate techniques. The following are links to articles we recommend:
-* [Securing privileged access overview](/security/compass/overview) ΓÇô This article provides an overview of current techniques using Zero Trust techniques to create and maintain secure privileged access.
+* [Securing privileged access overview](/security/privileged-access-workstations/overview) ΓÇô This article provides an overview of current techniques using Zero Trust techniques to create and maintain secure privileged access.
* [Microsoft Defender for Identity monitored domain activities](/defender-for-identity/monitored-activities) - This article provides a comprehensive list of activities to monitor and set alerts for.
To configure monitoring for Application Proxy, see [Troubleshoot Application Pro
| - | - | - | - | - | | Kerberos errors| Medium | Various tools| Medium | Kerberos authentication error guidance under Kerberos errors on [Troubleshoot Application Proxy problems and error messages](../app-proxy/application-proxy-troubleshoot.md). | | DC security issues| High| DC Security Audit logs| Event ID 4742(S): A computer account was changed<br>-and-<br>Flag ΓÇô Trusted for Delegation<br>-or-<br>Flag ΓÇô Trusted to Authenticate for Delegation| Investigate any flag change. |
-| Pass-the-ticket like attacks| High| | | Follow guidance in:<br>[Security principal reconnaissance (LDAP) (external ID 2038)](/defender-for-identity/reconnaissance-discovery-alerts)<br>[Tutorial: Compromised credential alerts](/defender-for-identity/credential-access-alerts)<br>[Understand and use Lateral Movement Paths with Microsoft Defender for Identity](/defender-for-identity/use-case-lateral-movement-path)<br>[Understanding entity profiles](/defender-for-identity/investigate-assets) |
+| Pass-the-ticket like attacks| High| | | Follow guidance in:<br>[Security principal reconnaissance (LDAP) (external ID 2038)](/defender-for-identity/reconnaissance-discovery-alerts)<br>[Tutorial: Compromised credential alerts](/defender-for-identity/credential-access-alerts)<br>[Understand and use Lateral Movement Paths with Microsoft Defender for Identity](/defender-for-identity/understand-lateral-movement-paths)<br>[Understanding entity profiles](/defender-for-identity/investigate-assets) |
### Legacy authentication settings For multifactor authentication (MFA) to be effective, you also need to block legacy authentication. You then need to monitor your environment and alert on any use of legacy authentication. Legacy authentication protocols like POP, SMTP, IMAP, and MAPI canΓÇÖt enforce MFA. This makes these protocols the preferred entry points for attackers. For more information on tools that you can use to block legacy authentication, see [New tools to block legacy authentication in your organization](https://techcommunity.microsoft.com/t5/azure-active-directory-identity/new-tools-to-block-legacy-authentication-in-your-organization/ba-p/1225302).
-Legacy authentication is captured in the Microsoft Entra sign-in log as part of the detail of the event. You can use the Azure Monitor workbook to help with identifying legacy authentication usage. For more information, see [Sign-ins using legacy authentication](../reports-monitoring/howto-use-azure-monitor-workbooks.md), which is part of [How to use Azure Monitor Workbooks for Microsoft Entra reports](../reports-monitoring/howto-use-azure-monitor-workbooks.md). You can also use the Insecure protocols workbook for Microsoft Sentinel. For more information, see [Microsoft Sentinel Insecure Protocols Workbook Implementation Guide](https://techcommunity.microsoft.com/t5/azure-sentinel/azure-sentinel-insecure-protocols-workbook-implementation-guide/ba-p/1197564). Specific activities to monitor include:
+Legacy authentication is captured in the Microsoft Entra sign-in log as part of the detail of the event. You can use the Azure Monitor workbook to help with identifying legacy authentication usage. For more information, see [Sign-ins using legacy authentication](../reports-monitoring/howto-use-workbooks.md), which is part of [How to use Azure Monitor Workbooks for Microsoft Entra reports](../reports-monitoring/howto-use-workbooks.md). You can also use the Insecure protocols workbook for Microsoft Sentinel. For more information, see [Microsoft Sentinel Insecure Protocols Workbook Implementation Guide](https://techcommunity.microsoft.com/t5/azure-sentinel/azure-sentinel-insecure-protocols-workbook-implementation-guide/ba-p/1197564). Specific activities to monitor include:
| What to monitor| Risk level| Where| Filter/sub-filter| Notes | | - | - | - | - | - |
active-directory Security Operations Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/security-operations-introduction.md
For more information, see [What is Identity Protection](../identity-protection/o
For the best results, we recommend that you monitor your domain controllers using Microsoft Defender for Identity. This approach enables the best detection and automation capabilities. Follow the guidance from these resources: * [Microsoft Defender for Identity architecture](/defender-for-identity/architecture)
-* [Connect Microsoft Defender for Identity to Active Directory quickstart](/defender-for-identity/install-step2)
+* [Connect Microsoft Defender for Identity to Active Directory quickstart](/defender-for-identity/directory-service-accounts)
If you don't plan to use Microsoft Defender for Identity, monitor your domain controllers by one of these approaches:
active-directory Sync Directory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/sync-directory.md
Explore the following resources to learn more about directory synchronization wi
## Next steps
-* [What is hybrid identity with Microsoft Entra ID?](../../active-directory/hybrid/whatis-hybrid-identity.md) Microsoft's identity solutions span on-premises and cloud-based capabilities. Hybrid identity solutions create a common user identity for authentication and authorization to all resources, regardless of location.
+* [What is hybrid identity with Microsoft Entra ID?](../hybrid/whatis-hybrid-identity.md) Microsoft's identity solutions span on-premises and cloud-based capabilities. Hybrid identity solutions create a common user identity for authentication and authorization to all resources, regardless of location.
* [Install the Microsoft Entra Connect provisioning agent](../hybrid/cloud-sync/how-to-install.md) walks you through the installation process for the Microsoft Entra Connect provisioning agent and how to initially configure it in the Azure portal. * [Microsoft Entra Connect cloud sync new agent configuration](../hybrid/cloud-sync/how-to-configure.md) guides you through configuring Microsoft Entra Connect cloud sync. * [Microsoft Entra authentication and synchronization protocol overview](auth-sync-overview.md) describes integration with authentication and synchronization protocols. Authentication integrations enable you to use Microsoft Entra ID and its security and management features with little or no changes to your applications that use legacy authentication methods. Synchronization integrations enable you to sync user and group data to Microsoft Entra ID and then user Microsoft Entra management capabilities. Some sync patterns enable automated provisioning.
active-directory Sync Ldap https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/architecture/sync-ldap.md
Explore the following resources to learn more about LDAP synchronization with Mi
## Next steps
-* [What is hybrid identity with Microsoft Entra ID?](../../active-directory/hybrid/whatis-hybrid-identity.md) Microsoft's identity solutions span on-premises and cloud-based capabilities. Hybrid identity solutions create a common user identity for authentication and authorization to all resources, regardless of location.
+* [What is hybrid identity with Microsoft Entra ID?](../hybrid/whatis-hybrid-identity.md) Microsoft's identity solutions span on-premises and cloud-based capabilities. Hybrid identity solutions create a common user identity for authentication and authorization to all resources, regardless of location.
* [Microsoft Entra authentication and synchronization protocol overview](auth-sync-overview.md) describes integration with authentication and synchronization protocols. Authentication integrations enable you to use Microsoft Entra ID and its security and management features with little or no changes to your applications that use legacy authentication methods. Synchronization integrations enable you to sync user and group data to Microsoft Entra ID and then user Microsoft Entra management capabilities. Some sync patterns enable automated provisioning.
active-directory Concept Registration Mfa Sspr Combined https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/concept-registration-mfa-sspr-combined.md
Users can access manage mode by going to [Security info](https://aka.ms/mysecuri
## Key usage scenarios ### Update a password in MySignIns (preview)
-A user navigates to [Security info](https://aka.ms/mysecurityinfo). After signing in, the user can update their password. For more information about different authentication methods that you can require by using Conditional Access policies, see [How to secure the registration of security info](/azure/active-directory/conditional-access/howto-conditional-access-policy-registration). When finished, the user has the new password updated on the Security info page.
+A user navigates to [Security info](https://aka.ms/mysecurityinfo). After signing in, the user can update their password. For more information about different authentication methods that you can require by using Conditional Access policies, see [How to secure the registration of security info](../conditional-access/howto-conditional-access-policy-registration.md). When finished, the user has the new password updated on the Security info page.
### Protect Security info registration with Conditional Access
active-directory Concept Sspr Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/concept-sspr-policy.md
This guidance applies to other providers, such as Intune and Microsoft 365, whic
### Set or check the password policies by using PowerShell
-To get started, [download and install the Azure AD PowerShell module](/powershell/module/Azuread/) and [connect it to your Microsoft Entra tenant](/powershell/module/azuread/connect-azuread#examples).
+To get started, [download and install the Azure AD PowerShell module](/powershell/module/azuread/) and [connect it to your Microsoft Entra tenant](/powershell/module/azuread/connect-azuread#examples).
After the module is installed, use the following steps to complete each task as needed.
active-directory How To Mfa Server Migration Utility https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/how-to-mfa-server-migration-utility.md
A few important points:
During the previous phases, you can remove users from the Staged Rollout folders to take them out of scope of Microsoft Entra multifactor authentication and route them back to your on-premises Azure MFA server for all MFA requests originating from Microsoft Entra ID.
-**Phase 3** requires moving all clients that authenticate to the on-premises MFA Server (VPNs, password managers, and so on) to Microsoft Entra federation via SAML/OAUTH. If modern authentication standards aren't supported, you're required to stand up NPS server(s) with the Microsoft Entra multifactor authentication extension installed. Once dependencies are migrated, users should no longer use the User portal on the MFA Server, but rather should manage their authentication methods in Microsoft Entra ID ([aka.ms/mfasetup](https://aka.ms/mfasetup)). Once users begin managing their authentication data in Microsoft Entra ID, those methods won't be synced back to MFA Server. If you roll back to the on-premises MFA Server after users have made changes to their Authentication Methods in Microsoft Entra ID, those changes will be lost. After user migrations are complete, change the [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0#federatedidpmfabehavior-values&preserve-view=true) domain federation setting. The change tells Microsoft Entra ID to no longer perform MFA on-premises and to perform _all_ MFA requests with Microsoft Entra multifactor authentication, regardless of group membership.
+**Phase 3** requires moving all clients that authenticate to the on-premises MFA Server (VPNs, password managers, and so on) to Microsoft Entra federation via SAML/OAUTH. If modern authentication standards aren't supported, you're required to stand up NPS server(s) with the Microsoft Entra multifactor authentication extension installed. Once dependencies are migrated, users should no longer use the User portal on the MFA Server, but rather should manage their authentication methods in Microsoft Entra ID ([aka.ms/mfasetup](https://aka.ms/mfasetup)). Once users begin managing their authentication data in Microsoft Entra ID, those methods won't be synced back to MFA Server. If you roll back to the on-premises MFA Server after users have made changes to their Authentication Methods in Microsoft Entra ID, those changes will be lost. After user migrations are complete, change the [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0&preserve-view=true#federatedidpmfabehavior-values) domain federation setting. The change tells Microsoft Entra ID to no longer perform MFA on-premises and to perform _all_ MFA requests with Microsoft Entra multifactor authentication, regardless of group membership.
The following sections explain the migration steps in more detail.
Using the data points you collected in [Authentication services](#authentication
### Update domain federation settings Once you've completed user migrations, and moved all of your [Authentication services](#authentication-services) off of MFA Server, it's time to update your domain federation settings. After the update, Microsoft Entra no longer sends MFA request to your on-premises federation server.
-To configure Microsoft Entra ID to ignore MFA requests to your on-premises federation server, install the [Microsoft Graph PowerShell SDK](/powershell/microsoftgraph/installation?view=graph-powershell-1.0&preserve-view=true&viewFallbackFrom=graph-powershell-) and set [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0#federatedidpmfabehavior-values&preserve-view=true) to `rejectMfaByFederatedIdp`, as shown in the following example.
+To configure Microsoft Entra ID to ignore MFA requests to your on-premises federation server, install the [Microsoft Graph PowerShell SDK](/powershell/microsoftgraph/installation?view=graph-powershell-1.0&preserve-view=true&viewFallbackFrom=graph-powershell-) and set [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0&preserve-view=true#federatedidpmfabehavior-values) to `rejectMfaByFederatedIdp`, as shown in the following example.
#### Request <!-- {
If the upgrade had issues, follow these steps to roll back:
>Any changes since the backup was made will be lost, but should be minimal if backup was made right before upgrade and upgrade was unsuccessful. 1. Run the installer for your previous version (for example, 8.0.x.x).
-1. Configure Microsoft Entra ID to accept MFA requests to your on-premises federation server. Use Graph PowerShell to set [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0#federatedidpmfabehavior-values&preserve-view=true) to `enforceMfaByFederatedIdp`, as shown in the following example.
+1. Configure Microsoft Entra ID to accept MFA requests to your on-premises federation server. Use Graph PowerShell to set [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0&preserve-view=true#federatedidpmfabehavior-values) to `enforceMfaByFederatedIdp`, as shown in the following example.
**Request** <!-- {
active-directory How To Migrate Mfa Server To Mfa With Federation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/how-to-migrate-mfa-server-to-mfa-with-federation.md
This section covers final steps before migrating user MFA settings.
### Set federatedIdpMfaBehavior to enforceMfaByFederatedIdp
-For federated domains, MFA may be enforced by Microsoft Entra Conditional Access or by the on-premises federation provider. Each federated domain has a Microsoft Graph PowerShell security setting named **federatedIdpMfaBehavior**. You can set **federatedIdpMfaBehavior** to `enforceMfaByFederatedIdp` so Microsoft Entra ID accepts MFA that's performed by the federated identity provider. If the federated identity provider didn't perform MFA, Microsoft Entra ID redirects the request to the federated identity provider to perform MFA. For more information, see [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta#federatedidpmfabehavior-values&preserve-view=true).
+For federated domains, MFA may be enforced by Microsoft Entra Conditional Access or by the on-premises federation provider. Each federated domain has a Microsoft Graph PowerShell security setting named **federatedIdpMfaBehavior**. You can set **federatedIdpMfaBehavior** to `enforceMfaByFederatedIdp` so Microsoft Entra ID accepts MFA that's performed by the federated identity provider. If the federated identity provider didn't perform MFA, Microsoft Entra ID redirects the request to the federated identity provider to perform MFA. For more information, see [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta&preserve-view=true#federatedidpmfabehavior-values).
>[!NOTE] > The **federatedIdpMfaBehavior** setting is a new version of the **SupportsMfa** property of the [New-MgDomainFederationConfiguration](/powershell/module/microsoft.graph.identity.directorymanagement/new-mgdomainfederationconfiguration) cmdlet.
active-directory Howto Mfa Adfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-mfa-adfs.md
If your organization is federated with Microsoft Entra ID, use Microsoft Entra multifactor authentication or Active Directory Federation Services (AD FS) to secure resources that are accessed by Microsoft Entra ID. Use the following procedures to secure Microsoft Entra resources with either Microsoft Entra multifactor authentication or Active Directory Federation Services. >[!NOTE]
->Set the domain setting [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta#federatedidpmfabehavior-values&preserve-view=true) to `enforceMfaByFederatedIdp` (recommended) or **SupportsMFA** to `$True`. The **federatedIdpMfaBehavior** setting overrides **SupportsMFA** when both are set.
+>Set the domain setting [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta&preserve-view=true#federatedidpmfabehavior-values) to `enforceMfaByFederatedIdp` (recommended) or **SupportsMFA** to `$True`. The **federatedIdpMfaBehavior** setting overrides **SupportsMFA** when both are set.
<a name='secure-azure-ad-resources-using-ad-fs'></a>
active-directory Howto Mfa Mfasettings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-mfa-mfasettings.md
Previously updated : 09/15/2023 Last updated : 10/18/2023
When a user reports a MFA prompt as suspicious, the event shows up in the Sign-i
- To view fraud reports in the Audit logs, select **Identity** > **Monitoring & health** > **Audit logs**. The fraud report appears under Activity type Fraud reported - user is blocked for MFA or Fraud reported - no action taken based on the tenant-level settings for fraud report.
+>[!NOTE]
+>A user is not reported as High Risk if they perform passwordless authentication.
+ ### Manage suspicious activity events Once a user has reported a prompt as suspicious, the risk should be investigated and remediated with [Identity Protection](../identity-protection/howto-identity-protection-remediate-unblock.md).
To enable trusted IPs by using Conditional Access policies, complete the followi
`c:[Type== "https://schemas.microsoft.com/ws/2012/01/insidecorporatenetwork"] => issue(claim = c);`
+ >[!NOTE]
+ >The **Skip multi-factor authentication for requests from federated users on my intranet** option will affect the Conditional Access evaluation for locations.
+ * **For requests from a specific range of public IPs**: To choose this option, enter the IP addresses in the text box, in CIDR notation. * For IP addresses that are in the range *xxx.xxx.xxx*.1 through *xxx.xxx.xxx*.254, use notation like ***xxx.xxx.xxx*.0/24**. * For a single IP address, use notation like ***xxx.xxx.xxx.xxx*/32**.
active-directory Howto Mfa Userstates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-mfa-userstates.md
After you enable users, notify them via email. Tell the users that a prompt is d
If your users were enabled using per-user enabled and enforced Microsoft Entra multifactor authentication the following PowerShell can assist you in making the conversion to Conditional Access based Microsoft Entra multifactor authentication.
-Run this PowerShell in an ISE window or save as a `.PS1` file to run locally. The operation can only be done by using the [MSOnline module](/powershell/module/msonline#msonline).
+Run this PowerShell in an ISE window or save as a `.PS1` file to run locally. The operation can only be done by using the [MSOnline module](/powershell/module/msonline/#msonline).
```PowerShell # Connect to tenant
active-directory Howto Registration Mfa Sspr Combined Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-registration-mfa-sspr-combined-troubleshoot.md
The following table lists all audit events generated by combined registration:
| Symptom | Troubleshooting steps | | | |
-| I'm not seeing the methods I expected to see. | 1. Check if the user has a Microsoft Entra admin role. If yes, view the SSPR admin policy differences. <br> 2. Determine whether the user is being interrupted because of multifactor authentication registration enforcement or SSPR registration enforcement. See the [flowchart](../../active-directory/authentication/concept-registration-mfa-sspr-combined.md#combined-registration-modes) under "Combined registration modes" to determine which methods should be shown. <br> 3. Determine how recently the multifactor authentication or SSPR policy was changed. If the change was recent, it might take some time for the updated policy to propagate.|
+| I'm not seeing the methods I expected to see. | 1. Check if the user has a Microsoft Entra admin role. If yes, view the SSPR admin policy differences. <br> 2. Determine whether the user is being interrupted because of multifactor authentication registration enforcement or SSPR registration enforcement. See the [flowchart](../authentication/concept-registration-mfa-sspr-combined.md#combined-registration-modes) under "Combined registration modes" to determine which methods should be shown. <br> 3. Determine how recently the multifactor authentication or SSPR policy was changed. If the change was recent, it might take some time for the updated policy to propagate.|
## Troubleshooting manage mode
active-directory Troubleshoot Sspr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/troubleshoot-sspr.md
Use the following information to understand the problem and what needs to be cor
## Microsoft Entra forums
-If you have general questions about Microsoft Entra ID and self-service password reset, you can ask the community for assistance on the [Microsoft Q&A question page for Microsoft Entra ID](/answers/topics/azure-active-directory.html). Members of the community include engineers, product managers, MVPs, and fellow IT professionals.
+If you have general questions about Microsoft Entra ID and self-service password reset, you can ask the community for assistance on the [Microsoft Q&A question page for Microsoft Entra ID](/answers/tags/455/entra-id). Members of the community include engineers, product managers, MVPs, and fellow IT professionals.
## Contact Microsoft support
active-directory Howto Get Appsource Certified https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/azuread-dev/howto-get-appsource-certified.md
For more information about the AppSource trial experience, see [this video](http
## Get support
-For Azure AD integration, we use [Microsoft Q&A](/answers/products/) with the community to provide support.
+For Azure AD integration, we use [Microsoft Q&A](/answers/) with the community to provide support.
We highly recommend you ask your questions on Microsoft Q&A first and browse existing issues to see if someone has asked your question before. Make sure that your questions or comments are tagged with [`[azure-active-directory]`](/answers/topics/azure-active-directory.html).
Use the following comments section to provide feedback and help us refine and sh
[AAD-Dev-Guide]: v1-overview.md [AAD-QuickStart-Web-Apps]: v1-overview.md#get-started
-<!--Image references-->
+<!--Image references-->
active-directory Block Legacy Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/block-legacy-authentication.md
The following messaging protocols support legacy authentication:
- Authenticated SMTP - Used to send authenticated email messages. - Autodiscover - Used by Outlook and EAS clients to find and connect to mailboxes in Exchange Online. - Exchange ActiveSync (EAS) - Used to connect to mailboxes in Exchange Online.-- Exchange Online PowerShell - Used to connect to Exchange Online with remote PowerShell. If you block Basic authentication for Exchange Online PowerShell, you need to use the Exchange Online PowerShell Module to connect. For instructions, see [Connect to Exchange Online PowerShell using multifactor authentication](/powershell/exchange/exchange-online/connect-to-exchange-online-powershell/mfa-connect-to-exchange-online-powershell).
+- Exchange Online PowerShell - Used to connect to Exchange Online with remote PowerShell. If you block Basic authentication for Exchange Online PowerShell, you need to use the Exchange Online PowerShell Module to connect. For instructions, see [Connect to Exchange Online PowerShell using multifactor authentication](/powershell/exchange/connect-to-exchange-online-powershell).
- Exchange Web Services (EWS) - A programming interface that's used by Outlook, Outlook for Mac, and third-party apps. - IMAP4 - Used by IMAP email clients. - MAPI over HTTP (MAPI/HTTP) - Primary mailbox access protocol used by Outlook 2010 SP2 and later.
Before you can block legacy authentication in your directory, you need to first
1. Browse to **Identity** > **Monitoring & health** > **Sign-in logs**. 1. Add the **Client App** column if it isn't shown by clicking on **Columns** > **Client App**. 1. Select **Add filters** > **Client App** > choose all of the legacy authentication protocols and select **Apply**.
-1. If you've activated the [new sign-in activity reports preview](../reports-monitoring/concept-all-sign-ins.md), repeat the above steps also on the **User sign-ins (non-interactive)** tab.
+1. If you've activated the [new sign-in activity reports preview](../reports-monitoring/concept-sign-ins.md), repeat the above steps also on the **User sign-ins (non-interactive)** tab.
Filtering shows you sign-in attempts made by legacy authentication protocols. Clicking on each individual sign-in attempt shows you more details. The **Client App** field under the **Basic Info** tab indicates which legacy authentication protocol was used.
You can select all available grant controls for the **Other clients** condition;
- [Determine effect using Conditional Access report-only mode](howto-conditional-access-insights-reporting.md) - If you aren't familiar with configuring Conditional Access policies yet, see [require MFA for specific apps with Microsoft Entra Conditional Access](../authentication/tutorial-enable-azure-mfa.md) for an example.-- For more information about modern authentication support, see [How modern authentication works for Office client apps](/office365/enterprise/modern-auth-for-office-2013-and-2016)
+- For more information about modern authentication support, see [How modern authentication works for Office client apps](/microsoft-365/enterprise/modern-auth-for-office-2013-and-2016)
- [How to set up a multifunction device or application to send email using Microsoft 365](/exchange/mail-flow-best-practices/how-to-set-up-a-multifunction-device-or-application-to-send-email-using-microsoft-365-or-office-365) - [Enable modern authentication in Exchange Online](/exchange/clients-and-mobile-in-exchange-online/enable-or-disable-modern-authentication-in-exchange-online)-- [Enable Modern Authentication for Office 2013 on Windows devices](/office365/admin/security-and-compliance/enable-modern-authentication)-- [How to configure Exchange Server on-premises to use Hybrid Modern Authentication](/office365/enterprise/configure-exchange-server-for-hybrid-modern-authentication)-- [How to use Modern Authentication with Skype for Business](/skypeforbusiness/manage/authentication/use-adal)
+- [Enable Modern Authentication for Office 2013 on Windows devices](/microsoft-365/admin/)
+- [How to configure Exchange Server on-premises to use Hybrid Modern Authentication](/microsoft-365/enterprise/configure-exchange-server-for-hybrid-modern-authentication)
+- [How to use Modern Authentication with Skype for Business](/microsoft-365/enterprise/hybrid-modern-auth-overview)
active-directory Concept Conditional Access Conditions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-conditions.md
This setting has an effect on access attempts made from the following mobile app
| Mail/Calendar/People app, Outlook 2016, Outlook 2013 (with modern authentication)| Exchange Online | Windows 10 | | MFA and location policy for apps. Device-based policies arenΓÇÖt supported.| Any My Apps app service | Android and iOS | | Microsoft Teams Services - this client app controls all services that support Microsoft Teams and all its Client Apps - Windows Desktop, iOS, Android, WP, and web client | Microsoft Teams | Windows 10, Windows 8.1, Windows 7, iOS, Android, and macOS |
-| Office 2016 apps, Office 2013 (with modern authentication), [OneDrive sync client](/onedrive/enable-conditional-access) | SharePoint | Windows 8.1, Windows 7 |
+| Office 2016 apps, Office 2013 (with modern authentication), [OneDrive sync client](/sharepoint/enable-conditional-access) | SharePoint | Windows 8.1, Windows 7 |
| Office 2016 apps, Universal Office apps, Office 2013 (with modern authentication), [OneDrive sync client](/sharepoint/enable-conditional-access) | SharePoint Online | Windows 10 | | Office 2016 (Word, Excel, PowerPoint, OneNote only). | SharePoint | macOS | | Office 2019| SharePoint | Windows 10, macOS |
active-directory Concept Conditional Access Grant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-grant.md
Administrators can choose to enforce one or more controls when granting access.
- [Require multifactor authentication (Microsoft Entra multifactor authentication)](../authentication/concept-mfa-howitworks.md) - [Require authentication strength](#require-authentication-strength)-- [Require device to be marked as compliant (Microsoft Intune)](/intune/protect/device-compliance-get-started)
+- [Require device to be marked as compliant (Microsoft Intune)](/mem/intune/protect/device-compliance-get-started)
- [Require Microsoft Entra hybrid joined device](../devices/concept-hybrid-join.md) - [Require approved client app](./howto-policy-approved-app-or-app-protection.md) - [Require app protection policy](./howto-policy-approved-app-or-app-protection.md)
The **Require Microsoft Entra hybrid joined device** control:
### Require approved client app
-Organizations can require that an approved client app is used to access selected cloud apps. These approved client apps support [Intune app protection policies](/intune/app-protection-policy) independent of any mobile device management solution.
+Organizations can require that an approved client app is used to access selected cloud apps. These approved client apps support [Intune app protection policies](/mem/intune/apps/app-protection-policy) independent of any mobile device management solution.
To apply this grant control, the device must be registered in Microsoft Entra ID, which requires using a broker app. The broker app can be Microsoft Authenticator for iOS, or either Microsoft Authenticator or Microsoft Company Portal for Android devices. If a broker app isn't installed on the device when the user attempts to authenticate, the user is redirected to the appropriate app store to install the required broker app.
active-directory Concept Conditional Access Session https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-session.md
Organizations can use this control to require Microsoft Entra ID to pass device
For more information on the use and configuration of app-enforced restrictions, see the following articles: - [Enabling limited access with SharePoint Online](/sharepoint/control-access-from-unmanaged-devices)-- [Enabling limited access with Exchange Online](/microsoft-365/security/office-365-security/secure-email-recommended-policies?view=o365-worldwide#limit-access-to-exchange-online-from-outlook-on-the-web&preserve-view=true)
+- [Enabling limited access with Exchange Online](/microsoft-365/security/office-365-security/secure-email-recommended-policies?view=o365-worldwide&preserve-view=true#limit-access-to-exchange-online-from-outlook-on-the-web)
## Conditional Access application control
active-directory Howto Conditional Access Policy Compliant Device https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-compliant-device.md
Organizations that use the [Subscription Activation](/windows/deployment/windows
[Use report-only mode for Conditional Access to determine the results of new policy decisions.](concept-conditional-access-report-only.md)
-[Device compliance policies work with Microsoft Entra ID](/intune/device-compliance-get-started#device-compliance-policies-work-with-azure-ad)
+[Device compliance policies work with Microsoft Entra ID](/mem/intune/protect/device-compliance-get-started#device-compliance-policies-work-with-azure-ad)
active-directory What If Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/what-if-tool.md
You start an evaluation by clicking **What If**. The evaluation result provides
- Policies that will apply to your user or workload identity. - Policies that don't apply to your user or workload identity.
-If [classic policies](policy-migration.md#classic-policies) exist for the selected cloud apps, an indicator is presented to you. By clicking the indicator, you're redirected to the classic policies page. On the classic policies page, you can migrate a classic policy or just disable it. You can return to your evaluation result by closing this page.
+If [classic policies](./policy-migration-mfa.md) exist for the selected cloud apps, an indicator is presented to you. By clicking the indicator, you're redirected to the classic policies page. On the classic policies page, you can migrate a classic policy or just disable it. You can return to your evaluation result by closing this page.
:::image type="content" source="media/what-if-tool/conditional-access-what-if-evaluation-result-example.png" alt-text="Screenshot of an example of the policy evaluation in the What If tool showing policies that would apply." lightbox="media/what-if-tool/conditional-access-what-if-evaluation-result-example.png":::
active-directory Authentication Flows App Scenarios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/authentication-flows-app-scenarios.md
Though we don't recommend that you use it, the [username/password flow](scenario
Using the username/password flow constrains your applications. For instance, applications can't sign in a user who needs to use multifactor authentication or the Conditional Access tool in Microsoft Entra ID. Your applications also don't benefit from single sign-on. Authentication with the username/password flow goes against the principles of modern authentication and is provided only for legacy reasons.
-In desktop apps, if you want the token cache to persist, you can customize the [token cache serialization](msal-net-token-cache-serialization.md). By implementing dual token cache serialization, you can use backward-compatible and forward-compatible token caches.
+In desktop apps, if you want the token cache to persist, you can customize the [token cache serialization](/entra/msal/dotnet/how-to/token-cache-serialization). By implementing dual token cache serialization, you can use backward-compatible and forward-compatible token caches.
For more information, see [Desktop app that calls web APIs](scenario-desktop-overview.md).
Some scenarios, like those that involve Conditional Access related to a device I
For more information, see [Mobile app that calls web APIs](scenario-mobile-overview.md). > [!NOTE]
-> A mobile app that uses MSAL.iOS, MSAL.Android, or MSAL.NET on Xamarin can have app protection policies applied to it. For instance, the policies might prevent a user from copying protected text. The mobile app is managed by Intune and is recognized by Intune as a managed app. For more information, see [Microsoft Intune App SDK overview](/intune/app-sdk).
+> A mobile app that uses MSAL.iOS, MSAL.Android, or MSAL.NET on Xamarin can have app protection policies applied to it. For instance, the policies might prevent a user from copying protected text. The mobile app is managed by Intune and is recognized by Intune as a managed app. For more information, see [Microsoft Intune App SDK overview](/mem/intune/developer/app-sdk).
>
-> The [Intune App SDK](/intune/app-sdk-get-started) is separate from MSAL libraries and interacts with Microsoft Entra ID on its own.
+> The [Intune App SDK](/mem/intune/developer/app-sdk-get-started) is separate from MSAL libraries and interacts with Microsoft Entra ID on its own.
### Protected web API
For more information about authentication, see:
- [Authentication vs. authorization.](./authentication-vs-authorization.md) - [Microsoft identity platform access tokens.](access-tokens.md)-- [Securing access to IoT apps.](/azure/architecture/example-scenario/iot-aad/iot-aad#security)
+- [Securing access to IoT apps.](/azure/architecture/reference-architectures/iot#security)
active-directory Certificate Credentials https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/certificate-credentials.md
If you're interested in using a JWT issued by another identity provider as a cre
## Assertion format
-To compute the assertion, you can use one of the many JWT libraries in the language of your choice - [MSAL supports this using `.WithCertificate()`](msal-net-client-assertions.md). The information is carried by the token in its **Header**, **Claims**, and **Signature**.
+To compute the assertion, you can use one of the many JWT libraries in the language of your choice - [MSAL supports this using `.WithCertificate()`](/entra/msal/dotnet/acquiring-tokens/msal-net-client-assertions). The information is carried by the token in its **Header**, **Claims**, and **Signature**.
### Header
Client assertions can be used anywhere a client secret would be used. For exampl
## Next steps
-The [MSAL.NET library handles this scenario](msal-net-client-assertions.md) in a single line of code.
+The [MSAL.NET library handles this scenario](/entra/msal/dotnet/acquiring-tokens/web-apps-apis/confidential-client-assertions) in a single line of code.
The [.NET Core daemon console application using Microsoft identity platform](https://github.com/Azure-Samples/active-directory-dotnetcore-daemon-v2) code sample on GitHub shows how an application uses its own credentials for authentication. It also shows how you can [create a self-signed certificate](https://github.com/Azure-Samples/active-directory-dotnetcore-daemon-v2/tree/master/1-Call-MSGraph#optional-use-the-automation-script) using the `New-SelfSignedCertificate` PowerShell cmdlet. You can also use the [app creation scripts](https://github.com/Azure-Samples/active-directory-dotnetcore-daemon-v2/blob/master/1-Call-MSGraph/AppCreationScripts/AppCreationScripts.md) in the sample repo to create certificates, compute the thumbprint, and so on.
active-directory Developer Guide Conditional Access Authentication Context https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/developer-guide-conditional-access-authentication-context.md
The table below will show all corner cases where ACRS is added to the token's cl
## Next steps - [Granular Conditional Access for sensitive data and actions (Blog)](https://techcommunity.microsoft.com/t5/azure-active-directory-identity/granular-conditional-access-for-sensitive-data-and-actions/ba-p/1751775)-- [Zero trust with the Microsoft identity platform](/security/zero-trust/identity-developer)
+- [Zero trust with the Microsoft identity platform](/security/zero-trust/develop/identity)
- [Building Zero Trust ready apps with the Microsoft identity platform](/security/zero-trust/develop/identity) - [Conditional Access authentication context](../conditional-access/concept-conditional-access-cloud-apps.md#authentication-context) - [authenticationContextClassReference resource type - MS Graph](/graph/api/conditionalaccessroot-list-authenticationcontextclassreferences)
active-directory Howto Convert App To Be Multi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/howto-convert-app-to-be-multi-tenant.md
To learn more about making API calls to Microsoft Entra ID and Microsoft 365 ser
[AAD-App-SP-Objects]:app-objects-and-service-principals.md [AAD-Auth-Scenarios]:./authentication-vs-authorization.md [AAD-Consent-Overview]:./application-consent-experience.md
-[AAD-Dev-Guide]:azure-ad-developers-guide.md
+[AAD-Dev-Guide]:./index.yml
[AAD-Integrating-Apps]:./quickstart-register-app.md [AAD-Samples-MT]: /samples/browse/?products=azure-active-directory [AAD-Why-To-Integrate]: ./how-to-integrate.md
active-directory Howto Create Service Principal Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/howto-create-service-principal-portal.md
To configure access policies:
## Next steps -- Learn how to use [Azure PowerShell](howto-authenticate-service-principal-powershell.md) or [Azure CLI](/cli/azure/create-an-azure-service-principal-azure-cli) to create a service principal.
+- Learn how to use [Azure PowerShell](howto-authenticate-service-principal-powershell.md) or [Azure CLI](/cli/azure/azure-cli-sp-tutorial-1) to create a service principal.
- To learn about specifying security policies, see [Azure role-based access control (Azure RBAC)](/azure/role-based-access-control/role-assignments-portal). - For a list of available actions that can be granted or denied to users, see [Azure Resource Manager Resource Provider operations](/azure/role-based-access-control/resource-provider-operations). - For information about working with app registrations by using **Microsoft Graph**, see the [Applications](/graph/api/resources/application) API reference.
active-directory Identity Platform Integration Checklist https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/identity-platform-integration-checklist.md
Use the following checklist to ensure that your application is effectively integ
## Branding
-![checkbox](./media/integration-checklist/checkbox-two.svg) Adhere to the [Branding guidelines for applications](/azure/active-directory/develop/howto-add-branding-in-apps).
+![checkbox](./medi).
![checkbox](./medi). Make sure your name and logo are representative of your company/product so that users can make informed decisions. Ensure that you're not violating any trademarks.
Use the following checklist to ensure that your application is effectively integ
![checkbox](./medi). If you must hand-code for the authentication protocols, you should follow the [Microsoft SDL](https://www.microsoft.com/sdl/default.aspx) or similar development methodology. Pay close attention to the security considerations in the standards specifications for each protocol.
-![checkbox](./medi) apps.
+![checkbox](./medi) apps.
![checkbox](./media/integration-checklist/checkbox-two.svg) For mobile apps, configure each platform using the application registration experience. In order for your application to take advantage of the Microsoft Authenticator or Microsoft Company Portal for single sign-in, your app needs a "broker redirect URI" configured. This allows Microsoft to return control to your application after authentication. When configuring each platform, the app registration experience will guide you through the process. Use the quickstart to download a working example. On iOS, use brokers and system webview whenever possible.
-![checkbox](./medi).
+![checkbox](./media/integration-checklist/checkbox-two.svg) In web apps or web APIs, keep one token cache per account. For web apps, the token cache should be keyed by the account ID. For web APIs, the account should be keyed by the hash of the token used to call the API. MSAL.NET provides custom token cache serialization in the .NET Framework and .NET Core subplatforms. For security and performance reasons, our recommendation is to serialize one cache per user. For more information, read about [token cache serialization](/entra/msal/dotnet/how-to/token-cache-serialization).
![checkbox](./media/integration-checklist/checkbox-two.svg) If the data your app requires is available through [Microsoft Graph](https://developer.microsoft.com/graph), request permissions for this data using the Microsoft Graph endpoint rather than the individual API.
active-directory Jwt Claims Customization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/jwt-claims-customization.md
You can use the following functions to transform claims.
| **ExtractNumeric() - Suffix** | Returns the suffix numerical part of the string.<br/>For example, if the input's value is `BSimon_123`, then it returns `123`. | | **IfEmpty()** | Outputs an attribute or constant if the input is null or empty.<br/>For example, if you want to output an attribute stored in an extension attribute if the employee ID for a given user is empty. To perform this function, configure the following values:<br/>Parameter 1(input): user.employeeid<br/>Parameter 2 (output): user.extensionattribute1<br/>Parameter 3 (output if there's no match): user.employeeid | | **IfNotEmpty()** | Outputs an attribute or constant if the input isn't null or empty.<br/>For example, if you want to output an attribute stored in an extension attribute if the employee ID for a given user isn't empty. To perform this function, you configure the following values:<br/>Parameter 1(input): user.employeeid<br/>Parameter 2 (output): user.extensionattribute1 |
-| **Substring() - Fixed Length** (Preview)| Extracts parts of a string claim type, beginning at the character at the specified position, and returns the specified number of characters.<br/>SourceClaim - The claim source of the transform that should be executed.<br/>StartIndex - The zero-based starting character position of a substring in this instance.<br/>Length - The length in characters of the substring.<br/>For example:<br/>sourceClaim - PleaseExtractThisNow<br/>StartIndex - 6<br/>Length - 11<br/>Output: ExtractThis |
-| **Substring() - EndOfString** (Preview) | Extracts parts of a string claim type, beginning at the character at the specified position, and returns the rest of the claim from the specified start index. <br/>SourceClaim - The claim source of the transform.<br/>StartIndex - The zero-based starting character position of a substring in this instance.<br/>For example:<br/>sourceClaim - PleaseExtractThisNow<br/>StartIndex - 6<br/>Output: ExtractThisNow |
-| **RegexReplace()** (Preview) | RegexReplace() transformation accepts as input parameters:<br/>- Parameter 1: a user attribute as regex input<br/>- An option to trust the source as multivalued<br/>- Regex pattern<br/>- Replacement pattern. The replacement pattern may contain static text format along with a reference that points to regex output groups and more input parameters. |
+| **Substring() - Fixed Length** | Extracts parts of a string claim type, beginning at the character at the specified position, and returns the specified number of characters.<br/>SourceClaim - The claim source of the transform that should be executed.<br/>StartIndex - The zero-based starting character position of a substring in this instance.<br/>Length - The length in characters of the substring.<br/>For example:<br/>sourceClaim - PleaseExtractThisNow<br/>StartIndex - 6<br/>Length - 11<br/>Output: ExtractThis |
+| **Substring() - EndOfString** | Extracts parts of a string claim type, beginning at the character at the specified position, and returns the rest of the claim from the specified start index. <br/>SourceClaim - The claim source of the transform.<br/>StartIndex - The zero-based starting character position of a substring in this instance.<br/>For example:<br/>sourceClaim - PleaseExtractThisNow<br/>StartIndex - 6<br/>Output: ExtractThisNow |
+| **RegexReplace()** | RegexReplace() transformation accepts as input parameters:<br/>- Parameter 1: a user attribute as regex input<br/>- An option to trust the source as multivalued<br/>- Regex pattern<br/>- Replacement pattern. The replacement pattern may contain static text format along with a reference that points to regex output groups and more input parameters. |
If you need other transformations, submit your idea in the [feedback forum in Microsoft Entra ID](https://feedback.azure.com/d365community/forum/22920db1-ad25-ec11-b6e6-000d3a4f0789) under the *SaaS application* category.
Authorization: Bearer {token}
``` ## Configure a custom signing key using PowerShell
-Use PowerShell to [instantiate an MSAL Public Client Application](msal-net-initializing-client-applications.md#initializing-a-public-client-application-from-code) and use the [Authorization Code Grant](v2-oauth2-auth-code-flow.md) flow to obtain a delegated permission access token for Microsoft Graph. Use the access token to call Microsoft Graph and configure a custom signing key for the service principal. After you configure the custom signing key, your application code needs to [validate the token signing key](#validate-token-signing-key).
+Use PowerShell to [instantiate an MSAL Public Client Application](/entr) flow to obtain a delegated permission access token for Microsoft Graph. Use the access token to call Microsoft Graph and configure a custom signing key for the service principal. After you configure the custom signing key, your application code needs to [validate the token signing key](#validate-token-signing-key).
To run this script, you need:
active-directory Msal Acquire Cache Tokens https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-acquire-cache-tokens.md
When your client requests an access token, Microsoft Entra ID also returns an au
## Next steps Several of the platforms supported by MSAL have additional token cache-related information in the documentation for that platform's library. For example:-- [Get a token from the token cache using MSAL.NET](msal-net-acquire-token-silently.md)
+- [Get a token from the token cache using MSAL.NET](/entra/msal/dotnet/acquiring-tokens/acquire-token-silently)
- [Single sign-on with MSAL.js](msal-js-sso.md) - [Custom token cache serialization in MSAL for Python](/entra/msal/python/advanced/msal-python-token-cache-serialization) - [Custom token cache serialization in MSAL for Java](/entra/msal/java/advanced/msal-java-token-cache-serialization)
active-directory Msal Client Application Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-client-application-configuration.md
The authority you specify in your code needs to be consistent with the **Support
The authority can be: - A Microsoft Entra cloud authority.-- An Azure AD B2C authority. See [B2C specifics](msal-net-b2c-considerations.md).-- An Active Directory Federation Services (AD FS) authority. See [AD FS support](msal-net-adfs-support.md).
+- An Azure AD B2C authority. See [B2C specifics](/entra/msal/dotnet/acquiring-tokens/desktop-mobile/social-identities).
+- An Active Directory Federation Services (AD FS) authority. See [AD FS support](/entra/msal/dotnet/acquiring-tokens/desktop-mobile/adfs-support).
Microsoft Entra cloud authorities have two parts:
To help in debugging and authentication failure troubleshooting scenarios, the M
:::row::: :::column:::
- - [Logging in MSAL.NET](msal-logging-dotnet.md)
+ - [Logging in MSAL.NET](/entra/msal/dotnet/advanced/exceptions/msal-logging)
- [Logging in MSAL for Android](msal-logging-android.md) - [Logging in MSAL.js](msal-logging-js.md) :::column-end:::
To help in debugging and authentication failure troubleshooting scenarios, the M
## Next steps
-Learn about [instantiating client applications by using MSAL.NET](msal-net-initializing-client-applications.md) and [instantiating client applications by using MSAL.js](msal-js-initializing-client-applications.md).
+Learn about [instantiating client applications by using MSAL.NET](/entr).
active-directory Msal Client Applications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-client-applications.md
In MSAL, the client ID, also called the _application ID_ or _app ID_, is passed
For more information about application configuration and instantiating, see: - [Client application configuration options](msal-client-application-configuration.md)-- [Instantiating client applications by using MSAL.NET](msal-net-initializing-client-applications.md)
+- [Instantiating client applications by using MSAL.NET](/entra/msal/dotnet/getting-started/initializing-client-applications)
- [Instantiating client applications by using MSAL.js](msal-js-initializing-client-applications.md)
active-directory Msal Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-migration.md
ADAL to MSAL migration guide for different platforms are available in the follow
- [Migrate to MSAL iOS and macOS](migrate-objc-adal-msal.md) - [Migrate to MSAL Java](/entra/msal/java/advanced/migrate-adal-msal-java) - [Migrate to MSAL.js](msal-compare-msal-js-and-adal-js.md)-- [Migrate to MSAL .NET](msal-net-migration.md)
+- [Migrate to MSAL .NET](/entra/msal/dotnet/how-to/msal-net-migration)
- [Migrate to MSAL Node](msal-node-migration.md) - [Migrate to MSAL Python](/entra/msal/python/advanced/migrate-python-adal-msal)
active-directory Msal Net System Browser Android Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-net-system-browser-android-considerations.md
If authentication fails (for example, if authentication launches with DuckDuckGo
- **Mitigation**: Ask the user to enable a browser on their device. Recommend a browser that supports custom tabs. ## Next steps
-For more information and code examples, see [Choosing between an embedded web browser and a system browser on Xamarin Android](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/MSAL.NET-uses-web-browser#choosing-between-embedded-web-browser-or-system-browser-on-xamarinandroid) and [Embedded versus system web UI](msal-net-web-browsers.md#embedded-vs-system-web-ui).
+For more information and code examples, see [Choosing between an embedded web browser and a system browser on Xamarin Android](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/MSAL.NET-uses-web-browser#choosing-between-embedded-web-browser-or-system-browser-on-xamarinandroid) and [Embedded versus system web UI](/entra/msal/dotnet/acquiring-tokens/using-web-browsers#embedded-vs-system-web-ui).
active-directory Msal Net Use Brokers With Xamarin Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-net-use-brokers-with-xamarin-apps.md
Here are a few tips on avoiding issues when you implement brokered authenticatio
## Next steps
-Learn about [Considerations for using Universal Windows Platform with MSAL.NET](msal-net-uwp-considerations.md).
+Learn about [Considerations for using Universal Windows Platform with MSAL.NET](/entra/msal/dotnet/acquiring-tokens/desktop-mobile/uwp).
active-directory Msal Net Xamarin Android Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-net-xamarin-android-considerations.md
var authResult = AcquireTokenInteractive(scopes)
.ExecuteAsync(); ```
-For more information, see [Use web browsers for MSAL.NET](msal-net-web-browsers.md) and [Xamarin Android system browser considerations](msal-net-system-browser-android-considerations.md).
+For more information, see [Use web browsers for MSAL.NET](/entr).
## Troubleshooting
active-directory Quickstart Single Page App Javascript Sign In https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/quickstart-single-page-app-javascript-sign-in.md
# Quickstart: Sign in users in a single-page app (SPA) and call the Microsoft Graph API using JavaScript
-This quickstart uses a sample JavaScript (JS) single-page app (SPA) to show you how to sign in users by using the [authorization code flow](./v2-oauth2-auth-code-flow.md) with Proof Key for Code Exchange (PKCE) and call the Microsoft Graph API. The sample uses the [Microsoft Authentication Library for JavaScript](/javascript/api/@azure/msal-react) to handle authentication.
+This quickstart uses a sample JavaScript (JS) single-page app (SPA) to show you how to sign in users by using the [authorization code flow](./v2-oauth2-auth-code-flow.md) with Proof Key for Code Exchange (PKCE) and call the Microsoft Graph API. The sample uses the [Microsoft Authentication Library for JavaScript](/javascript/api/%40azure/msal-react/) to handle authentication.
## Prerequisites
active-directory Quickstart V2 Aspnet Core Webapp Calls Graph https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/quickstart-v2-aspnet-core-webapp-calls-graph.md
> > ## About the code >
-> This section gives an overview of the code required to sign in users and call the Microsoft Graph API on their behalf. This overview can be useful to understand how the code works, main arguments, and also if you want to add sign-in to an existing ASP.NET Core application and call Microsoft Graph. It uses [Microsoft.Identity.Web](microsoft-identity-web.md), which is a wrapper around [MSAL.NET](msal-overview.md).
+> This section gives an overview of the code required to sign in users and call the Microsoft Graph API on their behalf. This overview can be useful to understand how the code works, main arguments, and also if you want to add sign-in to an existing ASP.NET Core application and call Microsoft Graph. It uses [Microsoft.Identity.Web](/entr).
> > ### How the sample works >
> > The `AddAuthentication()` method configures the service to add cookie-based authentication, which is used in browser scenarios and to set the challenge to OpenID Connect. >
-> The line containing `.AddMicrosoftIdentityWebApp` adds the Microsoft identity platform authentication to your application. This is provided by [Microsoft.Identity.Web](microsoft-identity-web.md). It's then configured to sign in using the Microsoft identity platform based on the information in the `AzureAD` section of the *appsettings.json* configuration file:
+> The line containing `.AddMicrosoftIdentityWebApp` adds the Microsoft identity platform authentication to your application. This is provided by [Microsoft.Identity.Web](/entra/msal/dotnet/microsoft-identity-web/). It's then configured to sign in using the Microsoft identity platform based on the information in the `AzureAD` section of the *appsettings.json* configuration file:
> > | *appsettings.json* key | Description > | > ||-|
active-directory Reference Breaking Changes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/reference-breaking-changes.md
Check this article regularly to learn about:
> [!TIP] > To be notified of updates to this page, add this URL to your RSS feed reader:<br/>`https://learn.microsoft.com/api/search/rss?search=%22Azure+Active+Directory+breaking+changes+reference%22&locale=en-us`
+## October 2023
+
+### Updated RemoteConnect UX Prompt
+
+**Effective date**: October 2023
+
+**Endpoints impacted**: v2.0 and v1.0
+
+**Protocol impacted**: RemoteConnect
+
+RemoteConnect is a cross-device flow that is used for Microsoft Authentication Broker and Microsoft Intune related scenarios involving [Primary Refresh Tokens](../devices/concept-primary-refresh-token.md). To help prevent phishing attacks, the RemoteConnect flow will be receiving updated UX language to call out that the remote device (the device which initiated the flow) will be able to access any applications used by your organization upon successful completion of the flow.
+
+The prompt that appears will look something like this:
++ ## June 2023 ### Omission of email claims with an unverified domain owner
You can review the current text of the 50105 error and more on the error lookup
**Change** For single tenant applications, adding or updating the AppId URI validates that the domain in the HTTPS scheme URI is listed in the verified domain list in the customer tenant or that the value uses the default scheme (`api://{appId}`) provided by Azure AD. This could prevent applications from adding an AppId URI if the domain isn't in the verified domain list or the value doesn't use the default scheme.
-To find more information on verified domains, refer to the [custom domains documentation](../../active-directory/fundamentals/add-custom-domain.md).
+To find more information on verified domains, refer to the [custom domains documentation](../fundamentals/add-custom-domain.md).
The change doesn't affect existing applications using unverified domains in their AppID URI. It validates only new applications or when an existing application updates an identifier URI or adds a new one to the identifierUri collection. The new restrictions apply only to URIs added to an app's identifierUris collection after October 15, 2021. AppId URIs already in an application's identifierUris collection when the restriction takes effect on October 15, 2021 will continue to function even if you add new URIs to that collection.
active-directory Reference Claims Mapping Policy Type https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/reference-claims-mapping-policy-type.md
Based on the method chosen, a set of inputs and outputs is expected. Define the
|-|-|--|-| | **Join** | string1, string2, separator | output claim | Joins input strings by using a separator in between. For example, string1:`foo@bar.com` , string2:`sandbox` , separator:`.` results in output claim:`foo@bar.com.sandbox`. | | **ExtractMailPrefix** | Email or UPN | extracted string | Extension attributes 1-15 or any other directory extensions, which store a UPN or email address value for the user. For example, `johndoe@contoso.com`. Extracts the local part of an email address. For example, mail:`foo@bar.com` results in output claim:`foo`. If no \@ sign is present, then the original input string is returned. |
+| **ToLowercase()** | string | output string | Converts the characters of the selected attribute into lowercase characters. |
+| **ToUppercase()** | string | output string | Converts the characters of the selected attribute into uppercase characters. |
+| **RegexReplace()** | | | RegexReplace() transformation accepts as input parameters:<br/>- Parameter 1: a user attribute as regex input<br/>- An option to trust the source as multivalued<br/>- Regex pattern<br/>- Replacement pattern. The replacement pattern may contain static text format along with a reference that points to regex output groups and more input parameters. |
- **InputClaims** - Used to pass the data from a claim schema entry to a transformation. It has three attributes: **ClaimTypeReferenceId**, **TransformationClaimType** and **TreatAsMultiValue**. - **ClaimTypeReferenceId** - Joined with the ID element of the claim schema entry to find the appropriate input claim.
active-directory Reference Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/reference-error-codes.md
Previously updated : 06/07/2023 Last updated : 10/17/2023
The `error` field has several possible values - review the protocol documentatio
| Error | Description | |||
-| AADSTS16000 | SelectUserAccount - This is an interrupt thrown by Microsoft Entra ID, which results in UI that allows the user to select from among multiple valid SSO sessions. This error is fairly common and may be returned to the application if `prompt=none` is specified. |
+| AADSTS16000 | SelectUserAccount - This is an interrupt thrown by Microsoft Entra ID, which results in UI that allows the user to select from among multiple valid SSO sessions. This error is fairly common and might be returned to the application if `prompt=none` is specified. |
| AADSTS16001 | UserAccountSelectionInvalid - You'll see this error if the user selects on a tile that the session select logic has rejected. When triggered, this error allows the user to recover by picking from an updated list of tiles/sessions, or by choosing another account. This error can occur because of a code defect or race condition. | | AADSTS16002 | AppSessionSelectionInvalid - The app-specified SID requirement wasn't met. | | AADSTS160021| AppSessionSelectionInvalidSessionNotExist - Application requested a user session which does not exist. |
The `error` field has several possible values - review the protocol documentatio
| AADSTS50000 | TokenIssuanceError - There's an issue with the sign-in service. [Open a support ticket](../fundamentals/how-to-get-support.md) to resolve this issue. | | AADSTS50001 | InvalidResource - The resource is disabled or doesn't exist. Check your app's code to ensure that you have specified the exact resource URL for the resource you're trying to access. | | AADSTS50002 | NotAllowedTenant - Sign-in failed because of a restricted proxy access on the tenant. If it's your own tenant policy, you can change your restricted tenant settings to fix this issue. |
-| AADSTS500011 | InvalidResourceServicePrincipalNotFound - The resource principal named {name} was not found in the tenant named {tenant}. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You might have sent your authentication request to the wrong tenant. If you expect the app to be installed, you may need to provide administrator permissions to add it. Check with the developers of the resource and application to understand what the right setup for your tenant is. |
+| AADSTS500011 | InvalidResourceServicePrincipalNotFound - The resource principal named {name} was not found in the tenant named {tenant}. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You might have sent your authentication request to the wrong tenant. If you expect the app to be installed, you might need to provide administrator permissions to add it. Check with the developers of the resource and application to understand what the right setup for your tenant is. |
| AADSTS500021 | Access to '{tenant}' tenant is denied. AADSTS500021 indicates that the tenant restriction feature is configured and that the user is trying to access a tenant that isn't in the list of allowed tenants specified in the header `Restrict-Access-To-Tenant`. For more information, see [Use tenant restrictions to manage access to SaaS cloud applications](../manage-apps/tenant-restrictions.md).| | AADSTS500022 | Access to '{tenant}' tenant is denied. AADSTS500022 indicates that the tenant restriction feature is configured and that the user is trying to access a tenant that isn't in the list of allowed tenants specified in the header `Restrict-Access-To-Tenant`. For more information, see [Use tenant restrictions to manage access to SaaS cloud applications](../manage-apps/tenant-restrictions.md).| | AADSTS50003 | MissingSigningKey - Sign-in failed because of a missing signing key or certificate. This might be because there was no signing key configured in the app. To learn more, see the troubleshooting article for error [AADSTS50003](/troubleshoot/azure/active-directory/error-code-aadsts50003-cert-or-key-not-configured). If you still see issues, contact the app owner or an app admin. |
The `error` field has several possible values - review the protocol documentatio
| AADSTS50014 | GuestUserInPendingState - The user account doesnΓÇÖt exist in the directory. An application likely chose the wrong tenant to sign into, and the currently logged in user was prevented from doing so since they did not exist in your tenant. If this user should be able to log in, add them as a guest. For further information, please visit [add B2B users](/azure/active-directory/b2b/add-users-administrator). | | AADSTS50015 | ViralUserLegalAgeConsentRequiredState - The user requires legal age group consent. | | AADSTS50017 | CertificateValidationFailed - Certification validation failed, reasons for the following reasons:<ul><li>Cannot find issuing certificate in trusted certificates list</li><li>Unable to find expected CrlSegment</li><li>Cannot find issuing certificate in trusted certificates list</li><li>Delta CRL distribution point is configured without a corresponding CRL distribution point</li><li>Unable to retrieve valid CRL segments because of a timeout issue</li><li>Unable to download CRL</li></ul>Contact the tenant admin. |
-| AADSTS50020 | UserUnauthorized - Users are unauthorized to call this endpoint. User account '{email}' from identity provider '{idp}' does not exist in tenant '{tenant}' and cannot access the application '{appid}'({appName}) in that tenant. This account needs to be added as an external user in the tenant first. Sign out and sign in again with a different Microsoft Entra user account. If this user should be a member of the tenant, they should be invited via the [B2B system](/azure/active-directory/b2b/add-users-administrator). For additional information, visit [AADSTS50020](/troubleshoot/azure/active-directory/error-code-aadsts50020-user-account-identity-provider-does-not-exist). |
+| AADSTS50020 | UserUnauthorized - Users are unauthorized to call this endpoint. User account '{email}' from identity provider '{idp}' does not exist in tenant '{tenant}' and cannot access the application '{appid}'({appName}) in that tenant. This account needs to be added as an external user in the tenant first. Sign out and sign in again with a different Microsoft Entra user account. If this user should be a member of the tenant, they should be invited via the [B2B system](../external-identities/add-users-administrator.md). For additional information, visit [AADSTS50020](/troubleshoot/azure/active-directory/error-code-aadsts50020-user-account-identity-provider-does-not-exist). |
| AADSTS500208 | The domain is not a valid login domain for the account type - This situation occurs when the user's account does not match the expected account type for the given tenant.. For instance, if the tenant is configured to allow only work or school accounts, and the user tries to sign in with a personal Microsoft account, they will receive this error. | AADSTS500212 | NotAllowedByOutboundPolicyTenant - The user's administrator has set an outbound access policy that doesn't allow access to the resource tenant. | | AADSTS500213 | NotAllowedByInboundPolicyTenant - The resource tenant's cross-tenant access policy doesn't allow this user to access this tenant. |
The `error` field has several possible values - review the protocol documentatio
| AADSTS50029 | Invalid URI - domain name contains invalid characters. Contact the tenant admin. | | AADSTS50032 | WeakRsaKey - Indicates the erroneous user attempt to use a weak RSA key. | | AADSTS50033 | RetryableError - Indicates a transient error not related to the database operations. |
-| AADSTS50034 | UserAccountNotFound - To sign into this application, the account must be added to the directory. This error can occur because the user mis-typed their username, or isn't in the tenant. An application may have chosen the wrong tenant to sign into, and the currently logged in user was prevented from doing so since they did not exist in your tenant. If this user should be able to log in, add them as a guest. See docs here: [Add B2B users](../external-identities/add-users-administrator.md). |
+| AADSTS50034 | UserAccountNotFound - To sign into this application, the account must be added to the directory. This error can occur because the user mis-typed their username, or isn't in the tenant. An application might have chosen the wrong tenant to sign into, and the currently logged in user was prevented from doing so since they did not exist in your tenant. If this user should be able to log in, add them as a guest. See docs here: [Add B2B users](../external-identities/add-users-administrator.md). |
| AADSTS50042 | UnableToGeneratePairwiseIdentifierWithMissingSalt - The salt required to generate a pairwise identifier is missing in principle. Contact the tenant admin. | | AADSTS50043 | UnableToGeneratePairwiseIdentifierWithMultipleSalts | | AADSTS50048 | SubjectMismatchesIssuer - Subject mismatches Issuer claim in the client assertion. Contact the tenant admin. | | AADSTS50049 | NoSuchInstanceForDiscovery - Unknown or invalid instance. | | AADSTS50050 | MalformedDiscoveryRequest - The request is malformed. | | AADSTS50053 | This error can result from two different reasons: <br><ul><li>IdsLocked - The account is locked because the user tried to sign in too many times with an incorrect user ID or password. The user is blocked due to repeated sign-in attempts. See [Remediate risks and unblock users](../identity-protection/howto-identity-protection-remediate-unblock.md).</li><li>Or, sign-in was blocked because it came from an IP address with malicious activity.</li></ul> <br>To determine which failure reason caused this error, sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least an [Cloud Application Administrator](../roles/permissions-reference.md#cloud-application-administrator). Navigate to your Microsoft Entra tenant and then **Monitoring & health** -> **Sign-in logs**. Find the failed user sign-in with **Sign-in error code** 50053 and check the **Failure reason**.|
-| AADSTS50055 | InvalidPasswordExpiredPassword - The password is expired. The user's password is expired, and therefore their login or session was ended. They will be offered the opportunity to reset it, or may ask an admin to reset it via [Reset a user's password using Microsoft Entra ID](../fundamentals/users-reset-password-azure-portal.md). |
+| AADSTS50055 | InvalidPasswordExpiredPassword - The password is expired. The user's password is expired, and therefore their login or session was ended. They will be offered the opportunity to reset it, or can ask an admin to reset it via [Reset a user's password using Microsoft Entra ID](../fundamentals/users-reset-password-azure-portal.md). |
| AADSTS50056 | Invalid or null password: password doesn't exist in the directory for this user. The user should be asked to enter their password again. | | AADSTS50057 | UserDisabled - The user account is disabled. The user object in Active Directory backing this account has been disabled. An admin can re-enable this account [through PowerShell](/powershell/module/activedirectory/enable-adaccount) |
-| AADSTS50058 | UserInformationNotProvided - Session information isn't sufficient for single-sign-on. This means that a user isn't signed in. This is a common error that's expected when a user is unauthenticated and has not yet signed in.</br>If this error is encountered in an SSO context where the user has previously signed in, this means that the SSO session was either not found or invalid.</br>This error may be returned to the application if prompt=none is specified. |
+| AADSTS50058 | UserInformationNotProvided - Session information isn't sufficient for single-sign-on. This means that a user isn't signed in. This is a common error that's expected when a user is unauthenticated and has not yet signed in.</br>If this error is encountered in an SSO context where the user has previously signed in, this means that the SSO session was either not found or invalid.</br>This error might be returned to the application if prompt=none is specified. |
| AADSTS50059 | MissingTenantRealmAndNoUserInformationProvided - Tenant-identifying information was not found in either the request or implied by any provided credentials. The user can contact the tenant admin to help resolve the issue. | | AADSTS50061 | SignoutInvalidRequest - Unable to complete sign out. The request was invalid. | | AADSTS50064 | CredentialAuthenticationError - Credential validation on username or password has failed. |
The `error` field has several possible values - review the protocol documentatio
| AADSTS50071 | SignoutMessageExpired - The logout request has expired. | | AADSTS50072 | UserStrongAuthEnrollmentRequiredInterrupt - User needs to enroll for second factor authentication (interactive). | | AADSTS50074 | UserStrongAuthClientAuthNRequiredInterrupt - Strong authentication is required and the user did not pass the MFA challenge. |
-| AADSTS50076 | UserStrongAuthClientAuthNRequired - Due to a configuration change made by the admin such as a Conditional Access policy, per-user enforcement, or because you moved to a new location, the user must use multi-factor authentication to access the resource. Retry with a new authorize request for the resource. |
-| AADSTS50078 | UserStrongAuthExpired- Presented multi-factor authentication has expired due to policies configured by your administrator, you must refresh your multi-factor authentication to access '{resource}'.|
-| AADSTS50079 | UserStrongAuthEnrollmentRequired - Due to a configuration change made by the admin such as a Conditional Access policy, per-user enforcement, or because the user moved to a new location, the user is required to use multi-factor authentication. Either a managed user needs to register security info to complete multi-factor authentication, or a federated user needs to get the multi-factor claim from the federated identity provider. |
+| AADSTS50076 | UserStrongAuthClientAuthNRequired - Due to a configuration change made by the admin such as a Conditional Access policy, per-user enforcement, or because you moved to a new location, the user must use multifactor authentication to access the resource. Retry with a new authorize request for the resource. |
+| AADSTS50078 | UserStrongAuthExpired- Presented multifactor authentication has expired due to policies configured by your administrator, you must refresh your multifactor authentication to access '{resource}'.|
+| AADSTS50079 | UserStrongAuthEnrollmentRequired - Due to a configuration change made by the admin such as a Conditional Access policy, per-user enforcement, or because the user moved to a new location, the user is required to use multifactor authentication. Either a managed user needs to register security info to complete multifactor authentication, or a federated user needs to get the multifactor claim from the federated identity provider. |
| AADSTS50085 | Refresh token needs social IDP login. Have user try signing-in again with username -password | | AADSTS50086 | SasNonRetryableError | | AADSTS50087 | SasRetryableError - A transient error has occurred during strong authentication. Please try again. |
The `error` field has several possible values - review the protocol documentatio
| AADSTS50178 | SessionControlNotSupportedForPassthroughUsers - Session control isn't supported for passthrough users. | | AADSTS50180 | WindowsIntegratedAuthMissing - Integrated Windows authentication is needed. Enable the tenant for Seamless SSO. | | AADSTS50187 | DeviceInformationNotProvided - The service failed to perform device authentication. |
-| AADSTS50194 | Application '{appId}'({appName}) isn't configured as a multi-tenant application. Usage of the /common endpoint isn't supported for such applications created after '{time}'. Use a tenant-specific endpoint or configure the application to be multi-tenant. |
+| AADSTS50194 | Application '{appId}'({appName}) isn't configured as a multitenant application. Usage of the /common endpoint isn't supported for such applications created after '{time}'. Use a tenant-specific endpoint or configure the application to be multitenant. |
| AADSTS50196 | LoopDetected - A client loop has been detected. Check the appΓÇÖs logic to ensure that token caching is implemented, and that error conditions are handled correctly. The app has made too many of the same request in too short a period, indicating that it is in a faulty state or is abusively requesting tokens. | | AADSTS50197 | ConflictingIdentities - The user could not be found. Try signing in again. | | AADSTS50199 | CmsiInterrupt - For security reasons, user confirmation is required for this request. Interrupt is shown for all scheme redirects in mobile browsers. <br />No action required. The user was asked to confirm that this app is the application they intended to sign into. <br />This is a security feature that helps prevent spoofing attacks. This occurs because a system webview has been used to request a token for a native application. <br />To avoid this prompt, the redirect URI should be part of the following safe list: <br />http://<br />https://<br />chrome-extension:// (desktop Chrome browser only) |
The `error` field has several possible values - review the protocol documentatio
| AADSTS53002 | ApplicationUsedIsNotAnApprovedApp - The app used isn't an approved app for Conditional Access. User needs to use one of the apps from the list of approved apps to use in order to get access. | | AADSTS53003 | BlockedByConditionalAccess - Access has been blocked by Conditional Access policies. The access policy does not allow token issuance. If this is unexpected, see the Conditional Access policy that applied to this request or contact your administrator. For additional information, please visit [troubleshooting sign-in with Conditional Access](../conditional-access/troubleshoot-conditional-access.md). | | AADSTS530035 |BlockedBySecurityDefaults - Access has been blocked by security defaults. This is due to the request using legacy auth or being deemed unsafe by security defaults policies. For additional information, please visit [enforced security policies](../fundamentals/security-defaults.md#enforced-security-policies).|
-| AADSTS53004 | ProofUpBlockedDueToRisk - User needs to complete the multi-factor authentication registration process before accessing this content. User should register for multi-factor authentication. |
-| AADSTS53010 | ProofUpBlockedDueToSecurityInfoAcr - Cannot configure multi-factor authentication methods because the organization requires this information to be set from specific locations or devices. |
+| AADSTS53004 | ProofUpBlockedDueToRisk - User needs to complete the multifactor authentication registration process before accessing this content. User should register for multifactor authentication. |
+| AADSTS53010 | ProofUpBlockedDueToSecurityInfoAcr - Cannot configure multifactor authentication methods because the organization requires this information to be set from specific locations or devices. |
| AADSTS53011 | User blocked due to risk on home tenant. | | AADSTS530034 | DelegatedAdminBlockedDueToSuspiciousActivity - A delegated administrator was blocked from accessing the tenant due to account risk in their home tenant. | | AADSTS54000 | MinorUserBlockedLegalAgeGroupRule | | AADSTS54005 | OAuth2 Authorization code was already redeemed, please retry with a new valid code or use an existing refresh token. | | AADSTS65001 | DelegationDoesNotExist - The user or administrator has not consented to use the application with ID X. Send an interactive authorization request for this user and resource. |
-| AADSTS65002 | Consent between first party application '{applicationId}' and first party resource '{resourceId}' must be configured via preauthorization - applications owned and operated by Microsoft must get approval from the API owner before requesting tokens for that API. A developer in your tenant may be attempting to reuse an App ID owned by Microsoft. This error prevents them from impersonating a Microsoft application to call other APIs. They must move to another app ID they register.|
+| AADSTS65002 | Consent between first party application '{applicationId}' and first party resource '{resourceId}' must be configured via preauthorization - applications owned and operated by Microsoft must get approval from the API owner before requesting tokens for that API. A developer in your tenant might be attempting to reuse an App ID owned by Microsoft. This error prevents them from impersonating a Microsoft application to call other APIs. They must move to another app ID they register.|
| AADSTS65004 | UserDeclinedConsent - User declined to consent to access the app. Have the user retry the sign-in and consent to the app|
-| AADSTS65005 | MisconfiguredApplication - The app required resource access list does not contain apps discoverable by the resource or The client app has requested access to resource, which was not specified in its required resource access list or Graph service returned bad request or resource not found. If the app supports SAML, you may have configured the app with the wrong Identifier (Entity). To learn more, see the troubleshooting article for error [AADSTS650056](/troubleshoot/azure/active-directory/error-code-aadsts650056-misconfigured-app). |
+| AADSTS65005 | MisconfiguredApplication - The app required resource access list does not contain apps discoverable by the resource or The client app has requested access to resource, which was not specified in its required resource access list or Graph service returned bad request or resource not found. If the app supports SAML, you might have configured the app with the wrong Identifier (Entity). To learn more, see the troubleshooting article for error [AADSTS650056](/troubleshoot/azure/active-directory/error-code-aadsts650056-misconfigured-app). |
| AADSTS650052 | The app needs access to a service `(\"{name}\")` that your organization `\"{organization}\"` has not subscribed to or enabled. Contact your IT Admin to review the configuration of your service subscriptions. | | AADSTS650054 | The application asked for permissions to access a resource that has been removed or is no longer available. Make sure that all resources the app is calling are present in the tenant you're operating in. | | AADSTS650056 | Misconfigured application. This could be due to one of the following: the client has not listed any permissions for '{name}' in the requested permissions in the client's application registration. Or, the admin has not consented in the tenant. Or, check the application identifier in the request to ensure it matches the configured client application identifier. Or, check the certificate in the request to ensure it's valid. Please contact your admin to fix the configuration or consent on behalf of the tenant. Client app ID: {ID}. Please contact your admin to fix the configuration or consent on behalf of the tenant.| | AADSTS650057 | Invalid resource. The client has requested access to a resource which isn't listed in the requested permissions in the client's application registration. Client app ID: {appId}({appName}). Resource value from request: {resource}. Resource app ID: {resourceAppId}. List of valid resources from app registration: {regList}. | | AADSTS67003 | ActorNotValidServiceIdentity |
-| AADSTS70000 | InvalidGrant - Authentication failed. The refresh token isn't valid. Error may be due to the following reasons:<ul><li>Token binding header is empty</li><li>Token binding hash does not match</li></ul> |
+| AADSTS70000 | InvalidGrant - Authentication failed. The refresh token isn't valid. Error might be due to the following reasons:<ul><li>Token binding header is empty</li><li>Token binding hash does not match</li></ul> |
| AADSTS70001 | UnauthorizedClient - The application is disabled. To learn more, see the troubleshooting article for error [AADSTS70001](/troubleshoot/azure/active-directory/error-code-aadsts70001-app-not-found-in-directory). | | AADSTS700011 | UnauthorizedClientAppNotFoundInOrgIdTenant - Application with identifier {appIdentifier} was not found in the directory. A client application requested a token from your tenant, but the client app doesn't exist in your tenant, so the call failed. | | AADSTS70002 | InvalidClient - Error validating the credentials. The specified client_secret does not match the expected value for this client. Correct the client_secret and try again. For more info, see [Use the authorization code to request an access token](v2-oauth2-auth-code-flow.md#redeem-a-code-for-an-access-token). |
The `error` field has several possible values - review the protocol documentatio
| AADSTS9001023 |The grant type isn't supported over the /common or /consumers endpoints. Please use the /organizations or tenant-specific endpoint.| | AADSTS90012 | RequestTimeout - The requested has timed out. | | AADSTS90013 | InvalidUserInput - The input from the user isn't valid. |
-| AADSTS90014 | MissingRequiredField - This error code may appear in various cases when an expected field isn't present in the credential. |
+| AADSTS90014 | MissingRequiredField - This error code might appear in various cases when an expected field isn't present in the credential. |
| AADSTS900144 | The request body must contain the following parameter: '{name}'. Developer error - the app is attempting to sign in without the necessary or correct authentication parameters.| | AADSTS90015 | QueryStringTooLong - The query string is too long. | | AADSTS90016 | MissingRequiredClaim - The access token isn't valid. The required claim is missing. |
The `error` field has several possible values - review the protocol documentatio
| AADSTS90033 | MsodsServiceUnavailable - The Microsoft Online Directory Service (MSODS) isn't available. | | AADSTS90036 | MsodsServiceUnretryableFailure - An unexpected, non-retryable error from the WCF service hosted by MSODS has occurred. [Open a support ticket](../fundamentals/how-to-get-support.md) to get more details on the error. | | AADSTS90038 | NationalCloudTenantRedirection - The specified tenant 'Y' belongs to the National Cloud 'X'. Current cloud instance 'Z' does not federate with X. A cloud redirect error is returned. |
+| AADSTS900384 | JWT token failed signature validation. Actual message content is runtime specific, there are a variety of causes for this error. Please see the returned exception message for details.|
| AADSTS90043 | NationalCloudAuthCodeRedirection - The feature is disabled. | | AADSTS900432 | Confidential Client isn't supported in Cross Cloud request.| | AADSTS90051 | InvalidNationalCloudId - The national cloud identifier contains an invalid cloud identifier. |
active-directory Sample V2 Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/sample-v2-code.md
The following samples show how to build applications for the Python language and
> [!div class="mx-tdCol2BreakAll"] > | App type | Code sample(s) <br/> on GitHub |Auth<br/> libraries |Auth flow | > | -- | -- |-- |-- |
-> | Web application | &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/1-Authentication/sign-in) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/1-Authentication/sign-in-b2c) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/2-Authorization-I/call-graph) <br/> &#8226; [Deploy to Azure App Service](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/3-Deployment/deploy-to-azure-app-service)| [MSAL Python](/entra/msal/python) | Authorization code |
+> | Web application | &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/1-Authentication/sign-in) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/1-Authentication/sign-in-b2c) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/2-Authorization-I/call-graph) <br/> &#8226; [Deploy to Azure App Service](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/3-Deployment/deploy-to-azure-app-service)| [MSAL Python](/entra/msal/python/) | Authorization code |
### Kotlin
The following samples show how to build applications with Windows Presentation F
> | App type | Code sample(s) <br/> on GitHub |Auth<br/> libraries |Auth flow | > | -- | -- |-- |-- | > | Desktop | [Sign in users and call Microsoft Graph](https://github.com/Azure-Samples/active-directory-dotnet-native-aspnetcore-v2/tree/master/2.%20Web%20API%20now%20calls%20Microsoft%20Graph) | [MSAL.NET](/entra/msal/dotnet) | Authorization code with PKCE |
-> | Desktop | &#8226; [Sign in users and call ASP.NET Core web API](https://github.com/Azure-Samples/active-directory-dotnet-native-aspnetcore-v2/tree/master/1.%20Desktop%20app%20calls%20Web%20API) <br/> &#8226; [Sign in users and call Microsoft Graph](https://github.com/azure-samples/active-directory-dotnet-desktop-msgraph-v2) | [MSAL.NET](/entra/msal/dotnet) | Authorization code with PKCE |
+> | Desktop | &#8226; [Sign in users and call ASP.NET Core web API](https://github.com/Azure-Samples/active-directory-dotnet-native-aspnetcore-v2/tree/master/1.%20Desktop%20app%20calls%20Web%20API) <br/> &#8226; [Sign in users and call Microsoft Graph](https://github.com/azure-samples/active-directory-dotnet-desktop-msgraph-v2) | [MSAL.NET](/entra/msal/dotnet/) | Authorization code with PKCE |
active-directory Scenario Daemon App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-daemon-app-configuration.md
MSAL.NET has two methods to provide signed assertions to the confidential client
- `.WithClientAssertion()` - `.WithClientClaims()`
-When you use `WithClientAssertion`, provide a signed JWT. This advanced scenario is detailed in [Client assertions](msal-net-client-assertions.md).
+When you use `WithClientAssertion`, provide a signed JWT. This advanced scenario is detailed in [Client assertions](/entra/msal/dotnet/acquiring-tokens/msal-net-client-assertions).
```csharp string signedClientAssertion = ComputeAssertion();
app = ConfidentialClientApplicationBuilder.Create(config.ClientId)
.Build(); ```
-Again, for details, see [Client assertions](msal-net-client-assertions.md).
+Again, for details, see [Client assertions](/entra/msal/dotnet/acquiring-tokens/msal-net-client-assertions).
active-directory Scenario Daemon Production https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-daemon-production.md
You'll need to explain to your customers how to perform these operations. For mo
- Reference documentation for: - Instantiating [ConfidentialClientApplication](/dotnet/api/microsoft.identity.client.confidentialclientapplicationbuilder).
- - Calling [AcquireTokenForClient](/dotnet/api/microsoft.identity.client.acquiretokenforclientparameterbuilder?view=azure-dotnet&preserve-view=true).
+ - Calling [AcquireTokenForClient](/dotnet/api/microsoft.identity.client.acquiretokenforclientparameterbuilder?preserve-view=true&view=msal-dotnet-latest&viewFallbackFrom=azure-dotnet).
- Other samples/tutorials: - [microsoft-identity-platform-console-daemon](https://github.com/Azure-Samples/microsoft-identity-platform-console-daemon) features a small .NET Core daemon console application that displays the users of a tenant querying Microsoft Graph.
active-directory Scenario Desktop Acquire Token Interactive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-desktop-acquire-token-interactive.md
catch(MsalUiRequiredException)
On both desktop and mobile applications, it's important to specify the parent by using `.WithParentActivityOrWindow`. In many cases, it's a requirement and MSAL will throw exceptions.
-For desktop applications, see [Parent window handles](/azure/active-directory/develop/scenario-desktop-acquire-token-wam#parent-window-handles).
+For desktop applications, see [Parent window handles](./scenario-desktop-acquire-token-wam.md#parent-window-handles).
For mobile applications, provide `Activity` (Android) or `UIViewController` (iOS).
The structure defines the following constants:
#### WithUseEmbeddedWebView
-This method enables you to specify if you want to force the usage of an embedded WebView or the system WebView (when available). For more information, see [Usage of web browsers](msal-net-web-browsers.md).
+This method enables you to specify if you want to force the usage of an embedded WebView or the system WebView (when available). For more information, see [Usage of web browsers](/entra/msal/dotnet/acquiring-tokens/using-web-browsers).
```csharp var result = await app.AcquireTokenInteractive(scopes)
active-directory Scenario Mobile App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-mobile-app-configuration.md
For considerations about the browsers on Android, see [Xamarin.Android-specific
On UWP, you can use corporate networks. The following sections explain the tasks that you should complete in the corporate scenario.
-For more information, see [UWP-specific considerations with MSAL.NET](msal-net-uwp-considerations.md).
+For more information, see [UWP-specific considerations with MSAL.NET](/entra/msal/dotnet/acquiring-tokens/desktop-mobile/uwp).
## Configure the application to use the broker
active-directory Scenario Web App Call Api App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-web-app-call-api-app-configuration.md
The following image shows the various possibilities of *Microsoft.Identity.Web*
:::image type="content" source="media/scenarios/microsoft-identity-web-startup-cs.svg" alt-text="Block diagram showing service configuration options in startup dot C S for calling a web API and specifying a token cache implementation"::: > [!NOTE]
-> To fully understand the code examples here, be familiar with [ASP.NET Core fundamentals](/aspnet/core/fundamentals), and in particular with [dependency injection](/aspnet/core/fundamentals/dependency-injection) and [options](/aspnet/core/fundamentals/configuration/options).
+> To fully understand the code examples here, be familiar with [ASP.NET Core fundamentals](/aspnet/core/fundamentals/), and in particular with [dependency injection](/aspnet/core/fundamentals/dependency-injection) and [options](/aspnet/core/fundamentals/configuration/options).
Code examples in this article and the following one are extracted from the [ASP.NET Web app sample](https://github.com/Azure-Samples/ms-identity-aspnet-webapp-openidconnect). You might want to refer to that sample for full implementation details.
See `app.py` for the full context of that code.
Instead of a client secret, the confidential client application can also prove its identity by using a client certificate or a client assertion.
-The use of client assertions is an advanced scenario, detailed in [Client assertions](msal-net-client-assertions.md).
+The use of client assertions is an advanced scenario, detailed in [Client assertions](/entra/msal/dotnet/acquiring-tokens/msal-net-client-assertions).
## Token cache > [!IMPORTANT]
-> The token-cache implementation for web apps or web APIs is different from the implementation for desktop applications, which is often [file based](msal-net-token-cache-serialization.md).
+> The token-cache implementation for web apps or web APIs is different from the implementation for desktop applications, which is often [file based](/entra/msal/dotnet/how-to/token-cache-serialization).
> For security and performance reasons, it's important to ensure that for web apps and web APIs there is one token cache per user account. You must serialize the token cache for each account. # [ASP.NET Core](#tab/aspnetcore)
-The ASP.NET core tutorial uses dependency injection to let you decide the token cache implementation in the Startup.cs file for your application. Microsoft.Identity.Web comes with prebuilt token-cache serializers described in [Token cache serialization](msal-net-token-cache-serialization.md). An interesting possibility is to choose ASP.NET Core [distributed memory caches](/aspnet/core/performance/caching/distributed#distributed-memory-cache):
+The ASP.NET core tutorial uses dependency injection to let you decide the token cache implementation in the Startup.cs file for your application. Microsoft.Identity.Web comes with prebuilt token-cache serializers described in [Token cache serialization](/entra/msal/dotnet/how-to/token-cache-serialization). An interesting possibility is to choose ASP.NET Core [distributed memory caches](/aspnet/core/performance/caching/distributed#distributed-memory-cache):
```csharp // Use a distributed token cache by adding:
For details about the token-cache providers, see also Microsoft.Identity.Web's [
# [ASP.NET](#tab/aspnet)
-The ASP.NET tutorial uses dependency injection to let you decide the token cache implementation in the *Startup.Auth.cs* file for your application. *Microsoft.Identity.Web* comes with prebuilt token-cache serializers described in [Token cache serialization](msal-net-token-cache-serialization.md). An interesting possibility is to choose ASP.NET Core [distributed memory caches](/aspnet/core/performance/caching/distributed#distributed-memory-cache):
+The ASP.NET tutorial uses dependency injection to let you decide the token cache implementation in the *Startup.Auth.cs* file for your application. *Microsoft.Identity.Web* comes with prebuilt token-cache serializers described in [Token cache serialization](/entra/msal/dotnet/how-to/token-cache-serialization). An interesting possibility is to choose ASP.NET Core [distributed memory caches](/aspnet/core/performance/caching/distributed#distributed-memory-cache):
```csharp var services = owinTokenAcquirerFactory.Services;
services.AddDistributedSqlServerCache(options =>
For details about the token-cache providers, see also the *Microsoft.Identity.Web* [Token cache serialization](https://aka.ms/ms-id-web/token-cache-serialization) article, and the [ASP.NET Core Web app tutorials | Token caches](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/tree/master/2-WebApp-graph-user/2-2-TokenCache) phase of the web app's tutorial.
-For details see [Token cache serialization for MSAL.NET](./msal-net-token-cache-serialization.md).
+For details see [Token cache serialization for MSAL.NET](/entra/msal/dotnet/how-to/token-cache-serialization).
# [Java](#tab/java)
active-directory Secure Group Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/secure-group-access-control.md
The following table presents several security best practices for security groups
|--|| |**Ensure resource owner and group owner are the same principal**. Applications should build their own group management experience and create new groups to manage access. For example, an application can create groups with the `Group.Create` permission and add itself as the owner of the group. This way the application has control over its groups without being over privileged to modify other groups in the tenant.|When group owners and resource owners are different entities, group owners can add users to the group who aren't supposed to access the resource but can then access it unintentionally.| |**Build an implicit contract between the resource owner and group owner**. The resource owner and the group owner should align on the group purpose, policies, and members that can be added to the group to get access to the resource. This level of trust is non-technical and relies on human or business contract.|When group owners and resource owners have different intentions, the group owner may add users to the group the resource owner didn't intend on giving access to. This action can result in unnecessary and potentially risky access.|
-|**Use private groups for access control**. Microsoft 365 groups are managed by the [visibility concept](/graph/api/resources/group?view=graph-rest-1.0#group-visibility-options&preserve-view=true). This property controls the join policy of the group and visibility of group resources. Security groups have join policies that either allow anyone to join or require owner approval. On-premises-synced groups can also be public or private. Users joining an on-premises-synced group can get access to cloud resource as well.|When you use a public group for access control, any member can join the group and get access to the resource. The risk of elevation of privilege exists when a public group is used to give access to an external resource.|
+|**Use private groups for access control**. Microsoft 365 groups are managed by the [visibility concept](/graph/api/resources/group?view=graph-rest-1.0&preserve-view=true#group-visibility-options). This property controls the join policy of the group and visibility of group resources. Security groups have join policies that either allow anyone to join or require owner approval. On-premises-synced groups can also be public or private. Users joining an on-premises-synced group can get access to cloud resource as well.|When you use a public group for access control, any member can join the group and get access to the resource. The risk of elevation of privilege exists when a public group is used to give access to an external resource.|
|**Group nesting**. When you use a group for access control and it has other groups as its members, members of the subgroups can get access to the resource. In this case, there are multiple group owners of the parent group and the subgroups.|Aligning with multiple group owners on the purpose of each group and how to add the right members to these groups is more complex and more prone to accidental grant of access. Limit the number of nested groups or don't use them at all if possible.| ## Next steps
active-directory Support Fido2 Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/support-fido2-authentication.md
# Support passwordless authentication with FIDO2 keys in apps you develop
-These configurations and best practices will help you avoid common scenarios that block [FIDO2 passwordless authentication](../../active-directory/authentication/concept-authentication-passwordless.md) from being available to users of your applications.
+These configurations and best practices will help you avoid common scenarios that block [FIDO2 passwordless authentication](../authentication/concept-authentication-passwordless.md) from being available to users of your applications.
## General best practices ### Domain hints
-Don't use a domain hint to bypass [home-realm discovery](../../active-directory/manage-apps/configure-authentication-for-federated-users-portal.md). This feature is meant to make sign-ins more streamlined, but the federated identity provider may not support passwordless authentication.
+Don't use a domain hint to bypass [home-realm discovery](../manage-apps/configure-authentication-for-federated-users-portal.md). This feature is meant to make sign-ins more streamlined, but the federated identity provider may not support passwordless authentication.
### Requiring specific credentials
The availability of FIDO2 passwordless authentication for applications that run
## Next steps
-[Passwordless authentication options for Microsoft Entra ID](../../active-directory/authentication/concept-authentication-passwordless.md)
+[Passwordless authentication options for Microsoft Entra ID](../authentication/concept-authentication-passwordless.md)
active-directory Troubleshoot Publisher Verification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/troubleshoot-publisher-verification.md
Below are some common issues that may occur during the process.
Your app registrations may have been created using a different user account in this tenant, a personal/consumer account, or in a different tenant. Ensure you're signed in with the correct account in the tenant where your app registrations were created. - **I'm getting an error related to multi-factor authentication. What should I do?**
- Ensure [multi-factor authentication](../fundamentals/concept-fundamentals-mfa-get-started.md) is enabled and **required** for the user you're signing in with and for this scenario. For example, MFA could be:
+ Ensure [multi-factor authentication](../authentication/concept-mfa-licensing.md) is enabled and **required** for the user you're signing in with and for this scenario. For example, MFA could be:
- Always required for the user you're signing in with. - [Required for Azure management](../conditional-access/howto-conditional-access-policy-azure-management.md). - [Required for the type of administrator](../conditional-access/howto-conditional-access-policy-admin-mfa.md) you're signing in with.
Occurs when multi-factor authentication (MFA) hasn't been enabled and performed
The error message displayed will be: "Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to proceed." **Remediation Steps**
-1. Ensure [multi-factor authentication](../fundamentals/concept-fundamentals-mfa-get-started.md) is enabled and **required** for the user you're signing in with and for this scenario
+1. Ensure [multi-factor authentication](../authentication/concept-mfa-licensing.md) is enabled and **required** for the user you're signing in with and for this scenario
1. Retry Publisher Verification ### UserUnableToAddPublisher
active-directory Tutorial V2 Windows Uwp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/tutorial-v2-windows-uwp.md
In this tutorial:
## Prerequisites
-* [Visual Studio 2019](https://visualstudio.microsoft.com/vs/) with the [Universal Windows Platform development](/windows/uwp/get-started/get-set-up) workload installed
+* [Visual Studio 2019](https://visualstudio.microsoft.com/vs/) with the [Universal Windows Platform development](/windows/apps/windows-app-sdk/set-up-your-development-environment) workload installed
## How this guide works
active-directory Userinfo https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/userinfo.md
The claims shown in the response are all those that the UserInfo endpoint can re
You can't add to or customize the information returned by the UserInfo endpoint.
-To customize the information returned by the identity platform during authentication and authorization, use [claims mapping](active-directory-claims-mapping.md) and [optional claims](active-directory-optional-claims.md) to modify security token configuration.
+To customize the information returned by the identity platform during authentication and authorization, use [claims mapping](./saml-claims-customization.md) and [optional claims](./optional-claims.md) to modify security token configuration.
## Next steps
active-directory V2 Admin Consent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/v2-admin-consent.md
Previously updated : 09/15/2023 Last updated : 10/18/2023
http://localhost/myapp/permissions
| `scope` | The set of permissions that were granted access to, for the application.| | `admin_consent` | Will be set to `True`.|
+> [!WARNING]
+> Never use the **tenant ID** value of the `tenant` parameter to authenticate or authorize users. The tenant ID value can be updated and sent by bad actors to impersonate a response to your app. This can cause your application to be exposed to security incidents.
+ ### Error response ```none
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/whats-new-docs.md
Welcome to what's new in the Microsoft identity platform documentation. This art
- [Access tokens in the Microsoft identity platform](access-tokens.md) - Improve the explanations on how to validate a token - [Claims mapping policy type](reference-claims-mapping-policy-type.md) - Updates to Restricted Claims Set-- [Migrate confidential client applications from ADAL.NET to MSAL.NET](msal-net-migration-confidential-client.md) - Improving clarity in the content
+- [Migrate confidential client applications from ADAL.NET to MSAL.NET](/entra/msal/dotnet/how-to/migrate-confidential-client) - Improving clarity in the content
- [Single sign-on with MSAL.js](msal-js-sso.md) - Add guidance on using the loginHint claim for SSO - [Tutorial: Create a Blazor Server app that uses the Microsoft identity platform for authentication](tutorial-blazor-server.md) - Simplified and leverage the Microsoft Identity App Sync .NET tool
active-directory Concept Directory Join https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/concept-directory-join.md
The goal of Microsoft Entra joined devices is to simplify:
Microsoft Entra join can be deployed by using any of the following methods: -- [Windows Autopilot](/windows/deployment/windows-autopilot/windows-10-autopilot)-- [Bulk deployment](/intune/windows-bulk-enroll)
+- [Windows Autopilot](/autopilot/windows-autopilot)
+- [Bulk deployment](/mem/intune/enrollment/windows-bulk-enroll)
- [Self-service experience](device-join-out-of-box.md) ## Next steps
active-directory Howto Vm Sign In Azure Ad Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/howto-vm-sign-in-azure-ad-linux.md
There are a few ways to open Cloud Shell:
If you choose to install and use the Azure CLI locally, this article requires you to use version 2.22.1 or later. Run `az --version` to find the version. If you need to install or upgrade, see [Install the Azure CLI](/cli/azure/install-azure-cli). 1. Create a resource group by running [az group create](/cli/azure/group#az-group-create).
-1. Create a VM by running [az vm create](/cli/azure/vm?preserve-view=true#az-vm-create&preserve-view=true). Use a supported distribution in a supported region.
+1. Create a VM by running [az vm create](/cli/azure/vm?#az-vm-create). Use a supported distribution in a supported region.
1. Install the Microsoft Entra login VM extension by using [az vm extension set](/cli/azure/vm/extension#az-vm-extension-set). The following example deploys a VM and then installs the extension to enable Microsoft Entra login for a Linux VM. VM extensions are small applications that provide post-deployment configuration and automation tasks on Azure virtual machines. Customize the example as needed to support your testing requirements.
active-directory Hybrid Join Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/hybrid-join-control.md
If your Microsoft Entra ID is federated with AD FS, you first need to configure
To register Windows down-level devices, organizations must install [Microsoft Workplace Join for non-Windows 10 computers](https://www.microsoft.com/download/details.aspx?id=53554) available on the Microsoft Download Center.
-You can deploy the package by using a software distribution system likeΓÇ»[Microsoft Configuration Manager](/configmgr/). The package supports the standard silent installation options with the quiet parameter. The current branch of Configuration Manager offers benefits over earlier versions, like the ability to track completed registrations.
+You can deploy the package by using a software distribution system likeΓÇ»[Microsoft Configuration Manager](/mem/configmgr/). The package supports the standard silent installation options with the quiet parameter. The current branch of Configuration Manager offers benefits over earlier versions, like the ability to track completed registrations.
The installer creates a scheduled task on the system that runs in the user context. The task is triggered when the user signs in to Windows. The task silently joins the device with Microsoft Entra ID with the user credentials after authenticating with Microsoft Entra ID.
active-directory Manage Device Identities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/manage-device-identities.md
From there, you can go to **All devices** to:
- Identify devices, including: - Devices that have been joined or registered in Microsoft Entra ID.
- - Devices deployed via [Windows Autopilot](/windows/deployment/windows-autopilot/windows-autopilot).
+ - Devices deployed via [Windows Autopilot](/autopilot/windows-autopilot).
- Printers that use [Universal Print](/universal-print/fundamentals/universal-print-getting-started). - Complete device identity management tasks like enable, disable, delete, and manage. - The management options for [Printers](/universal-print/fundamentals/) and [Windows Autopilot](/autopilot/windows-autopilot) are limited in Microsoft Entra ID. These devices must be managed from their respective admin interfaces.
You must be assigned one of the following roles to manage device settings:
- **Users may join devices to Microsoft Entra ID**: This setting enables you to select the users who can register their devices as Microsoft Entra joined devices. The default is **All**. > [!NOTE]
- > The **Users may join devices to Microsoft Entra ID** setting is applicable only to Microsoft Entra join on Windows 10 or newer. This setting doesn't apply to Microsoft Entra hybrid joined devices, [Microsoft Entra joined VMs in Azure](./howto-vm-sign-in-azure-ad-windows.md#enable-azure-ad-login-for-a-windows-vm-in-azure), or Microsoft Entra joined devices that use [Windows Autopilot self-deployment mode](/mem/autopilot/self-deploying) because these methods work in a userless context.
+ > The **Users may join devices to Microsoft Entra ID** setting is applicable only to Microsoft Entra join on Windows 10 or newer. This setting doesn't apply to Microsoft Entra hybrid joined devices, [Microsoft Entra joined VMs in Azure](./howto-vm-sign-in-azure-ad-windows.md#enable-azure-ad-login-for-a-windows-vm-in-azure), or Microsoft Entra joined devices that use [Windows Autopilot self-deployment mode](/autopilot/self-deploying) because these methods work in a userless context.
- **Users may register their devices with Microsoft Entra ID**: You need to configure this setting to allow users to register Windows 10 or newer personal, iOS, Android, and macOS devices with Microsoft Entra ID. If you select **None**, devices aren't allowed to register with Microsoft Entra ID. Enrollment with Microsoft Intune or mobile device management for Microsoft 365 requires registration. If you've configured either of these services, **ALL** is selected, and **NONE** is unavailable. - **Require multifactor authentication to register or join devices with Microsoft Entra ID**:
active-directory Plan Device Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/plan-device-deployment.md
Conditional Access <br>(Require Microsoft Entra hybrid joined devices) | | | ![C
## Microsoft Entra Registration
-Registered devices are often managed with [Microsoft Intune](/mem/intune/enrollment/device-enrollment). Devices are enrolled in Intune in several ways, depending on the operating system.
+Registered devices are often managed with [Microsoft Intune](/mem/intune/fundamentals/deployment-guide-enrollment). Devices are enrolled in Intune in several ways, depending on the operating system.
Microsoft Entra registered devices provide support for Bring Your Own Devices (BYOD) and corporate owned devices to SSO to cloud resources. Access to resources is based on the Microsoft Entra [Conditional Access policies](../conditional-access/concept-conditional-access-grant.md) applied to the device and the user.
Review supported and unsupported platforms for integrated devices:
| Device management tools | Microsoft Entra registered | Microsoft Entra joined | Microsoft Entra hybrid joined | | | :: | :: | :: |
-| [Mobile Device Management (MDM)](/windows/client-management/mdm/azure-active-directory-integration-with-mdm) <br>Example: Microsoft Intune | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
+| [Mobile Device Management (MDM)](/windows/client-management/azure-active-directory-integration-with-mdm) <br>Example: Microsoft Intune | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
| [Co-management with Microsoft Intune and Microsoft Configuration Manager](/mem/configmgr/comanage/overview) <br>(Windows 10 or newer) | | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | | [Group policy](/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/hh831791(v=ws.11))<br>(Windows only) | | | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
active-directory Troubleshoot Mac Sso Extension Plugin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/troubleshoot-mac-sso-extension-plugin.md
Finished SSO request.
At this point in the authentication/authorization flow, the PRT has been bootstrapped and it should be visible in the macOS keychain access. See [Checking Keychain Access for PRT](#checking-keychain-access-for-prt). The **MSAL macOS sample** application uses the access token received from the Microsoft SSO Extension Broker to display the user's information.
-Next, examine server-side [Microsoft Entra sign-in logs](../reports-monitoring/reference-basic-info-sign-in-logs.md#correlation-id) based on the correlation ID collected from the client-side SSO extension logs. For more information, see [Sign-in logs in Microsoft Entra ID](../reports-monitoring/concept-sign-ins.md).
+Next, examine server-side [Microsoft Entra sign-in logs](../reports-monitoring/concept-sign-in-log-activity-details.md) based on the correlation ID collected from the client-side SSO extension logs. For more information, see [Sign-in logs in Microsoft Entra ID](../reports-monitoring/concept-sign-ins.md).
<a name='view-azure-ad-sign-in-logs-by-correlation-id-filter'></a>
active-directory Domains Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/domains-manage.md
You must change or delete any such resource in your Microsoft Entra organization
You can **ForceDelete** a domain name in the [Azure portal](https://portal.azure.com) or using [Microsoft Graph API](/graph/api/domain-forcedelete). These options use an asynchronous operation and update all references from the custom domain name like ΓÇ£user@contoso.comΓÇ¥ to the initial default domain name such as ΓÇ£user@contoso.onmicrosoft.com.ΓÇ¥
-To call **ForceDelete** in the Azure portal, you must ensure that there are fewer than 1000 references to the domain name, and any references where Exchange is the provisioning service must be updated or removed in the [Exchange Admin Center](https://outlook.office365.com/ecp/). This includes Exchange Mail-Enabled Security Groups and distributed lists. For more information, see [Removing mail-enabled security groups](/Exchange/recipients/mail-enabled-security-groups#Remove%20mail-enabled%20security%20groups&preserve-view=true). Also, the **ForceDelete** operation won't succeed if either of the following is true:
+To call **ForceDelete** in the Azure portal, you must ensure that there are fewer than 1000 references to the domain name, and any references where Exchange is the provisioning service must be updated or removed in the [Exchange Admin Center](https://outlook.office365.com/ecp/). This includes Exchange Mail-Enabled Security Groups and distributed lists. For more information, see [Removing mail-enabled security groups](/Exchange/recipients/mail-enabled-security-groups#Remove%20mail-enabled%20security%20groups). Also, the **ForceDelete** operation won't succeed if either of the following is true:
* You purchased a domain via Microsoft 365 domain subscription services * You are a partner administering on behalf of another customer organization
If you find that any of the conditions havenΓÇÖt been met, manually clean up the
Most management tasks for domain names in Microsoft Entra ID can also be completed using Microsoft PowerShell, or programmatically using the Microsoft Graph API.
-* [Using PowerShell to manage domain names in Microsoft Entra ID](/powershell/module/azuread/#domains&preserve-view=true)
+* [Using PowerShell to manage domain names in Microsoft Entra ID](/powershell/module/azuread/?preserve-view=true#domains)
* [Domain resource type](/graph/api/resources/domain) ## Next steps
-* [Add custom domain names](../fundamentals/add-custom-domain.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)
-* [Remove Exchange mail-enabled security groups in Exchange Admin Center on a custom domain name in Microsoft Entra ID](/Exchange/recipients/mail-enabled-security-groups#Remove%20mail-enabled%20security%20groups&preserve-view=true)
+* [Add custom domain names](../fundamentals/add-custom-domain.md?context=azure/active-directory/users-groups-roles/context/ugr-context)
+* [Remove Exchange mail-enabled security groups in Exchange Admin Center on a custom domain name in Microsoft Entra ID](/exchange/recipients/mail-enabled-security-groups?preserve-view=true#Remove%20mail-enabled%20security%20groups)
* [ForceDelete a custom domain name with Microsoft Graph API](/graph/api/domain-forcedelete)
active-directory Domains Verify Custom Subdomain https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/domains-verify-custom-subdomain.md
After a root domain is added to Microsoft Entra ID, part of Microsoft Entra, all subsequent subdomains added to that root in your Microsoft Entra organization automatically inherit the authentication setting from the root domain. However, if you want to manage domain authentication settings independently from the root domain settings, you can now with the Microsoft Graph API. For example, if you have a federated root domain such as contoso.com, this article can help you verify a subdomain such as child.contoso.com as managed instead of federated.
-In the Azure portal, when the parent domain is federated and the admin tries to verify a managed subdomain on the **Custom domain names** page, you'll get a 'Failed to add domain' error with the reason "One or more properties contains invalid values." If you try to add this subdomain from the Microsoft 365 admin center, you'll receive a similar error. For more information about the error, see [A child domain doesn't inherit parent domain changes in Office 365, Azure, or Intune](/office365/troubleshoot/administration/child-domain-fails-inherit-parent-domain-changes).
+In the Azure portal, when the parent domain is federated and the admin tries to verify a managed subdomain on the **Custom domain names** page, you'll get a 'Failed to add domain' error with the reason "One or more properties contains invalid values." If you try to add this subdomain from the Microsoft 365 admin center, you'll receive a similar error. For more information about the error, see [A child domain doesn't inherit parent domain changes in Office 365, Azure, or Intune](/microsoft-365/troubleshoot/administration/child-domain-fails-inherit-parent-domain-changes).
Because subdomains inherit the authentication type of the root domain by default, you must promote the subdomain to a root domain in Microsoft Entra ID using the Microsoft Graph so you can set the authentication type to your desired type.
Invoking API with a federated verified subdomain with user references | POST | 4
## Next steps -- [Add custom domain names](../fundamentals/add-custom-domain.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)
+- [Add custom domain names](../fundamentals/add-custom-domain.md?context=azure/active-directory/users-groups-roles/context/ugr-context)
- [Manage domain names](domains-manage.md) - [ForceDelete a custom domain name with Microsoft Graph API](/graph/api/domain-forcedelete)
active-directory Groups Dynamic Membership https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/groups-dynamic-membership.md
The following device attributes can be used.
managementType | MDM (for mobile devices) | device.managementType -eq "MDM" memberOf | Any string value (valid group object ID) | device.memberof -any (group.objectId -in ['value']) objectId | a valid Microsoft Entra object ID | device.objectId -eq "76ad43c9-32c5-45e8-a272-7b58b58f596d"
- profileType | a valid [profile type](/graph/api/resources/device?view=graph-rest-1.0#properties&preserve-view=true) in Microsoft Entra ID | device.profileType -eq "RegisteredDevice"
+ profileType | a valid [profile type](/graph/api/resources/device?view=graph-rest-1.0&preserve-view=true#properties) in Microsoft Entra ID | device.profileType -eq "RegisteredDevice"
systemLabels | any string matching the Intune device property for tagging Modern Workplace devices | device.systemLabels -contains "M365Managed" <!-- docutune:enable -->
The following device attributes can be used.
> [!NOTE] > When using `deviceOwnership` to create Dynamic Groups for devices, you need to set the value equal to `Company`. On Intune the device ownership is represented instead as Corporate. For more information, see [OwnerTypes](/mem/intune/developer/reports-ref-devices#ownertypes) for more details. > When using `deviceTrustType` to create Dynamic Groups for devices, you need to set the value equal to `AzureAD` to represent Microsoft Entra joined devices, `ServerAD` to represent Microsoft Entra hybrid joined devices or `Workplace` to represent Microsoft Entra registered devices.
-> When using `extensionAttribute1-15` to create Dynamic Groups for devices you need to set the value for `extensionAttribute1-15` on the device. Learn more on [how to write `extensionAttributes` on a Microsoft Entra device object](/graph/api/device-update?view=graph-rest-1.0&tabs=http#example-2--write-extensionattributes-on-a-device&preserve-view=true)
+> When using `extensionAttribute1-15` to create Dynamic Groups for devices you need to set the value for `extensionAttribute1-15` on the device. Learn more on [how to write `extensionAttributes` on a Microsoft Entra device object](/graph/api/device-update?view=graph-rest-1.0&tabs=http&preserve-view=true#example-2--write-extensionattributes-on-a-device)
## Next steps
active-directory Groups Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/groups-lifecycle.md
If the group you're restoring contains documents, SharePoint sites, or other per
## How to retrieve Microsoft 365 group expiration date
-In addition to Access Panel where users can view group details including expiration date and last renewed date, expiration date of a Microsoft 365 group can be retrieved from Microsoft Graph REST API Beta. expirationDateTime as a group property has been enabled in Microsoft Graph Beta. It can be retrieved with a GET request. For more details, please refer to [this example](/graph/api/group-get?view=graph-rest-beta#example&preserve-view=true).
+In addition to Access Panel where users can view group details including expiration date and last renewed date, expiration date of a Microsoft 365 group can be retrieved from Microsoft Graph REST API Beta. expirationDateTime as a group property has been enabled in Microsoft Graph Beta. It can be retrieved with a GET request. For more details, please refer to [this example](/graph/api/group-get?view=graph-rest-beta&preserve-view=true#example).
> [!NOTE] > In order to manage group memberships on Access Panel, "Restrict access to Groups in Access Panel" needs to be set to "No" in Microsoft Entra groups General Setting.
active-directory Groups Settings V2 Cmdlets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/groups-settings-v2-cmdlets.md
$param = @{
mailNickname="Demo" }
-New-MgGroup -BodyParameter $param
+New-MgGroup @param
``` ## Update groups
active-directory Users Close Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/users-close-account.md
Users in an unmanaged organization are often created during self-service sign-up
Before you can close your account, you should confirm the following items:
-* Make sure you are a user of an unmanaged Microsoft Entra organization. You can't close your account if you belong to a managed organization. If you belong to a managed organization and want to close your account, you must contact your administrator. For information about how to determine whether you belong to an unmanaged organization, see [Delete the user from Unmanaged Tenant](/power-automate/gdpr-dsr-delete#delete-the-user-from-unmanaged-tenant).
+* Make sure you are a user of an unmanaged Microsoft Entra organization. You can't close your account if you belong to a managed organization. If you belong to a managed organization and want to close your account, you must contact your administrator. For information about how to determine whether you belong to an unmanaged organization, see [Delete the user from Unmanaged Tenant](/power-automate/privacy-dsr-delete#delete-the-user-from-unmanaged-tenant).
-* Save any data you want to keep. For information about how to submit an export request, see [Accessing and exporting system-generated logs for Unmanaged Tenants](/power-platform/admin/powerapps-gdpr-dsr-guide-systemlogs#accessing-and-exporting-system-generated-logs-for-unmanaged-tenants).
+* Save any data you want to keep. For information about how to submit an export request, see [Accessing and exporting system-generated logs for Unmanaged Tenants](/power-platform/admin/powerapps-privacy-dsr-guide-systemlogs#accessing-and-exporting-system-generated-logs-for-unmanaged-tenants).
> [!WARNING] > Closing your account is irreversible. When you close your account, all personal data will be removed. You will no longer have access to your account and data associated with your account.
active-directory How To Customize Branding Customers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/customers/how-to-customize-branding-customers.md
You can also create user flows programmatically using the Company Branding Graph
By default, Microsoft offers a neutral branding for your tenant that can be personalized to suit your company's specific requirements. This default branding doesn't include any pre-existing Microsoft branding. In the event that the custom company branding fails to load, the sign-in page will automatically switch back to this neutral branding. Additionally, each custom branding property can be manually added to the custom sign-in page.
-You can customize this neutral branding with a custom background image or color, favicon, layout, header, and footer. You can also customize the sign-in form and add custom text to different instances or upload [custom CSS](/azure/active-directory/fundamentals/reference-company-branding-css-template).
+You can customize this neutral branding with a custom background image or color, favicon, layout, header, and footer. You can also customize the sign-in form and add custom text to different instances or upload [custom CSS](../../fundamentals/reference-company-branding-css-template.md).
The following image displays the neutral default branding of the tenant. You can find the numbered branding elements and their corresponding descriptions after the image. :::image type="content" source="media/how-to-customize-branding-customers/ciam-neutral-branding.png" alt-text="Screenshot of the CIAM neutral branding." lightbox="media/how-to-customize-branding-customers/ciam-neutral-branding.png":::
The following image displays the neutral default branding of the tenant. You can
## How to customize the default sign-in experience
-Before you customize any settings, the neutral default branding will appear in your sign-in and sign-up pages. You can customize this default experience with a custom background image or color, favicon, layout, header, and footer. You can also upload a [custom CSS](/azure/active-directory/fundamentals/reference-company-branding-css-template).
+Before you customize any settings, the neutral default branding will appear in your sign-in and sign-up pages. You can customize this default experience with a custom background image or color, favicon, layout, header, and footer. You can also upload a [custom CSS](../../fundamentals/reference-company-branding-css-template.md).
-1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Global Administrator](/azure/active-directory/roles/permissions-reference#global-administrator).
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Global Administrator](../../roles/permissions-reference.md#global-administrator).
1. If you have access to multiple tenants, use the **Directories + subscriptions** filter :::image type="icon" source="media/common/portal-directory-subscription-filter.png" border="false"::: in the top menu to switch to the customer tenant you created earlier. 1. Browse to **Company Branding** > **Default sign-in** > **Edit**.
Your tenant name replaces the Microsoft banner logo in the neutral default sign-
:::image type="content" source="media/how-to-customize-branding-customers/tenant-name.png" alt-text="Screenshot of the tenant name." lightbox="media/how-to-customize-branding-customers/tenant-name.png":::
-1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com/) as at least a [Global Administrator](/azure/active-directory/roles/permissions-reference#global-administrator).
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com/) as at least a [Global Administrator](../../roles/permissions-reference.md#global-administrator).
1. If you have access to multiple tenants, use the **Directories + subscriptions** filter :::image type="icon" source="media/common/portal-directory-subscription-filter.png" border="false"::: in the top menu to switch to the customer tenant you created earlier. 1. In the search bar, type and select **Properties**. 1. Edit the **Name** field.
You can use the Microsoft Graph API to customize a few items programmatically. F
## Next steps In this article we learned how to customize the look and feel of the customer sign-in and sing-up experience. To learn more about customizing the language of the tenant, see the [Language customization](how-to-customize-languages-customers.md) article.
-For an understanding of the differences in workforce tenant branding, see the article [How to customize branding for your workforce](/azure/active-directory/fundamentals/how-to-customize-branding).
+For an understanding of the differences in workforce tenant branding, see the article [How to customize branding for your workforce](../../fundamentals/how-to-customize-branding.md).
active-directory How To Web App Node Sign In Call Api Sign In Acquire Access Token https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/customers/how-to-web-app-node-sign-in-call-api-sign-in-acquire-access-token.md
The `/signin`, `/signout` and `/redirect` routes are defined in the *routes/auth
- Initiates sign-in flow by triggering the first leg of auth code flow.
- - Initializes a [confidential client application](../../../active-directory/develop/msal-client-applications.md) instance by using `msalConfig` MSAL configuration object.
+ - Initializes a [confidential client application](../../develop/msal-client-applications.md) instance by using `msalConfig` MSAL configuration object.
```javascript const msalInstance = this.getMsalInstance(this.config.msalConfig);
active-directory How To Web App Node Use Certificate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/customers/how-to-web-app-node-use-certificate.md
# Use client certificate for authentication in your Node.js web app
-Microsoft Entra ID for customers supports two types of authentication for [confidential client applications](../../../active-directory/develop/msal-client-applications.md); password-based authentication (such as client secret) and certificate-based authentication. For a higher level of security, we recommend using a certificate (instead of a client secret) as a credential in your confidential client applications.
+Microsoft Entra ID for customers supports two types of authentication for [confidential client applications](../../develop/msal-client-applications.md); password-based authentication (such as client secret) and certificate-based authentication. For a higher level of security, we recommend using a certificate (instead of a client secret) as a credential in your confidential client applications.
In production, you should purchase a certificate signed by a well-known certificate authority, and use [Azure Key Vault](https://azure.microsoft.com/products/key-vault/) to manage certificate access and lifetime for you. However, for testing purposes, you can create a self-signed certificate and configure your apps to authenticate with it.
active-directory Tutorial Web App Node Sign In Sign Out https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/customers/tutorial-web-app-node-sign-in-sign-out.md
The `/` route is the entry point to the application. It renders the *views/index
- It initiates sign-in flow by triggering the first leg of auth code flow.
- - It initializes a [confidential client application](../../../active-directory/develop/msal-client-applications.md) instance by using MSAL configuration object, `msalConfig`, that you created earlier.
+ - It initializes a [confidential client application](../../develop/msal-client-applications.md) instance by using MSAL configuration object, `msalConfig`, that you created earlier.
```javascript const msalInstance = this.getMsalInstance(this.config.msalConfig);
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/customers/whats-new-docs.md
Welcome to what's new in Microsoft Entra ID for customers documentation. This ar
## September 2023
-This month, we renamed Microsoft Entra ID to Microsoft Entra ID. For more information about the rebranding, see the [New name for Microsoft Entra ID](/azure/active-directory/fundamentals/new-name) article.
+This month, we renamed Microsoft Entra ID to Microsoft Entra ID. For more information about the rebranding, see the [New name for Microsoft Entra ID](../../fundamentals/new-name.md) article.
### Updated articles
active-directory External Identities Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/external-identities-overview.md
A multi-tenant organization is an organization that has more than one instance o
- [What is Microsoft Entra B2B collaboration?](what-is-b2b.md) - [What is Microsoft Entra B2B direct connect?](b2b-direct-connect-overview.md) - [About Azure AD B2C](/azure/active-directory-b2c/overview)-- [About Microsoft Entra multi-tenant organizations](../../active-directory/multi-tenant-organizations/overview.md)
+- [About Microsoft Entra multi-tenant organizations](../multi-tenant-organizations/overview.md)
active-directory Hybrid On Premises To Cloud https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/hybrid-on-premises-to-cloud.md
Before Microsoft Entra ID, organizations with on-premises identity systems have traditionally managed partner accounts in their on-premises directory. In such an organization, when you start to move apps to Microsoft Entra ID, you want to make sure your partners can access the resources they need. It shouldn't matter whether the resources are on-premises or in the cloud. Also, you want your partner users to be able to use the same sign-in credentials for both on-premises and Microsoft Entra resources.
-If you create accounts for your external partners in your on-premises directory (for example, you create an account with a sign-in name of "msullivan" for an external user named Maria Sullivan in your partners.contoso.com domain), you can now sync these accounts to the cloud. Specifically, you can use [Microsoft Entra Connect](/azure/active-directory/hybrid/connect/whatis-azure-ad-connect) to sync the partner accounts to the cloud, which creates a user account with UserType = Guest. This enables your partner users to access cloud resources using the same credentials as their local accounts, without giving them more access than they need. For more information about converting local guest accounts see [Convert local guest accounts to Microsoft Entra B2B guest accounts](/azure/active-directory/architecture/10-secure-local-guest).
+If you create accounts for your external partners in your on-premises directory (for example, you create an account with a sign-in name of "msullivan" for an external user named Maria Sullivan in your partners.contoso.com domain), you can now sync these accounts to the cloud. Specifically, you can use [Microsoft Entra Connect](../hybrid/connect/whatis-azure-ad-connect.md) to sync the partner accounts to the cloud, which creates a user account with UserType = Guest. This enables your partner users to access cloud resources using the same credentials as their local accounts, without giving them more access than they need. For more information about converting local guest accounts see [Convert local guest accounts to Microsoft Entra B2B guest accounts](../architecture/10-secure-local-guest.md).
> [!NOTE] > See also how to [invite internal users to B2B collaboration](invite-internal-users.md). With this feature, you can invite internal guest users to use B2B collaboration, regardless of whether you've synced their accounts from your on-premises directory to the cloud. Once the user accepts the invitation to use B2B collaboration, they'll be able to use their own identities and credentials to sign in to the resources you want them to access. You wonΓÇÖt need to maintain passwords or manage account lifecycles.
active-directory Tenant Restrictions V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/tenant-restrictions-v2.md
There are three options for enforcing tenant restrictions v2 for clients:
### Option 1: Universal tenant restrictions v2 as part of Microsoft Entra Global Secure Access (preview)
-Universal tenant restrictions v2 as part of [Microsoft Entra Global Secure Access](/azure/global-secure-access/overview-what-is-global-secure-access) is recommended because it provides authentication and data plane protection for all devices and platforms. This option provides more protection against sophisticated attempts to bypasses authentication. For example, attackers might try to allow anonymous access to a malicious tenantΓÇÖs apps, such as anonymous meeting join in Teams. Or, attackers might attempt to import to your organizational device an access token lifted from a device in the malicious tenant. Universal tenant restrictions v2 prevents these attacks by sending tenant restrictions v2 signals on the authentication plane (Microsoft Entra ID and Microsoft Account) and data plane (Microsoft cloud applications).
+Universal tenant restrictions v2 as part of [Microsoft Entra Global Secure Access](/entra/global-secure-access/overview-what-is-global-secure-access) is recommended because it provides authentication and data plane protection for all devices and platforms. This option provides more protection against sophisticated attempts to bypasses authentication. For example, attackers might try to allow anonymous access to a malicious tenantΓÇÖs apps, such as anonymous meeting join in Teams. Or, attackers might attempt to import to your organizational device an access token lifted from a device in the malicious tenant. Universal tenant restrictions v2 prevents these attacks by sending tenant restrictions v2 signals on the authentication plane (Microsoft Entra ID and Microsoft Account) and data plane (Microsoft cloud applications).
### Option 2: Set up tenant restrictions v2 on your corporate proxy
active-directory Tutorial Bulk Invite https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/tutorial-bulk-invite.md
Check to see that the guest users you added exist in the directory either in the
### View guest users with PowerShell
-To view guest users with PowerShell, you'll need the [`Microsoft.Graph.Users` PowerShell module](/powershell/module/microsoft.graph.users/?view=graph-powershell-beta&preserve-view=true). Then sign in using the `Connect-MgGraph` command with an admin account to consent to the required scopes:
+To view guest users with PowerShell, you'll need the [`Microsoft.Graph.Users` PowerShell module](/powershell/module/microsoft.graph.users/?view=graph-powershell-1.0&viewFallbackFrom=graph-powershell-beta&preserve-view=true). Then sign in using the `Connect-MgGraph` command with an admin account to consent to the required scopes:
```powershell Connect-MgGraph -Scopes "User.Read.All" ```
active-directory What Is B2b https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/what-is-b2b.md
B2B collaboration is enabled by default, but comprehensive admin settings let yo
- Use [external collaboration settings](external-collaboration-settings-configure.md) to define who can invite external users, allow or block B2B specific domains, and set restrictions on guest user access to your directory. -- Use [Microsoft cloud settings](cross-cloud-settings.md) to establish mutual B2B collaboration between the Microsoft Azure global cloud and [Microsoft Azure Government](/azure/azure-government/) or [Microsoft Azure operated by 21Vianet](/azure/china).
+- Use [Microsoft cloud settings](cross-cloud-settings.md) to establish mutual B2B collaboration between the Microsoft Azure global cloud and [Microsoft Azure Government](/azure/azure-government/) or [Microsoft Azure operated by 21Vianet](/azure/china/).
## Easily invite guest users from the Azure portal
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/whats-new-docs.md
Welcome to what's new in Azure Active Directory External Identities documentatio
## September 2023
-This month, we renamed Azure Active Directory (Azure AD) to Microsoft Entra ID. For more information about the rebranding, see the [New name for Azure Active Directory](/azure/active-directory/fundamentals/new-name) article.
+This month, we renamed Azure Active Directory (Azure AD) to Microsoft Entra ID. For more information about the rebranding, see the [New name for Azure Active Directory](../fundamentals/new-name.md) article.
### Updated articles
active-directory Compare https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/compare.md
Most IT administrators are familiar with Active Directory Domain Services concep
|:-|:-|:-| |**Users**||| |Provisioning: users | Organizations create internal users manually or use an in-house or automated provisioning system, such as the Microsoft Identity Manager, to integrate with an HR system.|Existing AD organizations use [Microsoft Entra Connect](../hybrid/connect/how-to-connect-sync-whatis.md) to sync identities to the cloud.</br> Microsoft Entra ID adds support to automatically create users from [cloud HR systems](../app-provisioning/what-is-hr-driven-provisioning.md). </br>Microsoft Entra ID can provision identities in [SCIM enabled](../app-provisioning/use-scim-to-provision-users-and-groups.md) SaaS apps to automatically provide apps with the necessary details to allow access for users. |
-|Provisioning: external identities| Organizations create external users manually as regular users in a dedicated external AD forest, resulting in administration overhead to manage the lifecycle of external identities (guest users)| Microsoft Entra ID provides a special class of identity to support external identities. [Microsoft Entra B2B](/azure/active-directory/b2b/) will manage the link to the external user identity to make sure they are valid. |
+|Provisioning: external identities| Organizations create external users manually as regular users in a dedicated external AD forest, resulting in administration overhead to manage the lifecycle of external identities (guest users)| Microsoft Entra ID provides a special class of identity to support external identities. [Microsoft Entra B2B](../external-identities/index.yml) will manage the link to the external user identity to make sure they are valid. |
| Entitlement management and groups| Administrators make users members of groups. App and resource owners then give groups access to apps or resources.| [Groups](./how-to-manage-groups.md) are also available in Microsoft Entra ID and administrators can also use groups to grant permissions to resources. In Microsoft Entra ID, administrators can assign membership to groups manually or use a query to dynamically include users to a group. </br> Administrators can use [Entitlement management](../governance/entitlement-management-overview.md) in Microsoft Entra ID to give users access to a collection of apps and resources using workflows and, if necessary, time-based criteria. | | Admin management|Organizations will use a combination of domains, organizational units, and groups in AD to delegate administrative rights to manage the directory and resources it controls.| Microsoft Entra ID provides [built-in roles](./how-subscriptions-associated-directory.md) with its Microsoft Entra role-based access control (Microsoft Entra RBAC) system, with limited support for [creating custom roles](../roles/custom-overview.md) to delegate privileged access to the identity system, the apps, and resources it controls.</br>Managing roles can be enhanced with [Privileged Identity Management (PIM)](../privileged-identity-management/pim-configure.md) to provide just-in-time, time-restricted, or workflow-based access to privileged roles. | | Credential management| Credentials in Active Directory are based on passwords, certificate authentication, and smartcard authentication. Passwords are managed using password policies that are based on password length, expiry, and complexity.|Microsoft Entra ID uses intelligent [password protection](../authentication/concept-password-ban-bad.md) for cloud and on-premises. Protection includes smart lockout plus blocking common and custom password phrases and substitutions. </br>Microsoft Entra ID significantly boosts security [through Multi-factor authentication](../authentication/concept-mfa-howitworks.md) and [passwordless](../authentication/concept-authentication-passwordless.md) technologies, like FIDO2. </br>Microsoft Entra ID reduces support costs by providing users a [self-service password reset](../authentication/concept-sspr-howitworks.md) system. |
Most IT administrators are familiar with Active Directory Domain Services concep
| Mid-tier/Daemon services|Services running in on-premises environments normally use AD service accounts or group Managed Service Accounts (gMSA) to run. These apps will then inherit the permissions of the service account.| Microsoft Entra ID provides [managed identities](../managed-identities-azure-resources/index.yml) to run other workloads in the cloud. The lifecycle of these identities is managed by Microsoft Entra ID and is tied to the resource provider and it can't be used for other purposes to gain backdoor access.| | **Devices**||| | Mobile|Active Directory doesn't natively support mobile devices without third-party solutions.| MicrosoftΓÇÖs mobile device management solution, Microsoft Intune, is integrated with Microsoft Entra ID. Microsoft Intune provides device state information to the identity system to evaluate during authentication. |
-| Windows desktops|Active Directory provides the ability to domain join Windows devices to manage them using Group Policy, System Center Configuration Manager, or other third-party solutions.|Windows devices can be [joined to Microsoft Entra ID](../devices/index.yml). Conditional Access can check if a device is Microsoft Entra joined as part of the authentication process. Windows devices can also be managed with [Microsoft Intune](/intune/what-is-intune). In this case, Conditional Access, will consider whether a device is compliant (for example, up-to-date security patches and virus signatures) before allowing access to the apps.|
+| Windows desktops|Active Directory provides the ability to domain join Windows devices to manage them using Group Policy, System Center Configuration Manager, or other third-party solutions.|Windows devices can be [joined to Microsoft Entra ID](../devices/index.yml). Conditional Access can check if a device is Microsoft Entra joined as part of the authentication process. Windows devices can also be managed with [Microsoft Intune](/mem/intune/fundamentals/what-is-intune). In this case, Conditional Access, will consider whether a device is compliant (for example, up-to-date security patches and virus signatures) before allowing access to the apps.|
| Windows servers| Active Directory provides strong management capabilities for on-premises Windows servers using Group Policy or other management solutions.| Windows servers virtual machines in Azure can be managed with [Microsoft Entra Domain Services](../../active-directory-domain-services/index.yml). [Managed identities](../managed-identities-azure-resources/index.yml) can be used when VMs need access to the identity system directory or resources.| | Linux/Unix workloads|Active Directory doesn't natively support non-Windows without third-party solutions, although Linux machines can be configured to authenticate with Active Directory as a Kerberos realm.|Linux/Unix VMs can use [managed identities](../managed-identities-azure-resources/index.yml) to access the identity system or resources. Some organizations, migrate these workloads to cloud container technologies, which can also use managed identities.| ## Next steps - [What is Microsoft Entra ID?](./whatis.md)-- [Compare self-managed Active Directory Domain Services, Microsoft Entra ID, and managed Microsoft Entra Domain Services](../../active-directory-domain-services/compare-identity-solutions.md)
+- [Compare self-managed Active Directory Domain Services, Microsoft Entra ID, and managed Microsoft Entra Domain Services](/entra/identity/domain-services/compare-identity-solutions)
- [Frequently asked questions about Microsoft Entra ID](./active-directory-faq.yml) - [What's new in Microsoft Entra ID?](./whats-new.md)
active-directory Concept Secure Remote Workers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/concept-secure-remote-workers.md
The following table is intended to highlight the key actions for the following l
## Next steps - For detailed deployment guidance for individual features of Microsoft Entra ID, review the [Microsoft Entra ID project deployment plans](../architecture/deployment-plans.md).-- Organizations can use [identity secure score](identity-secure-score.md) to track their progress against other Microsoft recommendations.
+- Organizations can use [identity secure score](../reports-monitoring/concept-identity-secure-score.md) to track their progress against other Microsoft recommendations.
active-directory How To Manage Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/how-to-manage-groups.md
To create a basic group and add members:
### Turn off group welcome email
-A welcome notification is sent to all users when they're added to a new Microsoft 365 group, regardless of the membership type. When an attribute of a user or device changes, all dynamic group rules in the organization are processed for potential membership changes. Users who are added then also receive the welcome notification. You can turn off this behavior in [Exchange PowerShell](/powershell/module/exchange/users-and-groups/Set-UnifiedGroup).
+A welcome notification is sent to all users when they're added to a new Microsoft 365 group, regardless of the membership type. When an attribute of a user or device changes, all dynamic group rules in the organization are processed for potential membership changes. Users who are added then also receive the welcome notification. You can turn off this behavior in [Exchange PowerShell](/powershell/module/exchange/set-unifiedgroup).
## Add or remove members and owners
active-directory How To View Support Access Request Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/how-to-view-support-access-request-logs.md
There are three activities that can be associated with an automated or system-in
## Next steps - [Manage Microsoft Support access requests](how-to-manage-support-access-requests.md)-- [Learn about audit logs](../../active-directory/reports-monitoring/concept-audit-logs.md)
+- [Learn about audit logs](../reports-monitoring/concept-audit-logs.md)
active-directory Introduction Identity Access Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/introduction-identity-access-management.md
These are the most well-known and commonly used authentication and authorization
OAuth is an open-standards identity management protocol that provides secure access for websites, mobile apps, and Internet of Things and other devices. It uses tokens that are encrypted in transit and eliminates the need to share credentials. OAuth 2.0, the latest release of OAuth, is a popular framework used by major social media platforms and consumer services, from Facebook and LinkedIn to Google, PayPal, and Netflix. To learn more, read about [OAuth 2.0 protocol](/azure/active-directory/develop/active-directory-v2-protocols). #### OpenID Connect (OIDC)
-With the release of the OpenID Connect (which uses public-key encryption), OpenID became a widely adopted authentication layer for OAuth. Like SAML, OpenID Connect (OIDC) is widely used for single sign-on (SSO), but OIDC uses REST/JSON instead of XML. OIDC was designed to work with both native and mobile apps by using REST/JSON protocols. The primary use case for SAML, however, is web-based apps. To learn more, read about [OpenID Connect protocol](/azure/active-directory/develop/active-directory-v2-protocols).
+With the release of the OpenID Connect (which uses public-key encryption), OpenID became a widely adopted authentication layer for OAuth. Like SAML, OpenID Connect (OIDC) is widely used for single sign-on (SSO), but OIDC uses REST/JSON instead of XML. OIDC was designed to work with both native and mobile apps by using REST/JSON protocols. The primary use case for SAML, however, is web-based apps. To learn more, read about [OpenID Connect protocol](../develop/v2-protocols.md).
#### JSON web tokens (JWTs)
active-directory New Name https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/new-name.md
The following table lists terminology that is not impacted by the Azure AD renam
| Active Directory <br/><br/>&#8226; Windows Server Active Directory <br/>&#8226; Active Directory Federation Services (AD FS) <br/>&#8226; Active Directory Domain Services (AD DS) <br/>&#8226; Active Directory <br/>&#8226; Any Active Directory feature(s) | Windows Server Active Directory, commonly known as Active Directory, and related features and services associated with Active Directory aren't branded with Microsoft Entra. | | Authentication library <br/><br/>&#8226; Azure AD Authentication Library (ADAL) <br/>&#8226; Microsoft Authentication Library (MSAL) | Azure Active Directory Authentication Library (ADAL) is deprecated. While existing apps that use ADAL continue to work, Microsoft will no longer release security fixes on ADAL. Migrate applications to the Microsoft Authentication Library (MSAL) to avoid putting your app's security at risk. <br/><br/>[Microsoft Authentication Library (MSAL)](../develop/msal-overview.md) - Provides security tokens from the Microsoft identity platform to authenticate users and access secured web APIs to provide secure access to Microsoft Graph, other Microsoft APIs, third-party web APIs, or your own web API. | | B2C <br/><br/>&#8226; Azure Active Directory B2C <br/>&#8226; Azure AD B2C | [Azure Active Directory B2C](/azure/active-directory-b2c) isn't being renamed. We're continuing to invest in security, availability, and reliability in Azure AD B2C and our next-generation solution for external identities, [Microsoft Entra External ID](../external-identities/index.yml). |
-| Graph <br/><br/>&#8226; Azure Active Directory Graph <br/>&#8226; Azure AD Graph <br/>&#8226; Microsoft Graph | Azure Active Directory (Azure AD) Graph is deprecated. Going forward, further investment in Azure AD Graph won't be made, and Azure AD Graph APIs have no SLA or maintenance commitment beyond security-related fixes. Investments in new features and functionalities will only be made in Microsoft Graph.<br/><br/>[Microsoft Graph](/graph) - Grants programmatic access to organization, user, and application data stored in Microsoft Entra ID. |
+| Graph <br/><br/>&#8226; Azure Active Directory Graph <br/>&#8226; Azure AD Graph <br/>&#8226; Microsoft Graph | Azure Active Directory (Azure AD) Graph is deprecated. Going forward, further investment in Azure AD Graph won't be made, and Azure AD Graph APIs have no SLA or maintenance commitment beyond security-related fixes. Investments in new features and functionalities will only be made in Microsoft Graph.<br/><br/>[Microsoft Graph](/graph/) - Grants programmatic access to organization, user, and application data stored in Microsoft Entra ID. |
| PowerShell <br/><br/>&#8226; Azure Active Directory PowerShell <br/>&#8226; Azure AD PowerShell <br/>&#8226; Microsoft Graph PowerShell | Azure AD PowerShell for Graph is planned for deprecation on March 30, 2024. For more info on the deprecation plans, see the deprecation update. We encourage you to migrate to Microsoft Graph PowerShell, which is the recommended module for interacting with Azure AD. <br/><br/>[Microsoft Graph PowerShell](/powershell/microsoftgraph/overview) - Acts as an API wrapper for the Microsoft Graph APIs and helps administer every Microsoft Entra ID feature that has an API in Microsoft Graph. | | Accounts <br/><br/>&#8226; Microsoft account <br/>&#8226; Work or school account | For end user sign-ins and account experiences, follow guidance for work and school accounts in [Sign in with Microsoft branding guidelines](../develop/howto-add-branding-in-apps.md). | | Microsoft identity platform | The Microsoft identity platform encompasses all our identity and access developer assets. It continues to provide the resources to help you build applications that your users and customers can sign in to using their Microsoft identities or social accounts. |
Only official product names are capitalized, plus Conditional Access and My * ap
- [How to: Rename Azure AD](how-to-rename-azure-ad.md) - [Stay up-to-date with what's new in Microsoft Entra ID (formerly Azure AD)](./whats-new.md) - [Get started using Microsoft Entra ID at the Microsoft Entra admin center](https://entra.microsoft.com/)-- [Learn more about the Microsoft Entra family with content from Microsoft Learn](/entra)
+- [Learn more about the Microsoft Entra family with content from Microsoft Learn](/entra/)
<!-- docutune:ignore "Azure Active Directory" "Azure AD" "AAD" "Entra ID" "Cloud Knox" "Identity Governance" -->
active-directory What Is Deprecated https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/what-is-deprecated.md
Use the following table to learn about changes including deprecations, retiremen
|||:| |[Azure AD Authentication Library (ADAL)](https://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-change-announcements-march-2023-train/ba-p/2967448)|Retirement|Jun 30, 2023| |[My Apps improvements](https://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-change-announcements-march-2023-train/ba-p/2967448)|Feature change|Jun 30, 2023|
-|[Microsoft Authenticator Lite for Outlook mobile](../../active-directory/authentication/how-to-mfa-authenticator-lite.md)|Feature change|Jun 9, 2023|
+|[Microsoft Authenticator Lite for Outlook mobile](../authentication/how-to-mfa-authenticator-lite.md)|Feature change|Jun 9, 2023|
|[My Groups experience](https://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-change-announcements-march-2023-train/ba-p/2967448)|Feature change|May 2023| |[My Apps browser extension](https://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-change-announcements-march-2023-train/ba-p/2967448)|Feature change|May 2023| |Microsoft Authenticator app [Number matching](../authentication/how-to-mfa-number-match.md)|Feature change|May 8, 2023|
Use the definitions in this section help clarify the state, availability, and su
* **End-of-life** - engineering investments have ended, and the feature is unavailable to any customer ## Next steps
-[What's new in Microsoft Entra ID?](../../active-directory/fundamentals/whats-new.md)
+[What's new in Microsoft Entra ID?](../fundamentals/whats-new.md)
## Resources * [Microsoft Entra change announcement blog](https://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-change-announcements-november-2022-train/ba-p/2967452)
active-directory Whats New Archive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/whats-new-archive.md
For more information, see: [Block users from viewing their BitLocker keys (previ
**Service category:** Identity Protection **Product capability:** Identity Security & Protection
-Identity Protection risk detections (alerts) are now also available in Microsoft 365 Defender to provide a unified investigation experience for security professionals. For more information, see: [Investigate alerts in Microsoft 365 Defender](/microsoft-365/security/defender/investigate-alerts?view=o365-worldwide#alert-sources&preserve-view=true)
+Identity Protection risk detections (alerts) are now also available in Microsoft 365 Defender to provide a unified investigation experience for security professionals. For more information, see: [Investigate alerts in Microsoft 365 Defender](/microsoft-365/security/defender/investigate-alerts?view=o365-worldwide&preserve-view=true#alert-sources)
To learn more about trusts and how to deploy your own, visit [How trust relation
In July 2022 we've added the following 28 new applications in our App gallery with Federation support:
-[Lunni Ticket Service](https://ticket.lunni.io/login), [Spring Health](https://benefits.springhealth.com/care), [Sorbet](https://lite.sorbetapp.com/login), [Planview ID](../saas-apps/planview-id-tutorial.md), [Karbonalpha](https://saas.karbonalpha.com/settings/api), [Headspace](../saas-apps/headspace-tutorial.md), [SeekOut](../saas-apps/seekout-tutorial.md), [Stackby](../saas-apps/stackby-tutorial.md), [Infrascale Cloud Backup](../saas-apps/infrascale-cloud-backup-tutorial.md), [Keystone](../saas-apps/keystone-tutorial.md), [LMS・教育管理システム Leaf](../saas-apps/lms-and-education-management-system-leaf-tutorial.md), [ZDiscovery](../saas-apps/zdiscovery-tutorial.md), [ラインズeライブラリアドバンス (Lines eLibrary Advance)](../saas-apps/lines-elibrary-advance-tutorial.md), [Rootly](../saas-apps/rootly-tutorial.md), [Articulate 360](../saas-apps/articulate360-tutorial.md), [Rise.com](../saas-apps/risecom-tutorial.md), [SevOne Network Monitoring System (NMS)](../saas-apps/sevone-network-monitoring-system-tutorial.md), [PGM](https://ups-pgm.4gfactor.com/azure/), [TouchRight Software](https://app.touchrightsoftware.com/), [Tendium](../saas-apps/tendium-tutorial.md), [Training Platform](../saas-apps/training-platform-tutorial.md), [Znapio](https://app.znapio.com/), [Preset](../saas-apps/preset-tutorial.md), [itslearning MS Teams sync](https://itslearning.com/global/), [Veza](../saas-apps/veza-tutorial.md),
+[Lunni Ticket Service](https://ticket.lunni.io/login), [Spring Health](https://benefits.springhealth.com/care), [Sorbet](https://lite.sorbetapp.com/login), [Planview ID](../saas-apps/planview-admin-tutorial.md), [Karbonalpha](https://saas.karbonalpha.com/settings/api), [Headspace](../saas-apps/headspace-tutorial.md), [SeekOut](../saas-apps/seekout-tutorial.md), [Stackby](../saas-apps/stackby-tutorial.md), [Infrascale Cloud Backup](../saas-apps/infrascale-cloud-backup-tutorial.md), [Keystone](../saas-apps/keystone-tutorial.md), [LMS・教育管理システム Leaf](../saas-apps/lms-and-education-management-system-leaf-tutorial.md), [ZDiscovery](../saas-apps/zdiscovery-tutorial.md), [ラインズeライブラリアドバンス (Lines eLibrary Advance)](../saas-apps/lines-elibrary-advance-tutorial.md), [Rootly](../saas-apps/rootly-tutorial.md), [Articulate 360](../saas-apps/articulate360-tutorial.md), [Rise.com](../saas-apps/risecom-tutorial.md), [SevOne Network Monitoring System (NMS)](../saas-apps/sevone-network-monitoring-system-tutorial.md), [PGM](https://ups-pgm.4gfactor.com/azure/), [TouchRight Software](https://app.touchrightsoftware.com/), [Tendium](../saas-apps/tendium-tutorial.md), [Training Platform](../saas-apps/training-platform-tutorial.md), [Znapio](https://app.znapio.com/), [Preset](../saas-apps/preset-tutorial.md), [itslearning MS Teams sync](https://itslearning.com/global/), [Veza](../saas-apps/veza-tutorial.md),
You can also find the documentation of all the applications from here https://aka.ms/AppsTutorial,
Pick a group of up to five members and provision them into your third-party appl
**Product capability:** Identity Security & Protection
-We're delighted to announce a new security protection that prevents bypassing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD. When enabled for a federated domain in your Azure AD tenant, it ensures that a compromised federated account can't bypass Azure AD Multi-Factor Authentication by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta#federatedidpmfabehavior-values&preserve-view=true).
+We're delighted to announce a new security protection that prevents bypassing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD. When enabled for a federated domain in your Azure AD tenant, it ensures that a compromised federated account can't bypass Azure AD Multi-Factor Authentication by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta&preserve-view=true#federatedidpmfabehavior-values).
We highly recommend enabling this new protection when using Azure AD Multi-Factor Authentication as your multi factor authentication for your federated users. To learn more about the protection and how to enable it, visit [Enable protection to prevent by-passing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD](/windows-server/identity/ad-fs/deployment/best-practices-securing-ad-fs#enable-protection-to-prevent-by-passing-of-cloud-azure-ad-multi-factor-authentication-when-federated-with-azure-ad).
For listing your application in the Azure AD app gallery, see the details here h
-We're delighted to announce a new security protection that prevents bypassing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD. When enabled for a federated domain in your Azure AD tenant, it ensures that a compromised federated account can't bypass Azure AD Multi-Factor Authentication by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0#federatedidpmfabehavior-values&preserve-view=true).
+We're delighted to announce a new security protection that prevents bypassing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD. When enabled for a federated domain in your Azure AD tenant, it ensures that a compromised federated account can't bypass Azure AD Multi-Factor Authentication by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-1.0&preserve-view=true#federatedidpmfabehavior-values).
We highly recommend enabling this new protection when using Azure AD Multi-Factor Authentication as your multi factor authentication for your federated users. To learn more about the protection and how to enable it, visit [Enable protection to prevent by-passing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD](/windows-server/identity/ad-fs/deployment/best-practices-securing-ad-fs#enable-protection-to-prevent-by-passing-of-cloud-azure-ad-multi-factor-authentication-when-federated-with-azure-ad).
active-directory Whats New Sovereign Clouds Archive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/whats-new-sovereign-clouds-archive.md
With Continuous access evaluation (CAE), critical security events and policies a
**Product capability:** Identity Security & Protection
-We're delighted to announce a new security protection that prevents bypassing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD. When enabled for a federated domain in your Azure AD tenant, it ensures that a compromised federated account can't bypass Azure AD Multi-Factor Authentication by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta#federatedidpmfabehavior-values&preserve-view=true).
+We're delighted to announce a new security protection that prevents bypassing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD. When enabled for a federated domain in your Azure AD tenant, it ensures that a compromised federated account can't bypass Azure AD Multi-Factor Authentication by imitating that a multi factor authentication has already been performed by the identity provider. The protection can be enabled via new security setting, [federatedIdpMfaBehavior](/graph/api/resources/internaldomainfederation?view=graph-rest-beta&preserve-view=true#federatedidpmfabehavior-values).
We highly recommend enabling this new protection when using Azure AD Multi-Factor Authentication as your multi factor authentication for your federated users. To learn more about the protection and how to enable it, visit [Enable protection to prevent by-passing of cloud Azure AD Multi-Factor Authentication when federated with Azure AD](/windows-server/identity/ad-fs/deployment/best-practices-securing-ad-fs#enable-protection-to-prevent-by-passing-of-cloud-azure-ad-multi-factor-authentication-when-federated-with-azure-ad).
active-directory Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/whats-new.md
TRv2 allows organizations to enable safe and productive cross-company collaborat
TRv2 uses the cross-tenant access policy, and offers both authentication and data plane protection. It enforces policies during user authentication, and on data plane access with Exchange Online, SharePoint Online, Teams, and MSGraph.ΓÇ» While the data plane support with Windows GPO and Global Secure Access is still in public preview, authentication plane support with proxy is now generally available.
-Visit https://aka.ms/tenant-restrictions-enforcement for more information on tenant restriction V2 and Global Secure Access client-side tagging for TRv2 at [Universal tenant restrictions](/azure/global-secure-access/how-to-universal-tenant-restrictions).
+Visit https://aka.ms/tenant-restrictions-enforcement for more information on tenant restriction V2 and Global Secure Access client-side tagging for TRv2 at [Universal tenant restrictions](/entra/global-secure-access/how-to-universal-tenant-restrictions).
For more information, see: [Require an app protection policy on Windows devices
In July 2023 we've added the following 10 new applications in our App gallery with Federation support:
-[Gainsight SAML](../saas-apps/gainsight-saml-tutorial.md), [Dataddo](https://www.dataddo.com/), [Puzzel](https://www.puzzel.com/), [Worthix App](../saas-apps/worthix-app-tutorial.md), [iOps360 IdConnect](https://iops360.com/iops360-id-connect-azuread-single-sign-on/), [Airbase](../saas-apps/airbase-tutorial.md), [Couchbase Capella - SSO](../saas-apps/couchbase-capella-sso-tutorial.md), [SSO for Jama Connect®](../saas-apps/sso-for-jama-connect-tutorial.md), [mediment (メディメント)](https://mediment.jp/), [Netskope Cloud Exchange Administration Console](../saas-apps/netskope-cloud-exchange-administration-console-tutorial.md), [Uber](../saas-apps/uber-tutorial.md), [Plenda](https://app.plenda.nl/), [Deem Mobile](../saas-apps/deem-mobile-tutorial.md), [40SEAS](https://www.40seas.com/), [Vivantio](https://www.vivantio.com/), [AppTweak](https://www.apptweak.com/), [Vbrick Rev Cloud](../saas-apps/vbrick-rev-cloud-tutorial.md), [OptiTurn](../saas-apps/optiturn-tutorial.md), [Application Experience with Mist](https://www.mist.com/), [クラウド勤怠管理システムKING OF TIME](../saas-apps/cloud-attendance-management-system-king-of-time-tutorial.md), [Connect1](../saas-apps/connect1-tutorial.md), [DB Education Portal for Schools](../saas-apps/db-education-portal-for-schools-tutorial.md), [SURFconext](../saas-apps/surfconext-tutorial.md), [Chengliye Smart SMS Platform](../saas-apps/chengliye-smart-sms-platform-tutorial.md), [CivicEye SSO](../saas-apps/civic-eye-sso-tutorial.md), [Colloquial](../saas-apps/colloquial-tutorial.md), [BigPanda](../saas-apps/bigpanda-tutorial.md), [Foreman](https://foreman.mn/)
+[Gainsight SAML](../saas-apps/gainsight-tutorial.md), [Dataddo](https://www.dataddo.com/), [Puzzel](https://www.puzzel.com/), [Worthix App](../saas-apps/worthix-app-tutorial.md), [iOps360 IdConnect](https://iops360.com/iops360-id-connect-azuread-single-sign-on/), [Airbase](../saas-apps/airbase-tutorial.md), [Couchbase Capella - SSO](../saas-apps/couchbase-capella-sso-tutorial.md), [SSO for Jama Connect®](../saas-apps/sso-for-jama-connect-tutorial.md), [mediment (メディメント)](https://mediment.jp/), [Netskope Cloud Exchange Administration Console](../saas-apps/netskope-cloud-exchange-administration-console-tutorial.md), [Uber](../saas-apps/uber-tutorial.md), [Plenda](https://app.plenda.nl/), [Deem Mobile](../saas-apps/deem-mobile-tutorial.md), [40SEAS](https://www.40seas.com/), [Vivantio](https://www.vivantio.com/), [AppTweak](https://www.apptweak.com/), [Vbrick Rev Cloud](../saas-apps/vbrick-rev-cloud-tutorial.md), [OptiTurn](../saas-apps/optiturn-tutorial.md), [Application Experience with Mist](https://www.mist.com/), [クラウド勤怠管理システムKING OF TIME](../saas-apps/cloud-attendance-management-system-king-of-time-tutorial.md), [Connect1](../saas-apps/connect1-tutorial.md), [DB Education Portal for Schools](../saas-apps/db-education-portal-for-schools-tutorial.md), [SURFconext](../saas-apps/surfconext-tutorial.md), [Chengliye Smart SMS Platform](../saas-apps/chengliye-smart-sms-platform-tutorial.md), [CivicEye SSO](../saas-apps/civic-eye-sso-tutorial.md), [Colloquial](../saas-apps/colloquial-tutorial.md), [BigPanda](../saas-apps/bigpanda-tutorial.md), [Foreman](https://foreman.mn/)
You can also find the documentation of all the applications from here https://aka.ms/AppsTutorial.
active-directory Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/apps.md
| Category | Application | | : | : |
-| HR | [SuccessFactors - User Provisioning](../../active-directory/saas-apps/sap-successfactors-inbound-provisioning-tutorial.md) |
-| HR | [Workday - User Provisioning](../../active-directory/saas-apps/workday-inbound-cloud-only-tutorial.md)|
-|[LDAP directory](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md)| OpenLDAP<br>Microsoft Active Directory Lightweight Directory Services<br>389 Directory Server<br>Apache Directory Server<br>IBM Tivoli DS<br>Isode Directory<br>NetIQ eDirectory<br>Novell eDirectory<br>Open DJ<br>Open DS<br>Oracle (previously Sun ONE) Directory Server Enterprise Edition<br>RadiantOne Virtual Directory Server (VDS) |
-| [SQL database](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md)| Microsoft SQL Server and Azure SQL<br>IBM DB2 10.x<br>IBM DB2 9.x<br>Oracle 10g and 11g<br>Oracle 12c and 18c<br>MySQL 5.x|
-| Cloud platform| [AWS IAM Identity Center](../../active-directory/saas-apps/aws-single-sign-on-provisioning-tutorial.md) |
-| Cloud platform| [Google Cloud Platform - User Provisioning](../../active-directory/saas-apps/g-suite-provisioning-tutorial.md) |
-| Business applications|[SAP Cloud Identity Platform - Provisioning](../../active-directory/saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md) |
-| CRM| [Salesforce - User Provisioning](../../active-directory/saas-apps/salesforce-provisioning-tutorial.md) |
-| ITSM| [ServiceNow](../../active-directory/saas-apps/servicenow-provisioning-tutorial.md)|
+| HR | [SuccessFactors - User Provisioning](../saas-apps/sap-successfactors-inbound-provisioning-tutorial.md) |
+| HR | [Workday - User Provisioning](../saas-apps/workday-inbound-cloud-only-tutorial.md)|
+|[LDAP directory](../app-provisioning/on-premises-ldap-connector-configure.md)| OpenLDAP<br>Microsoft Active Directory Lightweight Directory Services<br>389 Directory Server<br>Apache Directory Server<br>IBM Tivoli DS<br>Isode Directory<br>NetIQ eDirectory<br>Novell eDirectory<br>Open DJ<br>Open DS<br>Oracle (previously Sun ONE) Directory Server Enterprise Edition<br>RadiantOne Virtual Directory Server (VDS) |
+| [SQL database](../app-provisioning/tutorial-ecma-sql-connector.md)| Microsoft SQL Server and Azure SQL<br>IBM DB2 10.x<br>IBM DB2 9.x<br>Oracle 10g and 11g<br>Oracle 12c and 18c<br>MySQL 5.x|
+| Cloud platform| [AWS IAM Identity Center](../saas-apps/aws-single-sign-on-provisioning-tutorial.md) |
+| Cloud platform| [Google Cloud Platform - User Provisioning](../saas-apps/g-suite-provisioning-tutorial.md) |
+| Business applications|[SAP Cloud Identity Platform - Provisioning](../saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md) |
+| CRM| [Salesforce - User Provisioning](../saas-apps/salesforce-provisioning-tutorial.md) |
+| ITSM| [ServiceNow](../saas-apps/servicenow-provisioning-tutorial.md)|
<a name='entra-identity-governance-integrations'></a> ## Microsoft Entra ID Governance integrations
-The list below provides key integrations between Microsoft Entra ID Governance and various applications, including both provisioning and SSO integrations. For a full list of applications that Microsoft Entra ID integrates with specifically for SSO, see [here](../../active-directory/saas-apps/tutorial-list.md).
+The list below provides key integrations between Microsoft Entra ID Governance and various applications, including both provisioning and SSO integrations. For a full list of applications that Microsoft Entra ID integrates with specifically for SSO, see [here](../saas-apps/tutorial-list.md).
Microsoft Entra ID Governance can be integrated with many other applications, using standards such as OpenID Connect, SAML, SCIM, SQL and LDAP. If you're using a SaaS application which isn't listed, then [ask the SaaS vendor to onboard](../manage-apps/v2-howto-app-gallery-listing.md). For integration with other applications, see [integrating applications with Microsoft Entra ID](identity-governance-applications-integrate.md). | Application | Automated provisioning | Single Sign On (SSO)| | : | :-: | :-: |
-| 389 directory server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [4me](../../active-directory/saas-apps/4me-provisioning-tutorial.md) | ΓùÅ | ΓùÅ|
-| [8x8](../../active-directory/saas-apps/8x8-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [15five](../../active-directory/saas-apps/15five-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Acunetix 360](../../active-directory/saas-apps/acunetix-360-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Adobe Identity Management](../../active-directory/saas-apps/adobe-identity-management-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Adobe Identity Management (OIDC)](../../active-directory/saas-apps/adobe-identity-management-provisioning-oidc-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Airbase](../../active-directory/saas-apps/airbase-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Aha!](../../active-directory/saas-apps/aha-tutorial.md) | | ΓùÅ |
-| [Airstack](../../active-directory/saas-apps/airstack-provisioning-tutorial.md) | ΓùÅ | |
-| [Akamai Enterprise Application Access](../../active-directory/saas-apps/akamai-enterprise-application-access-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Airtable](../../active-directory/saas-apps/airtable-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Albert](../../active-directory/saas-apps/albert-provisioning-tutorial.md) | ΓùÅ | |
-| [AlertMedia](../../active-directory/saas-apps/alertmedia-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Alexis HR](../../active-directory/saas-apps/alexishr-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Alinto Protect (renamed Cleanmail)](../../active-directory/saas-apps/alinto-protect-provisioning-tutorial.md) | ΓùÅ | |
-| [Alvao](../../active-directory/saas-apps/alvao-provisioning-tutorial.md) | ΓùÅ | |
-| [Amazon Web Services (AWS) - Role Provisioning](../../active-directory/saas-apps/amazon-web-service-tutorial.md) | ΓùÅ | ΓùÅ |
-| Apache Directory Server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [Appaegis Isolation Access Cloud](../../active-directory/saas-apps/appaegis-isolation-access-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Apple School Manager](../../active-directory/saas-apps/apple-school-manager-provision-tutorial.md) | ΓùÅ | |
-| [Apple Business Manager](../../active-directory/saas-apps/apple-business-manager-provision-tutorial.md) | ΓùÅ | |
-| [Ardoq](../../active-directory/saas-apps/ardoq-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Asana](../../active-directory/saas-apps/asana-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [AskSpoke](../../active-directory/saas-apps/askspoke-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Atea](../../active-directory/saas-apps/atea-provisioning-tutorial.md) | ΓùÅ | |
-| [Atlassian Cloud](../../active-directory/saas-apps/atlassian-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Atmos](../../active-directory/saas-apps/atmos-provisioning-tutorial.md) | ΓùÅ | |
-| [AuditBoard](../../active-directory/saas-apps/auditboard-provisioning-tutorial.md) | ΓùÅ | |
-| [Autodesk SSO](../../active-directory/saas-apps/autodesk-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| 389 directory server ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [4me](../saas-apps/4me-provisioning-tutorial.md) | ΓùÅ | ΓùÅ|
+| [8x8](../saas-apps/8x8-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [15five](../saas-apps/15five-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Acunetix 360](../saas-apps/acunetix-360-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Adobe Identity Management](../saas-apps/adobe-identity-management-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Adobe Identity Management (OIDC)](../saas-apps/adobe-identity-management-provisioning-oidc-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Airbase](../saas-apps/airbase-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Aha!](../saas-apps/aha-tutorial.md) | | ΓùÅ |
+| [Airstack](../saas-apps/airstack-provisioning-tutorial.md) | ΓùÅ | |
+| [Akamai Enterprise Application Access](../saas-apps/akamai-enterprise-application-access-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Airtable](../saas-apps/airtable-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Albert](../saas-apps/albert-provisioning-tutorial.md) | ΓùÅ | |
+| [AlertMedia](../saas-apps/alertmedia-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Alexis HR](../saas-apps/alexishr-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Alinto Protect (renamed Cleanmail)](../saas-apps/alinto-protect-provisioning-tutorial.md) | ΓùÅ | |
+| [Alvao](../saas-apps/alvao-provisioning-tutorial.md) | ΓùÅ | |
+| [Amazon Business](../saas-apps/amazon-business-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Amazon Web Services (AWS) - Role Provisioning](../saas-apps/amazon-web-service-tutorial.md) | ΓùÅ | ΓùÅ |
+| Apache Directory Server ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [Appaegis Isolation Access Cloud](../saas-apps/appaegis-isolation-access-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Apple School Manager](../saas-apps/apple-school-manager-provision-tutorial.md) | ΓùÅ | |
+| [Apple Business Manager](../saas-apps/apple-business-manager-provision-tutorial.md) | ΓùÅ | |
+| [Ardoq](../saas-apps/ardoq-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Asana](../saas-apps/asana-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [AskSpoke](../saas-apps/askspoke-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Atea](../saas-apps/atea-provisioning-tutorial.md) | ΓùÅ | |
+| [Atlassian Cloud](../saas-apps/atlassian-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Atmos](../saas-apps/atmos-provisioning-tutorial.md) | ΓùÅ | |
+| [AuditBoard](../saas-apps/auditboard-provisioning-tutorial.md) | ΓùÅ | |
+| [Autodesk SSO](../saas-apps/autodesk-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Azure Databricks SCIM Connector](/azure/databricks/administration-guide/users-groups/scim/aad) | ΓùÅ | |
-| [AWS IAM Identity Center](../../active-directory/saas-apps/aws-single-sign-on-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Axiad Cloud](../../active-directory/saas-apps/axiad-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BambooHR](../../active-directory/saas-apps/bamboo-hr-tutorial.md) | | ΓùÅ |
-| [BenQ IAM](../../active-directory/saas-apps/benq-iam-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Bentley - Automatic User Provisioning](../../active-directory/saas-apps/bentley-automatic-user-provisioning-tutorial.md) | ΓùÅ | |
-| [Better Stack](../../active-directory/saas-apps/better-stack-provisioning-tutorial.md) | ΓùÅ | |
-| [BIC Cloud Design](../../active-directory/saas-apps/bic-cloud-design-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BIS](../../active-directory/saas-apps/bis-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BitaBIZ](../../active-directory/saas-apps/bitabiz-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Bizagi Studio for Digital Process Automation](../../active-directory/saas-apps/bizagi-studio-for-digital-process-automation-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BLDNG APP](../../active-directory/saas-apps/bldng-app-provisioning-tutorial.md) | ΓùÅ | |
-| [Blink](../../active-directory/saas-apps/blink-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Blinq](../../active-directory/saas-apps/blinq-provisioning-tutorial.md) | ΓùÅ | |
-| [BlogIn](../../active-directory/saas-apps/blogin-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BlueJeans](../../active-directory/saas-apps/bluejeans-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Bonusly](../../active-directory/saas-apps/bonusly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Box](../../active-directory/saas-apps/box-userprovisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Boxcryptor](../../active-directory/saas-apps/boxcryptor-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Bpanda](../../active-directory/saas-apps/bpanda-provisioning-tutorial.md) | ΓùÅ | |
-| [Brivo Onair Identity Connector](../../active-directory/saas-apps/brivo-onair-identity-connector-provisioning-tutorial.md) | ΓùÅ | |
-| [Britive](../../active-directory/saas-apps/britive-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BrowserStack Single Sign-on](../../active-directory/saas-apps/browserstack-single-sign-on-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [BullseyeTDP](../../active-directory/saas-apps/bullseyetdp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cato Networks Provisioning](../../active-directory/saas-apps/cato-networks-provisioning-tutorial.md) | ΓùÅ | |
-| [Cerner Central](../../active-directory/saas-apps/cernercentral-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cerby](../../active-directory/saas-apps/cerby-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Chaos](../../active-directory/saas-apps/chaos-provisioning-tutorial.md) | ΓùÅ | |
-| [Chatwork](../../active-directory/saas-apps/chatwork-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [CheckProof](../../active-directory/saas-apps/checkproof-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cinode](../../active-directory/saas-apps/cinode-provisioning-tutorial.md) | ΓùÅ | |
-| [Cisco Umbrella User Management](../../active-directory/saas-apps/cisco-umbrella-user-management-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cisco Webex](../../active-directory/saas-apps/cisco-webex-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Clarizen One](../../active-directory/saas-apps/clarizen-one-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cleanmail Swiss](../../active-directory/saas-apps/cleanmail-swiss-provisioning-tutorial.md) | ΓùÅ | |
-| [Clebex](../../active-directory/saas-apps/clebex-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cloud Academy SSO](../../active-directory/saas-apps/cloud-academy-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Coda](../../active-directory/saas-apps/coda-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Code42](../../active-directory/saas-apps/code42-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cofense Recipient Sync](../../active-directory/saas-apps/cofense-provision-tutorial.md) | ΓùÅ | |
-| [Comeet Recruiting Software](../../active-directory/saas-apps/comeet-recruiting-software-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Connecter](../../active-directory/saas-apps/connecter-provisioning-tutorial.md) | ΓùÅ | |
-| [Contentful](../../active-directory/saas-apps/contentful-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Concur](../../active-directory/saas-apps/concur-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Cornerstone OnDemand](../../active-directory/saas-apps/cornerstone-ondemand-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [CybSafe](../../active-directory/saas-apps/cybsafe-provisioning-tutorial.md) | ΓùÅ | |
-| [Dagster Cloud](../../active-directory/saas-apps/dagster-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Datadog](../../active-directory/saas-apps/datadog-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Documo](../../active-directory/saas-apps/documo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [DocuSign](../../active-directory/saas-apps/docusign-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Dropbox Business](../../active-directory/saas-apps/dropboxforbusiness-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Dialpad](../../active-directory/saas-apps/dialpad-provisioning-tutorial.md) | ΓùÅ | |
-| [DigiCert](../../active-directory/saas-apps/digicert-tutorial.md) | | ΓùÅ |
-| [Directprint.io](../../active-directory/saas-apps/directprint-io-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Druva](../../active-directory/saas-apps/druva-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Dynamic Signal](../../active-directory/saas-apps/dynamic-signal-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Embed Signage](../../active-directory/saas-apps/embed-signage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Envoy](../../active-directory/saas-apps/envoy-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Eletive](../../active-directory/saas-apps/eletive-provisioning-tutorial.md) | ΓùÅ | |
-| [Elium](../../active-directory/saas-apps/elium-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Exium](../../active-directory/saas-apps/exium-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Evercate](../../active-directory/saas-apps/evercate-provisioning-tutorial.md) | ΓùÅ | |
-| [Facebook Work Accounts](../../active-directory/saas-apps/facebook-work-accounts-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Federated Directory](../../active-directory/saas-apps/federated-directory-provisioning-tutorial.md) | ΓùÅ | |
-| [Figma](../../active-directory/saas-apps/figma-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Flock](../../active-directory/saas-apps/flock-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Foodee](../../active-directory/saas-apps/foodee-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Fortes Change Cloud](../../active-directory/saas-apps/fortes-change-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Frankli.io](../../active-directory/saas-apps/frankli-io-provisioning-tutorial.md) | ΓùÅ | |
-| [Freshservice Provisioning](../../active-directory/saas-apps/freshservice-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Funnel Leasing](../../active-directory/saas-apps/funnel-leasing-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Fuze](../../active-directory/saas-apps/fuze-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [G Suite](../../active-directory/saas-apps/g-suite-provisioning-tutorial.md) | ΓùÅ | |
-| [Genesys Cloud for Azure](../../active-directory/saas-apps/purecloud-by-genesys-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [getAbstract](../../active-directory/saas-apps/getabstract-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [GHAE](../../active-directory/saas-apps/ghae-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [GitHub](../../active-directory/saas-apps/github-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [GitHub AE](../../active-directory/saas-apps/github-ae-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [GitHub Enterprise Managed User](../../active-directory/saas-apps/github-enterprise-managed-user-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [GitHub Enterprise Managed User (OIDC)](../../active-directory/saas-apps/github-enterprise-managed-user-oidc-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [GoToMeeting](../../active-directory/saas-apps/citrixgotomeeting-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Global Relay Identity Sync](../../active-directory/saas-apps/global-relay-identity-sync-provisioning-tutorial.md) | ΓùÅ | |
-| [Gong](../../active-directory/saas-apps/gong-provisioning-tutorial.md) | ΓùÅ | |
-| [GoLinks](../../active-directory/saas-apps/golinks-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Grammarly](../../active-directory/saas-apps/grammarly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Group Talk](../../active-directory/saas-apps/grouptalk-provisioning-tutorial.md) | ΓùÅ | |
-| [Gtmhub](../../active-directory/saas-apps/gtmhub-provisioning-tutorial.md) | ΓùÅ | |
-| [H5mag](../../active-directory/saas-apps/h5mag-provisioning-tutorial.md) | ΓùÅ | |
-| [Harness](../../active-directory/saas-apps/harness-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [AWS IAM Identity Center](../saas-apps/aws-single-sign-on-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Axiad Cloud](../saas-apps/axiad-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BambooHR](../saas-apps/bamboo-hr-tutorial.md) | | ΓùÅ |
+| [BenQ IAM](../saas-apps/benq-iam-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Bentley - Automatic User Provisioning](../saas-apps/bentley-automatic-user-provisioning-tutorial.md) | ΓùÅ | |
+| [Better Stack](../saas-apps/better-stack-provisioning-tutorial.md) | ΓùÅ | |
+| [BIC Cloud Design](../saas-apps/bic-cloud-design-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BIS](../saas-apps/bis-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BitaBIZ](../saas-apps/bitabiz-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Bizagi Studio for Digital Process Automation](../saas-apps/bizagi-studio-for-digital-process-automation-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BLDNG APP](../saas-apps/bldng-app-provisioning-tutorial.md) | ΓùÅ | |
+| [Blink](../saas-apps/blink-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Blinq](../saas-apps/blinq-provisioning-tutorial.md) | ΓùÅ | |
+| [BlogIn](../saas-apps/blogin-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BlueJeans](../saas-apps/bluejeans-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Bonusly](../saas-apps/bonusly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Box](../saas-apps/box-userprovisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Boxcryptor](../saas-apps/boxcryptor-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Bpanda](../saas-apps/bpanda-provisioning-tutorial.md) | ΓùÅ | |
+| [Brivo Onair Identity Connector](../saas-apps/brivo-onair-identity-connector-provisioning-tutorial.md) | ΓùÅ | |
+| [Britive](../saas-apps/britive-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BrowserStack Single Sign-on](../saas-apps/browserstack-single-sign-on-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [BullseyeTDP](../saas-apps/bullseyetdp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Bustle B2B Transport Systems](../saas-apps/bustle-b2b-transport-systems-provisioning-tutorial.md) | ΓùÅ | |
+| [Canva](../saas-apps/canva-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cato Networks Provisioning](../saas-apps/cato-networks-provisioning-tutorial.md) | ΓùÅ | |
+| [Cerner Central](../saas-apps/cernercentral-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cerby](../saas-apps/cerby-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Chaos](../saas-apps/chaos-provisioning-tutorial.md) | ΓùÅ | |
+| [Chatwork](../saas-apps/chatwork-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [CheckProof](../saas-apps/checkproof-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cinode](../saas-apps/cinode-provisioning-tutorial.md) | ΓùÅ | |
+| [Cisco Umbrella User Management](../saas-apps/cisco-umbrella-user-management-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cisco Webex](../saas-apps/cisco-webex-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Clarizen One](../saas-apps/clarizen-one-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cleanmail Swiss](../saas-apps/cleanmail-swiss-provisioning-tutorial.md) | ΓùÅ | |
+| [Clebex](../saas-apps/clebex-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cloud Academy SSO](../saas-apps/cloud-academy-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Coda](../saas-apps/coda-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Code42](../saas-apps/code42-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cofense Recipient Sync](../saas-apps/cofense-provision-tutorial.md) | ΓùÅ | |
+| [Colloquial](../saas-apps/colloquial-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Comeet Recruiting Software](../saas-apps/comeet-recruiting-software-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Connecter](../saas-apps/connecter-provisioning-tutorial.md) | ΓùÅ | |
+| [Contentful](../saas-apps/contentful-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Concur](../saas-apps/concur-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cornerstone OnDemand](../saas-apps/cornerstone-ondemand-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cybozu](../saas-apps/cybozu-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [CybSafe](../saas-apps/cybsafe-provisioning-tutorial.md) | ΓùÅ | |
+| [Dagster Cloud](../saas-apps/dagster-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Datadog](../saas-apps/datadog-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Documo](../saas-apps/documo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [DocuSign](../saas-apps/docusign-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Dropbox Business](../saas-apps/dropboxforbusiness-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Dialpad](../saas-apps/dialpad-provisioning-tutorial.md) | ΓùÅ | |
+| [Diffchecker](../saas-apps/diffchecker-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [DigiCert](../saas-apps/digicert-tutorial.md) | | ΓùÅ |
+| [Directprint.io](../saas-apps/directprint-io-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Druva](../saas-apps/druva-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Dynamic Signal](../saas-apps/dynamic-signal-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Embed Signage](../saas-apps/embed-signage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Envoy](../saas-apps/envoy-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Eletive](../saas-apps/eletive-provisioning-tutorial.md) | ΓùÅ | |
+| [Elium](../saas-apps/elium-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Exium](../saas-apps/exium-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Evercate](../saas-apps/evercate-provisioning-tutorial.md) | ΓùÅ | |
+| [Facebook Work Accounts](../saas-apps/facebook-work-accounts-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Federated Directory](../saas-apps/federated-directory-provisioning-tutorial.md) | ΓùÅ | |
+| [Figma](../saas-apps/figma-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Flock](../saas-apps/flock-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Foodee](../saas-apps/foodee-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Forcepoint Cloud Security Gateway - User Authentication](../saas-apps/forcepoint-cloud-security-gateway-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Fortes Change Cloud](../saas-apps/fortes-change-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Frankli.io](../saas-apps/frankli-io-provisioning-tutorial.md) | ΓùÅ | |
+| [Freshservice Provisioning](../saas-apps/freshservice-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Funnel Leasing](../saas-apps/funnel-leasing-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Fuze](../saas-apps/fuze-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [G Suite](../saas-apps/g-suite-provisioning-tutorial.md) | ΓùÅ | |
+| [Genesys Cloud for Azure](../saas-apps/purecloud-by-genesys-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [getAbstract](../saas-apps/getabstract-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [GHAE](../saas-apps/ghae-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [GitHub](../saas-apps/github-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [GitHub AE](../saas-apps/github-ae-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [GitHub Enterprise Managed User](../saas-apps/github-enterprise-managed-user-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [GitHub Enterprise Managed User (OIDC)](../saas-apps/github-enterprise-managed-user-oidc-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [GoToMeeting](../saas-apps/citrixgotomeeting-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Global Relay Identity Sync](../saas-apps/global-relay-identity-sync-provisioning-tutorial.md) | ΓùÅ | |
+| [Gong](../saas-apps/gong-provisioning-tutorial.md) | ΓùÅ | |
+| [GoLinks](../saas-apps/golinks-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Grammarly](../saas-apps/grammarly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Group Talk](../saas-apps/grouptalk-provisioning-tutorial.md) | ΓùÅ | |
+| [Gtmhub](../saas-apps/gtmhub-provisioning-tutorial.md) | ΓùÅ | |
+| [H5mag](../saas-apps/h5mag-provisioning-tutorial.md) | ΓùÅ | |
+| [Harness](../saas-apps/harness-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| HCL Domino | ΓùÅ | |
-| [Headspace](../../active-directory/saas-apps/headspace-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [HelloID](../../active-directory/saas-apps/helloid-provisioning-tutorial.md) | ΓùÅ | |
-| [Holmes Cloud](../../active-directory/saas-apps/holmes-cloud-provisioning-tutorial.md) | ΓùÅ | |
-| [Hootsuite](../../active-directory/saas-apps/hootsuite-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Hoxhunt](../../active-directory/saas-apps/hoxhunt-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Howspace](../../active-directory/saas-apps/howspace-provisioning-tutorial.md) | ΓùÅ | |
-| [Humbol](../../active-directory/saas-apps/humbol-provisioning-tutorial.md) | ΓùÅ | |
-| IBM DB2 ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
-| IBM Tivoli Directory Server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [Ideo](../../active-directory/saas-apps/ideo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Ideagen Cloud](../../active-directory/saas-apps/ideagen-cloud-provisioning-tutorial.md) | ΓùÅ | |
-| [Infor CloudSuite](../../active-directory/saas-apps/infor-cloudsuite-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [InformaCast](../../active-directory/saas-apps/informacast-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [iPass SmartConnect](../../active-directory/saas-apps/ipass-smartconnect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Iris Intranet](../../active-directory/saas-apps/iris-intranet-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Insight4GRC](../../active-directory/saas-apps/insight4grc-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Insite LMS](../../active-directory/saas-apps/insite-lms-provisioning-tutorial.md) | ΓùÅ | |
-| [introDus Pre and Onboarding Platform](../../active-directory/saas-apps/introdus-pre-and-onboarding-platform-provisioning-tutorial.md) | ΓùÅ | |
-| [Invision](../../active-directory/saas-apps/invision-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [InviteDesk](../../active-directory/saas-apps/invitedesk-provisioning-tutorial.md) | ΓùÅ | |
-| Isode directory server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [Jive](../../active-directory/saas-apps/jive-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Jostle](../../active-directory/saas-apps/jostle-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Joyn FSM](../../active-directory/saas-apps/joyn-fsm-provisioning-tutorial.md) | ΓùÅ | |
-| [Juno Journey](../../active-directory/saas-apps/juno-journey-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Keeper Password Manager & Digital Vault](../../active-directory/saas-apps/keeper-password-manager-digitalvault-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Keepabl](../../active-directory/saas-apps/keepabl-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Kintone](../../active-directory/saas-apps/kintone-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Kisi Phsyical Security](../../active-directory/saas-apps/kisi-physical-security-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Klaxoon](../../active-directory/saas-apps/klaxoon-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Klaxoon SAML](../../active-directory/saas-apps/klaxoon-saml-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Kno2fy](../../active-directory/saas-apps/kno2fy-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [KnowBe4 Security Awareness Training](../../active-directory/saas-apps/knowbe4-security-awareness-training-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Kpifire](../../active-directory/saas-apps/kpifire-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [KPN Grip](../../active-directory/saas-apps/kpn-grip-provisioning-tutorial.md) | ΓùÅ | |
-| [LanSchool Air](../../active-directory/saas-apps/lanschool-air-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Headspace](../saas-apps/headspace-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [HelloID](../saas-apps/helloid-provisioning-tutorial.md) | ΓùÅ | |
+| [Holmes Cloud](../saas-apps/holmes-cloud-provisioning-tutorial.md) | ΓùÅ | |
+| [Hootsuite](../saas-apps/hootsuite-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Hoxhunt](../saas-apps/hoxhunt-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Howspace](../saas-apps/howspace-provisioning-tutorial.md) | ΓùÅ | |
+| [Humbol](../saas-apps/humbol-provisioning-tutorial.md) | ΓùÅ | |
+| [Hypervault](../saas-apps/hypervault-provisioning-tutorial.md) | ΓùÅ | |
+| IBM DB2 ([SQL connector](../app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
+| IBM Tivoli Directory Server ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [Ideo](../saas-apps/ideo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Ideagen Cloud](../saas-apps/ideagen-cloud-provisioning-tutorial.md) | ΓùÅ | |
+| [Infor CloudSuite](../saas-apps/infor-cloudsuite-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [InformaCast](../saas-apps/informacast-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [iPass SmartConnect](../saas-apps/ipass-smartconnect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Iris Intranet](../saas-apps/iris-intranet-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Insight4GRC](../saas-apps/insight4grc-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Insite LMS](../saas-apps/insite-lms-provisioning-tutorial.md) | ΓùÅ | |
+| [introDus Pre and Onboarding Platform](../saas-apps/introdus-pre-and-onboarding-platform-provisioning-tutorial.md) | ΓùÅ | |
+| [Invision](../saas-apps/invision-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [InviteDesk](../saas-apps/invitedesk-provisioning-tutorial.md) | ΓùÅ | |
+| Isode directory server ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [Jive](../saas-apps/jive-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Jostle](../saas-apps/jostle-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Joyn FSM](../saas-apps/joyn-fsm-provisioning-tutorial.md) | ΓùÅ | |
+| [Juno Journey](../saas-apps/juno-journey-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Keeper Password Manager & Digital Vault](../saas-apps/keeper-password-manager-digitalvault-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Keepabl](../saas-apps/keepabl-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Kintone](../saas-apps/kintone-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Kisi Phsyical Security](../saas-apps/kisi-physical-security-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Klaxoon](../saas-apps/klaxoon-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Klaxoon SAML](../saas-apps/klaxoon-saml-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Kno2fy](../saas-apps/kno2fy-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [KnowBe4 Security Awareness Training](../saas-apps/knowbe4-security-awareness-training-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Kpifire](../saas-apps/kpifire-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [KPN Grip](../saas-apps/kpn-grip-provisioning-tutorial.md) | ΓùÅ | |
+| [LanSchool Air](../saas-apps/lanschool-air-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [LawVu](../..//active-directory/saas-apps/lawvu-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [LDAP](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) | ΓùÅ | |
-| [LimbleCMMS](../../active-directory/saas-apps/limblecmms-provisioning-tutorial.md) | ΓùÅ | |
-| [LinkedIn Elevate](../../active-directory/saas-apps/linkedinelevate-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [LinkedIn Sales Navigator](../../active-directory/saas-apps/linkedinsalesnavigator-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Lucid (All Products)](../../active-directory/saas-apps/lucid-all-products-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Lucidchart](../../active-directory/saas-apps/lucidchart-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [LUSID](../../active-directory/saas-apps/LUSID-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Leapsome](../../active-directory/saas-apps/leapsome-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [LogicGate](../../active-directory/saas-apps/logicgate-provisioning-tutorial.md) | ΓùÅ | |
-| [Looop](../../active-directory/saas-apps/looop-provisioning-tutorial.md) | ΓùÅ | |
-| [LogMeIn](../../active-directory/saas-apps/logmein-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Maptician](../../active-directory/saas-apps/maptician-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Markit Procurement Service](../../active-directory/saas-apps/markit-procurement-service-provisioning-tutorial.md) | ΓùÅ | |
-| [MediusFlow](../../active-directory/saas-apps/mediusflow-provisioning-tutorial.md) | ΓùÅ | |
-| [MerchLogix](../../active-directory/saas-apps/merchlogix-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Meta Networks Connector](../../active-directory/saas-apps/meta-networks-connector-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| MicroFocus Novell eDirectory ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [LDAP](../app-provisioning/on-premises-ldap-connector-configure.md) | ΓùÅ | |
+| [LimbleCMMS](../saas-apps/limblecmms-provisioning-tutorial.md) | ΓùÅ | |
+| [LinkedIn Elevate](../saas-apps/linkedinelevate-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [LinkedIn Sales Navigator](../saas-apps/linkedinsalesnavigator-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Litmos](../saas-apps/litmos-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Lucid (All Products)](../saas-apps/lucid-all-products-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Lucidchart](../saas-apps/lucidchart-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [LUSID](../saas-apps/LUSID-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Leapsome](../saas-apps/leapsome-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [LogicGate](../saas-apps/logicgate-provisioning-tutorial.md) | ΓùÅ | |
+| [Looop](../saas-apps/looop-provisioning-tutorial.md) | ΓùÅ | |
+| [LogMeIn](../saas-apps/logmein-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [M-Files](../saas-apps/m-files-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Maptician](../saas-apps/maptician-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Markit Procurement Service](../saas-apps/markit-procurement-service-provisioning-tutorial.md) | ΓùÅ | |
+| [MediusFlow](../saas-apps/mediusflow-provisioning-tutorial.md) | ΓùÅ | |
+| [MerchLogix](../saas-apps/merchlogix-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Meta Networks Connector](../saas-apps/meta-networks-connector-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| MicroFocus Novell eDirectory ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
| Microsoft 365 | ΓùÅ | ΓùÅ | | Microsoft Active Directory Domain Services | | ΓùÅ | | Microsoft Azure | ΓùÅ | ΓùÅ | | [Microsoft Entra Domain Services](/entra/identity/domain-services/synchronization) | ΓùÅ | ΓùÅ |
-| Microsoft Azure SQL ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
-| Microsoft Lightweight Directory Server (ADAM) ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| Microsoft Azure SQL ([SQL connector](../app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
+| Microsoft Lightweight Directory Server (ADAM) ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
| Microsoft SharePoint Server (SharePoint) | ΓùÅ | |
-| Microsoft SQL Server ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
-| [Mixpanel](../../active-directory/saas-apps/mixpanel-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Mindtickle](../../active-directory/saas-apps/mindtickle-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Miro](../../active-directory/saas-apps/miro-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Monday.com](../../active-directory/saas-apps/mondaycom-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [MongoDB Atlas](../../active-directory/saas-apps/mongodb-cloud-tutorial.md) | | ΓùÅ |
-| [Moqups](../../active-directory/saas-apps/moqups-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Mural Identity](../../active-directory/saas-apps/mural-identity-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [MX3 Diagnostics](../../active-directory/saas-apps/mx3-diagnostics-connector-provisioning-tutorial.md) | ΓùÅ | |
-| [myPolicies](../../active-directory/saas-apps/mypolicies-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| MySQL ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
-| NetIQ eDirectory ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [Netpresenter Next](../../active-directory/saas-apps/netpresenter-provisioning-tutorial.md) | ΓùÅ | |
-| [Netskope User Authentication](../../active-directory/saas-apps/netskope-administrator-console-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Netsparker Enterprise](../../active-directory/saas-apps/netsparker-enterprise-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [New Relic by Organization](../../active-directory/saas-apps/new-relic-by-organization-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [NordPass](../../active-directory/saas-apps/nordpass-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Notion](../../active-directory/saas-apps/notion-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| Novell eDirectory ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [Office Space Software](../../active-directory/saas-apps/officespace-software-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Olfeo SAAS](../../active-directory/saas-apps/olfeo-saas-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| Open DJ ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| Open DS ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [OpenForms](../../active-directory/saas-apps/openforms-provisioning-tutorial.md) | ΓùÅ | |
-| [OpenLDAP](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) | ΓùÅ | |
-| [OpenText Directory Services](../../active-directory/saas-apps/open-text-directory-services-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Oracle Cloud Infrastructure Console](../../active-directory/saas-apps/oracle-cloud-infrastructure-console-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| Oracle Database ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
+| Microsoft SQL Server ([SQL connector](../app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
+| [Mixpanel](../saas-apps/mixpanel-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Mindtickle](../saas-apps/mindtickle-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Miro](../saas-apps/miro-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Monday.com](../saas-apps/mondaycom-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [MongoDB Atlas](../saas-apps/mongodb-cloud-tutorial.md) | | ΓùÅ |
+| [Moqups](../saas-apps/moqups-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Mural Identity](../saas-apps/mural-identity-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [MX3 Diagnostics](../saas-apps/mx3-diagnostics-connector-provisioning-tutorial.md) | ΓùÅ | |
+| [myPolicies](../saas-apps/mypolicies-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| MySQL ([SQL connector](../app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
+| NetIQ eDirectory ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [Netpresenter Next](../saas-apps/netpresenter-provisioning-tutorial.md) | ΓùÅ | |
+| [Netskope User Authentication](../saas-apps/netskope-administrator-console-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Netsparker Enterprise](../saas-apps/netsparker-enterprise-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [New Relic by Organization](../saas-apps/new-relic-by-organization-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [NordPass](../saas-apps/nordpass-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Notion](../saas-apps/notion-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| Novell eDirectory ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [Office Space Software](../saas-apps/officespace-software-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Olfeo SAAS](../saas-apps/olfeo-saas-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Oneflow](../saas-apps/oneflow-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| Open DJ ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| Open DS ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [OpenForms](../saas-apps/openforms-provisioning-tutorial.md) | ΓùÅ | |
+| [OpenLDAP](../app-provisioning/on-premises-ldap-connector-configure.md) | ΓùÅ | |
+| [OpenText Directory Services](../saas-apps/open-text-directory-services-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Oracle Cloud Infrastructure Console](../saas-apps/oracle-cloud-infrastructure-console-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| Oracle Database ([SQL connector](../app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | |
| Oracle E-Business Suite | ΓùÅ | ΓùÅ |
-| [Oracle Fusion ERP](../../active-directory/saas-apps/oracle-fusion-erp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [O'Reilly Learning Platform](../../active-directory/saas-apps/oreilly-learning-platform-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Oracle Fusion ERP](../saas-apps/oracle-fusion-erp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [O'Reilly Learning Platform](../saas-apps/oreilly-learning-platform-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| Oracle Internet Directory | ΓùÅ | | | Oracle PeopleSoft ERP | ΓùÅ | ΓùÅ |
-| Oracle SunONE Directory Server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [PagerDuty](../../active-directory/saas-apps/pagerduty-tutorial.md) | | ΓùÅ |
-| [Palo Alto Networks Cloud Identity Engine - Cloud Authentication Service](../../active-directory/saas-apps/palo-alto-networks-cloud-identity-engine-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Palo Alto Networks SCIM Connector](../../active-directory/saas-apps/palo-alto-networks-scim-connector-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [PaperCut Cloud Print Management](../../active-directory/saas-apps/papercut-cloud-print-management-provisioning-tutorial.md) | ΓùÅ | |
-| [Parsable](../../active-directory/saas-apps/parsable-provisioning-tutorial.md) | ΓùÅ | |
-| [Peripass](../../active-directory/saas-apps/peripass-provisioning-tutorial.md) | ΓùÅ | |
-| [Pingboard](../../active-directory/saas-apps/pingboard-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Plandisc](../../active-directory/saas-apps/plandisc-provisioning-tutorial.md) | ΓùÅ | |
-| [Playvox](../../active-directory/saas-apps/playvox-provisioning-tutorial.md) | ΓùÅ | |
-| [Preciate](../../active-directory/saas-apps/preciate-provisioning-tutorial.md) | ΓùÅ | |
-| [PrinterLogic SaaS](../../active-directory/saas-apps/printer-logic-saas-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Priority Matrix](../../active-directory/saas-apps/priority-matrix-provisioning-tutorial.md) | ΓùÅ | |
-| [ProdPad](../../active-directory/saas-apps/prodpad-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Promapp](../../active-directory/saas-apps/promapp-provisioning-tutorial.md) | ΓùÅ | |
-| [Proxyclick](../../active-directory/saas-apps/proxyclick-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Peakon](../../active-directory/saas-apps/peakon-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Proware](../../active-directory/saas-apps/proware-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| RadiantOne Virtual Directory Server (VDS) ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
-| [Real Links](../../active-directory/saas-apps/real-links-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Reward Gateway](../../active-directory/saas-apps/reward-gateway-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [RFPIO](../../active-directory/saas-apps/rfpio-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Rhombus Systems](../../active-directory/saas-apps/rhombus-systems-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Ring Central](../../active-directory/saas-apps/ringcentral-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Robin](../../active-directory/saas-apps/robin-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Rollbar](../../active-directory/saas-apps/rollbar-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Rouse Sales](../../active-directory/saas-apps/rouse-sales-provisioning-tutorial.md) | ΓùÅ | |
-| [Salesforce](../../active-directory/saas-apps/salesforce-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SafeGuard Cyber](../../active-directory/saas-apps/safeguard-cyber-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Salesforce Sandbox](../../active-directory/saas-apps/salesforce-sandbox-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Samanage](../../active-directory/saas-apps/samanage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| Oracle SunONE Directory Server ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [PagerDuty](../saas-apps/pagerduty-tutorial.md) | | ΓùÅ |
+| [Palo Alto Networks Cloud Identity Engine - Cloud Authentication Service](../saas-apps/palo-alto-networks-cloud-identity-engine-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Palo Alto Networks SCIM Connector](../saas-apps/palo-alto-networks-scim-connector-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [PaperCut Cloud Print Management](../saas-apps/papercut-cloud-print-management-provisioning-tutorial.md) | ΓùÅ | |
+| [Parsable](../saas-apps/parsable-provisioning-tutorial.md) | ΓùÅ | |
+| [Peripass](../saas-apps/peripass-provisioning-tutorial.md) | ΓùÅ | |
+| [Pingboard](../saas-apps/pingboard-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Plandisc](../saas-apps/plandisc-provisioning-tutorial.md) | ΓùÅ | |
+| [Playvox](../saas-apps/playvox-provisioning-tutorial.md) | ΓùÅ | |
+| [Postman](../saas-apps/postman-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Preciate](../saas-apps/preciate-provisioning-tutorial.md) | ΓùÅ | |
+| [PrinterLogic SaaS](../saas-apps/printer-logic-saas-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Priority Matrix](../saas-apps/priority-matrix-provisioning-tutorial.md) | ΓùÅ | |
+| [ProdPad](../saas-apps/prodpad-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Promapp](../saas-apps/promapp-provisioning-tutorial.md) | ΓùÅ | |
+| [Proxyclick](../saas-apps/proxyclick-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Peakon](../saas-apps/peakon-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Proware](../saas-apps/proware-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| RadiantOne Virtual Directory Server (VDS) ([LDAP connector](../app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [Real Links](../saas-apps/real-links-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Recnice](../saas-apps/recnice-provisioning-tutorial.md) | ΓùÅ | |
+| [Reward Gateway](../saas-apps/reward-gateway-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [RFPIO](../saas-apps/rfpio-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Rhombus Systems](../saas-apps/rhombus-systems-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Ring Central](../saas-apps/ringcentral-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Robin](../saas-apps/robin-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Rollbar](../saas-apps/rollbar-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Rouse Sales](../saas-apps/rouse-sales-provisioning-tutorial.md) | ΓùÅ | |
+| [Salesforce](../saas-apps/salesforce-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SafeGuard Cyber](../saas-apps/safeguard-cyber-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Salesforce Sandbox](../saas-apps/salesforce-sandbox-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Samanage](../saas-apps/samanage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| SAML-based apps | | ΓùÅ |
-| [SAP Analytics Cloud](../../active-directory/saas-apps/sap-analytics-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SAP Cloud Platform](../../active-directory/saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SAP R/3 and ERP](../../active-directory/app-provisioning/on-premises-sap-connector-configure.md) | ΓùÅ | |
-| [SAP HANA](../../active-directory/saas-apps/saphana-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SAP SuccessFactors to Active Directory](../../active-directory/saas-apps/sap-successfactors-inbound-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SAP SuccessFactors to Microsoft Entra ID](../../active-directory/saas-apps/sap-successfactors-inbound-provisioning-cloud-only-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SAP SuccessFactors Writeback](../../active-directory/saas-apps/sap-successfactors-writeback-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SchoolStream ASA](../../active-directory/saas-apps/schoolstream-asa-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SAP Analytics Cloud](../saas-apps/sap-analytics-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SAP Cloud Platform](../saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SAP R/3 and ERP](../app-provisioning/on-premises-sap-connector-configure.md) | ΓùÅ | |
+| [SAP HANA](../saas-apps/saphana-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SAP SuccessFactors to Active Directory](../saas-apps/sap-successfactors-inbound-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SAP SuccessFactors to Microsoft Entra ID](../saas-apps/sap-successfactors-inbound-provisioning-cloud-only-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SAP SuccessFactors Writeback](../saas-apps/sap-successfactors-writeback-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SchoolStream ASA](../saas-apps/schoolstream-asa-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [SCIM-based apps in the cloud](../app-provisioning/use-scim-to-provision-users-and-groups.md) | ΓùÅ | | | [SCIM-based apps on-premises](../app-provisioning/on-premises-scim-provisioning.md) | ΓùÅ | |
-| [Secure Deliver](../../active-directory/saas-apps/secure-deliver-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SecureLogin](../../active-directory/saas-apps/secure-login-provisioning-tutorial.md) | ΓùÅ | |
-| [Sentry](../../active-directory/saas-apps/sentry-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [ServiceNow](../../active-directory/saas-apps/servicenow-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Segment](../../active-directory/saas-apps/segment-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Shopify Plus](../../active-directory/saas-apps/shopify-plus-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Sigma Computing](../../active-directory/saas-apps/sigma-computing-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Signagelive](../../active-directory/saas-apps/signagelive-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Slack](../../active-directory/saas-apps/slack-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Smartfile](../../active-directory/saas-apps/smartfile-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Smartsheet](../../active-directory/saas-apps/smartsheet-provisioning-tutorial.md) | ΓùÅ | |
-| [Smallstep SSH](../../active-directory/saas-apps/smallstep-ssh-provisioning-tutorial.md) | ΓùÅ | |
-| [Snowflake](../../active-directory/saas-apps/snowflake-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Soloinsight - CloudGate SSO](../../active-directory/saas-apps/soloinsight-cloudgate-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SoSafe](../../active-directory/saas-apps/sosafe-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [SpaceIQ](../../active-directory/saas-apps/spaceiq-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Splashtop](../../active-directory/saas-apps/splashtop-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [StarLeaf](../../active-directory/saas-apps/starleaf-provisioning-tutorial.md) | ΓùÅ | |
-| [Storegate](../../active-directory/saas-apps/storegate-provisioning-tutorial.md) | ΓùÅ | |
-| [SurveyMonkey Enterprise](../../active-directory/saas-apps/surveymonkey-enterprise-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Swit](../../active-directory/saas-apps/swit-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Symantec Web Security Service (WSS)](../../active-directory/saas-apps/symantec-web-security-service.md) | ΓùÅ | ΓùÅ |
-| [Tableau Cloud](../../active-directory/saas-apps/tableau-online-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Tailscale](../../active-directory/saas-apps/tailscale-provisioning-tutorial.md) | ΓùÅ | |
-| [Talentech](../../active-directory/saas-apps/talentech-provisioning-tutorial.md) | ΓùÅ | |
-| [Tanium SSO](../../active-directory/saas-apps/tanium-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Tap App Security](../../active-directory/saas-apps/tap-app-security-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Taskize Connect](../../active-directory/saas-apps/taskize-connect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Teamgo](../../active-directory/saas-apps/teamgo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [TeamViewer](../../active-directory/saas-apps/teamviewer-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [TerraTrue](../../active-directory/saas-apps/terratrue-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [ThousandEyes](../../active-directory/saas-apps/thousandeyes-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Tic-Tac Mobile](../../active-directory/saas-apps/tic-tac-mobile-provisioning-tutorial.md) | ΓùÅ | |
-| [TimeClock 365](../../active-directory/saas-apps/timeclock-365-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [TimeClock 365 SAML](../../active-directory/saas-apps/timeclock-365-saml-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Templafy SAML2](../../active-directory/saas-apps/templafy-saml-2-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Templafy OpenID Connect](../../active-directory/saas-apps/templafy-openid-connect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [TheOrgWiki](../../active-directory/saas-apps/theorgwiki-provisioning-tutorial.md) | ΓùÅ | |
-| [Thrive LXP](../../active-directory/saas-apps/thrive-lxp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Torii](../../active-directory/saas-apps/torii-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [TravelPerk](../../active-directory/saas-apps/travelperk-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Tribeloo](../../active-directory/saas-apps/tribeloo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Twingate](../../active-directory/saas-apps/twingate-provisioning-tutorial.md) | ΓùÅ | |
-| [Uber](../../active-directory/saas-apps/uber-provisioning-tutorial.md) | ΓùÅ | |
-| [UNIFI](../../active-directory/saas-apps/unifi-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [uniFlow Online](../../active-directory/saas-apps/uniflow-online-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [uni-tel ) | ΓùÅ | |
-| [Vault Platform](../../active-directory/saas-apps/vault-platform-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Vbrick Rev Cloud](../../active-directory/saas-apps/vbrick-rev-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [V-Client](../../active-directory/saas-apps/v-client-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Velpic](../../active-directory/saas-apps/velpic-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Visibly](../../active-directory/saas-apps/visibly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Visitly](../../active-directory/saas-apps/visitly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Vonage](../../active-directory/saas-apps/vonage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [WATS](../../active-directory/saas-apps/wats-provisioning-tutorial.md) | ΓùÅ | |
-| [Webroot Security Awareness Training](../../active-directory/saas-apps/webroot-security-awareness-training-provisioning-tutorial.md) | ΓùÅ | |
-| [WEDO](../../active-directory/saas-apps/wedo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Whimsical](../../active-directory/saas-apps/whimsical-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Workday to Active Directory](../../active-directory/saas-apps/workday-inbound-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Workday to Microsoft Entra ID](../../active-directory/saas-apps/workday-inbound-cloud-only-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Workday Writeback](../../active-directory/saas-apps/workday-writeback-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Workteam](../../active-directory/saas-apps/workteam-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Workplace by Facebook](../../active-directory/saas-apps/workplace-by-facebook-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Workgrid](../../active-directory/saas-apps/workgrid-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Wrike](../../active-directory/saas-apps/wrike-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Xledger](../../active-directory/saas-apps/xledger-provisioning-tutorial.md) | ΓùÅ | |
-| [Yellowbox](../../active-directory/saas-apps/yellowbox-provisioning-tutorial.md) | ΓùÅ | |
-| [Zapier](../../active-directory/saas-apps/zapier-provisioning-tutorial.md) | ΓùÅ | |
-| [Zendesk](../../active-directory/saas-apps/zendesk-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zenya](../../active-directory/saas-apps/zenya-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zero](../../active-directory/saas-apps/zero-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zip](../../active-directory/saas-apps/zip-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zoom](../../active-directory/saas-apps/zoom-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler](../../active-directory/saas-apps/zscaler-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler Beta](../../active-directory/saas-apps/zscaler-beta-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler One](../../active-directory/saas-apps/zscaler-one-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler Private Access](../../active-directory/saas-apps/zscaler-private-access-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler Two](../../active-directory/saas-apps/zscaler-two-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler Three](../../active-directory/saas-apps/zscaler-three-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
-| [Zscaler ZSCloud](../../active-directory/saas-apps/zscaler-zscloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [ScreenSteps](../saas-apps/screensteps-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Secure Deliver](../saas-apps/secure-deliver-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SecureLogin](../saas-apps/secure-login-provisioning-tutorial.md) | ΓùÅ | |
+| [Sentry](../saas-apps/sentry-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [ServiceNow](../saas-apps/servicenow-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Segment](../saas-apps/segment-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Shopify Plus](../saas-apps/shopify-plus-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Sigma Computing](../saas-apps/sigma-computing-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Signagelive](../saas-apps/signagelive-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Slack](../saas-apps/slack-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Smartfile](../saas-apps/smartfile-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Smartsheet](../saas-apps/smartsheet-provisioning-tutorial.md) | ΓùÅ | |
+| [Smallstep SSH](../saas-apps/smallstep-ssh-provisioning-tutorial.md) | ΓùÅ | |
+| [Snowflake](../saas-apps/snowflake-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Soloinsight - CloudGate SSO](../saas-apps/soloinsight-cloudgate-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SoSafe](../saas-apps/sosafe-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [SpaceIQ](../saas-apps/spaceiq-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Splashtop](../saas-apps/splashtop-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [StarLeaf](../saas-apps/starleaf-provisioning-tutorial.md) | ΓùÅ | |
+| [Storegate](../saas-apps/storegate-provisioning-tutorial.md) | ΓùÅ | |
+| [SurveyMonkey Enterprise](../saas-apps/surveymonkey-enterprise-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Swit](../saas-apps/swit-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Symantec Web Security Service (WSS)](../saas-apps/symantec-web-security-service.md) | ΓùÅ | ΓùÅ |
+| [Tableau Cloud](../saas-apps/tableau-online-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Tailscale](../saas-apps/tailscale-provisioning-tutorial.md) | ΓùÅ | |
+| [Talentech](../saas-apps/talentech-provisioning-tutorial.md) | ΓùÅ | |
+| [Tanium SSO](../saas-apps/tanium-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Tap App Security](../saas-apps/tap-app-security-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Taskize Connect](../saas-apps/taskize-connect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Teamgo](../saas-apps/teamgo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [TeamViewer](../saas-apps/teamviewer-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [TerraTrue](../saas-apps/terratrue-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [ThousandEyes](../saas-apps/thousandeyes-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Tic-Tac Mobile](../saas-apps/tic-tac-mobile-provisioning-tutorial.md) | ΓùÅ | |
+| [TimeClock 365](../saas-apps/timeclock-365-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [TimeClock 365 SAML](../saas-apps/timeclock-365-saml-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Templafy SAML2](../saas-apps/templafy-saml-2-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Templafy OpenID Connect](../saas-apps/templafy-openid-connect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [TheOrgWiki](../saas-apps/theorgwiki-provisioning-tutorial.md) | ΓùÅ | |
+| [Thrive LXP](../saas-apps/thrive-lxp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Torii](../saas-apps/torii-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [TravelPerk](../saas-apps/travelperk-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Tribeloo](../saas-apps/tribeloo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Twingate](../saas-apps/twingate-provisioning-tutorial.md) | ΓùÅ | |
+| [Uber](../saas-apps/uber-provisioning-tutorial.md) | ΓùÅ | |
+| [UNIFI](../saas-apps/unifi-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [uniFlow Online](../saas-apps/uniflow-online-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [uni-tel ) | ΓùÅ | |
+| [Vault Platform](../saas-apps/vault-platform-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Vbrick Rev Cloud](../saas-apps/vbrick-rev-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [V-Client](../saas-apps/v-client-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Velpic](../saas-apps/velpic-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Visibly](../saas-apps/visibly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Visitly](../saas-apps/visitly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [VMware](../saas-apps/vmware-identity-service-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Vonage](../saas-apps/vonage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [WATS](../saas-apps/wats-provisioning-tutorial.md) | ΓùÅ | |
+| [Webroot Security Awareness Training](../saas-apps/webroot-security-awareness-training-provisioning-tutorial.md) | ΓùÅ | |
+| [WEDO](../saas-apps/wedo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Whimsical](../saas-apps/whimsical-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Workday to Active Directory](../saas-apps/workday-inbound-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Workday to Microsoft Entra ID](../saas-apps/workday-inbound-cloud-only-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Workday Writeback](../saas-apps/workday-writeback-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Workteam](../saas-apps/workteam-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Workplace by Facebook](../saas-apps/workplace-by-facebook-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Workgrid](../saas-apps/workgrid-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Wrike](../saas-apps/wrike-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Xledger](../saas-apps/xledger-provisioning-tutorial.md) | ΓùÅ | |
+| [XM Fax and XM SendSecure](../saas-apps/xm-fax-and-xm-send-secure-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Yellowbox](../saas-apps/yellowbox-provisioning-tutorial.md) | ΓùÅ | |
+| [Zapier](../saas-apps/zapier-provisioning-tutorial.md) | ΓùÅ | |
+| [Zendesk](../saas-apps/zendesk-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zenya](../saas-apps/zenya-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zero](../saas-apps/zero-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zip](../saas-apps/zip-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zoho One](../saas-apps/zoho-one-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zoom](../saas-apps/zoom-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler](../saas-apps/zscaler-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler Beta](../saas-apps/zscaler-beta-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler One](../saas-apps/zscaler-one-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler Private Access](../saas-apps/zscaler-private-access-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler Two](../saas-apps/zscaler-two-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler Three](../saas-apps/zscaler-three-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Zscaler ZSCloud](../saas-apps/zscaler-zscloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
## Partner driven integrations
-There is also a healthy partner ecosystem, further expanding the breadth and depth of integrations available with Microsoft Entra ID Governance. Explore the [partner integrations](../../active-directory/app-provisioning/partner-driven-integrations.md) available, including connectors for:
+There is also a healthy partner ecosystem, further expanding the breadth and depth of integrations available with Microsoft Entra ID Governance. Explore the [partner integrations](../app-provisioning/partner-driven-integrations.md) available, including connectors for:
* Epic * Cerner * IBM RACF
There is also a healthy partner ecosystem, further expanding the breadth and dep
## Next steps
-To learn more about application provisioning, see [What is application provisioning](../../active-directory/app-provisioning/user-provisioning.md).
+To learn more about application provisioning, see [What is application provisioning](../app-provisioning/user-provisioning.md).
active-directory Deploy Access Reviews https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/deploy-access-reviews.md
Access reviews activities are recorded and available from the [Microsoft Entra a
| | Apply decision | | Date range| Seven days |
-For more advanced queries and analysis of access reviews, and to track changes and completion of reviews, export your Microsoft Entra audit logs to [Azure Log Analytics](../reports-monitoring/quickstart-azure-monitor-route-logs-to-storage-account.md) or Azure Event Hubs. When audit logs are stored in Log Analytics, you can use the [powerful analytics language](../reports-monitoring/howto-analyze-activity-logs-log-analytics.md) and build your own dashboards.
+For more advanced queries and analysis of access reviews, and to track changes and completion of reviews, export your Microsoft Entra audit logs to [Azure Log Analytics](../reports-monitoring/howto-archive-logs-to-storage-account.md) or Azure Event Hubs. When audit logs are stored in Log Analytics, you can use the [powerful analytics language](../reports-monitoring/howto-analyze-activity-logs-log-analytics.md) and build your own dashboards.
## Next steps
active-directory Entitlement Management Logs And Reporting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/entitlement-management-logs-and-reporting.md
Microsoft Entra ID stores audit events for up to 30 days in the audit log. Howev
Before you use the Azure Monitor workbooks, you must configure Microsoft Entra ID to send a copy of its audit logs to Azure Monitor.
-Archiving Microsoft Entra audit logs requires you to have Azure Monitor in an Azure subscription. You can read more about the prerequisites and estimated costs of using Azure Monitor in [Microsoft Entra activity logs in Azure Monitor](../reports-monitoring/concept-activity-logs-azure-monitor.md).
+Archiving Microsoft Entra audit logs requires you to have Azure Monitor in an Azure subscription. You can read more about the prerequisites and estimated costs of using Azure Monitor in [Microsoft Entra activity logs in Azure Monitor](../reports-monitoring/concept-log-monitoring-integration-options-considerations.md).
**Prerequisite role**: Global Administrator
Archiving Microsoft Entra audit logs requires you to have Azure Monitor in an Az
1. Check if there's already a setting to send the audit logs to that workspace.
-1. If there isn't already a setting, select **Add diagnostic setting**. Use the instructions in [Integrate Microsoft Entra logs with Azure Monitor logs](../reports-monitoring/howto-integrate-activity-logs-with-log-analytics.md) to send the Microsoft Entra audit log to the Azure Monitor workspace.
+1. If there isn't already a setting, select **Add diagnostic setting**. Use the instructions in [Integrate Microsoft Entra logs with Azure Monitor logs](../reports-monitoring/howto-integrate-activity-logs-with-azure-monitor-logs.md) to send the Microsoft Entra audit log to the Azure Monitor workspace.
![Diagnostics settings pane](./media/entitlement-management-logs-and-reporting/audit-log-diagnostics-settings.png)
If you would like to know the oldest and newest audit events held in Azure Monit
AuditLogs | where TimeGenerated > ago(3653d) | summarize OldestAuditEvent=min(TimeGenerated), NewestAuditEvent=max(TimeGenerated) by Type ```
-For more information on the columns that are stored for audit events in Azure Monitor, see [Interpret the Microsoft Entra audit logs schema in Azure Monitor](../reports-monitoring/overview-reports.md).
+For more information on the columns that are stored for audit events in Azure Monitor, see [Interpret the Microsoft Entra audit logs schema in Azure Monitor](../reports-monitoring/overview-monitoring-health.md).
## Create custom Azure Monitor queries using Azure PowerShell
$subs | ft
You can reauthenticate and associate your PowerShell session to that subscription using a command such as `Connect-AzAccount ΓÇôSubscription $subs[0].id`. To learn more about how to authenticate to Azure from PowerShell, including non-interactively, see [Sign in with Azure PowerShell](/powershell/azure/authenticate-azureps).
-If you have multiple Log Analytics workspaces in that subscription, then the cmdlet [Get-AzOperationalInsightsWorkspace](/powershell/module/Az.OperationalInsights/Get-AzOperationalInsightsWorkspace) returns the list of workspaces. Then you can find the one that has the Microsoft Entra logs. The `CustomerId` field returned by this cmdlet is the same as the value of the "Workspace ID" displayed in the Microsoft Entra admin center in the Log Analytics workspace overview.
+If you have multiple Log Analytics workspaces in that subscription, then the cmdlet [Get-AzOperationalInsightsWorkspace](/powershell/module/az.operationalinsights/get-azoperationalinsightsworkspace) returns the list of workspaces. Then you can find the one that has the Microsoft Entra logs. The `CustomerId` field returned by this cmdlet is the same as the value of the "Workspace ID" displayed in the Microsoft Entra admin center in the Log Analytics workspace overview.
```powershell $wks = Get-AzOperationalInsightsWorkspace
$wks | ft CustomerId, Name
``` ### Send the query to the Log Analytics workspace
-Finally, once you have a workspace identified, you can use [Invoke-AzOperationalInsightsQuery](/powershell/module/az.operationalinsights/Invoke-AzOperationalInsightsQuery) to send a Kusto query to that workspace. These queries are written in [Kusto query language](/azure/kusto/query/).
+Finally, once you have a workspace identified, you can use [Invoke-AzOperationalInsightsQuery](/powershell/module/az.operationalinsights/invoke-azoperationalinsightsquery) to send a Kusto query to that workspace. These queries are written in [Kusto query language](/azure/data-explorer/kusto/query/).
For example, you can retrieve the date range of the audit event records from the Log Analytics workspace, with PowerShell cmdlets to send a query like:
active-directory Identity Governance Applications Deploy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/identity-governance-applications-deploy.md
Microsoft Entra ID with Azure Monitor provides several reports to help you under
* An administrator, or a catalog owner, can [retrieve the list of users who have access package assignments](entitlement-management-access-package-assignments.md), via the Microsoft Entra admin center, Graph or PowerShell. * You can also send the audit logs to Azure Monitor and view a history of [changes to the access package](entitlement-management-logs-and-reporting.md#view-events-for-an-access-package), in the Microsoft Entra admin center, or via PowerShell.
-* You can view the last 30 days of sign-ins to an application in the [sign-ins report](../reports-monitoring/reference-basic-info-sign-in-logs.md) in the Microsoft Entra admin center, or via [Graph](/graph/api/signin-list?view=graph-rest-1.0&tabs=http&preserve-view=true).
-* You can also send the [sign in logs to Azure Monitor](../reports-monitoring/concept-activity-logs-azure-monitor.md) to archive sign in activity for up to two years.
+* You can view the last 30 days of sign-ins to an application in the [sign-ins report](../reports-monitoring/concept-sign-in-log-activity-details.md) in the Microsoft Entra admin center, or via [Graph](/graph/api/signin-list?view=graph-rest-1.0&tabs=http&preserve-view=true).
+* You can also send the [sign in logs to Azure Monitor](../reports-monitoring/concept-log-monitoring-integration-options-considerations.md) to archive sign in activity for up to two years.
## Monitor to adjust entitlement management policies and access as needed
active-directory Identity Governance Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/identity-governance-overview.md
In addition to the features listed above, additional Microsoft Entra features fr
|Policy and role management|Admin can define Conditional Access policies for run-time access to applications. Resource owners can define policies for user's access via access packages.|[Conditional Access](../conditional-access/overview.md) and [Entitlement management](entitlement-management-overview.md) policies| |Access certification|Admins can enable recurring access recertification for: SaaS apps, on-premises apps, cloud group memberships, Microsoft Entra ID or Azure Resource role assignments. Automatically remove resource access, block guest access and delete guest accounts.|[Access reviews](access-reviews-overview.md), also surfaced in [PIM](../privileged-identity-management/pim-create-roles-and-resource-roles-review.md)| |Fulfillment and provisioning|Automatic provisioning and deprovisioning into Microsoft Entra connected apps, including via SCIM, LDAP, SQL and into SharePoint Online sites. |[user provisioning](../app-provisioning/user-provisioning.md)|
-|Reporting and analytics|Admins can retrieve audit logs of recent user provisioning and sign on activity. Integration with Azure Monitor and 'who has access' via access packages.|[Microsoft Entra reports](../reports-monitoring/overview-reports.md) and [monitoring](../reports-monitoring/overview-monitoring.md)|
+|Reporting and analytics|Admins can retrieve audit logs of recent user provisioning and sign on activity. Integration with Azure Monitor and 'who has access' via access packages.|[Microsoft Entra reports](../reports-monitoring/overview-monitoring-health.md) and [monitoring](../reports-monitoring/overview-monitoring-health.md)|
|Privileged access|Just-in-time and scheduled access, alerting, approval workflows for Microsoft Entra roles (including custom roles) and Azure Resource roles.|[Microsoft Entra PIM](../privileged-identity-management/pim-configure.md)| |Auditing|Admins can be alerted of creation of admin accounts.|[Microsoft Entra PIM alerts](../privileged-identity-management/pim-how-to-configure-security-alerts.md)|
active-directory Lifecycle Workflows Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/lifecycle-workflows-deployment.md
The following information is important information about your organization and t
Before you begin planning a Lifecycle Workflow deployment, you should become familiar with the parts of workflow and the terminology around Lifecycle Workflows.
-The [Understanding Lifecycle Workflows](understanding-lifecycle-workflows.md) document, uses the portal to explain the parts of a workflow. The [Developer API reference Lifecycle Workflows](lifecycle-workflows-developer-reference.md) document, uses a GRAPH example to explain the parts of a workflow.
+The [Understanding Lifecycle Workflows](understanding-lifecycle-workflows.md) document, uses the portal to explain the parts of a workflow. The [Developer API reference Lifecycle Workflows](/graph/api/resources/identitygovernance-workflow) document, uses a GRAPH example to explain the parts of a workflow.
You can use this document to become familiar with the parts of workflow prior to deploying them.
active-directory Sap https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/sap.md
SAP likely runs critical functions, such as HR and ERP, for your business. At th
### SuccessFactors
-Customers who use SAP SuccessFactors can easily bring identities into [Microsoft Entra ID](../../active-directory/saas-apps/sap-successfactors-inbound-provisioning-cloud-only-tutorial.md) or [on-premises Active Directory](../../active-directory/saas-apps/sap-successfactors-inbound-provisioning-tutorial.md) by using native connectors. The connectors support the following scenarios:
+Customers who use SAP SuccessFactors can easily bring identities into [Microsoft Entra ID](../saas-apps/sap-successfactors-inbound-provisioning-cloud-only-tutorial.md) or [on-premises Active Directory](../saas-apps/sap-successfactors-inbound-provisioning-tutorial.md) by using native connectors. The connectors support the following scenarios:
-* **Hiring new employees**: When a new employee is added to SuccessFactors, a user account is automatically created in Microsoft Entra ID and optionally Microsoft 365 and [other software as a service (SaaS) applications that Microsoft Entra ID supports](../../active-directory/app-provisioning/user-provisioning.md). This process includes write-back of the email address to SuccessFactors.
+* **Hiring new employees**: When a new employee is added to SuccessFactors, a user account is automatically created in Microsoft Entra ID and optionally Microsoft 365 and [other software as a service (SaaS) applications that Microsoft Entra ID supports](../app-provisioning/user-provisioning.md). This process includes write-back of the email address to SuccessFactors.
* **Employee attribute and profile updates**: When an employee record is updated in SuccessFactors (such as name, title, or manager), the employee's user account is automatically updated in Microsoft Entra ID and optionally Microsoft 365 and other SaaS applications that Microsoft Entra ID supports. * **Employee terminations**: When an employee is terminated in SuccessFactors, the employee's user account is automatically disabled in Microsoft Entra ID and optionally Microsoft 365 and other SaaS applications that Microsoft Entra ID supports. * **Employee rehires**: When an employee is rehired in SuccessFactors, the employee's old account can be automatically reactivated or re-provisioned (depending on your preference) to Microsoft Entra ID and optionally Microsoft 365 and other SaaS applications that Microsoft Entra ID supports.
After you set up provisioning for your SAP applications, you can enable SSO for
After your users are in Microsoft Entra ID, you can provision accounts into the various SaaS and on-premises SAP applications that they need access to. You have three ways to accomplish this:
-* Use the enterprise application in Microsoft Entra ID to configure both single sign-on (SSO) and provisioning to SAP applications such as [SAP Analytics Cloud](../../active-directory/saas-apps/sap-analytics-cloud-provisioning-tutorial.md). With this option, you can apply a consistent set of governance processes across all your applications.
-* Use the [SAP Identity Authentication Service (IAS)](../../active-directory/saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md) enterprise application in Microsoft Entra ID to provision identities into SAP IAS. After you bring all the identities into SAP IAS, you can use SAP IPS to provision the accounts from there into your applications when required.
+* Use the enterprise application in Microsoft Entra ID to configure both single sign-on (SSO) and provisioning to SAP applications such as [SAP Analytics Cloud](../saas-apps/sap-analytics-cloud-provisioning-tutorial.md). With this option, you can apply a consistent set of governance processes across all your applications.
+* Use the [SAP Identity Authentication Service (IAS)](../saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md) enterprise application in Microsoft Entra ID to provision identities into SAP IAS. After you bring all the identities into SAP IAS, you can use SAP IPS to provision the accounts from there into your applications when required.
* Use the [SAP IPS](https://help.sap.com/docs/IDENTITY_PROVISIONING/f48e822d6d484fa5ade7dda78b64d9f5/f2b2df8a273642a1bf801e99ecc4a043.html) integration to directly export identities from Microsoft Entra ID into your [applications](https://help.sap.com/docs/IDENTITY_PROVISIONING/f48e822d6d484fa5ade7dda78b64d9f5/ab3f641552464c79b94d10b9205fd721.html). When you're using SAP IPS to pull users into your applications, all provisioning configuration is managed in SAP directly. You can still use the enterprise application in Microsoft Entra ID to manage SSO and use [Microsoft Entra ID as the corporate identity provider](https://help.sap.com/docs/IDENTITY_AUTHENTICATION/6d6d63354d1242d185ab4830fc04feb1/058c7b14209f4f2d8de039da4330a1c1.html). ### Provision identities into on-premises SAP systems that SAP IPS doesn't support Customers who have yet to transition from applications such as SAP R/3 and SAP ERP Central Component (SAP ECC) to SAP S/4HANA can still rely on the Microsoft Entra provisioning service to provision user accounts. Within SAP R/3 and SAP ECC, you expose the necessary Business Application Programming Interfaces (BAPIs) for creating, updating, and deleting users. Within Microsoft Entra ID, you have two options:
-* Use the lightweight Microsoft Entra provisioning agent and [web services connector](/azure/active-directory/app-provisioning/on-premises-web-services-connector) to [provision users into apps such as SAP ECC](/azure/active-directory/app-provisioning/on-premises-sap-connector-configure).
+* Use the lightweight Microsoft Entra provisioning agent and [web services connector](../app-provisioning/on-premises-web-services-connector.md) to [provision users into apps such as SAP ECC](../app-provisioning/on-premises-sap-connector-configure.md).
* In scenarios where you need to do more complex group and role management, use [Microsoft Identity Manager](/microsoft-identity-manager/reference/microsoft-identity-manager-2016-ma-ws) to manage access to your legacy SAP applications. ## Trigger custom workflows
With separation-of-duties checks in Microsoft Entra ID [entitlement management](
## Next steps
-* [Bring identities from SAP SuccessFactors into Microsoft Entra ID](../../active-directory/saas-apps/sap-successfactors-inbound-provisioning-cloud-only-tutorial.md)
-* [Provision accounts in SAP IAS](../../active-directory/saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md)
+* [Bring identities from SAP SuccessFactors into Microsoft Entra ID](../saas-apps/sap-successfactors-inbound-provisioning-cloud-only-tutorial.md)
+* [Provision accounts in SAP IAS](../saas-apps/sap-cloud-platform-identity-authentication-provisioning-tutorial.md)
active-directory Tutorial Prepare User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/tutorial-prepare-user-accounts.md
Once your user(s) has been successfully created in Microsoft Entra ID, you may p
## Additional steps for pre-hire scenario
-There are some additional steps that you should be aware of when testing either the [On-boarding users to your organization using Lifecycle workflows with the Microsoft Entra Admin Center](tutorial-onboard-custom-workflow-portal.md) tutorial or the [On-boarding users to your organization using Lifecycle workflows with Microsoft Graph](tutorial-onboard-custom-workflow-graph.md) tutorial.
+There are some additional steps that you should be aware of when testing either the [On-boarding users to your organization using Lifecycle workflows with the Microsoft Entra Admin Center](tutorial-onboard-custom-workflow-portal.md) tutorial or the [On-boarding users to your organization using Lifecycle workflows with Microsoft Graph](/graph/tutorial-lifecycle-workflows-onboard-custom-workflow) tutorial.
### Edit the users attributes using the Microsoft Entra admin center
The manager attribute is used for email notification tasks. It's used by the li
:::image type="content" source="media/tutorial-lifecycle-workflows/graph-get-manager.png" alt-text="Screenshot of getting a manager in Graph explorer." lightbox="media/tutorial-lifecycle-workflows/graph-get-manager.png":::
-For more information about updating manager information for a user in Graph API, see [assign manager](/graph/api/user-post-manager?view=graph-rest-1.0&tabs=http&preserve-view=true) documentation. You can also set this attribute in the Azure Admin center. For more information, see [add or change profile information](../fundamentals/how-to-manage-user-profile-info.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context).
+For more information about updating manager information for a user in Graph API, see [assign manager](/graph/api/user-post-manager?view=graph-rest-1.0&tabs=http&preserve-view=true) documentation. You can also set this attribute in the Azure Admin center. For more information, see [add or change profile information](../fundamentals/how-to-manage-user-profile-info.md?context=azure/active-directory/users-groups-roles/context/ugr-context).
### Enabling the Temporary Access Pass (TAP)
A user with groups and Teams memberships is required before you begin the tutori
## Next steps - [On-boarding users to your organization using Lifecycle workflows with the Microsoft Entra admin center](tutorial-onboard-custom-workflow-portal.md)-- [On-boarding users to your organization using Lifecycle workflows with Microsoft Graph](tutorial-onboard-custom-workflow-graph.md)
+- [On-boarding users to your organization using Lifecycle workflows with Microsoft Graph](/graph/tutorial-lifecycle-workflows-onboard-custom-workflow)
- [Tutorial: Off-boarding users from your organization using Lifecycle workflows with The Microsoft Entra Admin Center](tutorial-offboard-custom-workflow-portal.md)-- [Tutorial: Off-boarding users from your organization using Lifecycle workflows with Microsoft Graph](tutorial-offboard-custom-workflow-graph.md)
+- [Tutorial: Off-boarding users from your organization using Lifecycle workflows with Microsoft Graph](/graph/tutorial-lifecycle-workflows-offboard-custom-workflow)
active-directory Understanding Lifecycle Workflows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/understanding-lifecycle-workflows.md
The following document provides an overview of a workflow created using Lifecycl
For a full list of supported delegated and application permissions required to use Lifecycle Workflows, see: [Lifecycle workflows permissions](/graph/permissions-reference#lifecycle-workflows-permissions).
-For delegated scenarios, the admin needs one of the following [Microsoft Entra roles](/azure/active-directory/users-groups-roles/directory-assign-admin-roles#available-roles):
+For delegated scenarios, the admin needs one of the following [Microsoft Entra roles](../roles/permissions-reference.md):
- Global administrator - Global reader
active-directory What Is Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/what-is-provisioning.md
For more information, see [What is HR driven provisioning?](../app-provisioning/
In Microsoft Entra ID, the term **[app provisioning](../app-provisioning/user-provisioning.md)** refers to automatically creating copies of user identities in the applications that users need access to, for applications that have their own data store, distinct from Microsoft Entra ID or Active Directory. In addition to creating user identities, app provisioning includes the maintenance and removal of user identities from those apps, as the user's status or roles change. Common scenarios include provisioning a Microsoft Entra user into applications like [Dropbox](../saas-apps/dropboxforbusiness-provisioning-tutorial.md), [Salesforce](../saas-apps/salesforce-provisioning-tutorial.md), [ServiceNow](../saas-apps/servicenow-provisioning-tutorial.md), as each of these applications have their own user repository distinct from Microsoft Entra ID.
-Microsoft Entra ID also supports provisioning users into applications hosted on-premises or in a virtual machine, without having to open up any firewalls. If your application supports [SCIM](https://aka.ms/scimoverview), or you've built a SCIM gateway to connect to your legacy application, you can use the Microsoft Entra provisioning agent to [directly connect](/azure/active-directory/app-provisioning/on-premises-scim-provisioning) with your application and automate provisioning and deprovisioning. If you have legacy applications that don't support SCIM and rely on an [LDAP](/azure/active-directory/app-provisioning/on-premises-ldap-connector-configure) user store or a [SQL](/azure/active-directory/app-provisioning/on-premises-sql-connector-configure) database, or that have a [SOAP or REST API](../app-provisioning/on-premises-web-services-connector.md), Microsoft Entra ID can support those as well.
+Microsoft Entra ID also supports provisioning users into applications hosted on-premises or in a virtual machine, without having to open up any firewalls. If your application supports [SCIM](https://aka.ms/scimoverview), or you've built a SCIM gateway to connect to your legacy application, you can use the Microsoft Entra provisioning agent to [directly connect](../app-provisioning/on-premises-scim-provisioning.md) with your application and automate provisioning and deprovisioning. If you have legacy applications that don't support SCIM and rely on an [LDAP](../app-provisioning/on-premises-ldap-connector-configure.md) user store or a [SQL](../app-provisioning/on-premises-sql-connector-configure.md) database, or that have a [SOAP or REST API](../app-provisioning/on-premises-web-services-connector.md), Microsoft Entra ID can support those as well.
For more information, see [What is app provisioning?](../app-provisioning/user-provisioning.md)
active-directory Custom Attribute Mapping https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/cloud-sync/custom-attribute-mapping.md
If you have extended Active Directory to include custom attributes, you can add
To discover and map attributes, click **Add attribute mapping**. The attributes will automatically be discovered and will be available in the drop-down under **source attribute**. Fill in the type of mapping you want and click **Apply**. [![Custom attribute mapping](media/custom-attribute-mapping/schema-1.png)](media/custom-attribute-mapping/schema-1.png#lightbox)
-For information on new attributes that are added and updated in Microsoft Entra ID see the [user resource type](/graph/api/resources/user?view=graph-rest-1.0#properties&preserve-view=true) and consider subscribing to [change notifications](/graph/webhooks).
+For information on new attributes that are added and updated in Microsoft Entra ID see the [user resource type](/graph/api/resources/user?view=graph-rest-1.0&preserve-view=true#properties) and consider subscribing to [change notifications](/graph/webhooks).
For more information on extension attributes, see [Syncing extension attributes for Microsoft Entra Application Provisioning](../../app-provisioning/user-provisioning-sync-attributes-for-mapping.md)
active-directory How To Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/cloud-sync/how-to-install.md
To use *password writeback* and enable the self-service password reset (SSPR) se
Set-AADCloudSyncPasswordWritebackConfiguration -Enable $true -Credential $(Get-Credential) ```
-For more information about using password writeback with Microsoft Entra Cloud Sync, see [Tutorial: Enable cloud sync self-service password reset writeback to an on-premises environment (preview)](../../../active-directory/authentication/tutorial-enable-cloud-sync-sspr-writeback.md).
+For more information about using password writeback with Microsoft Entra Cloud Sync, see [Tutorial: Enable cloud sync self-service password reset writeback to an on-premises environment (preview)](../../authentication/tutorial-enable-cloud-sync-sspr-writeback.md).
## Install an agent in the US government cloud
active-directory Choose Ad Authn https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/connect/choose-ad-authn.md
The following diagrams outline the high-level architecture components required f
|Is there a health monitoring solution?|Not required|Agent status provided by the [[Microsoft Entra admin center](https://entra.microsoft.com)](tshoot-connect-pass-through-authentication.md)|[Microsoft Entra Connect Health](how-to-connect-health-adfs.md)| |Do users get single sign-on to cloud resources from domain-joined devices within the company network?|Yes with [Microsoft Entra joined devices](../../devices/concept-directory-join.md), [Microsoft Entra hybrid joined devices](../../devices/how-to-hybrid-join.md), the [Microsoft Enterprise SSO plug-in for Apple devices](../../develop/apple-sso-plugin.md), or [Seamless SSO](how-to-connect-sso.md)|Yes with [Microsoft Entra joined devices](../../devices/concept-directory-join.md), [Microsoft Entra hybrid joined devices](../../devices/how-to-hybrid-join.md), the [Microsoft Enterprise SSO plug-in for Apple devices](../../develop/apple-sso-plugin.md), or [Seamless SSO](how-to-connect-sso.md)|Yes| |What sign-in types are supported?|UserPrincipalName + password<br><br>Windows-Integrated Authentication by using [Seamless SSO](how-to-connect-sso.md)<br><br>[Alternate login ID](how-to-connect-install-custom.md)<br><br>[Microsoft Entra joined Devices](../../devices/concept-directory-join.md)<br><br>[Microsoft Entra hybrid joined devices](../../devices/how-to-hybrid-join.md)<br><br>[Certificate and smart card authentication](../../authentication/concept-certificate-based-authentication-smartcard.md)|UserPrincipalName + password<br><br>Windows-Integrated Authentication by using [Seamless SSO](how-to-connect-sso.md)<br><br>[Alternate login ID](how-to-connect-pta-faq.yml)<br><br>[Microsoft Entra joined Devices](../../devices/concept-directory-join.md)<br><br>[Microsoft Entra hybrid joined devices](../../devices/how-to-hybrid-join.md)<br><br>[Certificate and smart card authentication](../../authentication/concept-certificate-based-authentication-smartcard.md)|UserPrincipalName + password<br><br>sAMAccountName + password<br><br>Windows-Integrated Authentication<br><br>[Certificate and smart card authentication](/windows-server/identity/ad-fs/operations/configure-user-certificate-authentication)<br><br>[Alternate login ID](/windows-server/identity/ad-fs/operations/configuring-alternate-login-id)|
-|Is Windows Hello for Business supported?|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-trust)|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-trust)<br><br>*Both require Windows Server 2016 Domain functional level*|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-kerberos-trust)<br><br>[Certificate trust model](/windows/security/identity-protection/hello-for-business/hello-key-trust-adfs)|
-|What are the multifactor authentication options?|[Microsoft Entra multifactor authentication](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../conditional-access/controls.md)|[Microsoft Entra multifactor authentication](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../conditional-access/controls.md)|[Microsoft Entra multifactor authentication](../../authentication/index.yml)<br><br>[Third-party MFA](/windows-server/identity/ad-fs/operations/configure-additional-authentication-methods-for-ad-fs)<br><br>[Custom Controls with Conditional Access*](../../conditional-access/controls.md)|
+|Is Windows Hello for Business supported?|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-trust)|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-kerberos-trust)<br><br>*Both require Windows Server 2016 Domain functional level*|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-kerberos-trust)<br><br>[Certificate trust model](/windows/security/identity-protection/hello-for-business/hello-key-trust-adfs)|
+|What are the multifactor authentication options?|[Microsoft Entra multifactor authentication](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../conditional-access/controls.md)|[Microsoft Entra multifactor authentication](../../authentication/index.yml)<br><br>[Custom Controls with Conditional Access*](../../conditional-access/controls.md)|[Microsoft Entra multifactor authentication](../../authentication/index.yml)<br><br>[Third-party MFA](/windows-server/identity/ad-fs/operations/configure-additional-authentication-methods-for-ad-fs)<br><br>[Custom Controls with Conditional Access*](../../conditional-access/controls.md)|
|What user account states are supported?|Disabled accounts<br>(up to 30-minute delay)|Disabled accounts<br><br>Account locked out<br><br>Account expired<br><br>Password expired<br><br>Sign-in hours|Disabled accounts<br><br>Account locked out<br><br>Account expired<br><br>Password expired<br><br>Sign-in hours| |What are the Conditional Access options?|[Microsoft Entra Conditional Access, with Microsoft Entra ID P1 or P2](../../conditional-access/overview.md)|[Microsoft Entra Conditional Access, with Microsoft Entra ID P1 or P2](../../conditional-access/overview.md)|[Microsoft Entra Conditional Access, with Microsoft Entra ID P1 or P2](../../conditional-access/overview.md)<br><br>[AD FS claim rules](https://adfshelp.microsoft.com/AadTrustClaims/ClaimsGenerator)| |Is blocking legacy protocols supported?|[Yes](../../conditional-access/overview.md)|[Yes](../../conditional-access/overview.md)|[Yes](/windows-server/identity/ad-fs/operations/access-control-policies-w2k12)|
active-directory Four Steps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/connect/four-steps.md
To learn more, go read [Monitor AD FS using Microsoft Entra Connect Health](./ho
### Create custom dashboards for your leadership and your day to day
-Organizations that don't have a SIEM solution can use [Azure Monitor workbooks for Microsoft Entra ID](/azure/active-directory/reports-monitoring/howto-use-workbooks). The integration contains pre-built workbooks and templates to help you understand how your users adopt and use Microsoft Entra features, which allows you to gain insights into all the activities within your directory. You can also create your own workbooks and share with your leadership team to report on day-to-day activities. Workbooks are a great way to monitor your business and see all of your most important metrics at a glance.
+Organizations that don't have a SIEM solution can use [Azure Monitor workbooks for Microsoft Entra ID](../../reports-monitoring/howto-use-workbooks.md). The integration contains pre-built workbooks and templates to help you understand how your users adopt and use Microsoft Entra features, which allows you to gain insights into all the activities within your directory. You can also create your own workbooks and share with your leadership team to report on day-to-day activities. Workbooks are a great way to monitor your business and see all of your most important metrics at a glance.
### Understand your support call drivers
active-directory How To Connect Install Prerequisites https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/connect/how-to-connect-install-prerequisites.md
To read more about securing your Active Directory environment, see [Best practic
### Harden your Microsoft Entra Connect server We recommend that you harden your Microsoft Entra Connect server to decrease the security attack surface for this critical component of your IT environment. Following these recommendations will help to mitigate some security risks to your organization. -- We recommend hardening the Microsoft Entra Connect server as a Control Plane (formerly Tier 0) asset by following the guidance provided in [Secure Privileged Access](/security/privileged-access-workstations/overview) and [Active Directory administrative tier model](/windows-server/identity/securing-privileged-access/securing-privileged-access-reference-material).
+- We recommend hardening the Microsoft Entra Connect server as a Control Plane (formerly Tier 0) asset by following the guidance provided in [Secure Privileged Access](/security/privileged-access-workstations/overview) and [Active Directory administrative tier model](/security/privileged-access-workstations/privileged-access-access-model).
- Restrict administrative access to the Microsoft Entra Connect server to only domain administrators or other tightly controlled security groups. - Create a [dedicated account for all personnel with privileged access](/windows-server/identity/securing-privileged-access/securing-privileged-access). Administrators shouldn't be browsing the web, checking their email, and doing day-to-day productivity tasks with highly privileged accounts. - Follow the guidance provided in [Securing privileged access](/security/privileged-access-workstations/overview).
When you use Microsoft Entra Connect to deploy AD FS or the Web Application Prox
* Ensure the Windows Remote Management/WS-Management (WinRM) service is running via the Services snap-in. * In an elevated PowerShell command window, use the command `Enable-PSRemoting ΓÇôforce`. * On the machine on which the wizard is running (if the target machine is non-domain joined or is an untrusted domain):
- * In an elevated PowerShell command window, use the command `Set-Item.WSMan:\localhost\Client\TrustedHosts ΓÇôValue <DMZServerFQDN> -Force ΓÇôConcatenate`.
+ * In an elevated PowerShell command window, use the command `Set-Item.WSMan:\localhost\Client\TrustedHosts ΓÇôValue "<DMZServerFQDN>" -Force ΓÇôConcatenate`.
* In the server * Add a DMZ WAP host to a machine pool. In the server manager, select **Manage** > **Add Servers**, and then use the **DNS** tab. * On the **Server Manager All Servers** tab, right-click the WAP server, and select **Manage As**. Enter local (not domain) credentials for the WAP machine.
active-directory How To Connect Sso https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/connect/how-to-connect-sso.md
For more information on how SSO works with Windows 10 using PRT, see: [Primary R
- Microsoft 365 Win32 clients (Outlook, Word, Excel, and others) with versions 16.0.8730.xxxx and above are supported using a non-interactive flow. For OneDrive, you'll have to activate the [OneDrive silent config feature](https://techcommunity.microsoft.com/t5/Microsoft-OneDrive-Blog/Previews-for-Silent-Sync-Account-Configuration-and-Bandwidth/ba-p/120894) for a silent sign-on experience. - It can be enabled via Microsoft Entra Connect. - It's a free feature, and you don't need any paid editions of Microsoft Entra ID to use it.-- It's supported on web browser-based clients and Office clients that support [modern authentication](/office365/enterprise/modern-auth-for-office-2013-and-2016) on platforms and browsers capable of Kerberos authentication:
+- It's supported on web browser-based clients and Office clients that support [modern authentication](/microsoft-365/enterprise/modern-auth-for-office-2013-and-2016) on platforms and browsers capable of Kerberos authentication:
| OS\Browser |Internet Explorer|Microsoft Edge\*\*\*\*|Google Chrome|Mozilla Firefox|Safari| | | | | | | --
active-directory Migrate From Federation To Cloud Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/connect/migrate-from-federation-to-cloud-authentication.md
Install [Microsoft Entra Connect](https://www.microsoft.com/download/details.asp
### Document current federation settings
-To find your current federation settings, run [Get-MgDomainFederationConfiguration](/powershell/module/microsoft.graph.identity.directorymanagement/get-mgdomainfederationconfiguration?view=graph-powershell-beta&preserve-view=true).
+To find your current federation settings, run [Get-MgDomainFederationConfiguration](/powershell/module/microsoft.graph.identity.directorymanagement/get-mgdomainfederationconfiguration?view=graph-powershell-1.0&viewFallbackFrom=graph-powershell-beta&preserve-view=true).
```powershell Get-MgDomainFederationConfiguration ΓÇôDomainID yourdomain.com
active-directory F5 Bigip Deployment Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/f5-bigip-deployment-guide.md
Confirm you can connect to the BIG-IP VM web config and sign in with the credent
To connect to the CLI: -- [Azure Bastion service](../../bastion/bastion-overview.md): Connect to VMs in a VNet, from any location
+- [Azure Bastion service](/azure/bastion/bastion-overview): Connect to VMs in a VNet, from any location
- SSH client, such as PowerShell with the just-in-time (JIT) approach - Serial Console: In the portal, in the VM menu, Support and troubleshooting section. It doesn't support file transfers. - From the internet: Configure the BIG-IP primary IP with a public IP. Add an NSG rule to allow SSH traffic. Restrict your trusted IP source.
When the BIG-IP system is provisioned, we recommend a full configuration backup.
10. Save the user configuration set (UCS) archive locally. 11. Select **Download**.
-You can create a backup of the entire system disk using [Azure snapshots](../../virtual-machines/windows/snapshot-copy-managed-disk.md). This tool provides contingency for testing between TMOS versions, or rolling back to a fresh system.
+You can create a backup of the entire system disk using [Azure snapshots](/azure/virtual-machines/snapshot-copy-managed-disk). This tool provides contingency for testing between TMOS versions, or rolling back to a fresh system.
```PowerShell # Install modules
active-directory Manage Consent Requests https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/manage-consent-requests.md
# Manage consent to applications and evaluate consent requests
-Microsoft recommends that you [restrict user consent](../../active-directory/manage-apps/configure-user-consent.md) to allow users to consent only for apps from verified publishers, and only for permissions that you select. For apps that don't meet these criteria, the decision-making process is centralized with your organization's security and identity administrator team.
+Microsoft recommends that you [restrict user consent](../manage-apps/configure-user-consent.md) to allow users to consent only for apps from verified publishers, and only for permissions that you select. For apps that don't meet these criteria, the decision-making process is centralized with your organization's security and identity administrator team.
After you've disabled or restricted user consent, you have several important steps to take to help keep your organization secure as you continue to allow business-critical applications to be used. These steps are crucial to minimize impact on your organization's support team and IT administrators, and to help prevent the use of unmanaged accounts in third-party applications.
active-directory Prevent Domain Hints With Home Realm Discovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/prevent-domain-hints-with-home-realm-discovery.md
After step 4 is complete all users, except those in `guestHandlingDomain.com`, c
## Configuring policy through Graph Explorer
-Manage the [Home Realm Discovery policy](/graph/api/resources/homeRealmDiscoveryPolicy) using [Microsoft Graph](https://developer.microsoft.com/graph/graph-explorer).
+Manage the [Home Realm Discovery policy](/graph/api/resources/homerealmdiscoverypolicy) using [Microsoft Graph](https://developer.microsoft.com/graph/graph-explorer).
1. Sign in to Microsoft Graph explorer with one of the roles listed in the prerequisite section. 1. Grant the `Policy.ReadWrite.ApplicationConfiguration` permission.
active-directory How Manage User Assigned Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities.md
In this article, you create a user-assigned managed identity by using Azure Reso
You can't list and delete a user-assigned managed identity by using a Resource Manager template. See the following articles to create and list a user-assigned managed identity: -- [List user-assigned managed identity](./how-to-manage-ua-identity-cli.md#list-user-assigned-managed-identities)-- [Delete user-assigned managed identity](./how-to-manage-ua-identity-cli.md#delete-a-user-assigned-managed-identity)
+- [List user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-azcli#list-user-assigned-managed-identities)
+- [Delete user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-azcli#delete-a-user-assigned-managed-identity)
## Template creation and editing Resource Manager templates help you deploy new or modified resources defined by an Azure resource group. Several options are available for template editing and deployment, both local and portal-based. You can:
active-directory Managed Identities Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/managed-identities-status.md
The following Azure services support managed identities for Azure resources:
| Azure Load Testing | [Use managed identities for Azure Load Testing](/azure/load-testing/how-to-use-a-managed-identity) | | Azure Logic Apps | [Authenticate access to Azure resources using managed identities in Azure Logic Apps](/azure/logic-apps/create-managed-service-identity) | | Azure Log Analytics cluster | [Azure Monitor customer-managed key](/azure/azure-monitor/logs/customer-managed-keys)
-| Azure Machine Learning Services | [Use Managed identities with Azure Machine Learning](../../machine-learning/how-to-use-managed-identities.md?tabs=python) |
+| Azure Machine Learning Services | [Use Managed identities with Azure Machine Learning](/azure/machine-learning/how-to-identity-based-service-authentication?tabs=python) |
| Azure Managed Disk | [Use the Azure portal to enable server-side encryption with customer-managed keys for managed disks](/azure/virtual-machines/disks-enable-customer-managed-keys-portal) | | Azure Media services | [Managed identities](/azure/media-services/latest/concept-managed-identities) | | Azure Monitor | [Azure Monitor customer-managed key](/azure/azure-monitor/logs/customer-managed-keys?tabs=portal) | | Azure Policy | [Remediate non-compliant resources with Azure Policy](/azure/governance/policy/how-to/remediate-resources) |
-| Microsoft Purview | [Credentials for source authentication in Microsoft Purview](../../purview/manage-credentials.md) |
+| Microsoft Purview | [Credentials for source authentication in Microsoft Purview](/purview/manage-credentials) |
| Azure Resource Mover | [Move resources across regions (from resource group)](/azure/resource-mover/move-region-within-resource-group) | Azure Site Recovery | [Replicate machines with private endpoints](/azure/site-recovery/azure-to-azure-how-to-enable-replication-private-endpoints#enable-the-managed-identity-for-the-vault) | | Azure Search | [Set up an indexer connection to a data source using a managed identity](/azure/search/search-howto-managed-identities-data-sources) |
The following Azure services support managed identities for Azure resources:
| Azure Stack Edge | [Manage Azure Stack Edge secrets using Azure Key Vault](/azure/databox-online/azure-stack-edge-gpu-activation-key-vault#recover-managed-identity-access) | Azure Static Web Apps | [Securing authentication secrets in Azure Key Vault](/azure/static-web-apps/key-vault-secrets) | Azure Stream Analytics | [Authenticate Stream Analytics to Azure Data Lake Storage Gen1 using managed identities](/azure/stream-analytics/stream-analytics-managed-identities-adls) |
-| Azure Synapse | [Azure Synapse workspace managed identity](../../synapse-analytics/security/synapse-workspace-managed-identity.md) |
+| Azure Synapse | [Azure Synapse workspace managed identity](/azure/data-factory/data-factory-service-identity) |
| Azure VM image builder | [Configure Azure Image Builder Service permissions using Azure CLI](/azure/virtual-machines/linux/image-builder-permissions-cli#using-managed-identity-for-azure-storage-access)| | Azure Virtual Machine Scale Sets | [Configure managed identities on virtual machine scale set - Azure CLI](qs-configure-cli-windows-vmss.md) | | Azure Virtual Machines | [Secure and use policies on virtual machines in Azure](/azure/virtual-machines/windows/security-policy#managed-identities-for-azure-resources) |
active-directory Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/overview.md
There are two types of managed identities:
- You authorize the managed identity to have access to one or more services. - The name of the system-assigned service principal is always the same as the name of the Azure resource it is created for. For a deployment slot, the name of its system-assigned identity is ```<app-name>/slots/<slot-name>```. -- **User-assigned**. You may also create a managed identity as a standalone Azure resource. You can [create a user-assigned managed identity](./how-to-manage-ua-identity-portal.md) and assign it to one or more Azure Resources. When you enable a user-assigned managed identity:
+- **User-assigned**. You may also create a managed identity as a standalone Azure resource. You can [create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-azp) and assign it to one or more Azure Resources. When you enable a user-assigned managed identity:
- A service principal of a special type is created in Microsoft Entra ID for the identity. The service principal is managed separately from the resources that use it. - User-assigned identities can be used by multiple resources. - You authorize the managed identity to have access to one or more services.
Resources that support system assigned managed identities allow you to:
If you choose a user assigned managed identity instead: -- You can [create, read, update, and delete](./how-to-manage-ua-identity-portal.md) the identities.
+- You can [create, read, update, and delete](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-azp) the identities.
- You can use RBAC role assignments to [grant permissions](howto-assign-access-portal.md). - User assigned managed identities can be used on more than one resource. - CRUD operations are available for review in [Azure Activity logs](/azure/azure-monitor/essentials/activity-log).
active-directory Qs Configure Rest Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/qs-configure-rest-vm.md
To assign a user-assigned identity to a VM, your account needs the [Virtual Mach
az account get-access-token ```
-4. Create a user-assigned managed identity using the instructions found here: [Create a user-assigned managed identity](how-to-manage-ua-identity-rest.md#create-a-user-assigned-managed-identity).
+4. Create a user-assigned managed identity using the instructions found here: [Create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-rest#create-a-user-assigned-managed-identity).
5. Create a VM using CURL to call the Azure Resource Manager REST endpoint. The following example creates a VM named *myVM* in the resource group *myResourceGroup* with a user-assigned managed identity `ID1`, as identified in the request body by the value `"identity":{"type":"UserAssigned"}`. Replace `<ACCESS TOKEN>` with the value you received in the previous step when you requested a Bearer access token and the `<SUBSCRIPTION ID>` value as appropriate for your environment.
To assign a user-assigned identity to a VM, your account needs the [Virtual Mach
az account get-access-token ```
-2. Create a user-assigned managed identity using the instructions found here, [Create a user-assigned managed identity](how-to-manage-ua-identity-rest.md#create-a-user-assigned-managed-identity).
+2. Create a user-assigned managed identity using the instructions found here, [Create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-rest#create-a-user-assigned-managed-identity).
3. To ensure you don't delete existing user or system-assigned managed identities that are assigned to the VM, you need to list the identity types assigned to the VM by using the following CURL command. If you have managed identities assigned to the virtual machine scale set, they are listed under in the `identity` value.
PATCH https://management.azure.com/subscriptions/<SUBSCRIPTION ID>/resourceGroup
For information on how to create, list, or delete user-assigned managed identities using REST see: -- [Create, list, or delete a user-assigned managed identities using REST API calls](how-to-manage-ua-identity-rest.md)
+- [Create, list, or delete a user-assigned managed identities using REST API calls](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-rest)
active-directory Qs Configure Rest Vmss https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/qs-configure-rest-vmss.md
In this section, you learn how to add and remove user-assigned managed identity
az account get-access-token ```
-4. Create a user-assigned managed identity using the instructions found here: [Create a user-assigned managed identity](how-to-manage-ua-identity-rest.md#create-a-user-assigned-managed-identity).
+4. Create a user-assigned managed identity using the instructions found here: [Create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-rest#create-a-user-assigned-managed-identity).
5. Create a virtual machine scale set using CURL to call the Azure Resource Manager REST endpoint. The following example creates a virtual machine scale set named *myVMSS* in the resource group *myResourceGroup* with a user-assigned managed identity `ID1`, as identified in the request body by the value `"identity":{"type":"UserAssigned"}`. Replace `<ACCESS TOKEN>` with the value you received in the previous step when you requested a Bearer access token and the `<SUBSCRIPTION ID>` value as appropriate for your environment.
In this section, you learn how to add and remove user-assigned managed identity
az account get-access-token ```
-2. Create a user-assigned managed identity using the instructions found here, [Create a user-assigned managed identity](how-to-manage-ua-identity-rest.md#create-a-user-assigned-managed-identity).
+2. Create a user-assigned managed identity using the instructions found here, [Create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-rest#create-a-user-assigned-managed-identity).
3. To ensure you don't delete existing user or system-assigned managed identities that are assigned to the virtual machine scale set, you need to list the identity types assigned to the virtual machine scale set by using the following CURL command. If you have managed identities assigned to the virtual machine scale set, they are listed in the `identity` value.
PATCH https://management.azure.com/subscriptions/<SUBSCRIPTION ID>/resourceGroup
For information on how to create, list, or delete user-assigned managed identities using REST see: -- [Create, list, or delete a user-assigned managed identity using REST API calls](how-to-manage-ua-identity-rest.md)
+- [Create, list, or delete a user-assigned managed identity using REST API calls](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-rest)
active-directory Qs Configure Template Windows Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/qs-configure-template-windows-vm.md
The following example shows you how to remove a system-assigned managed identity
In this section, you assign a user-assigned managed identity to an Azure VM using Azure Resource Manager template. > [!NOTE]
-> To create a user-assigned managed identity using an Azure Resource Manager Template, see [Create a user-assigned managed identity](how-to-manage-ua-identity-arm.md#create-a-user-assigned-managed-identity).
+> To create a user-assigned managed identity using an Azure Resource Manager Template, see [Create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-arm#create-a-user-assigned-managed-identity).
### Assign a user-assigned managed identity to an Azure VM
active-directory Qs Configure Template Windows Vmss https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/qs-configure-template-windows-vmss.md
If you have a virtual machine scale set that no longer needs a system-assigned m
In this section, you assign a user-assigned managed identity to a virtual machine scale set using Azure Resource Manager template. > [!Note]
-> To create a user-assigned managed identity using an Azure Resource Manager Template, see [Create a user-assigned managed identity](how-to-manage-ua-identity-arm.md#create-a-user-assigned-managed-identity).
+> To create a user-assigned managed identity using an Azure Resource Manager Template, see [Create a user-assigned managed identity](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-arm#create-a-user-assigned-managed-identity).
### Assign a user-assigned managed identity to a virtual machine scale set
active-directory Tutorial Linux Vm Access Arm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/tutorial-linux-vm-access-arm.md
In this quickstart, you learned how to use a system-assigned managed identity to
> [!div class="nextstepaction"] >[Azure Resource Manager](/azure/azure-resource-manager/management/overview)
->[Create, list or delete a user-assigned managed identity using Azure PowerShell](how-to-manage-ua-identity-powershell.md)
+>[Create, list or delete a user-assigned managed identity using Azure PowerShell](./how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-powershell)
active-directory Tutorial Linux Vm Access Storage Access Key https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/tutorial-linux-vm-access-storage-access-key.md
For detailed steps, see [Assign Azure roles using the Azure portal](/azure/role-
For the remainder of the tutorial, we will work from the VM we created earlier.
-To complete these steps, you will need an SSH client. If you are using Windows, you can use the SSH client in the [Windows Subsystem for Linux](/windows/wsl/install-win10). If you need assistance configuring your SSH client's keys, see [How to Use SSH keys with Windows on Azure](/azure/virtual-machines/linux/ssh-from-windows), or [How to create and use an SSH public and private key pair for Linux VMs in Azure](/azure/virtual-machines/linux/mac-create-ssh-keys).
+To complete these steps, you will need an SSH client. If you are using Windows, you can use the SSH client in the [Windows Subsystem for Linux](/windows/wsl/install). If you need assistance configuring your SSH client's keys, see [How to Use SSH keys with Windows on Azure](/azure/virtual-machines/linux/ssh-from-windows), or [How to create and use an SSH public and private key pair for Linux VMs in Azure](/azure/virtual-machines/linux/mac-create-ssh-keys).
1. In the Azure portal, navigate to **Virtual Machines**, go to your Linux virtual machine, then from the **Overview** page click **Connect** at the top. Copy the string to connect to your VM. 2. Connect to your VM using your SSH client.
active-directory Tutorial Windows Vm Access Storage Sas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-storage-sas.md
For this request we'll use the follow HTTP request parameters to create the SAS
} ```
-These parameters are included in the POST body of the request for the SAS credential. For more information on the parameters for creating a SAS credential, see the [List Service SAS REST reference](/rest/api/storagerp/storageaccounts/listservicesas).
+These parameters are included in the POST body of the request for the SAS credential. For more information on the parameters for creating a SAS credential, see the [List Service SAS REST reference](/rest/api/storagerp/storage-accounts/list-service-sas).
First, convert the parameters to JSON, then call the storage `listServiceSas` endpoint to create the SAS credential:
active-directory Concept Pim For Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/concept-pim-for-groups.md
Role-assignable groups benefit from extra protections comparing to non-role-assi
To learn more about Microsoft Entra built-in roles and their permissions, see [Microsoft Entra built-in roles](../roles/permissions-reference.md).
-Microsoft Entra role-assignable group feature is not part of Microsoft Entra Privileged Identity Management (Microsoft Entra PIM). For more information on licensing, see [Microsoft Entra ID Governance licensing fundamentals](../../active-directory/governance/licensing-fundamentals.md) .
+Microsoft Entra role-assignable group feature is not part of Microsoft Entra Privileged Identity Management (Microsoft Entra PIM). For more information on licensing, see [Microsoft Entra ID Governance licensing fundamentals](../governance/licensing-fundamentals.md) .
## Relationship between role-assignable groups and PIM for Groups
active-directory Groups Assign Member Owner https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/groups-assign-member-owner.md
# Assign eligibility for a group in Privileged Identity Management
-In Azure Active Directory, formerly known as Microsoft Entra ID, you can use Privileged Identity Management (PIM) to manage just-in-time membership in the group or just-in-time ownership of the group.
+In Microsoft Entra ID, formerly known as Azure Active Directory, you can use Privileged Identity Management (PIM) to manage just-in-time membership in the group or just-in-time ownership of the group.
When a membership or ownership is assigned, the assignment:
When a membership or ownership is assigned, the assignment:
- Can't be removed within five minutes of it being assigned >[!NOTE]
->Every user who is eligible for membership in or ownership of a PIM for Groups must have a Microsoft Entra Premuim P2 or Microsoft Entra ID Governance license. For more information, see [License requirements to use Privileged Identity Management](subscription-requirements.md).
+>Every user who is eligible for membership in or ownership of a PIM for Groups must have a Microsoft Entra Premium P2 or Microsoft Entra ID Governance license. For more information, see [License requirements to use Privileged Identity Management](../governance/licensing-fundamentals.md).
## Assign an owner or member of a group
active-directory Pim Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-apis.md
Find more details about APIs that allow to manage assignments in the documentati
- [PIM for Microsoft Entra roles API reference](/graph/api/resources/privilegedidentitymanagementv3-overview) - [PIM for Azure resource roles API reference](/rest/api/authorization/privileged-role-eligibility-rest-sample) - [PIM for Groups API reference](/graph/api/resources/privilegedidentitymanagement-for-groups-api-overview)-- [PIM Alerts for Microsoft Entra roles API reference](/graph/api/resources/privilegedidentitymanagementv3-overview?view=graph-rest-beta#building-blocks-of-the-pim-alerts-apis&preserve-view=true)
+- [PIM Alerts for Microsoft Entra roles API reference](/graph/api/resources/privilegedidentitymanagementv3-overview?view=graph-rest-beta&preserve-view=true#building-blocks-of-the-pim-alerts-apis)
- [PIM Alerts for Azure Resources API reference](/rest/api/authorization/role-management-alert-rest-sample)
active-directory Pim Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-configure.md
When you use B2B collaboration, you can invite an external user to your organiza
## Next steps -- [License requirements to use Privileged Identity Management](subscription-requirements.md)
+- [License requirements to use Privileged Identity Management](../governance/licensing-fundamentals.md)
- [Securing privileged access for hybrid and cloud deployments in Microsoft Entra ID](../roles/security-planning.md?toc=/azure/active-directory/privileged-identity-management/toc.json) - [Deploy Privileged Identity Management](pim-deployment-plan.md)
active-directory Pim Create Roles And Resource Roles Review https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-create-roles-and-resource-roles-review.md
The need for access to privileged Azure resource and Microsoft Entra roles by em
[!INCLUDE [entra-id-license-pim.md](../../../includes/entra-id-license-pim.md)]
-For more information about licenses for PIM, refer to [License requirements to use Privileged Identity Management](subscription-requirements.md).
+For more information about licenses for PIM, refer to [License requirements to use Privileged Identity Management](../governance/licensing-fundamentals.md).
To create access reviews for Azure resources, you must be assigned to the [Owner](/azure/role-based-access-control/built-in-roles#owner) or the [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) role for the Azure resources. To create access reviews for Microsoft Entra roles, you must be assigned to the [Global Administrator](../roles/permissions-reference.md#global-administrator) or the [Privileged Role Administrator](../roles/permissions-reference.md#privileged-role-administrator) role.
active-directory Pim Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-getting-started.md
To use Privileged Identity Management, you must have one of the following licens
- [!INCLUDE [entra-id-license-pim.md](../../../includes/entra-id-license-pim.md)]
-For more information, see [License requirements to use Privileged Identity Management](subscription-requirements.md).
+For more information, see [License requirements to use Privileged Identity Management](../governance/licensing-fundamentals.md).
> [!Note] > When a user who is active in a privileged role in a Microsoft Entra organization with a Premium P2 license goes to **Roles and administrators** in Microsoft Entra ID and selects a role (or even just visits Privileged Identity Management):
active-directory Pim Resource Roles Assign Roles https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-resource-roles-assign-roles.md
Follow these steps to make a user eligible for an Azure resource role.
## Assign a role using ARM API
-Privileged Identity Management supports Azure Resource Manager (ARM) API commands to manage Azure resource roles, as documented in the [PIM ARM API reference](/rest/api/authorization/roleeligibilityschedulerequests). For the permissions required to use the PIM API, see [Understand the Privileged Identity Management APIs](pim-apis.md).
+Privileged Identity Management supports Azure Resource Manager (ARM) API commands to manage Azure resource roles, as documented in the [PIM ARM API reference](/rest/api/authorization/role-eligibility-schedule-requests). For the permissions required to use the PIM API, see [Understand the Privileged Identity Management APIs](pim-apis.md).
The following example is a sample HTTP request to create an eligible assignment for an Azure role.
active-directory Pim Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-troubleshoot.md
Assign the User Access Administrator role to the Privileged identity Management
## Next steps -- [License requirements to use Privileged Identity Management](subscription-requirements.md)
+- [License requirements to use Privileged Identity Management](../governance/licensing-fundamentals.md)
- [Securing privileged access for hybrid and cloud deployments in Microsoft Entra ID](../roles/security-planning.md?toc=/azure/active-directory/privileged-identity-management/toc.json) - [Deploy Privileged Identity Management](pim-deployment-plan.md)
active-directory Concept Diagnostic Settings Logs Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/concept-diagnostic-settings-logs-options.md
The `AuditLogs` report capture changes to applications, groups, users, and licen
### Sign-in logs
-The `SignInLogs` send the interactive sign-in logs, which are logs generated by your users signing in. Sign-in logs are generated by users providing their username and password on a Microsoft Entra sign-in screen or passing an MFA challenge. For more information, see [Interactive user sign-ins](concept-all-sign-ins.md#interactive-user-sign-ins).
+The `SignInLogs` send the interactive sign-in logs, which are logs generated by your users signing in. Sign-in logs are generated by users providing their username and password on a Microsoft Entra sign-in screen or passing an MFA challenge. For more information, see [Interactive user sign-ins](./concept-sign-ins.md#interactive-user-sign-ins).
### Non-interactive sign-in logs
-The `NonInteractiveUserSIgnInLogs` are sign-ins done on behalf of a user, such as by a client app. The device or client uses a token or code to authenticate or access a resource on behalf of a user. For more information, see [Non-interactive user sign-ins](concept-all-sign-ins.md#non-interactive-user-sign-ins).
+The `NonInteractiveUserSIgnInLogs` are sign-ins done on behalf of a user, such as by a client app. The device or client uses a token or code to authenticate or access a resource on behalf of a user. For more information, see [Non-interactive user sign-ins](./concept-sign-ins.md#non-interactive-user-sign-ins).
### Service principal sign-in logs
-If you need to review sign-in activity for apps or service principals, the `ServicePrincipalSignInLogs` may be a good option. In these scenarios, certificates or client secrets are used for authentication. For more information, see [Service principal sign-ins](concept-all-sign-ins.md#service-principal-sign-ins).
+If you need to review sign-in activity for apps or service principals, the `ServicePrincipalSignInLogs` may be a good option. In these scenarios, certificates or client secrets are used for authentication. For more information, see [Service principal sign-ins](./concept-sign-ins.md#service-principal-sign-ins).
### Managed identity sign-in logs
-The `ManagedIdentitySignInLogs` provide similar insights as the service principal sign-in logs, but for managed identities, where Azure manages the secrets. For more information, see [Managed identity sign-ins](concept-all-sign-ins.md#managed-identity-for-azure-resources-sign-ins).
+The `ManagedIdentitySignInLogs` provide similar insights as the service principal sign-in logs, but for managed identities, where Azure manages the secrets. For more information, see [Managed identity sign-ins](./concept-sign-ins.md#managed-identity-sign-ins).
### Provisioning logs
The `NetworkAccessTrafficLogs` logs are associated with Microsoft Entra Internet
## Next steps -- [Learn about the sign-in logs](concept-all-sign-ins.md)
+- [Learn about the sign-in logs](./concept-sign-ins.md)
- [Explore how to access the activity logs](howto-access-activity-logs.md)
active-directory Concept Log Monitoring Integration Options Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/concept-log-monitoring-integration-options-considerations.md
With the data sample captured, multiply accordingly to find out how large the fi
To get an idea of how much a log integration could cost for your organization, you can enable an integration for a day or two. Use this option if your budget allows for the temporary increase.
-To enable a log integration, follow the steps in the [Integrate activity logs with Azure Monitor logs](howto-integrate-activity-logs-with-log-analytics.md) article. If possible, create a new resource group for the logs and endpoint you want to try out. Having a devoted resource group makes it easy to view the cost analysis and then delete it when you're done.
+To enable a log integration, follow the steps in the [Integrate activity logs with Azure Monitor logs](./howto-integrate-activity-logs-with-azure-monitor-logs.md) article. If possible, create a new resource group for the logs and endpoint you want to try out. Having a devoted resource group makes it easy to view the cost analysis and then delete it when you're done.
With the integration enabled, navigate to **Azure portal** > **Cost Management** > **Cost analysis**. There are several ways to analyze costs. This [Cost Management quickstart](/azure/cost-management-billing/costs/quick-acm-cost-analysis) should help you get started. The figures in the following screenshot are used for example purposes and are not intended to reflect actual amounts.
Once you have an estimate for the GB/day that will be sent to an endpoint, enter
## Next steps * [Create a storage account](/azure/storage/common/storage-account-create)
-* [Archive activity logs to a storage account](quickstart-azure-monitor-route-logs-to-storage-account.md)
-* [Route activity logs to an event hub](./tutorial-azure-monitor-stream-logs-to-event-hub.md)
-* [Integrate activity logs with Azure Monitor](howto-integrate-activity-logs-with-log-analytics.md)
+* [Archive activity logs to a storage account](./howto-archive-logs-to-storage-account.md)
+* [Route activity logs to an event hub](./howto-stream-logs-to-event-hub.md)
+* [Integrate activity logs with Azure Monitor](./howto-integrate-activity-logs-with-azure-monitor-logs.md)
active-directory Concept Sign Ins https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/concept-sign-ins.md
Microsoft Entra logs all sign-ins into an Azure tenant, which includes your internal apps and resources. As an IT administrator, you need to know what the values in the sign-in logs mean, so that you can interpret the log values correctly.
-Reviewing sign-in errors and patterns provides valuable insight into how your users access applications and services. The sign-in logs provided by Microsoft Entra ID are a powerful type of [activity log](overview-reports.md) that you can analyze. This article explains how to access and utilize the sign-in logs.
+Reviewing sign-in errors and patterns provides valuable insight into how your users access applications and services. The sign-in logs provided by Microsoft Entra ID are a powerful type of [activity log](./overview-monitoring-health.md) that you can analyze. This article explains how to access and utilize the sign-in logs.
The preview view of the sign-in logs includes interactive and non-interactive user sign-ins as well as service principal and managed identity sign-ins. You can still view the classic sign-in logs, which only include interactive sign-ins.
There are several reports available in **Usage & insights**. Some of these repor
### Microsoft 365 activity logs
-You can view Microsoft 365 activity logs from the [Microsoft 365 admin center](/office365/admin/admin-overview/about-the-admin-center). Microsoft 365 activity and Microsoft Entra activity logs share a significant number of directory resources. Only the Microsoft 365 admin center provides a full view of the Microsoft 365 activity logs.
+You can view Microsoft 365 activity logs from the [Microsoft 365 admin center](/microsoft-365/admin/admin-overview/admin-center-overview). Microsoft 365 activity and Microsoft Entra activity logs share a significant number of directory resources. Only the Microsoft 365 admin center provides a full view of the Microsoft 365 activity logs.
You can access the Microsoft 365 activity logs programmatically by using the [Office 365 Management APIs](/office/office-365-management-api/office-365-management-apis-overview). ## Next steps -- [Basic info in the Microsoft Entra sign-in logs](reference-basic-info-sign-in-logs.md)
+- [Basic info in the Microsoft Entra sign-in logs](./concept-sign-in-log-activity-details.md)
- [How to download logs in Microsoft Entra ID](howto-download-logs.md)
active-directory Howto Access Activity Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/howto-access-activity-logs.md
The SIEM tools you can integrate with your event hub can provide analysis and mo
1. Browse to **Identity** > **Monitoring & health** > **Diagnostic settings**. 1. Choose the logs you want to stream, select the **Stream to an event hub** option, and complete the fields. - [Set up an Event Hubs namespace and an event hub](/azure/event-hubs/event-hubs-create)
- - [Learn more about streaming activity logs to an event hub](tutorial-azure-monitor-stream-logs-to-event-hub.md)
+ - [Learn more about streaming activity logs to an event hub](./howto-stream-logs-to-event-hub.md)
Your independent security vendor should provide you with instructions on how to ingest data from Azure Event Hubs into their tool.
Integrating Microsoft Entra logs with Azure Monitor logs provides a centralized
1. Browse to **Identity** > **Monitoring & health** > **Diagnostic settings**. 1. Choose the logs you want to stream, select the **Send to Log Analytics workspace** option, and complete the fields. 1. Browse to **Identity** > **Monitoring & health** > **Log Analytics** and begin querying the data.
- - [Integrate Microsoft Entra logs with Azure Monitor logs](howto-integrate-activity-logs-with-log-analytics.md)
+ - [Integrate Microsoft Entra logs with Azure Monitor logs](./howto-integrate-activity-logs-with-azure-monitor-logs.md)
- [Learn how to query using Log Analytics](howto-analyze-activity-logs-log-analytics.md) ## Monitor events with Microsoft Sentinel
Use the following basic steps to access the reports in the Microsoft Entra admin
1. Browse to **Identity** > **Monitoring & health** > **Audit logs**/**Sign-in logs**/**Provisioning logs**. 1. Adjust the filter according to your needs.
- - [Learn how to filter activity logs](quickstart-filter-audit-log.md)
+ - [Learn how to filter activity logs](./howto-customize-filter-logs.md)
- [Explore the Microsoft Entra audit log categories and activities](reference-audit-activities.md)
- - [Learn about basic info in the Microsoft Entra sign-in logs](reference-basic-info-sign-in-logs.md)
+ - [Learn about basic info in the Microsoft Entra sign-in logs](./concept-sign-in-log-activity-details.md)
<a name='azure-ad-identity-protection-reports'></a>
The right solution for your long-term storage depends on your budget and what yo
- Download logs for manual storage - Integrate logs with Azure Monitor logs
-[Azure Storage](/azure/storage/common/storage-introduction) is the right solution if you aren't planning on querying your data often. For more information, see [Archive directory logs to a storage account](quickstart-azure-monitor-route-logs-to-storage-account.md).
+[Azure Storage](/azure/storage/common/storage-introduction) is the right solution if you aren't planning on querying your data often. For more information, see [Archive directory logs to a storage account](./howto-archive-logs-to-storage-account.md).
-If you plan to query the logs often to run reports or perform analysis on the stored logs, you should [integrate your data with Azure Monitor logs](howto-integrate-activity-logs-with-log-analytics.md).
+If you plan to query the logs often to run reports or perform analysis on the stored logs, you should [integrate your data with Azure Monitor logs](./howto-integrate-activity-logs-with-azure-monitor-logs.md).
If your budget is tight, and you need a cheap method to create a long-term backup of your activity logs, you can [manually download your logs](howto-download-logs.md). The user interface of the activity logs in the portal provides you with an option to download the data as **JSON** or **CSV**. One trade off of the manual download is that it requires more manual interaction. If you're looking for a more professional solution, use either Azure Storage or Azure Monitor.
Use the following basic steps to archive or download your activity logs.
## Next steps -- [Stream logs to an event hub](tutorial-azure-monitor-stream-logs-to-event-hub.md)-- [Archive logs to a storage account](quickstart-azure-monitor-route-logs-to-storage-account.md)-- [Integrate logs with Azure Monitor logs](howto-integrate-activity-logs-with-log-analytics.md)
+- [Stream logs to an event hub](./howto-stream-logs-to-event-hub.md)
+- [Archive logs to a storage account](./howto-archive-logs-to-storage-account.md)
+- [Integrate logs with Azure Monitor logs](./howto-integrate-activity-logs-with-azure-monitor-logs.md)
active-directory Howto Analyze Activity Logs Log Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/howto-analyze-activity-logs-log-analytics.md
# Analyze Microsoft Entra activity logs with Log Analytics
-After you [integrate Microsoft Entra activity logs with Azure Monitor logs](howto-integrate-activity-logs-with-log-analytics.md), you can use the power of Log Analytics and Azure Monitor logs to gain insights into your environment.
+After you [integrate Microsoft Entra activity logs with Azure Monitor logs](./howto-integrate-activity-logs-with-azure-monitor-logs.md), you can use the power of Log Analytics and Azure Monitor logs to gain insights into your environment.
* Compare your Microsoft Entra sign-in logs against security logs published by Microsoft Defender for Cloud.
For more information on Microsoft Entra built-in roles, see [Microsoft Entra bui
## Access Log Analytics
-To view the Microsoft Entra ID Log Analytics, you must already be sending your activity logs from Microsoft Entra ID to a Log Analytics workspace. This process is covered in the [How to integrate activity logs with Azure Monitor](howto-integrate-activity-logs-with-log-analytics.md) article.
+To view the Microsoft Entra ID Log Analytics, you must already be sending your activity logs from Microsoft Entra ID to a Log Analytics workspace. This process is covered in the [How to integrate activity logs with Azure Monitor](./howto-integrate-activity-logs-with-azure-monitor-logs.md) article.
[!INCLUDE [portal updates](~/articles/active-directory/includes/portal-update.md)]
active-directory Howto Download Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/howto-download-logs.md
The following screenshot shows the download window from the audit and sign-in lo
The following screenshot shows menu options for the provisioning log download process. ![Screenshot of the provisioning log download button options.](./media/howto-download-logs/provisioning-logs-download.png)
-If your tenant has enabled the [sign-in logs preview](concept-all-sign-ins.md), more options are available after selecting **Download**. The sign-in logs preview include interactive and non-interactive user sign-ins, service principal sign-ins, and managed identity sign-ins.
+If your tenant has enabled the [sign-in logs preview](./concept-sign-ins.md), more options are available after selecting **Download**. The sign-in logs preview include interactive and non-interactive user sign-ins, service principal sign-ins, and managed identity sign-ins.
![Screenshot of the download options for the sign-in logs preview.](media/howto-download-logs/sign-in-preview-download-options.png) ## Next steps -- [Integrate Microsoft Entra logs with Azure Monitor](howto-integrate-activity-logs-with-log-analytics.md)
+- [Integrate Microsoft Entra logs with Azure Monitor](./howto-integrate-activity-logs-with-azure-monitor-logs.md)
- [Access Microsoft Entra logs using the Graph API](quickstart-access-log-with-graph-api.md)
active-directory Howto Manage Inactive User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/howto-manage-inactive-user-accounts.md
The following details relate to the `lastSignInDateTime` property.
- Each interactive sign-in attempt results in an update of the underlying data store. Typically, sign-ins show up in the related sign-in report within 6 hours. -- To generate a `lastSignInDateTime` timestamp, you must attempt a sign-in. Either a failed or successful sign-in attempt, as long as it's recorded in the [Microsoft Entra sign-in logs](concept-all-sign-ins.md), generates a `lastSignInDateTime` timestamp. The value of the `lastSignInDateTime` property may be blank if:
+- To generate a `lastSignInDateTime` timestamp, you must attempt a sign-in. Either a failed or successful sign-in attempt, as long as it's recorded in the [Microsoft Entra sign-in logs](./concept-sign-ins.md), generates a `lastSignInDateTime` timestamp. The value of the `lastSignInDateTime` property may be blank if:
- The last attempted sign-in of a user took place before April 2020. - The affected user account was never used for a sign-in attempt.
active-directory Howto Use Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/howto-use-recommendations.md
GET /directory/recommendations/{recommendationId}/impactedResources
## Next steps - [Review the Microsoft Entra recommendations overview](overview-recommendations.md)-- [Learn about Service Health notifications](overview-service-health-notifications.md)
+- [Learn about Service Health notifications](/azure/service-health/service-health-portal-update)
active-directory Overview Monitoring Health https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/overview-monitoring-health.md
Activity logs help you understand the behavior of users in your organization. Th
- [**Audit logs**](concept-audit-logs.md) include the history of every task performed in your tenant. -- [**Sign-in logs**](concept-all-sign-ins.md) capture the sign-in attempts of your users and client applications.
+- [**Sign-in logs**](./concept-sign-ins.md) capture the sign-in attempts of your users and client applications.
- [**Provisioning logs**](concept-provisioning-logs.md) provide information around users provisioned in your tenant through a third party service.
Reviewing Microsoft Entra activity logs is the first step in maintaining and imp
Monitoring Microsoft Entra activity logs requires routing the log data to a monitoring and analysis solution. Endpoints include Azure Monitor logs, Microsoft Sentinel, or a third-party solution third-party Security Information and Event Management (SIEM) tool. - [Stream logs to an event hub to integrate with third-party SIEM tools.](howto-stream-logs-to-event-hub.md)-- [Integrate logs with Azure Monitor logs.](howto-integrate-activity-logs-with-log-analytics.md)
+- [Integrate logs with Azure Monitor logs.](./howto-integrate-activity-logs-with-azure-monitor-logs.md)
- [Analyze logs with Azure Monitor logs and Log Analytics.](howto-analyze-activity-logs-log-analytics.md)
active-directory Overview Workbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/overview-workbooks.md
Public workbook templates are built, updated, and deprecated to reflect the need
## Next steps -- Learn [how to use Azure Workbooks for Microsoft Entra ID](howto-use-azure-monitor-workbooks.md)
+- Learn [how to use Azure Workbooks for Microsoft Entra ID](./howto-use-workbooks.md)
- [Create your own workbook](/azure/azure-monitor/visualize/workbooks-create-workbook) - Create a [Log Analytics workspace](/azure/azure-monitor/logs/quick-create-workspace)
active-directory Reference Powershell Reporting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/reference-powershell-reporting.md
The following image shows an example for this command.
## Next steps -- [Microsoft Entra reports overview](overview-reports.md).
+- [Microsoft Entra reports overview](./overview-monitoring-health.md).
- [Audit logs report](concept-audit-logs.md). - [Programmatic access to Microsoft Entra reports](./howto-configure-prerequisites-for-reporting-api.md)
active-directory Reference Reports Data Retention https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/reference-reports-data-retention.md
In this article, you learn about the data retention policies for the different a
| Microsoft Entra Edition | Collection Start | | :-- | :-- | | Microsoft Entra ID P1 <br /> Microsoft Entra ID P2 <br /> Microsoft Entra Workload ID Premium | When you sign up for a subscription |
-| Microsoft Entra ID Free| The first time you open [Microsoft Entra ID](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) or use the [reporting APIs](./overview-reports.md) |
+| Microsoft Entra ID Free| The first time you open [Microsoft Entra ID](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) or use the [reporting APIs](./overview-monitoring-health.md) |
If you already have activities data with your free license, then you can see it immediately on upgrade. If you donΓÇÖt have any data, then it will take up to three days for the data to show up in the reports after you upgrade to a premium license. For security signals, the collection process starts when you opt-in to use the **Identity Protection Center**.
If you already have activities data with your free license, then you can see it
| Sign-ins | Seven days | 30 days | 30 days | | Microsoft Entra multifactor authentication usage | 30 days | 30 days | 30 days |
-You can retain the audit and sign-in activity data for longer than the default retention period outlined in the previous table by routing it to an Azure storage account using Azure Monitor. For more information, see [Archive Microsoft Entra logs to an Azure storage account](quickstart-azure-monitor-route-logs-to-storage-account.md).
+You can retain the audit and sign-in activity data for longer than the default retention period outlined in the previous table by routing it to an Azure storage account using Azure Monitor. For more information, see [Archive Microsoft Entra logs to an Azure storage account](./howto-archive-logs-to-storage-account.md).
**Security signals**
You can retain the audit and sign-in activity data for longer than the default r
## Next steps -- [Stream logs to an event hub](tutorial-azure-monitor-stream-logs-to-event-hub.md)
+- [Stream logs to an event hub](./howto-stream-logs-to-event-hub.md)
- [Learn how to download Microsoft Entra logs](howto-download-logs.md)
active-directory Workbook Legacy Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/workbook-legacy-authentication.md
This workbook supports multiple filters:
- To learn more about identity protection, see [What is identity protection](../identity-protection/overview-identity-protection.md). -- For more information about Microsoft Entra workbooks, see [How to use Microsoft Entra workbooks](howto-use-azure-monitor-workbooks.md).
+- For more information about Microsoft Entra workbooks, see [How to use Microsoft Entra workbooks](./howto-use-workbooks.md).
active-directory Workbook Risk Analysis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/workbook-risk-analysis.md
Risky Users:
- To learn more about identity protection, see [What is identity protection](../identity-protection/overview-identity-protection.md). -- For more information about Microsoft Entra workbooks, see [How to use Microsoft Entra workbooks](howto-use-azure-monitor-workbooks.md).
+- For more information about Microsoft Entra workbooks, see [How to use Microsoft Entra workbooks](./howto-use-workbooks.md).
active-directory Custom Create https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/roles/custom-create.md
Like built-in roles, custom roles are assigned by default at the default organiz
- Feel free to share with us on the [Microsoft Entra administrative roles forum](https://feedback.azure.com/d365community/forum/22920db1-ad25-ec11-b6e6-000d3a4f0789). - For more about role permissions, see [Microsoft Entra built-in roles](permissions-reference.md).-- For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2froles%2fcontext%2fugr-context).
+- For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md?context=azure/active-directory/roles/context/ugr-context).
active-directory M365 Workload Docs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/roles/m365-workload-docs.md
All products in Microsoft 365 can be managed with administrative roles in Micros
> [!div class="mx-tableFixed"] > | Microsoft 365 service | Role content | API content | > | - | | -- |
-> | Admin roles in Office 365 and Microsoft 365 business plans | [Microsoft 365 admin roles](/office365/admin/add-users/about-admin-roles) | Not available |
+> | Admin roles in Office 365 and Microsoft 365 business plans | [Microsoft 365 admin roles](/microsoft-365/admin/add-users/about-admin-roles) | Not available |
> | Microsoft Entra ID and Microsoft Entra ID Protection| [Microsoft Entra built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
-> | Exchange Online| [Exchange role-based access control](/exchange/understanding-role-based-access-control-exchange-2013-help) | [PowerShell for Exchange](/powershell/module/exchange/role-based-access-control/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/role-based-access-control/get-rolegroup) |
+> | Exchange Online| [Exchange role-based access control](/exchange/understanding-role-based-access-control-exchange-2013-help) | [PowerShell for Exchange](/powershell/module/exchange/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/get-rolegroup) |
> | SharePoint Online | [Microsoft Entra built-in roles](permissions-reference.md)<br>Also [About the SharePoint admin role in Microsoft 365](/sharepoint/sharepoint-admin-role) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) | > | Teams/Skype for Business | [Microsoft Entra built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) | > | Security & Compliance Center (Office 365 Advanced Threat Protection, Exchange Online Protection, Information Protection) | [Office 365 admin roles](/microsoft-365/security/office-365-security/scc-permissions) | [Exchange PowerShell](/powershell/module/exchange/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/get-rolegroup) |
active-directory Security Planning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/roles/security-planning.md
Securing privileged access requires changes to:
* Processes, administrative practices, and knowledge management * Technical components such as host defenses, account protections, and identity management
-Secure your privileged access in a way that is managed and reported in the Microsoft services you care about. If you have on-premises administrator accounts, see the guidance for on-premises and hybrid privileged access in Active Directory at [Securing Privileged Access](/windows-server/identity/securing-privileged-access/securing-privileged-access).
+Secure your privileged access in a way that is managed and reported in the Microsoft services you care about. If you have on-premises administrator accounts, see the guidance for on-premises and hybrid privileged access in Active Directory at [Securing Privileged Access](/security/privileged-access-workstations/overview).
> [!NOTE] > The guidance in this article refers primarily to features of Microsoft Entra ID that are included in Microsoft Entra ID P1 and P2. Microsoft Entra ID P2 is included in the EMS E5 suite and Microsoft 365 E5 suite. This guidance assumes your organization already has Microsoft Entra ID P2 licenses purchased for your users. If you do not have these licenses, some of the guidance might not apply to your organization. Also, throughout this article, the term Global Administrator means the same thing as "company administrator" or "tenant administrator."
The increase in "bring your own device" and work from home policies and the grow
* Identify the users who have administrative roles and the services where they can manage. * Use Microsoft Entra PIM to find out which users in your organization have administrator access to Microsoft Entra ID.
-* Beyond the roles defined in Microsoft Entra ID, Microsoft 365 comes with a set of administrator roles that you can assign to users in your organization. Each administrator role maps to common business functions, and gives people in your organization permissions to do specific tasks in the [Microsoft 365 admin center](https://admin.microsoft.com). Use the Microsoft 365 admin center to find out which users in your organization have administrator access to Microsoft 365, including via roles not managed in Microsoft Entra ID. For more information, see [About Microsoft 365 administrator roles](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) and [Security practices for Office 365](/office365/servicedescriptions/office-365-platform-service-description/office-365-securitycompliance-center).
+* Beyond the roles defined in Microsoft Entra ID, Microsoft 365 comes with a set of administrator roles that you can assign to users in your organization. Each administrator role maps to common business functions, and gives people in your organization permissions to do specific tasks in the [Microsoft 365 admin center](https://admin.microsoft.com). Use the Microsoft 365 admin center to find out which users in your organization have administrator access to Microsoft 365, including via roles not managed in Microsoft Entra ID. For more information, see [About Microsoft 365 administrator roles](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) and [Security practices for Office 365](/office365/servicedescriptions/microsoft-365-service-descriptions/microsoft-365-tenantlevel-services-licensing-guidance/microsoft-365-security-compliance-licensing-guidance).
* Do the inventory in services your organization relies on, such as Azure, Intune, or Dynamics 365. * Ensure that your accounts that are used for administration purposes:
active-directory Amazon Web Service Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/amazon-web-service-tutorial.md
Follow these steps to enable Microsoft Entra SSO.
| | | | | RoleSessionName | user.userprincipalname | `https://aws.amazon.com/SAML/Attributes` | | Role | user.assignedroles | `https://aws.amazon.com/SAML/Attributes` |
- | SessionDuration | "provide a value between 900 seconds (15 minutes) to 43200 seconds (12 hours)" | `https://aws.amazon.com/SAML/Attributes` |
+ | SessionDuration | user.sessionduration | `https://aws.amazon.com/SAML/Attributes` |
> [!NOTE] > AWS expects roles for users assigned to the application. Please set up these roles in Microsoft Entra ID so that users can be assigned the appropriate roles. To understand how to configure roles in Microsoft Entra ID, see [here](../develop/howto-add-app-roles-in-azure-ad-apps.md#app-roles-ui)
active-directory Configure Cmmc Level 2 Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/configure-cmmc-level-2-access-control.md
The following table provides a list of practice statement and objectives, and Mi
| AC.L2-3.1.9<br><br>**Practice statement:** Provide privacy and security notices consistent with applicable CUI rules.<br><br>**Objectives:**<br>Determine if:<br>[a.] privacy and security notices required by CUI-specified rules are identified, consistent, and associated with the specific CUI category; and<br>[b.] privacy and security notices are displayed. | With Microsoft Entra ID, you can deliver notification or banner messages for all apps that require and record acknowledgment before granting access. You can granularly target these terms of use policies to specific users (Member or Guest). You can also customize them per application via Conditional Access policies.<br><br>**Conditional Access** <br>[What is Conditional Access in Microsoft Entra ID?](../conditional-access/overview.md)<br><br>**Terms of use**<br>[Microsoft Entra terms of use](../conditional-access/terms-of-use.md)<br>[View report of who has accepted and declined](../conditional-access/terms-of-use.md) | | AC.L2-3.1.10<br><br>**Practice statement:** Use session lock with pattern-hiding displays to prevent access and viewing of data after a period of inactivity.<br><br>**Objectives:**<br>Determine if:<br>[a.] the period of inactivity after which the system initiates a session lock is defined;<br>[b.] access to the system and viewing of data is prevented by initiating a session lock after the defined period of inactivity; and<br>[c.] previously visible information is concealed via a pattern-hiding display after the defined period of inactivity. | Implement device lock by using a Conditional Access policy to restrict access to compliant or Microsoft Entra hybrid joined devices. Configure policy settings on the device to enforce device lock at the OS level with MDM solutions such as Intune. Microsoft Intune, Configuration Manager, or group policy objects can also be considered in hybrid deployments. For unmanaged devices, configure the Sign-In Frequency setting to force users to reauthenticate.<br>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br>[Grant controls in Conditional Access policy - Require Microsoft Entra hybrid joined device](../conditional-access/concept-conditional-access-grant.md)<br>[User sign-in frequency](../conditional-access/howto-conditional-access-session-lifetime.md)<br><br>Configure devices for maximum minutes of inactivity until the screen locks ([Android](/mem/intune/configuration/device-restrictions-android), [iOS](/mem/intune/configuration/device-restrictions-ios), [Windows 10](/mem/intune/configuration/device-restrictions-windows-10)).| | AC.L2-3.1.11<br><br>**Practice statement:** Terminate (automatically) a user session after a defined condition.<br><br>**Objectives:**<br>Determine if:<br>[a.] conditions requiring a user session to terminate are defined; and<br>[b.] a user session is automatically terminated after any of the defined conditions occur. | Enable Continuous Access Evaluation (CAE) for all supported applications. For application that don't support CAE, or for conditions not applicable to CAE, implement policies in Microsoft Defender for Cloud Apps to automatically terminate sessions when conditions occur. Additionally, configure Microsoft Entra ID Protection to evaluate user and sign-in Risk. Use Conditional Access with Identity protection to allow user to automatically remediate risk.<br>[Continuous access evaluation in Microsoft Entra ID](../conditional-access/concept-continuous-access-evaluation.md)<br>[Control cloud app usage by creating policies](/defender-cloud-apps/control-cloud-apps-with-policies)<br>[What is Microsoft Entra ID Protection?](../identity-protection/overview-identity-protection.md)
-|AC.L2-3.1.12<br><br>**Practice statement:** Monitor and control remote access sessions.<br><br>**Objectives:**<br>Determine if:<br>[a.] remote access sessions are permitted;<br>[b.] the types of permitted remote access are identified;<br>[c.] remote access sessions are controlled; and<br>[d.] remote access sessions are monitored. | In todayΓÇÖs world, users access cloud-based applications almost exclusively remotely from unknown or untrusted networks. It's critical to securing this pattern of access to adopt zero trust principals. To meet these controls requirements in a modern cloud world we must verify each access request explicitly, implement least privilege and assume breach.<br><br>Configure named locations to delineate internal vs external networks. Configure Conditional Access app control to route access via Microsoft Defender for Cloud Apps. Configure Defender for Cloud Apps to control and monitor all sessions.<br>[Zero Trust Deployment Guide for Microsoft Entra ID](https://www.microsoft.com/security/blog/2020/04/30/zero-trust-deployment-guide-azure-active-directory/)<br>[Location condition in Microsoft Entra Conditional Access](../conditional-access/location-condition.md)<br>[Deploy Cloud App Security Conditional Access App Control for Microsoft Entra apps](/defender-cloud-apps/proxy-deployment-aad)<br>[What is Microsoft Defender for Cloud Apps?](/cloud-app-security/what-is-cloud-app-security)<br>[Monitor alerts raised in Microsoft Defender for Cloud Apps](/microsoft-365/security/defender/investigate-alerts) |
+|AC.L2-3.1.12<br><br>**Practice statement:** Monitor and control remote access sessions.<br><br>**Objectives:**<br>Determine if:<br>[a.] remote access sessions are permitted;<br>[b.] the types of permitted remote access are identified;<br>[c.] remote access sessions are controlled; and<br>[d.] remote access sessions are monitored. | In todayΓÇÖs world, users access cloud-based applications almost exclusively remotely from unknown or untrusted networks. It's critical to securing this pattern of access to adopt zero trust principals. To meet these controls requirements in a modern cloud world we must verify each access request explicitly, implement least privilege and assume breach.<br><br>Configure named locations to delineate internal vs external networks. Configure Conditional Access app control to route access via Microsoft Defender for Cloud Apps. Configure Defender for Cloud Apps to control and monitor all sessions.<br>[Zero Trust Deployment Guide for Microsoft Entra ID](https://www.microsoft.com/security/blog/2020/04/30/zero-trust-deployment-guide-azure-active-directory/)<br>[Location condition in Microsoft Entra Conditional Access](../conditional-access/location-condition.md)<br>[Deploy Cloud App Security Conditional Access App Control for Microsoft Entra apps](/defender-cloud-apps/proxy-deployment-aad)<br>[What is Microsoft Defender for Cloud Apps?](/defender-cloud-apps/what-is-defender-for-cloud-apps)<br>[Monitor alerts raised in Microsoft Defender for Cloud Apps](/microsoft-365/security/defender/investigate-alerts) |
| AC.L2-3.1.13<br><br>**Practice statement:** Employ cryptographic mechanisms to protect the confidentiality of remote access sessions.<br><br>**Objectives:**<br>Determine if:<br>[a.] cryptographic mechanisms to protect the confidentiality of remote access sessions are identified; and<br>[b.] cryptographic mechanisms to protect the confidentiality of remote access sessions are implemented. | All Microsoft Entra customer-facing web services are secured with the Transport Layer Security (TLS) protocol and are implemented using FIPS-validated cryptography.<br>[Microsoft Entra Data Security Considerations (microsoft.com)](https://azure.microsoft.com/resources/azure-active-directory-data-security-considerations/) |
-| AC.L2-3.1.14<br><br>**Practice statement:** Route remote access via managed access control points.<br><br>**Objectives:**<br>Determine if:<br>[a.] managed access control points are identified and implemented; and<br>[b.] remote access is routed through managed network access control points. | Configure named locations to delineate internal vs external networks. Configure Conditional Access app control to route access via Microsoft Defender for Cloud Apps. Configure Defender for Cloud Apps to control and monitor all sessions. Secure devices used by privileged accounts as part of the privileged access story.<br>[Location condition in Microsoft Entra Conditional Access](../conditional-access/location-condition.md)<br>[Session controls in Conditional Access policy](../conditional-access/concept-conditional-access-session.md)<br>[Securing privileged access overview](/security/compass/overview) |
+| AC.L2-3.1.14<br><br>**Practice statement:** Route remote access via managed access control points.<br><br>**Objectives:**<br>Determine if:<br>[a.] managed access control points are identified and implemented; and<br>[b.] remote access is routed through managed network access control points. | Configure named locations to delineate internal vs external networks. Configure Conditional Access app control to route access via Microsoft Defender for Cloud Apps. Configure Defender for Cloud Apps to control and monitor all sessions. Secure devices used by privileged accounts as part of the privileged access story.<br>[Location condition in Microsoft Entra Conditional Access](../conditional-access/location-condition.md)<br>[Session controls in Conditional Access policy](../conditional-access/concept-conditional-access-session.md)<br>[Securing privileged access overview](/security/privileged-access-workstations/overview) |
| AC.L2-3.1.15<br><br>**Practice statement:** Authorize remote execution of privileged commands and remote access to security-relevant information.<br><br>**Objectives:**<br>Determine if:<br>[a.] privileged commands authorized for remote execution are identified;<br>[b.] security-relevant information authorized to be accessed remotely is identified;<br>[c.] the execution of the identified privileged commands via remote access is authorized; and<br>[d.] access to the identified security-relevant information via remote access is authorized. | Conditional Access is the Zero Trust control plane to target policies for access to your apps when combined with authentication context. You can apply different policies in those apps. Secure devices used by privileged accounts as part of the privileged access story. Configure Conditional Access policies to require the use of these secured devices by privileged users when performing privileged commands.<br>[Cloud apps, actions, and authentication context in Conditional Access policy](../conditional-access/concept-conditional-access-cloud-apps.md)<br>[Securing privileged access overview](/security/privileged-access-workstations/overview)<br>[Filter for devices as a condition in Conditional Access policy](../conditional-access/concept-condition-filters-for-devices.md) | | AC.L2-3.1.18<br><br>**Practice statement:** Control connection of mobile devices.<br><br>**Objectives:**<br>Determine if:<br>[a.] mobile devices that process, store, or transmit CUI are identified;<br>[b.] mobile device connections are authorized; and<br>[c.] mobile device connections are monitored and logged. | Configure device management policies via MDM (such as Microsoft Intune), Configuration Manager, or group policy objects (GPO) to enforce mobile device configuration and connection profile. Configure Conditional Access policies to enforce device compliance.<br><br>**Conditional Access**<br>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br>[Require Microsoft Entra hybrid joined device](../conditional-access/concept-conditional-access-grant.md)<br><br>**InTune**<br>[Device compliance policies in Microsoft Intune](/mem/intune/protect/device-compliance-get-started)<br>[What is app management in Microsoft Intune?](/mem/intune/apps/app-management) | | AC.L2-3.1.19<br><br>**Practice statement:** Encrypt CUI on mobile devices and mobile computing platforms.<br><br>**Objectives:**<br>Determine if:<br>[a.] mobile devices and mobile computing platforms that process, store, or transmit CUI are identified; and<br>[b.] encryption is employed to protect CUI on identified mobile devices and mobile computing platforms. | **Managed Device**<br>Configure Conditional Access policies to enforce compliant or Microsoft Entra hybrid joined device and to ensure managed devices are configured appropriately via device management solution to encrypt CUI.<br><br>**Unmanaged Device**<br>Configure Conditional Access policies to require app protection policies.<br>[Grant controls in Conditional Access policy - Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br>[Grant controls in Conditional Access policy - Require Microsoft Entra hybrid joined device](../conditional-access/concept-conditional-access-grant.md)<br>[Grant controls in Conditional Access policy - Require app protection policy](../conditional-access/concept-conditional-access-grant.md) |
active-directory Configure Cmmc Level 2 Additional Controls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/configure-cmmc-level-2-additional-controls.md
The following table provides a list of practice statement and objectives, and Mi
| CMMC practice statement and objectives | Microsoft Entra guidance and recommendations | | - | - |
-| SC.L2-3.13.3<br><br>**Practice statement:** Separate user functionality form system management functionality. <br><br>**Objectives:**<br>Determine if:<br>[a.] user functionality is identified;<br>[b.] system management functionality is identified; and<br>[c.] user functionality is separated from system management functionality. | Maintain separate user accounts in Microsoft Entra ID for everyday productivity use and administrative or system/privileged management. Privileged accounts should be cloud-only or managed accounts and not synchronized from on-premises to protect the cloud environment from on-premises compromise. System/privileged access should only be permitted from a security hardened privileged access workstation (PAW). Configure Conditional Access device filters to restrict access to administrative applications from PAWs that are enabled using Azure Virtual Desktops.<br>[Why are privileged access devices important](/security/compass/privileged-access-devices)<br>[Device Roles and Profiles](/security/compass/privileged-access-devices)<br>[Filter for devices as a condition in Conditional Access policy](../conditional-access/concept-condition-filters-for-devices.md)<br>[Azure Virtual Desktop](https://azure.microsoft.com/products/virtual-desktop/) |
+| SC.L2-3.13.3<br><br>**Practice statement:** Separate user functionality form system management functionality. <br><br>**Objectives:**<br>Determine if:<br>[a.] user functionality is identified;<br>[b.] system management functionality is identified; and<br>[c.] user functionality is separated from system management functionality. | Maintain separate user accounts in Microsoft Entra ID for everyday productivity use and administrative or system/privileged management. Privileged accounts should be cloud-only or managed accounts and not synchronized from on-premises to protect the cloud environment from on-premises compromise. System/privileged access should only be permitted from a security hardened privileged access workstation (PAW). Configure Conditional Access device filters to restrict access to administrative applications from PAWs that are enabled using Azure Virtual Desktops.<br>[Why are privileged access devices important](/security/compass/privileged-access-devices)<br>[Device Roles and Profiles](/security/privileged-access-workstations/privileged-access-devices)<br>[Filter for devices as a condition in Conditional Access policy](../conditional-access/concept-condition-filters-for-devices.md)<br>[Azure Virtual Desktop](https://azure.microsoft.com/products/virtual-desktop/) |
| SC.L2-3.13.4<br><br>**Practice statement:** Prevent unauthorized and unintended information transfer via shared system resources.<br><br>**Objectives:**<br>Determine if:<br>[a.] unauthorized and unintended information transfer via shared system resources is prevented. | Configure device management policies via MDM (such as Microsoft Intune), Configuration Manager, or group policy objects (GPO) to ensure devices are compliant with system hardening procedures. Include compliance with company policy regarding software patches to prevent attackers from exploiting flaws.<br><br>Configure Conditional Access policies to enforce device compliance.<br><br>**Conditional Access**<br>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br>[Require Microsoft Entra hybrid joined device](../conditional-access/concept-conditional-access-grant.md)<br><br>**InTune**<br>[Device compliance policies in Microsoft Intune](/mem/intune/protect/device-compliance-get-started) | | SC.L2-3.13.13<br><br>**Practice statement:** Control and monitor the use of mobile code.<br><br>**Objectives:**<br>Determine if:<br>[a.] use of mobile code is controlled; and<br>[b.] use of mobile code is monitored. | Configure device management policies via MDM (such as Microsoft Intune), Configuration Manager, or group policy objects (GPO) to disable the use of mobile code. Where use of mobile code is required monitor the use with endpoint security such as Microsoft Defender for Endpoint.<br><br>Configure Conditional Access policies to enforce device compliance.<br><br>**Conditional Access**<br>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br>[Require Microsoft Entra hybrid joined device](../conditional-access/concept-conditional-access-grant.md)<br><br>**InTune**<br>[Device compliance policies in Microsoft Intune](/mem/intune/protect/device-compliance-get-started)<br><br>**Defender for Endpoint**<br>[Microsoft Defender for Endpoint](/microsoft-365/security/defender-endpoint/microsoft-defender-endpoint?view=o365-worldwide&preserve-view=true) |
active-directory Fedramp Access Controls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/fedramp-access-controls.md
Each row in the following table provides prescriptive guidance to help you devel
| **AC-6(7)**<p><p>**The organization:**<br>**(a.)** Reviews [*FedRAMP Assignment: at a minimum, annually*] the privileges assigned to [*FedRAMP Assignment: all users with privileges*] to validate the need for such privileges; and<br>**(b.)** Reassigns or removes privileges, if necessary, to correctly reflect organizational mission/business needs. | **Review and validate all users with privileged access every year. Ensure privileges are reassigned (or removed if necessary) to align with organizational mission and business requirements.**<p>Use Microsoft Entra entitlement management with access reviews for privileged users to verify if privileged access is required. <p>Access reviews<br><li>[What is Microsoft Entra entitlement management?](../governance/entitlement-management-overview.md)<br><li>[Create an access review of Microsoft Entra roles in Privileged Identity Management](../privileged-identity-management/pim-create-roles-and-resource-roles-review.md)<br><li>[Review access of an access package in Microsoft Entra entitlement management](../governance/entitlement-management-access-reviews-review-access.md) | | **AC-7 Unsuccessful Login Attempts**<p><p>**The organization:**<br>**(a.)** Enforces a limit of [*FedRAMP Assignment: not more than three (3)*] consecutive invalid logon attempts by a user during a [*FedRAMP Assignment: fifteen (15) minutes*]; and<br>**(b.)** Automatically [Selection: locks the account/node for a [*FedRAMP Assignment: minimum of three (3) hours or until unlocked by an administrator]; delays next logon prompt according to [Assignment: organization-defined delay algorithm*]] when the maximum number of unsuccessful attempts is exceeded. | **Enforce a limit of no more than three consecutive failed login attempts on customer-deployed resources within a 15-minute period. Lock the account for a minimum of three hours or until unlocked by an administrator.**<p>Enable custom smart lockout settings. Configure lockout threshold and lockout duration in seconds to implement these requirements. <p>Smart lockout<br><li>[Protect user accounts from attacks with Microsoft Entra smart lockout](../authentication/howto-password-smart-lockout.md)<br><li>[Manage Microsoft Entra smart lockout values](../authentication/howto-password-smart-lockout.md) | | **AC-8 System Use Notification**<p><p>**The information system:**<br>**(a.)** Displays to users [*Assignment: organization-defined system use notification message or banner (FedRAMP Assignment: see additional Requirements and Guidance)*] before granting access to the system that provides privacy and security notices consistent with applicable federal laws, Executive Orders, directives, policies, regulations, standards, and guidance and states that:<br>(1.) Users are accessing a U.S. Government information system;<br>(2.) Information system usage may be monitored, recorded, and subject to audit;<br>(3.) Unauthorized use of the information system is prohibited and subject to criminal and civil penalties; and<br>(4.) Use of the information system indicates consent to monitoring and recording;<p><p>**(b.)** Retains the notification message or banner on the screen until users acknowledge the usage conditions and take explicit actions to log on to or further access the information system; and<p><p>**(c.)** For publicly accessible systems:<br>(1.) Displays system use information [*Assignment: organization-defined conditions (FedRAMP Assignment: see additional Requirements and Guidance)*], before granting further access;<br>(2.) Displays references, if any, to monitoring, recording, or auditing that are consistent with privacy accommodations for such systems that generally prohibit those activities; and<br>(3.) Includes a description of the authorized uses of the system.<p><p>**AC-8 Additional FedRAMP Requirements and Guidance:**<br>**Requirement:** The service provider shall determine elements of the cloud environment that require the System Use Notification control. The elements of the cloud environment that require System Use Notification are approved and accepted by the JAB/AO.<br>**Requirement:** The service provider shall determine how System Use Notification is going to be verified and provide appropriate periodicity of the check. The System Use Notification verification and periodicity are approved and accepted by the JAB/AO.<br>**Guidance:** If performed as part of a Configuration Baseline check, then the % of items requiring setting that are checked and that pass (or fail) check can be provided.<br>**Requirement:** If not performed as part of a Configuration Baseline check, then there must be documented agreement on how to provide results of verification and the necessary periodicity of the verification by the service provider. The documented agreement on how to provide verification of the results are approved and accepted by the JAB/AO. | **Display and require user acknowledgment of privacy and security notices before granting access to information systems.**<p>With Microsoft Entra ID, you can deliver notification or banner messages for all apps that require and record acknowledgment before granting access. You can granularly target these terms of use policies to specific users (Member or Guest). You can also customize them per application via Conditional Access policies.<p>Terms of use<br><li>[Microsoft Entra terms of use](../conditional-access/terms-of-use.md)<br><li>[View report of who has accepted and declined](../conditional-access/terms-of-use.md) |
-| **AC-10 Concurrent Session Control**<br>The information system limits the number of concurrent sessions for each [*Assignment: organization-defined account and/or account type*] to [*FedRAMP Assignment: three (3) sessions for privileged access and two (2) sessions for non-privileged access*].|**Limit concurrent sessions to three sessions for privileged access and two for nonprivileged access.** <p>Currently, users connect from multiple devices, sometimes simultaneously. Limiting concurrent sessions leads to a degraded user experience and provides limited security value. A better approach to address the intent behind this control is to adopt a zero-trust security posture. Conditions are explicitly validated before a session is created and continually validated throughout the life of a session. <p>In addition, use the following compensating controls. <p>Use Conditional Access policies to restrict access to compliant devices. Configure policy settings on the device to enforce user sign-in restrictions at the OS level with MDM solutions such as Intune. Endpoint Manager or group policy objects can also be considered in hybrid deployments.<p> Use Privileged Identity Management to further restrict and control privileged accounts. <p> Configure smart account lockout for invalid sign-in attempts.<p>**Implementation guidance** <p>Zero trust<br><li> [Securing identity with Zero Trust](/security/zero-trust/identity)<br><li>[Continuous access evaluation in Microsoft Entra ID](../conditional-access/concept-continuous-access-evaluation.md)<p>Conditional Access<br><li>[What is Conditional Access in Microsoft Entra ID?](../conditional-access/overview.md)<br><li>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br><li>[User sign-in frequency](../conditional-access/howto-conditional-access-session-lifetime.md)<p>Device policies<br><li>[Other smart card Group Policy settings and registry keys](/windows/security/identity-protection/smart-cards/smart-card-group-policy-and-registry-settings)<br><li>[Microsoft Endpoint Manager overview](/mem/endpoint-manager-overview)<p>Resources<br><li>[What is Microsoft Entra Privileged Identity Management?](../privileged-identity-management/pim-configure.md)<br><li>[Protect user accounts from attacks with Microsoft Entra smart lockout](../authentication/howto-password-smart-lockout.md)<p>See AC-12 for more session reevaluation and risk mitigation guidance. |
+| **AC-10 Concurrent Session Control**<br>The information system limits the number of concurrent sessions for each [*Assignment: organization-defined account and/or account type*] to [*FedRAMP Assignment: three (3) sessions for privileged access and two (2) sessions for non-privileged access*].|**Limit concurrent sessions to three sessions for privileged access and two for nonprivileged access.** <p>Currently, users connect from multiple devices, sometimes simultaneously. Limiting concurrent sessions leads to a degraded user experience and provides limited security value. A better approach to address the intent behind this control is to adopt a zero-trust security posture. Conditions are explicitly validated before a session is created and continually validated throughout the life of a session. <p>In addition, use the following compensating controls. <p>Use Conditional Access policies to restrict access to compliant devices. Configure policy settings on the device to enforce user sign-in restrictions at the OS level with MDM solutions such as Intune. Endpoint Manager or group policy objects can also be considered in hybrid deployments.<p> Use Privileged Identity Management to further restrict and control privileged accounts. <p> Configure smart account lockout for invalid sign-in attempts.<p>**Implementation guidance** <p>Zero trust<br><li> [Securing identity with Zero Trust](/security/zero-trust/deploy/identity)<br><li>[Continuous access evaluation in Microsoft Entra ID](../conditional-access/concept-continuous-access-evaluation.md)<p>Conditional Access<br><li>[What is Conditional Access in Microsoft Entra ID?](../conditional-access/overview.md)<br><li>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br><li>[User sign-in frequency](../conditional-access/howto-conditional-access-session-lifetime.md)<p>Device policies<br><li>[Other smart card Group Policy settings and registry keys](/windows/security/identity-protection/smart-cards/smart-card-group-policy-and-registry-settings)<br><li>[Microsoft Endpoint Manager overview](/mem/endpoint-manager-overview)<p>Resources<br><li>[What is Microsoft Entra Privileged Identity Management?](../privileged-identity-management/pim-configure.md)<br><li>[Protect user accounts from attacks with Microsoft Entra smart lockout](../authentication/howto-password-smart-lockout.md)<p>See AC-12 for more session reevaluation and risk mitigation guidance. |
| **AC-11 Session Lock**<br>**The information system:**<br>**(a)** Prevents further access to the system by initiating a session lock after [*FedRAMP Assignment: fifteen (15) minutes*] of inactivity or upon receiving a request from a user; and<br>**(b)** Retains the session lock until the user reestablishes access using established identification and authentication procedures.<p><p>**AC-11(1)**<br>The information system conceals, via the session lock, information previously visible on the display with a publicly viewable image. | **Implement a session lock after a 15-minute period of inactivity or upon receiving a request from a user. Retain the session lock until the user reauthenticates. Conceal previously visible information when a session lock is initiated.**<p> Implement device lock by using a Conditional Access policy to restrict access to compliant devices. Configure policy settings on the device to enforce device lock at the OS level with MDM solutions such as Intune. Endpoint Manager or group policy objects can also be considered in hybrid deployments. For unmanaged devices, configure the Sign-In Frequency setting to force users to reauthenticate.<p>Conditional Access<br><li>[Require device to be marked as compliant](../conditional-access/concept-conditional-access-grant.md)<br><li>[User sign-in frequency](../conditional-access/howto-conditional-access-session-lifetime.md)<p>MDM policy<br><li>Configure devices for maximum minutes of inactivity until the screen locks ([Android](/mem/intune/configuration/device-restrictions-android), [iOS](/mem/intune/configuration/device-restrictions-ios), [Windows 10](/mem/intune/configuration/device-restrictions-windows-10)). | | **AC-12 Session Termination**<br>The information system automatically terminates a user session after [*Assignment: organization-defined conditions or trigger events requiring session disconnect*].| **Automatically terminate user sessions when organizational defined conditions or trigger events occur.**<p>Implement automatic user session reevaluation with Microsoft Entra features such as risk-based Conditional Access and continuous access evaluation. You can implement inactivity conditions at a device level as described in AC-11.<p>Resources<br><li>[Sign-in risk-based Conditional Access](../conditional-access/howto-conditional-access-policy-risk.md)<br><li>[User risk-based Conditional Access](../conditional-access/howto-conditional-access-policy-risk-user.md)<br><li>[Continuous access evaluation](../conditional-access/concept-continuous-access-evaluation.md) | **AC-12(1)**<br>**The information system:**<br>**(a.)** Provides a logout capability for user-initiated communications sessions whenever authentication is used to gain access to [Assignment: organization-defined information resources]; and<br>**(b.)** Displays an explicit logout message to users indicating the reliable termination of authenticated communications sessions.<p><p>**AC-8 Additional FedRAMP Requirements and Guidance:**<br>**Guidance:** Testing for logout functionality (OTG-SESS-006) [Testing for logout functionality](https://owasp.org/www-project-web-security-testing-guide/latest/4-Web_Application_Security_Testing/06-Session_Management_Testing/06-Testing_for_Logout_Functionality) | **Provide a logout capability for all sessions and display an explicit logout message.** <p>All Microsoft Entra ID surfaced web interfaces provide a logout capability for user-initiated communications sessions. When SAML applications are integrated with Microsoft Entra ID, implement single sign-out. <p>Logout capability<br><li>When the user selects [Sign-out everywhere](https://aka.ms/mysignins), all current issued tokens are revoked. <p>Display message<br>Microsoft Entra ID automatically displays a message after user-initiated logout.<br><p>![Screenshot that shows an access control message.](medi) |
active-directory Hipaa Audit Controls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/hipaa-audit-controls.md
The following content provides the safeguard controls guidance from HIPAA. Find
| Recommendation | Action | | - | - |
-| Scan environment for ePHI data | [Microsoft Purview](../../purview/overview.md) can be enabled in audit mode to scan what ePHI is sitting in the data estate, and the resources that are being used to store that data. This information helps in establishing data classification and labeling the sensitivity of the data.</br>In addition, using [Content Explorer](/microsoft-365/compliance/data-classification-content-explorer) provides visibility into where the sensitive data is located. This information helps start the labeling journey from manually applying labeling or labeling recommendations on the client-side to service-side autolabeling. |
+| Scan environment for ePHI data | [Microsoft Purview](/purview/governance-solutions-overview) can be enabled in audit mode to scan what ePHI is sitting in the data estate, and the resources that are being used to store that data. This information helps in establishing data classification and labeling the sensitivity of the data.</br>In addition, using [Content Explorer](/purview/data-classification-content-explorer) provides visibility into where the sensitive data is located. This information helps start the labeling journey from manually applying labeling or labeling recommendations on the client-side to service-side autolabeling. |
| Enable Priva to safeguard Microsoft 365 data | [Microsoft Priva](/privacy/priva/priva-overview) evaluate ePHI data stored in Microsoft 365, scanning, and evaluating for sensitive information. | |Enable Azure Security benchmark |[Microsoft cloud security benchmark](/security/benchmark/azure/introduction) provides control for data protection across Azure services and provides a baseline for implementation for services that store ePHI. Audit mode provides those recommendations and remediation steps to secure the environment. | | Enable Defender Vulnerability Management | [Microsoft Defender Vulnerability management](/azure/defender-for-cloud/remediate-vulnerability-findings-vm) is a built-in module in **Microsoft Defender for Endpoint**. The module helps you identify and discover vulnerabilities and misconfigurations in real-time. The module also helps you prioritize presenting the findings in a dashboard, and reports across devices, VMs and databases. |
active-directory Memo 22 09 Multi Factor Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/memo-22-09-multi-factor-authentication.md
Some agencies are modernizing their authentication credentials. There are multip
* See, [Overview of Microsoft Entra certificate-based authentication](../authentication/concept-certificate-based-authentication.md) * **Windows Hello for Business** has phishing-resistant multifactor authentication * See, [Windows Hello for Business Deployment Overview](/windows/security/identity-protection/hello-for-business/hello-deployment-guide)
- * See, [Windows Hello for Business](/windows/security/identity-protection/hello-for-business/hello-overview)
+ * See, [Windows Hello for Business](/windows/security/identity-protection/hello-for-business/)
### Protection from external phishing
active-directory Nist About Authenticator Assurance Levels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/nist-about-authenticator-assurance-levels.md
The standard includes AAL requirements for the following categories:
In general, AAL1 isn't recommended because it accepts password-only solutions, the most easily compromised authentication. For more information, see the blog post, [Your Pa$$word doesn't matter](https://techcommunity.microsoft.com/t5/azure-active-directory-identity/your-pa-word-doesn-t-matter/ba-p/731984).
-While NIST doesn't require verifier impersonation (credential phishing) resistance until AAL3, we advise you to address this threat at all levels. You can select authenticators that provide verifier impersonation resistance, such as requiring devices are joined to Microsoft Entra ID or hybrid Microsoft Entra ID. If you're using Office 365, you can use Office 365 Advanced Threat Protection, and its [anti-phishing policies](/microsoft-365/security/office-365-security/set-up-anti-phishing-policies).
+While NIST doesn't require verifier impersonation (credential phishing) resistance until AAL3, we advise you to address this threat at all levels. You can select authenticators that provide verifier impersonation resistance, such as requiring devices are joined to Microsoft Entra ID or hybrid Microsoft Entra ID. If you're using Office 365, you can use Office 365 Advanced Threat Protection, and its [anti-phishing policies](/microsoft-365/security/office-365-security/anti-phishing-policies-about).
As you evaluate the needed NIST AAL for your organization, consider whether your entire organization must meet NIST standards. If there are specific user groups and resources that can be segregated, you can apply NIST AAL configurations to those user groups and resources.
active-directory Nist Authenticator Assurance Level 3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/nist-authenticator-assurance-level-3.md
Authenticators are required to be:
Microsoft Entra joined and Microsoft Entra hybrid joined devices meet this requirement when:
-* You run [Windows in a FIPS-140 approved mode](/windows/security/threat-protection/fips-140-validation)
+* You run [Windows in a FIPS-140 approved mode](/windows/security/security-foundations/certification/fips-140-validation)
* On a machine with a TPM that's FIPS 140 Level 1 Overall, or higher, with FIPS 140 Level 3 Physical Security
active-directory Pci Requirement 8 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/pci-requirement-8.md
For more information about Microsoft Entra authentication methods that meet PCI
|**8.3.1** All user access to system components for users and administrators is authenticated via at least one of the following authentication factors: </br> Something you know, such as a password or passphrase. </br> Something you have, such as a token device or smart card. </br> Something you are, such as a biometric element.|[Microsoft Entra ID requires passwordless methods to meet the PCI requirements](https://microsoft.sharepoint-df.com/:w:/t/MicrosoftTechnicalContributorProgram-PCIDSSDocumentation/ETlhHVraW_NPsMGM-mFZlfgB4OPry8BxGizhQ4qItfGCFw?e=glcZ8y) </br> See holistic passwordless deployment. [Plan a passwordless authentication deployment in Microsoft Entra ID](../authentication/howto-authentication-passwordless-deployment.md)| |**8.3.2** Strong cryptography is used to render all authentication factors unreadable during transmission and storage on all system components.|Cryptography used by Microsoft Entra ID is compliant with [PCI definition of Strong Cryptography](https://www.pcisecuritystandards.org/glossary/#glossary-s). [Microsoft Entra Data protection considerations](../fundamentals/data-protection-considerations.md)| |**8.3.3** User identity is verified before modifying any authentication factor.|Microsoft Entra ID requires users to authenticate to update their authentication methods using self-service, such as mysecurityinfo portal and the self-service password reset (SSPR) portal. [Set up security info from a sign-in page](https://support.microsoft.com/en-us/topic/28180870-c256-4ebf-8bd7-5335571bf9a8) </br> [Common Conditional Access policy: Securing security info registration](../conditional-access/howto-conditional-access-policy-registration.md) </br> [Microsoft Entra self-service password reset](../authentication/concept-sspr-howitworks.md) </br> Administrators with privileged roles can modify authentication factors: Global, Password, User, Authentication, and Privileged Authentication. [Least privileged roles by task in Microsoft Entra ID](../roles/delegate-by-task.md). Microsoft recommends you enable JIT access and governance, for privileged access using [Microsoft Entra Privileged Identity Management](../privileged-identity-management/pim-configure.md)|
-|**8.3.4** Invalid authentication attempts are limited by: </br> Locking out the user ID after not more than 10 attempts. </br> Setting the lockout duration to a minimum of 30 minutes or until the userΓÇÖs identity is confirmed.|Deploy Windows Hello for Business for Windows devices that support hardware Trusted Platform Modules (TPM) 2.0 or higher. </br> For Windows Hello for Business, lockout relates to the device. The gesture, PIN, or biometric, unlocks access to the local TPM. Administrators configure the lockout behavior with GPO or Intune policies. [TPM Group Policy settings](/windows/security/information-protection/tpm/trusted-platform-module-services-group-policy-settings) </br> [Manage Windows Hello for Business on devices at the time devices enroll with Intune](/mem/intune/protect/windows-hello) </br> [TPM fundamentals](/windows/security/information-protection/tpm/tpm-fundamentals) </br> Windows Hello for Business works for on-premises authentication to Active Directory and cloud resources on Microsoft Entra ID. </br> For FIDO2 security keys, brute-force protection is related to the key. The gesture, PIN or biometric, unlocks access to the local key storage. Administrators configure Microsoft Entra ID to allow registration of FIDO2 security keys from manufacturers that align to PCI requirements. [Enable passwordless security key sign-in](../authentication/howto-authentication-passwordless-security-key.md) </br></br> **Microsoft Authenticator App** </br> To mitigate brute force attacks using Microsoft Authenticator app passwordless sign in, enable number matching and more context. </br> Microsoft Entra ID generates a random number in the authentication flow. The user types it in the authenticator app. The mobile app authentication prompt shows the location, the request IP address, and the request application. [How to use number matching in MFA notifications](../authentication/how-to-mfa-number-match.md) </br> [How to use additional context in Microsoft Authenticator notifications](../authentication/how-to-mfa-additional-context.md)|
+|**8.3.4** Invalid authentication attempts are limited by: </br> Locking out the user ID after not more than 10 attempts. </br> Setting the lockout duration to a minimum of 30 minutes or until the userΓÇÖs identity is confirmed.|Deploy Windows Hello for Business for Windows devices that support hardware Trusted Platform Modules (TPM) 2.0 or higher. </br> For Windows Hello for Business, lockout relates to the device. The gesture, PIN, or biometric, unlocks access to the local TPM. Administrators configure the lockout behavior with GPO or Intune policies. [TPM Group Policy settings](/windows/security/hardware-security/tpm/trusted-platform-module-services-group-policy-settings) </br> [Manage Windows Hello for Business on devices at the time devices enroll with Intune](/mem/intune/protect/windows-hello) </br> [TPM fundamentals](/windows/security/hardware-security/tpm/tpm-fundamentals) </br> Windows Hello for Business works for on-premises authentication to Active Directory and cloud resources on Microsoft Entra ID. </br> For FIDO2 security keys, brute-force protection is related to the key. The gesture, PIN or biometric, unlocks access to the local key storage. Administrators configure Microsoft Entra ID to allow registration of FIDO2 security keys from manufacturers that align to PCI requirements. [Enable passwordless security key sign-in](../authentication/howto-authentication-passwordless-security-key.md) </br></br> **Microsoft Authenticator App** </br> To mitigate brute force attacks using Microsoft Authenticator app passwordless sign in, enable number matching and more context. </br> Microsoft Entra ID generates a random number in the authentication flow. The user types it in the authenticator app. The mobile app authentication prompt shows the location, the request IP address, and the request application. [How to use number matching in MFA notifications](../authentication/how-to-mfa-number-match.md) </br> [How to use additional context in Microsoft Authenticator notifications](../authentication/how-to-mfa-additional-context.md)|
|**8.3.5** If passwords/passphrases are used as authentication factors to meet Requirement 8.3.1, they're set and reset for each user as follows: </br> Set to a unique value for first-time use and upon reset. </br> Forced to be changed immediately after the first use.|Not applicable to Microsoft Entra ID.| |**8.3.6** If passwords/passphrases are used as authentication factors to meet Requirement 8.3.1, they meet the following minimum level of complexity: </br> A minimum length of 12 characters (or IF the system doesn't support 12 characters, a minimum length of eight characters). </br> Contain both numeric and alphabetic characters.|Not applicable to Microsoft Entra ID.| |**8.3.7** Individuals aren't allowed to submit a new password/passphrase that is the same as any of the last four passwords/passphrases used.|Not applicable to Microsoft Entra ID.|
active-directory Admin Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/admin-api.md
The API is protected through Microsoft Entra ID and uses OAuth2 bearer tokens. T
### User bearer tokens
-The app registration needs to have the API Permission for `Verifiable Credentials Service Admin` and then when acquiring the access token the app should use scope `6a8b4b39-c021-437c-b060-5a14a3fd65f3/full_access`. The access token must be for a user with the [global administrator](../../active-directory/roles/permissions-reference.md#global-administrator) or the [authentication policy administrator](../../active-directory/roles/permissions-reference.md#authentication-policy-administrator) role. A user with role [global reader](../../active-directory/roles/permissions-reference.md#global-reader) can perform read-only API calls.
+The app registration needs to have the API Permission for `Verifiable Credentials Service Admin` and then when acquiring the access token the app should use scope `6a8b4b39-c021-437c-b060-5a14a3fd65f3/full_access`. The access token must be for a user with the [global administrator](../roles/permissions-reference.md#global-administrator) or the [authentication policy administrator](../roles/permissions-reference.md#authentication-policy-administrator) role. A user with role [global reader](../roles/permissions-reference.md#global-reader) can perform read-only API calls.
### Application bearer tokens
The `Verifiable Credentials Service Admin` service supports the following applic
| VerifiableCredential.Credential.Revoke | Permission to [revoke a previously issued credential](how-to-issuer-revoke.md) | | VerifiableCredential.Network.Read | Permission to read entries from the [Verified ID Network](vc-network-api.md) |
-The app registration needs to have the API Permission for `Verifiable Credentials Service Admin` and permissions required from the above table. When acquiring the access token, via the [client credentials flow](../../active-directory/develop/v2-oauth2-client-creds-grant-flow.md), the app should use scope `6a8b4b39-c021-437c-b060-5a14a3fd65f3/.default`.
+The app registration needs to have the API Permission for `Verifiable Credentials Service Admin` and permissions required from the above table. When acquiring the access token, via the [client credentials flow](../develop/v2-oauth2-client-creds-grant-flow.md), the app should use scope `6a8b4b39-c021-437c-b060-5a14a3fd65f3/.default`.
## Onboarding
active-directory Get Started Request Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/get-started-request-api.md
To get an access token, your app must be registered with the Microsoft identity
### Get an access token
-Use the [OAuth 2.0 client credentials grant flow](../../active-directory/develop/v2-oauth2-client-creds-grant-flow.md) to acquire the access token by using the Microsoft identity platform. Use a trusted library for this purpose. In this tutorial, we use the Microsoft Authentication Library [MSAL](../../active-directory/develop/msal-overview.md). MSAL simplifies adding authentication and authorization to an app that can call a secure web API.
+Use the [OAuth 2.0 client credentials grant flow](../develop/v2-oauth2-client-creds-grant-flow.md) to acquire the access token by using the Microsoft identity platform. Use a trusted library for this purpose. In this tutorial, we use the Microsoft Authentication Library [MSAL](../develop/msal-overview.md). MSAL simplifies adding authentication and authorization to an app that can call a secure web API.
# [HTTP](#tab/http)
active-directory Verifiable Credentials Configure Tenant Quick https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/verifiable-credentials-configure-tenant-quick.md
Specifically, you learn how to:
## Prerequisites -- Ensure that you have the [global administrator](../../active-directory/roles/permissions-reference.md#global-administrator) or the [authentication policy administrator](../../active-directory/roles/permissions-reference.md#authentication-policy-administrator) permission for the directory you want to configure. If you're not the global administrator, you need the [application administrator](../../active-directory/roles/permissions-reference.md#application-administrator) permission to complete the app registration including granting admin consent.
+- Ensure that you have the [global administrator](../roles/permissions-reference.md#global-administrator) or the [authentication policy administrator](../roles/permissions-reference.md#authentication-policy-administrator) permission for the directory you want to configure. If you're not the global administrator, you need the [application administrator](../roles/permissions-reference.md#application-administrator) permission to complete the app registration including granting admin consent.
- Ensure that you have a custom domain registered for the Microsoft Entra tenant. If you don't have one registered, the setup defaults to the manual setup experience. ## Set up Verified ID
active-directory Verifiable Credentials Configure Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/verifiable-credentials-configure-tenant.md
The following diagram illustrates the Verified ID architecture and the component
## Prerequisites - You need an Azure tenant with an active subscription. If you don't have an Azure subscription, [create one for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- Ensure that you have the [global administrator](../../active-directory/roles/permissions-reference.md#global-administrator) or the [authentication policy administrator](../../active-directory/roles/permissions-reference.md#authentication-policy-administrator) permission for the directory you want to configure. If you're not the global administrator, you need the [application administrator](../../active-directory/roles/permissions-reference.md#application-administrator) permission to complete the app registration including granting admin consent.
+- Ensure that you have the [global administrator](../roles/permissions-reference.md#global-administrator) or the [authentication policy administrator](../roles/permissions-reference.md#authentication-policy-administrator) permission for the directory you want to configure. If you're not the global administrator, you need the [application administrator](../roles/permissions-reference.md#application-administrator) permission to complete the app registration including granting admin consent.
- Ensure that you have the [contributor](/azure/role-based-access-control/built-in-roles#contributor) role for the Azure subscription or the resource group where you are deploying Azure Key Vault. ## Create a key vault
active-directory Verifiable Credentials Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/verifiable-credentials-faq.md
The tutorials for deploying and running the [samples](verifiable-credentials-con
- Dotnet - [Publish to App Service](/azure/app-service/quickstart-dotnetcore?tabs=net60&pivots=development-environment-vs#2-publish-your-web-app) - Node - [Deploy to App Service](/azure/app-service/quickstart-nodejs?tabs=linux&pivots=development-environment-vscode#deploy-to-azure)-- Java - [Deploy to App Service](../../app-service/quickstart-java.md?tabs=javase&pivots=platform-linux-development-environment-maven#4deploy-the-app). You need to add the maven plugin for Azure App Service to the sample.
+- Java - [Deploy to App Service](/azure/app-service/quickstart-java?tabs=javase&pivots=platform-linux-development-environment-maven#4deploy-the-app). You need to add the maven plugin for Azure App Service to the sample.
- Python - [Deploy using Visual Studio Code](/azure/app-service/quickstart-python?tabs=flask%2Cwindows%2Cazure-cli%2Cvscode-deploy%2Cdeploy-instructions-azportal%2Cterminal-bash%2Cdeploy-instructions-zip-azcli#3deploy-your-application-code-to-azure) Regardless of which language of the sample you are using, they will pickup the Azure AppService hostname `https://something.azurewebsites.net` and use it as the public endpoint. You don't need to configure something extra to make it work. If you make changes to the code or configuration, you need to redeploy the sample to Azure AppServices. Troubleshooting/debugging will not be as easy as running the sample on your local machine, where traces to the console window shows you errors, but you can achieve almost the same by using the [Log Stream](/azure/app-service/troubleshoot-diagnostic-logs#stream-logs).
active-directory Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/whats-new.md
Instructions for setting up place of work verification on LinkedIn available [he
## February 2023 -- *Public preview* - Entitlement Management customers can now create access packages that leverage Microsoft Entra Verified ID [learn more](../../active-directory/governance/entitlement-management-verified-id-settings.md)
+- *Public preview* - Entitlement Management customers can now create access packages that leverage Microsoft Entra Verified ID [learn more](../governance/entitlement-management-verified-id-settings.md)
- The Request Service API can now do revocation check for verifiable credentials presented that was issued with [StatusList2021](https://w3c.github.io/vc-status-list-2021/) or the [RevocationList2020](https://w3c-ccg.github.io/vc-status-rl-2020/) status list types.
Instructions for setting up place of work verification on LinkedIn available [he
## November 2022 -- Microsoft Entra Verified ID now reports events in the [audit log](../../active-directory/reports-monitoring/concept-audit-logs.md). Only management changes made via the Admin API are currently logged. Issuance or presentations of verifiable credentials aren't reported in the audit log. The log entries have a service name of `Verified ID` and the activity will be `Create authority`, `Update contract`, etc.
+- Microsoft Entra Verified ID now reports events in the [audit log](../reports-monitoring/concept-audit-logs.md). Only management changes made via the Admin API are currently logged. Issuance or presentations of verifiable credentials aren't reported in the audit log. The log entries have a service name of `Verified ID` and the activity will be `Create authority`, `Update contract`, etc.
## September 2022
active-directory Workload Identity Federation Create Trust https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/workload-identities/workload-identity-federation-create-trust.md
az rest -m DELETE -u 'https://graph.microsoft.com/applications/f6475511-fd81-49
- To learn how to use workload identity federation for GitHub Actions, see [Configure a GitHub Actions workflow to get an access token](/azure/developer/github/connect-from-azure). - Read the [GitHub Actions documentation](https://docs.github.com/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-azure) to learn more about configuring your GitHub Actions workflow to get an access token from Microsoft identity provider and access Azure resources. - For more information, read about how Microsoft Entra ID uses the [OAuth 2.0 client credentials grant](../develop/v2-oauth2-client-creds-grant-flow.md#third-case-access-token-request-with-a-federated-credential) and a client assertion issued by another IdP to get a token.-- For information about the required format of JWTs created by external identity providers, read about the [assertion format](/azure/active-directory/develop/active-directory-certificate-credentials#assertion-format).
+- For information about the required format of JWTs created by external identity providers, read about the [assertion format](../develop/certificate-credentials.md#assertion-format).
advisor Advisor Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-get-started.md
Previously updated : 09/15/2023 Last updated : 09/16/2023
ai-services Use Your Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/concepts/use-your-data.md
There is an [upload limit](../quotas-limits.md), and there are some caveats abou
There are three different sources of data that you can use with Azure OpenAI on your data. * Blobs in an Azure storage container that you provide
-* Local files uploaded using the Azure OpenAI Studio
-* URLs/web addresses.
+* Local files uploaded using the Azure OpenAI Studio
Once data is ingested, an [Azure Cognitive Search](/azure/search/search-what-is-azure-search) index in your search resource gets created to integrate the information with Azure OpenAI models.
Once data is ingested, an [Azure Cognitive Search](/azure/search/search-what-is-
Using the Azure OpenAI Studio, you can upload files from your machine. The service then stores the files to an Azure storage container and performs ingestion from the container.
-**Data ingestion from URLs**
-
-A crawling component first crawls the provided URL and stores its contents to an Azure Storage Container. The service then performs ingestion from the container.
- ### Troubleshooting failed ingestion jobs To troubleshoot a failed job, always look out for errors or warnings specified either in the API response or Azure OpenAI studio. Here are some of the common errors and warnings:
After you approve the request in your search service, you can start using the [c
### Storage accounts
-Storage accounts in virtual networks, firewalls, and private endpoints are currently not supported by Azure OpenAI on your data.
+Storage accounts in virtual networks, firewalls, and private endpoints are supported by Azure OpenAI on your data. To use a storage account in a private network:
+
+1. Ensure you have the system assigned managed identity principal enabled for your Azure OpenAI and Azure Cognitive Search resources.
+ 1. Using the Azure portal, navigate to your resource, and select **Identity** from the navigation menu on the left side of the screen.
+ 1. Set **Status** to **On**.
+ 1. Perform these steps for both of your Azure OpenAI and Azure Cognitive Search resources.
+
+ :::image type="content" source="../media/use-your-data/managed-identity.png" alt-text="A screenshot showing managed identity settings in the Azure portal." lightbox="../media/use-your-data/managed-identity.png":::
+
+1. Navigate back to your storage account. Select **Access Control (IAM)** for your resource. Select **Add**, then **Add role assignment**. In the window that appears, add the **Storage Data Contributor** role to the storage resource for your Azure OpenAI and search resource's managed identity.
+ 1. Assign access to **Managed Identity**.
+ 1. If you have multiple search resources, Perform this step for each search resource.
+
+ :::image type="content" source="../media/use-your-data/add-role-assignment.png" alt-text="A screenshot showing the role assignment option in the Azure portal." lightbox="../media/use-your-data/add-role-assignment.png":::
+
+1. If your storage account hasn't already been network restricted, go to networking tab and select **Enabled from selected virtual networks and IP addresses**.
+
+ :::image type="content" source="../media/use-your-data/enable-virtual-network.png" alt-text="A screenshot showing the option for enabling virtual networks in the Azure portal." lightbox="../media/use-your-data/enable-virtual-network.png":::
## Azure Role-based access controls (Azure RBAC)
ai-services Use Your Data Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/use-your-data-quickstart.md
zone_pivot_groups: openai-use-your-data
::: zone-end +
+[Reference](https://platform.openai.com/docs/api-reference?lang=python) | [Source code](https://github.com/openai/openai-python) | [Package (pypi)](https://pypi.org/project/openai/) | [Samples](https://github.com/openai/openai-cookbook/)
+
+The links above reference the OpenAI API for Python. There is no Azure-specific OpenAI Python SDK. [Learn how to switch between the OpenAI services and Azure OpenAI services](/azure/ai-services/openai/how-to/switching-endpoints).
+++
+[Reference](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go) | [Source code](https://github.com/Azure/azure-sdk-for-go) | [Package (Go)](https://pkg.go.dev/github.com/azure/azure-dev) | [Samples](https://github.com/azure-samples/azure-sdk-for-go-samples)
++ In this quickstart you can use your own data with Azure OpenAI models. Using Azure OpenAI's models on your data can provide you with a powerful conversational AI platform that enables faster and more accurate communication.
In this quickstart you can use your own data with Azure OpenAI models. Using Azu
- Your chat model can use version `gpt-35-turbo (0301)`, `gpt-35-turbo-16k`, `gpt-4`, and `gpt-4-32k`. You can view or change your model version in [Azure OpenAI Studio](./how-to/working-with-models.md#model-updates). -- Be sure that you are assigned at least the [Cognitive Services Contributor](./how-to/role-based-access-control.md#cognitive-services-contributor) role for the Azure OpenAI resource.
+- Be sure that you are assigned at least the [Cognitive Services Contributor](./how-to/role-based-access-control.md#cognitive-services-contributor) role for the Azure OpenAI resource.
::: zone pivot="programming-language-javascript"
In this quickstart you can use your own data with Azure OpenAI models. Using Azu
::: zone-end ++++++ ::: zone pivot="rest-api" [!INCLUDE [REST API quickstart](includes/use-your-data-rest.md)]
aks Load Balancer Standard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/load-balancer-standard.md
The following annotations are supported for Kubernetes services with type `LoadB
> [!NOTE] > `service.beta.kubernetes.io/azure-load-balancer-disable-tcp-reset` was deprecated in Kubernetes 1.18 and removed in 1.20.
+### Customize the load balancer health probe
+| Annotation | Value | Description |
+| - | -- | -- |
+| `service.beta.kubernetes.io/azure-load-balancer-health-probe-interval` | Health probe interval | |
+| `service.beta.kubernetes.io/azure-load-balancer-health-probe-num-of-probe` | The minimum number of unhealthy responses of health probe | |
+| `service.beta.kubernetes.io/azure-load-balancer-health-probe-request-path` | Request path of the health probe | |
+| `service.beta.kubernetes.io/port_{port}_no_lb_rule` | true/false | {port} is the port number in the service. When it is set to true, no lb rule and health probe rule for this port will be generated. health check service should not be exposed to the public internet(e.g. istio/envoy health check service)|
+| `service.beta.kubernetes.io/port_{port}_no_probe_rule` | true/false | {port} is the port number in the service. When it is set to true, no health probe rule for this port will be generated. |
+| `service.beta.kubernetes.io/port_{port}_health-probe_protocol` | Health probe protocol | {port} is the port number in the service. Explicit protocol for the health probe for the service port {port}, overriding port.appProtocol if set.|
+| `service.beta.kubernetes.io/port_{port}_health-probe_port` | port number or port name in service manifest | {port} is the port number in the service. Explicit port for the health probe for the service port {port}, overriding the default value. |
+| `service.beta.kubernetes.io/port_{port}_health-probe_interval` | Health probe interval | {port} is port number of service. |
+| `service.beta.kubernetes.io/port_{port}_health-probe_num-of-probe` | The minimum number of unhealthy responses of health probe | {port} is port number of service. |
+| `service.beta.kubernetes.io/port_{port}_health-probe_request-path` | Request path of the health probe | {port} is port number of service. |
+
+As documented [here](../load-balancer/load-balancer-custom-probe-overview.md), Tcp, Http and Https are three protocols supported by load balancer service.
+
+Currently, the default protocol of the health probe varies among services with different transport protocols, app protocols, annotations and external traffic policies.
+
+1. for local services, HTTP and /healthz would be used. The health probe will query NodeHealthPort rather than actual backend service
+1. for cluster TCP services, TCP would be used.
+1. for cluster UDP services, no health probes.
+
+> [!NOTE]
+> For local services with PLS integration and PLS proxy protocol enabled, the default HTTP+/healthz health probe does not work. Thus health probe can be customized the same way as cluster services to support this scenario.
+
+Since v1.20, service annotation `service.beta.kubernetes.io/azure-load-balancer-health-probe-request-path` is introduced to determine the health probe behavior.
+
+* For clusters <=1.23, `spec.ports.appProtocol` would only be used as probe protocol when `service.beta.kubernetes.io/azure-load-balancer-health-probe-request-path` is also set.
+* For clusters >1.24, `spec.ports.appProtocol` would be used as probe protocol and `/` would be used as default probe request path (`service.beta.kubernetes.io/azure-load-balancer-health-probe-request-path` could be used to change to a different request path).
+
+Note that the request path would be ignored when using TCP or the `spec.ports.appProtocol` is empty. More specifically:
+
+| loadbalancer sku | `externalTrafficPolicy` | spec.ports.Protocol | spec.ports.AppProtocol | `service.beta.kubernetes.io/azure-load-balancer-health-probe-request-path` | LB Probe Protocol | LB Probe Request Path |
+| - | -- | - | - | -- | | |
+| standard | local | any | any | any | http | `/healthz` |
+| standard | cluster | udp | any | any | null | null |
+| standard | cluster | tcp | | (ignored) | tcp | null |
+| standard | cluster | tcp | tcp | (ignored) | tcp | null |
+| standard | cluster | tcp | http/https | | TCP(<=1.23) or http/https(>=1.24) | null(<=1.23) or `/`(>=1.24) |
+| standard | cluster | tcp | http/https | `/custom-path` | http/https | `/custom-path` |
+| standard | cluster | tcp | unsupported protocol | `/custom-path` | tcp | null |
+| basic | local | any | any | any | http | `/healthz` |
+| basic | cluster | tcp | | (ignored) | tcp | null |
+| basic | cluster | tcp | tcp | (ignored) | tcp | null |
+| basic | cluster | tcp | http | | TCP(<=1.23) or http/https(>=1.24) | null(<=1.23) or `/`(>=1.24) |
+| basic | cluster | tcp | http | `/custom-path` | http | `/custom-path` |
+| basic | cluster | tcp | unsupported protocol | `/custom-path` | tcp | null |
+
+Since v1.21, two service annotations `service.beta.kubernetes.io/azure-load-balancer-health-probe-interval` and `load-balancer-health-probe-num-of-probe` are introduced, which customize the configuration of health probe. If `service.beta.kubernetes.io/azure-load-balancer-health-probe-interval` is not set, Default value of 5 is applied. If `load-balancer-health-probe-num-of-probe` is not set, Default value of 2 is applied. And total probe should be less than 120 seconds.
++
+### Custom Load Balancer health probe for port
+Different ports in a service may require different health probe configurations. This could be because of service design (such as a single health endpoint controlling multiple ports), or Kubernetes features like the [MixedProtocolLBService](https://kubernetes.io/docs/concepts/services-networking/service/#load-balancers-with-mixed-protocol-types).
+
+The following annotations can be used to customize probe configuration per service port.
+
+| port specific annotation | global probe annotation | Usage |
+| - | | - |
+| service.beta.kubernetes.io/port_{port}_no_lb_rule | N/A (no equivalent globally) | if set true, no lb rules and probe rules will be generated |
+| service.beta.kubernetes.io/port_{port}_no_probe_rule | N/A (no equivalent globally) | if set true, no probe rules will be generated |
+| service.beta.kubernetes.io/port_{port}_health-probe_protocol | N/A (no equivalent globally) | Set the health probe protocol for this service port (e.g. Http, Https, Tcp) |
+| service.beta.kubernetes.io/port_{port}_health-probe_port | N/A (no equivalent globally) | Sets the health probe port for this service port (e.g. 15021) |
+| service.beta.kubernetes.io/port_{port}_health-probe_request-path | service.beta.kubernetes.io/azure-load-balancer-health-probe-request-path | For Http or Https, sets the health probe request path. Defaults to / |
+| service.beta.kubernetes.io/port_{port}_health-probe_num-of-probe | service.beta.kubernetes.io/azure-load-balancer-health-probe-num-of-probe | Number of consecutive probe failures before the port is considered unhealthy |
+| service.beta.kubernetes.io/port_{port}_health-probe_interval | service.beta.kubernetes.io/azure-load-balancer-health-probe-interval | The amount of time between probe attempts |
+
+For following manifest, probe rule for port httpsserver is different from the one for httpserver because annoations for port httpsserver are specified.
+
+```yaml
+apiVersion: v1
+kind: Service
+metadata:
+ name: appservice
+ annotations:
+ service.beta.kubernetes.io/azure-load-balancer-health-probe-num-of-probe: "5"
+ service.beta.kubernetes.io/port_443_health-probe_num-of-probe: "4"
+spec:
+ type: LoadBalancer
+ selector:
+ app: server
+ ports:
+ - name: httpserver
+ protocol: TCP
+ port: 80
+ targetPort: 30102
+ - name: httpsserver
+ protocol: TCP
+ appProtocol: HTTPS
+ port: 443
+ targetPort: 30104
+```
+
+In this manifest, the https ports use a different node port, an HTTP readiness check at port 10256 on /healthz(healthz endpoint of kube-proxy).
+```yaml
+apiVersion: v1
+kind: Service
+metadata:
+ name: istio
+ annotations:
+ service.beta.kubernetes.io/azure-load-balancer-internal: "true"
+ service.beta.kubernetes.io/port_443_health-probe_protocol: "http"
+ service.beta.kubernetes.io/port_443_health-probe_port: "10256"
+ service.beta.kubernetes.io/port_443_health-probe_request-path: "/healthz"
+spec:
+ ports:
+ - name: https
+ protocol: TCP
+ port: 443
+ targetPort: 8443
+ nodePort: 30104
+ appProtocol: https
+ selector:
+ app: istio-ingressgateway
+ gateway: istio-ingressgateway
+ istio: ingressgateway
+ type: LoadBalancer
+ sessionAffinity: None
+ externalTrafficPolicy: Local
+ ipFamilies:
+ - IPv4
+ ipFamilyPolicy: SingleStack
+ allocateLoadBalancerNodePorts: true
+ internalTrafficPolicy: Cluster
+```
+
+In this manifest, the https ports use a different health probe endpoint, an HTTP readiness check at port 30000 on /healthz/ready.
+```yaml
+apiVersion: v1
+kind: Service
+metadata:
+ name: istio
+ annotations:
+ service.beta.kubernetes.io/azure-load-balancer-internal: "true"
+ service.beta.kubernetes.io/port_443_health-probe_protocol: "http"
+ service.beta.kubernetes.io/port_443_health-probe_port: "30000"
+ service.beta.kubernetes.io/port_443_health-probe_request-path: "/healthz/ready"
+spec:
+ ports:
+ - name: https
+ protocol: TCP
+ port: 443
+ targetPort: 8443
+ appProtocol: https
+ selector:
+ app: istio-ingressgateway
+ gateway: istio-ingressgateway
+ istio: ingressgateway
+ type: LoadBalancer
+ sessionAffinity: None
+ externalTrafficPolicy: Local
+ ipFamilies:
+ - IPv4
+ ipFamilyPolicy: SingleStack
+ allocateLoadBalancerNodePorts: true
+ internalTrafficPolicy: Cluster
+```
+ ## Troubleshooting SNAT If you know that you're starting many outbound TCP or UDP connections to the same destination IP address and port, and you observe failing outbound connections or support notifies you that you're exhausting SNAT ports (preallocated ephemeral ports used by PAT), you have several general mitigation options. Review these options and decide what's best for your scenario. It's possible that one or more can help manage your scenario. For detailed information, review the [outbound connections troubleshooting guide](../load-balancer/troubleshoot-outbound-connection.md).
aks Node Auto Repair https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/node-auto-repair.md
AKS engineers investigate alternative remediations if auto-repair is unsuccessfu
## Node auto-drain
-[Scheduled events][scheduled-events] can occur on the underlying VMs in any of your node pools. For [spot node pools][spot-node-pools], scheduled events may cause a *preempt* node event for the node. Certain node events, such as *preempt*, cause AKS node auto-drain to attempt a cordon and drain of the affected node. This process enables rescheduling for any affected workloads on that node. You might notice the node receives a taint with `"remediator.aks.microsoft.com/unschedulable"`, because of `"kubernetes.azure.com/scalesetpriority: spot"`.
+[Scheduled events][scheduled-events] can occur on the underlying VMs in any of your node pools. For [spot node pools][spot-node-pools], scheduled events may cause a *preempt* node event for the node. Certain node events, such as *preempt*, cause AKS node auto-drain to attempt a cordon and drain of the affected node. This process enables rescheduling for any affected workloads on that node. You might notice the node receives a taint with `"remediator.kubernetes.azure.com/unschedulable"`, because of `"kubernetes.azure.com/scalesetpriority: spot"`.
The following table shows the node events and actions they cause for AKS node auto-drain:
api-management Api Management Gateways Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-gateways-overview.md
The following table compares features available in the managed gateway versus th
| API threat detection with [Defender for APIs](protect-with-defender-for-apis.md) | ✔️ | ❌ | ❌ | <sup>1</sup> Depends on how the gateway is deployed, but is the responsibility of the customer.<br/>
-<sup>2</sup> Connectivity to the self-hosted gateway v2 [configuration endpoint](self-hosted-gateway-overview.md#fqdn-dependencies) requires DNS resolution of the default endpoint hostname; custom domain name is currently not supported.<br/>
-<sup>3</sup> Requires configuration of local CA certificates.<br/>
+<sup>2</sup> Connectivity to the self-hosted gateway v2 [configuration endpoint](self-hosted-gateway-overview.md#fqdn-dependencies) requires DNS resolution of the endpoint hostname.<br/>
### Backend APIs
For estimated maximum gateway throughput in the API Management service tiers, se
* In environments such as [Kubernetes](how-to-self-hosted-gateway-on-kubernetes-in-production.md), add multiple gateway replicas to handle expected usage. * Optionally [configure autoscaling](how-to-self-hosted-gateway-on-kubernetes-in-production.md#autoscaling) to meet traffic demands.
-## Next steps
+## Related content
- Learn more about [API Management in a Hybrid and multicloud World](https://aka.ms/hybrid-and-multi-cloud-api-management) - Learn more about using the [capacity metric](api-management-capacity.md) for scaling decisions
api-management Api Version Retirement Sep 2023 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/breaking-changes/api-version-retirement-sep-2023.md
After 30 September 2023, if you prefer not to update your tools, scripts, and pr
* **ARM, Bicep, or Terraform templates** - Update the template to use API version 2021-08-01 or later.
-* **Azure CLI** - Run `az version` to check your version. If you're running version 2.38.0 or later, no action is required. Use the `az upgrade` command to upgrade the Azure CLI if necessary. For more information, see [How to update the Azure CLI](/cli/azure/update-azure-cli).
+* **Azure CLI** - Run `az version` to check your version. If you're running version 2.42.0 or later, no action is required. Use the `az upgrade` command to upgrade the Azure CLI if necessary. For more information, see [How to update the Azure CLI](/cli/azure/update-azure-cli).
* **Azure PowerShell** - Run `Get-Module -ListAvailable -Name Az` to check your version. If you're running version 8.1.0 or later, no action is required. Use `Update-Module -Name Az -Repository PSGallery` to update the module if necessary. For more information, see [Install the Azure Az PowerShell module](/powershell/azure/install-azure-powershell).
api-management How To Self Hosted Gateway On Kubernetes In Production https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/how-to-self-hosted-gateway-on-kubernetes-in-production.md
Starting with version 2.1.5 or above, the self-hosted gateway provides observabi
- [API Inspector](api-management-howto-api-inspector.md) will show additional steps when HTTP(S) proxy is being used and its related interactions. - Verbose logs are provided to provide indication of the request proxy behavior.
+> [!NOTE]
+> Due to a known issue with HTTP proxies using basic authentication, using certificate revocation list (CRL) validation is not supported. Learn more in our [Self-Hosted Gateway settings reference](self-hosted-gateway-settings-reference.md) how to configure it appropriately.
+ > [!Warning] > Ensure that the [infrastructure requirements](self-hosted-gateway-overview.md#fqdn-dependencies) have been met and that the self-hosted gateway can still connect to them or certain functionality will not work properly.
api-management Self Hosted Gateway Settings Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/self-hosted-gateway-settings-reference.md
This guidance helps you provide the required information to define how to authen
| certificates.local.ca.enabled | Indication whether or not the self-hosted gateway should use local CA certificates that are mounted. It's required to run the self-hosted gateway as root or with user ID 1001. | No | `false` | v2.0+ | | net.server.tls.ciphers.allowed-suites | Comma-separated list of ciphers to use for TLS connection between API client and the self-hosted gateway. | No | `TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_DHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256,TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_DHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,TLS_DHE_RSA_WITH_AES_256_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,TLS_DHE_RSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,TLS_DHE_RSA_WITH_AES_256_CBC_SHA,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA,TLS_DHE_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_GCM_SHA384,TLS_RSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA256,TLS_RSA_WITH_AES_128_CBC_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA,TLS_RSA_WITH_AES_128_CBC_SHA` | v2.0+ | | net.client.tls.ciphers.allowed-suites | Comma-separated list of ciphers to use for TLS connection between the self-hosted gateway and the backend. | No | `TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_DHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256,TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_DHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,TLS_DHE_RSA_WITH_AES_256_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,TLS_DHE_RSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,TLS_DHE_RSA_WITH_AES_256_CBC_SHA,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA,TLS_DHE_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_GCM_SHA384,TLS_RSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA256,TLS_RSA_WITH_AES_128_CBC_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA,TLS_RSA_WITH_AES_128_CBC_SHA` | v2.0+ |
+| security.certificate-revocation.validation.enabled | Provides capability to turn certificate revocation list validation on/off | No | `false` | v2.3.6+ |
## Sovereign clouds
api-management V2 Service Tiers Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/v2-service-tiers-overview.md
A: A Standard v2 service instance can be integrated with a VNet to provide secur
A: No, such a deployment is only supported in the Premium tier.
+### Q: Is a Premium v2 tier planned?
+
+A: Yes, a Premium v2 preview is planned and will be announced separately.
+ ## Related content
-* Learn more about the API Management [tiers](api-management-features.md).
+* Learn more about the API Management [tiers](api-management-features.md).
api-management Virtual Network Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/virtual-network-reference.md
Previously updated : 08/29/2023 Last updated : 10/19/2023
When an API Management service instance is hosted in a VNet, the ports in the fo
| * / 6381 - 6383 | Inbound & Outbound | TCP | VirtualNetwork / VirtualNetwork | Access internal Azure Cache for Redis service for [caching](api-management-caching-policies.md) policies between machines (optional) | External & Internal | | * / 4290 | Inbound & Outbound | UDP | VirtualNetwork / VirtualNetwork | Sync Counters for [Rate Limit](rate-limit-policy.md) policies between machines (optional) | External & Internal | | * / 6390 | Inbound | TCP | AzureLoadBalancer / VirtualNetwork | **Azure Infrastructure Load Balancer** | External & Internal |
+| * / 443 | Inbound | TCP | AzureTrafficManager / VirtualNetwork | **Azure Traffic Manager** routing for multi-region deployment | External |
### [stv1](#tab/stv1)
When an API Management service instance is hosted in a VNet, the ports in the fo
| * / 6381 - 6383 | Inbound & Outbound | TCP | VirtualNetwork / VirtualNetwork | Access internal Azure Cache for Redis service for [caching](api-management-caching-policies.md) policies between machines (optional) | External & Internal | | * / 4290 | Inbound & Outbound | UDP | VirtualNetwork / VirtualNetwork | Sync Counters for [Rate Limit](rate-limit-policy.md) policies between machines (optional) | External & Internal | | * / * | Inbound | TCP | AzureLoadBalancer / VirtualNetwork | **Azure Infrastructure Load Balancer** (required for Premium SKU, optional for other SKUs) | External & Internal |
+| * / 443 | Inbound | TCP | AzureTrafficManager / VirtualNetwork | **Azure Traffic Manager** routing for multi-region deployment | External only |
app-service Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/overview.md
App Service Environment v3 is available in the following regions:
| -- | :--: | :-: | :-: | | | App Service Environment v3 | App Service Environment v3 | App Service Environment v1/v2 | | US DoD Central | ✅ | | ✅ |
-| US DoD East | | | ✅ |
+| US DoD East | ✅ | | ✅ |
| US Gov Arizona | ✅ | | ✅ | | US Gov Iowa | | | ✅ | | US Gov Texas | ✅ | | ✅ |
attestation Attestation Token Examples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/attestation/attestation-token-examples.md
# Examples of an attestation token
-Attestation policy is used to process the attestation evidence and determine whether Azure Attestation will issue an attestation token. Attestation token generation can be controlled with custom policies. Below are some examples of an attestation policy.
+Attestation policy is used to process the attestation evidence and determines whether Azure Attestation issues an attestation token. Attestation token generation can be controlled with custom policies. Here are some examples of an attestation token.
## Sample JWT generated for SGX attestation
Attestation policy is used to process the attestation evidence and determine whe
}.[Signature] ```
-Some of the claims used above are considered deprecated but are fully supported. It is recommended that all future code and tooling use the non-deprecated claim names. See [claims issued by Azure Attestation](claim-sets.md) for more information.
+Some of the claims used here are considered deprecated but are fully supported. It is recommended that all future code and tooling use the non-deprecated claim names. For more information, see [claims issued by Azure Attestation](claim-sets.md).
-The below claims will appear only in the attestation token generated for Intel® Xeon® Scalable processor-based server platforms. The claims will not appear if the SGX enclave is not configured with [Key Separation and Sharing Support](https://github.com/openenclave/openenclave/issues/3054)
+The below claims appear only in the attestation token generated for Intel® Xeon® Scalable processor-based server platforms. The claims do not appear if the SGX enclave is not configured with [Key Separation and Sharing Support](https://github.com/openenclave/openenclave/issues/3054)
**x-ms-sgx-config-id**
The below claims will appear only in the attestation token generated for Intel®
} ```
+## Sample JWT generated for TDX attestation
+
+The definitions of below claims are available in [Azure Attestation TDX EAT profile](trust-domain-extensions-eat-profile.md)
+
+```
+{
+ "attester_tcb_status": "UpToDate",
+ "dbgstat": "disabled",
+ "eat_profile": "https://aka.ms/maa-eat-profile-tdxvm",
+ "exp": 1697706287,
+ "iat": 1697677487,
+ "intuse": "generic",
+ "iss": "https://maasand001.eus.attest.azure.net",
+ "jti": "5f65006d573bc1c04f67820348c20f5d8da72ddbbd4d6c03da8de9f11b5cf29b",
+ "nbf": 1697677487,
+ "tdx_mrconfigid": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_mrowner": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_mrownerconfig": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_mrseam": "2fd279c16164a93dd5bf373d834328d46008c2b693af9ebb865b08b2ced320c9a89b4869a9fab60fbe9d0c5a5363c656",
+ "tdx_mrsignerseam": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_mrtd": "5be56d418d33661a6c21da77c9503a07e430b35eb92a0bd042a6b3c4e79b3c82bb1c594e770d0d129a0724669f1e953f",
+ "tdx_report_data": "93c6db49f2318387bcebdad0275e206725d948f9000d900344aa44abaef145960000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_rtmr0": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_rtmr1": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_rtmr2": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_rtmr3": "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
+ "tdx_seam_attributes": "0000000000000000",
+ "tdx_seamsvn": 3,
+ "tdx_td_attributes": "0000000000000000",
+ "tdx_td_attributes_debug": false,
+ "tdx_td_attributes_key_locker": false,
+ "tdx_td_attributes_perfmon": false,
+ "tdx_td_attributes_protection_keys": false,
+ "tdx_td_attributes_septve_disable": false,
+ "tdx_tee_tcb_svn": "03000600000000000000000000000000",
+ "tdx_xfam": "e718060000000000",
+ "x-ms-attestation-type": "tdxvm",
+ "x-ms-compliance-status": "azure-compliant-cvm",
+ "x-ms-policy-hash": "B56nbp5slhw66peoRYkpdq1WykMkEworvdol08hnMXE",
+ "x-ms-runtime": {
+ "test-claim-name": "test-claim-value"
+ },
+ "x-ms-ver": "1.0"
+}
+```
+ ## Next steps - [View examples of an attestation policy](policy-examples.md)
attestation Trust Domain Extensions Eat Profile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/attestation/trust-domain-extensions-eat-profile.md
+
+ Title: Azure Attestation EAT profile for TDX
+description: Azure Attestation EAT profile for TDX
++++ Last updated : 10/18/2023++++
+# Azure Attestation EAT profile for Intel® Trust Domain Extensions (TDX)
+
+This profile outlines claims for an [Intel® Trust Domain Extensions (TDX)](https://www.intel.com/content/www/us/en/developer/tools/trust-domain-extensions/overview.html) attestation result generated as an Entity Attestation Token (EAT) by Azure Attestation.
+
+The profile includes claims from the IETF [JWT](https://datatracker.ietf.org/doc/html/rfc7519) specification, the [EAT](https://datatracker.ietf.org/doc/html/draft-ietf-rats-eat-21)) specification, Intel's TDX specification and Microsoft specific claims.
+
+## JWT claims
+
+The complete definitions of the following claims are available in the JWT specification.
+
+**iat** - The "iat" (issued at) claim identifies the time at which the JWT was issued.
+
+**exp** - The "exp" (expiration time) claim identifies the expiration time on or after which the JWT MUST NOT be accepted for processing.
+
+**iss** - The "iss" (issuer) claim identifies the principal that issued the JWT.
+
+**jti** - The "jti" (JWT ID) claim provides a unique identifier for the JWT.
+
+**nbf** - The "nbf" (not before) claim identifies the time before which the JWT MUST NOT be accepted for processing.
+
+## EAT claims
+
+The complete definitions of the following claims are available in the EAT specification.
+
+**eat_profile** - The "eat_profile" claim identifies an EAT profile by either a URL or an OID.
+
+**dbgstat** - The "dbgstat" claim applies to entity-wide or submodule-wide debug facilities of the entity like [JTAG] and diagnostic hardware built into chips.
+
+**intuse** - The "intuse" claim provides an indication to an EAT consumer about the intended usage of the token.
+
+## TDX claims
+
+The complete definitions of the claims are available in the section A.3.2 TD Quote Body of [Intel® TDX DCAP Quoting Library API](https://download.01.org/intel-sgx/latest/dcap-latest/linux/docs/Intel_TDX_DCAP_Quoting_Library_API.pdf) specification.
+
+**tdx_mrsignerseam** - A 96-character hexadecimal string that represents a byte array of length 48 containing the measurement of the TDX module signer.
+
+**tdx_mrseam** - A 96-character hexadecimal string that represents a byte array of length 48 containing the measurement of the Intel TDX module.
+
+**tdx_mrtd** - A 96-character hexadecimal string that represents a byte array of length 48 containing the measurement of the initial contents of the TDX.
+
+**tdx_rtmr0** - A 96-character hexadecimal string that represents a byte array of length 48 containing the runtime extendable measurement register.
+
+**tdx_rtmr1** - A 96-character hexadecimal string that represents a byte array of length 48 containing the runtime extendable measurement register.
+
+**tdx_rtmr2** - A 96-character hexadecimal string that represents a byte array of length 48 containing the runtime extendable measurement register.
+
+**tdx_rtmr3** - A 96-character hexadecimal string that represents a byte array of length 48 containing the runtime extendable measurement register.
+
+**tdx_mrconfigid** - A 96-character hexadecimal string that represents a byte array of length 48 containing the software-defined ID for non-owner-defined configuration of the TDX, e.g., runtime or Operating System (OS) configuration.
+
+**tdx_mrowner** - A 96-character hexadecimal string that represents a byte array of length 48 containing the software-defined ID for the TDX's owner.
+
+**tdx_mrownerconfig** - A 96-character hexadecimal string that represents a byte array of length 48 containing the software-defined ID for owner-defined configuration of the TDX, e.g., specific to the workload rather than the runtime or OS.
+
+**tdx_report_data** - A 128-character hexadecimal string that represents a byte array of length 64. In this context, the TDX has the flexibility to include 64 bytes of custom data in a TDX Report. For instance, this space can be used to hold a nonce, a public key, or a hash of a larger block of data.
+
+**tdx_seam_attributes** - A 16 character hexadecimal string that represents a byte array of length 8 containing additional configuration of the TDX module.
+
+**tdx_tee_tcb_svn** - A 32 character hexadecimal string that represents a byte array of length 16 describing the Trusted Computing Base (TCB) Security Version Numbers (SVNs) of TDX.
+
+**tdx_xfam** - A 16 character hexadecimal string that represents a byte array of length 8 containing a mask of CPU extended features that the TDX is allowed to use.
+
+**tdx_seamsvn** - A number that represents the Intel TDX module SVN. The complete definition of the claim is available in section 3.1 SEAM_SIGSTRUCT: INTEL® TDX MODULE SIGNATURE STRUCTURE of [Intel® TDX Loader Interface Specification](https://cdrdv2.intel.com/v1/dl/getContent/733584)
+
+**tdx_td_attributes** - A 16 character hexadecimal string that represents a byte array of length 8. These are the attributes associated with the Trust Domain (TD). The complete definitions of the claims mentioned below are available in the section A.3.4. TD Attributes of [Intel® TDX DCAP Quoting Library API](https://download.01.org/intel-sgx/latest/dcap-latest/linux/docs/Intel_TDX_DCAP_Quoting_Library_API.pdf) specification.
+
+**tdx_td_attributes_debug** - A boolean value that indicates whether the TD runs in TD debug mode (set to 1) or not (set to 0). In TD debug mode, the CPU state and private memory are accessible by the host VMM.
+
+**tdx_td_attributes_key_locker** - A boolean value that indicates whether the TD is allowed to use Key Locker.
+
+**tdx_td_attributes_perfmon** - A boolean value that indicates whether the TD is allowed to use Perfmon and PERF_METRICS capabilities.
+
+**tdx_td_attributes_protection_keys** - A boolean value that indicates whether the TD is allowed to use Supervisor Protection Keys.
+
+**tdx_td_attributes_septve_disable** - A boolean value that determines whether to disable EPT violation conversion to #VE on TD access of PENDING pages.
+
+## Attester claims
+
+**attester_tcb_status** - A string value that represents the TCB level status of the platform being evaluated. See tcbStatus in [Intel® Trusted Services API Management Developer Portal](https://api.portal.trustedservices.intel.com/documentation).
+
+## Microsoft specific claims
+
+**x-ms-attestation-type** - A string value that represents the attestation type.
+
+**x-ms-policy-hash** - Hash of Azure Attestation evaluation policy computed as BASE64URL(SHA256(UTF8(BASE64URL(UTF8(policy text))))).
+
+**x-ms-runtime** - JSON object containing "claims" that are defined and generated within the attested environment. This is a specialization of the ΓÇ£enclave held dataΓÇ¥ concept, where the ΓÇ£enclave held dataΓÇ¥ is specifically formatted as a UTF-8 encoding of well formed JSON.
+
+**x-ms-ver** - JWT schema version (expected to be "1.0")
+}
azure-functions Durable Functions Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/durable/durable-functions-overview.md
app.get("raiseEventToOrchestration", async function (request, context) {
# [Python](#tab/v1-model) ```python
-import azure.functions as func
import azure.durable_functions as df
-myApp = df.DFApp(http_auth_level=func.AuthLevel.ANONYMOUS)
-# An HTTP-Triggered Function with a Durable Functions Client binding
-@myApp.route(route="orchestrators/{functionName}")
-@myApp.durable_client_input(client_name="client")
-async def main(client):
+async def main(client: str):
+ durable_client = df.DurableOrchestrationClient(client)
is_approved = True
- await client.raise_event(instance_id, "ApprovalEvent", is_approved)
+ await durable_client.raise_event(instance_id, "ApprovalEvent", is_approved)
```
azure-functions Functions Reference Node https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-reference-node.md
The `HttpRequest` object has the following properties:
| **`params`** | `Record<string, string>` | Route parameter keys and values. | | **`user`** | `HttpRequestUser | null` | Object representing logged-in user, either through Functions authentication, SWA Authentication, or null when no such user is logged in. | | **`body`** | `Buffer | string | any` | If the media type is "application/octet-stream" or "multipart/*", `body` is a Buffer. If the value is a JSON parse-able string, `body` is the parsed object. Otherwise, `body` is a string. |
-| **`rawBody`** | `Buffer | string` | If the media type is "application/octet-stream" or "multipart/*", `rawBody` is a Buffer. Otherwise, `rawBody` is a string. The only difference between `body` and `rawBody` is that `rawBody` doesn't JSON parse a string body. |
+| **`rawBody`** | `string` | The body as a string. Despite the name, this property doesn't return a Buffer. |
| **`bufferBody`** | `Buffer` | The body as a buffer. | ::: zone-end
azure-monitor Agent Windows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/agent-windows.md
The following steps install and configure the Log Analytics agent in Azure and A
6. On the **Azure Log Analytics** page, perform the following: 1. Paste the **Workspace ID** and **Workspace Key (Primary Key)** that you copied earlier. If the computer should report to a Log Analytics workspace in Azure Government cloud, select **Azure US Government** from the **Azure Cloud** drop-down list. 2. If the computer needs to communicate through a proxy server to the Log Analytics service, click **Advanced** and provide the URL and port number of the proxy server. If your proxy server requires authentication, type the username and password to authenticate with the proxy server and then click **Next**.
-7. Click **Next** once you have completed providing the necessary configuration settings.<br><br> :::image type="content" source="media/agent-windows/log-analytics-mma-setup-laworkspace.png" lightbox="media/agent-windows/log-analytics-mma-setup-laworkspace.png" alt-text="paste Workspace ID and Primary Key":::<br><br>
+7. Click **Next** once you have completed providing the necessary configuration settings.
+ <!-- convertborder later -->
+ :::image type="content" source="media/agent-windows/log-analytics-mma-setup-laworkspace.png" lightbox="media/agent-windows/log-analytics-mma-setup-laworkspace.png" alt-text="paste Workspace ID and Primary Key" border="false":::<br><br>
8. On the **Ready to Install** page, review your choices and then click **Install**. 9. On the **Configuration completed successfully** page, click **Finish**.
To retrieve the product code from the agent install package directly, you can us
After installation of the agent is finished, you can verify that it's successfully connected and reporting in two ways.
-From the computer in **Control Panel**, find the item **Microsoft Monitoring Agent**. Select it, and on the **Azure Log Analytics** tab, the agent should display a message stating *The Microsoft Monitoring Agent has successfully connected to the Microsoft Operations Management Suite service.*<br><br> :::image type="content" source="media/agent-windows/log-analytics-mma-laworkspace-status.png" lightbox="media/agent-windows/log-analytics-mma-laworkspace-status.png" alt-text="Screenshot that shows the MMA connection status to Log Analytics message.":::
+From the computer in **Control Panel**, find the item **Microsoft Monitoring Agent**. Select it, and on the **Azure Log Analytics** tab, the agent should display a message stating *The Microsoft Monitoring Agent has successfully connected to the Microsoft Operations Management Suite service.*
+<!-- convertborder later -->
You can also perform a log query in the Azure portal:
azure-monitor Azure Monitor Agent Data Collection Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-data-collection-endpoint.md
Add the data collection endpoints to a new or existing [Azure Monitor Private Li
> Other Azure Monitor resources like the Log Analytics workspaces configured in your data collection rules that you want to send data to must be part of this same AMPLS resource. For your data collection endpoints, ensure the **Accept access from public networks not connected through a Private Link Scope** option is set to **No** on the **Network Isolation** tab of your endpoint resource in the Azure portal. This setting ensures that public internet access is disabled and network communication only happens via private links.-
+<!-- convertborder later -->
### Associate DCEs to target machines Associate the data collection endpoints to the target resources by editing the data collection rule in the Azure portal. On the **Resources** tab, select **Enable Data Collection Endpoints**. Select a DCE for each virtual machine. See [Configure data collection for Azure Monitor Agent](../agents/data-collection-rule-azure-monitor-agent.md).-
+<!-- convertborder later -->
## Next steps
azure-monitor Azure Monitor Agent Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-manage.md
You can choose to use the individual policies from the preceding policy initiati
The initiatives or policies will apply to each virtual machine as it's created. A [remediation task](../../governance/policy/how-to/remediate-resources.md) deploys the policy definitions in the initiative to existing resources, so you can configure Azure Monitor Agent for any resources that were already created. When you create the assignment by using the Azure portal, you have the option of creating a remediation task at the same time. For information on the remediation, see [Remediate non-compliant resources with Azure Policy](../../governance/policy/how-to/remediate-resources.md).-
+<!-- convertborder later -->
## Next steps
azure-monitor Azure Monitor Agent Migration Tools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-migration-tools.md
You can access the workbook **[here](https://portal.azure.com/#view/AppInsightsE
:::image type="content" source="media/azure-monitor-migration-tools/ama-migration-helper.png" lightbox="media/azure-monitor-migration-tools/ama-migration-helper.png" alt-text="Screenshot of the Azure Monitor Agent Migration Helper workbook. The screenshot highlights the Subscription and Workspace dropdowns and shows the Azure Virtual Machines tab, on which you can track which agent is deployed on each virtual machine."::: **Automatic Migration Recommendations**-
+<!-- convertborder later -->
## Installing and using DCR Config Generator Azure Monitor Agent relies only on [data collection rules (DCRs)](../essentials/data-collection-rule-overview.md) for configuration, whereas Log Analytics Agent inherits its configuration from Log Analytics workspaces.
azure-monitor Azure Monitor Agent Windows Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-windows-client.md
# Azure Monitor agent on Windows client devices
-This article provides instructions and guidance for using the client installer for Azure Monitor Agent. It also explains how to leverage Data Collection Rules on Windows client devices.
+This article provides instructions and guidance for using the client installer for Azure Monitor Agent. It also explains how to use Data Collection Rules on Windows client devices.
Using the new client installer described here, you can now collect telemetry data from your Windows client devices in addition to servers and virtual machines. Both the [extension](./azure-monitor-agent-manage.md#virtual-machine-extension-details) and this installer use Data Collection rules to configure the **same underlying agent**. > [!NOTE]
-> This article provides specific guidance for installing the Azure Monitor agent on Windows client devices, subject to [limitations below](#limitations). For standard installation and management guidance for the agent, refer [the agent extension management guidance here](./azure-monitor-agent-manage.md)
+> This article provides specific guidance for installing the Azure Monitor agent on Windows client devices, subject to the [limitations](#limitations). For standard installation and management guidance for the agent, refer [the agent extension management guidance here](./azure-monitor-agent-manage.md)
### Comparison with virtual machine extension Here is a comparison between client installer and VM extension for Azure Monitor agent:
Here is a comparison between client installer and VM extension for Azure Monitor
| Central configuration | Via Data collection rules | Same | | Associating config rules to agents | DCRs associates directly to individual VM resources | DCRs associate to Monitored Object (MO), which maps to all devices within the Microsoft Entra tenant | | Data upload to Log Analytics | Via Log Analytics endpoints | Same |
-| Feature support | All features documented [here](./azure-monitor-agent-overview.md) | Features dependent on AMA agent extension that don't require additional extensions. This includes support for Sentinel Windows Event filtering |
+| Feature support | All features documented [here](./azure-monitor-agent-overview.md) | Features dependent on AMA agent extension that don't require more extensions. This includes support for Sentinel Windows Event filtering |
| [Networking options](./azure-monitor-agent-overview.md#networking) | Proxy support, Private link support | Proxy support only |
Here is a comparison between client installer and VM extension for Azure Monitor
| On-premises servers | No | [Virtual machine extension](./azure-monitor-agent-manage.md#virtual-machine-extension-details) (with Azure Arc agent) | Installs the agent using Azure extension framework, provided for on-premises by installing Arc agent | ## Limitations
-1. The Windows client installer supports latest Windows machines only that are **Microsoft Entra joined** or Microsoft Entra hybrid joined. More information under [prerequisites](#prerequisites) below
-2. The Data Collection rules can only target the Microsoft Entra tenant scope, i.e. all DCRs associated to the tenant (via Monitored Object) will apply to all Windows client machines within that tenant with the agent installed using this client installer. **Granular targeting using DCRs is not supported** for Windows client devices yet
+1. The Windows client installer supports latest Windows machines only that are **Microsoft Entra joined** or Microsoft Entra hybrid joined. For more information, see the [prerequisites](#prerequisites).
+2. The Data Collection rules can only target the Microsoft Entra tenant scope. That is, all DCRs associated to the tenant (via Monitored Object) will apply to all Windows client machines within that tenant with the agent installed using this client installer. **Granular targeting using DCRs is not supported** for Windows client devices yet
3. No support for Windows machines connected via **Azure private links** 4. The agent installed using the Windows client installer is designed mainly for Windows desktops or workstations that are **always connected**. While the agent can be installed via this method on laptops, it is not optimized for battery consumption and network limitations on a laptop.
Here is a comparison between client installer and VM extension for Azure Monitor
7. Before using any PowerShell cmdlet, ensure cmdlet related PowerShell module is installed and imported. ## Install the agent
-1. Download the Windows MSI installer for the agent using [this link](https://go.microsoft.com/fwlink/?linkid=2192409). You can also download it from **Monitor** > **Data Collection Rules** > **Create** experience on Azure portal (shown below):
- :::image type="content" source="media/azure-monitor-agent-windows-client/azure-monitor-agent-client-installer-portal.png" lightbox="media/azure-monitor-agent-windows-client/azure-monitor-agent-client-installer-portal.png" alt-text="Diagram shows download agent link on Azure portal.":::
+1. Download the Windows MSI installer for the agent using [this link](https://go.microsoft.com/fwlink/?linkid=2192409). You can also download it from **Monitor** > **Data Collection Rules** > **Create** experience on Azure portal (shown in the following screenshot):
+ <!-- convertborder later -->
+ :::image type="content" source="media/azure-monitor-agent-windows-client/azure-monitor-agent-client-installer-portal.png" lightbox="media/azure-monitor-agent-windows-client/azure-monitor-agent-client-installer-portal.png" alt-text="Diagram shows download agent link on Azure portal." border="false":::
2. Open an elevated admin command prompt window and change directory to the location where you downloaded the installer. 3. To install with **default settings**, run the following command: ```cli msiexec /i AzureMonitorAgentClientSetup.msi /qn ```
-4. To install with custom file paths, [network proxy settings](./azure-monitor-agent-overview.md#proxy-configuration), or on a Non-Public Cloud use the command below with the values from the following table:
+4. To install with custom file paths, [network proxy settings](./azure-monitor-agent-overview.md#proxy-configuration), or on a Non-Public Cloud use the following command with the values from the following table:
```cli msiexec /i AzureMonitorAgentClientSetup.msi /qn DATASTOREDIR="C:\example\folder"
Here is a comparison between client installer and VM extension for Azure Monitor
## Create and associate a 'Monitored Object' You need to create a 'Monitored Object' (MO) that creates a representation for the Microsoft Entra tenant within Azure Resource Manager (ARM). This ARM entity is what Data Collection Rules are then associated with. **This Monitored Object needs to be created only once for any number of machines in a single Microsoft Entra tenant**. Currently this association is only **limited** to the Microsoft Entra tenant scope, which means configuration applied to the Microsoft Entra tenant will be applied to all devices that are part of the tenant and running the agent installed via the client installer. Agents installed as virtual machine extension will not be impacted by this.
-The image below demonstrates how this works:
+The following image demonstrates how this works:
+<!-- convertborder later -->
-
-Then, proceed with the instructions below to create and associate them to a Monitored Object, using REST APIs or PowerShell commands.
+Then, proceed with the following instructions to create and associate them to a Monitored Object, using REST APIs or PowerShell commands.
### Permissions required
-Since MO is a tenant level resource, the scope of the permission would be higher than a subscription scope. Therefore, an Azure tenant admin may be needed to perform this step. [Follow these steps to elevate Microsoft Entra tenant admin as Azure Tenant Admin](../../role-based-access-control/elevate-access-global-admin.md). It will give the Microsoft Entra admin 'owner' permissions at the root scope. This is needed for all methods described below in this section.
+Since MO is a tenant level resource, the scope of the permission would be higher than a subscription scope. Therefore, an Azure tenant admin may be needed to perform this step. [Follow these steps to elevate Microsoft Entra tenant admin as Azure Tenant Admin](../../role-based-access-control/elevate-access-global-admin.md). It gives the Microsoft Entra admin 'owner' permissions at the root scope. This is needed for all methods described in the following section.
### Using REST APIs
PUT https://management.azure.com/providers/microsoft.insights/providers/microsof
After this step is complete, **reauthenticate** your session and **reacquire** your ARM bearer token. #### 2. Create Monitored Object
-This step creates the Monitored Object for the Microsoft Entra tenant scope. It will be used to represent client devices that are signed with that Microsoft Entra tenant identity.
+This step creates the Monitored Object for the Microsoft Entra tenant scope. It's used to represent client devices that are signed with that Microsoft Entra tenant identity.
**Permissions required**: Anyone who has 'Monitored Object Contributor' at an appropriate scope can perform this operation, as assigned in step 1.
PUT https://management.azure.com/providers/Microsoft.Insights/monitoredObjects/{
| Name | In | Type | Description | |:|:|:|:|:|
-| `AADTenantId` | path | string | ID of the Microsoft Entra tenant that the device(s) belong to. The MO will be created with the same ID |
+| `AADTenantId` | path | string | ID of the Microsoft Entra tenant that the device(s) belong to. The MO is created with the same ID |
**Headers** - Authorization: ARM Bearer Token
PUT https://management.azure.com/providers/Microsoft.Insights/monitoredObjects/{
| Name | Description | |:|:|
-| `location` | The Azure region where the MO object would be stored. It should be the **same region** where you created the Data Collection Rule. This is the location of the region from where agent communications would happen. |
+| `location` | The Azure region where the MO object would be stored. It should be the **same region** where you created the Data Collection Rule. This region is the location where agent communications would happen. |
#### 3. Associate DCR to Monitored Object
$requestURL = "https://management.azure.com$RespondId/providers/microsoft.insigh
``` ## Verify successful setup Check the ΓÇÿHeartbeatΓÇÖ table (and other tables you configured in the rules) in the Log Analytics workspace that you specified as a destination in the data collection rule(s).
-The `SourceComputerId`, `Computer`, `ComputerIP` columns should all reflect the client device information respectively, and the `Category` column should say 'Azure Monitor Agent'. See example below:
-
+The `SourceComputerId`, `Computer`, `ComputerIP` columns should all reflect the client device information respectively, and the `Category` column should say 'Azure Monitor Agent'. See the following example:
+<!-- convertborder later -->
### Using PowerShell for offboarding ```PowerShell
You can use any of the following options to check the installed version of the a
- Open **Control Panel** > **Programs and Features** > **Azure Monitor Agent** and click 'Uninstall' - Open **Settings** > **Apps** > **Apps and Features** > **Azure Monitor Agent** and click 'Uninstall'
-If you face issues during 'Uninstall', refer to [troubleshooting guidance](#troubleshoot) below
+If you face issues during 'Uninstall', refer to the [troubleshooting guidance](#troubleshoot).
### Update the agent In order to update the version, install the new version you wish to update to.
azure-monitor Data Sources Performance Counters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/data-sources-performance-counters.md
description: Learn how to configure collection of performance counters for Windo
Previously updated : 06/28/2022 Last updated : 10/19/2023
The following table provides different examples of log queries that retrieve per
## Next steps * [Collect performance counters from Linux applications](data-sources-linux-applications.md), including MySQL and Apache HTTP Server. * Learn about [log queries](../logs/log-query-overview.md) to analyze the data collected from data sources and solutions.
-* Export collected data to [Power BI](../logs/log-powerbi.md) for more visualizations and analysis.
+* Export collected data to [Power BI](../logs/log-powerbi.md) for more visualizations and analysis.
azure-monitor Vmext Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/vmext-troubleshoot.md
Title: Troubleshoot the Azure Log Analytics VM extension description: Describe the symptoms, causes, and resolution for the most common issues with the Log Analytics VM extension for Windows and Linux Azure VMs. Previously updated : 06/06/2019 Last updated : 10/19/2023
azure-monitor Alerts Manage Alerts Previous Version https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-manage-alerts-previous-version.md
The current alert rule wizard is different from the earlier experience:
1. To make alerts stateful, select **Automatically resolve alerts (preview)**. 1. Specify if the alert rule should trigger one or more [action groups](./action-groups.md) when the alert condition is met. > [!NOTE]
- > For limits on the actions that can be performed, see [Azure subscription service limits](../../azure-resource-manager/management/azure-subscription-service-limits.md).
+ > * For limits on the actions that can be performed, see [Azure subscription service limits](../../azure-resource-manager/management/azure-subscription-service-limits.md).
+ > * Search results were included in the payload of the triggered alert and its associated notifications. **Notice that**: The **email** included only **10 rows** from the unfiltered results while the **webhook payload** contained **1,000 unfiltered results**.
1. (Optional) Customize actions in log alert rules: - **Custom email subject**: Overrides the *email subject* of email actions. You can't modify the body of the mail and this field *isn't for email addresses*. - **Include custom Json payload for webhook**: Overrides the webhook JSON used by action groups, assuming that the action group contains a webhook action. Learn more about [webhook actions for log alerts](./alerts-log-webhook.md).
azure-monitor Container Insights Onboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-onboard.md
The following table lists the extra firewall configuration required for managed
| `global.handler.control.monitor.azure.us` | Access control service | 443 | | `<cluster-region-name>.handler.control.monitor.azure.us` | Fetch data collection rules for specific AKS cluster | 443 |
+## Troubleshooting
+If you have registered your cluster and/or configured HCI Insights before November, 2023, features that use the AMA agent on HCI, such as Arc for Servers Insights, VM Insights, Container Insights, Defender for Cloud or Sentinel may not be collecting logs and event data properly. See [Repair AMA agent for HCI](/azure-stack/hci/manage/monitor-hci-single?tabs=22h2-and-later) for steps to reconfigure the AMA agent and HCI Insights.
+ ## Next steps After you've enabled monitoring, you can begin analyzing the performance of your Kubernetes clusters that are hosted on AKS, Azure Stack, or another environment. To learn how to use Container insights, see [View Kubernetes cluster performance](container-insights-analyze.md).+
azure-monitor Log Standard Columns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/log-standard-columns.md
Event
The **\_TimeReceived** column contains the date and time that the record was received by the Azure Monitor ingestion point in the Azure cloud. This can be useful for identifying latency issues between the data source and the cloud. An example would be a networking issue causing a delay with data being sent from an agent. See [Log data ingestion time in Azure Monitor](../logs/data-ingestion-time.md) for more details. > [!NOTE]
-> The **\_TimeReceived** column is calculate each time it is used. This process is resource intensive. Refine from using it to filter large number of records. Using this function recurrently can lead to increased query execution duration.
+> The **\_TimeReceived** column is calculate each time it is used. This process is resource intensive. Refrain from using it to filter large number of records. Using this function recurrently can lead to increased query execution duration.
The following query gives the average latency by hour for event records from an agent. This includes the time from the agent to the cloud and the total time for the record to be available for log queries.
It is always more efficient to use the \_SubscriptionId column than extracting i
## \_SubscriptionId The **\_SubscriptionId** column holds the subscription ID of the resource that the record is associated with. This gives you a standard column to use to scope your query to only records from a particular subscription, or to compare different subscriptions.
-For Azure resources, the value of **__SubscriptionId** is the subscription part of the [Azure resource ID URL](../../azure-resource-manager/templates/template-functions-resource.md). The column is limited to Azure resources, including [Azure Arc](../../azure-arc/overview.md) resources, or to custom logs that indicated the Resource ID during ingestion.
+For Azure resources, the value of **__SubscriptionId** is the subscription part of the [Azure resource ID URL](../../azure-resource-manager/templates/template-functions-resource.md). The column is limited to Azure resources, including [Azure Arc](../../azure-arc/overview.md) resources, or to custom logs that indicated the Subscription ID during ingestion.
> [!NOTE] > Some data types already have fields that contain Azure subscription ID . While these fields are kept for backward compatibility, it is recommended to use the \_SubscriptionId column to perform cross correlation since it will be more consistent.
azure-monitor Move Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/move-workspace.md
# Move a Log Analytics workspace to a different subscription or resource group
-In this article, you'll learn the steps to move a Log Analytics workspace to another resource group or subscription in the same region.
+In this article, you'll learn the steps to move a Log Analytics workspace to another resource group or subscription in the same region. To move a workspace across regions, see [Move a Log Analytics workspace to another region](./move-workspace-region.md).
> [!TIP] > To learn more about how to move Azure resources through the Azure portal, PowerShell, the Azure CLI, or the REST API, see [Move resources to a new resource group or subscription](../../azure-resource-manager/management/move-resource-group-and-subscription.md).
In this article, you'll learn the steps to move a Log Analytics workspace to ano
## Prerequisites - The subscription or resource group where you want to move your Log Analytics workspace must be located in the same region as the Log Analytics workspace you're moving.
- > [!NOTE]
- > To move a workspace across regions, see [Move a Log Analytics workspace to another region](./move-workspace-region.md).
- The move operation requires that no services can be linked to the workspace. Prior to the move, delete solutions that rely on linked services, including an Azure Automation account. These solutions must be removed before you can unlink your Automation account. Data collection for the solutions will stop and their tables will be removed from the UI, but data will remain in the workspace per the table retention period. When you add solutions after the move, ingestion is restored and tables become visible with data. Linked services include: - Update management - Change tracking
azure-netapp-files Backup Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/backup-introduction.md
Azure NetApp Files backup is supported for the following regions:
* Brazil South * Canada Central * Canada East
+* Central US
* East Asia * East US * East US 2
Azure NetApp Files backup is supported for the following regions:
* North Central US * North Europe * Norway East
+* Norway West
* Qatar Central * South Africa North * South Central US
azure-resource-manager User Defined Data Types https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/user-defined-data-types.md
param serviceConfig ServiceConfig = { type: 'bar', value: true }
output config object = serviceConfig ```
-The parameter value is validated based on the discriminated property value. In the preceeding example, if the *serviceConfig* parameter value is of type *foo*, it undersoes validation using the *FooConfig*type. Likewise, if the parameter value is of type *bar*, validation is performed usin the *BarConfig* type, and this pattern continues for other types as well.
+The parameter value is validated based on the discriminated property value. In the preceeding example, if the *serviceConfig* parameter value is of type *foo*, it undergoes validation using the *FooConfig*type. Likewise, if the parameter value is of type *bar*, validation is performed using the *BarConfig* type, and this pattern continues for other types as well.
## Import types between Bicep files (Preview)
azure-vmware Migrate Sql Server Always On Availability Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/migrate-sql-server-always-on-availability-group.md
For any of the following scenarios, ExpressRoute connectivity is recommended for
- Production environments - Workloads with large database sizes-- Any case where there is a need to minimize downtime for migration the ExpressRoute connectivity is recommended for the migration.
+- Any case where there is a need to minimize downtime
Further downtime considerations are discussed in the next section.
The following table indicates the estimated downtime for migration of each SQL S
| **Scenario** | **Downtime expected** | **Notes** | |:|:--|:--|
-| **Standalone instance** | Low | Migration is done using VMware vMotion, the database is available during migration time, but it isn't recommended to commit any critical data during it. |
-| **Always On SQL Server Availability Group** | Low | The primary replica will always be available during the migration of the first secondary replica and the secondary replica will become the primary after the initial failover to Azure. |
-| **Always On SQL Server Failover Cluster Instance** | High | All nodes of the cluster are shutdown and migrated using VMware HCX Cold Migration. Downtime duration depends upon database size and private network speed to Azure cloud. |
+| **SQL Server standalone instance** | Low | Migration is done using VMware vMotion, the database is available during migration time, but it isn't recommended to commit any critical data during it. |
+| **SQL Server Always On Availability Group** | Low | The primary replica will always be available during the migration of the first secondary replica and the secondary replica will become the primary after the initial failover to Azure. |
+| **SQL Server Always On Failover Customer Instance** | High | All nodes of the cluster are shutdown and migrated using VMware HCX Cold Migration. Downtime duration depends upon database size and private network speed to Azure cloud. |
## Windows Server Failover Cluster quorum considerations
azure-vmware Migrate Sql Server Failover Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/migrate-sql-server-failover-cluster.md
For any of the following scenarios, ExpressRoute connectivity is recommended for
- Production environments - Workloads with large database sizes-- Any case where there is a need to minimize downtime for migration the ExpressRoute connectivity is recommended for the migration.
+- Any case where there is a need to minimize downtime
Further downtime considerations are discussed in the next section.
The following table indicates the estimated downtime for migration of each SQL S
| **Scenario** | **Downtime expected** | **Notes** | |:|:--|:--|
-| **Standalone instance** | Low | Migration is done using VMware vMotion, the database is available during migration time, but it isn't recommended to commit any critical data during it. |
-| **Always On SQL Server Availability Group** | Low | The primary replica will always be available during the migration of the first secondary replica and the secondary replica will become the primary after the initial failover to Azure. |
-| **Always On SQL Server Failover Cluster Instance** | High | All nodes of the cluster are shutdown and migrated using VMware HCX Cold Migration. Downtime duration depends upon database size and private network speed to Azure cloud. |
+| **SQL Server standalone instance** | Low | Migration is done using VMware vMotion, the database is available during migration time, but it isn't recommended to commit any critical data during it. |
+| **SQL Server Always On Availability Group** | Low | The primary replica will always be available during the migration of the first secondary replica and the secondary replica will become the primary after the initial failover to Azure. |
+| **SQL Server Always On Failover Customer Instance** | High | All nodes of the cluster are shutdown and migrated using VMware HCX Cold Migration. Downtime duration depends upon database size and private network speed to Azure cloud. |
## Windows Server Failover Cluster quorum considerations
azure-vmware Migrate Sql Server Standalone Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/migrate-sql-server-standalone-cluster.md
For any of the following scenarios, ExpressRoute connectivity is recommended for
- Production environments - Workloads with large database sizes-- Any case where there is a need to minimize downtime for migration the ExpressRoute connectivity is recommended for the migration.
+- Any case where there is a need to minimize downtime
Further downtime considerations are discussed in the next section.
The following table indicates the estimated downtime for migration of each SQL S
| **Scenario** | **Downtime expected** | **Notes** | |:|:--|:--|
-| **Standalone instance** | Low | Migration is done using VMware vMotion, the database is available during migration time, but it isn't recommended to commit any critical data during it. |
-| **Always On SQL Server Availability Group** | Low | The primary replica will always be available during the migration of the first secondary replica and the secondary replica will become the primary after the initial failover to Azure. |
-| **Always On SQL Server Failover Cluster Instance** | High | All nodes of the cluster are shutdown and migrated using VMware HCX Cold Migration. Downtime duration depends upon database size and private network speed to Azure cloud. |
+| **SQL Server standalone instance** | Low | Migration is done using VMware vMotion, the database is available during migration time, but it isn't recommended to commit any critical data during it. |
+| **SQL Server Always On Availability Group** | Low | The primary replica will always be available during the migration of the first secondary replica and the secondary replica will become the primary after the initial failover to Azure. |
+| **SQL Server Always On Failover Customer Instance** | High | All nodes of the cluster are shutdown and migrated using VMware HCX Cold Migration. Downtime duration depends upon database size and private network speed to Azure cloud. |
## Executing the migration
batch Simplified Node Communication Pool No Public Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/batch/simplified-node-communication-pool-no-public-ip.md
In a pool without public IP addresses, your virtual machines won't be able to ac
Another way to provide outbound connectivity is to use a user-defined route (UDR). This method lets you route traffic to a proxy machine that has public internet access, for example [Azure Firewall](../firewall/overview.md). > [!IMPORTANT]
-> There is no extra network resource (load balancer, network security group) created for simplified node communication pools without public IP addresses. Since the compute nodes in the pool are not bound to any load balancer, Azure may provide [Default Outbound Access](../virtual-network/ip-services/default-outbound-access.md). However, Default Outbound Access is not suitable for production workloads, so it is strongly recommended to bring your own Internet outbound access.
+> There is no extra network resource (load balancer, network security group) created for simplified node communication pools without public IP addresses. Since the compute nodes in the pool are not bound to any load balancer, Azure may provide [Default Outbound Access](../virtual-network/ip-services/default-outbound-access.md). However, Default Outbound Access is not suitable for production workloads, and will be retired on September 30, 2025 (see the [official announcement](https://azure.microsoft.com/updates/default-outbound-access-for-vms-in-azure-will-be-retired-transition-to-a-new-method-of-internet-access/)). So if your workloads do require internet outbound access, or your pool doesn't use private endpoint to access Batch node management endpoint, you must provide your own solution to enable internet outbound access.
## Troubleshooting
communication-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/service-limits.md
Title: Service limits for Azure Communication Services
description: Learn how to -+
This sandbox setup is designed to help developers begin building the application
|Get chat message|per Chat thread|250|-| |List chat messages|per User per chat thread|50|200| |List chat messages|per Chat thread|250|400|
-|Get read receipts|per User per chat thread|5|-|
-|Get read receipts|per Chat thread|100|-|
+|Get read receipts (20 participant limit**) |per User per chat thread|5|-|
+|Get read receipts (20 participant limit**) |per Chat thread|100|-|
|List chat thread participants|per User per chat thread|10|-| |List chat thread participants|per Chat thread|250|-| |Send message / update message / delete message|per Chat thread|10|30|
This sandbox setup is designed to help developers begin building the application
|Send typing indicator|per Chat thread|10|30| > [!NOTE]
-> Read receipts and typing indicators are not supported on chat threads with more than 20 participants.
+> ** Read receipts and typing indicators are not supported on chat threads with more than 20 participants.
### Chat storage Azure Communication Services stores chat messages indefinitely till they are deleted by the customer.
communication-services Sms Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/sms/sms-faq.md
Once you have submitted the short code program brief application in the Azure po
- Lower case letters: a - z - Numbers: 0-9 - Spaces
- - Special characters: *+*, *-*, _ , &
### Is a number purchase required to use alphanumeric sender ID? The use of alphanumeric sender ID does not require purchase of any phone number. Alphanumeric sender ID can be enabled through the Azure portal. See [enable alphanumeric sender ID quickstart](../../quickstarts/sms/enable-alphanumeric-sender-id.md) for instructions.
container-apps Service Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/service-connector.md
# Connect a container app to a cloud service with Service Connector
-Azure Container Apps allows you to use Service Connector to connect to cloud services in just a few steps. Service Connector manages the configuration of the network settings and connection information between different services. To view all supported services, [learn more about Service Connector](../service-connector/overview.md#what-services-are-supported-in-service-connector).
+Azure Container Apps allows you to use Service Connector to connect to cloud services in just a few steps. Service Connector manages the configuration of the network settings and connection information between different services. To view all supported services, [learn more about Service Connector](../service-connector/overview.md#what-services-are-supported-by-service-connector).
In this article, you learn to connect a container app to Azure Blob Storage.
cosmos-db Configure Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/configure-synapse-link.md
The first step to use Synapse Link is to enable it for your Azure Cosmos DB data
1. [Create a new Azure account](create-sql-api-dotnet.md#create-account), or select an existing Azure Cosmos DB account.
-1. Navigate to your Azure Cosmos DB account and open the **Azure Synapse Link** under Intergrations in the left pane.
+1. Navigate to your Azure Cosmos DB account and open the **Azure Synapse Link** under Integrations in the left pane.
1. Select **Enable**. This process can take 1 to 5 minutes to complete.
Please note the following details when enabling Azure Synapse Link on your exist
* You won't be able to query analytical store of an existing container while Synapse Link is being enabled on that container. Your OLTP workload isn't impacted and you can keep on reading data normally. Data ingested after the start of the initial sync will be merged into analytical store by the regular analytical store auto-sync process. > [!NOTE]
-> Currently you can't enable Synapse Link on your existing MongoDB API containers. Synapse Link can be enabled on newly created Mongo DB containers.
+> Now you can enable Synapse Link on your existing MongoDB API collections, using Azure CLI or PowerShell.
### Azure portal
Please note the following details when enabling Azure Synapse Link on your exist
The following options enable Synapse Link in a container by using Azure CLI by setting the `--analytical-storage-ttl` property.
-* [Create an Azure Cosmos DB MongoDB collection](/cli/azure/cosmosdb/mongodb/collection#az-cosmosdb-mongodb-collection-create-examples)
+* [Create or update an Azure Cosmos DB MongoDB collection](/cli/azure/cosmosdb/mongodb/collection#az-cosmosdb-mongodb-collection-create-examples)
* [Create or update an Azure Cosmos DB SQL API container](/cli/azure/cosmosdb/sql/container#az-cosmosdb-sql-container-create) ##### Use Azure CLI to enable Synapse Link for Azure Synapse Link for Gremlin API Graphs
For existing graphs, replace `create` with `update`.
The following options enable Synapse Link in a container by using Azure CLI by setting the `-AnalyticalStorageTtl` property.
-* [Create an Azure Cosmos DB MongoDB collection](/powershell/module/az.cosmosdb/new-azcosmosdbmongodbcollection#description)
+* [Create or update an Azure Cosmos DB MongoDB collection](/powershell/module/az.cosmosdb/new-azcosmosdbmongodbcollection#description)
* [Create or update an Azure Cosmos DB SQL API container](/powershell/module/az.cosmosdb/new-azcosmosdbsqlcontainer)
cosmos-db Continuous Backup Restore Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/continuous-backup-restore-introduction.md
Currently, you can restore an Azure Cosmos DB account (API for NoSQL or MongoDB,
By default, Azure Cosmos DB stores continuous mode backup data in locally redundant storage blobs. For the regions that have zone redundancy configured, the backup is stored in zone-redundant storage blobs. In continuous backup mode, you can't update the backup storage redundancy. ## Different ways to restore
-Continuous backup mode supports two ways to restore deleted containers and databases. They can be restored into a [new account](restore-account-continuous-backup.md) as documented here or can be restored into an existing account as described [here](restore-account-continuous-backup.md). The choice between these two depends on the scenarios and impact. In most cases it is preferred to restore deleted containers and databases into an existing account to prevent the cost of data transfer which is required in the case they are restored to a new account. For scenarios where you have modified the data accidentally restore into new account could be the prefered option.
+Continuous backup mode supports two ways to restore deleted containers and databases. They can be restored into a [new account](restore-account-continuous-backup.md) as documented here or can be restored into an existing account as described [here](restore-account-continuous-backup.md). The choice between these two depends on the scenarios and impact. In most cases it is preferred to restore deleted containers and databases into an existing account to prevent the cost of data transfer which is required in the case they are restored to a new account. For scenarios where you have modified the data accidentally restore into new account could be the preferred option.
## What is restored into a new account?
Currently the point in time restore functionality has the following limitations:
* Multi-regions write accounts aren't supported.
-* Currently Azure Synapse Link isn't fully compatible with continuous backup mode. For more information about backup with analytical store, see [analytical store backup](analytical-store-introduction.md#backup).
+* Currently Azure Synapse Link can be enabled, in preview, in continuous backup database accounts. The opposite situation isn't supported yet, it is not possible to turn on continuous backup in Synapse Link enabled database accounts. And analytical store isn't included in backups. For more information about backup and analytical store, see [analytical store backup](analytical-store-introduction.md#backup).
* The restored account is created in the same region where your source account exists. You can't restore an account into a region where the source account didn't exist.
cosmos-db Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/synapse-link.md
Azure Synapse Link isn't recommended if you're looking for traditional data ware
* Accessing the Azure Cosmos DB analytics store with Azure Synapse Dedicated SQL Pool currently isn't supported.
-* Enabling Azure Synapse Link on existing Azure Cosmos DB containers is only supported for API for NoSQL accounts. Azure Synapse Link can be enabled on new containers for both API for NoSQL and MongoDB accounts.
- * Although analytical store data isn't backed up, and therefore can't be restored, you can rebuild your analytical store by reenabling Azure Synapse Link in the restored container. Check the [analytical store documentation](analytical-store-introduction.md) for more information.
-* Currently Azure Synapse Link isn't fully compatible with continuous backup mode. Check the [analytical store documentation](analytical-store-introduction.md) for more information.
+* The capability to turn on Synapse Link in database accounts with continuous backup enabled is in preview now. The opposite situation, to turn on continuous backup in Synapse Link enabled database accounts, is still not supported yet.
* Granular role-based access control isn't supported when querying from Synapse. Users that have access to your Synapse workspace and have access to the Azure Cosmos DB account can access all containers within that account. We currently don't support more granular access to the containers.
cosmos-db Vector Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/vector-search.md
Implement RAG-patterns with Azure Cosmos DB for NoSQL and Azure Cognitive Search
### Code samples -- [.NET retail chatbot reference solution](https://github.com/Azure/Vector-Search-AI-Assistant/tree/cognitive-search-vector)-- [.NET samples - Hackathon project](https://github.com/AzureCosmosDB/OpenAIHackathon)
+- [.NET RAG Pattern retail reference solution](https://github.com/Azure/Vector-Search-AI-Assistant-MongoDBvCore)
+- [.NET samples - Hackathon project](https://github.com/Azure/Build-Modern-AI-Apps-Hackathon)
- [.NET tutorial - recipe chatbot](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/C%23/CosmosDB-NoSQL_CognitiveSearch) - [.NET tutorial - recipe chatbot w/ Semantic Kernel](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/C%23/CosmosDB-NoSQL_CognitiveSearch_SemanticKernel) - [Python notebook tutorial - Azure product chatbot](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/Python/CosmosDB-NoSQL_CognitiveSearch)
RAG can be applied using the native vector search feature in Azure Cosmos DB for
### Code samples -- [.NET retail chatbot sample](https://github.com/Azure/Vector-Search-AI-Assistant/tree/mongovcorev2)
+- [.NET RAG Pattern retail reference solution](https://github.com/Azure/Vector-Search-AI-Assistant-MongoDBvCore)
- [.NET tutorial - recipe chatbot](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/C%23/CosmosDB-MongoDBvCore) - [Python notebook tutorial - Azure product chatbot](https://github.com/microsoft/AzureDataRetrievalAugmentedGenerationSamples/tree/main/Python/CosmosDB-MongoDB-vCore)
You can employ RAG by utilizing native vector search within Azure Cosmos DB for
+
cost-management-billing Pay By Invoice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/pay-by-invoice.md
If you signed up for Azure through a Microsoft representative, then your default
When you switch to pay by wire transfer, you must pay your bill within 30 days of the invoice date by wire transfer.
-Users with a Microsoft Customer Agreement must always submit a request [Submit a request to set up pay by wire transfer](#submit-a-request-to-set-up-pay-by-wire-transfer) to Azure support to enable pay by wire transfer.
+Users with a Microsoft Customer Agreement must always [submit a request to set up pay by wire transfer](#submit-a-request-to-set-up-pay-by-wire-transfer) to Azure support to enable pay by wire transfer.
-Customers who have a Microsoft Online Services Program (pay-as-you-go) account can use the Azure portal to [Request to pay by wire transfer](#request-to-pay-by-wire-transfer).
+Customers who have a Microsoft Online Services Program (pay-as-you-go) account can use the Azure portal to [request to pay by wire transfer](#request-to-pay-by-wire-transfer).
> [!IMPORTANT] > * Pay by wire transfer is only available for customers using Azure on behalf of a company.
Customers who have a Microsoft Online Services Program (pay-as-you-go) account c
## Request to pay by wire transfer > [!NOTE]
-> Currently only customers in the United States can get automatically approved to change their payment method to wire transfer. Support for other regions is being evaluated. If you are not in the United States, you must [Submit a request to set up pay by wire transfer](#submit-a-request-to-set-up-pay-by-wire-transfer) to change your payment method.
+> Currently only customers in the United States can get automatically approved to change their payment method to wire transfer. Support for other regions is being evaluated. If you are not in the United States, you must [submit a request to set up pay by wire transfer](#submit-a-request-to-set-up-pay-by-wire-transfer) to change your payment method.
1. Sign in to the Azure portal. 1. Navigate to **Subscriptions** and then select the one that you want to set up wire transfer for.
Customers who have a Microsoft Online Services Program (pay-as-you-go) account c
1. On the **Pay by wire transfer** page, you see a message stating that you can request to use wire transfer instead of automatic payment using a credit or debit card. Select **Continue** to start the check. 1. Depending on your approval status: - If you're automatically approved, the page shows a message stating that you've been approved to pay by wire transfer. Enter your **Company name** and then select **Save**.
- - If the request couldn't be processed or if you're not approved, you need to follow the steps in the next [Submit a request to set up pay by wire transfer](#submit-a-request-to-set-up-pay-by-wire-transfer) section.
+ - If the request couldn't be processed or if you're not approved, you need to follow the steps in the next section [Submit a request to set up pay by wire transfer](#submit-a-request-to-set-up-pay-by-wire-transfer).
1. If you've been approved, on the Payment methods page under **Other payment methods**, to the right of **Wire transfer**, select the ellipsis (**...**) symbol and then select **Make default**. You're all set to pay by wire transfer.
On the Payment methods page, select **Pay by wire transfer**.
### Switch billing profile to wire transfer
-Using the following steps to switch a billing profile to wire transfer. Only the person who signed up for Azure can change the default payment method of a billing profile.
+Use the following steps to switch a billing profile to wire transfer. Only the person who signed up for Azure can change the default payment method of a billing profile.
1. Go to the Azure portal view your billing information. Search for and select **Cost Management + Billing**. 1. In the menu, choose **Billing profiles**.
data-factory Compare Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/compare-versions.md
- Title: Compare Azure Data Factory with Data Factory version 1
-description: This article compares Azure Data Factory with Azure Data Factory version 1.
----- Previously updated : 04/12/2023-
-# Compare Azure Data Factory with Data Factory version 1
--
-This article compares Data Factory with Data Factory version 1. For an introduction to Data Factory, see [Introduction to Data Factory](introduction.md).For an introduction to Data Factory version 1, see [Introduction to Azure Data Factory](v1/data-factory-introduction.md).
-
-## Feature comparison
-The following table compares the features of Data Factory with the features of Data Factory version 1.
-
-| Feature | Version 1 | Current version |
-| - | | |
-| Datasets | A named view of data that references the data that you want to use in your activities as inputs and outputs. Datasets identify data within different data stores, such as tables, files, folders, and documents. For example, an Azure Blob dataset specifies the blob container and folder in Azure Blob storage from which the activity should read the data.<br/><br/>**Availability** defines the processing window slicing model for the dataset (for example, hourly, daily, and so on). | Datasets are the same in the current version. However, you do not need to define **availability** schedules for datasets. You can define a trigger resource that can schedule pipelines from a clock scheduler paradigm. For more information, see [Triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json) and [Datasets](concepts-datasets-linked-services.md). |
-| Linked services | Linked services are much like connection strings, which define the connection information that's necessary for Data Factory to connect to external resources. | Linked services are the same as in Data Factory V1, but with a new **connectVia** property to utilize the Integration Runtime compute environment of the current version of Data Factory. For more information, see [Integration runtime in Azure Data Factory](concepts-integration-runtime.md) and [Linked service properties for Azure Blob storage](connector-azure-blob-storage.md#linked-service-properties). |
-| Pipelines | A data factory can have one or more pipelines. A pipeline is a logical grouping of activities that together perform a task. You use startTime, endTime, and isPaused to schedule and run pipelines. | Pipelines are groups of activities that are performed on data. However, the scheduling of activities in the pipeline has been separated into new trigger resources. You can think of pipelines in the current version of Data Factory more as "workflow units" that you schedule separately via triggers. <br/><br/>Pipelines do not have "windows" of time execution in the current version of Data Factory. The Data Factory V1 concepts of startTime, endTime, and isPaused are no longer present in the current version of Data Factory. For more information, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md) and [Pipelines and activities](concepts-pipelines-activities.md). |
-| Activities | Activities define actions to perform on your data within a pipeline. Data movement (copy activity) and data transformation activities (such as Hive, Pig, and MapReduce) are supported. | In the current version of Data Factory, activities still are defined actions within a pipeline. The current version of Data Factory introduces new [control flow activities](concepts-pipelines-activities.md#control-flow-activities). You use these activities in a control flow (looping and branching). Data movement and data transformation activities that were supported in V1 are supported in the current version. You can define transformation activities without using datasets in the current version. |
-| Hybrid data movement and activity dispatch | Now called Integration Runtime, [Data Management Gateway](v1/data-factory-data-management-gateway.md) supported moving data between on-premises and cloud.| Data Management Gateway is now called Self-Hosted Integration Runtime. It provides the same capability as it did in V1. <br/><br/> The Azure-SSIS Integration Runtime in the current version of Data Factory also supports deploying and running SQL Server Integration Services (SSIS) packages in the cloud. For more information, see [Integration runtime in Azure Data Factory](concepts-integration-runtime.md).|
-| Parameters | NA | Parameters are key-value pairs of read-only configuration settings that are defined in pipelines. You can pass arguments for the parameters when you are manually running the pipeline. If you are using a scheduler trigger, the trigger can pass values for the parameters too. Activities within the pipeline consume the parameter values. |
-| Expressions | Data Factory V1 allows you to use functions and system variables in data selection queries and activity/dataset properties. | In the current version of Data Factory, you can use expressions anywhere in a JSON string value. For more information, see [Expressions and functions in the current version of Data Factory](control-flow-expression-language-functions.md).|
-| Pipeline runs | NA | A single instance of a pipeline execution. For example, say you have a pipeline that executes at 8 AM, 9 AM, and 10 AM. There would be three separate runs of the pipeline (pipeline runs) in this case. Each pipeline run has a unique pipeline run ID. The pipeline run ID is a GUID that uniquely defines that particular pipeline run. Pipeline runs are typically instantiated by passing arguments to parameters that are defined in the pipelines. |
-| Activity runs | NA | An instance of an activity execution within a pipeline. |
-| Trigger runs | NA | An instance of a trigger execution. For more information, see [Triggers](concepts-pipeline-execution-triggers.md). |
-| Scheduling | Scheduling is based on pipeline start/end times and dataset availability. | Scheduler trigger or execution via external scheduler. For more information, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md). |
-
-The following sections provide more information about the capabilities of the current version.
-
-## Control flow
-To support diverse integration flows and patterns in the modern data warehouse, the current version of Data Factory has enabled a new flexible data pipeline model that is no longer tied to time-series data. A few common flows that were previously not possible are now enabled. They are described in the following sections.
-
-### Chaining activities
-In V1, you had to configure the output of an activity as an input of another activity to chain them. in the current version, you can chain activities in a sequence within a pipeline. You can use the **dependsOn** property in an activity definition to chain it with an upstream activity. For more information and an example, see [Pipelines and activities](concepts-pipelines-activities.md#multiple-activities-in-a-pipeline) and [Branching and chaining activities](tutorial-control-flow.md).
-
-### Branching activities
-in the current version, you can branch activities within a pipeline. The [If-condition activity](control-flow-if-condition-activity.md) provides the same functionality that an `if` statement provides in programming languages. It evaluates a set of activities when the condition evaluates to `true` and another set of activities when the condition evaluates to `false`. For examples of branching activities, see the [Branching and chaining activities](tutorial-control-flow.md) tutorial.
-
-### Parameters
-You can define parameters at the pipeline level and pass arguments while you're invoking the pipeline on-demand or from a trigger. Activities can consume the arguments that are passed to the pipeline. For more information, see [Pipelines and triggers](concepts-pipeline-execution-triggers.md).
-
-### Custom state passing
-Activity outputs including state can be consumed by a subsequent activity in the pipeline. For example, in the JSON definition of an activity, you can access the output of the previous activity by using the following syntax: `@activity('NameofPreviousActivity').output.value`. By using this feature, you can build workflows where values can pass through activities.
-
-### Looping containers
-The [ForEach activity](control-flow-for-each-activity.md) defines a repeating control flow in your pipeline. This activity iterates over a collection and runs specified activities in a loop. The loop implementation of this activity is similar to the Foreach looping structure in programming languages.
-
-The [Until](control-flow-until-activity.md) activity provides the same functionality that a do-until looping structure provides in programming languages. It runs a set of activities in a loop until the condition that's associated with the activity evaluates to `true`. You can specify a timeout value for the until activity in Data Factory.
-
-### Trigger-based flows
-Pipelines can be triggered by on-demand (event-based, i.e. blob post) or wall-clock time. The [pipelines and triggers](concepts-pipeline-execution-triggers.md) article has detailed information about triggers.
-
-### Invoking a pipeline from another pipeline
-The [Execute Pipeline activity](control-flow-execute-pipeline-activity.md) allows a Data Factory pipeline to invoke another pipeline.
-
-### Delta flows
-A key use case in ETL patterns is "delta loads," in which only data that has changed since the last iteration of a pipeline is loaded. New capabilities in the current version, such as [lookup activity](control-flow-lookup-activity.md), flexible scheduling, and control flow, enable this use case in a natural way. For a tutorial with step-by-step instructions, see [Tutorial: Incremental copy](tutorial-incremental-copy-powershell.md).
-
-### Other control flow activities
-Following are a few more control flow activities that are supported by the current version of Data Factory.
-
-Control activity | Description
-- | --
-[ForEach activity](control-flow-for-each-activity.md) | Defines a repeating control flow in your pipeline. This activity is used to iterate over a collection and runs specified activities in a loop. The loop implementation of this activity is similar to Foreach looping structure in programming languages.
-[Web activity](control-flow-web-activity.md) | Calls a custom REST endpoint from a Data Factory pipeline. You can pass datasets and linked services to be consumed and accessed by the activity.
-[Lookup activity](control-flow-lookup-activity.md) | Reads or looks up a record or table name value from any external source. This output can further be referenced by succeeding activities.
-[Get metadata activity](control-flow-get-metadata-activity.md) | Retrieves the metadata of any data in Azure Data Factory.
-[Wait activity](control-flow-wait-activity.md) | Pauses the pipeline for a specified period of time.
-
-## Deploy SSIS packages to Azure
-You use Azure-SSIS if you want to move your SSIS workloads to the cloud, create a data factory by using the current version, and provision an Azure-SSIS Integration Runtime.
-
-The Azure-SSIS Integration Runtime is a fully managed cluster of Azure VMs (nodes) that are dedicated to running your SSIS packages in the cloud. After you provision Azure-SSIS Integration Runtime, you can use the same tools that you have been using to deploy SSIS packages to an on-premises SSIS environment.
-
-For example, you can use SQL Server Data Tools or SQL Server Management Studio to deploy SSIS packages to this runtime on Azure. For step-by-step instructions, see the tutorial [Deploy SQL Server integration services packages to Azure](./tutorial-deploy-ssis-packages-azure.md).
-
-## Flexible scheduling
-In the current version of Data Factory, you do not need to define dataset availability schedules. You can define a trigger resource that can schedule pipelines from a clock scheduler paradigm. You can also pass parameters to pipelines from a trigger for a flexible scheduling and execution model.
-
-Pipelines do not have "windows" of time execution in the current version of Data Factory. The Data Factory V1 concepts of startTime, endTime, and isPaused don't exist in the current version of Data Factory. For more information about how to build and then schedule a pipeline in the current version of Data Factory, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md).
-
-## Support for more data stores
-The current version supports the copying of data to and from more data stores than V1. For a list of supported data stores, see the following articles:
--- [Version 1 - supported data stores](v1/data-factory-data-movement-activities.md#supported-data-stores-and-formats)-- [Current version - supported data stores](copy-activity-overview.md#supported-data-stores-and-formats)-
-## Support for on-demand Spark cluster
-The current version supports the creation of an on-demand Azure HDInsight Spark cluster. To create an on-demand Spark cluster, specify the cluster type as Spark in your on-demand, HDInsight linked service definition. Then you can configure the Spark activity in your pipeline to use this linked service.
-
-At runtime, when the activity is executed, the Data Factory service automatically creates the Spark cluster for you. For more information, see the following articles:
--- [Spark Activity in the current version of Data Factory](transform-data-using-spark.md)-- [Azure HDInsight on-demand linked service](compute-linked-services.md#azure-hdinsight-on-demand-linked-service)-
-## Custom activities
-In V1, you implement (custom) DotNet activity code by creating a .NET class library project with a class that implements the Execute method of the IDotNetActivity interface. Therefore, you need to write your custom code in .NET Framework 4.5.2 and run it on Windows-based Azure Batch Pool nodes.
-
-In a custom activity in the current version, you don't have to implement a .NET interface. You can directly run commands, scripts, and your own custom code compiled as an executable.
-
-For more information, see [Difference between custom activity in Data Factory and version 1](transform-data-using-dotnet-custom-activity.md#compare-v2-v1).
-
-## SDKs
- the current version of Data Factory provides a richer set of SDKs that can be used to author, manage, and monitor pipelines.
--- **.NET SDK**: The .NET SDK is updated in the current version.--- **PowerShell**: The PowerShell cmdlets are updated in the current version. The cmdlets for the current version have **DataFactoryV2** in the name, for example: Get-AzDataFactoryV2. --- **Python SDK**: This SDK is new in the current version.--- **REST API**: The REST API is updated in the current version. -
-The SDKs that are updated in the current version are not backward-compatible with V1 clients.
-
-## Authoring experience
-
-| | Version 2 | Version 1 |
-| | -- | -- |
-| **Azure portal** | [Yes](quickstart-create-data-factory-portal.md) | No |
-| **Azure PowerShell** | [Yes](quickstart-create-data-factory-powershell.md) | [Yes](./v1/data-factory-build-your-first-pipeline-using-powershell.md) |
-| **.NET SDK** | [Yes](quickstart-create-data-factory-dot-net.md) | [Yes](./v1/data-factory-build-your-first-pipeline-using-vs.md) |
-| **REST API** | [Yes](quickstart-create-data-factory-rest-api.md) | [Yes](./v1/data-factory-build-your-first-pipeline-using-rest-api.md) |
-| **Python SDK** | [Yes](quickstart-create-data-factory-python.md) | No |
-| **Resource Manager template** | [Yes](quickstart-create-data-factory-resource-manager-template.md) | [Yes](./v1/data-factory-build-your-first-pipeline-using-arm.md) |
-
-## Roles and permissions
-
-The Data Factory version 1 Contributor role can be used to create and manage the current version of Data Factory resources. For more info, see [Data Factory Contributor](../role-based-access-control/built-in-roles.md#data-factory-contributor).
-
-## Monitoring experience
-in the current version, you can also monitor data factories by using [Azure Monitor](monitor-using-azure-monitor.md). The new PowerShell cmdlets support monitoring of [integration runtimes](monitor-integration-runtime.md). Both V1 and V2 support visual monitoring via a monitoring application that can be launched from the Azure portal.
--
-## Next steps
-Learn how to create a data factory by following step-by-step instructions in the following quickstarts: [PowerShell](quickstart-create-data-factory-powershell.md), [.NET](quickstart-create-data-factory-dot-net.md), [Python](quickstart-create-data-factory-python.md), [REST API](quickstart-create-data-factory-rest-api.md).
data-factory Concepts Datasets Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/concepts-datasets-linked-services.md
Last updated 02/08/2023
# Datasets in Azure Data Factory and Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-create-datasets.md)
-> * [Current version](concepts-datasets-linked-services.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Concepts Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/concepts-linked-services.md
Last updated 10/25/2022
# Linked services in Azure Data Factory and Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you're using:"]
-> * [Version 1](v1/data-factory-create-datasets.md)
-> * [Current version](concepts-linked-services.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article describes what linked services are, how they're defined in JSON format, and how they're used in Azure Data Factory and Azure Synapse Analytics.
data-factory Concepts Pipeline Execution Triggers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/concepts-pipeline-execution-triggers.md
# Pipeline execution and triggers in Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of the Data Factory service that you're using:"]
-> * [Version 1](v1/data-factory-scheduling-and-execution.md)
-> * [Current version](concepts-pipeline-execution-triggers.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] A _pipeline run_ in Azure Data Factory and Azure Synapse defines an instance of a pipeline execution. For example, say you have a pipeline that executes at 8:00 AM, 9:00 AM, and 10:00 AM. In this case, there are three separate runs of the pipeline or pipeline runs. Each pipeline run has a unique pipeline run ID. A run ID is a GUID that uniquely defines that particular pipeline run.
data-factory Concepts Pipelines Activities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/concepts-pipelines-activities.md
Last updated 10/24/2022
# Pipelines and activities in Azure Data Factory and Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you're using:"]
-> * [Version 1](v1/data-factory-create-pipelines.md)
-> * [Current version](concepts-pipelines-activities.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] [!INCLUDE[ML Studio (classic) retirement](../../includes/machine-learning-studio-classic-deprecation.md)]
data-factory Connector Amazon Redshift https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-amazon-redshift.md
Last updated 07/13/2023
# Copy data from Amazon Redshift using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-amazon-redshift-connector.md)
-> * [Current version](connector-amazon-redshift.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Amazon Simple Storage Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-amazon-simple-storage-service.md
Last updated 06/05/2023
# Copy and transform data in Amazon Simple Storage Service using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you're using:"]
->
-> * [Version 1](v1/data-factory-amazon-simple-storage-service-connector.md)
-> * [Current version](connector-amazon-simple-storage-service.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Azure Blob Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-blob-storage.md
Last updated 09/29/2023
# Copy and transform data in Azure Blob Storage by using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you're using:"]
-> - [Version 1](v1/data-factory-azure-blob-connector.md)
-> - [Current version](connector-azure-blob-storage.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to Azure Blob Storage. It also describes how to use the Data Flow activity to transform data in Azure Blob Storage. To learn more, read the [Azure Data Factory](introduction.md) and the [Azure Synapse Analytics](..\synapse-analytics\overview-what-is.md) introduction articles.
data-factory Connector Azure Cosmos Analytical Store https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-cosmos-analytical-store.md
Last updated 03/31/2023
# Copy and transform data in Azure Cosmos DB analytical store by using Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Current version](connector-azure-cosmos-analytical-store.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use Data Flow to transform data in Azure Cosmos DB analytical store. To learn more, read the introductory articles for [Azure Data Factory](introduction.md) and [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
When transforming data in mapping data flow, you can read and write to collectio
Settings specific to Azure Cosmos DB are available in the **Source Options** tab of the source transformation.
-**Include system columns:** If true, ```id```, ```_ts```, and other system columns will be included in your data flow metadata from Azure Cosmos DB. When updating collections, it is important to include this so that you can grab the existing row ID.
+**Include system columns:** If true, ```id```, ```_ts```, and other system columns are included in your data flow metadata from Azure Cosmos DB. When updating collections, it's important to include this so that you can grab the existing row ID.
**Page size:** The number of documents per page of the query result. Default is "-1" which uses the service dynamic page up to 1000.
Settings specific to Azure Cosmos DB are available in the **Source Options** tab
**Preferred regions:** Choose the preferred read regions for this process.
-**Change feed:** If true, you will get data from [Azure Cosmos DB change feed](../cosmos-db/change-feed.md) which is a persistent record of changes to a container in the order they occur from last run automatically. When you set it true, do not set both **Infer drifted column types** and **Allow schema drift** as true at the same time. For more details, see [Azure Cosmos DB change feed](#azure-cosmos-db-change-feed).
+**Change feed:** If true, you'll get data from [Azure Cosmos DB change feed](../cosmos-db/change-feed.md), which is a persistent record of changes to a container in the order they occur from last run automatically. When you set it true, don't set both **Infer drifted column types** and **Allow schema drift** as true at the same time. For more information, see [Azure Cosmos DB change feed](#azure-cosmos-db-change-feed).
-**Start from beginning:** If true, you will get initial load of full snapshot data in the first run, followed by capturing changed data in next runs. If false, the initial load will be skipped in the first run, followed by capturing changed data in next runs. The setting is aligned with the same setting name in [Azure Cosmos DB reference](https://github.com/Azure/azure-cosmosdb-spark/wiki/Configuration-references#reading-cosmosdb-collection-change-feed). For more details, see [Azure Cosmos DB change feed](#azure-cosmos-db-change-feed).
+**Start from beginning:** If true, you'll get initial load of full snapshot data in the first run, followed by capturing changed data in next runs. If false, the initial load will be skipped in the first run, followed by capturing changed data in next runs. The setting is aligned with the same setting name in [Azure Cosmos DB reference](https://github.com/Azure/azure-cosmosdb-spark/wiki/Configuration-references#reading-cosmosdb-collection-change-feed). For more information, see [Azure Cosmos DB change feed](#azure-cosmos-db-change-feed).
### Sink transformation
Settings specific to Azure Cosmos DB are available in the **Settings** tab of th
**Update method:** Determines what operations are allowed on your database destination. The default is to only allow inserts. To update, upsert, or delete rows, an alter-row transformation is required to tag rows for those actions. For updates, upserts and deletes, a key column or columns must be set to determine which row to alter. **Collection action:** Determines whether to recreate the destination collection prior to writing.
-* None: No action will be done to the collection.
-* Recreate: The collection will get dropped and recreated
+* None: No action is done to the collection.
+* Recreate: The collection gets dropped and recreated
**Batch size**: An integer that represents how many objects are being written to Azure Cosmos DB collection in each batch. Usually, starting with the default batch size is sufficient. To further tune this value, note: -- Azure Cosmos DB limits single request's size to 2MB. The formula is "Request Size = Single Document Size * Batch Size". If you hit error saying "Request size is too large", reduce the batch size value.
+- Azure Cosmos DB limits single request's size to 2 MB. The formula is "Request Size = Single Document Size * Batch Size". If you hit error saying "Request size is too large", reduce the batch size value.
- The larger the batch size, the better throughput the service can achieve, while make sure you allocate enough RUs to empower your workload. **Partition key:** Enter a string that represents the partition key for your collection. Example: ```/movies/title```
Settings specific to Azure Cosmos DB are available in the **Settings** tab of th
## Azure Cosmos DB change feed
-Azure Data Factory can get data from [Azure Cosmos DB change feed](../cosmos-db/change-feed.md) by enabling it in the mapping data flow source transformation. With this connector option, you can read change feeds and apply transformations before loading transformed data into destination datasets of your choice. You do not have to use Azure functions to read the change feed and then write custom transformations. You can use this option to move data from one container to another, prepare change feed driven material views for fit purpose or automate container backup or recovery based on change feed, and enable many more such use cases using visual drag and drop capability of Azure Data Factory.
+Azure Data Factory can get data from [Azure Cosmos DB change feed](../cosmos-db/change-feed.md) by enabling it in the mapping data flow source transformation. With this connector option, you can read change feeds and apply transformations before loading transformed data into destination datasets of your choice. You don't have to use Azure functions to read the change feed and then write custom transformations. You can use this option to move data from one container to another, prepare change feed driven material views for fit purpose or automate container backup or recovery based on change feed, and enable many more such use cases using visual drag and drop capability of Azure Data Factory.
Make sure you keep the pipeline and activity name unchanged, so that the checkpoint can be recorded by ADF for you to get changed data from the last run automatically. If you change your pipeline name or activity name, the checkpoint will be reset, which leads you to start from beginning or get changes from now in the next run.
-When you debug the pipeline, this feature works the same. Be aware that the checkpoint will be reset when you refresh your browser during the debug run. After you are satisfied with the pipeline result from debug run, you can go ahead to publish and trigger the pipeline. At the moment when you first time trigger your published pipeline, it automatically restarts from the beginning or gets changes from now on.
+When you debug the pipeline, this feature works the same. The checkpoint will be reset when you refresh your browser during the debug run. After you're satisfied with the pipeline result from debug run, you can go ahead to publish and trigger the pipeline. At the moment when you first time trigger your published pipeline, it automatically restarts from the beginning or gets changes from now on.
-In the monitoring section, you always have the chance to rerun a pipeline. When you are doing so, the changed data is always captured from the previous checkpoint of your selected pipeline run.
+In the monitoring section, you always have the chance to rerun a pipeline. When you're doing so, the changed data is always captured from the previous checkpoint of your selected pipeline run.
In addition, Azure Cosmos DB analytical store now supports Change Data Capture (CDC) for Azure Cosmos DB API for NoSQL and Azure Cosmos DB API for Mongo DB (public preview). Azure Cosmos DB analytical store allows you to efficiently consume a continuous and incremental feed of changed (inserted, updated, and deleted) data from analytical store.
data-factory Connector Azure Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-cosmos-db.md
Last updated 03/02/2023
# Copy and transform data in Azure Cosmos DB for NoSQL by using Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-azure-documentdb-connector.md)
-> * [Current version](connector-azure-cosmos-db.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use Copy Activity in Azure Data Factory to copy data from and to Azure Cosmos DB for NoSQL, and use Data Flow to transform data in Azure Cosmos DB for NoSQL. To learn more, read the introductory articles for [Azure Data Factory](introduction.md) and [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
data-factory Connector Azure Data Lake Store https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-data-lake-store.md
Last updated 08/10/2023
# Copy data to or from Azure Data Lake Storage Gen1 using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Azure Data Factory that you're using:"]
->
-> * [Version 1](v1/data-factory-azure-datalake-connector.md)
-> * [Current version](connector-azure-data-lake-store.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to copy data to and from Azure Data Lake Storage Gen1. To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
data-factory Connector Azure Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-search.md
Last updated 07/13/2023
# Copy data to an Azure Cognitive Search index using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-azure-search-connector.md)
-> * [Current version](connector-azure-search.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data into Azure Cognitive Search index. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Azure Sql Data Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-sql-data-warehouse.md
Last updated 04/20/2023
# Copy and transform data in Azure Synapse Analytics by using Azure Data Factory or Synapse pipelines
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you're using:"]
->
-> - [Version1](v1/data-factory-azure-sql-data-warehouse-connector.md)
-> - [Current version](connector-azure-sql-data-warehouse.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use Copy Activity in Azure Data Factory or Synapse pipelines to copy data from and to Azure Synapse Analytics, and use Data Flow to transform data in Azure Data Lake Storage Gen2. To learn about Azure Data Factory, read the [introductory article](introduction.md).
To use this feature, create an [Azure Blob Storage linked service](connector-azu
## Use PolyBase to load data into Azure Synapse Analytics
-Using [PolyBase](/sql/relational-databases/polybase/polybase-guide) is an efficient way to load a large amount of data into Azure Synapse Analytics with high throughput. You'll see a large gain in the throughput by using PolyBase instead of the default BULKINSERT mechanism. For a walkthrough with a use case, see [Load 1 TB into Azure Synapse Analytics](v1/data-factory-load-sql-data-warehouse.md).
+Using [PolyBase](/sql/relational-databases/polybase/polybase-guide) is an efficient way to load a large amount of data into Azure Synapse Analytics with high throughput. You'll see a large gain in the throughput by using PolyBase instead of the default BULKINSERT mechanism.
- If your source data is in **Azure Blob, Azure Data Lake Storage Gen1 or Azure Data Lake Storage Gen2**, and the **format is PolyBase compatible**, you can use copy activity to directly invoke PolyBase to let Azure Synapse Analytics pull the data from source. For details, see **[Direct copy by using PolyBase](#direct-copy-by-using-polybase)**. - If your source data store and format isn't originally supported by PolyBase, use the **[Staged copy by using PolyBase](#staged-copy-by-using-polybase)** feature instead. The staged copy feature also provides you better throughput. It automatically converts the data into PolyBase-compatible format, stores the data in Azure Blob storage, then calls PolyBase to load data into Azure Synapse Analytics.
data-factory Connector Azure Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-sql-database.md
Last updated 04/06/2023
# Copy and transform data in Azure SQL Database by using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Azure Data Factory that you're using:"]
->
-> - [Version 1](v1/data-factory-azure-sql-connector.md)
-> - [Current version](connector-azure-sql-database.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use Copy Activity in Azure Data Factory or Azure Synapse pipelines to copy data from and to Azure SQL Database, and use Data Flow to transform data in Azure SQL Database. To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
data-factory Connector Azure Table Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-azure-table-storage.md
Last updated 07/13/2023
# Copy data to and from Azure Table storage using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-azure-table-connector.md)
-> * [Current version](connector-azure-table-storage.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Cassandra https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-cassandra.md
Last updated 01/25/2023
# Copy data from Cassandra using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-cassandra-connector.md)
-> * [Current version](connector-cassandra.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Db2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-db2.md
Last updated 07/13/2023
# Copy data from DB2 using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-db2-connector.md)
-> * [Current version](connector-db2.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector File System https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-file-system.md
# Copy data to or from a file system by using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-file-system-connector.md)
-> * [Current version](connector-file-system.md)
+ [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to copy data to and from file system. To learn more read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
+This article outlines how to copy data to and from file system. To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
## Supported capabilities
Specifically, this file system connector supports:
Use the following steps to create a file system linked service in the Azure portal UI.
-1. Browse to the Manage tab in your Azure Data Factory or Synapse workspace and select Linked Services, then click New:
+1. Browse to the Manage tab in your Azure Data Factory or Synapse workspace and select Linked Services, then select New:
# [Azure Data Factory](#tab/data-factory)
The following properties are supported for file system under `location` settings
| Property | Description | Required | | - | | -- | | type | The type property under `location` in dataset must be set to **FileServerLocation**. | Yes |
-| folderPath | The path to folder. If you want to use wildcard to filter folder, skip this setting and specify in activity source settings. Note that you will need to setup the file share location in your Windows or Linux environment to expose the folder for sharing. | No |
+| folderPath | The path to folder. If you want to use wildcard to filter folder, skip this setting and specify in activity source settings. You need to set up the file share location in your Windows or Linux environment to expose the folder for sharing. | No |
| fileName | The file name under the given folderPath. If you want to use wildcard to filter files, skip this setting and specify in activity source settings. | No | **Example:**
The following properties are supported for file system under `storeSettings` set
| ***Locate the files to copy:*** | | | | OPTION 1: static path<br> | Copy from the given folder/file path specified in the dataset. If you want to copy all files from a folder, additionally specify `wildcardFileName` as `*`. | | | OPTION 2: server side filter<br>- fileFilter | File server side native filter, which provides better performance than OPTION 3 wildcard filter. Use `*` to match zero or more characters and `?` to match zero or single character. Learn more about the syntax and notes from the **Remarks** under [this section](/dotnet/api/system.io.directory.getfiles#system-io-directory-getfiles(system-string-system-string-system-io-searchoption)). | No |
-| OPTION 3: client side filter<br>- wildcardFolderPath | The folder path with wildcard characters to filter source folders. Such filter happens within the service, which enumerate the folders/files under the given path then apply the wildcard filter.<br>Allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character); use `^` to escape if your actual folder name has wildcard or this escape char inside. <br>See more examples in [Folder and file filter examples](#folder-and-file-filter-examples). | No |
+| OPTION 3: client side filter<br>- wildcardFolderPath | The folder path with wildcard characters to filter source folders. Such filter happens within the service, which enumerates the folders/files under the given path then apply the wildcard filter.<br>Allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character); use `^` to escape if your actual folder name has wildcard or this escape char inside. <br>See more examples in [Folder and file filter examples](#folder-and-file-filter-examples). | No |
| OPTION 3: client side filter<br>- wildcardFileName | The file name with wildcard characters under the given folderPath/wildcardFolderPath to filter source files. Such filter happens within the service, which enumerates the files under the given path then apply the wildcard filter.<br>Allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character); use `^` to escape if your actual file name has wildcard or this escape char inside.<br>See more examples in [Folder and file filter examples](#folder-and-file-filter-examples). | Yes |
-| OPTION 3: a list of files<br>- fileListPath | Indicates to copy a given file set. Point to a text file that includes a list of files you want to copy, one file per line, which is the relative path to the path configured in the dataset.<br/>When using this option, do not specify file name in dataset. See more examples in [File list examples](#file-list-examples). |No |
+| OPTION 3: a list of files<br>- fileListPath | Indicates to copy a given file set. Point to a text file that includes a list of files you want to copy, one file per line, which is the relative path to the path configured in the dataset.<br/>When using this option, don't specify file name in dataset. See more examples in [File list examples](#file-list-examples). |No |
| ***Additional settings:*** | | |
-| recursive | Indicates whether the data is read recursively from the subfolders or only from the specified folder. Note that when recursive is set to true and the sink is a file-based store, an empty folder or subfolder isn't copied or created at the sink. <br>Allowed values are **true** (default) and **false**.<br>This property doesn't apply when you configure `fileListPath`. |No |
-| deleteFilesAfterCompletion | Indicates whether the binary files will be deleted from source store after successfully moving to the destination store. The file deletion is per file, so when copy activity fails, you will see some files have already been copied to the destination and deleted from source, while others are still remaining on source store. <br/>This property is only valid in binary files copy scenario. The default value: false. |No |
-| modifiedDatetimeStart | Files filter based on the attribute: Last Modified. <br>The files will be selected if their last modified time is greater than or equal to `modifiedDatetimeStart` and less than `modifiedDatetimeEnd`. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z". <br> The properties can be NULL, which means no file attribute filter will be applied to the dataset. When `modifiedDatetimeStart` has datetime value but `modifiedDatetimeEnd` is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value will be selected. When `modifiedDatetimeEnd` has datetime value but `modifiedDatetimeStart` is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.<br/>This property doesn't apply when you configure `fileListPath`. | No |
+| recursive | Indicates whether the data is read recursively from the subfolders or only from the specified folder. When recursive is set to true and the sink is a file-based store, an empty folder or subfolder isn't copied or created at the sink. <br>Allowed values are **true** (default) and **false**.<br>This property doesn't apply when you configure `fileListPath`. |No |
+| deleteFilesAfterCompletion | Indicates whether the binary files will be deleted from source store after successfully moving to the destination store. The file deletion is per file, so when copy activity fails, you'll see some files have already been copied to the destination and deleted from source, while others are still remaining on source store. <br/>This property is only valid in binary files copy scenario. The default value: false. |No |
+| modifiedDatetimeStart | Files filter based on the attribute: Last Modified. <br>The files are selected if their last modified time is greater than or equal to `modifiedDatetimeStart` and less than `modifiedDatetimeEnd`. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z". <br> The properties can be NULL, which means no file attribute filter is applied to the dataset. When `modifiedDatetimeStart` has datetime value but `modifiedDatetimeEnd` is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value are selected. When `modifiedDatetimeEnd` has datetime value but `modifiedDatetimeStart` is NULL, it means the files whose last modified attribute is less than the datetime value are selected.<br/>This property doesn't apply when you configure `fileListPath`. | No |
| modifiedDatetimeEnd | Same as above. | No |
-| enablePartitionDiscovery | For files that are partitioned, specify whether to parse the partitions from the file path and add them as additional source columns.<br/>Allowed values are **false** (default) and **true**. | No |
-| partitionRootPath | When partition discovery is enabled, specify the absolute root path in order to read partitioned folders as data columns.<br/><br/>If it is not specified, by default,<br/>- When you use file path in dataset or list of files on source, partition root path is the path configured in dataset.<br/>- When you use wildcard folder filter, partition root path is the sub-path before the first wildcard.<br/><br/>For example, assuming you configure the path in dataset as "root/folder/year=2020/month=08/day=27":<br/>- If you specify partition root path as "root/folder/year=2020", copy activity will generate two more columns `month` and `day` with value "08" and "27" respectively, in addition to the columns inside the files.<br/>- If partition root path is not specified, no extra column will be generated. | No |
+| enablePartitionDiscovery | For files that are partitioned, specify whether to parse the partitions from the file path and add them as extra source columns.<br/>Allowed values are **false** (default) and **true**. | No |
+| partitionRootPath | When partition discovery is enabled, specify the absolute root path in order to read partitioned folders as data columns.<br/><br/>If it isn't specified, by default,<br/>- When you use file path in dataset or list of files on source, partition root path is the path configured in dataset.<br/>- When you use wildcard folder filter, partition root path is the subpath before the first wildcard.<br/><br/>For example, assuming you configure the path in dataset as "root/folder/year=2020/month=08/day=27":<br/>- If you specify partition root path as "root/folder/year=2020", copy activity generates two more columns `month` and `day` with value "08" and "27" respectively, in addition to the columns inside the files.<br/>- If partition root path isn't specified, no extra column is generated. | No |
| maxConcurrentConnections |The upper limit of concurrent connections established to the data store during the activity run. Specify a value only when you want to limit concurrent connections.| No | **Example:**
This section describes the resulting behavior of the Copy operation for differen
| true |preserveHierarchy | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target folder Folder1 is created with the same structure as the source:<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5. | | true |flattenHierarchy | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target Folder1 is created with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File5 | | true |mergeFiles | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target Folder1 is created with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1 + File2 + File3 + File4 + File 5 contents are merged into one file with autogenerated file name |
-| false |preserveHierarchy | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/><br/>Subfolder1 with File3, File4, and File5 are not picked up. |
+| false |preserveHierarchy | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/><br/>Subfolder1 with File3, File4, and File5 aren't picked up. |
| false |flattenHierarchy | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;autogenerated name for File2<br/><br/>Subfolder1 with File3, File4, and File5 are not picked up. |
-| false |mergeFiles | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1 + File2 contents are merged into one file with autogenerated file name. autogenerated name for File1<br/><br/>Subfolder1 with File3, File4, and File5 are not picked up. |
+| false |mergeFiles | Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5 | The target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1 + File2 contents are merged into one file with autogenerated file name. autogenerated name for File1<br/><br/>Subfolder1 with File3, File4, and File5 aren't picked up. |
## Lookup activity properties
To learn details about the properties, check [Lookup activity](control-flow-look
## GetMetadata activity properties
-To learn details about the properties, check [GetMetadata activity](control-flow-get-metadata-activity.md)
+To learn details about the properties, check [GetMetadata activity.](control-flow-get-metadata-activity.md)
## Delete activity properties
-To learn details about the properties, check [Delete activity](delete-activity.md)
+To learn details about the properties, check [Delete activity.](delete-activity.md)
## Legacy models
To learn details about the properties, check [Delete activity](delete-activity.m
|: |: |: | | type | The type property of the dataset must be set to: **FileShare** |Yes | | folderPath | Path to the folder. Wildcard filter is supported, allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character); use `^` to escape if your actual folder name has wildcard or this escape char inside. <br/><br/>Examples: rootfolder/subfolder/, see more examples in [Sample linked service and dataset definitions](#sample-linked-service-and-dataset-definitions) and [Folder and file filter examples](#folder-and-file-filter-examples). |No |
-| fileName | **Name or wildcard filter** for the file(s) under the specified "folderPath". If you don't specify a value for this property, the dataset points to all files in the folder. <br/><br/>For filter, allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character).<br/>- Example 1: `"fileName": "*.csv"`<br/>- Example 2: `"fileName": "???20180427.txt"`<br/>Use `^` to escape if your actual file name has wildcard or this escape char inside.<br/><br/>When fileName isn't specified for an output dataset and **preserveHierarchy** isn't specified in the activity sink, the copy activity automatically generates the file name with the following pattern: "*Data.[activity run ID GUID].[GUID if FlattenHierarchy].[format if configured].[compression if configured]*", for example "Data.0a405f8a-93ff-4c6f-b3be-f69616f1df7a.txt.gz"; if you copy from tabular source using table name instead of query, the name pattern is "*[table name].[format].[compression if configured]*", for example "MyTable.csv". |No |
-| modifiedDatetimeStart | Files filter based on the attribute: Last Modified. The files will be selected if their last modified time is greater than or equal to `modifiedDatetimeStart` and less than `modifiedDatetimeEnd`. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z". <br/><br/> Be aware the overall performance of data movement will be impacted by enabling this setting when you want to do file filter from huge amounts of files. <br/><br/> The properties can be NULL, which means no file attribute filter will be applied to the dataset. When `modifiedDatetimeStart` has datetime value but `modifiedDatetimeEnd` is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value will be selected. When `modifiedDatetimeEnd` has datetime value but `modifiedDatetimeStart` is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.| No |
-| modifiedDatetimeEnd | Files filter based on the attribute: Last Modified. The files will be selected if their last modified time is greater than or equal to `modifiedDatetimeStart` and less than `modifiedDatetimeEnd`. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z". <br/><br/> Be aware the overall performance of data movement will be impacted by enabling this setting when you want to do file filter from huge amounts of files. <br/><br/> The properties can be NULL, which means no file attribute filter will be applied to the dataset. When `modifiedDatetimeStart` has datetime value but `modifiedDatetimeEnd` is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value will be selected. When `modifiedDatetimeEnd` has datetime value but `modifiedDatetimeStart` is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.| No |
+| fileName | **Name or wildcard filter** for the files under the specified "folderPath". If you don't specify a value for this property, the dataset points to all files in the folder. <br/><br/>For filter, allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character).<br/>- Example 1: `"fileName": "*.csv"`<br/>- Example 2: `"fileName": "???20180427.txt"`<br/>Use `^` to escape if your actual file name has wildcard or this escape char inside.<br/><br/>When fileName isn't specified for an output dataset and **preserveHierarchy** isn't specified in the activity sink, the copy activity automatically generates the file name with the following pattern: "*Data.[activity run ID GUID].[GUID if FlattenHierarchy].[format if configured].[compression if configured]*", for example "Data.0a405f8a-93ff-4c6f-b3be-f69616f1df7a.txt.gz"; if you copy from tabular source using table name instead of query, the name pattern is "*[table name].[format].[compression if configured]*", for example "MyTable.csv". |No |
+| modifiedDatetimeStart | Files filter based on the attribute: Last Modified. The files are selected if their last modified time is greater than or equal to `modifiedDatetimeStart` and less than `modifiedDatetimeEnd`. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z". <br/><br/> Be aware the overall performance of data movement are impacted by enabling this setting when you want to do file filter from huge amounts of files. <br/><br/> The properties can be NULL, which means no file attribute filter is applied to the dataset. When `modifiedDatetimeStart` has datetime value but `modifiedDatetimeEnd` is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value are selected. When `modifiedDatetimeEnd` has datetime value but `modifiedDatetimeStart` is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.| No |
+| modifiedDatetimeEnd | Files filter based on the attribute: Last Modified. The files are selected if their last modified time is greater than or equal to `modifiedDatetimeStart` and less than `modifiedDatetimeEnd`. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z". <br/><br/> Be aware the overall performance of data movement are impacted by enabling this setting when you want to do file filter from huge amounts of files. <br/><br/> The properties can be NULL, which means no file attribute filter is applied to the dataset. When `modifiedDatetimeStart` has datetime value but `modifiedDatetimeEnd` is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value are selected. When `modifiedDatetimeEnd` has datetime value but `modifiedDatetimeStart` is NULL, it means the files whose last modified attribute is less than the datetime value are selected.| No |
| format | If you want to **copy files as-is** between file-based stores (binary copy), skip the format section in both input and output dataset definitions.<br/><br/>If you want to parse or generate files with a specific format, the following file format types are supported: **TextFormat**, **JsonFormat**, **AvroFormat**, **OrcFormat**, **ParquetFormat**. Set the **type** property under format to one of these values. For more information, see [Text Format](supported-file-formats-and-compression-codecs-legacy.md#text-format), [Json Format](supported-file-formats-and-compression-codecs-legacy.md#json-format), [Avro Format](supported-file-formats-and-compression-codecs-legacy.md#avro-format), [Orc Format](supported-file-formats-and-compression-codecs-legacy.md#orc-format), and [Parquet Format](supported-file-formats-and-compression-codecs-legacy.md#parquet-format) sections. |No (only for binary copy scenario) | | compression | Specify the type and level of compression for the data. For more information, see [Supported file formats and compression codecs](supported-file-formats-and-compression-codecs-legacy.md#compression-support).<br/>Supported types are: **GZip**, **Deflate**, **BZip2**, and **ZipDeflate**.<br/>Supported levels are: **Optimal** and **Fastest**. |No |
To learn details about the properties, check [Delete activity](delete-activity.m
| Property | Description | Required | |: |: |: | | type | The type property of the copy activity source must be set to: **FileSystemSource** |Yes |
-| recursive | Indicates whether the data is read recursively from the subfolders or only from the specified folder. Note when recursive is set to true and sink is file-based store, empty folder/sub-folder will not be copied/created at sink.<br/>Allowed values are: **true** (default), **false** | No |
+| recursive | Indicates whether the data is read recursively from the subfolders or only from the specified folder. Note when recursive is set to true and sink is file-based store, empty folder/sub-folder won't be copied/created at sink.<br/>Allowed values are: **true** (default), **false** | No |
| maxConcurrentConnections |The upper limit of concurrent connections established to the data store during the activity run. Specify a value only when you want to limit concurrent connections.| No | **Example:**
data-factory Connector Ftp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-ftp.md
# Copy data from FTP server using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
->
-> * [Version 1](v1/data-factory-ftp-connector.md)
> * [Current version](connector-ftp.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Hdfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-hdfs.md
# Copy data from the HDFS server using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of the Data Factory service that you are using:"]
-> * [Version 1](v1/data-factory-hdfs-connector.md)
-> * [Current version](connector-hdfs.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to copy data from the Hadoop Distributed File System (HDFS) server. To learn more, read the introductory articles for [Azure Data Factory](introduction.md) and [Synapse Analytics](../synapse-analytics/overview-what-is.md).
data-factory Connector Http https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-http.md
# Copy data from an HTTP endpoint by using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-http-connector.md)
-> * [Current version](connector-http.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use Copy Activity in Azure Data Factory and Azure Synapse to copy data from an HTTP endpoint. The article builds on [Copy Activity](copy-activity-overview.md), which presents a general overview of Copy Activity.
data-factory Connector Mongodb Legacy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-mongodb-legacy.md
Last updated 01/25/2023
# Copy data from MongoDB using Azure Data Factory or Synapse Analytics (legacy)
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-on-premises-mongodb-connector.md)
-> * [Current version](connector-mongodb.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a MongoDB database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-mysql.md
# Copy data from MySQL using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-mysql-connector.md)
-> * [Current version](connector-mysql.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from a MySQL database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Odata https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-odata.md
# Copy data from an OData source by using Azure Data Factory or Synapse Analytics [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-odata-connector.md)
-> * [Current version](connector-odata.md)
This article outlines how to use Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from an OData source. The article builds on [Copy Activity](copy-activity-overview.md), which presents a general overview of Copy Activity.
data-factory Connector Odbc https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-odbc.md
Last updated 10/25/2022
# Copy data from and to ODBC data stores using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-odbc-connector.md)
-> * [Current version](connector-odbc.md)
+ [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in Azure Data Factory to copy data from and to an ODBC data store. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Oracle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-oracle.md
# Copy data from and to Oracle by using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-oracle-connector.md)
-> * [Current version](connector-oracle.md)
- [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the copy activity in Azure Data Factory to copy data from and to an Oracle database. It builds on the [copy activity overview](copy-activity-overview.md).
data-factory Connector Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-postgresql.md
Last updated 10/25/2022
# Copy data from PostgreSQL using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-postgresql-connector.md)
-> * [Current version](connector-postgresql.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from a PostgreSQL database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Salesforce https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-salesforce.md
Last updated 07/13/2023
# Copy data from and to Salesforce using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-salesforce-connector.md)
-> * [Current version](connector-salesforce.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Sap Business Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-sap-business-warehouse.md
Last updated 10/25/2022
# Copy data from SAP Business Warehouse using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-sap-business-warehouse-connector.md)
-> * [Current version](connector-sap-business-warehouse.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from an SAP Business Warehouse (BW). It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Sap Hana https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-sap-hana.md
Last updated 10/20/2022
# Copy data from SAP HANA using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-sap-hana-connector.md)
-> * [Current version](connector-sap-hana.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from an SAP HANA database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Sftp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-sftp.md
Last updated 04/12/2023
# Copy and transform data in SFTP server using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of the Data Factory service that you are using:"]
-> * [Version 1](v1/data-factory-sftp-connector.md)
-> * [Current version](connector-sftp.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use Copy Activity to copy data from and to the secure FTP (SFTP) server, and use Data Flow to transform data in SFTP server. To learn more read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
data-factory Connector Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-sql-server.md
Last updated 07/13/2023
# Copy and transform data to and from SQL Server by using Azure Data Factory or Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Azure Data Factory that you're using:"]
-> * [Version 1](v1/data-factory-sqlserver-connector.md)
-> * [Current version](connector-sql-server.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the copy activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to SQL Server database and use Data Flow to transform data in SQL Server database. To learn more read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md).
data-factory Connector Sybase https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-sybase.md
Last updated 01/20/2023
# Copy data from Sybase using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-onprem-sybase-connector.md)
-> * [Current version](connector-sybase.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a Sybase database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Connector Teradata https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-teradata.md
# Copy data from Teradata Vantage using Azure Data Factory and Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
->
-> * [Version 1](v1/data-factory-onprem-teradata-connector.md)
> * [Current version](connector-teradata.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Connector Web Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connector-web-table.md
Last updated 01/18/2023
# Copy data from Web table by using Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-web-table-connector.md)
-> * [Current version](connector-web-table.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a Web table database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
data-factory Control Flow Expression Language Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/control-flow-expression-language-functions.md
Last updated 10/25/2022
# Expressions and functions in Azure Data Factory and Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-functions-variables.md)
-> * [Current version/Synapse version](control-flow-expression-language-functions.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] This article provides details about expressions and functions supported by Azure Data Factory and Azure Synapse Analytics.
data-factory Copy Activity Fault Tolerance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/copy-activity-fault-tolerance.md
Last updated 10/25/2022
# Fault tolerance of copy activity in Azure Data Factory and Synapse Analytics pipelines
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-copy-activity-fault-tolerance.md)
-> * [Current version](copy-activity-fault-tolerance.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Copy Activity Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/copy-activity-overview.md
# Copy activity in Azure Data Factory and Azure Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory that you're using:"]
-> * [Version 1](v1/data-factory-data-movement-activities.md)
-> * [Current version](copy-activity-overview.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Copy Activity Performance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/copy-activity-performance.md
Last updated 10/25/2022
# Copy activity performance and scalability guide
-> [!div class="op_single_selector" title1="Select the version of Azure Data Factory that you're using:"]
-> * [Version 1](v1/data-factory-copy-activity-performance.md)
-> * [Current version](copy-activity-performance.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Data Movement Security Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/data-movement-security-considerations.md
Last updated 02/01/2023
# Security considerations for data movement in Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
->
-> * [Version 1](v1/data-factory-data-movement-security-considerations.md)
> * [Current version](data-movement-security-considerations.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory How To Expression Language Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/how-to-expression-language-functions.md
Last updated 07/17/2023
# How to use parameters, expressions and functions in Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-functions-variables.md)
-> * [Current version](how-to-expression-language-functions.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] In this document, we will primarily focus on learning fundamental concepts with various examples to explore the ability to create parameterized data pipelines within Azure Data Factory. Parameterization and dynamic expressions are such notable additions to ADF because they can save a tremendous amount of time and allow for a much more flexible Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) solution, which will dramatically reduce the cost of solution maintenance and speed up the implementation of new features into existing pipelines. These gains are because parameterization minimizes the amount of hard coding and increases the number of reusable objects and processes in a solution.
data-factory Pipeline Trigger Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/pipeline-trigger-troubleshoot-guide.md
Type=Microsoft.DataTransfer.Execution.Core.ExecutionException,Message=There are
**Cause**
-You've reached the integration runtime's capacity limit. You might be running a large amount of data flow by using the same integration runtime at the same time. See [Azure subscription and service limits, quotas, and constraints](../azure-resource-manager/management/azure-subscription-service-limits.md#version-2) for details.
+You've reached the integration runtime's capacity limit. You might be running a large amount of data flow by using the same integration runtime at the same time. See [Azure subscription and service limits, quotas, and constraints](../azure-resource-manager/management/azure-subscription-service-limits.md#azure-data-factory-limits) for details.
**Resolution**
data-factory Quickstart Create Data Factory Dot Net https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/quickstart-create-data-factory-dot-net.md
# Quickstart: Create a data factory and pipeline using .NET SDK
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
-> * [Current version](quickstart-create-data-factory-dot-net.md)
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
data-factory Quickstart Create Data Factory Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/quickstart-create-data-factory-powershell.md
# Quickstart: Create an Azure Data Factory using PowerShell
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
-> * [Current version](quickstart-create-data-factory-powershell.md)
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
data-factory Quickstart Create Data Factory Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/quickstart-create-data-factory-python.md
# Quickstart: Create a data factory and pipeline using Python
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
-> * [Current version](quickstart-create-data-factory-python.md)
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
data-factory Quickstart Create Data Factory Resource Manager Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/quickstart-create-data-factory-resource-manager-template.md
Last updated 10/25/2022
# Quickstart: Create an Azure Data Factory using ARM template
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-build-your-first-pipeline-using-arm.md)
-> * [Current version](quickstart-create-data-factory-resource-manager-template.md)
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
data-factory Quickstart Create Data Factory Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/quickstart-create-data-factory-rest-api.md
# Quickstart: Create an Azure Data Factory and pipeline by using the REST API
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
-> * [Current version](quickstart-create-data-factory-rest-api.md)
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
data-factory Self Hosted Integration Runtime Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/self-hosted-integration-runtime-troubleshoot-guide.md
For Azure Data Factory v2 and Azure Synapse customers:
- If automatic update is off and you've already upgraded your .NET Framework Runtime to 4.7.2 or later, you can manually download the latest 5.x and install it on your machine. - If automatic update is off and you haven't upgraded your .NET Framework Runtime to 4.7.2 or later. When you try to manually install self-hosted integration runtime 5.x and register the key, you will be required to upgrade your .NET Framework Runtime version first. -
-For Azure Data Factory v1 customers:
-- Self-hosted integration runtime 5.X doesn't support Azure Data Factory v1.-- The self-hosted integration runtime will be automatically upgraded to the latest version of 4.x. And the latest version of 4.x won't expire. -- If you try to manually install self-hosted integration runtime 5.x and register the key, you'll be notified that self-hosted integration runtime 5.x doesn't support Azure Data Factory v1.
-
-
## Self-hosted IR connectivity issues ### Self-hosted integration runtime can't connect to the cloud service
data-factory Transform Data Using Custom Activity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-custom-activity.md
Last updated 08/10/2023
# Use custom activities in an Azure Data Factory or Azure Synapse Analytics pipeline
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-use-custom-activities.md)
-> * [Current version](transform-data-using-dotnet-custom-activity.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)] There are two types of activities that you can use in an Azure Data Factory or Synapse pipeline.
This serialization is not truly secure, and is not intended to be secure. The in
To access properties of type *SecureString* from a custom activity, read the `activity.json` file, which is placed in the same folder as your .EXE, deserialize the JSON, and then access the JSON property (extendedProperties => [propertyName] => value).
-## <a name="compare-v2-v1"></a> Compare v2 Custom Activity and version 1 (Custom) DotNet Activity
-
-In Azure Data Factory version 1, you implement a (Custom) DotNet Activity by creating a .NET Class Library project with a class that implements the `Execute` method of the `IDotNetActivity` interface. The Linked Services, Datasets, and Extended Properties in the JSON payload of a (Custom) DotNet Activity are passed to the execution method as strongly-typed objects. For details about the version 1 behavior, see [(Custom) DotNet in version 1](v1/data-factory-use-custom-activities.md). Because of this implementation, your version 1 DotNet Activity code has to target .NET Framework 4.5.2. The version 1 DotNet Activity also has to be executed on Windows-based Azure Batch Pool nodes.
-
-In the Azure Data Factory V2 and Synapse pipelines Custom Activity, you are not required to implement a .NET interface. You can now directly run commands, scripts, and your own custom code, compiled as an executable. To configure this implementation, you specify the `Command` property together with the `folderPath` property. The Custom Activity uploads the executable and its dependencies to `folderpath` and executes the command for you.
-
-The Linked Services, Datasets (defined in referenceObjects), and Extended Properties defined in the JSON payload of a Data Factory v2 or Synapse pipeline Custom Activity can be accessed by your executable as JSON files. You can access the required properties using a JSON serializer as shown in the preceding SampleApp.exe code sample.
-
-With the changes introduced in the Data Factory V2 and Synapse pipeline Custom Activity, you can write your custom code logic in your preferred language and execute it on Windows and Linux Operation Systems supported by Azure Batch.
-
-The following table describes the differences between the Data Factory V2 and Synapse pipeline Custom Activity and the Data Factory version 1 (Custom) DotNet Activity:
-
-|Differences | Custom Activity | version 1 (Custom) DotNet Activity |
-| - | - | - |
-|How custom logic is defined |By providing an executable |By implementing a .NET DLL |
-|Execution environment of the custom logic |Windows or Linux |Windows (.NET Framework 4.5.2) |
-|Executing scripts |Supports executing scripts directly (for example "cmd /c echo hello world" on Windows VM) |Requires implementation in the .NET DLL |
-|Dataset required |Optional |Required to chain activities and pass information |
-|Pass information from activity to custom logic |Through ReferenceObjects (LinkedServices and Datasets) and ExtendedProperties (custom properties) |Through ExtendedProperties (custom properties), Input, and Output Datasets |
-|Retrieve information in custom logic |Parses activity.json, linkedServices.json, and datasets.json stored in the same folder of the executable |Through .NET SDK (.NET Frame 4.5.2) |
-|Logging |Writes directly to STDOUT |Implementing Logger in .NET DLL |
-
-If you have existing .NET code written for a version 1 (Custom) DotNet Activity, you need to modify your code for it to work with the current version of the Custom Activity. Update your code by following these high-level guidelines:
-
- - Change the project from a .NET Class Library to a Console App.
- - Start your application with the `Main` method. The `Execute` method of the `IDotNetActivity` interface is no longer required.
- - Read and parse the Linked Services, Datasets and Activity with a JSON serializer, and not as strongly-typed objects. Pass the values of required properties to your main custom code logic. Refer to the preceding SampleApp.exe code as an example.
- - The Logger object is no longer supported. Output from your executable can be printed to the console and is saved to stdout.txt.
- - The Microsoft.Azure.Management.DataFactories NuGet package is no longer required.
- - Compile your code, upload the executable and its dependencies to Azure Storage, and define the path in the `folderPath` property.
-
-For a complete sample of how the end-to-end DLL and pipeline sample described in the Data Factory version 1 article [Use custom activities in an Azure Data Factory pipeline](./v1/data-factory-use-custom-activities.md) can be rewritten as a Custom Activity for Data Factory v2 and Synapse pipelines, see [Custom Activity sample](https://github.com/Azure/Azure-DataFactory/tree/master/SamplesV1/ADFv2CustomActivitySample).
- ## Auto-scaling of Azure Batch You can also create an Azure Batch pool with **autoscale** feature. For example, you could create an Azure batch pool with 0 dedicated VMs and an autoscale formula based on the number of pending tasks.
data-factory Transform Data Using Data Lake Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-data-lake-analytics.md
Last updated 08/10/2023
# Process data by running U-SQL scripts on Azure Data Lake Analytics with Azure Data Factory and Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-usql-activity.md)
-> * [Current version](transform-data-using-data-lake-analytics.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Hadoop Hive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-hadoop-hive.md
Last updated 08/10/2023
# Transform data using Hadoop Hive activity in Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-hive-activity.md)
-> * [Current version](transform-data-using-hadoop-hive.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Hadoop Map Reduce https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-hadoop-map-reduce.md
Last updated 08/10/2023
# Transform data using Hadoop MapReduce activity in Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-map-reduce.md)
-> * [Current version](transform-data-using-hadoop-map-reduce.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Hadoop Pig https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-hadoop-pig.md
Last updated 08/10/2023
# Transform data using Hadoop Pig activity in Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-pig-activity.md)
-> * [Current version](transform-data-using-hadoop-pig.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Hadoop Streaming https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-hadoop-streaming.md
Last updated 08/10/2023
# Transform data using Hadoop Streaming activity in Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-hadoop-streaming-activity.md)
-> * [Current version](transform-data-using-hadoop-streaming.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Machine Learning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-machine-learning.md
Last updated 08/10/2023
# Create a predictive pipeline using Machine Learning Studio (classic) with Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-azure-ml-batch-execution-activity.md)
-> * [Current version](transform-data-using-machine-learning.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Spark https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-spark.md
Last updated 08/10/2023
# Transform data using Spark activity in Azure Data Factory and Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-spark.md)
-> * [Current version](transform-data-using-spark.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Using Stored Procedure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/transform-data-using-stored-procedure.md
Last updated 08/10/2023
# Transform data by using the SQL Server Stored Procedure activity in Azure Data Factory or Synapse Analytics
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-stored-proc-activity.md)
-> * [Current version](transform-data-using-stored-procedure.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Tutorial Copy Data Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/tutorial-copy-data-tool.md
Last updated 08/10/2023
# Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool
-> [!div class="op_single_selector" title1="Select the version of the Data Factory service that you're using:"]
-> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
-> * [Current version](tutorial-copy-data-tool.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Tutorial Hybrid Copy Data Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/tutorial-hybrid-copy-data-tool.md
Last updated 08/10/2023
# Copy data from a SQL Server database to Azure Blob storage by using the Copy Data tool
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
-> * [Current version](tutorial-hybrid-copy-data-tool.md)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Data Factory Amazon Redshift Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/v1/data-factory-amazon-redshift-connector.md
- Title: Move data from Amazon Redshift by using Azure Data Factory
-description: Learn how to move data from Amazon Redshift by using Azure Data Factory Copy Activity.
---- Previously updated : 04/12/2023---
-# Move data From Amazon Redshift using Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](data-factory-amazon-redshift-connector.md)
-> * [Version 2 (current version)](../connector-amazon-redshift.md)
-
-> [!NOTE]
-> This article applies to version 1 of Data Factory. If you are using the current version of the Data Factory service, see [Amazon Redshift connector in V2](../connector-amazon-redshift.md).
-
-This article explains how to use the Copy Activity in Azure Data Factory to move data from Amazon Redshift. The article builds on the [Data Movement Activities](data-factory-data-movement-activities.md) article, which presents a general overview of data movement with the copy activity.
-
-Data Factory currently supports only moving data from Amazon Redshift to a [supported sink data store](data-factory-data-movement-activities.md#supported-data-stores-and-formats). Moving data from other data stores to Amazon Redshift is not supported.
-
-> [!TIP]
-> To achieve the best performance when copying large amounts of data from Amazon Redshift, consider using the built-in Redshift **UNLOAD** command through Amazon Simple Storage Service (Amazon S3). For details, see [Use UNLOAD to copy data from Amazon Redshift](#use-unload-to-copy-data-from-amazon-redshift).
-
-## Prerequisites
-* If you are moving data to an on-premises data store, install [Data Management Gateway](data-factory-data-management-gateway.md) on an on-premises machine. Grant access for a gateway to the Amazon Redshift cluster by using the on-premises machine IP address. For instructions, see [Authorize access to the cluster](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-authorize-cluster-access.html).
-* To move data to an Azure data store, see the [Compute IP address and SQL ranges that are used by the Microsoft Azure Datacenters](https://www.microsoft.com/download/details.aspx?id=41653).
-
-## Getting started
-You can create a pipeline with a copy activity to move data from an Amazon Redshift source by using different tools and APIs.
-
-The easiest way to create a pipeline is to use the Azure Data Factory Copy Wizard. For a quick walkthrough on creating a pipeline by using the Copy Wizard, see the [Tutorial: Create a pipeline by using the Copy Wizard](data-factory-copy-data-wizard-tutorial.md).
-
-You can also create a pipeline by using Visual Studio, Azure PowerShell, or other tools. Azure Resource Manager templates, the .NET API, or the REST API can also be used to create the pipeline. For step-by-step instructions to create a pipeline with a copy activity, see the [Copy Activity tutorial](data-factory-copy-data-from-azure-blob-storage-to-sql-database.md).
-
-Whether you use the tools or APIs, you perform the following steps to create a pipeline that moves data from a source data store to a sink data store:
-
-1. Create linked services to link input and output data stores to your data factory.
-2. Create datasets to represent input and output data for the copy operation.
-3. Create a pipeline with a copy activity that takes a dataset as an input and a dataset as an output.
-
-When you use the Copy Wizard, JSON definitions for these Data Factory entities are automatically created. When you use tools or APIs (except the .NET API), you define the Data Factory entities by using the JSON format. The JSON example: Copy data from Amazon Redshift to Azure Blob storage shows the JSON definitions for the Data Factory entities that are used to copy data from an Amazon Redshift data store.
-
-The following sections describe the JSON properties that are used to define the Data Factory entities for Amazon Redshift.
-
-## Linked service properties
-
-The following table provides descriptions for the JSON elements that are specific to an Amazon Redshift linked service.
-
-| Property | Description | Required |
-| | | |
-| **type** |This property must be set to **AmazonRedshift**. |Yes |
-| **server** |The IP address or host name of the Amazon Redshift server. |Yes |
-| **port** |The number of the TCP port that the Amazon Redshift server uses to listen for client connections. |No (default is 5439) |
-| **database** |The name of the Amazon Redshift database. |Yes |
-| **username** |The name of the user who has access to the database. |Yes |
-| **password** |The password for the user account. |Yes |
-
-## Dataset properties
-
-For a list of the sections and properties that are available for defining datasets, see the [Creating datasets](data-factory-create-datasets.md) article. The **structure**, **availability**, and **policy** sections are similar for all dataset types. Examples of dataset types include Azure SQL, Azure Blob storage, and Azure Table storage.
-
-The **typeProperties** section is different for each type of dataset and provides information about the location of the data in the store. **The typeProperties** section for a dataset of type **RelationalTable**, which includes the Amazon Redshift dataset, has the following properties:
-
-| Property | Description | Required |
-| | | |
-| **tableName** |The name of the table in the Amazon Redshift database that the linked service refers to. |No (if the **query** property of a copy activity of type **RelationalSource** is specified) |
-
-## Copy Activity properties
-
-For a list of sections and properties that are available for defining activities, see the [Creating Pipelines](data-factory-create-pipelines.md) article. The **name**, **description**, **inputs** table, **outputs** table, and **policy** properties are available for all types of activities. The properties that are available in the **typeProperties** section vary for each activity type. For Copy Activity, the properties vary depending on the types of data sources and sinks.
-
-For Copy Activity, when the source is of type **AmazonRedshiftSource**, the following properties are available in **typeProperties** section:
-
-| Property | Description | Required |
-| | | |
-| **query** | Use the custom query to read the data. |No (if the **tableName** property of a dataset is specified) |
-| **redshiftUnloadSettings** | Contains the property group when using the Redshift **UNLOAD** command. | No |
-| **s3LinkedServiceName** | The Amazon S3 to use as an interim store. The linked service is specified by using an Azure Data Factory name of type **AwsAccessKey**. | Required when using the **redshiftUnloadSettings** property |
-| **bucketName** | Indicates the Amazon S3 bucket to use to store the interim data. If this property is not provided, Copy Activity auto-generates a bucket. | Required when using the **redshiftUnloadSettings** property |
-
-Alternatively, you can use the **RelationalSource** type, which includes Amazon Redshift, with the following property in the **typeProperties** section. Note this source type doesn't support the Redshift **UNLOAD** command.
-
-| Property | Description | Required |
-| | | |
-| **query** |Use the custom query to read the data. | No (if the **tableName** property of a dataset is specified) |
-
-## Use UNLOAD to copy data from Amazon Redshift
-
-The Amazon Redshift [**UNLOAD**](https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html) command unloads the results of a query to one or more files on Amazon S3. This command is recommended by Amazon for copying large datasets from Redshift.
-
-**Example: Copy data from Amazon Redshift to Azure Synapse Analytics**
-
-This example copies data from Amazon Redshift to Azure Synapse Analytics. The example uses the Redshift **UNLOAD** command, staged copy data, and Microsoft PolyBase.
-
-For this sample use case, Copy Activity first unloads the data from Amazon Redshift to Amazon S3 as configured in the **redshiftUnloadSettings** option. Next, the data is copied from Amazon S3 to Azure Blob storage as specified in the **stagingSettings** option. Finally, PolyBase loads the data into Azure Synapse Analytics. All of the interim formats are handled by Copy Activity.
--
-```json
-{
- "name": "CopyFromRedshiftToSQLDW",
- "type": "Copy",
- "typeProperties": {
- "source": {
- "type": "AmazonRedshiftSource",
- "query": "select * from MyTable",
- "redshiftUnloadSettings": {
- "s3LinkedServiceName":"MyAmazonS3StorageLinkedService",
- "bucketName": "bucketForUnload"
- }
- },
- "sink": {
- "type": "SqlDWSink",
- "allowPolyBase": true
- },
- "enableStaging": true,
- "stagingSettings": {
- "linkedServiceName": "MyAzureStorageLinkedService",
- "path": "adfstagingcopydata"
- },
- "cloudDataMovementUnits": 32
- .....
- }
-}
-```
-
-## JSON example: Copy data from Amazon Redshift to Azure Blob storage
-This sample shows how to copy data from an Amazon Redshift database to Azure Blob Storage. Data can be copied directly to any [supported sink](data-factory-data-movement-activities.md#supported-data-stores-and-formats) by using Copy Activity.
-
-The sample has the following data factory entities:
-
-* A linked service of type [AmazonRedshift](#linked-service-properties)
-* A linked service of type [AzureStorage](data-factory-azure-blob-connector.md#linked-service-properties).
-* An input [dataset](data-factory-create-datasets.md) of type [RelationalTable](#dataset-properties)
-* An output [dataset](data-factory-create-datasets.md) of type [AzureBlob](data-factory-azure-blob-connector.md#dataset-properties)
-* A [pipeline](data-factory-create-pipelines.md) with a copy activity that uses the [RelationalSource](#copy-activity-properties) and [BlobSink](data-factory-azure-blob-connector.md#copy-activity-properties) properties
-
-The sample copies data from a query result in Amazon Redshift to an Azure blob hourly. The JSON properties that are used in the sample are described in the sections that follow the entity definitions.
-
-**Amazon Redshift linked service**
-
-```json
-{
- "name": "AmazonRedshiftLinkedService",
- "properties":
- {
- "type": "AmazonRedshift",
- "typeProperties":
- {
- "server": "< The IP address or host name of the Amazon Redshift server >",
- "port": "<The number of the TCP port that the Amazon Redshift server uses to listen for client connections.>",
- "database": "<The database name of the Amazon Redshift database>",
- "username": "<username>",
- "password": "<password>"
- }
- }
-}
-```
-
-**Azure Blob storage linked service**
-
-```json
-{
- "name": "AzureStorageLinkedService",
- "properties": {
- "type": "AzureStorage",
- "typeProperties": {
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=<accountname>;AccountKey=<accountkey>"
- }
- }
-}
-```
-**Amazon Redshift input dataset**
-
-The **external** property is set to "true" to inform the Data Factory service that the dataset is external to the data factory. This property setting indicates that the dataset is not produced by an activity in the data factory. Set the property to true on an input dataset that is not produced by an activity in the pipeline.
-
-```json
-{
- "name": "AmazonRedshiftInputDataset",
- "properties": {
- "type": "RelationalTable",
- "linkedServiceName": "AmazonRedshiftLinkedService",
- "typeProperties": {
- "tableName": "<Table name>"
- },
- "availability": {
- "frequency": "Hour",
- "interval": 1
- },
- "external": true
- }
-}
-```
-
-**Azure Blob output dataset**
-
-Data is written to a new blob every hour by setting the **frequency** property to "Hour" and the **interval** property to 1. The **folderPath** property for the blob is dynamically evaluated. The property value is based on the start time of the slice that is being processed. The folder path uses the year, month, day, and hours parts of the start time.
-
-```json
-{
- "name": "AzureBlobOutputDataSet",
- "properties": {
- "type": "AzureBlob",
- "linkedServiceName": "AzureStorageLinkedService",
- "typeProperties": {
- "folderPath": "mycontainer/fromamazonredshift/yearno={Year}/monthno={Month}/dayno={Day}/hourno={Hour}",
- "format": {
- "type": "TextFormat",
- "rowDelimiter": "\n",
- "columnDelimiter": "\t"
- },
- "partitionedBy": [
- {
- "name": "Year",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "yyyy"
- }
- },
- {
- "name": "Month",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "MM"
- }
- },
- {
- "name": "Day",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "dd"
- }
- },
- {
- "name": "Hour",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "HH"
- }
- }
- ]
- },
- "availability": {
- "frequency": "Hour",
- "interval": 1
- }
- }
-}
-```
-
-**Copy activity in a pipeline with an Azure Redshift source (of type RelationalSource) and an Azure Blob sink**
-
-The pipeline contains a copy activity that is configured to use the input and output datasets. The pipeline is scheduled to run every hour. In the JSON definition for the pipeline, the **source** type is set to **RelationalSource** and the **sink** type is set to **BlobSink**. The SQL query specified for the **query** property selects the data to copy from the past hour.
-
-```json
-{
- "name": "CopyAmazonRedshiftToBlob",
- "properties": {
- "description": "pipeline for copy activity",
- "activities": [
- {
- "type": "Copy",
- "typeProperties": {
- "source": {
- "type": "AmazonRedshiftSource",
- "query": "$$Text.Format('select * from MyTable where timestamp >= \\'{0:yyyy-MM-ddTHH:mm:ss}\\' AND timestamp < \\'{1:yyyy-MM-ddTHH:mm:ss}\\'', WindowStart, WindowEnd)",
- "redshiftUnloadSettings": {
- "s3LinkedServiceName":"myS3Storage",
- "bucketName": "bucketForUnload"
- }
- },
- "sink": {
- "type": "BlobSink",
- "writeBatchSize": 0,
- "writeBatchTimeout": "00:00:00"
- },
- "cloudDataMovementUnits": 32
- },
- "inputs": [
- {
- "name": "AmazonRedshiftInputDataset"
- }
- ],
- "outputs": [
- {
- "name": "AzureBlobOutputDataSet"
- }
- ],
- "policy": {
- "timeout": "01:00:00",
- "concurrency": 1
- },
- "scheduler": {
- "frequency": "Hour",
- "interval": 1
- },
- "name": "AmazonRedshiftToBlob"
- }
- ],
- "start": "2014-06-01T18:00:00Z",
- "end": "2014-06-01T19:00:00Z"
- }
-}
-```
-### Type mapping for Amazon Redshift
-As mentioned in the [data movement activities](data-factory-data-movement-activities.md) article, Copy Activity performs automatic type conversions from source type to sink type. The types are converted by using a two-step approach:
-
-1. Convert from a native source type to a .NET type
-2. Convert from a .NET type to a native sink type
-
-The following mappings are used when Copy Activity converts the data from an Amazon Redshift type to a .NET type:
-
-| Amazon Redshift type | .NET type |
-| | |
-| SMALLINT |Int16 |
-| INTEGER |Int32 |
-| BIGINT |Int64 |
-| DECIMAL |Decimal |
-| REAL |Single |
-| DOUBLE PRECISION |Double |
-| BOOLEAN |String |
-| CHAR |String |
-| VARCHAR |String |
-| DATE |DateTime |
-| TIMESTAMP |DateTime |
-| TEXT |String |
-
-## Map source to sink columns
-To learn how to map columns in the source dataset to columns in the sink dataset, see [Mapping dataset columns in Azure Data Factory](data-factory-map-columns.md).
-
-## Repeatable reads from relational sources
-When you copy data from a relational data store, keep repeatability in mind to avoid unintended outcomes. In Azure Data Factory, you can rerun a slice manually. You can also configure the retry **policy** for a dataset to rerun a slice when a failure occurs. Make sure that the same data is read, no matter how many times the slice is rerun. Also make sure that the same data is read regardless of how you rerun the slice. For more information, see [Repeatable reads from relational sources](data-factory-repeatable-copy.md#repeatable-read-from-relational-sources).
-
-## Performance and tuning
-Learn about key factors that affect the performance of Copy Activity and ways to optimize performance in the [Copy Activity Performance and Tuning Guide](data-factory-copy-activity-performance.md).
-
-## Next steps
-For step-by-step instructions for creating a pipeline with Copy Activity, see the [Copy Activity tutorial](data-factory-copy-data-from-azure-blob-storage-to-sql-database.md).
data-factory Data Factory Amazon Simple Storage Service Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/v1/data-factory-amazon-simple-storage-service-connector.md
- Title: Move data from Amazon Simple Storage Service by using Data Factory
-description: Learn about how to move data from Amazon Simple Storage Service (S3) by using Azure Data Factory.
---- Previously updated : 04/12/2023---
-# Move data from Amazon Simple Storage Service by using Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](data-factory-amazon-simple-storage-service-connector.md)
-> * [Version 2 (current version)](../connector-amazon-simple-storage-service.md)
-
-> [!NOTE]
-> This article applies to version 1 of Data Factory. If you are using the current version of the Data Factory service, see [Amazon S3 connector in V2](../connector-amazon-simple-storage-service.md).
-
-This article explains how to use the copy activity in Azure Data Factory to move data from Amazon Simple Storage Service (S3). It builds on the [Data movement activities](data-factory-data-movement-activities.md) article, which presents a general overview of data movement with the copy activity.
-
-You can copy data from Amazon S3 to any supported sink data store. For a list of data stores supported as sinks by the copy activity, see the [Supported data stores](data-factory-data-movement-activities.md#supported-data-stores-and-formats) table. Data Factory currently supports only moving data from Amazon S3 to other data stores, but not moving data from other data stores to Amazon S3.
-
-## Required permissions
-To copy data from Amazon S3, make sure you have been granted the following permissions:
-
-* `s3:GetObject` and `s3:GetObjectVersion` for Amazon S3 Object Operations.
-* `s3:ListBucket` for Amazon S3 Bucket Operations. If you are using the Data Factory Copy Wizard, `s3:ListAllMyBuckets` is also required.
-
-For details about the full list of Amazon S3 permissions, see [Specifying Permissions in a Policy](https://docs.aws.amazon.com/AmazonS3/latest/dev/using-with-s3-actions.html).
-
-## Getting started
-You can create a pipeline with a copy activity that moves data from an Amazon S3 source by using different tools or APIs.
-
-The easiest way to create a pipeline is to use the **Copy Wizard**. For a quick walkthrough, see [Tutorial: Create a pipeline using Copy Wizard](data-factory-copy-data-wizard-tutorial.md).
-
-You can also use the following tools to create a pipeline: **Visual Studio**, **Azure PowerShell**, **Azure Resource Manager template**, **.NET API**, and **REST API**. For step-by-step instructions to create a pipeline with a copy activity, see the [Copy activity tutorial](data-factory-copy-data-from-azure-blob-storage-to-sql-database.md).
-
-Whether you use tools or APIs, you perform the following steps to create a pipeline that moves data from a source data store to a sink data store:
-
-1. Create **linked services** to link input and output data stores to your data factory.
-2. Create **datasets** to represent input and output data for the copy operation.
-3. Create a **pipeline** with a copy activity that takes a dataset as an input and a dataset as an output.
-
-When you use the wizard, JSON definitions for these Data Factory entities (linked services, datasets, and the pipeline) are automatically created for you. When you use tools or APIs (except .NET API), you define these Data Factory entities by using the JSON format. For a sample with JSON definitions for Data Factory entities that are used to copy data from an Amazon S3 data store, see the [JSON example: Copy data from Amazon S3 to Azure Blob](#json-example-copy-data-from-amazon-s3-to-azure-blob-storage) section of this article.
-
-> [!NOTE]
-> For details about supported file and compression formats for a copy activity, see [File and compression formats in Azure Data Factory](data-factory-supported-file-and-compression-formats.md).
-
-The following sections provide details about JSON properties that are used to define Data Factory entities specific to Amazon S3.
-
-## Linked service properties
-A linked service links a data store to a data factory. You create a linked service of type **AwsAccessKey** to link your Amazon S3 data store to your data factory. The following table provides description for JSON elements specific to Amazon S3 (AwsAccessKey) linked service.
-
-| Property | Description | Allowed values | Required |
-| | | | |
-| accessKeyID |ID of the secret access key. |string |Yes |
-| secretAccessKey |The secret access key itself. |Encrypted secret string |Yes |
-
->[!NOTE]
->This connector requires access keys for IAM account to copy data from Amazon S3. [Temporary Security Credential](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html) is not supported.
->
-
-Here is an example:
-
-```json
-{
- "name": "AmazonS3LinkedService",
- "properties": {
- "type": "AwsAccessKey",
- "typeProperties": {
- "accessKeyId": "<access key id>",
- "secretAccessKey": "<secret access key>"
- }
- }
-}
-```
-
-## Dataset properties
-To specify a dataset to represent input data in Azure Blob storage, set the type property of the dataset to **AmazonS3**. Set the **linkedServiceName** property of the dataset to the name of the Amazon S3 linked service. For a full list of sections and properties available for defining datasets, see [Creating datasets](data-factory-create-datasets.md).
-
-Sections such as structure, availability, and policy are similar for all dataset types (such as SQL database, Azure blob, and Azure table). The **typeProperties** section is different for each type of dataset, and provides information about the location of the data in the data store. The **typeProperties** section for a dataset of type **AmazonS3** (which includes the Amazon S3 dataset) has the following properties:
-
-| Property | Description | Allowed values | Required |
-| | | | |
-| bucketName |The S3 bucket name. |String |Yes |
-| key |The S3 object key. |String |No |
-| prefix |Prefix for the S3 object key. Objects whose keys start with this prefix are selected. Applies only when key is empty. |String |No |
-| version |The version of the S3 object, if S3 versioning is enabled. |String |No |
-| format | The following format types are supported: **TextFormat**, **JsonFormat**, **AvroFormat**, **OrcFormat**, **ParquetFormat**. Set the **type** property under format to one of these values. For more information, see the [Text format](data-factory-supported-file-and-compression-formats.md#text-format), [JSON format](data-factory-supported-file-and-compression-formats.md#json-format), [Avro format](data-factory-supported-file-and-compression-formats.md#avro-format), [Orc format](data-factory-supported-file-and-compression-formats.md#orc-format), and [Parquet format](data-factory-supported-file-and-compression-formats.md#parquet-format) sections. <br><br> If you want to copy files as-is between file-based stores (binary copy), skip the format section in both input and output dataset definitions. | |No |
-| compression | Specify the type and level of compression for the data. The supported types are: **GZip**, **Deflate**, **BZip2**, and **ZipDeflate**. The supported levels are: **Optimal** and **Fastest**. For more information, see [File and compression formats in Azure Data Factory](data-factory-supported-file-and-compression-formats.md#compression-support). | |No |
--
-> [!NOTE]
-> **bucketName + key** specifies the location of the S3 object, where bucket is the root container for S3 objects, and key is the full path to the S3 object.
-
-### Sample dataset with prefix
-
-```json
-{
- "name": "dataset-s3",
- "properties": {
- "type": "AmazonS3",
- "linkedServiceName": "link- testS3",
- "typeProperties": {
- "prefix": "testFolder/test",
- "bucketName": "testbucket",
- "format": {
- "type": "OrcFormat"
- }
- },
- "availability": {
- "frequency": "Hour",
- "interval": 1
- },
- "external": true
- }
-}
-```
-### Sample dataset (with version)
-
-```json
-{
- "name": "dataset-s3",
- "properties": {
- "type": "AmazonS3",
- "linkedServiceName": "link- testS3",
- "typeProperties": {
- "key": "testFolder/test.orc",
- "bucketName": "testbucket",
- "version": "XXXXXXXXXczm0CJajYkHf0_k6LhBmkcL",
- "format": {
- "type": "OrcFormat"
- }
- },
- "availability": {
- "frequency": "Hour",
- "interval": 1
- },
- "external": true
- }
-}
-```
-
-### Dynamic paths for S3
-The preceding sample uses fixed values for the **key** and **bucketName** properties in the Amazon S3 dataset.
-
-```json
-"key": "testFolder/test.orc",
-"bucketName": "testbucket",
-```
-
-You can have Data Factory calculate these properties dynamically at runtime, by using system variables such as SliceStart.
-
-```json
-"key": "$$Text.Format('{0:MM}/{0:dd}/test.orc', SliceStart)"
-"bucketName": "$$Text.Format('{0:yyyy}', SliceStart)"
-```
-
-You can do the same for the **prefix** property of an Amazon S3 dataset. For a list of supported functions and variables, see [Data Factory functions and system variables](data-factory-functions-variables.md).
-
-## Copy activity properties
-For a full list of sections and properties available for defining activities, see [Creating pipelines](data-factory-create-pipelines.md). Properties such as name, description, input and output tables, and policies are available for all types of activities. Properties available in the **typeProperties** section of the activity vary with each activity type. For the copy activity, properties vary depending on the types of sources and sinks. When a source in the copy activity is of type **FileSystemSource** (which includes Amazon S3), the following property is available in **typeProperties** section:
-
-| Property | Description | Allowed values | Required |
-| | | | |
-| recursive |Specifies whether to recursively list S3 objects under the directory. |true/false |No |
-
-## JSON example: Copy data from Amazon S3 to Azure Blob storage
-This sample shows how to copy data from Amazon S3 to an Azure Blob storage. However, data can be copied directly to [any of the sinks that are supported](data-factory-data-movement-activities.md#supported-data-stores-and-formats) by using the copy activity in Data Factory.
-
-The sample provides JSON definitions for the following Data Factory entities. You can use these definitions to create a pipeline to copy data from Amazon S3 to Blob storage, by using the [Visual Studio](data-factory-copy-activity-tutorial-using-visual-studio.md) or [PowerShell](data-factory-copy-activity-tutorial-using-powershell.md).
-
-* A linked service of type [AwsAccessKey](#linked-service-properties).
-* A linked service of type [AzureStorage](data-factory-azure-blob-connector.md#linked-service-properties).
-* An input [dataset](data-factory-create-datasets.md) of type [AmazonS3](#dataset-properties).
-* An output [dataset](data-factory-create-datasets.md) of type [AzureBlob](data-factory-azure-blob-connector.md#dataset-properties).
-* A [pipeline](data-factory-create-pipelines.md) with copy activity that uses [FileSystemSource](#copy-activity-properties) and [BlobSink](data-factory-azure-blob-connector.md#copy-activity-properties).
-
-The sample copies data from Amazon S3 to an Azure blob every hour. The JSON properties used in these samples are described in sections following the samples.
-
-### Amazon S3 linked service
-
-```json
-{
- "name": "AmazonS3LinkedService",
- "properties": {
- "type": "AwsAccessKey",
- "typeProperties": {
- "accessKeyId": "<access key id>",
- "secretAccessKey": "<secret access key>"
- }
- }
-}
-```
-
-### Azure Storage linked service
-
-```json
-{
- "name": "AzureStorageLinkedService",
- "properties": {
- "type": "AzureStorage",
- "typeProperties": {
- "connectionString": "DefaultEndpointsProtocol=https;AccountName=<accountname>;AccountKey=<accountkey>"
- }
- }
-}
-```
-
-### Amazon S3 input dataset
-
-Setting **"external": true** informs the Data Factory service that the dataset is external to the data factory. Set this property to true on an input dataset that is not produced by an activity in the pipeline.
-
-```json
- {
- "name": "AmazonS3InputDataset",
- "properties": {
- "type": "AmazonS3",
- "linkedServiceName": "AmazonS3LinkedService",
- "typeProperties": {
- "key": "testFolder/test.orc",
- "bucketName": "testbucket",
- "format": {
- "type": "OrcFormat"
- }
- },
- "availability": {
- "frequency": "Hour",
- "interval": 1
- },
- "external": true
- }
- }
-```
--
-### Azure Blob output dataset
-
-Data is written to a new blob every hour (frequency: hour, interval: 1). The folder path for the blob is dynamically evaluated based on the start time of the slice that is being processed. The folder path uses the year, month, day, and hours parts of the start time.
-
-```json
-{
- "name": "AzureBlobOutputDataSet",
- "properties": {
- "type": "AzureBlob",
- "linkedServiceName": "AzureStorageLinkedService",
- "typeProperties": {
- "folderPath": "mycontainer/fromamazons3/yearno={Year}/monthno={Month}/dayno={Day}/hourno={Hour}",
- "format": {
- "type": "TextFormat",
- "rowDelimiter": "\n",
- "columnDelimiter": "\t"
- },
- "partitionedBy": [
- {
- "name": "Year",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "yyyy"
- }
- },
- {
- "name": "Month",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "MM"
- }
- },
- {
- "name": "Day",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "dd"
- }
- },
- {
- "name": "Hour",
- "value": {
- "type": "DateTime",
- "date": "SliceStart",
- "format": "HH"
- }
- }
- ]
- },
- "availability": {
- "frequency": "Hour",
- "interval": 1
- }
- }
-}
-```
--
-### Copy activity in a pipeline with an Amazon S3 source and a blob sink
-
-The pipeline contains a copy activity that is configured to use the input and output datasets, and is scheduled to run every hour. In the pipeline JSON definition, the **source** type is set to **FileSystemSource**, and **sink** type is set to **BlobSink**.
-
-```json
-{
- "name": "CopyAmazonS3ToBlob",
- "properties": {
- "description": "pipeline for copy activity",
- "activities": [
- {
- "type": "Copy",
- "typeProperties": {
- "source": {
- "type": "FileSystemSource",
- "recursive": true
- },
- "sink": {
- "type": "BlobSink",
- "writeBatchSize": 0,
- "writeBatchTimeout": "00:00:00"
- }
- },
- "inputs": [
- {
- "name": "AmazonS3InputDataset"
- }
- ],
- "outputs": [
- {
- "name": "AzureBlobOutputDataSet"
- }
- ],
- "policy": {
- "timeout": "01:00:00",
- "concurrency": 1
- },
- "scheduler": {
- "frequency": "Hour",
- "interval": 1
- },
- "name": "AmazonS3ToBlob"
- }
- ],
- "start": "2014-08-08T18:00:00Z",
- "end": "2014-08-08T19:00:00Z"
- }
-}
-```
-> [!NOTE]
-> To map columns from a source dataset to columns from a sink dataset, see [Mapping dataset columns in Azure Data Factory](data-factory-map-columns.md).
--
-## Next steps
-See the following articles:
-
-* To learn about key factors that impact performance of data movement (copy activity) in Data Factory, and various ways to optimize it, see the [Copy activity performance and tuning guide](data-factory-copy-activity-performance.md).
-
-* For step-by-step instructions for creating a pipeline with a copy activity, see the [Copy activity tutorial](data-factory-copy-data-from-azure-blob-storage-to-sql-database.md).
data-factory Data Factory Api Change Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/v1/data-factory-api-change-log.md
- Title: Data Factory - .NET API Change Log
-description: Describes breaking changes, feature additions, bug fixes, and so on, in a specific version of .NET API for the Azure Data Factory.
-------- Previously updated : 04/12/2023--
-# Azure Data Factory - .NET API change log
-> [!NOTE]
-> This article applies to version 1 of Data Factory.
-
-This article provides information about changes to Azure Data Factory SDK in a specific version. You can find the latest NuGet package for Azure Data Factory [here](https://www.nuget.org/packages/Microsoft.Azure.Management.DataFactories)
-
-## Version 4.11.0
-Feature Additions:
-
-* The following linked service types have been added:
- * [OnPremisesMongoDbLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.onpremisesmongodblinkedservice)
- * [AmazonRedshiftLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.amazonredshiftlinkedservice)
- * [AwsAccessKeyLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.awsaccesskeylinkedservice)
-* The following dataset types have been added:
- * [MongoDbCollectionDataset](/dotnet/api/microsoft.azure.management.datafactories.models.mongodbcollectiondataset)
- * [AmazonS3Dataset](/dotnet/api/microsoft.azure.management.datafactories.models.amazons3dataset)
-* The following copy source types have been added:
- * [MongoDbSource](/dotnet/api/microsoft.azure.management.datafactories.models.mongodbsource)
-
-## Version 4.10.0
-* The following optional properties have been added to TextFormat:
- * [SkipLineCount](/dotnet/api/microsoft.azure.management.datafactories.models.textformat)
- * [FirstRowAsHeader](/dotnet/api/microsoft.azure.management.datafactories.models.textformat)
- * [TreatEmptyAsNull](/dotnet/api/microsoft.azure.management.datafactories.models.textformat)
-* The following linked service types have been added:
- * [OnPremisesCassandraLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.onpremisescassandralinkedservice)
- * [SalesforceLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.salesforcelinkedservice)
-* The following dataset types have been added:
- * [OnPremisesCassandraTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.onpremisescassandratabledataset)
-* The following copy source types have been added:
- * [CassandraSource](/dotnet/api/microsoft.azure.management.datafactories.models.cassandrasource)
-* Add [WebServiceInputs](/dotnet/api/microsoft.azure.management.datafactories.models.azuremlbatchexecutionactivity) property to AzureMLBatchExecutionActivity
- * Enable passing multiple web service inputs to an Azure Machine Learning experiment
-
-## Version 4.9.1
-### Bug fix
-* Deprecate WebApi-based authentication for [WebLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.weblinkedservice).
-
-## Version 4.9.0
-### Feature Additions
-* Add [EnableStaging](/dotnet/api/microsoft.azure.management.datafactories.models.copyactivity) and [StagingSettings](/dotnet/api/microsoft.azure.management.datafactories.models.stagingsettings) properties to CopyActivity. See [Staged copy](data-factory-copy-activity-performance.md#staged-copy) for details on the feature.
-
-### Bug fix
-* Introduce an overload of [ActivityWindowOperationExtensions.List](/dotnet/api/microsoft.azure.management.datafactories.activitywindowoperationsextensions) method, which takes an [ActivityWindowsByActivityListParameters](/dotnet/api/microsoft.azure.management.datafactories.models.activitywindowsbyactivitylistparameters) instance.
-* Mark [WriteBatchSize](/dotnet/api/microsoft.azure.management.datafactories.models.copysink) and [WriteBatchTimeout](/dotnet/api/microsoft.azure.management.datafactories.models.copysink) as optional in CopySink.
-
-## Version 4.8.0
-### Feature Additions
-* The following optional properties have been added to Copy activity type to enable tuning of copy performance:
- * [ParallelCopies](/dotnet/api/microsoft.azure.management.datafactories.models.copyactivity)
- * [CloudDataMovementUnits](/dotnet/api/microsoft.azure.management.datafactories.models.copyactivity)
-
-## Version 4.7.0
-### Feature Additions
-* Added new StorageFormat type [OrcFormat](/dotnet/api/microsoft.azure.management.datafactories.models.orcformat) type to copy files in optimized row columnar (ORC) format.
-* Add [AllowPolyBase](/dotnet/api/microsoft.azure.management.datafactories.models.sqldwsink) and PolyBaseSettings properties to SqlDWSink.
- * Enables the use of PolyBase to copy data into Azure Synapse Analytics.
-
-## Version 4.6.1
-### Bug Fixes
-* Fixes HTTP request for listing activity windows.
- * Removes the resource group name and the data factory name from the request payload.
-
-## Version 4.6.0
-### Feature Additions
-* The following properties have been added to [PipelineProperties](/dotnet/api/microsoft.azure.management.datafactories.models.pipelineproperties):
- * [PipelineMode](/dotnet/api/microsoft.azure.management.datafactories.models.pipelineproperties)
- * [ExpirationTime](/dotnet/api/microsoft.azure.management.datafactories.models.pipelineproperties)
- * [Datasets](/dotnet/api/microsoft.azure.management.datafactories.models.pipelineproperties)
-* The following properties have been added to [PipelineRuntimeInfo](/dotnet/api/microsoft.azure.management.datafactories.common.models.pipelineruntimeinfo):
- * [PipelineState](/dotnet/api/microsoft.azure.management.datafactories.common.models.pipelineruntimeinfo)
-* Added new [StorageFormat](/dotnet/api/microsoft.azure.management.datafactories.models.storageformat) type [JsonFormat](/dotnet/api/microsoft.azure.management.datafactories.models.jsonformat) type to define datasets whose data is in JSON format.
-
-## Version 4.5.0
-### Feature Additions
-* Added [list operations for activity window](/dotnet/api/microsoft.azure.management.datafactories.activitywindowoperationsextensions).
- * Added methods to retrieve activity windows with filters based on the entity types (that is, data factories, datasets, pipelines, and activities).
-* The following linked service types have been added:
- * [ODataLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.odatalinkedservice), [WebLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.weblinkedservice)
-* The following dataset types have been added:
- * [ODataResourceDataset](/dotnet/api/microsoft.azure.management.datafactories.models.odataresourcedataset), [WebTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.webtabledataset)
-* The following copy source types have been added:
- * [WebSource](/dotnet/api/microsoft.azure.management.datafactories.models.websource)
-
-## Version 4.4.0
-### Feature additions
-* The following linked service type has been added as data sources and sinks for copy activities:
- * [AzureStorageSasLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.azurestoragesaslinkedservice). See [Azure Storage SAS Linked Service](data-factory-azure-blob-connector.md#azure-storage-sas-linked-service) for conceptual information and examples.
-
-## Version 4.3.0
-### Feature additions
-* The following linked service types haven been added as data sources for copy activities:
- * [HdfsLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.hdfslinkedservice). See [Move data from HDFS using Data Factory](data-factory-hdfs-connector.md) for conceptual information and examples.
- * [OnPremisesOdbcLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.onpremisesodbclinkedservice). See [Move data From ODBC data stores using Azure Data Factory](data-factory-odbc-connector.md) for conceptual information and examples.
-
-## Version 4.2.0
-### Feature additions
-* The following new activity type has been added: [AzureMLUpdateResourceActivity](/dotnet/api/microsoft.azure.management.datafactories.models.azuremlupdateresourceactivity). For details about the activity, see [Updating Azure ML models using the Update Resource Activity](data-factory-azure-ml-batch-execution-activity.md).
-* A new optional property [updateResourceEndpoint](/dotnet/api/microsoft.azure.management.datafactories.models.azuremllinkedservice) has been added to the [AzureMLLinkedService class](/dotnet/api/microsoft.azure.management.datafactories.models.azuremllinkedservice).
-* [LongRunningOperationInitialTimeout](/dotnet/api/microsoft.azure.management.datafactories.datafactorymanagementclient) and [LongRunningOperationRetryTimeout](/dotnet/api/microsoft.azure.management.datafactories.datafactorymanagementclient) properties have been added to the [DataFactoryManagementClient](/dotnet/api/microsoft.azure.management.datafactories.datafactorymanagementclient) class.
-* Allow configuration of the timeouts for client calls to the Data Factory service.
-
-## Version 4.1.0
-### Feature additions
-* The following linked service types have been added:
- * [AzureDataLakeStoreLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.azuredatalakestorelinkedservice)
- * [AzureDataLakeAnalyticsLinkedService](/dotnet/api/microsoft.azure.management.datafactories.models.azuredatalakeanalyticslinkedservice)
-* The following activity types have been added:
- * [DataLakeAnalyticsUSQLActivity](/dotnet/api/microsoft.azure.management.datafactories.models.datalakeanalyticsusqlactivity)
-* The following dataset types have been added:
- * [AzureDataLakeStoreDataset](/dotnet/api/microsoft.azure.management.datafactories.models.azuredatalakestoredataset)
-* The following source and sink types for Copy Activity have been added:
- * [AzureDataLakeStoreSource](/dotnet/api/microsoft.azure.management.datafactories.models.azuredatalakestoresource)
- * [AzureDataLakeStoreSink](/dotnet/api/microsoft.azure.management.datafactories.models.azuredatalakestoresink)
-
-## Version 4.0.1
-### Breaking changes
-The following classes have been renamed. The new names were the original names of classes before 4.0.0 release.
-
-| Name in 4.0.0 | Name in 4.0.1 |
-|: |: |
-| AzureSqlDataWarehouseDataset |[AzureSqlDataWarehouseTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.azuresqldatawarehousetabledataset) |
-| AzureSqlDataset |[AzureSqlTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.azuresqltabledataset) |
-| AzureDataset |[AzureTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.azuretabledataset) |
-| OracleDataset |[OracleTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.oracletabledataset) |
-| RelationalDataset |[RelationalTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.relationaltabledataset) |
-| SqlServerDataset |[SqlServerTableDataset](/dotnet/api/microsoft.azure.management.datafactories.models.sqlservertabledataset) |
-
-## Version 4.0.0
-
-### Breaking changes
-
-* The Following classes/interfaces have been renamed.
-
-| Old name | New name |
-|: |: |
-| ITableOperations |[IDatasetOperations](/dotnet/api/microsoft.azure.management.datafactories.idatasetoperations) |
-| Table |[Dataset](/dotnet/api/microsoft.azure.management.datafactories.models.dataset) |
-| TableProperties |[DatasetProperties](/dotnet/api/microsoft.azure.management.datafactories.models.datasetproperties) |
-| TableTypeProprerties |[DatasetTypeProperties](/dotnet/api/microsoft.azure.management.datafactories.models.datasettypeproperties) |
-| TableCreateOrUpdateParameters |[DatasetCreateOrUpdateParameters](/dotnet/api/microsoft.azure.management.datafactories.models.datasetcreateorupdateparameters) |
-| TableCreateOrUpdateResponse |[DatasetCreateOrUpdateResponse](/dotnet/api/microsoft.azure.management.datafactories.models.datasetcreateorupdateresponse) |
-| TableGetResponse |[DatasetGetResponse](/dotnet/api/microsoft.azure.management.datafactories.models.datasetgetresponse) |
-| TableListResponse |[DatasetListResponse](/dotnet/api/microsoft.azure.management.datafactories.models.datasetlistresponse) |
-| CreateOrUpdateWithRawJsonContentParameters |[DatasetCreateOrUpdateWithRawJsonContentParameters](/dotnet/api/microsoft.azure.management.datafactories.models.datasetcreateorupdatewithrawjsoncontentparameters) |
-
-* The **List** methods return paged results now. If the response contains a non-empty **NextLink** property, the client application needs to continue fetching the next page until all pages are returned. Here is an example:
-
- ```csharp
- PipelineListResponse response = client.Pipelines.List("ResourceGroupName", "DataFactoryName");
- var pipelines = new List<Pipeline>(response.Pipelines);
-
- string nextLink = response.NextLink;
- while (!string.IsNullOrEmpty(nextLink))
- {
- PipelineListResponse nextResponse = client.Pipelines.ListNext(nextLink);
- pipelines.AddRange(nextResponse.Pipelines);
-
- nextLink = nextResponse.NextLink;
- }
- ```
-
-* **List** pipeline API returns only the summary of a pipeline instead of full details. For instance, activities in a pipeline summary only contain name and type.
-
-### Feature additions
-* The [SqlDWSink](/dotnet/api/microsoft.azure.management.datafactories.models.sqldwsink) class supports two new properties, **SliceIdentifierColumnName** and **SqlWriterCleanupScript**, to support idempotent copy to Azure Synapse Analytics. See the [Azure Synapse Analytics](data-factory-azure-sql-data-warehouse-connector.md) article for details about these properties.
-* We now support running stored procedure against Azure SQL Database and Azure Synapse Analytics sources as part of the Copy Activity. The [SqlSource](/dotnet/api/microsoft.azure.management.datafactories.models.sqlsource) and [SqlDWSource](/dotnet/api/microsoft.azure.management.datafactories.models.sqldwsource) classes have the following properties: **SqlReaderStoredProcedureName** and **StoredProcedureParameters**. See the [Azure SQL Database](data-factory-azure-sql-connector.md#sqlsource) and [Azure Synapse Analytics](data-factory-azure-sql-data-warehouse-connector.md#sqldwsource) articles on Azure.com for details about these properties.
data-factory Data Factory Azure Blob Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/v1/data-factory-azure-blob-connector.md
- Title: Copy data to/from Azure Blob Storage
-description: 'Learn how to copy blob data in Azure Data Factory. Use our sample: How to copy data to and from Azure Blob Storage and Azure SQL Database.'
---- Previously updated : 04/12/2023---
-# Copy data to or from Azure Blob Storage using Azure Data Factory
-> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
-> * [Version 1](data-factory-azure-blob-connector.md)
-> * [Version 2 (current version)](../connector-azure-blob-storage.md)
-
-> [!NOTE]
-> This article applies to version 1 of Data Factory. If you are using the current version of the Data Factory service, see [Azure Blob Storage connector in V2](../connector-azure-blob-storage.md).
--
-This article explains how to use the Copy Activity in Azure Data Factory to copy data to and from Azure Blob Storage. It builds on the [Data Movement Activities](data-factory-data-movement-activities.md) article, which presents a general overview of data movement with the copy activity.
-
-## Overview
-You can copy data from any supported source data store to Azure Blob Storage or from Azure Blob Storage to any supported sink data store. The following table provides a list of data stores supported as sources or sinks by the copy activity. For example, you can move data **from** a SQL Server database or a database in Azure SQL Database **to** an Azure blob storage. And, you can copy data **from** Azure blob storage **to** Azure Synapse Analytics or an Azure Cosmos DB collection.
--
-## Supported scenarios
-You can copy data **from Azure Blob Storage** to the following data stores:
--
-You can copy data from the following data stores **to Azure Blob Storage**:
--
-> [!IMPORTANT]
-> Copy Activity supports copying data from/to both general-purpose Azure Storage accounts and Hot/Cool Blob storage. The activity supports **reading from block, append, or page blobs**, but supports **writing to only block blobs**. Azure Premium Storage is not supported as a sink because it is backed by page blobs.
->
-> Copy Activity does not delete data from the source after the data is successfully copied to the destination. If you need to delete source data after a successful copy, create a [custom activity](data-factory-use-custom-activities.md) to delete the data and use the activity in the pipeline. For an example, see the [Delete blob or folder sample on GitHub](https://github.com/Azure/Azure-DataFactory/tree/master/SamplesV1/DeleteBlobFileFolderCustomActivity).
-
-## Get started
-You can create a pipeline with a copy activity that moves data to/from an Azure Blob Storage by using different tools/APIs.
-
-The easiest way to create a pipeline is to use the **Copy Wizard**. This article has a [walkthrough](#walkthrough-use-copy-wizard-to-copy-data-tofrom-blob-storage) for creating a pipeline to copy data from an Azure Blob Storage location to another Azure Blob Storage location. For a tutorial on creating a pipeline to copy data from an Azure Blob Storage to Azure SQL Database, see [Tutorial: Create a pipeline using Copy Wizard](data-factory-copy-data-wizard-tutorial.md).
-
-You can also use the following tools to create a pipeline: **Visual Studio**, **Azure PowerShell**, **Azure Resource Manager template**, **.NET API**, and **REST API**. See [Copy activity tutorial](data-factory-copy-data-from-azure-blob-storage-to-sql-database.md) for step-by-step instructions to create a pipeline with a copy activity.
-
-Whether you use the tools or APIs, you perform the following steps to create a pipeline that moves data from a source data store to a sink data store:
-
-1. Create a **data factory**. A data factory may contain one or more pipelines.
-2. Create **linked services** to link input and output data stores to your data factory. For example, if you are copying data from an Azure blob storage to Azure SQL Database, you create two linked services to link your Azure storage account and Azure SQL Database to your data factory. For linked service properties that are specific to Azure Blob Storage, see [linked service properties](#linked-service-properties) section.
-2. Create **datasets** to represent input and output data for the copy operation. In the example mentioned in the last step, you create a dataset to specify the blob container and folder that contains the input data. And, you create another dataset to specify the SQL table in Azure SQL Database that holds the data copied from the blob storage. For dataset properties that are specific to Azure Blob Storage, see [dataset properties](#dataset-properties) section.
-3. Create a **pipeline** with a copy activity that takes a dataset as an input and a dataset as an output. In the example mentioned earlier, you use BlobSource as a source and SqlSink as a sink for the copy activity. Similarly, if you are copying from Azure SQL Database to Azure Blob Storage, you use SqlSource and BlobSink in the copy activity. For copy activity properties that are specific to Azure Blob Storage, see [copy activity properties](#copy-activity-properties) section. For details on how to use a data store as a source or a sink, click the link in the previous section for your data store.
-
-When you use the wizard, JSON definitions for these Data Factory entities (linked services, datasets, and the pipeline) are automatically created for you. When you use tools/APIs (except .NET API), you define these Data Factory entities by using the JSON format. For samples with JSON definitions for Data Factory entities that are used to copy data to/from an Azure Blob Storage, see [JSON examples](#json-examples-for-copying-data-to-and-from-blob-storage) section of this article.
-
-The following sections provide details about JSON properties that are used to define Data Factory entities specific to Azure Blob Storage.
-
-## Linked service properties
-There are two types of linked services you can use to link an Azure Storage to an Azure data factory. They are: **AzureStorage** linked service and **AzureStorageSas** linked service. The Azure Storage linked service provides the data factory with global access to the Azure Storage. Whereas, The Azure Storage SAS (Shared Access Signature) linked service provides the data factory with restricted/time-bound access to the Azure Storage. There are no other differences between these two linked services. Choose the linked service that suits your needs. The following sections provide more details on these two linked services.
--
-## Dataset properties
-To specify a dataset to represent input or output data in an Azure Blob Storage, you set the type property of the dataset to: **AzureBlob**. Set the **linkedServiceName** property of the dataset to the name of the Azure Storage or Azure Storage SAS linked service. The type properties of the dataset specify the **blob container** and the **folder** in the blob storage.
-
-For a full list of JSON sections & properties available for defining datasets, see the [Creating datasets](data-factory-create-datasets.md) article. Sections such as structure, availability, and policy of a dataset JSON are similar for all dataset types (Azure SQL, Azure blob, Azure table, etc.).
-
-Data factory supports the following CLS-compliant .NET based type values for providing type information in "structure" for schema-on-read data sources like Azure blob: Int16, Int32, Int64, Single, Double, Decimal, Byte[], Bool, String, Guid, Datetime, Datetimeoffset, Timespan. Data Factory automatically performs type conversions when moving data from a source data store to a sink data store.
-
-The **typeProperties** section is different for each type of dataset and provides information about the location, format etc., of the data in the data store. The typeProperties section for dataset of type **AzureBlob** dataset has the following properties:
-
-| Property | Description | Required |
-| | | |
-| folderPath |Path to the container and folder in the blob storage. Example: myblobcontainer\myblobfolder\ |Yes |
-| fileName |Name of the blob. fileName is optional and case-sensitive.<br/><br/>If you specify a filename, the activity (including Copy) works on the specific Blob.<br/><br/>When fileName is not specified, Copy includes all Blobs in the folderPath for input dataset.<br/><br/>When **fileName** is not specified for an output dataset and **preserveHierarchy** is not specified in activity sink, the name of the generated file would be in the following this format: `Data.<Guid>.txt` (for example: : Data.0a405f8a-93ff-4c6f-b3be-f69616f1df7a.txt |No |
-| partitionedBy |partitionedBy is an optional property. You can use it to specify a dynamic folderPath and filename for time series data. For example, folderPath can be parameterized for every hour of data. See the [Using partitionedBy property section](#using-partitionedby-property) for details and examples. |No |
-| format | The following format types are supported: **TextFormat**, **JsonFormat**, **AvroFormat**, **OrcFormat**, **ParquetFormat**. Set the **type** property under format to one of these values. For more information, see [Text Format](data-factory-supported-file-and-compression-formats.md#text-format), [Json Format](data-factory-supported-file-and-compression-formats.md#json-format), [Avro Format](data-factory-supported-file-and-compression-formats.md#avro-format), [Orc Format](data-factory-supported-file-and-compression-formats.md#orc-format), and [Parquet Format](data-factory-supported-file-and-compression-formats.md#parquet-format) sections. <br><br> If you want to **copy files as-is** between file-based stores (binary copy), skip the format section in both input and output dataset definitions. |No |
-| compression | Specify the type and level of compression for the data. Supported types are: **GZip**, **Deflate**, **BZip2**, and **ZipDeflate**. Supported levels are: **Optimal** and **Fastest**. For more information, see [File and compression formats in Azure Data Factory](data-factory-supported-file-and-compression-formats.md#compression-support). |No |
-
-### Using partitionedBy property
-As mentioned in the previous section, you can specify a dynamic folderPath and filename for time series data with the **partitionedBy** property, [Data Factory functions, and the system variables](data-factory-functions-variables.md).
-
-For more information on time series datasets, scheduling, and slices, see [Creating Datasets](data-factory-create-datasets.md) and [Scheduling & Execution](data-factory-scheduling-and-execution.md) articles.
-
-#### Sample 1
-
-```json
-"folderPath": "wikidatagateway/wikisampledataout/{Slice}",
-"partitionedBy":
-[
- { "name": "Slice", "value": { "type": "DateTime", "date": "SliceStart", "format": "yyyyMMddHH" } },
-],
-```
-
-In this example, {Slice} is replaced with the value of Data Factory system variable SliceStart in the format (YYYYMMDDHH) specified. The SliceStart refers to start time of the slice. The folderPath is different for each slice. For example: wikidatagateway/wikisampledataout/2014100103 or wikidatagateway/wikisampledataout/2014100104
-
-#### Sample 2
-
-```json
-"folderPath": "wikidatagateway/wikisampledataout/{Year}/{Month}/{Day}",
-"fileName": "{Hour}.csv",
-"partitionedBy":
-[
- { "name": "Year", "value": { "type": "DateTime", "date": "SliceStart", "format": "yyyy" } },
- { "name": "Month", "value": { "type": "DateTime", "date": "SliceStart", "format": "MM" } },
- { "name": "Day", "value": { "type": "DateTime", "date": "SliceStart", "format": "dd" } },
- { "name": "Hour", "value": { "type": "DateTime", "date": "SliceStart", "format": "hh" } }
-],
-```
-
-In this example, year, month, day, and time of SliceStart are extracted into separate variables that are used by folderPath and fileName properties.
-
-## Copy activity properties
-For a full list of sections & properties available for defining activities, see the [Creating Pipelines](data-factory-create-pipelines.md) article. Properties such as name, description, input and output datasets, and policies are available for all types of activities. Whereas, properties available in the **typeProperties** section of the activity vary with each activity type. For Copy activity, they vary depending on the types of sources and sinks. If you are moving data from an Azure Blob Storage, you set the source type in the copy activity to **BlobSource**. Similarly, if you are moving data to an Azure Blob Storage, you set the sink type in the copy activity to **BlobSink**. This section provides a list of properties supported by BlobSource and BlobSink.
-
-**BlobSource** supports the following properties in the **typeProperties** section:
-
-| Property | Description | Allowed values | Required |
-| | | | |
-| recursive |Indicates whether the data is read recursively from the sub folders or only from the specified folder. |True (default value), False |No |
-
-**BlobSink** supports the following properties **typeProperties** section:
-
-| Property | Description | Allowed values | Required |
-| | | | |
-| copyBehavior |Defines the copy behavior when the source is BlobSource or FileSystem. |<b>PreserveHierarchy</b>: preserves the file hierarchy in the target folder. The relative path of source file to source folder is identical to the relative path of target file to target folder.<br/><br/><b>FlattenHierarchy</b>: all files from the source folder are in the first level of target folder. The target files have auto generated name. <br/><br/><b>MergeFiles</b>: merges all files from the source folder to one file. If the File/Blob Name is specified, the merged file name would be the specified name; otherwise, would be auto-generated file name. |No |
-
-**BlobSource** also supports these two properties for backward compatibility.
-
-* **treatEmptyAsNull**: Specifies whether to treat null or empty string as null value.
-* **skipHeaderLineCount** - Specifies how many lines need be skipped. It is applicable only when input dataset is using TextFormat.
-
-Similarly, **BlobSink** supports the following property for backward compatibility.
-
-* **blobWriterAddHeader**: Specifies whether to add a header of column definitions while writing to an output dataset.
-
-Datasets now support the following properties that implement the same functionality: **treatEmptyAsNull**, **skipLineCount**, **firstRowAsHeader**.
-
-The following table provides guidance on using the new dataset properties in place of these blob source/sink properties.
-
-| Copy Activity property | Dataset property |
-|: |: |
-| skipHeaderLineCount on BlobSource |skipLineCount and firstRowAsHeader. Lines are skipped first and then the first row is read as a header. |
-| treatEmptyAsNull on BlobSource |treatEmptyAsNull on input dataset |
-| blobWriterAddHeader on BlobSink |firstRowAsHeader on output dataset |
-
-See [Specifying TextFormat](data-factory-supported-file-and-compression-formats.md#text-format) section for detailed information on these properties.
-
-### recursive and copyBehavior examples
-This section describes the resulting behavior of the Copy operation for different combinations of recursive and copyBehavior values.
-
-| recursive | copyBehavior | Resulting behavior |
-| | | |
-| true |preserveHierarchy |For a source folder Folder1 with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5<br/><br/>the target folder Folder1 is created with the same structure as the source<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5. |
-| true |flattenHierarchy |For a source folder Folder1 with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5<br/><br/>the target Folder1 is created with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File5 |
-| true |mergeFiles |For a source folder Folder1 with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5<br/><br/>the target Folder1 is created with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1 + File2 + File3 + File4 + File 5 contents are merged into one file with auto-generated file name |
-| false |preserveHierarchy |For a source folder Folder1 with the following structure: <br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5<br/><br/>the target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/><br/><br/>Subfolder1 with File3, File4, and File5 are not picked up. |
-| false |flattenHierarchy |For a source folder Folder1 with the following structure:<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5<br/><br/>the target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;auto-generated name for File2<br/><br/><br/>Subfolder1 with File3, File4, and File5 are not picked up. |
-| false |mergeFiles |For a source folder Folder1 with the following structure:<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File2<br/>&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File3<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File5<br/><br/>the target folder Folder1 is created with the following structure<br/><br/>Folder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;File1 + File2 contents are merged into one file with auto-generated file name. auto-generated name for File1<br/><br/>Subfolder1 with File3, File4, and File5 are not picked up. |
-
-## Walkthrough: Use Copy Wizard to copy data to/from Blob Storage
-Let's look at how to quickly copy data to/from an Azure blob storage. In this walkthrough, both source and destination data stores of type: Azure Blob Storage. The pipeline in this walkthrough copies data from a folder to another folder in the same blob container. This walkthrough is intentionally simple to show you settings or properties when using Blob Storage as a source or sink.
-
-### Prerequisites
-1. Create a general-purpose **Azure Storage Account** if you don't have one already. You use the blob storage as both **source** and **destination** data store in this walkthrough. if you don't have an Azure storage account, see the [Create a storage account](../../storage/common/storage-account-create.md) article for steps to create one.
-2. Create a blob container named **adfblobconnector** in the storage account.
-4. Create a folder named **input** in the **adfblobconnector** container.
-5. Create a file named **emp.txt** with the following content and upload it to the **input** folder by using tools such as [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/)
- ```json
- John, Doe
- Jane, Doe
- ```
-
-### Create the data factory
-1. Sign in to the [Azure portal](https://portal.azure.com).
-2. Click **Create a resource** from the top-left corner, click **Intelligence + analytics**, and click **Data Factory**.
-3. In the **New data factory** pane:
- 1. Enter **ADFBlobConnectorDF** for the **name**. The name of the Azure data factory must be globally unique. If you receive the error: `*Data factory name "ADFBlobConnectorDF" is not available`, change the name of the data factory (for example, yournameADFBlobConnectorDF) and try creating again. See [Data Factory - Naming Rules](data-factory-naming-rules.md) topic for naming rules for Data Factory artifacts.
- 2. Select your Azure **subscription**.
- 3. For Resource Group, select **Use existing** to select an existing resource group (or) select **Create new** to enter a name for a resource group.
- 4. Select a **location** for the data factory.
- 5. Select **Pin to dashboard** check box at the bottom of the blade.
- 6. Click **Create**.
-3. After the creation is complete, you see the **Data Factory** blade as shown in the following image:
- :::image type="content" source="./media/data-factory-azure-blob-connector/data-factory-home-page.png" alt-text="Data factory home page":::
-
-### Copy Wizard
-1. On the Data Factory home page, click the **Copy data** tile to launch **Copy Data Wizard** in a separate tab.
-
- > [!NOTE]
- > If you see that the web browser is stuck at "Authorizing...", disable/uncheck **Block third-party cookies and site data** setting (or) keep it enabled and create an exception for **login.microsoftonline.com** and then try launching the wizard again.
-2. In the **Properties** page:
- 1. Enter **CopyPipeline** for **Task name**. The task name is the name of the pipeline in your data factory.
- 2. Enter a **description** for the task (optional).
- 3. For **Task cadence or Task schedule**, keep the **Run regularly on schedule** option. If you want to run this task only once instead of run repeatedly on a schedule, select **Run once now**. If you select, **Run once now** option, a [one-time pipeline](data-factory-create-pipelines.md#onetime-pipeline) is created.
- 4. Keep the settings for **Recurring pattern**. This task runs daily between the start and end times you specify in the next step.
- 5. Change the **Start date time** to **04/21/2017**.
- 6. Change the **End date time** to **04/25/2017**. You may want to type the date instead of browsing through the calendar.
- 8. Click **Next**.
- :::image type="content" source="./media/data-factory-azure-blob-connector/copy-tool-properties-page.png" alt-text="Copy Tool - Properties page":::
-3. On the **Source data store** page, click **Azure Blob Storage** tile. You use this page to specify the source data store for the copy task. You can use an existing data store linked service (or) specify a new data store. To use an existing linked service, you would select **FROM EXISTING LINKED SERVICES** and select the right linked service.
- :::image type="content" source="./media/data-factory-azure-blob-connector/copy-tool-source-data-store-page.png" alt-text="Copy Tool - Source data store page":::
-4. On the **Specify the Azure Blob storage account** page:
- 1. Keep the auto-generated name for **Connection name**. The connection name is the name of the linked service of type: Azure Storage.
- 2. Confirm that **From Azure subscriptions** option is selected for **Account selection method**.
- 3. Select your Azure subscription or keep **Select all** for **Azure subscription**.
- 4. Select an **Azure storage account** from the list of Azure storage accounts available in the selected subscription. You can also choose to enter storage account settings manually by selecting **Enter manually** option for the **Account selection method**.
- 5. Click **Next**.
- :::image type="content" source="./media/data-factory-azure-blob-connector/copy-tool-specify-azure-blob-storage-account.png" alt-text="Copy Tool - Specify the Azure Blob storage account":::
-5. On **Choose the input file or folder** page:
- 1. Double-click **adfblobcontainer**.
- 2. Select **input**, and click **Choose**. In this walkthrough, you select the input folder. You could also select the emp.txt file in the folder instead.
- :::image type="content" source="./media/data-factory-azure-blob-connector/copy-tool-choose-input-file-or-folder.png" alt-text="Copy Tool - Choose the input file or folder 1":::
-6. On the **Choose the input file or folder** page:
- 1. Confirm that the **file or folder** is set to **adfblobconnector/input**. If the files are in sub folders, for example, 2017/04/01, 2017/04/02, and so on, enter adfblobconnector/input/{year}/{month}/{day} for file or folder. When you press TAB out of the text box, you see three drop-down lists to select formats for year (yyyy), month (MM), and day (dd).
- 2. Do not set **Copy file recursively**. Select this option to recursively traverse through folders for files to be copied to the destination.
- 3. Do not the **binary copy** option. Select this option to perform a binary copy of source file to the destination. Do not select for this walkthrough so that you can see more options in the next pages.
- 4. Confirm that the **Compression type** is set to **None**. Select a value for this option if your source files are compressed in one of the supported formats.
- 5. Click **Next**.
- :::image type="content" source="./media/data-factory-azure-blob-connector/chose-input-file-folder.png" alt-text="Copy Tool - Choose the input file or folder 2":::
-7. On the **File format settings** page, you see the delimiters and the schema that is auto-detected by the wizard by parsing the file.
- 1. Confirm the following options:
- a. The **file format** is set to **Text format**. You can see all the supported formats in the drop-down list. For example: JSON, Avro, ORC, Parquet.
- b. The **column delimiter** is set to `Comma (,)`. You can see the other column delimiters supported by Data Factory in the drop-down list. You can also specify a custom delimiter.
- c. The **row delimiter** is set to `Carriage Return + Line feed (\r\n)`. You can see the other row delimiters supported by Data Factory in the drop-down list. You can also specify a custom delimiter.
- d. The **skip line count** is set to **0**. If you want a few lines to be skipped at the top of the file, enter the number here.
- e. The **first data row contains column names** is not set. If the source files contain column names in the first row, select this option.
- f. The **treat empty column value as null** option is set.
- 2. Expand **Advanced settings** to see advanced option available.
- 3. At the bottom of the page, see the **preview** of data from the emp.txt file.
- 4. Click **SCHEMA** tab at the bottom to see the schema that the copy wizard inferred by looking at the data in the source file.
- 5. Click **Next** after you review the delimiters and preview data.
- :::image type="content" source="./media/data-factory-azure-blob-connector/copy-tool-file-format-settings.png" alt-text="Copy Tool - File format settings":::
-8. On the **Destination data store page**, select **Azure Blob Storage**, and click **Next**. You are using the Azure Blob Storage as both the source and destination data stores in this walkthrough.
- :::image type="content" source="media/data-factory-azure-blob-connector/select-destination-data-store.png" alt-text="Copy Tool - select destination data store":::
-9. On **Specify the Azure Blob storage account** page:
- 1. Enter **AzureStorageLinkedService** for the **Connection name** field.
- 2. Confirm that **From Azure subscriptions** option is selected for **Account selection method**.
- 3. Select your Azure **subscription**.
- 4. Select your Azure storage account.
- 5. Click **Next**.
-10. On the **Choose the output file or folder** page:
- 1. specify **Folder path** as **adfblobconnector/output/{year}/{month}/{day}**. Enter **TAB**.
- 1. For the **year**, select **yyyy**.
- 1. For the **month**, confirm that it is set to **MM**.
- 1. For the **day**, confirm that it is set to **dd**.
- 1. Confirm that the **compression type** is set to **None**.
- 1. Confirm that the **copy behavior** is set to **Merge files**. If the output file with the same name already exists, the new content is added to the same file at the end.
- 1. Click **Next**.
- :::image type="content" source="media/data-factory-azure-blob-connector/choose-the-output-file-or-folder.png" alt-text="Copy Tool - Choose output file or folder":::
-11. On the **File format settings** page, review the settings, and click **Next**. One of the additional options here is to add a header to the output file. If you select that option, a header row is added with names of the columns from the schema of the source. You can rename the default column names when viewing the schema for the source. For example, you could change the first column to First Name and second column to Last Name. Then, the output file is generated with a header with these names as column names.
- :::image type="content" source="media/data-factory-azure-blob-connector/file-format-destination.png" alt-text="Copy Tool - File format settings for destination":::
-12. On the **Performance settings** page, confirm that **cloud units** and **parallel copies** are set to **Auto**, and click Next. For details about these settings, see [Copy activity performance and tuning guide](data-factory-copy-activity-performance.md#parallel-copy).
- :::image type="content" source="media/data-factory-azure-blob-connector/copy-performance-settings.png" alt-text="Copy Tool - Performance settings":::
-14. On the **Summary** page, review all settings (task properties, settings for source and destination, and copy settings), and click **Next**.
- :::image type="content" source="media/data-factory-azure-blob-connector/copy-tool-summary-page.png" alt-text="Copy Tool - Summary page":::
-15. Review information in the **Summary** page, and click **Finish**. The wizard creates two linked services, two datasets (input and output), and one pipeline in the data factory (from where you launched the Copy Wizard).
- :::image type="content" source="media/data-factory-azure-blob-connector/copy-tool-deployment-page.png" alt-text="Copy Tool - Deployment page":::
-
-### Monitor the pipeline (copy task)
-
-1. Click the link `Click here to monitor copy pipeline` on the **Deployment** page.
-2. You should see the **Monitor and Manage application** in a separate tab.
- :::image type="content" source="media/data-factory-azure-blob-connector/monitor-manage-app.png" alt-text="Monitor and Manage App":::
-3. Change the **start** time at the top to `04/19/2017` and **end** time to `04/27/2017`, and then click **Apply**.
-4. You should see five activity windows in the **ACTIVITY WINDOWS** list. The **WindowStart** times should cover all days from pipeline start to pipeline end times.
-5. Click **Refresh** button for the **ACTIVITY WINDOWS** list a few times until you see the status of all the activity windows is set to Ready.
-6. Now, verify that the output files are generated in the output folder of adfblobconnector container. You should see the following folder structure in the output folder:
-
- ```output
- 2017/04/21
- 2017/04/22
- 2017/04/23
- 2017/04/24
- 2017/04/25
- ```
-
- For detailed information about monitoring and managing data factories, see [Monitor and manage Data Factory pipeline](data-factory-monitor-manage-app.md) article.
-
-### Data Factory entities