Updates from: 09/10/2021 03:10:52
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c User Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/user-migration.md
Migrating from another identity provider to Azure Active Directory B2C (Azure AD B2C) might also require migrating existing user accounts. Two migration methods are discussed here, *pre migration* and *seamless migration*. With either approach, you're required to write an application or script that uses the [Microsoft Graph API](microsoft-graph-operations.md) to create user accounts in Azure AD B2C.
+Watch this video to learn about Azure AD B2C user migration strategies and steps to consider.
+
+>[!Video https://www.youtube.com/embed/lCWR6PGUgz0]
+ ## Pre migration In the pre migration flow, your migration application performs these steps for each user account:
active-directory On Premises Ecma Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-ecma-configure.md
After waiting, check your data source to see if new users are being provisioned.
1. Use the provisioning logs to determine which users were provisioned successfully or unsuccessfully. 1. Build custom alerts, dashboards, and queries by using the Azure Monitor integration.
-1. If the provisioning configuration seems to be in an unhealthy state, the application goes into quarantine. Learn more about [quarantine states](https://github.com/MicrosoftDocs/azure-docs-pr/compare/application-provisioning-quarantine-status.md?expand=1).
+1. If the provisioning configuration seems to be in an unhealthy state, the application goes into quarantine. Learn more about [quarantine states](https://github.com/MicrosoftDocs/azure-docs-pr/blob/master/articles/active-directory/app-provisioning/application-provisioning-quarantine-status.md).
## Next steps
active-directory Concept Mfa Licensing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/concept-mfa-licensing.md
Previously updated : 08/25/2021 Last updated : 09/08/2021
The following table details the different ways to get Azure AD Multi-Factor Auth
| [All Microsoft 365 plans](https://www.microsoft.com/microsoft-365/compare-microsoft-365-enterprise-plans) | Azure AD Multi-Factor Authentication can be enabled all users using [security defaults](../fundamentals/concept-fundamentals-security-defaults.md). Management of Azure AD Multi-Factor Authentication is through the Microsoft 365 portal. For an improved user experience, upgrade to Azure AD Premium P1 or P2 and use Conditional Access. For more information, see [secure Microsoft 365 resources with multi-factor authentication](/microsoft-365/admin/security-and-compliance/set-up-multi-factor-authentication). | | [Office 365 free](https://www.microsoft.com/microsoft-365/enterprise/compare-office-365-plans)<br>[Azure AD free](../verifiable-credentials/how-to-create-a-free-developer-account.md) | You can use [security defaults](../fundamentals/concept-fundamentals-security-defaults.md) to prompt users for multi-factor authentication as needed but you don't have granular control of enabled users or scenarios, but it does provide that additional security step.<br /> Even when security defaults aren't used to enable multi-factor authentication for everyone, users assigned the *Azure AD Global Administrator* role can be configured to use multi-factor authentication. This feature of the free tier makes sure the critical administrator accounts are protected by multi-factor authentication. |
-## Feature comparison of versions
+## Feature comparison based on licenses
The following table provides a list of the features that are available in the various versions of Azure AD Multi-Factor Authentication. Plan out your needs for securing user authentication, then determine which approach meets those requirements. For example, although Azure AD Free provides security defaults that provide Azure AD Multi-Factor Authentication, only the mobile authenticator app can be used for the authentication prompt, not a phone call or SMS. This approach may be a limitation if you can't ensure the mobile authentication app is installed on a user's personal device. See [Azure AD Free tier](#azure-ad-free-tier) later in this topic for more details.
The following table provides a list of the features that are available in the va
| Remember MFA for trusted devices | | ΓùÅ | ΓùÅ | ΓùÅ | | MFA for on-premises applications | | | | ΓùÅ |
+## Compare multi-factor authentication policies
+
+The following table provides deployment considerations for different MFA policies.
+
+| Policy | Security defaults | Conditional Access | Per-user MFA |
+| |::|::|::|
+| **Management** |
+| Standard set of security rules to keep your company safe | ΓùÅ | | |
+| One-click on/off | ΓùÅ | | |
+| Included in Office 365 licensing (See [license considerations](#available-versions-of-azure-ad-multi-factor-authentication)) | ΓùÅ | | ΓùÅ |
+| Pre-configured templates in Microsoft 365 Admin Center wizard | ΓùÅ | ΓùÅ | |
+| Configuration flexibility | | ΓùÅ | |
+| **Functionality** |
+| Exempt users from the policy | | ΓùÅ | ΓùÅ |
+| Authenticate by phone call or SMS | | ΓùÅ | ΓùÅ |
+| Authenticate by Microsoft Authenticator and Software tokens | ΓùÅ | ΓùÅ | ΓùÅ |
+| Authenticate by FIDO2, Windows Hello for Business, and Hardware tokens | | ΓùÅ | ΓùÅ |
+| Blocks legacy authentication protocols | ΓùÅ | ΓùÅ | ΓùÅ |
+| New employees are automatically protected | ΓùÅ | ΓùÅ | |
+| Dynamic MFA triggers based on risk events | | ΓùÅ | |
+| Authentication and authorization policies | | ΓùÅ | |
+| Configurable based on location and device state | | ΓùÅ | |
+| Support for "report only" mode | | ΓùÅ | |
+| Ability to completely block users/services | | ΓùÅ | |
+ ## Purchase and enable Azure AD Multi-Factor Authentication To use Azure AD Multi-Factor Authentication, register for or purchase an eligible Azure AD tier. Azure AD comes in four editionsΓÇöFree, Office 365, Premium P1, and Premium P2.
active-directory Howto Password Ban Bad On Premises Deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-password-ban-bad-on-premises-deploy.md
The following core requirements apply:
|`https://enterpriseregistration.windows.net`|Azure AD Password Protection functionality| > [!NOTE]
-> Some endpoints, such as the CRL endpoint, are not addressed in this article. For a list of all supported endpoints, see [Microsoft 365 URLs and IP address ranges](/microsoft-365/enterprise/urls-and-ip-address-ranges?view=o365-worldwide#microsoft-365-common-and-office-online).
+> Some endpoints, such as the CRL endpoint, are not addressed in this article. For a list of all supported endpoints, see [Microsoft 365 URLs and IP address ranges](/microsoft-365/enterprise/urls-and-ip-address-ranges#microsoft-365-common-and-office-online).
### Azure AD Password Protection DC agent
active-directory How To Accidental Deletes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/how-to-accidental-deletes.md
Previously updated : 01/25/2021 Last updated : 09/10/2021
The following document describes the accidental deletion feature for Azure AD Co
To use this feature, you set the threshold for the number of objects that, if deleted, synchronization should stop. So if this number is reached, the synchronization will stop and a notification will be sent to the email that is specified. This notification will allow you to investigate what is going on.
+For additional information and an example, see the following video.
+
+> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWK5mV]
+ ## Configure accidental delete prevention To use the new feature, follow the steps below.
active-directory How To Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/how-to-configure.md
Previously updated : 01/21/2021 Last updated : 09/10/2021
# Create a new configuration for Azure AD Connect cloud sync
-After you've installed the Azure AD Connect provisioning agent, you need to sign in to the Azure portal and configure it. Follow these steps to enable the agent.
+The following document will guide you through configuring Azure AD Connect cloud sync. For additional information and an example of how to configure cloud sync, see the video below.
++
+> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWKact]
+ ## Configure provisioning To configure provisioning, follow these steps.
active-directory How To Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/how-to-install.md
Previously updated : 11/16/2020 Last updated : 09/10/2021
This article walks you through the installation process for the Azure Active Dir
>[!NOTE] >This article deals with installing the provisioning agent by using the wizard. For information on installing the Azure AD Connect provisioning agent by using a command-line interface (CLI), see [Install the Azure AD Connect provisioning agent by using a CLI and PowerShell](how-to-install-pshell.md).
+For additional information and an example, see the following video.
+
+> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWK5mR]
+ ## Group Managed Service Accounts A group Managed Service Account (gMSA) is a managed domain account that provides automatic password management, simplified service principal name (SPN) management, and the ability to delegate the management to other administrators. It also extends this functionality over multiple servers. Azure AD Connect cloud sync supports and recommends the use of a group Managed Service Account for running the agent. For more information on a group Managed Service Account, see [Group Managed Service Accounts](/windows-server/security/group-managed-service-accounts/group-managed-service-accounts-overview).
active-directory How To On Demand Provision https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/how-to-on-demand-provision.md
Previously updated : 09/14/2020 Last updated : 09/10/2021
You can use the cloud sync feature of Azure Active Directory (Azure AD) Connect
> [!IMPORTANT] > When you use on-demand provisioning, the scoping filters are not applied to the user that you selected. You can use on-demand provisioning on users who are outside the organization units that you specified.
+For additional information and an example see the following video.
+
+> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWK5mW]
+ ## Validate a user To use on-demand provisioning, follow these steps:
active-directory Plan Cloud Sync Topologies https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/plan-cloud-sync-topologies.md
Previously updated : 02/26/2020 Last updated : 09/10/2021
This article describes various on-premises and Azure Active Directory (Azure AD)
> [!IMPORTANT] > Microsoft doesn't support modifying or operating Azure AD Connect cloud sync outside of the configurations or actions that are formally documented. Any of these configurations or actions might result in an inconsistent or unsupported state of Azure AD Connect cloud sync. As a result, Microsoft can't provide technical support for such deployments.
+For more information see the following video.
+
+> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWJ8l5]
+ ## Things to remember about all scenarios and topologies The following is a list of information to keep in mind when selecting a solution.
active-directory What Is Cloud Sync https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/what-is-cloud-sync.md
Previously updated : 12/11/2020 Last updated : 09/10/2021
Azure AD Connect cloud sync is new offering from Microsoft designed to meet and
- Multiple provisioning agents can be used to simplify high availability deployments, particularly critical for organizations relying upon password hash synchronization from AD to Azure AD. - Support for large groups with up to 50K members. It is recommended to use only the OU scoping filter when synchronizing large groups. - ![What is Azure AD Connect](media/what-is-cloud-sync/architecture-1.png) ## How is Azure AD Connect cloud sync different from Azure AD Connect sync?
active-directory Concept Conditional Access Cloud Apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-conditional-access-cloud-apps.md
Administrators can select published authentication contexts in their Conditional
For more information about authentication context use in applications, see the following articles. -- [Microsoft Information Protection sensitivity labels to protect SharePoint sites](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites?view=o365-worldwide#more-information-about-the-dependencies-for-the-authentication-context-option&preserve-view=true)
+- [Microsoft Information Protection sensitivity labels to protect SharePoint sites](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites#more-information-about-the-dependencies-for-the-authentication-context-option)
- [Microsoft Cloud App Security](/cloud-app-security/session-policy-aad?branch=pr-en-us-2082#require-step-up-authentication-authentication-context) - [Custom applications](../develop/developer-guide-conditional-access-authentication-context.md)
active-directory Msal Net Migration Public Client https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-migration-public-client.md
+
+ Title: Migrate public client applications to MSAL.NET
+
+description: Learn how to migrate a public client application from Azure Active Directory Authentication Library for .NET to Microsoft Authentication Library for .NET.
++++++++ Last updated : 08/31/2021+++
+#Customer intent: As an application developer, I want to migrate my public client app from ADAL.NET to MSAL.NET.
++
+# Migrate public client applications from ADAL.NET to MSAL.NET
+
+This article describes how to migrate a public client application from Azure Active Directory Authentication Library for .NET (ADAL.NET) to Microsoft Authentication Library for .NET (MSAL.NET). Public client applications are desktop apps, including Win32, WPF, and UWP apps, and mobile apps, that call another service on the user's behalf. For more information about public client applications, see [Authentication flows and application scenarios](authentication-flows-app-scenarios.md).
+
+## Migration steps
+
+1. Find the code by using ADAL.NET in your app.
+
+ The code that uses ADAL in a public client application instantiates `AuthenticationContext` and calls an override of `AcquireTokenAsync` with the following parameters:
+
+ - A `resourceId` string. This variable is the app ID URI of the web API that you want to call.
+ - A `clientId` which is the identifier for your application, also known as App ID.
+
+2. After you've identified that you have apps that are using ADAL.NET, install the MSAL.NET NuGet package [Microsoft.Identity.Client](https://www.nuget.org/packages/Microsoft.Identity.Client) and update your project library references. For more information, see [Install a NuGet package](https://www.bing.com/search?q=install+nuget+package).
+
+3. Update the code according to the public client application scenario. Some steps are common and apply across all the public client scenarios. Other steps are unique to each scenario.
+
+ The public client scenarios are:
+
+ - [Web Authentication Manager](scenario-desktop-acquire-token-wam.md) the preferred broker-based authentication on Windows.
+ - [Interactive Authentication](scenario-desktop-acquire-token-interactive.md) where the user is shown a web-based interface to complete the sign-in process.
+ - [Integrated Windows Authentication](scenario-desktop-acquire-token-integrated-windows-authentication.md) where a user signs using the same identity they used to sign into windows domain (for domain joined or AAD joined machines).
+ - [Username/Password](scenario-desktop-acquire-token-username-password.md) where the sign-in occurs by providing a username/password credential.
+ - [Device Code Flow](scenario-desktop-acquire-token-device-code-flow.md) where a device of limited UX shows you a device code to complete the authentication flow on an alternate device.
++
+## [Interactive](#tab/interactive)
+
+Interactive scenarios are where your public client application shows a login user interface hosted in a browser, and the user is required to interactively sign-in.
+
+#### Find out if your code uses interactive scenarios
+
+The ADAL code for your app in a public client application that uses interactive authentication instantiates `AuthenticationContext` and includes a call to `AcquireTokenAsync`, with the following parameters.
+ - A `clientId` which is a GUID representing your application registration
+ - A `resourceUrl` which indicates the resource you are asking the token for
+ - A Uri that is the reply URL
+ - A `PlatformParameters` object.
+
+ #### Update the code for interactive scenarios
+
+ [!INCLUDE [Common steps](includes/msal-net-adoption-steps-public-clients.md)]
+
+In this case, we replace the call to `AuthenticationContext.AcquireTokenAsync` with a call to `IPublicClientApplication.AcquireTokenInteractive`.
+
+Here's a comparison of ADAL.NET and MSAL.NET code for interactive scenarios:
+
+ ADAL
+ MSAL
+
+```csharp
+var ac = new AuthenticationContext("https://login.microsoftonline.com/<tenantId>");
+AuthenticationResult result;
+result = await ac.AcquireTokenAsync("<clientId>",
+ "https://resourceUrl",
+ new Uri("https://ClientReplyUrl"),
+ new PlatformParameters(PromptBehavior.Auto));
+```
+```csharp
+// 1. Configuration - read below about redirect URI
+var pca = PublicClientApplicationBuilder.Create("client_id")
+ .WithBroker()
+ .Build();
+
+// Add a token cache, see https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-token-cache-serialization?tabs=desktop
+
+// 2. GetAccounts
+var accounts = await pca.GetAccountsAsync();
+var accountToLogin = // choose an account, or null, or use PublicClientApplication.OperatingSystemAccount for the default OS account
+
+try
+{
+ // 3. AcquireTokenSilent
+ var authResult = await pca.AcquireTokenSilent(new[] { "User.Read" }, accountToLogin)
+ .ExecuteAsync();
+}
+catch (MsalUiRequiredException) // no change in the pattern
+{
+ // 4. Specific: Switch to the UI thread for next call . Not required for console apps.
+ await SwitchToUiThreadAsync(); // not actual code, this is different on each platform / tech
+
+ // 5. AcquireTokenInteractive
+ var authResult = await pca.AcquireTokenInteractive(new[] { "User.Read" })
+ .WithAccount(accountToLogin) // this already exists in MSAL, but it is more important for WAM
+ .WithParentActivityOrWindow(myWindowHandle) // to be able to parent WAM's windows to your app (optional, but highly recommended; not needed on UWP)
+ .ExecuteAsync();
+}
+```
+ :::column-end:::
+
+The MSAL code shown above uses WAM (Web authentication manager) which is the recommended approach. If you wish to use interactive authentication without WAM, see [Interactive Authentication](scenario-desktop-acquire-token-interactive.md).
+
+## [Integrated Windows Authentication](#tab/iwa)
+
+Integrated Windows authentication is where your public client application signs in using the same identity they used to sign into windows domain (for domain joined or AAD joined machines).
+
+#### Find out if your code uses Integrated Windows Authentication
+
+The ADAL code for your app uses integrated windows authentication scenarios if it contains a call to `AcquireTokenAsync` available as an extension method of the `AuthenticationContextIntegratedAuthExtensions` class, with the following parameters:
+
+- A `resource` which represents the resource you are asking the token for
+- A `clientId` which is a GUID representing your application registration
+- A `UserCredential` object that represents the user you are trying to request the token for.
+
+#### Update the code for integrated windows auth scenarios
+
+ [!INCLUDE [Common steps](includes/msal-net-adoption-steps-public-clients.md)]
+
+In this case, we replace the call to `AuthenticationContext.AcquireTokenAsync` with a call to `IPublicClientApplication.AcquireTokenByIntegratedWindowsAuth`.
+
+Here's a comparison of ADAL.NET and MSAL.NET code for integrated windows auth scenarios:
+
+ ADAL
+ MSAL
+
+```csharp
+var ac = new AuthenticationContext("https://login.microsoftonline.com/<tenantId>");
+AuthenticationResult result;
+result = await context.AcquireTokenAsync(resource, clientId,
+ new UserCredential("john@contoso.com"));
+```
+```csharp
+ string authority = "https://login.microsoftonline.com/contoso.com";
+ string[] scopes = new string[] { "user.read" };
+ IPublicClientApplication app = PublicClientApplicationBuilder
+ .Create(clientId)
+ .WithAuthority(authority)
+ .Build();
+
+ var accounts = await app.GetAccountsAsync();
+
+ AuthenticationResult result = null;
+ if (accounts.Any())
+ {
+ result = await app.AcquireTokenSilent(scopes, accounts.FirstOrDefault())
+ .ExecuteAsync();
+ }
+ else
+ {
+ try
+ {
+ result = await app.AcquireTokenByIntegratedWindowsAuth(scopes)
+ .ExecuteAsync(CancellationToken.None);
+ }
+ catch (MsalUiRequiredException ex)
+ {
+ // MsalUiRequiredException: AADSTS65001: The user or administrator has not consented to use the application
+ // with ID '{appId}' named '{appName}'.Send an interactive authorization request for this user and resource.
+
+ // you need to get user consent first. This can be done, if you are not using .NET Core (which does not have any Web UI)
+ // by doing (once only) an AcquireToken interactive.
+
+ // If you are using .NET core or don't want to do an AcquireTokenInteractive, you might want to suggest the user to navigate
+ // to a URL to consent: https://login.microsoftonline.com/common/oauth2/v2.0/authorize?client_id={clientId}&response_type=code&scope=user.read
+
+ // AADSTS50079: The user is required to use multi-factor authentication.
+ // There is no mitigation - if MFA is configured for your tenant and AAD decides to enforce it,
+ // you need to fallback to an interactive flows such as AcquireTokenInteractive or AcquireTokenByDeviceCode
+ }
+ catch (MsalServiceException ex)
+ {
+ // Kind of errors you could have (in ex.Message)
+
+ // MsalServiceException: AADSTS90010: The grant type is not supported over the /common or /consumers endpoints. Please use the /organizations or tenant-specific endpoint.
+ // you used common.
+ // Mitigation: as explained in the message from Azure AD, the authority needs to be tenanted or otherwise organizations
+
+ // MsalServiceException: AADSTS70002: The request body must contain the following parameter: 'client_secret or client_assertion'.
+ // Explanation: this can happen if your application was not registered as a public client application in Azure AD
+ // Mitigation: in the Azure portal, edit the manifest for your application and set the `allowPublicClient` to `true`
+ }
+ catch (MsalClientException ex)
+ {
+ // Error Code: unknown_user Message: Could not identify logged in user
+ // Explanation: the library was unable to query the current Windows logged-in user or this user is not AD or AAD
+ // joined (work-place joined users are not supported).
+
+ // Mitigation 1: on UWP, check that the application has the following capabilities: Enterprise Authentication,
+ // Private Networks (Client and Server), User Account Information
+
+ // Mitigation 2: Implement your own logic to fetch the username (e.g. john@contoso.com) and use the
+ // AcquireTokenByIntegratedWindowsAuth form that takes in the username
+
+ // Error Code: integrated_windows_auth_not_supported_managed_user
+ // Explanation: This method relies on an a protocol exposed by Active Directory (AD). If a user was created in Azure
+ // Active Directory without AD backing ("managed" user), this method will fail. Users created in AD and backed by
+ // AAD ("federated" users) can benefit from this non-interactive method of authentication.
+ // Mitigation: Use interactive authentication
+ }
+ }
+
+ Console.WriteLine(result.Account.Username);
+}
+```
+ :::column-end:::
+
+## [Username Password](#tab/up)
+
+Username Password authentication is where the sign-in occurs by providing a username/password credential.
+#### Find out if your code uses Username Password authentication
+
+The ADAL code for your app uses Username password authentication scenarios if it contains a call to `AcquireTokenAsync` available as an extension method of the `AuthenticationContextIntegratedAuthExtensions` class, with the following parameters:
+
+- A `resource` which represents the resource you are asking the token for
+- A `clientId` which is a GUID representing your application registration
+- A `UserPasswordCredential` object that contains the username and password for the user you are trying to request the token for.
+
+#### Update the code for username password auth scenarios
+
+In this case, we replace the call to `AuthenticationContext.AcquireTokenAsync` with a call to `IPublicClientApplication.AcquireTokenByUsernamePassword`.
+
+Here's a comparison of ADAL.NET and MSAL.NET code for username password scenarios:
+
+ [!INCLUDE [Common steps](includes/msal-net-adoption-steps-public-clients.md)]
+
+ ADAL
+ MSAL
+
+```csharp
+var ac = new AuthenticationContext("https://login.microsoftonline.com/<tenantId>");
+AuthenticationResult result;
+result = await context.AcquireTokenAsync(
+ resource, clientId,
+ new UserPasswordCredential("john@contoso.com", johnsPassword));
+
+```
+```csharp
+ string authority = "https://login.microsoftonline.com/contoso.com";
+ string[] scopes = new string[] { "user.read" };
+ IPublicClientApplication app;
+ app = PublicClientApplicationBuilder.Create(clientId)
+ .WithAuthority(authority)
+ .Build();
+ var accounts = await app.GetAccountsAsync();
+
+ AuthenticationResult result = null;
+ if (accounts.Any())
+ {
+ result = await app.AcquireTokenSilent(scopes, accounts.FirstOrDefault())
+ .ExecuteAsync();
+ }
+ else
+ {
+ try
+ {
+ var securePassword = new SecureString();
+ foreach (char c in "dummy") // you should fetch the password
+ securePassword.AppendChar(c); // keystroke by keystroke
+
+ result = await app.AcquireTokenByUsernamePassword(scopes,
+ "joe@contoso.com",
+ securePassword)
+ .ExecuteAsync();
+ }
+ catch(MsalException)
+ {
+ // See details below
+ }
+ }
+ Console.WriteLine(result.Account.Username);
+```
+ :::column-end:::
+
+## [Device Code](#tab/devicecode)
+
+Device code flow authentication is where a device of limited UX shows you a device code to complete the authentication flow on an alternate device.
+
+#### Find out if your code uses Device code flow authentication
+
+The ADAL code for your app uses device code flow scenarios if it contains a call to `AuthenticationContext.AcquireTokenByDeviceCodeAsync` with the following parameters:
+- A `DeviceCodeResult` object instance, which is instantiated with the `resourceID` of the resource you are asking for a token for, and a `clientId` which is the GUID that represents your application.
+
+#### Update the code for device code flow scenarios
+
+ [!INCLUDE [Common steps](includes/msal-net-adoption-steps-public-clients.md)]
+
+In this case, we replace the call to `AuthenticationContext.AcquireTokenAsync` with a call to `IPublicClientApplication.AcquireTokenWithDeviceCode`.
+
+Here's a comparison of ADAL.NET and MSAL.NET code for device code flow scenarios:
+
+ ADAL
+ MSAL
+
+```csharp
+static async Task<AuthenticationResult> GetTokenViaCode(AuthenticationContext ctx)
+{
+ AuthenticationResult result = null;
+ try
+ {
+ result = await ac.AcquireTokenSilentAsync(resource, clientId);
+ }
+ catch (AdalException adalException)
+ {
+ if (adalException.ErrorCode == AdalError.FailedToAcquireTokenSilently
+ || adalException.ErrorCode == AdalError.InteractionRequired)
+ {
+ try
+ {
+ DeviceCodeResult codeResult = await ctx.AcquireDeviceCodeAsync(resource, clientId);
+ Console.WriteLine("You need to sign in.");
+ Console.WriteLine("Message: " + codeResult.Message + "\n");
+ result = await ctx.AcquireTokenByDeviceCodeAsync(codeResult);
+ }
+ catch (Exception exc)
+ {
+ Console.WriteLine("Something went wrong.");
+ Console.WriteLine("Message: " + exc.Message + "\n");
+ }
+ }
+ return result;
+}
+
+```
+```csharp
+private const string ClientId = "<client_guid>";
+private const string Authority = "https://login.microsoftonline.com/contoso.com";
+private readonly string[] scopes = new string[] { "user.read" };
+
+static async Task<AuthenticationResult> GetATokenForGraph()
+{
+ IPublicClientApplication pca = PublicClientApplicationBuilder
+ .Create(ClientId)
+ .WithAuthority(Authority)
+ .WithDefaultRedirectUri()
+ .Build();
+
+ var accounts = await pca.GetAccountsAsync();
+
+ // All AcquireToken* methods store the tokens in the cache, so check the cache first
+ try
+ {
+ return await pca.AcquireTokenSilent(scopes, accounts.FirstOrDefault())
+ .ExecuteAsync();
+ }
+ catch (MsalUiRequiredException ex)
+ {
+ // No token found in the cache or AAD insists that a form interactive auth is required (e.g. the tenant admin turned on MFA)
+ // If you want to provide a more complex user experience, check out ex.Classification
+
+ return await AcquireByDeviceCodeAsync(pca);
+ }
+}
+
+private static async Task<AuthenticationResult> AcquireByDeviceCodeAsync(IPublicClientApplication pca)
+{
+ try
+ {
+ var result = await pca.AcquireTokenWithDeviceCode(scopes,
+ deviceCodeResult =>
+ {
+ // This will print the message on the console which tells the user where to go sign-in using
+ // a separate browser and the code to enter once they sign in.
+ // The AcquireTokenWithDeviceCode() method will poll the server after firing this
+ // device code callback to look for the successful login of the user via that browser.
+ // This background polling (whose interval and timeout data is also provided as fields in the
+ // deviceCodeCallback class) will occur until:
+ // * The user has successfully logged in via browser and entered the proper code
+ // * The timeout specified by the server for the lifetime of this code (typically ~15 minutes) has been reached
+ // * The developing application calls the Cancel() method on a CancellationToken sent into the method.
+ // If this occurs, an OperationCanceledException will be thrown (see catch below for more details).
+ Console.WriteLine(deviceCodeResult.Message);
+ return Task.FromResult(0);
+ }).ExecuteAsync();
+
+ Console.WriteLine(result.Account.Username);
+ return result;
+ }
+
+ // TODO: handle or throw all these exceptions depending on your app
+ catch (MsalServiceException ex)
+ {
+ // Kind of errors you could have (in ex.Message)
+
+ // AADSTS50059: No tenant-identifying information found in either the request or implied by any provided credentials.
+ // Mitigation: as explained in the message from Azure AD, the authoriy needs to be tenanted. you have probably created
+ // your public client application with the following authorities:
+ // https://login.microsoftonline.com/common or https://login.microsoftonline.com/organizations
+
+ // AADSTS90133: Device Code flow is not supported under /common or /consumers endpoint.
+ // Mitigation: as explained in the message from Azure AD, the authority needs to be tenanted
+
+ // AADSTS90002: Tenant <tenantId or domain you used in the authority> not found. This may happen if there are
+ // no active subscriptions for the tenant. Check with your subscription administrator.
+ // Mitigation: if you have an active subscription for the tenant this might be that you have a typo in the
+ // tenantId (GUID) or tenant domain name.
+ }
+ catch (OperationCanceledException ex)
+ {
+ // If you use a CancellationToken, and call the Cancel() method on it, then this *may* be triggered
+ // to indicate that the operation was cancelled.
+ // See https://docs.microsoft.com/dotnet/standard/threading/cancellation-in-managed-threads
+ // for more detailed information on how C# supports cancellation in managed threads.
+ }
+ catch (MsalClientException ex)
+ {
+ // Possible cause - verification code expired before contacting the server
+ // This exception will occur if the user does not manage to sign-in before a time out (15 mins) and the
+ // call to `AcquireTokenWithDeviceCode` is not cancelled in between
+ }
+}
+```
+ :::column-end:::
++
+### MSAL benefits
+
+Key benefits of MSAL.NET for your app include:
+
+- **Resilience**. MSAL.NET helps make your app resilient through the following:
+
+ - Azure AD Cached Credential Service (CCS) benefits. CCS operates as an Azure AD backup.
+ - Proactive renewal of tokens if the API that you call enables long-lived tokens through [continuous access evaluation](app-resilience-continuous-access-evaluation.md).
+
+### Troubleshooting
+
+The following troubleshooting information makes two assumptions:
+
+- Your ADAL.NET code was working.
+- You migrated to MSAL by keeping the same client ID.
+
+If you get an exception with either of the following messages:
+
+> `AADSTS90002: Tenant 'cf61953b-e41a-46b3-b500-663d279ea744' not found. This may happen if there are no active`
+> `subscriptions for the tenant. Check to make sure you have the correct tenant ID. Check with your subscription`
+> `administrator.`
+
+You can troubleshoot the exception by using these steps:
+
+1. Confirm that you're using the latest version of MSAL.NET.
+1. Confirm that the authority host that you set when building the confidential client application and the authority host that you used with ADAL are similar. In particular, is it the same [cloud](msal-national-cloud.md) (Azure Government, Azure China 21Vianet, or Azure Germany)?
+
+## Next steps
+
+Learn more about the [differences between ADAL.NET and MSAL.NET apps](msal-net-differences-adal-net.md).
+Learn more about [token cache serialization in MSAL.NET](msal-net-token-cache-serialization.md)
active-directory Msal Net Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-migration.md
For details about the decision tree below, read [MSAL.NET or Microsoft.Identity.
## Next steps - Learn how to [migrate confidential client applications built on top of ASP.NET MVC or .NET classic from ADAL.NET to MSAL.NET](msal-net-migration-confidential-client.md).
+- Learn how to [migrate public client applications built on top of .NET or .NET classic from ADAL.NET to MSAL.NET](msal-net-migration-public-client.md).
- Learn more about the [Differences between ADAL.NET and MSAL.NET apps](msal-net-differences-adal-net.md). - Learn how to migrate confidential client applications built on top of ASP.NET Core from ADAL.NET to Microsoft.Identity.Web: - [Web apps](https://github.com/AzureAD/microsoft-identity-web/wiki/web-apps#migrating-from-previous-versions--adding-authentication)
active-directory Msal Node Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-node-migration.md
var authorityURI = "https://login.microsoftonline.com/common";
var context = new AuthenticationContext(authorityURI, true, cache); ```
-MSAL Node uses an in-memory token cache by default. You do not need to explicitly import it; it is exposed as part of the `ConfidentialClientApplication` and `PublicClientApplication` classes.
+MSAL Node uses an in-memory token cache by default. You do not need to explicitly import it; in-memory token cache is exposed as part of the `ConfidentialClientApplication` and `PublicClientApplication` classes.
```javascript const msalTokenCache = publicClientApplication.getTokenCache();
active-directory Sample V2 Code https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/sample-v2-code.md
The following samples show public client desktop applications that access the Mi
> | Node.js | [Sign in users](https://github.com/Azure-Samples/ms-identity-javascript-nodejs-desktop) | MSAL Node | Authorization code with PKCE | > | Powershell | [Call Microsoft Graph by signing in users using username/password](https://github.com/azure-samples/active-directory-dotnetcore-console-up-v2) | MSAL.NET | Resource owner password credentials | > | Python | [Sign in users](https://github.com/Azure-Samples/ms-identity-python-desktop) | MSAL Python | Authorization code with PKCE |
-> | Universal Window Platform (UWP) | [Call Microsoft Graph](https://github.com/azure-samples/active-directory-dotnet-native-uwp-wam) | Web account manager API | Integrated windows authentication |
+> | Universal Window Platform (UWP) | [Call Microsoft Graph](https://github.com/Azure-Samples/active-directory-xamarin-native-v2/tree/main/2-With-broker) | MSAL.NET | Web account manager |
> | Windows Presentation Foundation (WPF) | [Sign in users and call Microsoft Graph](https://github.com/Azure-Samples/active-directory-dotnet-native-aspnetcore-v2/tree/master/2.%20Web%20API%20now%20calls%20Microsoft%20Graph) | MSAL.NET | Authorization code with PKCE | > | XAML | &#8226; [Sign in users and call ASP.NET core web API](https://github.com/Azure-Samples/active-directory-dotnet-native-aspnetcore-v2/tree/master/1.%20Desktop%20app%20calls%20Web%20API) <br/> &#8226; [Sign in users and call Microsoft Graph](https://github.com/azure-samples/active-directory-dotnet-desktop-msgraph-v2) | MSAL.NET | Authorization code with PKCE |
active-directory Scenario Desktop Acquire Token Device Code Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-desktop-acquire-token-device-code-flow.md
Interactive authentication with Azure AD requires a web browser. For more inform
This method takes as parameters: - The `scopes` to request an access token for.-- A callback that receives the [`DeviceCodeResult`](https://docs.microsoft.com/dotnet/api/microsoft.identity.client.devicecoderesult).
+- A callback that receives the [`DeviceCodeResult`](/dotnet/api/microsoft.identity.client.devicecoderesult).
The following sample code presents the synopsis of most current cases, with explanations of the kind of exceptions you can get and their mitigation. For a fully functional code sample, see [active-directory-dotnetcore-devicecodeflow-v2](https://github.com/azure-samples/active-directory-dotnetcore-devicecodeflow-v2) on GitHub.
active-directory Scenario Desktop Acquire Token Interactive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-desktop-acquire-token-interactive.md
Remarks:
#### WithPrompt
-`WithPrompt()` is used to control the interactivity with the user by specifying a prompt.
+`WithPrompt()` is used to control the interactivity with the user by specifying a prompt. The exact behavior can be controlled by using the [Microsoft.Identity.Client.Prompt](/dotnet/api/microsoft.identity.client.prompt) structure.
-![Image showing the fields in the Prompt structure. These constant values control interactivity with the user by defining the type of prompt displayed by the WithPrompt() method.](https://user-images.githubusercontent.com/34331512/112267137-3f1c3a00-8c32-11eb-97fb-33604311329a.png)
+The struct defines the following constants:
-The class defines the following constants:
--- ``SelectAccount`` forces the STS to present the account selection dialog box that contains accounts for which the user has a session. This option is useful when application developers want to let users choose among different identities. This option drives MSAL to send ``prompt=select_account`` to the identity provider. This option is the default. It does a good job of providing the best possible experience based on the available information, such as account and presence of a session for the user. Don't change it unless you have good reason to do it.-- ``Consent`` enables the application developer to force the user to be prompted for consent, even if consent was granted before. In this case, MSAL sends `prompt=consent` to the identity provider. This option can be used in some security-focused applications where the organization governance demands that the user is presented with the consent dialog box each time the application is used.-- ``ForceLogin`` enables the application developer to have the user prompted for credentials by the service, even if this user prompt might not be needed. This option can be useful to let the user sign in again if acquiring a token fails. In this case, MSAL sends `prompt=login` to the identity provider. Sometimes it's used in security-focused applications where the organization governance demands that the user re-signs in each time they access specific parts of an application.-- ``Create`` triggers a sign-up experience, which is used for External Identities, by sending `prompt=create` to the identity provider. This prompt should not be sent for Azure AD B2C apps. For more information, see [Add a self-service sign-up user flow to an app](../external-identities/self-service-sign-up-user-flow.md).-- ``Never`` (for .NET 4.5 and WinRT only) won't prompt the user, but instead tries to use the cookie stored in the hidden embedded web view. For more information, see web views in MSAL.NET. Using this option might fail. In that case, `AcquireTokenInteractive` throws an exception to notify that a UI interaction is needed. You'll need to use another `Prompt` parameter.-- ``NoPrompt`` won't send any prompt to the identity provider which therefore will decide to present the best sign-in experience to the user (single-sign-on, or select account). This option is also mandatory for Azure Active Directory (Azure AD) B2C edit profile policies. For more information, see [Azure AD B2C specifics](https://aka.ms/msal-net-b2c-specificities).
+- `SelectAccount` forces the STS to present the account selection dialog box that contains accounts for which the user has a session. This option is useful when application developers want to let users choose among different identities. This option drives MSAL to send `prompt=select_account` to the identity provider. This option is the default. It does a good job of providing the best possible experience based on the available information, such as account and presence of a session for the user. Don't change it unless you have good reason to do it.
+- `Consent` enables the application developer to force the user to be prompted for consent, even if consent was granted before. In this case, MSAL sends `prompt=consent` to the identity provider. This option can be used in some security-focused applications where the organization governance demands that the user is presented with the consent dialog box each time the application is used.
+- `ForceLogin` enables the application developer to have the user prompted for credentials by the service, even if this user prompt might not be needed. This option can be useful to let the user sign in again if acquiring a token fails. In this case, MSAL sends `prompt=login` to the identity provider. Sometimes it's used in security-focused applications where the organization governance demands that the user re-signs in each time they access specific parts of an application.
+- `Create` triggers a sign-up experience, which is used for External Identities, by sending `prompt=create` to the identity provider. This prompt should not be sent for Azure AD B2C apps. For more information, see [Add a self-service sign-up user flow to an app](../external-identities/self-service-sign-up-user-flow.md).
+- `Never` (for .NET 4.5 and WinRT only) won't prompt the user, but instead tries to use the cookie stored in the hidden embedded web view. For more information, see web views in MSAL.NET. Using this option might fail. In that case, `AcquireTokenInteractive` throws an exception to notify that a UI interaction is needed. You'll need to use another `Prompt` parameter.
+- `NoPrompt` won't send any prompt to the identity provider which therefore will decide to present the best sign-in experience to the user (single-sign-on, or select account). This option is also mandatory for Azure Active Directory (Azure AD) B2C edit profile policies. For more information, see [Azure AD B2C specifics](https://aka.ms/msal-net-b2c-specificities).
#### WithUseEmbeddedWebView
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 8/26/2021 Last updated : 9/9/2021
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information last updated on August 26th, 2021.
+>This information last updated on September 9th, 2021.
| Product name | String ID | GUID | Service plans included | Service plans included (friendly names) | | | | | | |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| EXCHANGE ONLINE POP | EXCHANGETELCO | cb0a98a8-11bc-494c-83d9-c1b1ac65327e | EXCHANGE_B_STANDARD (90927877-dcff-4af6-b346-2332c0b15bb7) | EXCHANGE ONLINE POP (90927877-dcff-4af6-b346-2332c0b15bb7) | | INTUNE | INTUNE_A | 061f9ace-7d42-4136-88ac-31dc755f143f | INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) | MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) | | Microsoft Dynamics AX7 User Trial | AX7_USER_TRIAL | fcecd1f9-a91e-488d-a918-a96cdb6ce2b0 | ERP_TRIAL_INSTANCE (e2f705fd-2468-4090-8c58-fad6e6b1e724)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318) | Dynamics 365 Operations Trial Environment (e2f705fd-2468-4090-8c58-fad6e6b1e724)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318) |
+| Microsoft Azure Multi-Factor Authentication | MFA_STANDALONE | cb2020b1-d8f6-41c0-9acd-8ff3d6d7831b | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0) |
+| Microsoft Defender for Office 365 (Plan 2) | THREAT_INTELLIGENCE | 3dd6cf57-d688-4eed-ba52-9e40b5468c3e | MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70) | Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70) |
| Microsoft 365 A1 | M365EDU_A1 | b17653a4-2443-4e8c-a550-18249dda78bb | AAD_EDU (3a3976ce-de18-4a87-a78e-5e9245e252df)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>WINDOWS_STORE (a420f25f-a7b3-4ff5-a9d0-5d58f73b537d) | Azure Active Directory for Education (3a3976ce-de18-4a87-a78e-5e9245e252df)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Windows Store Service (a420f25f-a7b3-4ff5-a9d0-5d58f73b537d) | | Microsoft 365 A3 for Faculty | M365EDU_A3_FACULTY | 4b590615-0888-425a-a965-b3bf7789848d | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/> Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Common Data Service for Teams_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | MICROSOFT 365 A3 FOR STUDENTS | M365EDU_A3_STUDENT | 7cfd9a2b-e110-4c39-bf20-c6a3f36a3121 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>PowerApps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| MICROSOFT 365 BUSINESS STANDARD | O365_BUSINESS_PREMIUM | f245ecc8-75af-4f8e-b61f-27d8114de5f3 | BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>O365_SB_Relationship_Management (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)| To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OUTLOOK CUSTOMER MANAGER (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | | MICROSOFT 365 BUSINESS STANDARD - PREPAID LEGACY | SMB_BUSINESS_PREMIUM | ac5cef5d-921b-4f97-9ef3-c99076e5470f | BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>O365_SB_Relationship_Management (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_MIDSIZE (41bf139a-4e60-409f-9346-a1361efc6dfb) | To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OUTLOOK CUSTOMER MANAGER (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_MIDSIZE (41bf139a-4e60-409f-9346-a1361efc6dfb) | | MICROSOFT 365 BUSINESS PREMIUM | SPB | cbdc14ab-d96c-4c30-b9f4-6ada7cdc1d46 | AAD_SMB (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_SMBIZ (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>O365_SB_Relationship_Management (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>STREAM_O365_E1 (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WINBIZ (8e229017-d77b-43d5-9305-903395523b99)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | AZURE ACTIVE DIRECTORY (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>TO-DO (PLAN 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE ARCHIVING FOR EXCHANGE ONLINE (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MICROSOFT INTUNE (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFT BOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OUTLOOK CUSTOMER MANAGER (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>AZURE INFORMATION PROTECTION PREMIUM P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>MICROSOFT STREAM FOR O365 E1 SKU (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WINDOWS 10 BUSINESS (8e229017-d77b-43d5-9305-903395523b99)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
+| Microsoft 365 Business Voice (US) | BUSINESS_VOICE_MED2_TELCO | 08d7bce8-6e16-490e-89db-1d508e5e9609 | MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOPSTN1 (4ed3ff63-69d7-4fb7-b984-5aec7f605ca8)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Domestic Calling Plan (4ed3ff63-69d7-4fb7-b984-5aec7f605ca8)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) |
+| Microsoft 365 Business Voice (without Calling Plan) for US | BUSINESS_VOICE_DIRECTROUTING_MED | 8330dae3-d349-44f7-9cad-1b23c64baabe | MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) |
| MICROSOFT 365 DOMESTIC CALLING PLAN (120 Minutes) | MCOPSTN_5 | 11dee6af-eca8-419f-8061-6864517c1875 | MCOPSTN5 (54a152dc-90de-4996-93d2-bc47e670fc06) | MICROSOFT 365 DOMESTIC CALLING PLAN (120 min) (54a152dc-90de-4996-93d2-bc47e670fc06) | | Microsoft 365 Domestic Calling Plan for GCC | MCOPSTN_1_GOV | 923f58ab-fca1-46a1-92f9-89fda21238a8 | MCOPSTN1_GOV (3c8a8792-7866-409b-bb61-1b20ace0368b)<br/>EXCHANGE_S_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8) | Domestic Calling for Government (3c8a8792-7866-409b-bb61-1b20ace0368b)<br/>Exchange Foundation for Government (922ba911-5694-4e99-a794-73aed9bfeec8) | | MICROSOFT 365 E3 | SPE_E3 | 05e9a617-0261-4cee-bb44-138d3ef5d965 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>FORMS_PLAN_E3 (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | AZURE ACTIVE DIRECTORY PREMIUM P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>CLOUD APP SECURITY DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>TO-DO (PLAN 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW FOR OFFICE 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>MICROSOFT FORMS (PLAN E3) (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFT AZURE MULTI-FACTOR AUTHENTICATION (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS FOR OFFICE 365(c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>AZURE INFORMATION PROTECTION PREMIUM P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>MICROSOFT STREAM FOR O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WINDOWS 10 ENTERPRISE (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>YAMMER ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| WINDOWS 10 ENTERPRISE E3 | WIN10_VDA_E3 | 6a0f6da5-0b87-4190-a6ae-9bb5a2b9546a | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365) | EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>UNIVERSAL PRINT (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WINDOWS 10 ENTERPRISE (NEW) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWS UPDATE FOR BUSINESS DEPLOYMENT SERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365) | | Windows 10 Enterprise E5 | WIN10_VDA_E5 | 488ba24a-39a9-4473-8ee5-19291e71b002 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Defender For Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365) | | Windows 10 Enterprise E5 Commercial (GCC Compatible) | WINE5_GCC_COMPAT | 938fd547-d794-42a4-996c-1cc206619580 | EXCHANGE_S_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef))<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118) | Exchange Foundation for Government (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>Microsoft Defender For Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118) |
+| Windows 365 Business 2 vCPU, 4 GB, 64 GB | CPC_B_2C_4RAM_64GB | 42e6818f-8966-444b-b7ac-0027c83fa8b5 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>(CPC_B_2C_4RAM_64GB (a790cd6e-a153-4461-83c7-e127037830b6) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Windows 365 Business 2 vCPU, 4 GB, 64 GB (a790cd6e-a153-4461-83c7-e127037830b6) |
+| Windows 365 Enterprise 2 vCPU, 4 GB, 64 GB | CPC_E_2C_4GB_64GB | 7bb14422-3b90-4389-a7be-f1b745fc037f | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>CPC_E_2C_4GB_64GB (23a25099-1b2f-4e07-84bd-b84606109438) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Windows 365 Enterprise 2 vCPU, 4 GB, 64 GB (23a25099-1b2f-4e07-84bd-b84606109438) |
| WINDOWS STORE FOR BUSINESS | WINDOWS_STORE | 6470687e-a428-4b7a-bef2-8a291ad947c9 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>WINDOWS_STORE (a420f25f-a7b3-4ff5-a9d0-5d58f73b537d) | EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>WINDOWS STORE SERVICE (a420f25f-a7b3-4ff5-a9d0-5d58f73b537d) | | Microsoft Workplace Analytics | WORKPLACE_ANALYTICS | 3d957427-ecdc-4df2-aacd-01cc9d519da8 | WORKPLACE_ANALYTICS (f477b0f0-3bb1-4890-940c-40fcee6ce05f)<br/>WORKPLACE_ANALYTICS_INSIGHTS_BACKEND (ff7b261f-d98b-415b-827c-42a3fdf015af)<br/>WORKPLACE_ANALYTICS_INSIGHTS_USER (b622badb-1b45-48d5-920f-4b27a2c0996c) | Microsoft Workplace Analytics (f477b0f0-3bb1-4890-940c-40fcee6ce05f)<br/>Microsoft Workplace Analytics Insights Backend (ff7b261f-d98b-415b-827c-42a3fdf015af)<br/>Microsoft Workplace Analytics Insights User (b622badb-1b45-48d5-920f-4b27a2c0996c) |
active-directory Security Operations Applications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-applications.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-all-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault logs](../../key-vault/general/logging.md)
For more information on consent operations, see the following resources:
* [Managing consent to applications and evaluating consent requests in Azure Active Directory](../manage-apps/manage-consent-requests.md)
-* [Detect and Remediate Illicit Consent Grants - Office 365](/microsoft-365/security/office-365-security/detect-and-remediate-illicit-consent-grants?view=o365-worldwide)
+* [Detect and Remediate Illicit Consent Grants - Office 365](/microsoft-365/security/office-365-security/detect-and-remediate-illicit-consent-grants)
* [Incident response playbook - App consent grant investigation](/security/compass/incident-response-playbook-app-consent)
active-directory Security Operations Devices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-devices.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-all-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide.md)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault logs](../..//key-vault/general/logging.md?tabs=Vault)
active-directory Security Operations Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-infrastructure.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-all-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault logs](../../key-vault/general/logging.md?tabs=Vault)
Azure AD uses Microsoft SQL Server Data Engine or SQL to store Azure AD Connect
| What to monitor| Where| Notes | | - | - | - |
-| mms_management_agent| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15) |
-| mms_partition| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15) |
-| mms_run_profile| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15) |
-| mms_server_configuration| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15) |
-| mms_synchronization_rule| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15) |
+| mms_management_agent| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records) |
+| mms_partition| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records) |
+| mms_run_profile| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records) |
+| mms_server_configuration| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records) |
+| mms_synchronization_rule| SQL service audit records| See [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records) |
For information on what and how to monitor configuration information refer to:
-* For SQL server, see [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15).
+* For SQL server, see [SQL Server Audit Records](/sql/relational-databases/security/auditing/sql-server-audit-records).
-* For Azure Sentinel, see [Connect to Windows servers to collect security events](/sql/relational-databases/security/auditing/sql-server-audit-records?view=sql-server-ver15).
+* For Azure Sentinel, see [Connect to Windows servers to collect security events](/sql/relational-databases/security/auditing/sql-server-audit-records).
* For information on configuring and using Azure AD Connect, see [What is Azure AD Connect?](../hybrid/whatis-azure-ad-connect.md)
For information on what and how to monitor configuration information refer to:
-* For more information on logging PowerShell script operations, refer to [Enabling Script Block Logging](/powershell/module/microsoft.powershell.core/about/about_logging_windows?view=powershell-7.1), which is part of the PowerShell reference documentation.
+* For more information on logging PowerShell script operations, refer to [Enabling Script Block Logging](/powershell/module/microsoft.powershell.core/about/about_logging_windows), which is part of the PowerShell reference documentation.
* For more information on configuring PowerShell logging for analysis by Splunk, refer to [Get Data into Splunk User Behavior Analytics](https://docs.splunk.com/Documentation/UBA/5.0.4.1/GetDataIn/AddPowerShell).
See these additional security operations guide articles:
-ΓÇÄ
+ΓÇÄ
active-directory Security Operations Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-introduction.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-all-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault logs](../../key-vault/general/logging.md?tabs=Vault)
active-directory Security Operations Privileged Accounts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-privileged-accounts.md
The log files you use for investigation and monitoring are:
* [Azure AD Audit logs](../reports-monitoring/concept-audit-logs.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault insights](../../azure-monitor/insights/key-vault-insights-overview.md)
You can monitor privileged account sign-in events in the Azure AD Sign-in logs.
| Sign-in failure, bad password threshold | High | Azure AD Sign-ins log | Status = Failure<br>-and-<br>error code = 50126 | Define a baseline threshold, and then monitor and adjust to suite your organizational behaviors and limit false alerts from being generated. | | Failure due to CA requirement |High | Azure AD Sign-ins log | Status = Failure<br>-and-<br>error code = 53003<br>-and-<br>Failure reason = blocked by CA | This can be an indication an attacker is trying to get into the account | | Privileged accounts that don't follow naming policy.| | Azure Subscription | [List Azure role assignments using the Azure portal - Azure RBAC](../../role-based-access-control/role-assignments-list-portal.md)| List role assignments for subscriptions and alert where sign in name doesn't match your organizations format. For example, ADM_ as a prefix. |
-| Interrupt | High/Medium | Azure AD Sign-ins | Status = Interrupted<br>-and-<br>error code = 50074<br>-and-<br>Failure reason = Strong Auth required<br>Status = Interrupted<br>-and-<br>Error code = 500121<br>Failure Reason = Authentication failed during strong authentication request | This can be an indication an attacker has the password for the account but can't pass the MFA challenge. | | |
+| Interrupt | High/Medium | Azure AD Sign-ins | Status = Interrupted<br>-and-<br>error code = 50074<br>-and-<br>Failure reason = Strong Auth required<br>Status = Interrupted<br>-and-<br>Error code = 500121<br>Failure Reason = Authentication failed during strong authentication request | This can be an indication an attacker has the password for the account but can't pass the MFA challenge. |
| Privileged accounts that don't follow naming policy.| High | Azure AD directory | [List Azure AD role assignments](../roles/view-assignments.md)| List roles assignments for Azure AD roles alert where UPN doesn't match your organizations format. For example, ADM_ as a prefix. |
-| Discover privileged accounts not registered for MFA. | High | Azure AD Graph API| Query for IsMFARegistered eq false for administrator accounts. [List credentialUserRegistrationDetails - Microsoft Graph beta](/graph/api/reportroot-list-credentialuserregistrationdetails?view=graph-rest-beta&tabs=http) | Audit and investigate to determine if intentional or an oversight. |
+| Discover privileged accounts not registered for MFA. | High | Azure AD Graph API| Query for IsMFARegistered eq false for administrator accounts. [List credentialUserRegistrationDetails - Microsoft Graph beta](/graph/api/reportroot-list-credentialuserregistrationdetails?view=graph-rest-beta&preserve-view=true&tabs=http) | Audit and investigate to determine if intentional or an oversight. |
| Account lockout | High | Azure AD Sign-ins log | Status = Failure<br>-and-<br>error code = 50053 | Define a baseline threshold, and then monitor and adjust to suite your organizational behaviors and limit false alerts from being generated. | | Account disabled/blocked for sign-ins | Low | Azure AD Sign-ins log | Status = Failure<br>-and-<br>Target = user UPN<br>-and-<br>error code = 50057 | This could indicate someone is trying to gain access to an account once they have left an organization. Although the account is blocked, it's still important to log and alert on this activity. | | MFA fraud alert/block | High | Azure AD Sign-ins log/Azure Log Anaylitics | Succeeded = false<br>-and-<br>Result detail = MFA denied<br>-and-<br>Target = user | Privileged user has indicated they haven't instigated the MFA prompt and could indicate an attacker has the password for the account. |
-| Privileged account sign-ins outside of expected controls. | | Azure AD Sign-ins log | Status = failure<br>UserPricipalName = <Admin account><br>Location = <unapproved location><br>IP Address = <unapproved IP><br>Device Info= <unapproved Browser, Operating System> | Monitor and alert on any entries that you have defined as unapproved. |
+| Privileged account sign-ins outside of expected controls. | | Azure AD Sign-ins log | Status = failure<br>UserPricipalName = \<Admin account\><br>Location = \<unapproved location\><br>IP Address = \<unapproved IP\><br>Device Info= \<unapproved Browser, Operating System\> | Monitor and alert on any entries that you have defined as unapproved. |
| Outside of normal sign in times | High | Azure AD Sign-ins log | Status =success<br>-and-<br>Location =<br>-and-<br>Time = outside of working hours | Monitor and alert if sign-ins occur outside of expected times. It is important to find the normal working pattern for each privileged account and to alert if there are unplanned changes outside of normal working times. Sign-ins outside of normal working hours could indicate compromise or possible insider threats. | | Identity protection risk | High | Identity Protection logs | Risk state = at risk<br>-and-<br>Risk level = low/medium/high<br>-and-<br>Activity = Unfamiliar sign-in/TOR, etc. | This indicates there is some abnormality detected with the sign in for the account and should be alerted on. | | Password change | High | Azure AD Audit logs | Activity Actor = admin/self service<br>-and-<br>Target = user<br>-and-<br>Status = success/failure | Alert on any administrator account password changes, especially for Global admins, user admins, subscription admins, and emergency access accounts. Write a query targeted at all privileged accounts. |
active-directory Security Operations Privileged Identity Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-privileged-identity-management.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-all-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault logs](../../key-vault/general/logging.md?tabs=Vault)
active-directory Security Operations User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/security-operations-user-accounts.md
The log files you use for investigation and monitoring are:
* [Sign-in logs](../reports-monitoring/concept-all-sign-ins.md)
-* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview?view=o365-worldwide)
+* [Microsoft 365 Audit logs](/microsoft-365/compliance/auditing-solutions-overview)
* [Azure Key Vault logs](../../key-vault/general/logging.md?tabs=Vault)
From the Azure portal you can view the Azure AD Audit logs and download as comma
* **[Azure Event Hubs](../../event-hubs/event-hubs-about.md) integrated with a SIEM**- [Azure AD logs can be integrated to other SIEMs](../reports-monitoring/tutorial-azure-monitor-stream-logs-to-event-hub.md) such as Splunk, ArcSight, QRadar and Sumo Logic via the Azure Event Hub integration.
-* **[Microsoft Cloud App Security](/cloud-app-security/what-is-cloud-app-security) (MCAS)** ΓÇô enables you to discover and manage apps, govern across apps and resources, and check your cloud appsΓÇÖ compliance.
+* **[Microsoft Cloud App Security](/cloud-app-security/what-is-cloud-app-security) (MCAS)** ΓÇô enables you to discover and manage apps, govern across apps and resources, and check your cloud apps' compliance.
Much of what you will monitor and alert on are the effects of your Conditional Access policies. You can use the [Conditional Access insights and reporting workbook](../conditional-access/howto-conditional-access-insights-reporting.md) to examine the effects of one or more Conditional Access policies on your sign-ins, as well as the results of policies, including device state. This workbook enables you to view an impact summary, and identify the impact over a specific time period. You can also use the workbook to investigate the sign-ins of a specific user.
Organizations tend to have specific formats and attributes that are used for cre
* User account UPN = Firstname.Lastname@contoso.com
-User accounts also frequently have an attribute that identifies a real user. For example, EMPID = XXXNNN. The following are suggestions to help you think about and define what normal is for your organization, as well as thing to consider when defining your baseline for log entries where accounts donΓÇÖt follow your organizationΓÇÖs naming convention:
+User accounts also frequently have an attribute that identifies a real user. For example, EMPID = XXXNNN. The following are suggestions to help you think about and define what normal is for your organization, as well as thing to consider when defining your baseline for log entries where accounts don't follow your organization's naming convention:
-* Accounts that donΓÇÖt follow the naming convention. For example, `nnnnnnn@contoso.com` versus `firstname.lastname@contoso.com`.
+* Accounts that don't follow the naming convention. For example, `nnnnnnn@contoso.com` versus `firstname.lastname@contoso.com`.
-* Accounts that donΓÇÖt have the standard attributes populated or are not in the correct format. For example, not having a valid employee ID.
+* Accounts that don't have the standard attributes populated or are not in the correct format. For example, not having a valid employee ID.
| What to monitor| Risk Level| Where| Filter/sub-filter| Notes | | - | - | - | - | - |
We recommend that user and privileged accounts only be created following your or
## Unusual sign ins
-Seeing failures for user authentication is normal. But seeing patterns or blocks of failures can be an indicator that something is happening with a userΓÇÖs Identity. For example, in the case of Password spray or Brute Force attacks, or when a user account is compromised. It is critical that you monitor and alert when patterns emerge. This helps ensure you can protect the user and your organizationΓÇÖs data.
+Seeing failures for user authentication is normal. But seeing patterns or blocks of failures can be an indicator that something is happening with a user's Identity. For example, in the case of Password spray or Brute Force attacks, or when a user account is compromised. It is critical that you monitor and alert when patterns emerge. This helps ensure you can protect the user and your organization's data.
-Success appears to say all is well. But it can mean that a bad actor has successfully accessed a service. Monitoring successful logins helps you detect user accounts that are gaining access but are not user accounts that should have access. User authentication successes are normal entries in Azure AD Sign-Ins logs. We recommend you monitor and alert to detect when patterns emerge. This helps ensure you can protect user accounts and your organizationΓÇÖs data.
+Success appears to say all is well. But it can mean that a bad actor has successfully accessed a service. Monitoring successful logins helps you detect user accounts that are gaining access but are not user accounts that should have access. User authentication successes are normal entries in Azure AD Sign-Ins logs. We recommend you monitor and alert to detect when patterns emerge. This helps ensure you can protect user accounts and your organization's data.
As you design and operationalize a log monitoring and alerting strategy, consider the tools available to you through the Azure portal. Identity Protection enables you to automate the detection, protection, and remediation of identity-based risks. Identity protection uses intelligence-fed machine learning and heuristic systems to detect risk and assign a risk score for users and sign ins. Customers can configure policies based on a risk level for when to allow or deny access or allow the user to securely self-remediate from a risk. The following Identity Protection risk detections inform risk levels today: | What to monitor | Risk Level | Where | Filter/sub-filter | Notes | | - | - | - | - | - |
-| Leaked credentials user risk detection| High| Azure AD Risk Detection logs| UX: Leaked credentials <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Azure AD Threat Intelligence user risk detection| High| Azure AD Risk Detection logs| UX: Azure AD threat intelligence <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Anonymous IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Anonymous IP address <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Atypical travel sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Atypical travel <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Anomalous Token| Varies| Azure AD Risk Detection logs| UX: Anomalous Token <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Malware linked IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Malware linked IP address <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Suspicious browser sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Suspicious browser <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Unfamiliar sign-in properties sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Unfamiliar sign-in properties <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Malicious IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Malicious IP address<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Suspicious inbox manipulation rules sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Suspicious inbox manipulation rules<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Password Spray sign-in risk detection| High| Azure AD Risk Detection logs| UX: Password spray<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Impossible travel sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Impossible travel<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| New country sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: New country<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Activity from anonymous IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Activity from Anonymous IP address<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Suspicious inbox forwarding sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Suspicious inbox forwarding<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
-| Azure AD threat intelligence sign-in risk detection| High| Azure AD Risk Detection logs| UX: Azure AD threat intelligence<br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta.md)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Leaked credentials user risk detection| High| Azure AD Risk Detection logs| UX: Leaked credentials <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Azure AD Threat Intelligence user risk detection| High| Azure AD Risk Detection logs| UX: Azure AD threat intelligence <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Anonymous IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Anonymous IP address <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Atypical travel sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Atypical travel <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Anomalous Token| Varies| Azure AD Risk Detection logs| UX: Anomalous Token <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Malware linked IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Malware linked IP address <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Suspicious browser sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Suspicious browser <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Unfamiliar sign-in properties sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Unfamiliar sign-in properties <br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Malicious IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Malicious IP address<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Suspicious inbox manipulation rules sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Suspicious inbox manipulation rules<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Password Spray sign-in risk detection| High| Azure AD Risk Detection logs| UX: Password spray<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Impossible travel sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Impossible travel<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| New country sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: New country<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Activity from anonymous IP address sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Activity from Anonymous IP address<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Suspicious inbox forwarding sign-in risk detection| Varies| Azure AD Risk Detection logs| UX: Suspicious inbox forwarding<br><br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
+| Azure AD threat intelligence sign-in risk detection| High| Azure AD Risk Detection logs| UX: Azure AD threat intelligence<br>API: See [riskDetection resource type - Microsoft Graph beta](/graph/api/resources/riskdetection?view=graph-rest-beta&preserve-view=true)| See [What is risk? Azure AD Identity Protection](../identity-protection/concept-identity-protection-risks.md) |
For more information, visit [What is Identity Protection](../identity-protection/overview-identity-protection.md). ### What to look for
-Configure monitoring on the data within the Azure AD Sign-ins Logs to ensure that alerting occurs and adheres to your organizationΓÇÖs security policies. Some examples of this are:
+Configure monitoring on the data within the Azure AD Sign-ins Logs to ensure that alerting occurs and adheres to your organization's security policies. Some examples of this are:
* **Failed Authentications**: As humans we all get our passwords wrong from time to time. However, many failed authentications can indicate that a bad actor is trying to obtain access. Attacks differ in ferocity but can range from a few attempts per hour to a much higher rate. For example, Password Spray normally preys on easier passwords against many accounts, while Brute Force attempts many passwords against targeted accounts. * **Interrupted Authentications**: An Interrupt in Azure AD represents an injection of an additional process to satisfy authentication, such as when enforcing a control in a CA policy. This is a normal event and can happen when applications are not configured correctly. But when you see many interrupts for a user account it could indicate something is happening with that account.
- * For example, if you filtered on a user in Sign-in logs and see a large volume of sign in status = Interrupted and Conditional Access = Failure. Digging deeper it may show in authentication details that the password is correct, but that strong authentication is required. This could mean the user is not completing multi-factor authentication (MFA) which could indicate the userΓÇÖs password is compromised and the bad actor is unable to fulfill MFA.
+ * For example, if you filtered on a user in Sign-in logs and see a large volume of sign in status = Interrupted and Conditional Access = Failure. Digging deeper it may show in authentication details that the password is correct, but that strong authentication is required. This could mean the user is not completing multi-factor authentication (MFA) which could indicate the user's password is compromised and the bad actor is unable to fulfill MFA.
* **Smart lock out**: Azure AD provides a smart lockout service which introduces the concept of familiar and non-familiar locations to the authentication process. A user account visiting a familiar location might authenticate successfully while a bad actor unfamiliar with the same location is blocked after several attempts. Look for accounts that have been locked out and investigate further.
The following are listed in order of importance based on the impact and severity
| What to monitor| Risk Level| Where| Filter/sub-filter| Notes | | - |- |- |- |- | | Multi-factor authentication (MFA) fraud alerts.| High| Azure AD Sign-ins log| Status = failed<br>-and-<br>Details = MFA Denied<br>| Monitor and alert on any entry. |
-| Failed authentications from countries you do not operate out of.| Medium| Azure AD Sign-ins log| Location = <unapproved location>| Monitor and alert on any entries. |
+| Failed authentications from countries you do not operate out of.| Medium| Azure AD Sign-ins log| Location = \<unapproved location\>| Monitor and alert on any entries. |
| Failed authentications for legacy protocols or protocols that are not used .| Medium| Azure AD Sign-ins log| Status = failure<br>-and-<br>Client app = Other Clients, POP, IMAP, MAPI, SMTP, ActiveSync| Monitor and alert on any entries. | | Failures blocked by CA.| Medium| Azure AD Sign-ins log| Error code = 53003 <br>-and-<br>Failure reason = blocked by CA| Monitor and alert on any entries. |
-| Increased failed authentications of any type.| Medium| Azure AD Sign-ins log| Capture increases in failures across the board. I.e., total failures for today is >10 % on the same day the previous week.| If you donΓÇÖt have a set threshold, monitor and alert if failures increase by 10% or greater. |
-| Authentication occurring at times and days of the week when countries do not conduct normal business operations.| Low| Azure AD Sign-ins log| Capture interactive authentication occurring outside of normal operating days\time. <br>Status = success<br>-and-<br>Location = <location><br>-and-<br>Day\Time = <not normal working hours>| Monitor and alert on any entries. |
+| Increased failed authentications of any type.| Medium| Azure AD Sign-ins log| Capture increases in failures across the board. I.e., total failures for today is >10 % on the same day the previous week.| If you don't have a set threshold, monitor and alert if failures increase by 10% or greater. |
+| Authentication occurring at times and days of the week when countries do not conduct normal business operations.| Low| Azure AD Sign-ins log| Capture interactive authentication occurring outside of normal operating days\time. <br>Status = success<br>-and-<br>Location = \<location\><br>-and-<br>Day\Time = \<not normal working hours\>| Monitor and alert on any entries. |
| Account disabled/blocked for sign-ins| Low| Azure AD Sign-ins log| Status = Failure<br>-and-<br>error code = 50057, The user account is disabled.| This could indicate someone is trying to gain access to an account once they have left an organization. Although the account is blocked it is still important to log and alert on this activity. |
The following are listed in order of importance based on the impact and severity
| What to monitor| Risk Level| Where| Filter/sub-filter| Notes | | - |- |- |- |- |
-| Authentications of privileged accounts outside of expected controls.| High| Azure AD Sign-ins log| Status = success<br>-and-<br>UserPricipalName = <Admin account><br>-and-<br>Location = <unapproved location><br>-and-<br>IP Address = <unapproved IP><br>Device Info= <unapproved Browser, Operating System><br>| Monitor and alert on successful authentication for privileged accounts outside of expected controls. Three common controls are listed. |
+| Authentications of privileged accounts outside of expected controls.| High| Azure AD Sign-ins log| Status = success<br>-and-<br>UserPricipalName = \<Admin account\><br>-and-<br>Location = \<unapproved location\><br>-and-<br>IP Address = \<unapproved IP\><br>Device Info= \<unapproved Browser, Operating System\><br>| Monitor and alert on successful authentication for privileged accounts outside of expected controls. Three common controls are listed. |
| When only single-factor authentication is required.| Low| Azure AD Sign-ins log| Status = success<br>Authentication requirement = Single-factor authentication| Monitor this periodically and ensure this is the expected behavior. |
-| Discover privileged accounts not registered for MFA.| High| Azure Graph API| Query for IsMFARegistered eq false for administrator accounts. <br>[List credentialUserRegistrationDetails - Microsoft Graph beta | Microsoft Docs](/graph/api/reportroot-list-credentialuserregistrationdetails?view=graph-rest-beta&tabs=http)| Audit and investigate to determine if intentional or an oversight. |
-| Successful authentications from countries your organization does not operate out of.| Medium| Azure AD Sign-ins log| Status = success<br>Location = <unapproved country>| Monitor and alert on any entries not equal to the city names you provide. |
+| Discover privileged accounts not registered for MFA.| High| Azure Graph API| Query for IsMFARegistered eq false for administrator accounts. <br>[List credentialUserRegistrationDetails - Microsoft Graph beta | Microsoft Docs](/graph/api/reportroot-list-credentialuserregistrationdetails?view=graph-rest-beta&preserve-view=true&tabs=http)| Audit and investigate to determine if intentional or an oversight. |
+| Successful authentications from countries your organization does not operate out of.| Medium| Azure AD Sign-ins log| Status = success<br>Location = \<unapproved country\>| Monitor and alert on any entries not equal to the city names you provide. |
| Successful authentication, session blocked by CA.| Medium| Azure AD Sign-ins log| Status = success<br>-and-<br>error code = 53003 ΓÇô Failure reason, blocked by CA| Monitor and investigate when authentication is successful, but session is blocked by CA. | | Successful authentication after you have disabled legacy authentication.| Medium| Azure AD Sign-ins log| status = success <br>-and-<br>Client app = Other Clients, POP, IMAP, MAPI, SMTP, ActiveSync| If your organization has disabled legacy authentication, monitor and alert when successful legacy authentication has taken place. |
On periodic basis, we recommend you review authentications to medium business im
| What to monitor| Risk Level| Where| Filter/sub-filter| Notes | | - | - |- |- |- |
-| Authentications to MBI and HBI application using single-factor authentication.| Low| Azure AD Sign-ins log| status = success<br>-and-<br>Application ID = <HBI app> <br>-and-<br>Authentication requirement = single-factor authentication.| Review and validate this configuration is intentional. |
-| Authentications at days and times of the week or year that countries do not conduct normal business operations.| Low| Azure AD Sign-ins log| Capture interactive authentication occurring outside of normal operating days\time. <br>Status = success<br>Location = <location><br>Date\Time = <not normal working hours>| Monitor and alert on authentications days and times of the week or year that countries do not conduct normal business operations. |
-| Measurable increase of successful sign ins.| Low| Azure AD Sign-ins log| Capture increases in successful authentication across the board. I.e., total successes for today is >10 % on the same day the previous week.| If you donΓÇÖt have a set threshold, monitor and alert if successful authentications increase by 10% or greater. |
+| Authentications to MBI and HBI application using single-factor authentication.| Low| Azure AD Sign-ins log| status = success<br>-and-<br>Application ID = \<HBI app\> <br>-and-<br>Authentication requirement = single-factor authentication.| Review and validate this configuration is intentional. |
+| Authentications at days and times of the week or year that countries do not conduct normal business operations.| Low| Azure AD Sign-ins log| Capture interactive authentication occurring outside of normal operating days\time. <br>Status = success<br>Location = \<location\><br>Date\Time = \<not normal working hours\>| Monitor and alert on authentications days and times of the week or year that countries do not conduct normal business operations. |
+| Measurable increase of successful sign ins.| Low| Azure AD Sign-ins log| Capture increases in successful authentication across the board. I.e., total successes for today is >10 % on the same day the previous week.| If you don't have a set threshold, monitor and alert if successful authentications increase by 10% or greater. |
## Next steps See these security operations guide articles:
active-directory Whats New Archive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/whats-new-archive.md
For more information about the new cookies, see [Cookie settings for accessing o
In January 2019, we've added these 35 new apps with Federation support to the app gallery:
-[Firstbird](../saas-apps/firstbird-tutorial.md), [Folloze](../saas-apps/folloze-tutorial.md), [Talent Palette](../saas-apps/talent-palette-tutorial.md), [Infor CloudSuite](../saas-apps/infor-cloud-suite-tutorial.md), [Cisco Umbrella](../saas-apps/cisco-umbrella-tutorial.md), [Zscaler Internet Access Administrator](../saas-apps/zscaler-internet-access-administrator-tutorial.md), [Expiration Reminder](../saas-apps/expiration-reminder-tutorial.md), [InstaVR Viewer](../saas-apps/instavr-viewer-tutorial.md), [CorpTax](../saas-apps/corptax-tutorial.md), [Verb](https://app.verb.net/login), [OpenLattice](https://openlattice.com/agora), [TheOrgWiki](https://www.theorgwiki.com/signup), [Pavaso Digital Close](../saas-apps/pavaso-digital-close-tutorial.md), [GoodPractice Toolkit](../saas-apps/goodpractice-toolkit-tutorial.md), [Cloud Service PICCO](../saas-apps/cloud-service-picco-tutorial.md), [AuditBoard](../saas-apps/auditboard-tutorial.md), [iProva](../saas-apps/iprova-tutorial.md), [Workable](../saas-apps/workable-tutorial.md), [CallPlease](https://webapp.callplease.com/create-account/create-account.html), [GTNexus SSO System](../saas-apps/gtnexus-sso-module-tutorial.md), [CBRE ServiceInsight](../saas-apps/cbre-serviceinsight-tutorial.md), [Deskradar](../saas-apps/deskradar-tutorial.md), [Coralogixv](../saas-apps/coralogix-tutorial.md), [Signagelive](../saas-apps/signagelive-tutorial.md), [ARES for Enterprise](../saas-apps/ares-for-enterprise-tutorial.md), [K2 for Office 365](https://www.k2.com/O365), [Xledger](https://www.xledger.net/), [iDiD Manager](../saas-apps/idid-manager-tutorial.md), [HighGear](../saas-apps/highgear-tutorial.md), [Visitly](../saas-apps/visitly-tutorial.md), [Korn Ferry ALP](../saas-apps/korn-ferry-alp-tutorial.md), [Acadia](../saas-apps/acadia-tutorial.md), [Adoddle cSaas Platform](../saas-apps/adoddle-csaas-platform-tutorial.md)<!-- , [CaféX Portal (Meetings)](https://docs.microsoft.com/azure/active-directory/saas-apps/cafexportal-meetings-tutorial), [MazeMap Link](https://docs.microsoft.com/azure/active-directory/saas-apps/mazemaplink-tutorial)-->
+[Firstbird](../saas-apps/firstbird-tutorial.md), [Folloze](../saas-apps/folloze-tutorial.md), [Talent Palette](../saas-apps/talent-palette-tutorial.md), [Infor CloudSuite](../saas-apps/infor-cloud-suite-tutorial.md), [Cisco Umbrella](../saas-apps/cisco-umbrella-tutorial.md), [Zscaler Internet Access Administrator](../saas-apps/zscaler-internet-access-administrator-tutorial.md), [Expiration Reminder](../saas-apps/expiration-reminder-tutorial.md), [InstaVR Viewer](../saas-apps/instavr-viewer-tutorial.md), [CorpTax](../saas-apps/corptax-tutorial.md), [Verb](https://app.verb.net/login), [OpenLattice](https://openlattice.com/agora), [TheOrgWiki](https://www.theorgwiki.com/signup), [Pavaso Digital Close](../saas-apps/pavaso-digital-close-tutorial.md), [GoodPractice Toolkit](../saas-apps/goodpractice-toolkit-tutorial.md), [Cloud Service PICCO](../saas-apps/cloud-service-picco-tutorial.md), [AuditBoard](../saas-apps/auditboard-tutorial.md), [iProva](../saas-apps/iprova-tutorial.md), [Workable](../saas-apps/workable-tutorial.md), [CallPlease](https://webapp.callplease.com/create-account/create-account.html), [GTNexus SSO System](../saas-apps/gtnexus-sso-module-tutorial.md), [CBRE ServiceInsight](../saas-apps/cbre-serviceinsight-tutorial.md), [Deskradar](../saas-apps/deskradar-tutorial.md), [Coralogixv](../saas-apps/coralogix-tutorial.md), [Signagelive](../saas-apps/signagelive-tutorial.md), [ARES for Enterprise](../saas-apps/ares-for-enterprise-tutorial.md), [K2 for Office 365](https://www.k2.com/O365), [Xledger](https://www.xledger.net/), [iDiD Manager](../saas-apps/idid-manager-tutorial.md), [HighGear](../saas-apps/highgear-tutorial.md), [Visitly](../saas-apps/visitly-tutorial.md), [Korn Ferry ALP](../saas-apps/korn-ferry-alp-tutorial.md), [Acadia](../saas-apps/acadia-tutorial.md), [Adoddle cSaas Platform](../saas-apps/adoddle-csaas-platform-tutorial.md)
For more information about the apps, see [SaaS application integration with Azure Active Directory](../saas-apps/tutorial-list.md). For more information about listing your application in the Azure AD app gallery, see [List your application in the Azure Active Directory application gallery](../develop/v2-howto-app-gallery-listing.md).
active-directory How To Connect Single Object Sync https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/how-to-connect-single-object-sync.md
The Single Object Sync tool performs the following steps:
9. Sync Object from Active Directory Connector Space. 10. Export Object from Azure Active Directory Connector Space to Azure Active Directory.
-In addition to the JSON output, the tool generates an HTML report that has all the details of the synchronization operation. The HTML report is located in **C:\ProgramData\AADConnect\ADSyncObjectDiagnostics\ ADSyncSingleObjectSyncResult-<date>.htm**. This HTML report can be shared with the support team to do further troubleshooting, if needed.
+In addition to the JSON output, the tool generates an HTML report that has all the details of the synchronization operation. The HTML report is located in **C:\ProgramData\AADConnect\ADSyncObjectDiagnostics\ ADSyncSingleObjectSyncResult-\<date\>.htm**. This HTML report can be shared with the support team to do further troubleshooting, if needed.
The HTML report has the following:
The HTML report has the following:
In order to use the Single Object Sync tool, you will need to use the following: - 2021 March release ([1.6.4.0](reference-connect-version-history.md#1640)) of Azure AD Connect or later.
+ - [PowerShell 5.0](/powershell/scripting/windows-powershell/whats-new/what-s-new-in-windows-powershell-50)
### Run the Single Object Sync tool
active-directory How To Connect Sync Feature Preferreddatalocation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/how-to-connect-sync-feature-preferreddatalocation.md
Title: 'Azure AD Connect: Configure preferred data location for Microsoft 365 resources' description: Describes how to put your Microsoft 365 user resources close to the user with Azure Active Directory Connect sync.- - Last updated 06/09/2021 - # Azure Active Directory Connect sync: Configure preferred data location for Microsoft 365 resources The purpose of this topic is to walk you through how to configure the attribute for preferred data location in Azure Active Directory (Azure AD) Connect sync. When someone uses Multi-Geo capabilities in Microsoft 365, you use this attribute to designate the geo-location of the userΓÇÖs Microsoft 365 data. (The terms *region* and *geo* are used interchangeably.) ## Supported Multi-Geo locations
-For a list of all geos supported by Azure AD Connect see [Microsoft 365 Multi-Geo availability](/microsoft-365/enterprise/microsoft-365-multi-geo?view=o365-worldwide#microsoft-365-multi-geo-availability)
+For a list of all geos supported by Azure AD Connect see [Microsoft 365 Multi-Geo availability](/microsoft-365/enterprise/microsoft-365-multi-geo#microsoft-365-multi-geo-availability)
## Enable synchronization of preferred data location By default, Microsoft 365 resources for your users are located in the same geo as your Azure AD tenant. For example, if your tenant is located in North America, then the users' Exchange mailboxes are also located in North America. For a multinational organization, this might not be optimal.
By setting the attribute **preferredDataLocation**, you can define a user's geo.
> [!IMPORTANT] > Multi-Geo is currently available to customers with an active Enterprise Agreement and a minimum of 250 Microsoft 365 Services subscriptions. Please talk to your Microsoft representative for details. >
-> For a list of all geos supported by Azure AD Connect see [Microsoft 365 Multi-Geo availability](/microsoft-365/enterprise/microsoft-365-multi-geo?view=o365-worldwide#microsoft-365-multi-geo-availability).
+> For a list of all geos supported by Azure AD Connect see [Microsoft 365 Multi-Geo availability](/microsoft-365/enterprise/microsoft-365-multi-geo#microsoft-365-multi-geo-availability).
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/reference-connect-version-history.md
Required permissions | For permissions required to apply an update, see [account
>[!IMPORTANT]
-> **On 31 August 2022, all 1.x versions of Azure Active Directory (Azure AD) Connect will be retired because they include SQL Server 2012 components that will no longer be supported.** Either upgrade to the most recent version of Azure AD Connect (2.x version) by that date, or [evaluate and switch to Azure AD cloud sync](https://docs.microsoft.com/azure/active-directory/cloud-sync/what-is-cloud-sync).
+> **On 31 August 2022, all 1.x versions of Azure Active Directory (Azure AD) Connect will be retired because they include SQL Server 2012 components that will no longer be supported.** Either upgrade to the most recent version of Azure AD Connect (2.x version) by that date, or [evaluate and switch to Azure AD cloud sync](../cloud-sync/what-is-cloud-sync.md).
> > You need to make sure you are running a recent version of Azure AD Connect to receive an optimal support experience. >
active-directory Whatis Azure Ad Connect V2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/whatis-azure-ad-connect-v2.md
This [article](/windows-server/get-started-19/install-upgrade-migrate-19) descri
This release of Azure AD Connect contains several cmdlets that require PowerShell 5.0, so this requirement is a new prerequisite for Azure AD Connect.
-More details about PowerShell prerequisites can be found [here](/powershell/scripting/windows-powershell/install/windows-powershell-system-requirements?view=powershell-7.1#windows-powershell-50).
+More details about PowerShell prerequisites can be found [here](/powershell/scripting/windows-powershell/install/windows-powershell-system-requirements#windows-powershell-50).
>[!NOTE] >PowerShell 5 is already part of Windows Server 2016 so you probably do not have to take action as long as you are on a recent Window Server version.
You should upgrade to Azure AD Connect V2.0 as soon as you can. **__All Azure AD
Yes, you still need to upgrade to remain in a supported state even if you do not use SQL Server 2012, due to the TLS1.0/1.1 and ADAL deprecation. **After the upgrade of my Azure AD Connect instance to V2.0, will the SQL 2012 components automatically get uninstalled?** </br>
-No, the upgrade to SQL 2019 does not remove any SQL 2012 components from your server. If you no longer need these components then you should follow [the SQL Server uninstallation instructions](https://docs.microsoft.com/sql/sql-server/install/uninstall-an-existing-instance-of-sql-server-setup).
+No, the upgrade to SQL 2019 does not remove any SQL 2012 components from your server. If you no longer need these components then you should follow [the SQL Server uninstallation instructions](/sql/sql-server/install/uninstall-an-existing-instance-of-sql-server-setup).
**What happens if I do not upgrade?** </br> Until one of the components that are being retired are actually deprecated, you will not see any impact. Azure AD Connect will keep on working.
active-directory Access Panel Collections https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/access-panel-collections.md
Previously updated : 02/10/2020 Last updated : 09/02/2021 +
+#customer intent: As an admin, I want to enable and create collections for My Apps portal in Azure AD.
# Create collections on the My Apps portal
Your users can use the My Apps portal to view and start the cloud-based applicat
> [!NOTE] > This article covers how an admin can enable and create collections. For information for the end user about how to use the My Apps portal and collections, see [Access and use collections](../user-help/my-applications-portal-workspaces.md).
+## Prerequisites
+
+To create collections on the My Apps portal, you need:
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- One of the following roles: Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal.
+ ## Enable the latest My Apps features 1. Open the [**Azure portal**](https://portal.azure.com/) and sign in as a user administrator or Global Administrator. 2. Go to **Azure Active Directory** > **User settings**.
-3. Under **User feature previews**, select **Manage user feature preview settings**.
+3. Under **User features**, select **Manage user feature settings**.
4. Under **Users can use preview features for My Apps**, choose one of the following options: * **Selected** - Enables the features for a specific group. Use the **Select a group** option to select the group for which you want to enable the features.
To create a collection, you must have an Azure AD Premium P1 or P2 license.
4. Select **New collection**. In the **New collection** page, enter a **Name** for the collection (we recommend not using "collection" in the name. Then enter a **Description**.
- ![New collection page](media/acces-panel-collections/new-collection.png)
- 5. Select the **Applications** tab. Select **+ Add application**, and then in the **Add applications** page, select all the applications you want to add to the collection, or use the **Search** box to find applications. ![Add an application to the collection](media/acces-panel-collections/add-applications.png)
-6. When you're finished adding applications, select **Add**. The list of selected applications appears. You can use the up arrows to change the order of applications in the list. To move an application down or to delete it from the collection, select the **More** menu (**...**).
+6. When you're finished adding applications, select **Add**. The list of selected applications appears. You can use the up arrows to change the order of applications in the list.
7. Select the **Owners** tab. Select **+ Add users and groups**, and then in the **Add users and groups** page, select the users or groups you want to assign ownership to. When you're finished selecting users and groups, choose **Select**.
-9. Select the **Users and groups** tab. Select **+ Add users and groups**, and then in the **Add users and groups** page, select the users or groups you want to assign the collection to. Or use the **Search** box to find users or groups. When you're finished selecting users and groups, choose **Select**.
-
- ![Add users and groups](media/acces-panel-collections/add-users-and-groups.png)
+8. Select the **Users and groups** tab. Select **+ Add users and groups**, and then in the **Add users and groups** page, select the users or groups you want to assign the collection to. Or use the **Search** box to find users or groups. When you're finished selecting users and groups, choose **Select**.
-11. Select **Review + Create**. The properties for the new collection appear.
+9. Select **Review + Create**. The properties for the new collection appear.
> [!NOTE] > Admin collections are managed through the [Azure portal](https://portal.azure.com), not from [My Apps portal](https://myapps.microsoft.com). For example, if you assign users or groups as an owner, then they can only manage the collection through the Azure portal.
You can access audit logs in the [Azure portal](https://portal.azure.com) by sel
## Get support for My Account pages
-From the My Apps page, a user can select **My account** > **View my account** to open their account settings. On the Azure AD **My Account** page, users can manage their security info, devices, passwords, and more. They can also access their Office account settings.
+From the My Apps page, a user can select **My account** > **View account** to open their account settings. On the Azure AD **My Account** page, users can manage their security info, devices, passwords, and more. They can also access their Office account settings.
In case you need to submit a support request for an issue with the Azure AD account page or the Office account page, follow these steps so your request is routed properly:
active-directory Assign User Or Group Access Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/assign-user-or-group-access-portal.md
To assign users to an app using PowerShell, you need the following:
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - One of the following roles: Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal.-- Set up Azure AD PowerShell. See [Azure AD PowerShell](https://docs.microsoft.com/powershell/azure/)
+- Set up Azure AD PowerShell. See [Azure AD PowerShell](/powershell/azure/)
- Optional: Azure Active Directory Premium P1 or P2 for group-based assignment. For more licensing requirements for the features discussed in this article, see the [Azure Active Directory pricing page](https://azure.microsoft.com/pricing/details/active-directory). - Optional: Completion of [Configure an app](add-application-portal-configure.md).
active-directory Configure Password Single Sign On Non Gallery Applications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/configure-password-single-sign-on-non-gallery-applications.md
Using Azure AD as your Identity Provider (IdP) and configuring single sign-on (S
> > If the application was registered using **App registrations** then the single sign-on capability is configured to use OIDC OAuth by default. In this case, the **Single sign-on** option won't show in the navigation under **Enterprise applications**. When you use **App registrations** to add your custom app, you configure options in the manifest file. To learn more about the manifest file, see [Azure Active Directory app manifest](../develop/reference-app-manifest.md). To learn more about SSO standards, see [Authentication and authorization using Microsoft identity platform](../develop/authentication-vs-authorization.md#authentication-and-authorization-using-the-microsoft-identity-platform). >
-> Other scenarios where **Single sign-on** will be missing from the navigation include when an application is hosted in another tenant or if your account does not have the required permissions (Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal). Permissions can also cause a scenario where you can open **Single sign-on** but won't be able to save. To learn more about Azure AD administrative roles, see (https://docs.microsoft.com/azure/active-directory/users-groups-roles/directory-assign-admin-roles).
+> Other scenarios where **Single sign-on** will be missing from the navigation include when an application is hosted in another tenant or if your account does not have the required permissions (Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal). Permissions can also cause a scenario where you can open **Single sign-on** but won't be able to save. To learn more about Azure AD administrative roles, see (../users-groups-roles/directory-assign-admin-roles.md).
## Basic configuration
active-directory Configure Permission Classifications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/configure-permission-classifications.md
To complete the tasks in this guide, you need the following:
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - A Global Administrator role.-- Set up Azure AD PowerShell. See [Azure AD PowerShell](https://docs.microsoft.com/powershell/azure/)
+- Set up Azure AD PowerShell. See [Azure AD PowerShell](/powershell/azure/)
## Manage permission classifications
active-directory Configure Saml Single Sign On https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/configure-saml-single-sign-on.md
In the [quickstart series](add-application-portal-setup-sso.md), there's an arti
> > If the application was registered using **App registrations** then the single sign-on capability is configured to use OIDC OAuth by default. In this case, the **Single sign-on** option won't show in the navigation under **Enterprise applications**. When you use **App registrations** to add your custom app, you configure options in the manifest file. To learn more about the manifest file, see [Azure Active Directory app manifest](../develop/reference-app-manifest.md). To learn more about SSO standards, see [Authentication and authorization using Microsoft identity platform](../develop/authentication-vs-authorization.md#authentication-and-authorization-using-the-microsoft-identity-platform). >
-> Other scenarios where **Single sign-on** will be missing from the navigation include when an application is hosted in another tenant or if your account does not have the required permissions (Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal). Permissions can also cause a scenario where you can open **Single sign-on** but won't be able to save. To learn more about Azure AD administrative roles, see (https://docs.microsoft.com/azure/active-directory/users-groups-roles/directory-assign-admin-roles).
+> Other scenarios where **Single sign-on** will be missing from the navigation include when an application is hosted in another tenant or if your account does not have the required permissions (Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal). Permissions can also cause a scenario where you can open **Single sign-on** but won't be able to save. To learn more about Azure AD administrative roles, see (../users-groups-roles/directory-assign-admin-roles.md).
## Basic SAML configuration
You should get the values from the application vendor. You can manually enter th
| Basic SAML Configuration setting | SP-Initiated | idP-Initiated | Description | |:--|:--|:--|:--|
-| **Identifier (Entity ID)** | Required for some apps | Required for some apps | Uniquely identifies the application. Azure AD sends the identifier to the application as the Audience parameter of the SAML token. The application is expected to validate it. This value also appears as the Entity ID in any SAML metadata provided by the application. Enter a URL that uses the following pattern: 'https://<subdomain>.contoso.com' *You can find this value as the **Issuer** element in the **AuthnRequest** (SAML request) sent by the application.* |
+| **Identifier (Entity ID)** | Required for some apps | Required for some apps | Uniquely identifies the application. Azure AD sends the identifier to the application as the Audience parameter of the SAML token. The application is expected to validate it. This value also appears as the Entity ID in any SAML metadata provided by the application. Enter a URL that uses the following pattern: `https://<subdomain>.contoso.com` *You can find this value as the **Issuer** element in the **AuthnRequest** (SAML request) sent by the application.* |
| **Reply URL** | Required | Required | Specifies where the application expects to receive the SAML token. The reply URL is also referred to as the Assertion Consumer Service (ACS) URL. You can use the additional reply URL fields to specify multiple reply URLs. For example, you might need additional reply URLs for multiple subdomains. Or, for testing purposes you can specify multiple reply URLs (local host and public URLs) at one time. | | **Sign-on URL** | Required | Don't specify | When a user opens this URL, the service provider redirects to Azure AD to authenticate and sign on the user. Azure AD uses the URL to start the application from Microsoft 365 or Azure AD My Apps. When blank, Azure AD does an IdP-initiated sign-on when a user launches the application from Microsoft 365, Azure AD My Apps, or the Azure AD SSO URL.| | **Relay State** | Optional | Optional | Specifies to the application where to redirect the user after authentication is completed. Typically the value is a valid URL for the application. However, some applications use this field differently. For more information, ask the application vendor.
active-directory Configure User Consent Groups https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/configure-user-consent-groups.md
To complete the tasks in this guide, you need the following:
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - A Global Administrator role.-- Set up Azure AD PowerShell. See [Azure AD PowerShell](https://docs.microsoft.com/powershell/azure/)
+- Set up Azure AD PowerShell. See [Azure AD PowerShell](/powershell/azure/)
## Manage group owner consent to apps
active-directory Datawiza With Azure Ad https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/datawiza-with-azure-ad.md
To get started, you'll need:
- An Azure subscription. If you don\'t have a subscription, you can get a [trial account](https://azure.microsoft.com/free/). -- An [Azure AD tenant](https://docs.microsoft.com/azure/active-directory/fundamentals/active-directory-access-create-new-tenant)
+- An [Azure AD tenant](../fundamentals/active-directory-access-create-new-tenant.md)
that's linked to your Azure subscription. - [Docker](https://docs.docker.com/get-docker/) and
are required to run DAB. Your applications can run on any platform, such as the
Datawiza integration includes the following components: -- [Azure AD](https://docs.microsoft.com/azure/active-directory/fundamentals/active-directory-whatis) - Microsoft's cloud-based identity and access management service, which helps users sign in and access external and internal resources.
+- [Azure AD]../fundamentals/active-directory-whatis.md) - Microsoft's cloud-based identity and access management service, which helps users sign in and access external and internal resources.
- Datawiza Access Broker (DAB) - The service user sign on and transparently passes identity to applications through HTTP headers.
header-based application should have SSO enabled with Azure AD. Open a browser a
## Next steps -- [Configure Datawiza with Azure AD B2C](https://docs.microsoft.com/azure/active-directory-b2c/partner-datawiza)
+- [Configure Datawiza with Azure AD B2C](../../active-directory-b2c/partner-datawiza.md)
- [Datawiza documentation](https://docs.datawiza.com)
active-directory Debug Saml Sso Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/debug-saml-sso-issues.md
We recommend installing the [My Apps Secure Sign-in Extension](../user-help/my-a
To download and install the My Apps Secure Sign-in Extension, use one of the following links. - [Chrome](https://go.microsoft.com/fwlink/?linkid=866367)-- [Microsoft Edge](https://go.microsoft.com/fwlink/?linkid=845176)
+- [Microsoft Edge](https://microsoftedge.microsoft.com/addons/detail/my-apps-secure-signin-ex/gaaceiggkkiffbfdpmfapegoiohkiipl)
## Test SAML-based single sign-on
active-directory Manage App Consent Policies https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/manage-app-consent-policies.md
Previously updated : 06/01/2020 Last updated : 09/02/2021 +
+#customer intent: As an admin, I want to manage app consent policies for enterprise applications in Azure AD
# Manage app consent policies
active-directory Manage Application Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/manage-application-permissions.md
For more information on consenting to applications, see [Azure Active Directory
To do the following actions, you must sign in as a global administrator, an application administrator, or a cloud application administrator. -- Set up Azure AD PowerShell. See [Azure AD PowerShell](https://docs.microsoft.com/powershell/azure/)
+- Set up Azure AD PowerShell. See [Azure AD PowerShell](/powershell/azure/)
To restrict access to applications, you need to require user assignment and then assign users or groups to the application. For more information, see [Methods for assigning users and groups](./assign-user-or-group-access-portal.md).
active-directory Migrate Applications From Okta To Azure Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-applications-from-okta-to-azure-active-directory.md
Now you can print all the applications in your Okta tenant to a JSON format.
![image to shows list of applications](media/migrate-applications-from-okta-to-azure-active-directory/list-of-applications.png)
-It's recommended to copy and convert this JSON list to CSV using a public converter such as <https://konklone.io/json/> or PowerShell using [ConvertFrom-Json](https://docs.microsoft.com/powershell/module/microsoft.powershell.utility/convertfrom-json?view=powershell-7.1)
-and [ConvertTo-CSV.](https://docs.microsoft.com/powershell/module/microsoft.powershell.utility/convertto-csv?view=powershell-7.1)
+It's recommended to copy and convert this JSON list to CSV using a public converter such as <https://konklone.io/json/> or PowerShell using [ConvertFrom-Json](/powershell/module/microsoft.powershell.utility/convertfrom-json)
+and [ConvertTo-CSV.](/powershell/module/microsoft.powershell.utility/convertto-csv)
After Downloading the CSV, the applications in your Okta tenant have been recorded successfully for future reference. ## Migrate a SAML application to Azure AD
-To migrate a SAML 2.0 application to Azure AD, first configure the application in your Azure AD tenant for application access. In this example, we'll be converting a Salesforce instance. Follow [this tutorial](https://docs.microsoft.com/azure/active-directory/saas-apps/salesforce-tutorial) to onboard the applications.
+To migrate a SAML 2.0 application to Azure AD, first configure the application in your Azure AD tenant for application access. In this example, we'll be converting a Salesforce instance. Follow [this tutorial](../saas-apps/salesforce-tutorial.md) to onboard the applications.
To complete the migration process, repeat configuration steps for all applications discovered in the Okta tenant.
Settings** > **New from Metadata File**
![image to shows select new identity provider](media/migrate-applications-from-okta-to-azure-active-directory/new-identity-provider.png)
-15. If everything has been correctly configured, the user will land at the Salesforce homepage. If there are any issues follow the [debugging guide](https://docs.microsoft.com/azure/active-directory/manage-apps/debug-saml-sso-issues).
+15. If everything has been correctly configured, the user will land at the Salesforce homepage. If there are any issues follow the [debugging guide](../manage-apps/debug-saml-sso-issues.md).
16. After testing the SSO connection from Azure, return to the enterprise application and assign the remaining users to the Salesforce application with the correct roles.
To complete the migration process, repeat configuration steps for all applicatio
![image to shows new oidc application](media/migrate-applications-from-okta-to-azure-active-directory/new-oidc-application.png)
-3. On the next page, you'll be presented with a choice about tenancy of your application registration. See [this article](https://docs.microsoft.com/azure/active-directory/develop/single-and-multi-tenant-apps) for details.
+3. On the next page, you'll be presented with a choice about tenancy of your application registration. See [this article](../develop/single-and-multi-tenant-apps.md) for details.
In this example, we are selecting **Accounts in any organizational directory**, any Azure AD directory **Multitenant** followed by **Register**.
directory**, any Azure AD directory **Multitenant** followed by **Register**.
4. After registering the application, navigate to the **App registrations** page under **Azure Active Directory**, and open the newly created registration.
- Depending on the [application scenario,](https://docs.microsoft.com/azure/active-directory/develop/authentication-flows-app-scenarios) various configuration actions
+ Depending on the [application scenario,](../develop/authentication-flows-app-scenarios.md) various configuration actions
might be needed. As most scenarios require App client secret, we'll cover those examples. 5. On the **Overview** page, record the Application (client) ID for use in your application later.
directory**, any Azure AD directory **Multitenant** followed by **Register**.
## Migrate a custom authorization server to Azure AD
-Okta authorization servers map one-to-one to application registrations that [expose an API](https://docs.microsoft.com/azure/active-directory/develop/quickstart-configure-app-expose-web-apis#add-a-scope).
+Okta authorization servers map one-to-one to application registrations that [expose an API](../develop/quickstart-configure-app-expose-web-apis.md#add-a-scope).
Default Okta authorization server should be mapped to Microsoft Graph scopes/permissions.
active-directory Migrate Okta Federation To Azure Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-okta-federation-to-azure-active-directory.md
Customers who have federated their Office 365 domains with Okta may not currentl
Use the following methods to determine which method is best suited for your environment: -- **Password hash synchronization** - [Password hash synchronization](https://docs.microsoft.com/azure/active-directory/hybrid/whatis-phs) is an extension to the directory synchronization feature implemented by Azure AD Connect server or Cloud provisioning agents. You can use this feature to sign into Azure AD services like Microsoft 365. You sign in to the service by using the same password you use to sign in to your on-premises Active Directory instance.
+- **Password hash synchronization** - [Password hash synchronization](../hybrid/whatis-phs.md) is an extension to the directory synchronization feature implemented by Azure AD Connect server or Cloud provisioning agents. You can use this feature to sign into Azure AD services like Microsoft 365. You sign in to the service by using the same password you use to sign in to your on-premises Active Directory instance.
-- **Pass-through authentication** - Azure AD [Pass-through authentication](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-pta) allows users to sign in to both on-premises and cloud-based applications using the same passwords. When users sign in using Azure AD, this feature validates users' passwords directly against the on-premises Active Directory via the Pass-through Authentication agent.
+- **Pass-through authentication** - Azure AD [Pass-through authentication](../hybrid/how-to-connect-pta.md) allows users to sign in to both on-premises and cloud-based applications using the same passwords. When users sign in using Azure AD, this feature validates users' passwords directly against the on-premises Active Directory via the Pass-through Authentication agent.
-- **Seamless SSO** - [Azure AD Seamless SSO](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-sso) automatically signs in users when they are on their corporate desktops
+- **Seamless SSO** - [Azure AD Seamless SSO](../hybrid/how-to-connect-sso.md) automatically signs in users when they are on their corporate desktops
that are connected to your corporate network. Seamless SSO provides your users with easy access to your cloud-based applications without needing any other on-premises components. Seamless SSO can also be deployed to Password hash synchronization or Pass-through authentication to create a seamless authentication experience to users in Azure AD.
-Ensure that you deploy all necessary pre-requisites of Seamless SSO to your end users by following the [deployment guide](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-sso-quick-start#step-1-check-the-prerequisites).
+Ensure that you deploy all necessary pre-requisites of Seamless SSO to your end users by following the [deployment guide](../hybrid/how-to-connect-sso-quick-start.md#step-1-check-the-prerequisites).
For our example, we'll be configuring Password hash sync and Seamless SSO.
Follow these steps to enable Seamless SSO:
## Step 2 - Configure staged rollout features
-[Staged rollout of cloud authentication](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-staged-rollout) is a feature of Azure AD that can be used to test de-federating users before de-federating an entire
-domain. Before the deployment review the [pre-requisites](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-staged-rollout#prerequisites).
+[Staged rollout of cloud authentication](../hybrid/how-to-connect-staged-rollout.md) is a feature of Azure AD that can be used to test de-federating users before de-federating an entire
+domain. Before the deployment review the [pre-requisites](../hybrid/how-to-connect-staged-rollout.md#prerequisites).
After enabling Password Hash Sync and Seamless SSO on the Azure AD Connect server, follow these steps to configure staged rollout.
After configuring the Okta app in Azure AD and the Identity Provider in the Okta
## Step 5 - Test-managed authentication on pilot members
-After configuring the Okta reverse federation app, have your users conduct full testing on the Managed authentication experience. Its recommended to set up Company branding to help your users distinguish the proper tenant they are signing into. Get [guidance](https://docs.microsoft.com/azure/active-directory/fundamentals/customize-branding) for setting up company branding.
+After configuring the Okta reverse federation app, have your users conduct full testing on the Managed authentication experience. Its recommended to set up Company branding to help your users distinguish the proper tenant they are signing into. Get [guidance](../fundamentals/customize-branding.md) for setting up company branding.
>[!IMPORTANT] >Determine any additional Conditional Access Policies
active-directory Migrate Okta Sign On Policies To Azure Active Directory Conditional Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access.md
Azure AD CA policies can be configured to match Okta's conditions for most scena
In some scenarios, you may need additional setup before you configure the CA policies. The two known scenarios at the time of writing this article are: -- **Okta network locations to named locations in Azure AD** - Follow [this article](https://docs.microsoft.com/azure/active-directory/conditional-access/location-condition#named-locations) to configure named locations in Azure AD.
+- **Okta network locations to named locations in Azure AD** - Follow [this article](../conditional-access/location-condition.md#named-locations) to configure named locations in Azure AD.
- **Okta device trust to device-based CA** - CA offers two possible options when evaluating a user's device.
Enabling hybrid Azure AD join can be done on your Azure AD Connect server by run
>[!NOTE] >Hybrid Azure AD join isn't supported with the Azure AD Connect cloud provisioning agents.
-1. Follow these [instructions](https://docs.microsoft.com/azure/active-directory/devices/hybrid-azuread-join-managed-domains#configure-hybrid-azure-ad-join) to enable Hybrid Azure AD join.
+1. Follow these [instructions](../devices/hybrid-azuread-join-managed-domains.md#configure-hybrid-azure-ad-join) to enable Hybrid Azure AD join.
2. On the SCP configuration page, select the **Authentication Service** drop-down. Choose your Okta federation provider URL followed by **Add**. Enter your on-premises enterprise administrator credentials then select **Next**.
Enabling hybrid Azure AD join can be done on your Azure AD Connect server by run
While hybrid Azure AD join is direct replacement for Okta device trust on Windows, CA policies can also look at device compliance for devices that have fully enrolled into Microsoft Endpoint Manager. -- **Compliance overview** - Refer to [device compliance policies in Microsoft Intune](https://docs.microsoft.com/mem/intune/protect/device-compliance-get-started#:~:text=Reference%20for%20non-compliance%20and%20Conditional%20Access%20on%20the,applicable%20%20...%20%203%20more%20rows).
+- **Compliance overview** - Refer to [device compliance policies in Microsoft Intune](/mem/intune/protect/device-compliance-get-started#:~:text=Reference%20for%20non-compliance%20and%20Conditional%20Access%20on%20the,applicable%20%20...%20%203%20more%20rows).
-- **Device compliance** - Create [policies in Microsoft Intune](https://docs.microsoft.com/mem/intune/protect/create-compliance-policy).
+- **Device compliance** - Create [policies in Microsoft Intune](/mem/intune/protect/create-compliance-policy).
-- **Windows enrollment** - If you've opted to deploy hybrid Azure AD join, an additional group policy can be deployed to complete the [auto-enrollment process of these devices into Microsoft Intune](https://docs.microsoft.com/windows/client-management/mdm/enroll-a-windows-10-device-automatically-using-group-policy).
+- **Windows enrollment** - If you've opted to deploy hybrid Azure AD join, an additional group policy can be deployed to complete the [auto-enrollment process of these devices into Microsoft Intune](/windows/client-management/mdm/enroll-a-windows-10-device-automatically-using-group-policy).
-- **iOS/iPadOS enrollment** - Before enrolling an iOS device, [additional configurations](https://docs.microsoft.com/mem/intune/enrollment/ios-enroll) must be made in the Endpoint Management Console.
+- **iOS/iPadOS enrollment** - Before enrolling an iOS device, [additional configurations](/mem/intune/enrollment/ios-enroll) must be made in the Endpoint Management Console.
-- **Android enrollment** - Before enrolling an Android device, [additional configurations](https://docs.microsoft.com/mem/intune/enrollment/android-enroll) must be made in the Endpoint Management Console.
+- **Android enrollment** - Before enrolling an Android device, [additional configurations](/mem/intune/enrollment/android-enroll) must be made in the Endpoint Management Console.
## Step 3 - Configure Azure AD Multi-Factor Authentication tenant settings
After you configured the pre-requisites, and established the base settings its t
1. To configure CA policies in Azure AD, navigate to the [Azure portal](https://portal.azure.com). Select **View** on Manage Azure Active Directory. 2. Configuration of CA policies should keep in mind [best
-practices for deploying and designing CA](https://docs.microsoft.com/azure/active-directory/conditional-access/plan-conditional-access#understand-conditional-access-policy-components).
+practices for deploying and designing CA](../conditional-access/plan-conditional-access.md#understand-conditional-access-policy-components).
-3. To mimic global sign-on MFA policy from Okta, [create a policy](https://docs.microsoft.com/azure/active-directory/conditional-access/howto-conditional-access-policy-all-users-mfa).
+3. To mimic global sign-on MFA policy from Okta, [create a policy](../conditional-access/howto-conditional-access-policy-all-users-mfa.md).
-4. Create a [device trust based CA rule](https://docs.microsoft.com/azure/active-directory/conditional-access/require-managed-devices).
+4. Create a [device trust based CA rule](../conditional-access/require-managed-devices.md).
5. This policy as any other in this tutorial can be targeted to a specific application, test group of users or both.
practices for deploying and designing CA](https://docs.microsoft.com/azure/activ
![image shows success in testing user](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/success-test-user.png) 6. After you configured the location-based policy, and device
-trust policy, its time to configure the equivalent [**Block legacy authentication**](https://docs.microsoft.com/azure/active-directory/conditional-access/howto-conditional-access-policy-block-legacy) policy.
+trust policy, its time to configure the equivalent [**Block legacy authentication**](../conditional-access/howto-conditional-access-policy-block-legacy.md) policy.
With these three CA policies, the original Okta sign on policies experience has been replicated in Azure AD. Next steps involve enrolling the user to Azure MFA and testing the policies.
need to register for Azure MFA methods. Users can be required to register throug
2. User can go to <https://aka.ms/mysecurityinfo> to enter information or manage form of MFA registration.
-See [this guide](https://docs.microsoft.com/azure/active-directory/authentication/howto-registration-mfa-sspr-combined) to fully understand the MFA registration process.
+See [this guide](../authentication/howto-registration-mfa-sspr-combined.md) to fully understand the MFA registration process.
Navigate to <https://aka.ms/mfasetup> after signing in with Okta MFA, you're instructed to register for MFA with Azure AD.
Navigate to <https://aka.ms/mfasetup> after signing in with Okta MFA, you're ins
>If registration already happened in the past for that user, they'll be taken to **My Security** information page after satisfying the MFA prompt.
-See the [end-user documentation for MFA enrollment](https://docs.microsoft.com/azure/active-directory/user-help/security-info-setup-signin).
+See the [end-user documentation for MFA enrollment](../user-help/security-info-setup-signin.md).
## Step 6 - Enable CA policies
active-directory Migrate Okta Sync Provisioning To Azure Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-okta-sync-provisioning-to-azure-active-directory.md
Migrating synchronization platforms isn't a small change. Each step of the proce
When switching from Okta provisioning to Azure AD, customers have two choices, either Azure AD Connect Server, or Azure AD cloud
-provisioning. It is recommended to read the full [comparison article from Microsoft](https://docs.microsoft.com/azure/active-directory/cloud-sync/what-is-cloud-sync#comparison-between-azure-ad-connect-and-cloud-provisioning) to understand the differences between the two products.
+provisioning. It is recommended to read the full [comparison article from Microsoft](../cloud-sync/what-is-cloud-sync.md#comparison-between-azure-ad- connect-and-cloud-sync) to understand the differences between the two products.
Azure AD cloud provisioning will be most familiar migration path for Okta customers using Universal or User sync. The cloud provisioning agents are lightweight, and can be installed on or near domain controllers like the Okta directory sync agents. It is not recommended to install them on the same server.
Azure AD Connect server should be chosen if your organization needs to take adva
- Support for writeback >[!NOTE]
->All pre-requisites should be taken into consideration when installing Azure AD Connect or Azure AD cloud provisioning. Refer to [this article to learn more](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-install-prerequisites) before installation.
+>All pre-requisites should be taken into consideration when installing Azure AD Connect or Azure AD cloud provisioning. Refer to [this article to learn more](../hybrid/how-to-connect-install-prerequisites.md) before installation.
## Step 1 - Confirm ImmutableID attribute synchronized by Okta
The example will grab **all** on-premises AD users, and export a list of their o
Once you've prepared your list of source and destination targets, its time to install Azure AD Connect server. If you've opted to use Azure AD Connect cloud provisioning, skip this section.
-1. Continue with [downloading and installing Azure AD Connect](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-install-custom) to your chosen server.
+1. Continue with [downloading and installing Azure AD Connect](../hybrid/how-to-connect-install-custom.md) to your chosen server.
2. On the **Identifying Users** page, under the **select how users should be identified with Azure AD** select the radial for **Choose a specific attribute**. Then, select **mS-DS-ConsistencyGUID** if you haven't modified the Okta defaults.
Before exiting the staging mode, it's important to verify that the ImmutableID's
## Step 4 - Install Azure AD cloud sync agents
-Once you've prepared your list of source and destination targets, its time to [install and configure Azure AD cloud sync agents](https://docs.microsoft.com/azure/active-directory/cloud-sync/tutorial-single-forest). If you've opted to use Azure AD Connect server, skip this section.
+Once you've prepared your list of source and destination targets, its time to [install and configure Azure AD cloud sync agents](../cloud-sync/tutorial-single-forest.md). If you've opted to use Azure AD Connect server, skip this section.
## Step 5 - Disable Okta provisioning to Azure AD
active-directory My Apps Deployment Plan https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/my-apps-deployment-plan.md
Previously updated : 07/25/2021 Last updated : 09/02/2021
The My Apps portal is available to users by default and cannot be turned off. It
Any application in the Azure Active Directory enterprise applications list appears when both of the following conditions are met: * The visibility property for the app is set to true.- * The app is assigned to any user or group. It appears for assigned users. Configuring the portal ensures that the right people can easily find the right apps.
Configuring the portal ensures that the right people can easily find the right a
Users access the My Apps portal to: * Discover and access all their organizationΓÇÖs Azure AD-connected applications to which they have access.- * ItΓÇÖs best to ensure apps are configured for single sign-on (SSO) to provide users the best experience.- * Request access to new apps that are configured for self-service.- * Create personal collections of apps.- * Manage access to apps for others when assigned the role of group owner or delegated control for the group used to grant access to the application(s). Administrators can configure: * [Consent experiences](../manage-apps/configure-user-consent.md) including terms of service.- * [Self-service application discovery and access requests](../manage-apps/access-panel-manage-self-service-access.md).- * [Collections of applications](../manage-apps/access-panel-collections.md).- * Assignment of icons to applications- * User-friendly names for applications- * Company branding shown on My Apps ## Plan consent configuration
It's best if SSO is enabled for all apps in the My Apps portal so that users hav
Azure AD supports multiple SSO options. * To learn more, see [Single sign-on options in Azure AD](sso-options.md).- * To learn more about using Azure AD as an identity provider for an app, see the [Quickstart Series on Application Management](../manage-apps/view-applications-portal.md). ### Use federated SSO if possible
For detailed information on the extension, see [Installing My Apps browser exten
If you must integrate these applications, you should define a mechanism to deploy the extension at scale with [supported browsers](../user-help/my-apps-portal-end-user-access.md). Options include: * [User-driven download and configuration for Chrome, Firefox, Microsoft Edge, or IE](../user-help/my-apps-portal-end-user-access.md)- * [Configuration Manager for Internet Explorer](/mem/configmgr/core/clients/deploy/deploy-clients-to-windows-computers) The extension allows users to launch any app from its search bar, finding access to recently used applications, and having a link to the My Apps page.
Every Azure AD application to which a user has access will appear on My Apps in
End users can also customize their experience by * Creating their own app collections.- * [Hiding and reordering app collections](access-panel-collections.md). ![Screenshot of self-service configuration](./media/my-apps-deployment-plan/collections.png)
See [Set up self-service group management in Azure Active Directory](../enterpri
You can enable users to discover and request access to applications via the My Apps panel. To do so, you must first * enable self-service group management- * enable app for SSO- * create a group for application access ![Screen shot of My Apps self service configuration](./media/my-apps-deployment-plan/my-apps-self-service.png)
active-directory Protect Against Consent Phishing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/protect-against-consent-phishing.md
If your organization has been impacted by an application disabled by Microsoft,
1. Investigate the application activity for the disabled application, including: - The delegated permissions or application permissions requested by the application. - The Azure AD audit logs for activity by the application and sign-in activity for users authorized to use the application.
-1. Review and implement the [guidance on defending against illicit consent grants](/microsoft-365/security/office-365-security/detect-and-remediate-illicit-consent-grants?view=o365-worldwide&preserve-view=true) in Microsoft cloud products, including auditing permissions and consent for the disabled application or any other suspicious apps found during review.
+1. Review and implement the [guidance on defending against illicit consent grants](/microsoft-365/security/office-365-security/detect-and-remediate-illicit-consent-grants) in Microsoft cloud products, including auditing permissions and consent for the disabled application or any other suspicious apps found during review.
1. Implement best practices for hardening against consent phishing, described below.
At Microsoft, we want to put admins in control by providing the right insights a
* Know how to spot and block common consent phishing tactics - Check for poor spelling and grammar. If an email message or the applicationΓÇÖs consent screen has spelling and grammatical errors, itΓÇÖs likely a suspicious application. In that case, you can report it directly on the [consent prompt](../develop/application-consent-experience.md#building-blocks-of-the-consent-prompt) with the ΓÇ£*Report it here*ΓÇ¥ link and Microsoft will investigate if it is a malicious application and disable it, if confirmed. - DonΓÇÖt rely on app names and domain URLs as a source of authenticity. Attackers like to spoof app names and domains that make it appear to come from a legitimate service or company to drive consent to a malicious app. Instead validate the source of the domain URL and use applications from [verified publishers](../develop/publisher-verification-overview.md) when possible.
- - Block [consent phishing emails with Microsoft Defender for Office 365](/microsoft-365/security/office-365-security/set-up-anti-phishing-policies?view=o365-worldwide&preserve-view=true#impersonation-settings-in-anti-phishing-policies-in-microsoft-defender-for-office-365) by protecting against phishing campaigns where an attacker is impersonating a known user in your organization.
+ - Block [consent phishing emails with Microsoft Defender for Office 365](/microsoft-365/security/office-365-security/set-up-anti-phishing-policies#impersonation-settings-in-anti-phishing-policies-in-microsoft-defender-for-office-365) by protecting against phishing campaigns where an attacker is impersonating a known user in your organization.
- Configure Microsoft cloud app security policies such as [activity policies](/cloud-app-security/user-activity-policies), [anomaly detection](/cloud-app-security/anomaly-detection-policy), and [OAuth app policies](/cloud-app-security/app-permission-policy) to help manage and take action on abnormal application activity in to your organization.
- - Investigate and hunt for consent phishing attacks by following the guidance on [advanced hunting with Microsoft 365 Defender](/microsoft-365/security/defender/advanced-hunting-overview?view=o365-worldwide&preserve-view=true).
+ - Investigate and hunt for consent phishing attacks by following the guidance on [advanced hunting with Microsoft 365 Defender](/microsoft-365/security/defender/advanced-hunting-overview).
* Allow access to apps you trust and protect against those you don’t trust - Use applications that have been publisher verified. [Publisher verification](../develop/publisher-verification-overview.md) helps admins and end users understand the authenticity of application developers through a Microsoft supported vetting process. - [Configure user consent settings](./configure-user-consent.md?tabs=azure-portal) to allow users to only consent to specific applications you trust, such as application developed by your organization or from verified publishers.
- - Create proactive [app governance](/microsoft-365/compliance/app-governance-manage-app-governance?view=o365-worldwide&preserve-view=true) policies to monitor third-party app behavior on the Microsoft 365 platform to address common suspicious app behaviors.
+ - Create proactive [app governance](/microsoft-365/compliance/app-governance-manage-app-governance) policies to monitor third-party app behavior on the Microsoft 365 platform to address common suspicious app behaviors.
## Next steps
active-directory Secure Hybrid Access Integrations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/secure-hybrid-access-integrations.md
+
+ Title: Secure hybrid access partner integration
+description: Help customers discover and migrate SaaS applications into Azure AD and connect apps that use legacy authentication methods with Azure AD.
+++++++ Last updated : 04/20/2021++++
+# Secure hybrid access partner integration
+
+Azure Active Directory (Azure AD) supports modern authentication protocols that keep applications secure in a highly connected, cloud-based world. However, many business applications were created to work in a protected corporate network, and some of these applications use legacy authentication methods. As companies look to build a Zero Trust strategy and support hybrid and cloud-first work environments, they need solutions that connect apps to Azure AD and provide modern authentication solutions for legacy applications.
+
+Azure AD natively supports modern protocols like SAML, WS-Fed, and OIDC. Azure AD's App Proxy supports Kerberos and header-based authentication. Other protocols like SSH, NTLM, LDAP, Cookies, aren't yet supported, but ISVs can create solutions to connect these applications with Azure AD to support customers on their journey to Zero Trust.
+
+ISVs have the opportunity to help customers discover and migrate SaaS applications into Azure AD. They can also connect apps that use legacy authentication methods with Azure AD. This will help customers consolidate onto a single platform (Azure AD) to simplify their app management and enable them to implement Zero Trust principles. Supporting apps using legacy authentication makes their users more secure. This solution can be a great stop-gap until the customer modernizes their apps to support modern authentication protocols.
+
+## Solution overview
+
+The solution you build can include the following parts:
+
+1. **App discovery**. Often, customers aren't aware of all the applications they're using. So as a first step you can build application discovery capabilities into your solution and surface discovered applications in the user interface. This enables the customer to prioritize how they want to approach integrating their applications with Azure AD.
+2. **App migration**. Next you can create an in-product workflow where the customer can directly integrate apps with Azure AD without having to go to the Azure AD portal. If you don't implement discovery capabilities in your solution you can start your solution here, integrating the applications customers do know about with Azure AD.
+3. **Legacy authentication support**. You can connect apps using legacy authentication methods to Azure AD so that they get the benefits of single sign-on (SSO) and other features.
+4. **Conditional access**. As an additional feature, you can enable customers to apply Azure AD [Conditional Access](/azure/active-directory/conditional-access/overview/) policies to the applications from within your solution without having to go the Azure AD portal.
+
+The rest of this guide explains the technical considerations and our recommendations for implementing a solution.
+
+## Publish your application to the Azure AD app gallery
+
+You can pre-integrate your application with Azure AD to support SSO and automated provisioning by following the process to [publish it in the Azure AD app gallery](/azure/active-directory/develop/v2-howto-app-gallery-listing/). The Azure AD app gallery is a trusted source of Azure AD compatible applications for IT admins. Applications listed there have been validated to be compatible with Azure AD. They support SSO, automate user provisioning, and can easily integrate into customer tenants with automated app registration.
+
+In addition, we recommend that you become a [verified publisher](/azure/active-directory/develop/publisher-verification-overview/) so that customers know you are the trusted publisher of the app.
+
+## Enable IT admin single sign-on
+
+You'll want to [choose either OIDC or SAML](/azure/active-directory/manage-apps/sso-options#choosing-a-single-sign-on-method/) to enable SSO for IT administrators to your solution.
+
+The best option is to use OIDC. Microsoft Graph uses [OIDC/OAuth](/azure/active-directory/develop/v2-protocols-oidc/). This means that if your solution uses OIDC with Azure AD for IT administrator SSO, then your customers will have a seamless end-to-end experience. They'll use OIDC to sign in to your solution and that same JSON Web Token (JWT) that was issued by Azure AD can then be used to interact with Microsoft Graph.
+
+If your solution is instead using [SAML](/azure/active-directory/manage-apps/configure-saml-single-sign-on/) for IT administrator SSO, the SAML token won't enable your solution to interact with Microsoft Graph. You can still use SAML for IT administrator SSO but your solution needs to support OIDC integration with Azure AD so it can get a JWT from Azure AD to properly interact with Microsoft Graph. You can use one of the following approaches:
+
+Recommended SAML Approach: Create a new registration in the Azure AD app gallery, which is [an OIDC app](/azure/active-directory/saas-apps/openidoauth-tutorial/). This provides the most seamless experience for your customer. They'll add both the SAML and OIDC apps to their tenant. If your application isn't in the Azure AD gallery today, you can start with a non-gallery [multi-tenant application](/azure/active-directory/develop/howto-convert-app-to-be-multi-tenant/).
+
+Alternate SAML Approach: Your customer can manually [create an OIDC application registration](/azure/active-directory/saas-apps/openidoauth-tutorial/) in their Azure AD tenant and ensure they set the right URI's, endpoints, and permissions specified later in this document.
+
+You'll would want to use the [client_credentials grant type](/azure/active-directory/develop/v2-oauth2-client-creds-grant-flow#get-a-token/), which will require that your solution allows the customer to input a client_ID and secret into your user interface, and that you store this information. Get a JWT from Azure AD, which you can then use to interact with Microsoft Graph.
+
+If you choose this route, you should have ready-made documentation for your customer about how to create this application registration within their Azure AD tenant including the endpoints, URI's, and permissions required.
+
+> [!NOTE]
+> Before any applications can be used for either IT administrator or end-user sign-on, the customer's IT administrator will need to [consent to the application in their tenant](/azure/active-directory/manage-apps/grant-admin-consent/).
+
+## Authentication flows
+
+The solution will include three key authentication flows that support the following scenarios:
+
+1. The customer's IT administrator signs in with SSO to administer your solution.
+
+2. The customer's IT administrator uses your solution to integrate applications with Azure AD via Microsoft Graph.
+
+3. End-users sign into legacy applications secured by your solution and Azure AD.
+
+### Your customer's IT administrator does single sign-on to your solution
+
+Your solution can use either SAML or OIDC for SSO when the customer's IT administrator signs in. Either way, its recommended that the IT administrator can sign in to your solution using their Azure AD credentials, which enables them a seamless experience and allows them to use the existing security controls they already have in place. Your solution should be integrated with Azure AD for SSO using either SAML or OIDC.
+
+![image diagram of the IT administrator being redirected by the solution to Azure AD to log in, and then being redirected by Azure AD back to the solution with a SAML token or JWT](./media/secure-hybrid-access-integrations/admin-flow.png)
+
+1. The IT administrator wants to sign-in to your solution with their Azure AD credentials.
+
+2. Your solution will redirect them to Azure AD either with a SAML or OIDC sign-in request.
+
+3. Azure AD will authenticate the IT administrator and then send them back to your solution with either a SAML token or JWT in tow to be authorized within your solution
+
+### The IT administrator integrates applications with Azure AD using your solution
+
+The second leg of the IT administrator journey will be to integrate applications with Azure AD by using your solution. To do this, your solution will use Microsoft Graph to create application registrations and Azure AD Conditional Access policies.
+
+Here is a diagram and summary of this user authentication flow:
+
+![image diagram of the IT administrator being redirected by the solution to Azure AD to log in, then being redirected by Azure AD back to the solution with a SAML token or JWT, and finally the solution making a call to Microsoft Graph with the JWT](./media/secure-hybrid-access-integrations/registration-flow.png)
++
+1. The IT administrator wants to sign-in to your solution with their Azure AD credentials.
+
+2. Your solution will redirect them to Azure AD either with a SAML or OIDC sign-in request.
+
+3. Azure AD will authenticate the IT administrator and then send them back to your solution with either a SAML token or JWT for authorization within your solution.
+
+4. When an IT administrator wants to integrate one of their applications with Azure AD, rather than having to go to the Azure AD portal, your solution will call the Microsoft Graph with their existing JWT to register those applications or apply Azure AD Conditional Access policies to them.
+
+### End-users sign-in to the applications secured by your solution and Azure AD
+
+When end users need to sign into individual applications secured with your solution and Azure AD, they use either OIDC or SAML. If the applications need to interact with Microsoft Graph or any Azure AD protected API for some reason, its recommended that the individual applications you register with Microsoft Graph be configured to use OIDC. This will ensure that the JWT that they get from Azure AD to authenticate them into the applications can also be applied for interacting with Microsoft Graph. If there is no need for the individual applications to interact with Microsoft Graph or any Azure AD protected API, then SAML will suffice.
+
+Here is a diagram and summary of this user authentication flow:
+
+![image diagram of the end user being redirected by the solution to Azure AD to log in, then being redirected by Azure AD back to the solution with a SAML token or JWT, and finally the solution making a call to another application using the application's preferred authentication type](./media/secure-hybrid-access-integrations/end-user-flow.png)
+
+1. The end user wants to sign-in to an application secured by your solution and Azure AD.
+2. Your solution will redirect them to Azure AD either with a SAML or OIDC sign-in request.
+3. Azure AD will authenticate the end user and then send them back to your solution with either a SAML token or JWT for authorization within your solution.
+4. Once authorized against your solution, your solution will then allow the original request to the application to go through using the preferred protocol of the application.
+
+## Summary of Microsoft Graph APIs you will use
+
+Your solution will need to use these APIs. Azure AD will allow you to configure either the delegated permissions or the application permissions. For this solution, you only need delegated permissions.
+
+[Application Templates API](/graph/application-saml-sso-configure-api#retrieve-the-gallery-application-template-identifier/): If you're interested in searching the Azure AD app gallery, you can use this API to find a matching application template. **Permission required** : Application.Read.All.
+
+[Application Registration API](/graph/api/application-post-applications): You'll use this API to create either OIDC or SAML application registrations so end users can sign-in to the applications that the customers have secured with your solution. Doing this will enable these applications to also be secured with Azure AD. **Permissions required** : Application.Read.All, Application.ReadWrite.All
+
+[Service Principal API](/graph/api/serviceprincipal-update): After doing the app registration, you'll need to update the Service Principal Object to set some SSO properties. **Permissions required** : Application.ReadWrite.All, Directory.AccessAsUser.All, AppRoleAssignment.ReadWrite.All (for assignment)
+
+[Conditional Access API](/graph/api/resources/conditionalaccesspolicy): If you want to also apply Azure AD Conditional Access policies to these end-user applications, you can use this API to do so. **Permissions required** : Policy.Read.All, Policy.ReadWrite.ConditionalAccess, and Application.Read.All
+
+## Example Graph API scenarios
+
+This section provides a reference example for using Microsoft Graph APIs to implement application registrations, connect legacy applications, and enable conditional access policies via your solution. In addition, there is guidance on automating admin consent, getting the token signing certificate, and assigning users and groups. This functionality may be useful in your solution.
+
+### Use the Graph API to register apps with Azure AD
+
+#### Apps in the Azure AD app gallery
+
+Some of the applications your customer is using will already be available in the [Azure AD Application Gallery](https://azuremarketplace.microsoft.com/marketplace/apps). You can create a solution that programmatically adds these applications to the customer's tenant. The following is an example of using the Microsoft Graph API to search the Azure AD app gallery for a matching template and then registering the application in the customer's Azure AD tenant.
+
+Search the Azure AD app gallery for a matching application. When using the application templates API, the display name is case-sensitive.
+
+```http
+Authorization: Required with a valid Bearer token
+Method: Get
+
+https://graph.microsoft.com/v1.0/applicationTemplates?$filter=displayname eq "Salesforce.com"
+```
+
+If a match is found from the prior API call, capture the ID and then make this API call while providing a user-friendly display name for the application in the JSON body:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: POST
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applicationTemplates/cd3ed3de-93ee-400b-8b19-b61ef44a0f29/instantiate
+{
+ "displayname": "Salesforce.com"
+}
+```
+
+When you make the above API call, we'll also generate a Service Principal object, which might take a few seconds. From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+
+Next, you'll want to PATCH the Service Principal Object with the saml protocol and the appropriate login URL:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: servicePrincipal/json
+
+https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f71680b27
+{
+ "preferredSingleSignOnMode":"saml",
+ "loginURL": "https://www.salesforce.com"
+}
+```
+
+And lastly, you'll want to patch the Application Object with the appropriate redirecturis and the identifieruris:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applications/54c4806b-b260-4a12-873c-967116983792
+{
+ "web": {
+ "redirectUris":["https://www.salesforce.com"]},
+ "identifierUris":["https://www.salesforce.com"]
+}
+```
+
+#### Applications not in the Azure AD app gallery
+
+If you can't find a match in the Azure AD app gallery or you just want to integrate a custom application, then you have the option of registering a custom application in Azure AD using this template ID:
+
+**8adf8e6e-67b2-4cf2-a259-e3dc5476c621**
+
+And then make this API call while providing a user-friendly display name of the application in the JSON body:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: POST
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applicationTemplates/8adf8e6e-67b2-4cf2-a259-e3dc5476c621/instantiate
+{
+ "displayname": "Custom SAML App"
+}
+```
+
+When you make the above API call, we'll also generate a Service Principal object, which might take a few seconds. From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+
+Next, you'll want to PATCH the Service Principal Object with the saml protocol and the appropriate login URL:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: servicePrincipal/json
+
+https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f71680b27
+{
+ "preferredSingleSignOnMode":"saml",
+ "loginURL": "https://www.samlapp.com"
+}
+```
+
+And lastly, you'll want to patch the Application Object with the appropriate redirecturis and the identifieruris:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applications/54c4806b-b260-4a12-873c-967116983792
+{
+ "web": {
+ "redirectUris":["https://www.samlapp.com"]},
+ "identifierUris":["https://www.samlapp.com"]
+}
+```
+
+#### Cut over to Azure AD single sign-on
+
+Once you have these SaaS applications registered inside Azure AD, the applications still need to be cut over to start us Azure AD as their identity provider. There are two ways to do this:
+
+1. If the applications support one-click SSO, then Azure AD can cut over the application for the customer. They just need to go into the Azure AD portal and perform the one-click SSO with the administrative credentials for the supported SaaS application. You can read about this in [one-click, SSO configuration of your Azure Marketplace application](/azure/active-directory/manage-apps/one-click-sso-tutorial/).
+2. If the application doesn't support one-click SSO, then the customer will need to manually cutover the application to start using Azure AD. You can learn more in the [SaaS App Integration Tutorials for use with Azure AD](/azure/active-directory/saas-apps/tutorial-list/).
+
+### Connect apps using legacy authentication methods to Azure AD
+
+This is where your solution can sit in between Azure AD and the application and enable the customer to get the benefits of Single-Sign On and other Azure Active Directory features even for applications that are not supported. To do so, your application will call Azure AD to authenticate the user and apply Azure AD Conditional Access policies before they can access these applications with legacy protocols.
+
+You can enable customers to do this integration directly from your console so that the discovery and integration is a seamless end-to-end experience. This will involve your platform creating either a SAML or OIDC application registration between your platform and Azure AD.
+
+#### Create a SAML application registration
+
+Use the custom application template ID for this:
+
+**8adf8e6e-67b2-4cf2-a259-e3dc5476c621**
+
+And then make this API call while providing a user-friendly display name in the JSON body:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: POST
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applicationTemplates/8adf8e6e-67b2-4cf2-a259-e3dc5476c621/instantiate
+{
+ "displayname": "Custom SAML App"
+}
+```
+
+When you make the above API call, we'll also generate a Service Principal object, which might take a few seconds. From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+
+Next, you'll want to PATCH the Service Principal Object with the saml protocol and the appropriate login URL:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: servicePrincipal/json
+
+https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f71680b27
+{
+ "preferredSingleSignOnMode":"saml",
+ "loginURL": "https://www.samlapp.com"
+}
+```
+
+And lastly, you'll want to PATCH the Application Object with the appropriate redirecturis and the identifieruris:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applications/54c4806b-b260-4a12-873c-967116983792
+{
+ "web": {
+ "redirectUris":["https://www.samlapp.com"]},
+ "identifierUris":["https://www.samlapp.com"]
+}
+```
+
+#### Create an OIDC application registration
+
+You should use the custom application template ID for this:
+
+**8adf8e6e-67b2-4cf2-a259-e3dc5476c621**
+
+And then make this API call while providing a user-friendly display name in the JSON body:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: POST
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applicationTemplates/8adf8e6e-67b2-4cf2-a259-e3dc5476c621/instantiate
+{
+ "displayname": "Custom OIDC App"
+}
+```
+
+From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/applications/{Application Object ID}
+{
+ "web": {
+ "redirectUris":["https://www.samlapp.com"]},
+ "identifierUris":["[https://www.samlapp.com"],
+ "requiredResourceAccess": [
+ {
+ "resourceAppId": "00000003-0000-0000-c000-000000000000",
+ "resourceAccess": [
+ {
+ "id": "7427e0e9-2fba-42fe-b0c0-848c9e6a8182",
+ "type": "Scope"
+ },
+ {
+ "id": "e1fe6dd8-ba31-4d61-89e7-88639da4683d",
+ "type": "Scope"
+ },
+ {
+ "id": "37f7f235-527c-4136-accd-4a02d197296e",
+ "type": "Scope"
+ }]
+ }]
+}
+```
+
+> [!NOTE]
+> The API Permissions listed above within the resourceAccess node will grant the application access to OpenID, User.Read, and offline_access, which should be enough to get the user signed in to your solution. You can find more information on permissions on the [permissions reference page](/graph/permissions-reference/).
+
+### Apply conditional access policies
+
+We want to empower customers and partners to also use the Microsoft Graph API to create or apply Conditional Access policies to customer's applications. For partners, this can provide additional value so the customer can apply these policies directly from your solution without having to go to the Azure AD portal. You have two options when applying Azure AD Conditional Access Policies:
+
+- You can assign the application to an existing Conditional Access Policy
+- You can create a new Conditional Access policy and assign the application to that new policy
+
+#### An existing conditional access policy
+
+First, you'll want to query to get a list of all Conditional Access Policies and grab the Object ID of the policy you want to modify:
+
+```https
+Authorization: Required with a valid Bearer token
+Method:GET
+
+https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies
+```
+
+Next, you'll want to Patch the policy by including the Application Object ID to be in scope of the includeApplications within the JSON body:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+
+https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies/{policyid}
+{
+ "displayName":"Existing CA Policy",
+ "state":"enabled",
+ "conditions":
+ {
+ "applications":
+ {
+ "includeApplications":[
+ "00000003-0000-0ff1-ce00-000000000000",
+ "{Application Object ID}"
+ ]
+ },
+ "users": {
+ "includeUsers":[
+ "All"
+ ]
+ }
+ },
+ "grantControls":
+ {
+ "operator":"OR",
+ "builtInControls":[
+ "mfa"
+ ]
+ }
+}
+```
+
+#### Create a new Azure AD conditional access policy
+
+You'll want to add the Application Object ID to be in scope of the includeApplications within the JSON body:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: POST
+
+https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies/
+{
+ "displayName":"New CA Policy",
+ "state":"enabled",
+ "conditions":
+ {
+ "applications": {
+ "includeApplications":[
+ "{Application Object ID}"
+ ]
+ },
+ "users": {
+ "includeUsers":[
+ "All"
+ ]
+ }
+ },
+ "grantControls": {
+ "operator":"OR",
+ "builtInControls":[
+ "mfa"
+ ]
+ }
+}
+```
+
+If you're interested in creating new Azure AD Conditional Access Policies, here are some additional templates that can help get you started using the [Conditional Access API](/azure/active-directory/conditional-access/howto-conditional-access-apis/).
+
+```https
+#Policy Template for Requiring Compliant Device
+
+{
+ "displayName":"Enforce Compliant Device",
+ "state":"enabled",
+ "conditions": {
+ "applications": {
+ "includeApplications":[
+ "{Application Object ID}"
+ ]
+ },
+ "users": {
+ "includeUsers":[
+ "All"
+ ]
+ }
+ },
+ "grantControls": {
+ "operator":"OR",
+ "builtInControls":[
+ "compliantDevice",
+ "domainJoinedDevice"
+ ]
+ }
+}
+
+#Policy Template for Block
+
+{
+ "displayName":"Block",
+ "state":"enabled",
+ "conditions": {
+ "applications": {
+ "includeApplications":[
+ "{Application Object ID}"
+ ]
+ },
+ "users": {
+ "includeUsers":[
+ "All"
+ ]
+ }
+ },
+ "grantControls": {
+ "operator":"OR",
+ "builtInControls":[
+ "block"
+ ]
+ }
+}
+```
+
+### Automate admin consent
+
+If the customer is onboarding numerous applications from your platform to Azure AD, you'll likely want to automate admin consent for them so they don't have to manually consent to lots of applications. This can also be done via Microsoft Graph. You'll need both the Service Principal Object ID of the application you created in previous API calls and the Service Principal Object ID of Microsoft Graph from the customer's tenant.
+
+You can get the Service Principal Object ID of Microsoft Graph by making this API call:
+
+```https
+Authorization: Required with a valid Bearer token
+Method:GET
+
+https://graph.microsoft.com/v1.0/serviceprincipals/?$filter=appid eq '00000003-0000-0000-c000-000000000000'&$select=id,appDisplayName
+```
+
+Then when you're ready to automate admin consent, you can make this API call:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: POST
+Content-type: application/json
+
+https://graph.microsoft.com/v1.0/oauth2PermissionGrants
+{
+ "clientId":"{Service Principal Object ID of Application}",
+ "consentType":"AllPrincipals",
+ "principalId":null,
+ "resourceId":"{Service Principal Object ID Of MicrosofT Graph}",
+ "scope":"openid user.read offline_access}"
+}
+```
+
+### Get the token signing certificate
+
+To get the public portion of the token signing certificate for all these applications, you can GET it from the Azure AD metadata endpoint for the application:
+
+```https
+Method:GET
+
+https://login.microsoftonline.com/{Tenant_ID}/federationmetadata/2007-06/federationmetadata.xml?appid={Application_ID}
+```
+
+### Assign users and groups
+
+Once you've published the applications to Azure AD, you can optionally assign it to users and groups to ensure it shows up on the [MyApplications](/azure/active-directory/user-help/my-applications-portal-workspaces/) portal. This assignment is stored on the Service Principal Object that was generated when you created the application:
+
+First you'll want to get any AppRoles that the application may have associated with it. It's common for SaaS applications to have various AppRoles associated with them. For custom applications, there is typically just the one default AppRole. Grab the ID of the AppRole you want to assign:
+
+```https
+Authorization: Required with a valid Bearer token
+Method:GET
+
+https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f71680b27
+```
+
+Next, you'll want to get the Object ID of the user or group from Azure AD that you'll want to assign to the application. Also take the App Role ID from the previous API call and submit it as part of the PATCH body on the Service Principal:
+
+```https
+Authorization: Required with a valid Bearer token
+Method: PATCH
+Content-type: servicePrincipal/json
+
+https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f71680b27
+{
+ "principalId":"{Principal Object ID of User -or- Group}",
+ "resourceId":"{Service Principal Object ID}",
+ "appRoleId":"{App Role ID}"
+}
+```
+
+## Existing partners
+
+Microsoft has existing partnerships with these third-party providers to protect legacy applications while using existing networking and delivery controllers.
+
+| **ADC provider** | **Link** |
+| | |
+| Akamai Enterprise Application Access (EAA) | [https://docs.microsoft.com/azure/active-directory/saas-apps/akamai-tutorial](/azure/active-directory/saas-apps/akamai-tutorial) |
+| Citrix Application Delivery Controller (ADC) | [https://docs.microsoft.com/azure/active-directory/saas-apps/citrix-netscaler-tutorial](/azure/active-directory/saas-apps/citrix-netscaler-tutorial) |
+| F5 Big-IP APM | [https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-integration](/azure/active-directory/manage-apps/f5-aad-integration) |
+| Kemp | [https://docs.microsoft.com/azure/active-directory/saas-apps/kemp-tutorial](/azure/active-directory/saas-apps/kemp-tutorial) |
+| Pulse Secure Virtual Traffic Manager (VTM) | [https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial](/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial) |
+
+The following VPN solution providers connect with Azure AD to enable modern authentication and authorization methods like SSO and multi-factor authentication.
+
+| **VPN vendor** | **Link** |
+| | |
+| Cisco AnyConnect | [https://docs.microsoft.com/azure/active-directory/saas-apps/cisco-anyconnect](/azure/active-directory/saas-apps/cisco-anyconnect) |
+| Fortinet | [https://docs.microsoft.com/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial](/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial) |
+| F5 Big-IP APM | [https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-password-less-vpn](/azure/active-directory/manage-apps/f5-aad-password-less-vpn) |
+| Palo Alto Networks Global Protect | [https://docs.microsoft.com/azure/active-directory/saas-apps/paloaltoadmin-tutorial](/azure/active-directory/saas-apps/paloaltoadmin-tutorial) |
+| Pulse Secure Pulse Connect Secure (PCS) | [https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial](/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial) |
+
+The following SDP solution providers connect with Azure AD to enable modern authentication and authorization methods like SSO and multi-factor authentication.
+
+| **SDP vendor** | **Link** |
+| | |
+| Datawiza Access Broker | [https://docs.microsoft.com/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso](/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso) |
+| Perimeter 81 | [https://docs.microsoft.com/azure/active-directory/saas-apps/perimeter-81-tutorial](/azure/active-directory/saas-apps/perimeter-81-tutorial) |
+| Silverfort Authentication Platform | [https://docs.microsoft.com/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso](/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso) |
+| Strata | [https://docs.microsoft.com/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial](/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial) |
+| Zscaler Private Access (ZPA) | [https://docs.microsoft.com/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial](/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial) |
active-directory Secure Hybrid Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/secure-hybrid-access.md
You can bridge the gap and strengthen your security posture across all applicati
## Secure hybrid access through Azure AD Application Proxy
-Using [Application Proxy](https://docs.microsoft.com/azure/active-directory/app-proxy/what-is-application-proxy) you can provide [secure remote access](https://docs.microsoft.com/azure/active-directory/app-proxy/application-proxy-add-on-premises-application) to your on-premises web applications. Your users donΓÇÖt need to use a VPN. Users benefit by easily connecting to their applications from any device after a [SSO](https://docs.microsoft.com/azure/active-directory/app-proxy/application-proxy-config-sso-how-to#how-to-configure-single-sign-on). Application Proxy provides remote access as a service and allows you to [easily publish your on-premise applications](https://docs.microsoft.com/azure/active-directory/app-proxy/application-proxy-add-on-premises-application) to users outside the corporate network. It helps you scale your cloud access management without requiring you to modify your on-premises applications. [Plan an Azure AD Application Proxy](https://docs.microsoft.com/azure/active-directory/app-proxy/application-proxy-deployment-plan) deployment as a next step.
+Using [Application Proxy](../app-proxy/what-is-application-proxy.md) you can provide [secure remote access](../app-proxy/application-proxy-add-on-premises-application.md) to your on-premises web applications. Your users donΓÇÖt need to use a VPN. Users benefit by easily connecting to their applications from any device after a [SSO](../app-proxy/application-proxy-config-sso-how-to.md#how-to-configure-single-sign-on). Application Proxy provides remote access as a service and allows you to [easily publish your on-premise applications](../app-proxy/application-proxy-add-on-premises-application.md) to users outside the corporate network. It helps you scale your cloud access management without requiring you to modify your on-premises applications. [Plan an Azure AD Application Proxy](../app-proxy/application-proxy-deployment-plan.md) deployment as a next step.
## Secure hybrid access through Azure AD partner integrations
In addition to [Azure AD Application Proxy](https://aka.ms/whyappproxy), Microso
The following partners offer pre-built solutions to support conditional access policies per application and provide detailed guidance for integrating with Azure AD. -- [Akamai Enterprise Application Access](https://docs.microsoft.com/azure/active-directory/saas-apps/akamai-tutorial)
+- [Akamai Enterprise Application Access](../saas-apps/akamai-tutorial.md)
-- [Citrix Application Delivery Controller (ADC)](https://docs.microsoft.com/azure/active-directory/saas-apps/citrix-netscaler-tutorial)
+- [Citrix Application Delivery Controller (ADC)](../saas-apps/citrix-netscaler-tutorial.md)
- [Datawiza Access Broker](datawiza-with-azure-ad.md) -- [F5 Big-IP APM ADC](https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-integration)
+- [F5 Big-IP APM ADC](../manage-apps/f5-aad-integration.md)
-- [F5 Big-IP APM VPN](https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-password-less-vpn)
+- [F5 Big-IP APM VPN](../manage-apps/f5-aad-password-less-vpn.md)
-- [Kemp](https://docs.microsoft.com/azure/active-directory/saas-apps/kemp-tutorial)
+- [Kemp](../saas-apps/kemp-tutorial.md)
-- [Perimeter 81](https://docs.microsoft.com/azure/active-directory/saas-apps/perimeter-81-tutorial)
+- [Perimeter 81](../saas-apps/perimeter-81-tutorial.md)
-- [Silverfort Authentication Platform](https://docs.microsoft.com/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso)
+- [Silverfort Authentication Platform](../manage-apps/add-application-portal-setup-oidc-sso.md)
-- [Strata](https://docs.microsoft.com/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial)
+- [Strata](../saas-apps/maverics-identity-orchestrator-saml-connector-tutorial.md)
The following partners offer pre-built solutions and detailed guidance for integrating with Azure AD. -- [Cisco AnyConnect](https://docs.microsoft.com/azure/active-directory/saas-apps/cisco-anyconnect)
+- [Cisco AnyConnect](../saas-apps/cisco-anyconnect.md)
-- [Fortinet](https://docs.microsoft.com/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial)
+- [Fortinet](../saas-apps/fortigate-ssl-vpn-tutorial.md)
-- [Palo Alto Networks Global Protect](https://docs.microsoft.com/azure/active-directory/saas-apps/paloaltoadmin-tutorial)
+- [Palo Alto Networks Global Protect](../saas-apps/paloaltoadmin-tutorial.md)
-- [Pulse Secure Pulse Connect Secure (PCS)](https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial)
+- [Pulse Secure Pulse Connect Secure (PCS)](../saas-apps/pulse-secure-pcs-tutorial.md)
-- [Pulse Secure Virtual Traffic Manager (VTM)](https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial)
+- [Pulse Secure Virtual Traffic Manager (VTM)](../saas-apps/pulse-secure-virtual-traffic-manager-tutorial.md)
-- [Zscaler Private Access (ZPA)](https://docs.microsoft.com/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial)
+- [Zscaler Private Access (ZPA)](../saas-apps/zscalerprivateaccess-tutorial.md)
active-directory Managed Identities Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/managed-identities-faq.md
You can find the list of resources that have a system-assigned managed identity
az resource list --query "[?identity.type=='SystemAssigned'].{Name:name, principalId:identity.principalId}" --output table ```
+### What Azure RBAC permissions are required to work with managed identities?
-### What Azure RBAC permissions are required to managed identity on a resource?
--- System-assigned managed identity: You need write permissions over the resource. For example, for virtual machines you need Microsoft.Compute/virtualMachines/write. This action is included in resource specific built-in roles like [Virtual Machine Contributor](../../role-based-access-control/built-in-roles.md#virtual-machine-contributor).-- User-assigned managed identity: You need write permissions over the resource. For example, for virtual machines you need Microsoft.Compute/virtualMachines/write. In addition to [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) role assignment over the managed identity.
+- System-assigned managed identity: You need write permissions over the resource. For example, for virtual machines you need `Microsoft.Compute/virtualMachines/write`. This action is included in resource specific built-in roles like [Virtual Machine Contributor](../../role-based-access-control/built-in-roles.md#virtual-machine-contributor).
+- Assigning user-assigned managed identities to resources: You need write permissions over the resource. For example, for virtual machines you need `Microsoft.Compute/virtualMachines/write`. You will also need the `Microsoft.ManagedIdentity/userAssignedIdentities/*/assign/action` action over the user-assigned identity. This action is included in the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) built-in role.
+- Managing user-assigned identities: To create or delete user-assigned managed identities, you need the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
+- Managing role assignments for managed identities: You need the [Owner](../../role-based-access-control/built-in-roles.md#all) or [User Access Administrator](../../role-based-access-control/built-in-roles.md#all) role assignment over the resource to which you're granting access. You will need the [Reader](../../role-based-access-control/built-in-roles.md#all) role assignment to the resource with a system-assigned identity, or to the user-assigned identity that is being given the role assignment. If you do not have read access, you can search by "User, group, or service principal" to find the identity's backing service principal, instead of searching by managed identity while adding the role assignment. [Read more about assigning Azure roles](../../role-based-access-control/role-assignments-portal.md).
### How do I prevent the creation of user-assigned managed identities?
active-directory Powershell For Azure Ad Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/privileged-identity-management/powershell-for-azure-ad-roles.md
This article contains instructions for using Azure Active Directory (Azure AD) P
![Find the organization ID in the properties for the Azure AD organization](./media/powershell-for-azure-ad-roles/tenant-id-for-Azure-ad-org.png) > [!Note]
-> The following sections are simple examples that can help get you up and running. You can find more detailed documentation regarding the following cmdlets at [https://docs.microsoft.com/powershell/module/azuread/?view=azureadps-2.0-preview&preserve-view=true#privileged_role_management](/powershell/module/azuread/?view=azureadps-2.0-preview&preserve-view=true#privileged_role_management). However, you must replace "azureResources" in the providerID parameter with "aadRoles". You will also need to remember to use the Tenant ID for your Azure AD organization as the resourceId parameter.
+> The following sections are simple examples that can help get you up and running. You can find more detailed documentation regarding the following cmdlets at [/powershell/module/azuread/?view=azureadps-2.0-preview&preserve-view=true#privileged_role_management](/powershell/module/azuread/?view=azureadps-2.0-preview&preserve-view=true#privileged_role_management). However, you must replace "azureResources" in the providerID parameter with "aadRoles". You will also need to remember to use the Tenant ID for your Azure AD organization as the resourceId parameter.
## Retrieving role definitions
active-directory List Role Assignments Users https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/list-role-assignments-users.md
A role can be assigned to a user directly or transitively via a group. This arti
- AzureADPreview module when using PowerShell - Microsoft.Graph module when using PowerShell-- Admin consent when using Graph explorer for Microsoft Graph API
+- Admin consent when using Graph Explorer for Microsoft Graph API
For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
Follow these steps to list Azure AD roles assigned to a user using the Microsoft
* [List Azure AD role assignments](view-assignments.md). * [Assign Azure AD roles to users](manage-roles-portal.md).
-* [Assign Azure AD roles to groups](groups-assign-role.md)
+* [Assign Azure AD roles to groups](groups-assign-role.md)
active-directory Adobe Echosign Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/adobe-echosign-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Adobe Sign | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Adobe Sign | Microsoft Docs'
description: Learn how to configure single sign-on between Azure Active Directory and Adobe Sign.
Previously updated : 01/19/2021 Last updated : 09/08/2021
-# Tutorial: Azure Active Directory integration with Adobe Sign
+# Tutorial: Azure AD SSO integration with Adobe Sign
In this tutorial, you'll learn how to integrate Adobe Sign with Azure Active Directory (Azure AD). When you integrate Adobe Sign with Azure AD, you can:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Adobe Sign supports **SP** initiated SSO
+* Adobe Sign supports **SP** initiated SSO.
## Add Adobe Sign from the gallery
To configure and test Azure AD single sign-on with Adobe Sign, you need to perfo
1. **[Create Adobe Sign test user](#create-adobe-sign-test-user)** - to have a counterpart of Britta Simon in Adobe Sign that is linked to the Azure AD representation of user. 1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
In this section, you enable Azure AD single sign-on in the Azure portal.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected. 1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure Adobe Sign SSO
+## Configure Adobe Sign SSO
-1. Before configuration, contact the [Adobe Sign Client support team](https://helpx.adobe.com/in/contact/support.html) to add your domain in the Adobe Sign allow list. Here's how to add the domain:
+1. Before configuration, contact the [Adobe Sign Client support team](https://helpx.adobe.com/in/contact/support.html) to add your domain in the Adobe Sign allowlist. Here's how to add the domain:
a. The [Adobe Sign Client support team](https://helpx.adobe.com/in/contact/support.html) sends you a randomly generated token. For your domain, the token will be like the following: **adobe-sign-verification= xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx**
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the SAML menu, select **Account Settings** > **SAML Settings**.
- ![Screenshot of Adobe Sign SAML Settings page](./media/adobe-echosign-tutorial/settings.png "Account")
+ ![Screenshot of Adobe Sign SAML Settings page.](./media/adobe-echosign-tutorial/settings.png "Account")
1. In the **SAML Settings** section, perform the following steps:
- ![Screenshot that highlights the SAML settings, including SAML Mandatory.](./media/adobe-echosign-tutorial/saml1.png "SAML Settings")
+ ![Screenshot that highlights the SAML settings, including SAML Mandatory.](./media/adobe-echosign-tutorial/profile.png "SAML Settings")
- ![Screenshot of SAML Settings](./media/adobe-echosign-tutorial/saml.png "SAML Settings")
+ ![Screenshot of SAML Settings.](./media/adobe-echosign-tutorial/certificate.png "SAML Settings")
a. Under **SAML Mode**, select **SAML Mandatory**.
active-directory Appneta Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/appneta-tutorial.md
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Select a single sign-on method** page, select **SAML**. 1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](./media/appneta-tutorial/edit-urls.png)
1. On the **Basic SAML Configuration** section, enter the values for the following fields: a. In the **Sign on URL** text box, type a URL using the following pattern: `https://<subdomain>.pm.appneta.com`
+ b. In the Reply URL (Assertion Consumer Service URL) field, enter:
+ `https://sso.connect.pingidentity.com/sso/sp/ACS.saml2`
+ > [!NOTE]
- > The Sign-on URL value is not real. Update this value with the actual Sign-On URL. Contact [AppNeta Performance Manager Client support team](mailto:support@appneta.com) to get this value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > The Sign-on URL value above is an example. Update this value with the actual Sign-On URL. Contact [AppNeta Performance Manager customer support team](mailto:support@appneta.com) to get this value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
-1. AppNeta Performance Manager application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+1. AppNeta Performance Manager application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes:
- ![image](common/edit-attribute.png)
+ ![Screenshot that shows the default attributes for a SAML token.](./media/appneta-tutorial/edit-attribute.png)
-1. In addition to above, AppNeta Performance Manager application expects few more attributes to be passed back in SAML response which are shown below. These attributes are also pre populated but you can review them as per your requirement.
+1. In addition to above, AppNeta Performance Manager application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirement.
| Name | Source Attribute | | | - |
Follow these steps to enable Azure AD SSO in the Azure portal.
| | | > [!NOTE]
- > **groups** refers to the security group in Appneta which is mapped to a **Role** in Azure AD. Please refer to [this](../develop/howto-add-app-roles-in-azure-ad-apps.md#app-roles-ui) doc which explains how to create custom roles in Azure AD.
+ > **groups** refers to the security group in AppNeta Performance Manager that is mapped to a **Role** in Azure AD. For more information, see [App roles UI](../develop/howto-add-app-roles-in-azure-ad-apps.md#app-roles-ui), which explains how to create custom roles in Azure AD. Rather than creating custom roles, most customers add a group claim in the AppNeta enterprise application for security groups with the source attribute group ID. To add a group claim:
- 1. Click **Add new claim** to open the **Manage user claims** dialog.
+ 1. Click **Edit** on **User Attributes & Claims**.
- 1. In the **Name** textbox, type the attribute name shown for that row.
+ 1. Click **Add a group claim** at the top of the page.
- 1. Leave the **Namespace** blank.
+ ![Screenshot that shows the Attributes & Claims pane with the add a group claim option selected.](./media/appneta-tutorial/add-a-group-claim.png)
- 1. Select Source as **Attribute**.
+ 1. Select **Security groups**.
- 1. From the **Source attribute** list, type the attribute value shown for that row.
+ 1. Set **Source attribute** as "Group ID".
- 1. Click **Ok**
+ 1. Under **Advanced options**, select **Customize the name of the group claim** and enter ΓÇ£groupsΓÇ¥ in the **Name** field:
- 1. Click **Save**.
+ ![Screenshot that shows the Group Claims pane with security groups, source attribute, and advanced options selected.](./media/appneta-tutorial/specify-security-groups.png)
-1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+ 1. Click **Save**. This will send Group Object IDs of users when they sign into AppNeta Performance Manager via SSO. Role mappings should be configured using these object IDs and the relevant user role in AppNeta Performance Manager.
- ![The Certificate download link](common/metadataxml.png)
+ ![Screenshot that shows the details of a group claim, with the object ID selected.](./media/appneta-tutorial/object-id.png)
+
+ ![Screenshot that shows the Edit Identity Provider pane, with the security group number selected. ](./media/appneta-tutorial/edit-identity-provider.png)
-1. On the **Set up AppNeta Performance Manager** section, copy the appropriate URL(s) based on your requirement.
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
- ![Copy configuration URLs](common/copy-configuration-urls.png)
+ ![The Certificate download link](common/metadataxml.png)
### Create an Azure AD test user
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the app's overview page, find the **Manage** section and select **Users and groups**. 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog. 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you have setup the roles as explained in the above, you can select it from the **Select a role** dropdown.
+1. If you have set up the roles as explained in the above, you can select it from the **Select a role** dropdown.
1. In the **Add Assignment** dialog, click the **Assign** button.
+ > [!NOTE]
+ > In practice, youΓÇÖll add groups to the application rather than individual users.
+ ## Configure AppNeta Performance Manager SSO
-To configure single sign-on on **AppNeta Performance Manager** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [AppNeta Performance Manager support team](mailto:support@appneta.com). They set this setting to have the SAML SSO connection set properly on both sides.
+To configure single sign-on on **AppNeta Performance Manager** side, you need to send the downloaded **Federation Metadata XML** to [AppNeta Performance Manager support team](mailto:support@appneta.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create AppNeta Performance Manager test user
-In this section, a user called Britta Simon is created in AppNeta Performance Manager. AppNeta Performance Manager supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in AppNeta Performance Manager, a new one is created after authentication.
+In this section, a user called B.Simon is created in AppNeta Performance Manager. AppNeta Performance Manager supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in AppNeta Performance Manager, a new one is created after authentication.
> [!Note] > If you need to create a user manually, contact [AppNeta Performance Manager support team](mailto:support@appneta.com).
In this section, a user called Britta Simon is created in AppNeta Performance Ma
In this section, you test your Azure AD single sign-on configuration with following options. -- Click on **Test this application** in Azure portal. This will redirect to AppNeta Performance Manager Sign-on URL where you can initiate the login flow.
+- In the Azure portal, select **Test this application**. This will redirect to AppNeta Performance Manager Sign-on URL, where you can initiate the login flow.
- Go to AppNeta Performance Manager Sign-on URL directly and initiate the login flow from there. -- You can use Microsoft My Apps. When you click the AppNeta Performance Manager tile in the My Apps, this will redirect to AppNeta Performance Manager Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+- You can use Microsoft My Apps. When you click the AppNeta Performance Manager tile in the My Apps portal, this will redirect to AppNeta Performance Manager Sign-on URL. For more information about the My Apps portal, see [Introduction to My Apps](../user-help/my-apps-portal-end-user-access.md).
## Next steps
-Once you configure AppNeta Performance Manager you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+After you configure AppNeta Performance Manager, you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
active-directory Freshservice Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/freshservice-provisioning-tutorial.md
This tutorial describes the steps you need to perform in both Freshservice Provi
The scenario outlined in this tutorial assumes that you already have the following prerequisites:
-* [An Azure AD tenant](https://docs.microsoft.com/azure/active-directory/develop/quickstart-create-new-tenant)
-* A user account in Azure AD with [permission](https://docs.microsoft.com/azure/active-directory/users-groups-roles/directory-assign-admin-roles) to configure provisioning (for example, Application Administrator, Cloud Application administrator, Application Owner, or Global Administrator).
+* [An Azure AD tenant](../develop/quickstart-create-new-tenant.md)
+* A user account in Azure AD with [permission](../users-groups-roles/directory-assign-admin-roles.md) to configure provisioning (for example, Application Administrator, Cloud Application administrator, Application Owner, or Global Administrator).
* A [Freshservice account](https://www.freshservice.com) with the Organizational Admin permissions. ## Step 1. Plan your provisioning deployment
-1. Learn about [how the provisioning service works](https://docs.microsoft.com/azure/active-directory/manage-apps/user-provisioning).
-2. Determine who will be in [scope for provisioning](https://docs.microsoft.com/azure/active-directory/manage-apps/define-conditional-rules-for-provisioning-user-accounts).
-3. Determine what data to [map between Azure AD and Freshservice Provisioning](https://docs.microsoft.com/azure/active-directory/manage-apps/customize-application-attributes).
+1. Learn about [how the provisioning service works](../manage-apps/user-provisioning.md).
+2. Determine who will be in [scope for provisioning](../manage-apps/define-conditional-rules-for-provisioning-user-accounts.md).
+3. Determine what data to [map between Azure AD and Freshservice Provisioning](../manage-apps/customize-application-attributes.md).
## Step 2. Configure Freshservice Provisioning to support provisioning with Azure AD
The scenario outlined in this tutorial assumes that you already have the followi
## Step 3. Add Freshservice Provisioning from the Azure AD application gallery
-Add Freshservice Provisioning from the Azure AD application gallery to start managing provisioning to Freshservice Provisioning. Learn more about adding an application from the gallery [here](https://docs.microsoft.com/azure/active-directory/manage-apps/add-gallery-app).
+Add Freshservice Provisioning from the Azure AD application gallery to start managing provisioning to Freshservice Provisioning. Learn more about adding an application from the gallery [here](../manage-apps/add-gallery-app.md).
## Step 4. Define who will be in scope for provisioning
-The Azure AD provisioning service allows you to scope who will be provisioned based on assignment to the application and or based on attributes of the user. If you choose to scope who will be provisioned to your app based on assignment, you can use the following [steps](../manage-apps/assign-user-or-group-access-portal.md) to assign users to the application. If you choose to scope who will be provisioned based solely on attributes of the user, you can use a scoping filter as described [here](https://docs.microsoft.com/azure/active-directory/manage-apps/define-conditional-rules-for-provisioning-user-accounts).
+The Azure AD provisioning service allows you to scope who will be provisioned based on assignment to the application and or based on attributes of the user. If you choose to scope who will be provisioned to your app based on assignment, you can use the following [steps](../manage-apps/assign-user-or-group-access-portal.md) to assign users to the application. If you choose to scope who will be provisioned based solely on attributes of the user, you can use a scoping filter as described [here](../manage-apps/define-conditional-rules-for-provisioning-user-accounts.md).
-* When assigning users to Freshservice Provisioning, you must select a role other than **Default Access**. Users with the Default Access role are excluded from provisioning and will be marked as not effectively entitled in the provisioning logs. If the only role available on the application is the default access role, you can [update the application manifest](https://docs.microsoft.com/azure/active-directory/develop/howto-add-app-roles-in-azure-ad-apps) to add additional roles.
+* When assigning users to Freshservice Provisioning, you must select a role other than **Default Access**. Users with the Default Access role are excluded from provisioning and will be marked as not effectively entitled in the provisioning logs. If the only role available on the application is the default access role, you can [update the application manifest](../develop/howto-add-app-roles-in-azure-ad-apps.md) to add additional roles.
-* Start small. Test with a small set of users before rolling out to everyone. When scope for provisioning is set to assigned users, you can control this by assigning one or two users to the app. When scope is set to all users and groups, you can specify an [attribute based scoping filter](https://docs.microsoft.com/azure/active-directory/manage-apps/define-conditional-rules-for-provisioning-user-accounts).
+* Start small. Test with a small set of users before rolling out to everyone. When scope for provisioning is set to assigned users, you can control this by assigning one or two users to the app. When scope is set to all users and groups, you can specify an [attribute based scoping filter](../manage-apps/define-conditional-rules-for-provisioning-user-accounts.md).
## Step 5. Configure automatic user provisioning to Freshservice Provisioning
This section guides you through the steps to configure the Azure AD provisioning
8. Under the **Mappings** section, select **Synchronize Azure Active Directory Users to Freshservice Provisioning**.
-9. Review the user attributes that are synchronized from Azure AD to Freshservice Provisioning in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the user accounts in Freshservice Provisioning for update operations. If you choose to change the [matching target attribute](https://docs.microsoft.com/azure/active-directory/manage-apps/customize-application-attributes), you will need to ensure that the Freshservice Provisioning API supports filtering users based on that attribute. Select the **Save** button to commit any changes.
+9. Review the user attributes that are synchronized from Azure AD to Freshservice Provisioning in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the user accounts in Freshservice Provisioning for update operations. If you choose to change the [matching target attribute](../manage-apps/customize-application-attributes.md), you will need to ensure that the Freshservice Provisioning API supports filtering users based on that attribute. Select the **Save** button to commit any changes.
|Attribute|Type|Supported For Filtering| ||||
This operation starts the initial synchronization cycle of all users defined in
## Step 6. Monitor your deployment Once you've configured provisioning, use the following resources to monitor your deployment:
-1. Use the [provisioning logs](https://docs.microsoft.com/azure/active-directory/reports-monitoring/concept-provisioning-logs) to determine which users have been provisioned successfully or unsuccessfully
-2. Check the [progress bar](https://docs.microsoft.com/azure/active-directory/app-provisioning/application-provisioning-when-will-provisioning-finish-specific-user) to see the status of the provisioning cycle and how close it is to completion
-3. If the provisioning configuration seems to be in an unhealthy state, the application will go into quarantine. Learn more about quarantine states [here](https://docs.microsoft.com/azure/active-directory/manage-apps/application-provisioning-quarantine-status).
+1. Use the [provisioning logs](../reports-monitoring/concept-provisioning-logs.md) to determine which users have been provisioned successfully or unsuccessfully
+2. Check the [progress bar](../app-provisioning/application-provisioning-when-will-provisioning-finish-specific-user.md) to see the status of the provisioning cycle and how close it is to completion
+3. If the provisioning configuration seems to be in an unhealthy state, the application will go into quarantine. Learn more about quarantine states [here](../manage-apps/application-provisioning-quarantine-status.md).
## Additional resources
active-directory Github Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/github-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with a GitHub Enterprise Cloud Organization | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with a GitHub Enterprise Cloud Organization | Microsoft Docs'
description: Learn how to configure single sign-on between Azure Active Directory and a GitHub Enterprise Cloud Organization.
Previously updated : 12/24/2020 Last updated : 09/08/2021
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with a GitHub Enterprise Cloud Organization
+# Tutorial: Azure AD SSO integration with a GitHub Enterprise Cloud Organization
In this tutorial, you'll learn how to integrate a GitHub Enterprise Cloud **Organization** with Azure Active Directory (Azure AD). When you integrate a GitHub Enterprise Cloud Organization with Azure AD, you can:
In this tutorial, you'll learn how to integrate a GitHub Enterprise Cloud **Orga
## Prerequisites
-To configure Azure AD integration with a GitHub Enterprise Cloud Organization, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* A GitHub organization created in [GitHub Enterprise Cloud](https://help.github.com/articles/github-s-products/#github-enterprise), which requires the [GitHub Enterprise billing plan](https://help.github.com/articles/github-s-billing-plans/#billing-plans-for-organizations)
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* A GitHub organization created in [GitHub Enterprise Cloud](https://help.github.com/articles/github-s-products/#github-enterprise), which requires the [GitHub Enterprise billing plan](https://help.github.com/articles/github-s-billing-plans/#billing-plans-for-organizations).
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* GitHub supports **SP** initiated SSO
-
-* GitHub supports [**Automated** user provisioning (organization invitations)](github-provisioning-tutorial.md)
+* GitHub supports **SP** initiated SSO.
+* GitHub supports [**Automated** user provisioning (organization invitations)](github-provisioning-tutorial.md).
## Adding GitHub from the gallery
Follow these steps to enable Azure AD SSO in the Azure portal.
c. In the **Sign on URL** text box, type a URL using the following pattern: `https://github.com/orgs/<Organization ID>/sso` - > [!NOTE]
- > Please note that these are not the real values. You have to update these values with the actual Sign on URL, Identifier and Reply URL. Here we suggest you to use the unique value of string in the Identifier. Go to GitHub Admin section to retrieve these values.
+ > Please note that these are not the real values. You have to update these values with the actual Identifier,Reply URL and Sign on URL. Here we suggest you to use the unique value of string in the Identifier. Go to GitHub Admin section to retrieve these values.
5. Your GitHub application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes, where as **Unique User Identifier (Name ID)** is mapped with **user.userprincipalname**. GitHub application expects **Unique User Identifier (Name ID)** to be mapped with **user.mail**, so you need to edit the attribute mapping by clicking on **Edit** icon and change the attribute mapping.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png) - ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
3. Check the **Enable SAML authentication** box, revealing the Single Sign-on configuration fields, perform the following steps:
- ![Screenshot that shows the "S A M L single sign-on" section with "Enable S A M L authentication" with U R L text boxes highlighted.](./media/github-tutorial/saml-sso.png)
+ ![Screenshot that shows the "S A M L single sign-on" section with "Enable S A M L authentication" with U R L text boxes highlighted.](./media/github-tutorial/authentication.png)
a. Copy **single sign-on URL** value and paste this value into the **Sign on URL** text box in the **Basic SAML Configuration** in the Azure portal.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
e. Update the **assertion consumer service URL (Reply URL)** from the default URL so that it the URL in GitHub matches the URL in the Azure app registration.
- ![image](./media/github-tutorial/tutorial_github_sha.png)
+ ![Screenshot that shows the image.](./media/github-tutorial/certificate.png)
5. Click on **Test SAML configuration** to confirm that no validation failures or errors during SSO.
- ![Settings](./media/github-tutorial/test.png)
+ ![Screenshot that shows the Settings.](./media/github-tutorial/test.png)
6. Click **Save**
The objective of this section is to create a user called Britta Simon in GitHub.
3. Click **Invite member**.
- ![Invite Users](./media/github-tutorial/invite-member.png "Invite Users")
+ ![Screenshot that shows the Invite Users.](./media/github-tutorial/invite-member.png "Invite Users")
4. On the **Invite member** dialog page, perform the following steps: a. In the **Email** textbox, type the email address of Britta Simon account.
- ![Invite People](./media/github-tutorial/email-box.png "Invite People")
+ ![Screenshot that shows the Invite People.](./media/github-tutorial/email-box.png "Invite People")
b. Click **Send Invitation**.
active-directory Paloaltoadmin Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/paloaltoadmin-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Palo Alto Networks - Admin UI | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Palo Alto Networks - Admin UI | Microsoft Docs'
description: Learn how to configure single sign-on between Azure Active Directory and Palo Alto Networks - Admin UI.
Previously updated : 09/10/2020 Last updated : 09/08/2021
-# Tutorial: Azure Active Directory integration with Palo Alto Networks - Admin UI
+# Tutorial: Azure AD SSO integration with Palo Alto Networks - Admin UI
-In this tutorial, you learn how to integrate Palo Alto Networks - Admin UI with Azure Active Directory (Azure AD).
-Integrating Palo Alto Networks - Admin UI with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Palo Alto Networks - Admin UI with Azure Active Directory (Azure AD). When you integrate Palo Alto Networks - Admin UI with Azure AD, you can:
-* You can control in Azure AD who has access to Palo Alto Networks - Admin UI.
-* You can enable your users to be automatically signed-in to Palo Alto Networks - Admin UI (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
+* Control in Azure AD who has access to Palo Alto Networks - Admin UI.
+* Enable your users to be automatically signed-in to Palo Alto Networks - Admin UI with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Palo Alto Networks - Admin UI, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Palo Alto Networks - Admin UI single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Palo Alto Networks - Admin UI single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Palo Alto Networks - Admin UI supports **SP** initiated SSO
-* Palo Alto Networks - Admin UI supports **Just In Time** user provisioning
+* Palo Alto Networks - Admin UI supports **SP** initiated SSO.
+* Palo Alto Networks - Admin UI supports **Just In Time** user provisioning.
## Adding Palo Alto Networks - Admin UI from the gallery
To configure the integration of Palo Alto Networks - Admin UI into Azure AD, you
1. In the **Add from the gallery** section, type **Palo Alto Networks - Admin UI** in the search box. 1. Select **Palo Alto Networks - Admin UI** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD SSO
+## Configure and test Azure AD SSO for Palo Alto Networks - Admin UI
In this section, you configure and test Azure AD single sign-on with Palo Alto Networks - Admin UI based on a test user called **B.Simon**. For single sign-on to work, a link relationship between an Azure AD user and the related user in Palo Alto Networks - Admin UI needs to be established.
For single sign-on to work, a link relationship between an Azure AD user and the
To configure and test Azure AD single sign-on with Palo Alto Networks - Admin UI, perform the following steps: 1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure Palo Alto Networks - Admin UI SSO](#configure-palo-alto-networksadmin-ui-sso)** - to configure the single sign-on settings on application side.
- * **[Create Palo Alto Networks - Admin UI test user](#create-palo-alto-networksadmin-ui-test-user)** - to have a counterpart of B.Simon in Palo Alto Networks - Admin UI that is linked to the Azure AD representation of user.
+ 1. **[Create Palo Alto Networks - Admin UI test user](#create-palo-alto-networksadmin-ui-test-user)** - to have a counterpart of B.Simon in Palo Alto Networks - Admin UI that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Sign-on URL** text box, type a URL using the following pattern:
- `https://<Customer Firewall FQDN>/php/login.php`
-
- b. In the **Identifier** box, type a URL using the following pattern:
+ a. In the **Identifier** box, type a URL using the following pattern:
`https://<Customer Firewall FQDN>:443/SAML20/SP`
- c. In the **Reply URL** text box, type the Assertion Consumer Service (ACS) URL in the following format:
+ b. In the **Reply URL** text box, type the Assertion Consumer Service (ACS) URL in the following format:
`https://<Customer Firewall FQDN>:443/SAML20/SP/ACS`
+ c. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<Customer Firewall FQDN>/php/login.php`
+ > [!NOTE]
- > These values are not real. Update these values with the actual Sign-On URL, Identifier and Reply URL. Contact [Palo Alto Networks - Admin UI Client support team](https://support.paloaltonetworks.com/support) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier,Reply URL and Sign on URL. Contact [Palo Alto Networks - Admin UI Client support team](https://support.paloaltonetworks.com/support) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
> > Port 443 is required on the **Identifier** and the **Reply URL** as these values are hardcoded into the Palo Alto Firewall. Removing the port number will result in an error during login if removed.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png) - ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
2. Select the **Device** tab.
- ![The Device tab](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_admin1.png)
+ ![Screenshot shows the Device tab.](./media/paloaltoadmin-tutorial/device.png)
3. In the left pane, select **SAML Identity Provider**, and then select **Import** to import the metadata file.
- ![The Import metadata file button](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_admin2.png)
+ ![Screenshot shows the Import metadata file button.](./media/paloaltoadmin-tutorial/admin.png)
4. In the **SAML Identify Provider Server Profile Import** window, do the following:
- ![The "SAML Identify Provider Server Profile Import" window](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_idp.png)
+ ![Screenshot shows the "SAML Identify Provider Server Profile Import" window.](./media/paloaltoadmin-tutorial/profile.png)
a. In the **Profile Name** box, provide a name (for example, **AzureAD Admin UI**).
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
5. In the left pane, select **SAML Identity Provider**, and then select the SAML Identity Provider Profile (for example, **AzureAD Admin UI**) that you created in the preceding step.
- ![The SAML Identity Provider Profile](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_idp_select.png)
+ ![Screenshot shows the SAML Identity Provider Profile](./media/paloaltoadmin-tutorial/azure.png)
6. In the **SAML Identity Provider Server Profile** window, do the following:
- ![The "SAML Identity Provider Server Profile" window](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_slo.png)
+ ![Screenshot shows the "SAML Identity Provider Server Profile" window.](./media/paloaltoadmin-tutorial/server.png)
a. In the **Identity Provider SLO URL** box, replace the previously imported SLO URL with the following URL: `https://login.microsoftonline.com/common/wsfederation?wa=wsignout1.0`
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
9. In the **Admin Role Profile** window, in the **Name** box, provide a name for the administrator role (for example, **fwadmin**). The administrator role name should match the SAML Admin Role attribute name that was sent by the Identity Provider. The administrator role name and value were created in **User Attributes** section in the Azure portal.
- ![Configure Palo Alto Networks Admin Role](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_adminrole.png)
+ ![Configure Palo Alto Networks Admin Role.](./media/paloaltoadmin-tutorial/role.png)
10. On the Firewall's Admin UI, select **Device**, and then select **Authentication Profile**.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
12. In the **Authentication Profile** window, do the following:
- ![The "Authentication Profile" window](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_authentication_profile.png)
+ ![Screenshot shows the "Authentication Profile" window.](./media/paloaltoadmin-tutorial/authentication.png)
a. In the **Name** box, provide a name (for example, **AzureSAML_Admin_AuthProfile**).
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
c. In the **IdP Server Profile** drop-down list, select the appropriate SAML Identity Provider Server profile (for example, **AzureAD Admin UI**).
- c. Select the **Enable Single Logout** check box.
+ d. Select the **Enable Single Logout** check box.
- d. In the **Admin Role Attribute** box, enter the attribute name (for example, **adminrole**).
+ e. In the **Admin Role Attribute** box, enter the attribute name (for example, **adminrole**).
- e. Select the **Advanced** tab and then, under **Allow List**, select **Add**.
+ f. Select the **Advanced** tab and then, under **Allow List**, select **Add**.
- ![The Add button on the Advanced tab](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_allowlist.png)
+ ![Screenshot shows the Add button on the Advanced tab.](./media/paloaltoadmin-tutorial/allowlist.png)
- f. Select the **All** check box, or select the users and groups that can authenticate with this profile.
+ g. Select the **All** check box, or select the users and groups that can authenticate with this profile.
When a user authenticates, the firewall matches the associated username or group against the entries in this list. If you donΓÇÖt add entries, no users can authenticate.
- g. Select **OK**.
+ h. Select **OK**.
13. To enable administrators to use SAML SSO by using Azure, select **Device** > **Setup**. In the **Setup** pane, select the **Management** tab and then, under **Authentication Settings**, select the **Settings** ("gear") button.
- ![The Settings button](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_authsetup.png)
+ ![Screenshot shows the Settings button.](./media/paloaltoadmin-tutorial/setup.png)
14. Select the SAML Authentication profile that you created in the Authentication Profile window(for example, **AzureSAML_Admin_AuthProfile**).
- ![The Authentication Profile field](./media/paloaltoadmin-tutorial/tutorial_paloaltoadmin_authsettings.png)
+ ![Screenshot shows the Authentication Profile field.](./media/paloaltoadmin-tutorial/settings.png)
15. Select **OK**.
In this section, you test your Azure AD single sign-on configuration with follow
* You can use Microsoft My Apps. When you click the Palo Alto Networks - Admin UI tile in the My Apps, you should be automatically signed in to the Palo Alto Networks - Admin UI for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md). - ## Next steps Once you configure Palo Alto Networks - Admin UI you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
active-directory Readcube Papers Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/readcube-papers-tutorial.md
Follow these steps to enable Azure AD SSO in the Azure portal.
![Edit Basic SAML Configuration](common/edit-urls.png) 1. On the **Basic SAML Configuration** section, perform the following step:
+ 1. In the **Reply URL (ACS URL)** text box, type the URL: `https://connect.liblynx.com/saml/module.php/saml/sp/saml2-acs.php/dsrsi`
+ 2. In the **Sign on URL** text box, type the URL: `https://app.readcube.com`
- a. In the **Sign on URL** text box, type the URL:
- `https://app.readcube.com`
+ ![Screenshot that shows example settings in the SAML Configuration pane.](./media/readcube-papers-tutorial/configure-saml.png)
+
1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer. ![The Certificate download link](common/copy-metadataurl.png)
To configure single sign-on on the **ReadCube Papers** side, you need to send th
### Create ReadCube Papers test user
-In this section, a user called Britta Simon is created in ReadCube Papers. ReadCube Papers supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in ReadCube Papers, a new one is created after authentication.
+In this section, a user called B.Simon is created in ReadCube Papers. ReadCube Papers supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in ReadCube Papers, a new one is created after authentication.
## Test SSO
In this section, you test your Azure AD single sign-on configuration with follow
* Go to ReadCube Papers Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the ReadCube Papers tile in the My Apps, this will redirect to ReadCube Papers Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ReadCube Papers tile in the My Apps portal, this will redirect to ReadCube Papers Sign-on URL. For more information about the My Apps portal, see [Introduction to My Apps](../user-help/my-apps-portal-end-user-access.md).
## Next steps
-Once you configure ReadCube Papers you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
+After you configure ReadCube Papers, you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sharefile Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/sharefile-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Citrix ShareFile | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Citrix ShareFile | Microsoft Docs'
description: Learn how to configure single sign-on between Azure Active Directory and Citrix ShareFile.
Previously updated : 01/18/2021 Last updated : 09/08/2021
-# Tutorial: Azure Active Directory integration with Citrix ShareFile
+# Tutorial: Azure AD SSO integration with Citrix ShareFile
-In this tutorial, you learn how to integrate Citrix ShareFile with Azure Active Directory (Azure AD).
-Integrating Citrix ShareFile with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Citrix ShareFile with Azure Active Directory (Azure AD). When you integrate Citrix ShareFile with Azure AD, you can:
-* You can control in Azure AD who has access to Citrix ShareFile.
-* You can enable your users to be automatically signed-in to Citrix ShareFile (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
+* Control in Azure AD who has access to Citrix ShareFile.
+* Enable your users to be automatically signed-in to Citrix ShareFile with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Citrix ShareFile, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/).
-* Citrix ShareFile single sign-on enabled subscription.
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Citrix ShareFile single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Citrix ShareFile supports **SP** initiated SSO
+* Citrix ShareFile supports **SP** initiated SSO.
## Adding Citrix ShareFile from the gallery
Follow these steps to enable Azure AD SSO in the Azure portal.
1. In the Azure portal, on the **Citrix ShareFile** application integration page, find the **Manage** section and select **single sign-on**. 1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Sign-on URL** text box, type a URL using the following pattern:
- `https://<tenant-name>.sharefile.com/saml/login`
-
- b. In the **Identifier (Entity ID)** textbox, type a URL using the following pattern:
+ a. In the **Identifier (Entity ID)** textbox, type a URL using the following pattern:
- - `https://<tenant-name>.sharefile.com`
- - `https://<tenant-name>.sharefile.com/saml/info`
- - `https://<tenant-name>.sharefile1.com/saml/info`
- - `https://<tenant-name>.sharefile1.eu/saml/info`
- - `https://<tenant-name>.sharefile.eu/saml/info`
+ | **Identifier** |
+ |--|
+ | `https://<tenant-name>.sharefile.com` |
+ | `https://<tenant-name>.sharefile.com/saml/info` |
+ | `https://<tenant-name>.sharefile1.com/saml/info` |
+ | `https://<tenant-name>.sharefile1.eu/saml/info` |
+ | `https://<tenant-name>.sharefile.eu/saml/info` |
- c. In the **Reply URL** textbox, type a URL using the following pattern:
+ b. In the **Reply URL** textbox, type a URL using the following pattern:
- - `https://<tenant-name>.sharefile.com/saml/acs`
- - `https://<tenant-name>.sharefile.eu/saml/<URL path>`
- - `https://<tenant-name>.sharefile.com/saml/<URL path>`
+ | **Reply URL** |
+ |-|
+ | `https://<tenant-name>.sharefile.com/saml/acs` |
+ | `https://<tenant-name>.sharefile.eu/saml/<URL path>` |
+ | `https://<tenant-name>.sharefile.com/saml/<URL path>` |
+
+ c. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<tenant-name>.sharefile.com/saml/login`
> [!NOTE]
- > These values are not real. Update these values with the actual Sign-On URL, Identifier and Reply URL. Contact [Citrix ShareFile Client support team](https://www.citrix.co.in/products/citrix-content-collaboration/support.html) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier,Reply URL and Sign on URL. Contact [Citrix ShareFile Client support team](https://www.citrix.co.in/products/citrix-content-collaboration/support.html) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
4. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
In this section, you test your Azure AD single sign-on configuration with follow
* You can use Microsoft My Apps. When you click the Citrix ShareFile tile in the My Apps, this will redirect to Citrix ShareFile Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md). - ## Next steps Once you configure Citrix ShareFile you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
active-directory Decentralized Identifier Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/verifiable-credentials/decentralized-identifier-overview.md
Our digital and physical lives are increasingly linked to the apps, services, and devices we use to access a rich set of experiences. This digital transformation allows us to interact with hundreds of companies and thousands of other users in ways that were previously unimaginable.
-But identity data has too often been exposed in security breaches. These breaches are impactful to people's lives affecting our social, professional, and financial lives. Microsoft believes that thereΓÇÖs a better way. Every person has a right to an identity that they own and control, one that securely stores elements of their digital identity and preserves privacy. This primer explains how we are joining hands with a diverse community to build an open, trustworthy, interoperable, and standards-based Decentralized Identity (DID) solution for individuals and organizations.
+But identity data has too often been exposed in security breaches. These breaches affect our social, professional, and financial lives. Microsoft believes that thereΓÇÖs a better way. Every person has a right to an identity that they own and control, one that securely stores elements of their digital identity and preserves privacy. This primer explains how we are joining hands with a diverse community to build an open, trustworthy, interoperable, and standards-based Decentralized Identity (DID) solution for individuals and organizations.
## Why we need Decentralized Identity
Today we use our digital identity at work, at home, and across every app, servic
Generally, users grant consent to several apps and devices. This approach requires a high degree of vigilance on the user's part to track who has access to what information. On the enterprise front, collaboration with consumers and partners requires high-touch orchestration to securely exchange data in a way that maintains privacy and security for all involved.
-We believe a standards-based Decentralized Identity system can unlock a new set of experiences that give users and organizations to have greater control over their dataΓÇöand deliver a higher degree of trust and security for apps, devices, and service providers
+We believe a standards-based Decentralized Identity system can unlock a new set of experiences that give users and organizations to have greater control over their dataΓÇöand deliver a higher degree of trust and security for apps, devices, and service providers.
## Lead with open standards WeΓÇÖre committed to working closely with customers, partners, and the community to unlock the next generation of Decentralized IdentityΓÇôbased experiences, and weΓÇÖre excited to partner with the individuals and organizations that are making incredible contributions in this space. If the DID ecosystem is to grow, standards, technical components, and code deliverables must be open source and accessible to all.
-Microsoft is actively collaborating with members of the Decentralized Identity Foundation (DIF), the W3C Credentials Community Group, and the wider identity community. WeΓÇÖre worked with these groups to identify and develop critical standards, and the following standards have been implemented in our services.
+Microsoft is actively collaborating with members of the Decentralized Identity Foundation (DIF), the W3C Credentials Community Group, and the wider identity community. WeΓÇÖve worked with these groups to identify and develop critical standards, and the following standards have been implemented in our services.
* [W3C Decentralized Identifiers](https://www.w3.org/TR/did-core/) * [W3C Verifiable Credentials](https://www.w3.org/TR/vc-data-model/)
Before we can understand DIDs, it helps to compare them with current identity sy
Decentralized Identifiers (DIDs) are different. DIDs are user-generated, self-owned, globally unique identifiers rooted in decentralized systems like ION. They possess unique characteristics, like greater assurance of immutability, censorship resistance, and tamper evasiveness. These attributes are critical for any ID system that is intended to provide self-ownership and user control.
-MicrosoftΓÇÖs verifiable credential solution uses decentralized credentials (DIDs) to cryptographically sign as proof that a relying party (verifier) is attesting to information proving they are the owners of a verifiable credential. Therefore, a basic understanding of decentralized identifiers is recommended for anyone creating a verifiable credential solution based on the Microsoft offering.
+MicrosoftΓÇÖs verifiable credential solution uses decentralized credentials (DIDs) to cryptographically sign as proof that a relying party (verifier) is attesting to information proving they are the owners of a verifiable credential. A basic understanding of DIDs is recommended for anyone creating a verifiable credential solution based on the Microsoft offering.
+ ## What are Verifiable Credentials?
- We use IDs in our daily lives. We have drivers licenses that we use as evidence of our ability to operate a car. Universities issue diplomas that prove we attained a level of education. We use passports to prove who we are to authorities as we arrive to other countries. The data model describes how we could handle these types of scenarios when working over the internet but in a secure manner that respects user's privacy. You can get additional information in The [Verifiable Credentials Data Model 1.0](https://www.w3.org/TR/vc-data-model/)
+We use IDs in our daily lives. We have drivers licenses that we use as evidence of our ability to operate a car. Universities issue diplomas that prove we attained a level of education. We use passports to prove who we are to authorities as we arrive to other countries. The data model describes how we could handle these types of scenarios when working over the internet but in a secure manner that respects users' privacy. You can get additional information in The [Verifiable Credentials Data Model 1.0](https://www.w3.org/TR/vc-data-model/).
In short, verifiable credentials are data objects consisting of claims made by the issuer attesting information about a subject. These claims are identified by schema and include the DID the issuer and subject. The issuer's DID creates a digital signature as proof that they attest to this information. ## How does Decentralized Identity work?
-We need a new form of identity. We need an identity that brings together technologies and standards to deliver key identity attributes like self-ownership, and censorship resistance. These capabilities are difficult to achieve using existing systems.
+We need a new form of identity. We need an identity that brings together technologies and standards to deliver key identity attributes like self-ownership and censorship resistance. These capabilities are difficult to achieve using existing systems.
To deliver on these promises, we need a technical foundation made up of seven key innovations. One key innovation is identifiers that are owned by the user, a user agent to manage keys associated with such identifiers, and encrypted, user-controlled datastores.
Now that you know about DIDs and verifiable credentials try them yourself by fol
- [Get started with verifiable credentials](get-started-verifiable-credentials.md) - [How to customize your credentials](credential-design.md)-- [Verifiable credentials FAQ](verifiable-credentials-faq.md)
+- [Verifiable credentials FAQ](verifiable-credentials-faq.md)
aks Scale Down Mode https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/scale-down-mode.md
To delete your deallocated nodes, you can change your Scale-down Mode to `Delete
az aks nodepool update --scale-down-mode Delete --name nodepool2 --cluster-name myAKSCluster --resource-group myResourceGroup ```
+> [!NOTE]
+> Changing your scale-down mode from `Deallocate` to `Delete` then back to `Deallocate` will delete all deallocated nodes while keeping your node pool in `Deallocate` scale-down mode.
+ ## Using Scale-down Mode to delete nodes on scale-down The default behavior of AKS without using Scale-down Mode is to delete your nodes when you scale-down your cluster. Using Scale-down Mode, this can be explicitly achieved by setting `--scale-down-mode Delete`.
app-service Creation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/creation.md
description: Learn how to create an App Service Environment.
ms.assetid: 7690d846-8da3-4692-8647-0bf5adfd862a Previously updated : 07/06/2021 Last updated : 09/07/2021
After your ASE is created, you can't change:
The subnet needs to be large enough to hold the maximum size that you'll scale your ASE. Pick a large enough subnet to support your maximum scale needs since it can't be changed after creation. The recommended size is a /24 with 256 addresses.
+## Deployment considerations
+
+There are two important things that need to be thought out before you deploy your ASE.
+
+- VIP type
+- deployment type
+
+There are two different VIP types, internal and external. With an internal VIP, your apps will be reached on the ASE at an address in your ASE subnet and your apps are not on public DNS. During creation in the portal, there is an option to create an Azure private DNS zone for your ASE. With an external VIP, your apps will be on a public internet facing address and your apps are in public DNS.
+
+There are three different deployment types;
+
+- single zone
+- zone redundant
+- host group
+
+The single zone ASE is available in all regions where ASEv3 is available. When you have a single zone ASE, you have a minimum App Service plan instance charge of one instance of windows Isolated v2. As soon as you have one or more instances, then that charge goes away. It is not an additive charge.
+
+In a zone redundant ASE, your apps spread across three zones in the same region. The zone redundant ASE is available in a subset of ASE capable regions primarily limited by the regions that support availability zones. When you have zone redundant ASE, the smallest size for your App Service plan is three instances. That ensures that there is an instance in each availability zone. App Service plans can be scaled up one or more instances at a time. Scaling does not need to be in units of three, but the app is only balanced across all availability zones when the total instances are multiples of three. A zone redundant ASE has triple the infrastructure and is made with zone redundant components so that if even two of the three zones go down for whatever reason, your workloads remain available. Due to the increased system need, the minimum charge for a zone redundant ASE is nine instances. If you have less than nine total App Service plan instances in your ASEv3, the difference will be charged as Windows I1v2. If you have nine or more instances, there is no added charge to have a zone redundant ASE. To learn more about zone redundancy, read [Regions and Availability zones][AZoverview].
+
+In a host group deployment, your apps are deployed onto a dedicated host group. The dedicated host group is not zone redundant. Dedicated host group deployment enables your ASE to be deployed on dedicated hardware. There is no minimum instance charge for use of an ASE on a dedicated host group, but you do have to pay for the host group when provisioning the ASE. On top of that you pay a discounted App Service plan rate as you create your plans and scale out. There are a finite number of cores available with a dedicated host deployment that are used by both the App Service plans and the infrastructure roles. Dedicated host deployments of the ASE can't reach the 200 total instance count normally available in an ASE. The number of total instances possible is related to the total number of App Service plan instances plus the load based number of infrastructure roles.
+ ## Creating an ASE in the portal 1. To create an ASE, search the marketplace for **App Service Environment v3**. + 2. Basics: Select the Subscription, select or create the Resource Group, and enter the name of your ASE. Select the type of Virtual IP type. If you select Internal, your inbound ASE address will be an address in your ASE subnet. If you select External, your inbound ASE address will be a public internet facing address. The ASE name will be also used for the domain suffix of your ASE. If your ASE name is *contoso* and you have an Internal VIP ASE, then the domain suffix will be *contoso.appserviceenvironment.net*. If your ASE name is *contoso* and you have an external VIP, the domain suffix will be *contoso.p.azurewebsites.net*.
-![App Service Environment create basics tab](./media/creation/creation-basics.png)
+
+ ![App Service Environment create basics tab](./media/creation/creation-basics.png)
+ 3. Hosting: Select *Enabled* or *Disabled* for Host Group deployment. Host Group deployment is used to select dedicated hardware. If you select Enabled, your ASE will be deployed onto dedicated hardware. When you deploy onto dedicated hardware, you are charged for the entire dedicated host during ASE creation and then a reduced price for your App Service plan instances.
-![App Service Environment hosting selections](./media/creation/creation-hosting.png)
-4. Networking: Select or create your Virtual Network, select or create your subnet. If you are creating an internal VIP ASE, you will have the option to configure Azure DNS private zones to point your domain suffix to your ASE. Details on how to manually configure DNS are in the DNS section under [Using an App Service Environment][UsingASE].
-![App Service Environment networking selections](./media/creation/creation-networking.png)
+
+ ![App Service Environment hosting selections](./media/creation/creation-hosting.png)
+
+4. Networking: Select or create your Virtual Network, select or create your subnet. If you are creating an internal VIP ASE, you can configure Azure DNS private zones to point your domain suffix to your ASE. Details on how to manually configure DNS are in the DNS section under [Using an App Service Environment][UsingASE].
+
+ ![App Service Environment networking selections](./media/creation/creation-networking.png)
+ 5. Review and Create: Check that your configuration is correct and select create. Your ASE can take up to nearly two hours to create. After your ASE creation completes, you can select it as a location when creating your apps. To learn more about creating apps in your new ASE or managing your ASE, read [Using an App Service Environment][UsingASE]
The ASE is normally deployed on VMs that are provisioned on a multi-tenant hyper
[ASEWAF]: app-service-app-service-environment-web-application-firewall.md [AppGW]: ../../web-application-firewall/ag/ag-overview.md [logalerts]: ../../azure-monitor/alerts/alerts-log.md
+[AZoverview]: ../../availability-zones/az-overview.md
app-service Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/overview.md
description: Overview on the App Service Environment
ms.assetid: 3d37f007-d6f2-4e47-8e26-b844e47ee919 Previously updated : 08/05/2021 Last updated : 09/07/2021
The ASEv3 is available in the following regions.
|Central US |East US 2| |East Asia | France Central| |East US | Germany West Central|
-|East US 2| North Europe|
-|France Central | South Central US|
-|Germany West Central | Southeast Asia|
+|East US 2| Japan East|
+|France Central | North Europe|
+|Germany West Central | South Central US|
+|Japan East | Southeast Asia|
|Korea Central | UK South|
-|North Europe | West Europe|
-|Norway East | West US 2 |
+|North Central US | West Europe|
+|North Europe | West US 2|
+|Norway East | |
+|South Africa North | |
|South Central US | | |Southeast Asia| | |Switzerland North | |
+|UAE North| |
|UK South| | |UK West| | |West Central US | | |West Europe | | |West US | | |West US 2| |
+|West US 3| |
<!--Links-->
-[reservedinstances]: https://docs.microsoft.com/azure/cost-management-billing/reservations/reservation-discount-app-service#how-reservation-discounts-apply-to-isolated-v2-instances
+[reservedinstances]: ../../cost-management-billing/reservations/reservation-discount-app-service.md
[pricing]: https://azure.microsoft.com/pricing/details/app-service/windows/
app-service Operating System Functionality https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/operating-system-functionality.md
description: Learn about the OS functionality in Azure App Service on Windows. F
ms.assetid: 39d5514f-0139-453a-b52e-4a1c06d8d914 Previously updated : 10/30/2018 Last updated : 09/09/2021
Because App Service supports a seamless scaling experience between different tie
## Development frameworks App Service pricing tiers control the amount of compute resources (CPU, disk storage, memory, and network egress) available to apps. However, the breadth of framework functionality available to apps remains the same regardless of the scaling tiers.
-App Service supports a variety of development frameworks, including ASP.NET, classic ASP, node.js, PHP, and Python - all of which run as extensions within IIS. In order to simplify and normalize security configuration, App Service apps typically run the various development frameworks with their default settings. One approach to configuring apps could have been to customize the API surface area and functionality for each individual development framework. App Service instead takes a more generic approach by enabling a common baseline of operating system functionality regardless of an app's development framework.
+App Service supports a variety of development frameworks, including ASP.NET, classic ASP, Node.js, PHP, and Python - all of which run as extensions within IIS. In order to simplify and normalize security configuration, App Service apps typically run the various development frameworks with their default settings. One approach to configuring apps could have been to customize the API surface area and functionality for each individual development framework. App Service instead takes a more generic approach by enabling a common baseline of operating system functionality regardless of an app's development framework.
The following sections summarize the general kinds of operating system functionality available to App Service apps.
Various drives exist within App Service, including local drives and network driv
### Local drives At its core, App Service is a service running on top of the Azure PaaS (platform as a service) infrastructure. As a result, the local drives that are "attached" to a virtual machine are the same drive types available to any worker role running in Azure. This includes: -- An operating system drive (the D:\ drive)-- An application drive that contains Azure Package cspkg files used exclusively by App Service (and inaccessible to customers)-- A "user" drive (the C:\ drive), whose size varies depending on the size of the VM.
+- An operating system drive (`%SystemDrive%`), whose size varies depending on the size of the VM.
+- A resource drive (`%ResourceDrive%`) used by App Service internally.
It is important to monitor your disk utilization as your application grows. If the disk quota is reached, it can have adverse effects to your application. For example:
It is important to monitor your disk utilization as your application grows. If t
<a id="NetworkDrives"></a> ### Network drives (UNC shares)
-One of the unique aspects of App Service that makes app deployment and maintenance straightforward is that all user content is stored on a set of UNC shares. This model maps well to the common pattern of content storage used by on-premises web hosting environments that have multiple load-balanced servers.
+One of the unique aspects of App Service that makes app deployment and maintenance straightforward is that all content shares are stored on a set of UNC shares. This model maps well to the common pattern of content storage used by on-premises web hosting environments that have multiple load-balanced servers.
-Within App Service, there is a number of UNC shares created in each data center. A percentage of the user content for all customers in each data center is allocated to each UNC share. Furthermore, all of the file content for a single customer's subscription is always placed on the same UNC share.
+Within App Service, there is a number of UNC shares created in each data center. A percentage of the user content for all customers in each data center is allocated to each UNC share. Each customer's subscription has a reserved directory structure on a specific UNC share within a data center. A customer may have multiple apps created within a specific data center, so all of the directories belonging to a single customer subscription are created on the same UNC share.
-Due to how Azure services work, the specific virtual machine responsible for hosting a UNC share will change over time. It is guaranteed that UNC shares will be mounted by different virtual machines as they are brought up and down during the normal course of Azure operations. For this reason, apps should never make hard-coded assumptions that the machine information in a UNC file path will remain stable over time. Instead, they should use the convenient *faux* absolute path **D:\home\site** that App Service provides. This faux absolute path provides a portable, app-and-user-agnostic method for referring to one's own app. By using **D:\home\site**, one can transfer shared files from app to app without having to configure a new absolute path for each transfer.
+Due to how Azure services work, the specific virtual machine responsible for hosting a UNC share will change over time. It is guaranteed that UNC shares will be mounted by different virtual machines as they are brought up and down during the normal course of Azure operations. For this reason, apps should never make hard-coded assumptions that the machine information in a UNC file path will remain stable over time. Instead, they should use the convenient *faux* absolute path `%HOME%\site` that App Service provides. This faux absolute path provides a portable, app-and-user-agnostic method for referring to one's own app. By using `%HOME%\site`, one can transfer shared files from app to app without having to configure a new absolute path for each transfer.
<a id="TypesOfFileAccess"></a> ### Types of file access granted to an app
-Each customer's subscription has a reserved directory structure on a specific UNC share within a data center. A customer may have multiple apps created within a specific data center, so all of the directories belonging to a single customer subscription are created on the same UNC share. The share may include directories such as those for content, error and diagnostic logs, and earlier versions of the app created by source control. As expected, a customer's app directories are available for read and write access at runtime by the app's application code.
+The `%HOME%` directory in an app maps to a content share in Azure Storage dedicated for that app, and its size is defined by your [pricing tier](https://azure.microsoft.com/pricing/details/app-service/). It may include directories such as those for content, error and diagnostic logs, and earlier versions of the app created by source control. These directories are available to the app's application code at runtime for read and write access. Because the files are not stored locally, they are persistent across app restarts.
-On the local drives attached to the virtual machine that runs an app, App Service reserves a chunk of space on the C:\ drive for app-specific temporary local storage. Although an app has full read/write access to its own temporary local storage, that storage really isn't intended to be used directly by the application code. Rather, the intent is to provide temporary file storage for IIS and web application frameworks. App Service also limits the amount of temporary local storage available to each app to prevent individual apps from consuming excessive amounts of local file storage.
+On the system drive, App Service reserves `%SystemDrive%\local` for app-specific temporary local storage. Changes to files in this directory are *not* persistent across app restarts. Although an app has full read/write access to its own temporary local storage, that storage really isn't intended to be used directly by the application code. Rather, the intent is to provide temporary file storage for IIS and web application frameworks. App Service also limits the amount of storage in `%SystemDrive%\local` for each app to prevent individual apps from consuming excessive amounts of local file storage. For **Free**, **Shared**, and **Consumption** (Azure Functions) tiers, the limit is 500 MB. See the following table for other tiers:
-Two examples of how App Service uses temporary local storage are the directory for temporary ASP.NET files and the directory for IIS compressed files. The ASP.NET compilation system uses the "Temporary ASP.NET Files" directory as a temporary compilation cache location. IIS uses the "IIS Temporary Compressed Files" directory to store compressed response output. Both of these types of file usage (as well as others) are remapped in App Service to per-app temporary local storage. This remapping ensures that functionality continues as expected.
+| SKU Family | B1/S1/etc. | B2/S2/etc. | B3/S3/etc. |
+| - | - | - | - |
+|Basic, Standard, Premium | 11 GB | 15 GB | 58 GB |
+| PremiumV2, PremiumV3, Isolated | 21 GB | 61 GB | 140 GB |
-Each app in App Service runs as a random unique low-privileged worker process identity called the "application pool identity", described further here: [https://www.iis.net/learn/manage/configuring-security/application-pool-identities](https://www.iis.net/learn/manage/configuring-security/application-pool-identities). Application code uses this identity for basic read-only access to the operating system drive (the D:\ drive). This means application code can list common directory structures and read common files on operating system drive. Although this might appear to be a somewhat broad level of access, the same directories and files are accessible when you provision a worker role in an Azure hosted service and read the drive contents.
+Two examples of how App Service uses temporary local storage are the directory for temporary ASP.NET files and the directory for IIS compressed files. The ASP.NET compilation system uses the `%SystemDrive%\local\Temporary ASP.NET Files` directory as a temporary compilation cache location. IIS uses the `%SystemDrive%\local\IIS Temporary Compressed Files` directory to store compressed response output. Both of these types of file usage (as well as others) are remapped in App Service to per-app temporary local storage. This remapping ensures that functionality continues as expected.
+
+Each app in App Service runs as a random unique low-privileged worker process identity called the "application pool identity", described further here: [https://www.iis.net/learn/manage/configuring-security/application-pool-identities](https://www.iis.net/learn/manage/configuring-security/application-pool-identities). Application code uses this identity for basic read-only access to the operating system drive. This means application code can list common directory structures and read common files on operating system drive. Although this might appear to be a somewhat broad level of access, the same directories and files are accessible when you provision a worker role in an Azure hosted service and read the drive contents.
<a name="multipleinstances"></a> ### File access across multiple instances
-The home directory contains an app's content, and application code can write to it. If an app runs on multiple instances, the home directory is shared among all instances so that all instances see the same directory. So, for example, if an app saves uploaded files to the home directory, those files are immediately available to all instances.
+The content share (`%HOME%`) directory contains an app's content, and application code can write to it. If an app runs on multiple instances, the `%HOME%` directory is shared among all instances so that all instances see the same directory. So, for example, if an app saves uploaded files to the `%HOME%` directory, those files are immediately available to all instances.
+
+The temporary local storage (`%SystemDrive%\local`) directory is not shared between instances, neither is it shared between the app and its [Kudu app](resources-kudu.md).
<a id="NetworkAccess"></a>
app-service Reference App Settings https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/reference-app-settings.md
The following environment variables are related to app deployment. For variables
| `WEBSITE_RUN_FROM_ZIP` | Deprecated. Use `WEBSITE_RUN_FROM_PACKAGE`. | | `WEBSITE_WEBDEPLOY_USE_SCM` | Set to `false` for WebDeploy to stop using the Kudu deployment engine. The default is `true`. To deploy to Linux apps using Visual Studio (WebDeploy/MSDeploy), set it to `false`. | | `MSDEPLOY_RENAME_LOCKED_FILES` | Set to `1` to attempt to rename DLLs if they can't be copied during a WebDeploy deployment. This setting is not applicable if `WEBSITE_WEBDEPLOY_USE_SCM` is set to `false`. |
-| `WEBSITE_DISABLE_SCM_SEPARATION` | By default, the main app and the Kudu app run in different sandboxes. When you stop the app, the Kudu app is still running, and you can continue to use Git deploy and MSDeploy. Each app has its own local files. Turning off this separation (setting to `false`) is a legacy mode that's no longer fully supported. |
+| `WEBSITE_DISABLE_SCM_SEPARATION` | By default, the main app and the Kudu app run in different sandboxes. When you stop the app, the Kudu app is still running, and you can continue to use Git deploy and MSDeploy. Each app has its own local files. Turning off this separation (setting to `true`) is a legacy mode that's no longer fully supported. |
| `WEBSITE_ENABLE_SYNC_UPDATE_SITE` | Set to `1` ensure that REST API calls to update `site` and `siteconfig` are completely applied to all instances before returning. The default is `1` if deploying with an ARM template, to avoid race conditions with subsequent ARM calls. | | `WEBSITE_START_SCM_ON_SITE_CREATION` | In an ARM template deployment, set to `1` in the ARM template to pre-start the Kudu app as part of app creation. | | `WEBSITE_START_SCM_WITH_PRELOAD` | For Linux apps, set to `true` to force preloading the Kudu app when Always On is enabled by pinging its URL. The default is `false`. For Windows apps, the Kudu app is always preloaded. |
The following environment variables are related to [App Service authentication](
| `WEBSITE_AUTH_VALIDATE_NONCE`| `true` or `false`. The default value is `true`. This value should never be set to `false` except when temporarily debugging [cryptographic nonce](https://en.wikipedia.org/wiki/Cryptographic_nonce) validation failures that occur during interactive logins. This application setting is intended for use with the V1 (classic) configuration experience. If using the V2 authentication configuration schema, you should instead use the `login.nonce.validateNonce` configuration value. | | `WEBSITE_AUTH_V2_CONFIG_JSON` | This environment variable is populated automatically by the Azure App Service platform and is used to configure the integrated authentication module. The value of this environment variable corresponds to the V2 (non-classic) authentication configuration for the current app in Azure Resource Manager. It's not intended to be configured explicitly. | | `WEBSITE_AUTH_ENABLED` | Read-only. Injected into a Windows or Linux app to indicate whether App Service authentication is enabled. |
-| `WEBSITE_AUTH_ENCRYPTION_KEY` | By default, the automatically generated key is used as the encryption key. To override, set to a desired key. This is recommended if you want to share tokens or sessions across multiple apps. If specified, it supercedes the `MACHINEKEY_DecryptionKey` setting. ||
-| `WEBSITE_AUTH_SIGNING_KEY` | By default, the automatically generated key is used as the signing key. To override, set to a desired key. This is recommended if you want to share tokens or sessions across multiple apps. If specified, it supercedes the `MACHINEKEY_ValidationKey` setting. ||
+| `WEBSITE_AUTH_ENCRYPTION_KEY` | By default, the automatically generated key is used as the encryption key. To override, set to a desired key. This is recommended if you want to share tokens or sessions across multiple apps. If specified, it supercedes the `MACHINEKEY_DecryptionKey` setting. |
+| `WEBSITE_AUTH_SIGNING_KEY` | By default, the automatically generated key is used as the signing key. To override, set to a desired key. This is recommended if you want to share tokens or sessions across multiple apps. If specified, it supercedes the `MACHINEKEY_ValidationKey` setting. |
<!-- System settings WEBSITE_AUTH_RUNTIME_VERSION
automanage Automanage Virtual Machines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automanage/automanage-virtual-machines.md
In addition to the standard services we onboard you to, we allow you to configur
You can adjust the settings of a default environment through preferences. Learn how to create a preference [here](virtual-machines-custom-preferences.md). > [!NOTE]
-> You cannot change the enivonrment configuration on your VM while Automanage is enabled. You will need to disable Automanage for that VM and then re-enable Automanage with the desired environment and preferences.
+> You cannot change the environment configuration on your VM while Automanage is enabled. You will need to disable Automanage for that VM and then re-enable Automanage with the desired environment and preferences.
For the complete list of participating Azure services and if they support preferences, see here: - [Automanage for Linux](automanage-windows-server.md)
In this article, you learned that Automanage for virtual machines provides a mea
Try enabling Automanage for virtual machines in the Azure portal. > [!div class="nextstepaction"]
-> [Enable Automanage for virtual machines in the Azure portal](quick-create-virtual-machines-portal.md)
+> [Enable Automanage for virtual machines in the Azure portal](quick-create-virtual-machines-portal.md)
azure-app-configuration Rest Api Authentication Hmac https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/rest-api-authentication-hmac.md
using (var client = new HttpClient())
static class HttpRequestMessageExtensions {
- public static HttpRequestMessage Sign(this HttpRequestMessage request, string credential, byte[] secret)
+ public static HttpRequestMessage Sign(this HttpRequestMessage request, string credential, string secret)
{ string host = request.RequestUri.Authority; string verb = request.Method.ToString().ToUpper();
static class HttpRequestMessageExtensions
// Signature string signature;
- using (var hmac = new HMACSHA256(secret))
+ using (var hmac = new HMACSHA256(Convert.FromBase64String(secret)))
{ signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.ASCII.GetBytes(stringToSign))); }
azure-arc Create Data Controller Indirect Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/create-data-controller-indirect-cli.md
Once you have run the command, continue on to [Monitoring the creation status](#
#### Configure storage (Azure Stack HCI with AKS-HCI)
-If you are using Azure Stack HCI with AKS-HCI, do one of the following, depending on your Azure stack HCA AKS-HCI version:
--- For version 1.20 and above, create a custom storage class with `fsGroupPolicy:File` (For details - https://kubernetes-csi.github.io/docs/support-fsgroup.html). -- For version 1.19, use:
+If you are using Azure Stack HCI with AKS-HCI, create a custom storage class with `fsType`.
```json fsType: ext4
By default, the kubeadm deployment profile uses a storage class called `local-st
If you want to customize your deployment profile to specify a specific storage class and/or service type, start by creating a new custom deployment profile file based on the kubeadm deployment profile by running the following command. This command will create a directory `custom` in your current working directory and a custom deployment profile file `control.json` in that directory. ```azurecli
-az arcdata dc config init --source azure-arc-kubeadm --path ./custom --k8s-namespace <namespace> --use-k8s
+az arcdata dc config init --source azure-arc-kubeadm --path ./custom
``` You can look up the available storage classes by running the following command.
azure-arc Limitations Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/limitations-managed-instance.md
+
+ Title: Limitations of Azure Arc-enabled SQL Managed Instance
+description: Limitations of Azure Arc-enabled SQL Managed Instance
++++++ Last updated : 09/07/2021+++
+# Limitations of Azure Arc-enabled SQL Managed Instance
+
+This article describes limitations of Azure Arc-enabled SQL Managed Instance.
+
+At this time, the business critical service tier is public preview. The general purpose service tier is generally available.
+
+## Backup and restore
+
+### Automated backups
+
+- User databases with SIMPLE recovery model are not backed up.
+- System database `model` is not backed up in order to prevent interference with creation/deletion of database. The database gets locked when admin operations are performed.
+
+### Point-in-time restore (PITR)
+
+- Doesn't support restore from one Azure Arc-enabled SQL Managed Instance to another Azure Arc-enabled SQL Managed Instance. The database can only be restored to the same Arc-enabled SQL Managed Instance where the backups were created.
+- Renaming of a databases is currently not supported, for point in time restore purposes.
+- No support for restoring a TDE enabled database currently.
+- A deleted database cannot be restored currently.
+
+## Other limitations
+
+- Transactional replication is currently not supported.
+- Log shipping is currently blocked.
+- Only SQL Server authentication is supported.
+
+## Roles and responsibilities
+
+The roles and responsibilities between Microsoft and its customers differ between Azure PaaS services (Platform As A Service) and Azure hybrid (like Azure Arc-enabled SQL Managed Instance).
+
+### Frequently asked questions
+
+The table below summarizes answers to frequently asked questions regarding support roles and responsibilities.
+
+| Question | Azure Platform As A Service (PaaS) | Azure Arc hybrid services |
+|:-|::|::|
+| Who provides the infrastructure? | Microsoft | Customer |
+| Who provides the software?* | Microsoft | Microsoft |
+| Who does the operations? | Microsoft | Customer |
+| Does Microsoft provide SLAs? | Yes | No |
+| WhoΓÇÖs in charge of SLAs? | Microsoft | Customer |
+
+\* Azure services
+
+__Why doesn't Microsoft provide SLAs on Azure Arc hybrid services?__ Because Microsoft does not own the infrastructure and does not operate it. Customers do.
+
+## Next steps
+
+- **Try it out.** Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
+
+- **Create your own.** Follow these steps to create on your own Kubernetes cluster:
+ 1. [Install the client tools](install-client-tools.md)
+ 2. [Create the Azure Arc data controller](create-data-controller.md)
+ 3. [Create an Azure Arc-enabled SQL Managed Instance](create-sql-managed-instance.md)
+
+- **Learn**
+ - [Read more about Azure Arc-enabled data services](https://azure.microsoft.com/services/azure-arc/hybrid-data-services)
+ - [Read about Azure Arc](https://aka.ms/azurearc)
azure-arc Reference Az Postgres Arc Server https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/reference/reference-az-postgres-arc-server.md
The path to the source json file for the Azure Arc-enabled PostgreSQL Hyperscale
#### `--k8s-namespace -k` The Kubernetes namespace where the Azure Arc-enabled PostgreSQL Hyperscale server group is deployed. If no namespace is specified, then the namespace defined in the kubeconfig will be used. #### `--cores-limit`
-The maximum number of CPU cores for Azure Arc-enabled PostgreSQL Hyperscale server group that can be used per node. Fractional cores are supported. Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The maximum number of CPU cores for Azure Arc-enabled PostgreSQL Hyperscale server group that can be used per node. Fractional cores are supported. Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--cores-request`
-The minimum number of CPU cores that must be available per node to schedule the service. Fractional cores are supported. Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The minimum number of CPU cores that must be available per node to schedule the service. Fractional cores are supported. Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--memory-limit`
-The memory limit of the Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The memory limit of the Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--memory-request`
-The memory request of the Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The memory request of the Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--storage-class-data` The storage class to be used for data persistent volumes. #### `--storage-class-logs`
The storage class to be used for logs persistent volumes.
#### `--storage-class-backups` The storage class to be used for backup persistent volumes. #### `--volume-claim-mounts`
-A comma-separated list of volume claim mounts. A volume claim mount is a pair of an existing persistent volume claim (in the same namespace) and volume type (and optional metadata depending on the volume type) separated by colon.The persistent volume will be mounted in each pod for the PostgreSQL server group. The mount path may depend on the volume type.
+A comma-separated list of volume claim mounts. A volume claim mount is a pair of an existing persistent volume claim (in the same namespace) and volume type (and optional metadata depending on the volume type) separated by colon. The persistent volume will be mounted in each pod for the PostgreSQL server group. The mount path may depend on the volume type.
#### `--extensions` A comma-separated list of the Postgres extensions that should be loaded on startup. Please refer to the postgres documentation for supported values. #### `--volume-size-data`
Optional.
#### `--no-wait` If given, the command will not wait for the instance to be in a ready state before returning. #### `--engine-settings`
-A comma separated list of Postgres engine settings in the format 'key1=val1, key2=val2'.
+A comma-separated list of Postgres engine settings in the format 'key1=val1, key2=val2'.
#### `--coordinator-settings`
-A comma separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'coordinator' node role. When node role specific settings are specified, default settings will be ignored and overridden with the settings provided here.
+A comma-separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'coordinator' node role. When node role-specific settings are specified, default settings will be ignored and overridden with the settings provided here.
#### `--worker-settings`
-A comma separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'worker' node role. When node role specific settings are specified, default settings will be ignored and overridden with the settings provided here.
+A comma-separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'worker' node role. When node role-specific settings are specified, default settings will be ignored and overridden with the settings provided here.
#### `--use-k8s` Use local Kubernetes APIs to perform this action. ### Global Arguments
The path to the source json file for the Azure Arc-enabled PostgreSQL Hyperscale
#### `--workers -w` The number of worker nodes to provision in a server group. In Preview, reducing the number of worker nodes is not supported. Refer to documentation for additional details. #### `--cores-limit`
-The maximum number of CPU cores for Azure Arc-enabled PostgreSQL Hyperscale server group that can be used per node, fractional cores are supported. To remove the cores_limit, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The maximum number of CPU cores for Azure Arc-enabled PostgreSQL Hyperscale server group that can be used per node, fractional cores are supported. To remove the cores_limit, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--cores-request`
-The minimum number of CPU cores that must be available per node to schedule the service, fractional cores are supported. To remove the cores_request, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The minimum number of CPU cores that must be available per node to schedule the service, fractional cores are supported. To remove the cores_request, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--memory-limit`
-The memory limit for Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). To remove the memory_limit, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The memory limit for Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). To remove the memory_limit, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--memory-request`
-The memory request for Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). To remove the memory_request, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format <role>=<value>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
+The memory request for Azure Arc-enabled PostgreSQL Hyperscale server group as a number followed by Ki (kilobytes), Mi (megabytes), or Gi (gigabytes). To remove the memory_request, specify its value as empty string. Optionally a comma-separated list of roles with values can be specified in format \<role\>=\<value\>. Valid roles are: "coordinator" or "c", "worker" or "w". If no roles are specified, settings will apply to all nodes of the PostgreSQL Hyperscale server group.
#### `--extensions` A comma-separated list of the Postgres extensions that should be loaded on startup. Please refer to the postgres documentation for supported values. #### `--port`
Optional.
#### `--no-wait` If given, the command will not wait for the instance to be in a ready state before returning. #### `--engine-settings`
-A comma separated list of Postgres engine settings in the format 'key1=val1, key2=val2'. The provided settings will be merged with the existing settings. To remove a setting, provide an empty value like 'removedKey='. If you change an engine setting that requires a restart, the service will be restarted to apply the settings immediately.
+A comma-separated list of Postgres engine settings in the format 'key1=val1, key2=val2'. The provided settings will be merged with the existing settings. To remove a setting, provide an empty value like 'removedKey='. If you change an engine setting that requires a restart, the service will be restarted to apply the settings immediately.
#### `--replace-settings` When specified with --engine-settings, will replace all existing custom engine settings with new set of settings and values. #### `--coordinator-settings`
-A comma separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'coordinator' node role. When node role specific settings are specified, default settings will be ignored and overridden with the settings provided here.
+A comma-separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'coordinator' node role. When node role-specific settings are specified, default settings will be ignored and overridden with the settings provided here.
#### `--worker-settings`
-A comma separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'worker' node role. When node role specific settings are specified, default settings will be ignored and overridden with the settings provided here.
+A comma-separated list of Postgres engine settings in the format 'key1=val1, key2=val2' to be applied to 'worker' node role. When node role-specific settings are specified, default settings will be ignored and overridden with the settings provided here.
#### `--admin-password` If given, the Azure Arc-enabled PostgreSQL Hyperscale server group's admin password will be set to the value of the AZDATA_PASSWORD environment variable if present and a prompted value otherwise. #### `--use-k8s`
azure-arc Quickstart Connect Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/quickstart-connect-cluster.md
Previously updated : 06/30/2021 Last updated : 09/09/2021 keywords: "Kubernetes, Arc, Azure, cluster"
If your cluster is behind an outbound proxy server, Azure CLI and the Azure Arc-
``` > [!NOTE]
- > * Some network requests such as the ones involving in-cluster service-to-service communication need to be separated from the traffic that is routed via the proxy server for outbound communication. The `--proxy-skip-range` parameter can be used to specify the CIDR range and endpoints in a comma-separated way so that any communication from the agents to these endpoints do not go via the outbound proxy. At a minimum, the CIDR range of the services in the cluster should be specified as value for this parameter. For example, let's say `kubectl get svc -A` returns a list of services where all the services have ClusterIP values in the range `10.0.0.0/16`. Then the value to specify for `--proxy-skip-range` is '10.0.0.0/16,kubernetes.default.svc'.
+ > * Some network requests such as the ones involving in-cluster service-to-service communication need to be separated from the traffic that is routed via the proxy server for outbound communication. The `--proxy-skip-range` parameter can be used to specify the CIDR range and endpoints in a comma-separated way so that any communication from the agents to these endpoints do not go via the outbound proxy. At a minimum, the CIDR range of the services in the cluster should be specified as value for this parameter. For example, let's say `kubectl get svc -A` returns a list of services where all the services have ClusterIP values in the range `10.0.0.0/16`. Then the value to specify for `--proxy-skip-range` is `10.0.0.0/16,kubernetes.default.svc,.svc.cluster.local,.svc`.
> * `--proxy-http`, `--proxy-https`, and `--proxy-skip-range` are expected for most outbound proxy environments. `--proxy-cert` is *only* required if you need to inject trusted certificates expected by proxy into the trusted certificate store of agent pods. ### [Azure PowerShell](#tab/azure-powershell)
azure-cache-for-redis Cache Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-configure.md
When using the Redis Console with a premium clustered cache, you can issue comma
If you attempt to access a key that is stored in a different shard than the connected shard, you receive an error message similar to the following message:
-```
+```console
shard1>get myKey (error) MOVED 866 13.90.202.154:13000 (shard 0) ```
For information on moving resources from one resource group to another, and from
## Next steps
-* For more information on working with Redis commands, see [How can I run Redis commands?](cache-development-faq.yml#how-can-i-run-redis-commands-)
+* For more information on working with Redis commands, see [How can I run Redis commands?](cache-development-faq.yml#how-can-i-run-redis-commands-)
azure-cache-for-redis Cache How To Multi Replicas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-multi-replicas.md
Last updated 08/11/2020
# Add replicas to Azure Cache for Redis
-In this article, you'll learn how to set up an Azure Cache instance with additional replicas using the Azure portal.
+
+In this article, you'll learn how to set up an Azure Cache for Redis instance with additional replicas using the Azure portal.
Azure Cache for Redis Standard and Premium tiers offer redundancy by hosting each cache on two dedicated virtual machines (VMs). These VMs are configured as primary and replica. When the primary VM becomes unavailable, the replica detects that and takes over as the new primary automatically. You can now increase the number of replicas in a Premium cache up to three, giving you a total of four VMs backing a cache. Having multiple replicas results in higher resilience than what a single replica can provide. ## Prerequisites+ * Azure subscription - [create one for free](https://azure.microsoft.com/free/) ## Create a cache+ To create a cache, follow these steps: 1. Sign in to the [Azure portal](https://portal.azure.com) and select **Create a resource**.
To create a cache, follow these steps:
1. On the **New** page, select **Databases** and then select **Azure Cache for Redis**. :::image type="content" source="media/cache-create/new-cache-menu.png" alt-text="Select Azure Cache for Redis.":::
-
+ 1. On the **Basics** page, configure the settings for your new cache.
-
+ | Setting | Suggested value | Description | | | - | -- |
- | **Subscription** | Select your subscription. | The subscription under which to create this new Azure Cache for Redis instance. |
- | **Resource group** | Select a resource group, or select **Create new** and enter a new resource group name. | Name for the resource group in which to create your cache and other resources. By putting all your app resources in one resource group, you can easily manage or delete them together. |
- | **DNS name** | Enter a globally unique name. | The cache name must be a string between 1 and 63 characters that contains only numbers, letters, or hyphens. The name must start and end with a number or letter, and can't contain consecutive hyphens. Your cache instance's *host name* will be *\<DNS name>.redis.cache.windows.net*. |
+ | **Subscription** | Select your subscription. | The subscription under which to create this new Azure Cache for Redis instance. |
+ | **Resource group** | Select a resource group, or select **Create new** and enter a new resource group name. | Name for the resource group in which to create your cache and other resources. By putting all your app resources in one resource group, you can easily manage or delete them together. |
+ | **DNS name** | Enter a globally unique name. | The cache name must be a string between 1 and 63 characters that contains only numbers, letters, or hyphens. The name must start and end with a number or letter, and can't contain consecutive hyphens. Your cache instance's *host name* will be *\<DNS name>.redis.cache.windows.net*. |
| **Location** | Select a location. | Select a [region](https://azure.microsoft.com/regions/) near other services that will use your cache. | | **Cache type** | Select a [Premium tier](https://azure.microsoft.com/pricing/details/cache/) cache. | The pricing tier determines the size, performance, and features that are available for the cache. For more information, see [Azure Cache for Redis Overview](cache-overview.md). |
-
+ 1. On the **Advanced** page, choose **Replica count**.
-
+ :::image type="content" source="media/cache-how-to-multi-replicas/create-multi-replicas.png" alt-text="Replica count.":::
-
+ > [!NOTE] > Currently, you can't use Append-only File (AOF) persistence or geo-replication with multiple replicas (more than one replica). >
-1. Leave other options in their default settings.
+1. Leave other options in their default settings.
1. Select **Create**.
-
+ It takes a while for the cache to create. You can monitor progress on the Azure Cache for Redis **Overview** page. When **Status** shows as **Running**, the cache is ready to use. > [!NOTE]
To create a cache, follow these steps:
> ## Next Steps+ Learn more about Azure Cache for Redis features. > [!div class="nextstepaction"]
azure-cache-for-redis Cache Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-migration-guide.md
Last updated 07/22/2020
# Migrate to Azure Cache for Redis+ This article describes a number of approaches to migrate an existing Redis cache running on-premises or in another cloud service to Azure Cache for Redis. ## Migration scenarios+ Open-source Redis can run in many compute environments. Common examples include: - **On-premises** - Redis caches running in private datacenters.
azure-functions Functions Bindings Rabbitmq Output https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-rabbitmq-output.md
When working with C# Script functions:
# [JavaScript](#tab/javascript)
-The queue message is available via context.bindings.<NAME> where <NAME> matches the name defined in function.json. If the payload is JSON, the value is deserialized into an object.
+The queue message is available via context.bindings.\<NAME\> where \<NAME\> matches the name defined in function.json. If the payload is JSON, the value is deserialized into an object.
# [Python](#tab/python)
azure-functions Deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/start-stop-vms/deploy.md
To simplify management and removal, we recommend you deploy Start/Stop VMs v2 (p
> [!NOTE] > Currently this preview does not support specifying an existing Storage account or Application Insights resource. +
+> [!NOTE]
+> The naming format for the function app and storage account has changed. To guarantee global uniqueness, a random and unique string is now appended to the names of these resource.
+ 1. Open your browser and navigate to the Start/Stop VMs v2 [GitHub organization](https://github.com/microsoft/startstopv2-deployments/blob/main/README.md). 1. Select the deployment option based on the Azure cloud environment your Azure VMs are created in. This will open the custom Azure Resource Manager deployment page in the Azure portal. 1. If prompted, sign in to the [Azure portal](https://portal.azure.com).
To simplify management and removal, we recommend you deploy Start/Stop VMs v2 (p
:::image type="content" source="media/deploy/deployment-results-resource-list.png" alt-text="Start/Stop VMs template deployment resource list":::
-> [!NOTE]
-> The naming format for the function app and storage account has changed. To guarantee global uniqueness, a random and unique string is now appended to the names of these resource.
- > [!NOTE] > We are collecting operation and heartbeat telemetry to better assist you if you reach the support team for any troubleshooting. We are also collecting virtual machine event history to verify when the service acted on a virtual machine and how long a virtual machine was snoozed in order to determine the efficacy of the service.
For each scenario, you can target the action against one or more subscriptions,
"/subscriptions/11111111-0000-1111-2222-444444444444/resourceGroups/rg2/providers/Microsoft.ClassicCompute/virtualMachines/vm30" ]
+ }
} ```
+1. In the overview pane for the logic app, select **Enable**.
+ ## Sequenced start and stop scenario In an environment that includes two or more components on multiple Azure Resource Manager VMs in a distributed application architecture, supporting the sequence in which components are started and stopped in order is important.
azure-monitor Alerts Metric https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/alerts/alerts-metric.md
description: Learn how to use Azure portal or CLI to create, view, and manage me
Previously updated : 08/02/2021 Last updated : 09/09/2021 # Create, view, and manage metric alerts using Azure Monitor
The previous sections described how to create, view, and manage metric alert rul
6. You can disable a metric alert rule using the following command. ```azurecli
- az monitor metrics alert update -g {ResourceGroup} -n {AlertRuleName} --disabled false
+ az monitor metrics alert update -g {ResourceGroup} -n {AlertRuleName} --enabled false
``` 7. You can delete a metric alert rule using the following command.
azure-monitor Snapshot Collector Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/snapshot-collector-release-notes.md
For bug reports and feedback, open an issue on GitHub at https://github.com/micr
## Release notes
+## [1.4.0](https://www.nuget.org/packages/Microsoft.ApplicationInsights.SnapshotCollector/1.4.0)
+A point release to address multiple improvements and added support for Azure Active Directory (AAD) authentication for Application Insights ingestion.
+### Changes
+- Add back MinidumpWithThreadInfo when writing dumps.
+- Add CompatibilityVersion to improve synchronization between Snapshot Collector agent and uploader on breaking changes.
+- Remove the DebugQueryBufferSize detour.
+- Target netstandard2.0 only in Snapshot Collector.
+- Update to latest Azure SDK packages.
+- Change SnapshotUploader LogFile naming algorithm to avoid excessive file I/O in Antares.
+- Add pid, role name, and process start time to blob metadata.
+- Use System.Diagnostics.Process where possible in Snapshot Collector and Snapshot Uploader.
+- Snapshot Collector package size reduced by 60%. From 10.34 mb to 4.11 mb.
+- Update to latest Application Insights 2.18.0.
+### New features
+- Add Azure Active Directory authentication to SnapshotCollector. Learn more about Azure AD authentication in Application Insights [here](./azure-ad-authentication.md).
+### Bug fixes
+- Fix [ObjectDisposedException on shutdown](https://github.com/microsoft/ApplicationInsights-dotnet/issues/2097).
+ ## [1.3.7.5](https://www.nuget.org/packages/Microsoft.ApplicationInsights.SnapshotCollector/1.3.7.5) A point release to backport a fix from 1.4.0-pre. ### Bug fixes
azure-monitor Logs Data Export https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/logs-data-export.md
Currently, there are no additional charges for the data export feature. Pricing
### Storage account Data is sent to storage accounts as it reaches Azure Monitor and stored in hourly append blobs. The data export configuration creates a container for each table in the storage account with the name *am-* followed by the name of the table. For example, the table *SecurityEvent* would sent to a container named *am-SecurityEvent*.
-The storage account blob path is *WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/\<resource-group\>/providers/microsoft.operationalinsights/workspaces/\<workspace\>/y=\<four-digit numeric year\>/m=\<two-digit numeric month\>/d=\<two-digit numeric day\>/h=\<two-digit 24-hour clock hour\>/m=00/PT1H.json*. Since append blobs are limited to 50K writes in storage, the number of exported blobs may extend if the number of appends is high. The naming pattern for blobs in such a case would be PT1H_#.json, where # is the incremental blob count.
+Data is sent to storage accounts as it reaches Azure Monitor and stored in append blobs. The data export configuration creates a container for each table in the storage account a prefix *am-* followed by the name of the table. For example, *SecurityEvent* is exported to a container named *am-SecurityEvent*.
+
+The storage account blob path is in hourly blob path. Starting 15-October 2021, the blob path is in 5 minutes granularity: *WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/\<resource-group\>/providers/microsoft.operationalinsights/workspaces/\<workspace\>/y=\<four-digit numeric year\>/m=\<two-digit numeric month\>/d=\<two-digit numeric day\>/h=\<two-digit 24-hour clock hour\>/m=\<two-digit 60-minute clock minute\>/PT05M.json*. Since append blobs are limited to 50K writes in storage, the number of exported blobs may extend if the number of appends is high. The naming pattern for blobs in such a case would be PT05M_#.json*, where # is the incremental blob count.
The storage account data format is [JSON lines](../essentials/resource-logs-blob-format.md). This means each record is delimited by a newline, with no outer records array and no commas between JSON records.
azure-monitor Logs Dedicated Clusters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/logs-dedicated-clusters.md
Title: Azure Monitor Logs Dedicated Clusters
-description: Customers who ingest more than 1 TB a day of monitoring data may use dedicated rather than shared clusters
+description: Customers meeting the minimum commitment tier could use dedicated clusters
Azure Monitor Logs Dedicated Clusters are a deployment option that enables advanced capabilities for Azure Monitor Logs customers. Customers can select which of their Log Analytics workspaces should be hosted on dedicated clusters.
-Dedicated clusters require customers to commit using a capacity of at least 1 TB of data ingestion per day. You can migrate an existing workspace to a dedicated cluster with no data loss or service interruption.
+Dedicated clusters require customers to commit for at least 500 GB of data ingestion per day. You can migrate an existing workspace to a dedicated cluster with no data loss or service interruption.
-The capabilities that require dedicated clusters:
+Capabilities that require dedicated clusters:
- **[Customer-managed Keys](../logs/customer-managed-keys.md)** - Encrypt the cluster data using keys that are provided and controlled by the customer. - **[Lockbox](../logs/customer-managed-keys.md#customer-lockbox-preview)** - Control Microsoft support engineers access requests to your data.
After you create your cluster resource, you can edit additional properties such
You can have up to 2 active clusters per subscription per region. If the cluster is deleted, it is still reserved for 14 days. You can have up to 4 reserved clusters per subscription per region (active or recently deleted).
-> [!INFORMATION]
+> [!NOTE]
> Cluster creation triggers resource allocation and provisioning. This operation can take a few hours to complete. > Dedicated cluster is billed once provisioned regardless data ingestion and itΓÇÖs recommended to prepare the deployment to expedite the provisioning and workspaces link to cluster. Verify the following: > - A list of initial workspace to be linked to cluster is identified
azure-netapp-files Create Active Directory Connections https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/create-active-directory-connections.md
na ms.devlang: na Previously updated : 09/07/2021 Last updated : 09/09/2021 # Create and manage Active Directory connections for Azure NetApp Files
Additional AADDS considerations apply for Azure NetApp Files:
* Azure NetApp Files supports `user` and `resource forest` types. * For synchronization type, you can select `All` or `Scoped`. If you select `Scoped`, ensure the correct Azure AD group is selected for accessing SMB shares. If you are uncertain, you can use the `All` synchronization type.
+* If you use AADDS with a dual-protocol volume, you must be in a custom OU in order to apply POSIX attributes. See [Manage LDAP POSIX Attributes](create-volumes-dual-protocol.md#manage-ldap-posix-attributes) for details.
When you create an Active Directory connection, note the following specifics for AADDS:
This setting is configured in the **Active Directory Connections** under **NetAp
You can specify users or groups that will be given administrator privileges on the volume. ![Screenshot that shows the Administrators box of Active Directory connections window.](../media/azure-netapp-files/active-directory-administrators.png)
+
+ The **Administrators** feature is currently in preview. If this is your first time using this feature, register the feature before using it:
+
+ ```azurepowershell-interactive
+ Register-AzProviderFeature -ProviderNamespace Microsoft.NetApp -FeatureName ANFAdAdministrators
+ ```
+
+ Check the status of the feature registration:
+
+ > [!NOTE]
+ > The **RegistrationState** may be in the `Registering` state for up to 60 minutes before changing to`Registered`. Wait until the status is `Registered` before continuing.
+
+ ```azurepowershell-interactive
+ Get-AzProviderFeature -ProviderNamespace Microsoft.NetApp -FeatureName ANFAdAdministrators
+ ```
+
+ You can also use [Azure CLI commands](/cli/azure/feature) `az feature register` and `az feature show` to register the feature and display the registration status.
* Credentials, including your **username** and **password**
azure-netapp-files Snapshots Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/snapshots-introduction.md
The following diagram shows a volume reverting to an earlier snapshot:
![Diagram that shows a volume reverting to an earlier snapshot](../media/azure-netapp-files/snapshot-volume-revert.png) > [!IMPORTANT]
-> Active filesystem data that was written and snapshots that were taken after the selected snapshot was taken will be lost. The snapshot revert operation will replace all the data in the targeted volume with the data in the selected snapshot. You should pay attention to the snapshot contents and creation date when you select a snapshot. You cannot undo the snapshot revert operation.
+> Active filesystem data that was written and snapshots that were taken after the selected snapshot will be lost. The snapshot revert operation will replace all the data in the targeted volume with the data in the selected snapshot. You should pay attention to the snapshot contents and creation date when you select a snapshot. You cannot undo the snapshot revert operation.
See [Revert a volume using snapshot revert](azure-netapp-files-manage-snapshots.md#revert-a-volume-using-snapshot-revert) about how to use this feature.
azure-netapp-files Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/whats-new.md
na ms.devlang: na Previously updated : 09/07/2021 Last updated : 09/08/2021
Azure NetApp Files is updated regularly. This article provides a summary about t
## September 2021
-* [**Administrators**](create-active-directory-connections.md#create-an-active-directory-connection) option in Active Directory connections
+* [**Administrators**](create-active-directory-connections.md#create-an-active-directory-connection) option in Active Directory connections (Preview)
The Active Directory connections page now includes an **Administrators** field. You can specify users or groups that will be given administrator privileges on the volume.
azure-percept Troubleshoot Dev Kit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/troubleshoot-dev-kit.md
In this section, you'll get guidance on which logs to collect and how to collect
|*Module container logs* - records details about specific IoT Edge module containers|Collect when you find issues with a module|```sudo iotedge logs [container name]```| |*Network logs* - a set of logs covering Wi-Fi services and the network stack.|Collect when you find Wi-Fi or network issues.|```sudo journalctl -u hostapd.service -u wpa_supplicant.service -u ztpd.service -u systemd-networkd > network_log.txt```<br><br>```cat /etc/os-release && cat /etc/os-subrelease && cat /etc/adu-version && rpm -q ztpd > system_ver.txt```<br><br>Run both commands. Each command collects multiple logs and puts them into a single output.|
+> [!WARNING]
+> Output from the `support-bundle` command can contain host, device and module names, information logged by your modules etc. Please be aware of this if sharing the output in a public forum.
+ ## Troubleshooting commands Here's a set of commands that can be used for troubleshooting issues you may find with the dev kit. To run these commands, you must first connect to your dev kit [over SSH](./how-to-ssh-into-percept-dk.md).
There are three small LEDs on top of the carrier board housing. A cloud icon is
|LED 2 (Wi-Fi) |Slow blink |Device is ready to be configured by Wi-Fi Easy Connect and is announcing its presence to a configurator. | |LED 2 (Wi-Fi) |Fast blink |Authentication was successful, device association in progress. | |LED 2 (Wi-Fi) |On (solid) |Authentication and association were successful; device is connected to a Wi-Fi network. |
-|LED 3 |NA |LED not in use. |
+|LED 3 |NA |LED not in use. |
azure-resource-manager Add Template To Azure Pipelines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/add-template-to-azure-pipelines.md
An Azure CLI task takes the following inputs:
## Next steps
+* To learn more about using Bicep with Azure Pipelines, and for hands-on guidance, see [Build your first Bicep deployment pipeline by using Azure Pipelines](/learn/modules/build-first-bicep-deployment-pipeline-using-azure-pipelines/) on **Microsoft Learn**.
* To use the what-if operation in a pipeline, see [Test ARM templates with What-If in a pipeline](https://4bes.nl/2021/03/06/test-arm-templates-with-what-if/). * To learn about using Bicep file with GitHub Actions, see [Deploy Bicep files by using GitHub Actions](./deploy-github-actions.md).
azure-resource-manager Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/best-practices.md
Last updated 06/01/2021
This article recommends practices to follow when developing your Bicep files. These practices make your Bicep file easier to understand and use.
+### Microsoft Learn
+
+To learn more about Bicep best practices, and for hands-on guidance, see [Structure your Bicep code for collaboration](/learn/modules/structure-bicep-code-collaboration/) on **Microsoft Learn**.
+ ## Parameters * Use good naming for parameter declarations. Good names make your templates easy to read and understand. Make sure you're using clear, descriptive names, and be consistent in your naming.
azure-resource-manager Bicep Functions Files https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-files.md
+
+ Title: Bicep functions - files
+description: Describes the functions to use in a Bicep file to load content from a file.
+ Last updated : 09/09/2021++
+# File functions for Bicep
+
+This article describes the Bicep functions for loading content from external files.
+
+## loadFileAsBase64
+
+`loadFileAsBase64(filePath)`
+
+Loads the content of the specified file as a base64 string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| filePath | Yes | string | The path to the file to load. The path is relative to the deployed Bicep file. |
+
+### Remarks
+
+Use this function when you have base64 content that is stored in a separate file. Rather than duplicating the content into your Bicep file, load the content with this function. The file is loaded when the Bicep file is compiled to a JSON template. During deployment, the JSON template contains the contents of the file as a hard-coded string.
+
+This function requires **Bicep version 0.4.412 or later**.
+
+The maximum allowed size of the file is **96 Kb**.
+
+### Return value
+
+The contents of the file as a base64 string.
+
+## loadTextContent
+
+`loadTextContent(filePath, [encoding])`
+
+Loads the content of the specified file as a string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| filePath | Yes | string | The path to the file to load. The path is relative to the deployed Bicep file. |
+| encoding | No | string | The file encoding. The default value is `utf-8`. The available options are: `iso-8859-1`, `us-ascii`, `utf-16`, `utf-16BE`, or `utf-8`. |
+
+### Remarks
+
+Use this function when you have content that is more stored in a separate file. Rather than duplicating the content in your Bicep file, load the content with this function. For example, you can load a deployment script from a file. The file is loaded when the Bicep file is compiled to the JSON template. During deployment, the JSON template contains the contents of the file as a hard-coded string.
+
+When loading a JSON file, you can use the [json](bicep-functions-object.md#json) function with the loadTextContent function to create a JSON object. In VS Code, the properties of the loaded object are available intellisense. For example, you can create a file with values to share across many Bicep files. An example is shown in this article.
+
+This function requires **Bicep version 0.4.412 or later**.
+
+The maximum allowed size of the file is **131,072 characters**, including line endings.
+
+### Return value
+
+The contents of the file as a string.
+
+### Examples
+
+The following example loads a script from a file and uses it for a deployment script.
++
+In the next example, you create a JSON file that contains values you want to use for a network security group.
++
+You load that file and convert it to a JSON object. You use the object to assign values to the resource.
++
+You can reuse the file of values in other Bicep files that deploy a network security group.
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions.md
Title: Bicep functions description: Describes the functions to use in a Bicep file to retrieve values, work with strings and numerics, and retrieve deployment information. Previously updated : 06/16/2021 Last updated : 09/09/2021 # Bicep functions
The following functions are available for getting values related to the deployme
* [deployment](./bicep-functions-deployment.md#deployment) * [environment](./bicep-functions-deployment.md#environment)
+## File functions
+
+The following functions are available for loading the content from external files into your Bicep file.
+
+* [loadFileAsBase64](bicep-functions-files.md#loadfileasbase64)
+* [loadTextContent](bicep-functions-files.md#loadtextcontent)
+ ## Logical functions The following function is available for working with logical conditions:
azure-resource-manager Child Resource Name Type https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/child-resource-name-type.md
Each parent resource accepts only certain resource types as child resources. The
In Bicep, you can specify the child resource either within the parent resource or outside of the parent resource. The values you provide for the resource name and resource type vary based on whether the child resource is defined inside or outside of the parent resource.
+### Microsoft Learn
+
+To learn more about child resources, and for hands-on guidance, see [Deploy child and extension resources by using Bicep](/learn/modules/child-extension-bicep-templates) on **Microsoft Learn**.
+ ## Within parent resource The following example shows the child resource included within the resources property of the parent resource.
azure-resource-manager Conditional Resource Deployment https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/conditional-resource-deployment.md
Sometimes you need to optionally deploy a resource or module in Bicep. Use the `
> [!NOTE] > Conditional deployment doesn't cascade to [child resources](child-resource-name-type.md). If you want to conditionally deploy a resource and its child resources, you must apply the same condition to each resource type.
+### Microsoft Learn
+
+To learn more about conditions, and for hands-on guidance, see [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) on **Microsoft Learn**.
+ ## Deploy condition You can pass in a parameter value that indicates whether a resource is deployed. The following example conditionally deploys a DNS zone.
azure-resource-manager Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-cli.md
The deployment can take a few minutes to complete. When it finishes, you see a m
## Deploy remote Bicep file
-Currently, Azure CLI doesn't support deploying remote Bicep files. Use [Bicep CLI](./install.md#development-environment) to compile the Bicep file to a JSON template, and then load the JSON file to the remote location.
+Currently, Azure CLI doesn't support deploying remote Bicep files. Use [Bicep CLI](./install.md#vs-code-and-bicep-extension) to compile the Bicep file to a JSON template, and then load the JSON file to the remote location.
## Parameters
azure-resource-manager Deploy Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-powershell.md
The deployment can take several minutes to complete.
## Deploy remote Bicep file
-Currently, Azure PowerShell doesn't support deploying remote Bicep files. Use [Bicep CLI](./install.md#development-environment) to compile the Bicep file to a JSON template, and then load the JSON file to the remote location.
+Currently, Azure PowerShell doesn't support deploying remote Bicep files. Use [Bicep CLI](./install.md#vs-code-and-bicep-extension) to compile the Bicep file to a JSON template, and then load the JSON file to the remote location.
## Parameters
azure-resource-manager Deploy To Management Group https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-to-management-group.md
This article describes how to set scope with Bicep when deploying to a managemen
As your organization matures, you can deploy a Bicep file to create resources at the management group level. For example, you may need to define and assign [policies](../../governance/policy/overview.md) or [Azure role-based access control (Azure RBAC)](../../role-based-access-control/overview.md) for a management group. With management group level templates, you can declaratively apply policies and assign roles at the management group level.
+### Microsoft Learn
+
+To learn more about deployment scopes, and for hands-on guidance, see [Deploy resources to subscriptions, management groups, and tenants by using Bicep](/learn/modules/deploy-resources-scopes-bicep/) on **Microsoft Learn**.
+ ## Supported resources Not all resource types can be deployed to the management group level. This section lists which resource types are supported.
azure-resource-manager Deploy To Subscription https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-to-subscription.md
To simplify the management of resources, you can deploy resources at the level o
> [!NOTE] > You can deploy to 800 different resource groups in a subscription level deployment.
+### Microsoft Learn
+
+To learn more about deployment scopes, and for hands-on guidance, see [Deploy resources to subscriptions, management groups, and tenants by using Bicep](/learn/modules/deploy-resources-scopes-bicep/) on **Microsoft Learn**.
+ ## Supported resources Not all resource types can be deployed to the subscription level. This section lists which resource types are supported.
azure-resource-manager Deploy To Tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-to-tenant.md
Last updated 07/19/2021
As your organization matures, you may need to define and assign [policies](../../governance/policy/overview.md) or [Azure role-based access control (Azure RBAC)](../../role-based-access-control/overview.md) across your Azure AD tenant. With tenant level templates, you can declaratively apply policies and assign roles at a global level.
+### Microsoft Learn
+
+To learn more about deployment scopes, and for hands-on guidance, see [Deploy resources to subscriptions, management groups, and tenants by using Bicep](/learn/modules/deploy-resources-scopes-bicep/) on **Microsoft Learn**.
+ ## Supported resources Not all resource types can be deployed to the tenant level. This section lists which resource types are supported.
azure-resource-manager Deploy What If https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-what-if.md
Before deploying a Bicep file, you can preview the changes that will happen. Azu
You can use the what-if operation with Azure PowerShell, Azure CLI, or REST API operations. What-if is supported for resource group, subscription, management group, and tenant level deployments.
+### Microsoft Learn
+
+To learn more about the what-if operation, and for hands-on guidance, see [Preview Azure deployment changes by using what-if](/learn/modules/arm-template-whatif/) on **Microsoft Learn**.
+ ## Install Azure PowerShell module To use what-if in PowerShell, you must have version **4.2 or later of the Az module**.
azure-resource-manager Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/install.md
Title: Set up Bicep development and deployment environments description: How to configure Bicep development and deployment environments Previously updated : 08/26/2021 Last updated : 09/09/2021
Let's make sure your environment is set up for developing and deploying Bicep files.
-## Development environment
+## VS Code and Bicep extension
To create Bicep files, you need a good Bicep editor. We recommend:
To verify you've installed the extension, open any file with the `.bicep` file e
:::image type="content" source="./media/install/language-mode.png" alt-text="Bicep language mode":::
-## Deployment environment
+After setting up your development environment, install the latest version of Azure CLI. Those steps are shown in the next sections.
-The easiest way to get the commands you need to deploy a Bicep file is to install the latest version of Azure CLI. You can also use PowerShell, but it requires an extra installation.
--- [Azure CLI](#azure-cli)-- [Azure PowerShell](#azure-powershell)-- [Install manually](#install-manually)-
-### Azure CLI
+## Azure CLI
You must have Azure CLI version 2.20.0 or later installed. To install or update Azure CLI, see:
For more commands, see [Bicep CLI](bicep-cli.md).
> [!IMPORTANT] > Azure CLI installs a self-contained instance of the Bicep CLI. This instance doesn't conflict with any versions you may have manually installed. Azure CLI doesn't add Bicep CLI to your PATH.
-#### Install on an air-gapped cloud
-
-To install Bicep CLI in an air-gapped environment, you need to download the Bicep CLI executable manually and save it to a certain location.
--- **Linux**-
- 1. Download **bicep-linux-x64** from the [Bicep release page](https://github.com/Azure/bicep/releases/latest/) in a non-air-gapped environment.
- 1. Copy the executable to the **$HOME/.azure/bin** directory on an air-gapped machine.
--- **macOS**-
- 1. Download **bicep-osx-x64** from the [Bicep release page](https://github.com/Azure/bicep/releases/latest/) in a non-air-gapped environment.
- 1. Copy the executable to the **$HOME/.azure/bin** directory on an air-gapped machine.
--- **Windows**-
- 1. Download **bicep-win-x64.exe** from the [Bicep release page](https://github.com/Azure/bicep/releases/latest/) in a non-air-gapped environment.
- 1. Copy the executable to the **%UserProfile%/.azure/bin** directory on an air-gapped machine.
-
-Note `bicep install` and `bicep upgrade` commands don't not work in an air-gapped environment.
-
-### Azure PowerShell
+## Azure PowerShell
You must have Azure PowerShell version 5.6.0 or later installed. To update or install, see [Install Azure PowerShell](/powershell/azure/install-az-ps).
To deploy Bicep files, use Bicep CLI version 0.3.1 or later. To check your Bicep
bicep --version ```
-### Install manually
+## Install manually
The following methods install the Bicep CLI and add it to your PATH. You must manually install for any use other than Azure CLI.
The following methods install the Bicep CLI and add it to your PATH. You must ma
- [macOS](#macos) - [Windows](#windows)
-#### Linux
+### Linux
```sh # Fetch the latest Bicep CLI binary
bicep --help
> [!NOTE] > For lightweight Linux distributions like [Alpine](https://alpinelinux.org/), use **bicep-linux-musl-x64** instead of **bicep-linux-x64** in the preceding script.
-#### macOS
+### macOS
-##### via homebrew
+#### via homebrew
```sh # Add the tap for bicep
brew tap azure/bicep
brew install bicep ```
-##### via BASH
+#### via BASH
```sh # Fetch the latest Bicep CLI binary
bicep --help
```
-#### Windows
+### Windows
-##### Windows Installer
+#### Windows Installer
Download and run the [latest Windows installer](https://github.com/Azure/bicep/releases/latest/download/bicep-setup-win-x64.exe). The installer doesn't require administrative privileges. After the installation, Bicep CLI is added to your user PATH. Close and reopen any open command shell windows for the PATH change to take effect.
-##### Chocolatey
+#### Chocolatey
```powershell choco install bicep ```
-##### Winget
+#### Winget
```powershell winget install -e --id Microsoft.Bicep ```
-##### Manual with PowerShell
+#### Manual with PowerShell
```powershell # Create the install folder
bicep --help
# Done! ```
-### Install the nightly builds
+## Install on air-gapped cloud
+
+To install Bicep CLI in an air-gapped environment, you need to download the Bicep CLI executable manually and save it to a certain location.
+
+- **Linux**
+
+ 1. Download **bicep-linux-x64** from the [Bicep release page](https://github.com/Azure/bicep/releases/latest/) in a non-air-gapped environment.
+ 1. Copy the executable to the **$HOME/.azure/bin** directory on an air-gapped machine.
+
+- **macOS**
+
+ 1. Download **bicep-osx-x64** from the [Bicep release page](https://github.com/Azure/bicep/releases/latest/) in a non-air-gapped environment.
+ 1. Copy the executable to the **$HOME/.azure/bin** directory on an air-gapped machine.
+
+- **Windows**
+
+ 1. Download **bicep-win-x64.exe** from the [Bicep release page](https://github.com/Azure/bicep/releases/latest/) in a non-air-gapped environment.
+ 1. Copy the executable to the **%UserProfile%/.azure/bin** directory on an air-gapped machine.
+
+Note `bicep install` and `bicep upgrade` commands don't work in an air-gapped environment.
+
+## Install the nightly builds
If you'd like to try the latest pre-release bits of Bicep before they're released, see [Install nightly builds](https://github.com/Azure/bicep/blob/main/docs/installing-nightly.md).
azure-resource-manager Loop Modules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-modules.md
This article shows you how to deploy more than one instance of a [module](module
You can also use a loop with [resources](loop-resources.md), [properties](loop-properties.md), [variables](loop-variables.md), and [outputs](loop-outputs.md).
+### Microsoft Learn
+
+To learn more about loops, and for hands-on guidance, see [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) on **Microsoft Learn**.
+ ## Syntax Loops can be used to declare multiple modules by:
azure-resource-manager Loop Outputs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-outputs.md
This article shows you how to create more than one value for an output in your B
You can also use a loop with [modules](loop-modules.md), [resources](loop-resources.md), [properties in a resource](loop-properties.md), and [variables](loop-variables.md).
+### Microsoft Learn
+
+To learn more about loops, and for hands-on guidance, see [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) on **Microsoft Learn**.
+ ## Syntax Loops can be used to return items during deployment by:
azure-resource-manager Loop Properties https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-properties.md
You can only use a loop with top-level resources, even when applying a loop to a
You can also use a loop with [modules](loop-modules.md), [resources](loop-resources.md), [variables](loop-variables.md), and [outputs](loop-outputs.md).
+### Microsoft Learn
+
+To learn more about loops, and for hands-on guidance, see [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) on **Microsoft Learn**.
+ ## Syntax Loops can be used to declare multiple properties by:
azure-resource-manager Loop Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-resources.md
You can also use a loop with [modules](loop-modules.md), [properties](loop-prope
If you need to specify whether a resource is deployed at all, see [condition element](conditional-resource-deployment.md).
+### Microsoft Learn
+
+To learn more about loops, and for hands-on guidance, see [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) on **Microsoft Learn**.
+ ## Syntax Loops can be used declare multiple resources by:
azure-resource-manager Loop Variables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-variables.md
This article shows you how to create more than one value for a variable in your
You can also use copy with [modules](loop-modules.md), [resources](loop-resources.md), [properties in a resource](loop-properties.md), and [outputs](loop-outputs.md).
+### Microsoft Learn
+
+To learn more about loops, and for hands-on guidance, see [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) on **Microsoft Learn**.
+ ## Syntax Loops can be used to declare multiple variables by:
azure-resource-manager Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/overview.md
Title: Bicep language for deploying Azure resources description: Describes the Bicep language for deploying infrastructure to Azure. It provides an improved authoring experience over using JSON to develop templates. Previously updated : 08/18/2021 Last updated : 09/09/2021 # What is Bicep?
When you're ready, you can [decompile the JSON files to Bicep](./decompile.md).
## Known limitations -- No support for single-line object and arrays. For example, `['a', 'b', 'c']` isn't supported. For more information, see [Arrays](data-types.md#arrays) and [Objects](data-types.md#objects).-- No support for breaking long lines into multiple lines. For example:
+- Bicep is newline sensitive. For example:
```bicep resource sa 'Microsoft.Storage/storageAccounts@2019-06-01' = if (newOrExisting == 'new') {
azure-resource-manager Parameters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/parameters.md
Resource Manager resolves parameter values before starting the deployment operat
Each parameter must be set to one of the [data types](data-types.md).
+### Microsoft Learn
+
+To learn more about parameters, and for hands-on guidance, see [Build reusable Bicep templates by using parameters](/learn/modules/build-reusable-bicep-templates-parameters) on **Microsoft Learn**.
+ ## Minimal declaration Each parameter needs a name and type. A parameter can't have the same name as a variable, resource, output, or other parameter in the same scope.
azure-resource-manager Scope Extension Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/scope-extension-resources.md
This article shows how to set the scope for an extension resource type when depl
> [!NOTE] > The scope property is only available to extension resource types. To specify a different scope for a resource type that isn't an extension type, use a [module](modules.md).
+### Microsoft Learn
+
+To learn more about extension resources, and for hands-on guidance, see [Deploy child and extension resources by using Bicep](/learn/modules/child-extension-bicep-templates) on **Microsoft Learn**.
+ ## Apply at deployment scope To apply an extension resource type at the target deployment scope, add the resource to your template as you would with any other resource type. The available scopes are [resource group](deploy-to-resource-group.md), [subscription](deploy-to-subscription.md), [management group](deploy-to-management-group.md), and [tenant](deploy-to-tenant.md). The deployment scope must support the resource type.
azure-resource-manager Resource Name Rules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/resource-name-rules.md
Title: Resource naming restrictions description: Shows the rules and restrictions for naming Azure resources. Previously updated : 09/03/2021 Last updated : 09/09/2021 # Naming rules and restrictions for Azure resources
In the following tables, the term alphanumeric refers to:
* **A** through **Z** (uppercase letters) * **0** through **9** (numbers)
+> [!NOTE]
+> All resources with a public endpoint can't include reserved words or trademarks in the name. For a list of the blocked words, see [Resolve reserved resource name errors](resource-name-rules.md).
+ ## Microsoft.AnalysisServices > [!div class="mx-tableFixed"]
In the following tables, the term alphanumeric refers to:
## Next steps
-For recommendations about how to name resources, see [Ready: Recommended naming and tagging conventions](/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging).
+* For recommendations about how to name resources, see [Ready: Recommended naming and tagging conventions](/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging).
+
+* All resources with a public endpoint can't include reserved words or trademarks in the name. For a list of the blocked words, see [Resolve reserved resource name errors](resource-name-rules.md).
azure-resource-manager Resources Without Resource Group Limit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/resources-without-resource-group-limit.md
By default, you can deploy up to 800 instances of a resource type in each resour
For some resource types, you need to contact support to have the 800 instance limit removed. Those resource types are noted in this article.
+Some resources have a limit on the number instances per region. This limit is different than the 800 instances per resource group. To check your instances per region, use the Azure portal. Select your subscription and **Usage + quotas** in the left pane. For more information, see [Check resource usage against limits](../../networking/check-usage-against-limits.md).
## Microsoft.AlertsManagement
azure-resource-manager Template Functions Comparison https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/template-functions-comparison.md
Title: Template functions - comparison description: Describes the functions to use in an Azure Resource Manager template (ARM template) to compare values. Previously updated : 05/11/2021 Last updated : 09/08/2021 # Comparison functions for ARM templates
In Bicep, use the `??` operator instead. See [Coalesce ??](../bicep/operators-lo
| Parameter | Required | Type | Description | |: |: |: |: | | arg1 |Yes |int, string, array, or object |The first value to test for null. |
-| additional args |No |int, string, array, or object |Additional values to test for null. |
+| more args |No |int, string, array, or object | More values to test for null. |
### Return value
The value of the first non-null parameters, which can be a string, int, array, o
### Example
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/coalesce.json) shows the output from different uses of coalesce.
+The following example template shows the output from different uses of coalesce.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "objectToTest": {
- "type": "object",
- "defaultValue": {
- "null1": null,
- "null2": null,
- "string": "default",
- "int": 1,
- "object": { "first": "default" },
- "array": [ 1 ]
- }
- }
- },
- "resources": [
- ],
- "outputs": {
- "stringOutput": {
- "type": "string",
- "value": "[coalesce(parameters('objectToTest').null1, parameters('objectToTest').null2, parameters('objectToTest').string)]"
- },
- "intOutput": {
- "type": "int",
- "value": "[coalesce(parameters('objectToTest').null1, parameters('objectToTest').null2, parameters('objectToTest').int)]"
- },
- "objectOutput": {
- "type": "object",
- "value": "[coalesce(parameters('objectToTest').null1, parameters('objectToTest').null2, parameters('objectToTest').object)]"
- },
- "arrayOutput": {
- "type": "array",
- "value": "[coalesce(parameters('objectToTest').null1, parameters('objectToTest').null2, parameters('objectToTest').array)]"
- },
- "emptyOutput": {
- "type": "bool",
- "value": "[empty(coalesce(parameters('objectToTest').null1, parameters('objectToTest').null2))]"
- }
- }
-}
-```
The output from the preceding example with the default values is:
The equals function is often used with the `condition` element to test whether a
### Example
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/equals.json) checks different types of values for equality. All the default values return True.
+The following example checks different types of values for equality. All the default values return True.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "firstInt": {
- "type": "int",
- "defaultValue": 1
- },
- "secondInt": {
- "type": "int",
- "defaultValue": 1
- },
- "firstString": {
- "type": "string",
- "defaultValue": "a"
- },
- "secondString": {
- "type": "string",
- "defaultValue": "a"
- },
- "firstArray": {
- "type": "array",
- "defaultValue": [ "a", "b" ]
- },
- "secondArray": {
- "type": "array",
- "defaultValue": [ "a", "b" ]
- },
- "firstObject": {
- "type": "object",
- "defaultValue": { "a": "b" }
- },
- "secondObject": {
- "type": "object",
- "defaultValue": { "a": "b" }
- }
- },
- "resources": [
- ],
- "outputs": {
- "checkInts": {
- "type": "bool",
- "value": "[equals(parameters('firstInt'), parameters('secondInt') )]"
- },
- "checkStrings": {
- "type": "bool",
- "value": "[equals(parameters('firstString'), parameters('secondString'))]"
- },
- "checkArrays": {
- "type": "bool",
- "value": "[equals(parameters('firstArray'), parameters('secondArray'))]"
- },
- "checkObjects": {
- "type": "bool",
- "value": "[equals(parameters('firstObject'), parameters('secondObject'))]"
- }
- }
-}
-```
The output from the preceding example with the default values is:
The output from the preceding example with the default values is:
| checkArrays | Bool | True | | checkObjects | Bool | True |
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/not-equals.json) uses [not](template-functions-logical.md#not) with **equals**.
+The following example template uses [not](template-functions-logical.md#not) with **equals**.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "resources": [
- ],
- "outputs": {
- "checkNotEquals": {
- "type": "bool",
- "value": "[not(equals(1, 2))]"
- }
- }
-}
-```
The output from the preceding example is:
Returns **True** if the first value is greater than the second value; otherwise,
### Example
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/greater.json) checks whether the one value is greater than the other.
+The following example checks whether the one value is greater than the other.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "firstInt": {
- "type": "int",
- "defaultValue": 1
- },
- "secondInt": {
- "type": "int",
- "defaultValue": 2
- },
- "firstString": {
- "type": "string",
- "defaultValue": "A"
- },
- "secondString": {
- "type": "string",
- "defaultValue": "a"
- }
- },
- "resources": [
- ],
- "outputs": {
- "checkInts": {
- "type": "bool",
- "value": "[greater(parameters('firstInt'), parameters('secondInt') )]"
- },
- "checkStrings": {
- "type": "bool",
- "value": "[greater(parameters('firstString'), parameters('secondString'))]"
- }
- }
-}
-```
The output from the preceding example with the default values is:
Returns **True** if the first value is greater than or equal to the second value
### Example
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/greaterorequals.json) checks whether the one value is greater than or equal to the other.
+The following example checks whether the one value is greater than or equal to the other.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "firstInt": {
- "type": "int",
- "defaultValue": 1
- },
- "secondInt": {
- "type": "int",
- "defaultValue": 2
- },
- "firstString": {
- "type": "string",
- "defaultValue": "A"
- },
- "secondString": {
- "type": "string",
- "defaultValue": "a"
- }
- },
- "resources": [
- ],
- "outputs": {
- "checkInts": {
- "type": "bool",
- "value": "[greaterOrEquals(parameters('firstInt'), parameters('secondInt') )]"
- },
- "checkStrings": {
- "type": "bool",
- "value": "[greaterOrEquals(parameters('firstString'), parameters('secondString'))]"
- }
- }
-}
-```
The output from the preceding example with the default values is:
Returns **True** if the first value is less than the second value; otherwise, **
### Example
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/less.json) checks whether the one value is less than the other.
+The following example checks whether the one value is less than the other.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "firstInt": {
- "type": "int",
- "defaultValue": 1
- },
- "secondInt": {
- "type": "int",
- "defaultValue": 2
- },
- "firstString": {
- "type": "string",
- "defaultValue": "A"
- },
- "secondString": {
- "type": "string",
- "defaultValue": "a"
- }
- },
- "resources": [
- ],
- "outputs": {
- "checkInts": {
- "type": "bool",
- "value": "[less(parameters('firstInt'), parameters('secondInt') )]"
- },
- "checkStrings": {
- "type": "bool",
- "value": "[less(parameters('firstString'), parameters('secondString'))]"
- }
- }
-}
-```
The output from the preceding example with the default values is:
Returns **True** if the first value is less than or equal to the second value; o
### Example
-The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/lessorequals.json) checks whether the one value is less than or equal to the other.
+The following example checks whether the one value is less than or equal to the other.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "firstInt": {
- "type": "int",
- "defaultValue": 1
- },
- "secondInt": {
- "type": "int",
- "defaultValue": 2
- },
- "firstString": {
- "type": "string",
- "defaultValue": "A"
- },
- "secondString": {
- "type": "string",
- "defaultValue": "a"
- }
- },
- "resources": [
- ],
- "outputs": {
- "checkInts": {
- "type": "bool",
- "value": "[lessOrEquals(parameters('firstInt'), parameters('secondInt') )]"
- },
- "checkStrings": {
- "type": "bool",
- "value": "[lessOrEquals(parameters('firstString'), parameters('secondString'))]"
- }
- }
-}
-```
The output from the preceding example with the default values is:
The output from the preceding example with the default values is:
## Next steps
-* For a description of the sections in an ARM template, see [Understand the structure and syntax of ARM templates](./syntax.md).
+* For a description of the sections in an ARM template, see [Understand the structure and syntax of ARM templates](./syntax.md).
azure-resource-manager Template Functions Date https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/template-functions-date.md
Title: Template functions - date description: Describes the functions to use in an Azure Resource Manager template (ARM template) to work with dates. Previously updated : 05/11/2021 Last updated : 09/09/2021 # Date functions for ARM templates
The datetime value that results from adding the duration value to the base value
The following example template shows different ways of adding time values.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "baseTime": {
- "type": "string",
- "defaultValue": "[utcNow('u')]"
- }
- },
- "variables": {
- "add3Years": "[dateTimeAdd(parameters('baseTime'), 'P3Y')]",
- "subtract9Days": "[dateTimeAdd(parameters('baseTime'), '-P9D')]",
- "add1Hour": "[dateTimeAdd(parameters('baseTime'), 'PT1H')]"
- },
- "resources": [],
- "outputs": {
- "add3YearsOutput": {
- "value": "[variables('add3Years')]",
- "type": "string"
- },
- "subtract9DaysOutput": {
- "value": "[variables('subtract9Days')]",
- "type": "string"
- },
- "add1HourOutput": {
- "value": "[variables('add1Hour')]",
- "type": "string"
- },
- }
-}
-```
When the preceding template is deployed with a base time of `2020-04-07 14:53:14Z`, the output is:
When the preceding template is deployed with a base time of `2020-04-07 14:53:14
The next example template shows how to set the start time for an Automation schedule.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "omsAutomationAccountName": {
- "type": "string",
- "defaultValue": "demoAutomation",
- "metadata": {
- "description": "Use an existing Automation account."
- }
- },
- "scheduleName": {
- "type": "string",
- "defaultValue": "demoSchedule1",
- "metadata": {
- "description": "Name of the new schedule."
- }
- },
- "baseTime": {
- "type": "string",
- "defaultValue": "[utcNow('u')]",
- "metadata": {
- "description": "Schedule will start one hour from this time."
- }
- }
- },
- "variables": {
- "startTime": "[dateTimeAdd(parameters('baseTime'), 'PT1H')]"
- },
- "resources": [
- ...
- {
- "type": "Microsoft.Automation/automationAccounts/schedules",
- "apiVersion": "2015-10-31",
- "name": "[concat(parameters('omsAutomationAccountName'), '/', parameters('scheduleName'))]",
-
- "properties": {
- "description": "Demo Scheduler",
- "startTime": "[variables('startTime')]",
- "interval": 1,
- "frequency": "Hour"
- }
- }
- ],
- "outputs": {
- }
-}
-```
## utcNow
The current UTC datetime value.
The following example template shows different formats for the datetime value.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "utcValue": {
- "type": "string",
- "defaultValue": "[utcNow()]"
- },
- "utcShortValue": {
- "type": "string",
- "defaultValue": "[utcNow('d')]"
- },
- "utcCustomValue": {
- "type": "string",
- "defaultValue": "[utcNow('M d')]"
- }
- },
- "resources": [
- ],
- "outputs": {
- "utcOutput": {
- "type": "string",
- "value": "[parameters('utcValue')]"
- },
- "utcShortOutput": {
- "type": "string",
- "value": "[parameters('utcShortValue')]"
- },
- "utcCustomOutput": {
- "type": "string",
- "value": "[parameters('utcCustomValue')]"
- }
- }
-}
-```
The output from the preceding example varies for each deployment but will be similar to:
The output from the preceding example varies for each deployment but will be sim
The next example shows how to use a value from the function when setting a tag value.
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2018-05-01/subscriptionDeploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "utcShort": {
- "type": "string",
- "defaultValue": "[utcNow('d')]"
- },
- "rgName": {
- "type": "string"
- }
- },
- "resources": [
- {
- "type": "Microsoft.Resources/resourceGroups",
- "apiVersion": "2020-10-01",
- "name": "[parameters('rgName')]",
- "location": "westeurope",
- "tags": {
- "createdDate": "[parameters('utcShort')]"
- },
- "properties": {}
- }
- ],
- "outputs": {
- "utcShortOutput": {
- "type": "string",
- "value": "[parameters('utcShort')]"
- }
- }
-}
-```
## Next steps
-* For a description of the sections in an ARM template, see [Understand the structure and syntax of ARM templates](./syntax.md).
+* For a description of the sections in an ARM template, see [Understand the structure and syntax of ARM templates](./syntax.md).
azure-sql Auditing Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/auditing-overview.md
To configure writing audit logs to an event hub, select **Event Hub**. Select th
## <a id="subheading-3"></a>Analyze audit logs and reports
-If you chose to write audit logs to Azure Monitor logs:
+If you chose to write audit logs to Log Analytics:
- Use the [Azure portal](https://portal.azure.com). Open the relevant database. At the top of the database's **Auditing** page, select **View audit logs**.
azure-sql Elastic Pool Resource Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/elastic-pool-resource-management.md
Previously updated : 09/16/2020 Last updated : 09/8/2021 # Resource management in dense elastic pools
This approach allows customers to use dense elastic pools to achieve adequate pe
> In dense pools with many active databases, it may not be feasible to increase the number of databases in the pool up to the maximums documented for [DTU](resource-limits-dtu-elastic-pools.md) and [vCore](resource-limits-vcore-elastic-pools.md) elastic pools. > > The number of databases that can be placed in dense pools without causing resource contention and performance problems depends on the number of concurrently active databases, and on resource consumption by user workloads in each database. This number can change over time as user workloads change.
+>
+> Additionally, if the min vCores per database, or min DTUs per database setting is set to a value greater than 0, the maximum number of databases in the pool will be implicitly limited. For more information, see [Database properties for pooled vCore databases](resource-limits-vcore-elastic-pools.md#database-properties-for-pooled-databases) and [Database properties for pooled DTU databases](resource-limits-dtu-elastic-pools.md#database-properties-for-pooled-databases).
When resource contention occurs in a densely packed pool, customers can choose one or more of the following actions to mitigate it:
In addition to these metrics, Azure SQL Database provides a view that returns ac
|[sys.dm_resource_governor_workload_groups_history_ex](/sql/relational-databases/system-dynamic-management-views/sys-dm-resource-governor-workload-groups-history-ex-azure-sql-database)|Returns workload group utilization statistics for the last 32 minutes. Each row represents a 20-second interval. The `delta_` columns return the change in each statistic during the interval.| |||
+> [!TIP]
+> To query these and other dynamic management views using a principal other than server administrator, add this principal to the `##MS_ServerStateReader##` [server role](security-server-roles.md).
+ These views can be used to monitor resource utilization and troubleshoot resource contention in near real-time. User workload on the primary and readable secondary replicas, including geo-replicas, is classified into the `SloSharedPool1` resource pool and `UserPrimaryGroup.DBId[N]` workload group, where `N` stands for the database ID value. In addition to monitoring current resource utilization, customers using dense pools can maintain historical resource utilization data in a separate data store. This data can be used in predictive analysis to proactively manage resource utilization based on historical and seasonal trends.
azure-sql Ledger Append Only Ledger Tables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-append-only-ledger-tables.md
Title: "Azure SQL Database append-only ledger tables" description: This article provides information on append-only ledger table schema and views in Azure SQL Database.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
Append-only ledger tables allow only `INSERT` operations on your tables, which ensures that privileged users such as database administrators can't alter data through traditional [Data Manipulation Language](/sql/t-sql/queries/queries) operations. Append-only ledger tables are ideal for systems that don't update or delete records, such as security information event and management systems or blockchain systems where data needs to be replicated from the blockchain to a database. Because there are no `UPDATE` or `DELETE` operations on an append-only table, there's no need for a corresponding history table as there is with [updatable ledger tables](ledger-updatable-ledger-tables.md).
azure-sql Ledger Audit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-audit.md
Title: "Azure SQL Database audit events with ledger-enabled tables" description: Overview of Azure SQL Database ledger auditing capabilities- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
When you perform forensics activities with ledger-enabled tables, data is captured in the ledger view and database ledger. Other action IDs are added to the SQL audit logs, too. The following tables outline these new audit logging events. The conditions that trigger the events follow each table.
azure-sql Ledger Create A Single Database With Ledger Enabled https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-create-a-single-database-with-ledger-enabled.md
Previously updated : "07/23/2021"- Last updated : "09/09/2021" # Quickstart: Create a database in Azure SQL Database with ledger enabled
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
In this quickstart, you create a [ledger database](ledger-overview.md#ledger-database) in Azure SQL Database and configure [automatic digest storage with Azure Blob Storage](ledger-digest-management-and-database-verification.md#automatic-generation-and-storage-of-database-digests) by using the Azure portal. For more information about ledger, see [Azure SQL Database ledger](ledger-overview.md).
azure-sql Ledger Database Ledger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-database-ledger.md
Title: "Database ledger" description: This article provides information on ledger database tables and associated views in Azure SQL Database.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
The database ledger is part of the ledger feature of Azure SQL Database. The database ledger incrementally captures the state of a database as the database evolves over time, while updates occur on ledger tables. It logically uses a blockchain and [Merkle tree data structures](/archive/msdn-magazine/2018/march/blockchain-blockchain-fundamentals).
azure-sql Ledger Digest Management And Database Verification https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-digest-management-and-database-verification.md
Title: "Digest management and database verification" description: This article provides information on digest management and database verification for a ledger database in Azure SQL Database.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
Azure SQL Database ledger provides a form of data integrity called *forward integrity*, which provides evidence of data tampering on data in your ledger tables. For example, if a banking transaction occurs on a ledger table where a balance has been updated to value `x`, and an attacker later modifies the data by changing the balance from `x` to `y`, database verification will detect this tampering activity.
azure-sql Ledger How To Access Acl Digest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-how-to-access-acl-digest.md
Title: "Access the digests stored in Azure Confidential Ledger" description: Access the digests stored in Azure Confidential Ledger with an Azure SQL Database ledger.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
This article shows you how to access an [Azure SQL Database ledger](ledger-overview.md) digest stored in [Azure Confidential Ledger](../../confidential-ledger/index.yml) to get end-to-end security and integrity guarantees. Throughout this article, we'll explain how to access and verify integrity of the stored information.
azure-sql Ledger How To Append Only Ledger Tables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-how-to-append-only-ledger-tables.md
Title: "Create and use append-only ledger tables" description: Learn how to create and use append-only ledger tables in Azure SQL Database.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
This article shows you how to create an [append-only ledger table](ledger-append-only-ledger-tables.md) in Azure SQL Database. Next, you'll insert values in your append-only ledger table and then attempt to make updates to the data. Finally, you'll view the results by using the ledger view. We'll use an example of a card key access system for a facility, which is an append-only system pattern. Our example will give you a practical look at the relationship between the append-only ledger table and its corresponding ledger view.
azure-sql Ledger How To Updatable Ledger Tables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-how-to-updatable-ledger-tables.md
Title: "Create and use updatable ledger tables" description: Learn how to create and use updatable ledger tables in Azure SQL Database.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
This article shows you how to create an [updatable ledger table](ledger-updatable-ledger-tables.md) in Azure SQL Database. Next, you'll insert values in your updatable ledger table and then make updates to the data. Finally, you'll view the results by using the ledger view. We'll use an example of a banking application that tracks banking customers' balances in their accounts. Our example will give you a practical look at the relationship between the updatable ledger table and its corresponding history table and ledger view.
azure-sql Ledger Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-limits.md
Title: "Limitations for Azure SQL Database ledger" description: Limitations of the ledger feature in Azure SQL Database- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
This article provides an overview of the limitations of ledger tables used with Azure SQL Database.
azure-sql Ledger Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-overview.md
Title: "Azure SQL Database ledger overview" description: Learn the basics of the Azure SQL Database ledger feature.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
Establishing trust around the integrity of data stored in database systems has been a longstanding problem for all organizations that manage financial, medical, or other sensitive data. The ledger feature of [Azure SQL Database](sql-database-paas-overview.md) provides tamper-evidence capabilities in your database. You can cryptographically attest to other parties, such as auditors or other business parties, that your data hasn't been tampered with.
azure-sql Ledger Updatable Ledger Tables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-updatable-ledger-tables.md
Title: "Azure SQL Database updatable ledger tables" description: This article provides information on updatable ledger tables, ledger schema, and ledger views in Azure SQL Database.- Previously updated : "07/23/2021" Last updated : "09/09/2021"
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
Updatable ledger tables are system-versioned tables on which users can perform updates and deletes while also providing tamper-evidence capabilities. When updates or deletes occur, all earlier versions of a row are preserved in a secondary table, known as the history table. The history table mirrors the schema of the updatable ledger table. When a row is updated, the latest version of the row remains in the ledger table, while its earlier version is inserted into the history table by the system, transparently to the application.
azure-sql Ledger Verify Database https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-verify-database.md
Previously updated : "07/23/2021"- Last updated : "09/09/2021" # Verify a ledger table to detect tampering
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in public preview and available in West Europe, Brazil South, and West Central US.
+> Azure SQL Database ledger is currently in public preview.
In this article, you'll verify the integrity of the data in your Azure SQL Database ledger tables. If you selected **Enable automatic digest storage** when you [created your database in SQL Database](ledger-create-a-single-database-with-ledger-enabled.md), follow the Azure portal instructions to automatically generate the Transact-SQL (T-SQL) script needed to verify the database ledger in the [query editor](connect-query-portal.md). Otherwise, follow the T-SQL instructions by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio).
azure-sql Sql Data Sync Data Sql Server Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/sql-data-sync-data-sql-server-sql-database.md
Previously updated : 08/20/2019 Last updated : 09/09/2021 # What is SQL Data Sync for Azure?
Not directly. You can sync between SQL Server databases indirectly, however, by
### Can I use Data Sync to sync between databases in SQL Database that belong to different subscriptions
-Yes. You can sync between databases that belong to resource groups owned by different subscriptions.
+Yes. You can sync between databases that belong to resource groups owned by different subscriptions, even if the subscriptions belong to different tenants.
- If the subscriptions belong to the same tenant, and you have permission to all subscriptions, you can configure the sync group in the Azure portal.-- Otherwise, you have to use PowerShell to add the sync members that belong to different subscriptions.
+- Otherwise, you have to use PowerShell to add the sync members.
### Can I use Data Sync to sync between databases in SQL Database that belong to different clouds (like Azure Public Cloud and Azure China 21Vianet)
azure-video-analyzer Visualize Ai Events Power Bi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-video-analyzer/video-analyzer-docs/visualize-ai-events-power-bi.md
Last updated 09/08/2021
-# Visualize AI inference events with Power BI
+# Tutorial: Real-time visualization of AI inference events with Power BI
Azure Video Analyzer provides the capability to capture, record, and analyze live video along with publishing the results of video analysis in form of AI inference events to the [IoT Edge Hub](../../iot-edge/iot-edge-runtime.md?view=iotedge-2020-11&preserve-view=true#iot-edge-hub). These AI inference events can then be routed to other destinations including Visual Studio Code and Azure services such as Time Series Insights and Event Hubs.
cloudfoundry Cloudfoundry Deploy Your First App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cloudfoundry/cloudfoundry-deploy-your-first-app.md
There are several options for creating a Cloud Foundry environment on Azure: -- Use the [Pivotal Cloud Foundry offer][pcf-azuremarketplace] in the Azure Marketplace to create a standard environment that includes PCF Ops Manager and the Azure Service Broker. You can find [complete instructions][pcf-azuremarketplace-pivotaldocs] for deploying the marketplace offer in the Pivotal documentation.
+- Use the Pivotal Cloud Foundry offer in the Azure Marketplace to create a standard environment that includes PCF Ops Manager and the Azure Service Broker. You can find [complete instructions][pcf-azuremarketplace-pivotaldocs] for deploying the marketplace offer in the Pivotal documentation.
- Create a customized environment by [deploying Pivotal Cloud Foundry manually][pcf-custom]. - [Deploy the open-source Cloud Foundry packages directly][oss-cf-bosh] by setting up a [BOSH](https://bosh.io) director, a VM that coordinates the deployment of the Cloud Foundry environment.
Running the `cf app` command on the application shows that Cloud Foundry is crea
<!-- LINKS -->
-[pcf-azuremarketplace]: https://azuremarketplace.microsoft.com/marketplace/apps/pivotal.pivotal-cloud-foundry
[pcf-custom]: https://docs.pivotal.io/pivotalcf/1-10/customizing/azure.html [oss-cf-bosh]: https://github.com/cloudfoundry-incubator/bosh-azure-cpi-release/tree/master/docs [pcf-azuremarketplace-pivotaldocs]: https://docs.pivotal.io/ops-manager/2-10/install/pcf_azure.html
cloudfoundry Cloudfoundry Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cloudfoundry/cloudfoundry-get-started.md
Cloud Foundry is well suited to agile software development, including the use of
## Next steps -- [Deploy Pivotal Cloud Foundry from the Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps/pivotal.pivotal-cloud-foundry) - [Deploy an app to Cloud Foundry in Azure](./cloudfoundry-deploy-your-first-app.md)
cognitive-services Reference Data Guidelines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/reference-data-guidelines.md
- Title: Import and export data reference - QnA Maker
-description: Use this import and export reference to get the best results for your knowledge base backup, storage, and replacement.
--- Previously updated : 01/02/2020--
-# Import and export data reference
-
-Review this import and export reference to get the best results for your knowledge base backup, storage, and replacement.
-
-## Import and export knowledge base
-
-**TSV and XLS files**, from exported knowledge bases, can only be used by importing the files from the **Settings** page in the QnA Maker portal. They can't be used as data sources during knowledge base creation or from the **+ Add file** or **+ Add URL** feature on the **Settings** page.
-
-When you import the Knowledge base through these **TSV and XLS files**, the QnA pairs get added to the editorial source and not the sources from which the QnAs were extracted in the exported Knowledge Base.
cognitive-services Reference Document Type Url https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/reference-document-type-url.md
- Title: URLs types supported for import - QnA Maker
-description: Understand how the types of URLs are used to import and create QnA pairs.
--- Previously updated : 01/02/2020-
-# URLs supported for importing documents
-
-Understand how the types of URLs are used to import and create QnA pairs.
-
-## FAQ URLs
-
-QnA Maker can support FAQ web pages in 3 different forms:
-
-* Plain FAQ pages
-* FAQ pages with links
-* FAQ pages with a Topics Homepage
-
-### Plain FAQ pages
-
-This is the most common type of FAQ page, in which the answers immediately follow the questions in the same page.
-
-Below is an example of a plain FAQ page:
-
-![Plain FAQ page example for a knowledge base](./media/qnamaker-concepts-datasources/plain-faq.png)
--
-### FAQ pages with links
-
-In this type of FAQ page, questions are aggregated together and are linked to answers that are either in different sections of the same page, or in different pages.
-
-Below is an example of an FAQ page with links in sections that are on the same page:
-
- ![Section Link FAQ page example for a knowledge base](./media/qnamaker-concepts-datasources/sectionlink-faq.png)
--
-### Parent Topics page links to child answers pages
-
-This type of FAQ has a Topics page where each topic is linked to a corresponding set of questions and answers on a different page. QnA Maker crawls all the linked pages to extract the corresponding questions & answers.
-
-Below is an example of a Topics page with links to FAQ sections in different pages.
-
- ![Deep link FAQ page example for a knowledge base](./media/qnamaker-concepts-datasources/topics-faq.png)
-
-## Support URLs
-
-QnA Maker can process semi-structured support web pages, such as web articles that would describe how to perform a given task, how to diagnose and resolve a given problem, and what are the best practices for a given process. Extraction works best on content that has a clear structure with hierarchical headings.
-
-> [!NOTE]
-> Extraction for support articles is a new feature and is in early stages. It works best for simple pages, that are well structured, and do not contain complex headers/footers.
-
-![QnA Maker supports extraction from semi-structured web pages where a clear structure is presented with hierarchical headings](./media/qnamaker-concepts-datasources/support-web-pages-with-heirarchical-structure.png)
cognitive-services Reference Question Answer Set https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/reference-question-answer-set.md
- Title: Design knowledge base - QnA Maker
-description: A QnA Maker knowledge base consists of a set of question-and-answer (QnA) pairs and optional metadata associated with each QnA pair.
--- Previously updated : 09/01/2020--
-# Question and answer pair
-
-A knowledge base consists of question and answer (QnA) pairs. Each pair has one answer and a pair contains all the information associated with that _answer_. An answer can loosely resemble a database row or a data structure instance.
-
-## Question and answer pairs
-
-The **required** settings in a question-and-answer (QnA) pair are:
-
-* a **question** - text of user query, used to QnA Maker's machine-learning, to align with text of user's question with different wording but the same answer
-* the **answer** - the pair's answer is the response that's returned when a user query is matched with the associated question
-
-Each pair is represented by an **ID**.
-
-The **optional** settings for a pair include:
-
-* **Alternate forms of the question** - this helps QnA Maker return the correct answer for a wider variety of question phrasings
-* **Metadata**: Metadata are tags associated with a QnA pair and are represented as key-value pairs. Metadata tags are used to filter QnA pairs and limit the set over which query matching is performed.
-* **Multi-turn prompts**, used to continue a multi-turn conversation
-
-![QnA Maker knowledge bases](media/qnamaker-concepts-knowledgebase/knowledgebase.png)
-
-## Editorially add to knowledge base
-
-If you do not have pre-existing content to populate the knowledge base, you can add QnA pairs editorially in the QnA Maker portal. Learn how to update your knowledge base [here](How-To/edit-knowledge-base.md).
-
-## Editing your knowledge base locally
-
-Once a knowledge base is created, it is recommended that you make edits to the knowledge base text in the [QnA Maker portal](https://qnamaker.ai), rather than exporting and reimporting through local files. However, there may be times that you need to edit a knowledge base locally.
-
-Export the knowledge base from the **Settings** page, then edit the knowledge base with Microsoft Excel. If you choose to use another application to edit your exported file, the application may introduce syntax errors because it is not fully TSV compliant. Microsoft Excel's TSV files generally don't introduce any formatting errors.
-
-Once you are done with your edits, reimport the TSV file from the **Settings** page. This will completely replace the current knowledge base with the imported knowledge base.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Knowledge base lifecycle in QnA Maker](Concepts/development-lifecycle-knowledge-base.md)
cognitive-services Reference Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/reference-role-based-access-control.md
- Title: Azure role-based access control (Azure RBAC) - QnA Maker
-description: Control access to QnA Maker with the Azure roles for your QnA Maker resource
--- Previously updated : 05/15/2020--
-# Azure role-based access control (Azure RBAC)
-
-Use the following table to determine your access needs for your QnA Maker resource.
-
cognitive-services Migrate To V3 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/migrate-to-v3.md
# Translator V2 to V3 Migration > [!NOTE]
-> V2 was deprecated on April 30, 2018. Please migrate your applications to V3 in order to take advantage of new functionality available exclusively in V3. V2 will be retired on May 24, 2021.
+> V2 was deprecated on April 30, 2018. Please migrate your applications to V3 in order to take advantage of new functionality available exclusively in V3. V2 was retired on May 24, 2021.
The Microsoft Translator team has released Version 3 (V3) of the Translator. This release includes new features, deprecated methods and a new format for sending to, and receiving data from the Microsoft Translator Service. This document provides information for changing applications to use V3.
cognitive-services Container Image Tags https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/containers/container-image-tags.md
Previously updated : 06/25/2021 Last updated : 09/09/2021
This container image has the following tags available. You can also find a full
# [Latest version](#tab/current) * Release notes for version `1.3.0`:
- * Support for standalone language IDs with `SingleLanguage` and contiuous mode.
+ * Support for standalone language IDs with `SingleLanguage` and continuous mode.
| Image Tags | Notes | ||:|
The [Text Analytics for health][ta-he] container image can be found on the `mcr.
This container image has the following tags available. You can also find a full list of [tags on the MCR](https://mcr.microsoft.com/v2/azure-cognitive-services/textanalytics/healthcare/tags/list).
+# [Latest version](#tab/current)
+
+Release notes for `3.0.017010001-onprem-amd64`:
+
+* You can now use the [Text Analytics for health container with the client library](../text-analytics/how-tos/text-analytics-how-to-install-containers.md?tabs=healthcare#run-the-container-with-client-library-support)
+
+| Image Tags | Notes |
+|-|:|
+| `latest` | |
+| `3.0.017010001-onprem-amd64` | |
+
+# [Previous versions](#tab/previous)
+ Release notes for `3.0.015490002-onprem-amd64`: * new model-version `2021-03-01`
Release notes for `3.0.015490002-onprem-amd64`:
| `latest` | | | `3.0.015490002-onprem-amd64` | | + ## Translator
cognitive-services Text Analytics How To Install Containers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-install-containers.md
Previously updated : 07/21/2021 Last updated : 09/09/2021 keywords: on-premises, Docker, container, sentiment analysis, natural language processing
communication-services Call Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/call-diagnostics.md
+
+ Title: Azure Communication Services Call Diagnostics
+
+description: Provides an overview of the Call Diagnostics feature.
+++++ Last updated : 08/17/2021++++
+# Call diagnostics
+
+When working with calls in Azure Communication Services, issues or problems may arise that cause issues for your customers. To aid with that we have a feature called "Call Diagnostics" which enables you to examine various properties of a Call to determine what the issue might be.
+
+**Call diagnostics, is currently only supported for our JS / Web SDK.**
+
+## Accessing diagnostics
+
+Call diagnostics is an extended feature of the core `Call` API and allows you to diagnose an active call.
+
+```js
+const callDiagnostics = call.api(Features.Diagnostics);
+```
+
+## Diagnostic values
+
+The following users facing diagnostics are available:
+
+### Network values
+
+| Name | Description | Possible values | Use cases |
+| - | | | |
+| noNetwork | There is no network available. | - Set to `True` when a call fails to start because there is no network available. <br/> - Set to `False` when there are ICE candidates present. | Device is not connected to a network. |
+| networkRelaysNotReachable | Problems with a network. | - Set to `True` when the network has some constraint that is not allowing you to reach ACS relays. <br/> - Set to `False` upon making a new call. | During a call when the WiFi signal goes on and off. |
+| networkReconnect | The connection was lost and we are reconnecting to the network. | - Set to `Poor` when the media transport connectivity is lost <br/> - Set to `Bad` when the network is disconnected <br/> - Set to `Good` when a new session is connected. | Low bandwidth, no internet |
+| networkReceiveQuality | An indicator regarding incoming stream quality. | - Set to `Bad` when there is a severe problem with receiving the stream. quality <br/> - Set to `Poor` when there is a mild problem with receiving the stream. quality <br/> - Set to `Good` when there is no problem with receiving the stream. | Low bandwidth |
+
+### Audio values
+
+| Name | Description | Possible values | Use cases |
+| | | -- | - |
+| noSpeakerDevicesEnumerated | There is no audio output device (speaker) on the user's system. | - Set to `True` when there are no speaker devices on the system, and speaker selection is supported. <br/> - Set to `False` when there is a least 1 speaker device on the system, and speaker selection is supported. | All speakers are unplugged |
+| speakingWhileMicrophoneIsMuted | Speaking while being on mute. | - Set to `True` when local microphone is muted and the local user is speaking. <br/> - Set to `False` when local user either stops speaking, or un-mutes the microphone. <br/> \* Note: as of today, this isn't supported on safari yet, as the audio level samples are taken from webrtc. stats. | During a call, mute your microphone and speak into it. |
+| noMicrophoneDevicesEnumerated | No audio capture devices (microphone) on the user's system | - Set to `True` when there are no microphone devices on the system. <br/> - Set to `False` when there is at least 1 microphone device on the system. | All microphones are unplugged during the call. |
+| microphoneNotFunctioning | Microphone is not functioning. | - Set to `True` when we fail to start sending local audio stream because the microphone device may have been disabled in the system or it is being used by another process. This UFD takes about 10 seconds to get raised. <br/> - Set to `False` when microphone starts to successfully send audio stream again. | No microphones available, microphone access disabled in a system |
+| microphoneMuteUnexpectedly | Microphone is muted | - Set to `True` when microphone enters muted state unexpectedly. <br/> - Set to `False` when microphone starts to successfully send audio stream | Microphone is muted from the system. |
+| microphonePermissionDenied | There is low volume from device or itΓÇÖs almost silent on macOS. | - Set to `True` when audio permission is denied by system settings (audio). <br/> - Set to `False` on successful stream acquisition. <br/> Note: This diagnostic only works on macOS. | Microphone permissions are disabled in the Settings. |
+
+### Camera values
+
+| Name | Description | Possible values | Use cases |
+| - | | | - |
+| cameraFreeze | Camera stops producing frames for more than 5 seconds. | - Set to `True` when the local video stream is frozen. This means the remote side is seeing your video frozen on their screen or it means that the remote participants are not rendering your video on their screen. <br/> - Set to `False` when the freeze ends and users can see your video as per normal. | The Camera was lost during the call or bad network caused the camera to freeze. |
+| cameraStartFailed | Generic camera failure. | - Set to `True` when we fail to start sending local video because the camera device may have been disabled in the system or it is being used by another process~. <br/> - Set to `False` when selected camera device successfully sends local video. again. | Camera failures |
+| cameraStartTimedOut | Common scenario where camera is in bad state. | - Set to `True` when camera device times out to start sending video stream. <br/> - Set to `False` when selected camera device successfully sends local video again. | Camera failures |
+| cameraPermissionDenied | Camera permissions were denied in settings. | - Set to `True` when camera permission is denied by system settings (video). <br/> - Set to `False` on successful stream acquisition. <br> Note: This diagnostic only works on macOS Chrome | Camera permissions are disabled in the settings. |
+
+### Misc values
+
+| Name | Description | Possible values | Use cases |
+| - | | - | -- |
+| screenshareRecordingDisabled | System screen sharing was denied by preferences in Settings. | - Set to `True` when screen sharing permission is denied by system settings (sharing). <br/> - Set to `False` on successful stream acquisition. <br/> Note: This diagnostic only works on macOS.Chrome. | Screen recording is disabled in Settings. |
+
+## Diagnostic events
+
+- Subscribe to the `diagnosticChanged` event to monitor when any call diagnostic changes.
+
+```js
+/**
+ * Each diagnostic has the following data:
+ * - diagnostic is the type of diagnostic, e.g. NetworkSendQuality, DeviceSpeakWhileMuted, etc...
+ * - value is DiagnosticQuality or DiagnosticFlag:
+ * - DiagnosticQuality = enum { Good = 1, Poor = 2, Bad = 3 }.
+ * - DiagnosticFlag = true | false.
+ * - valueType = 'DiagnosticQuality' | 'DiagnosticFlag'
+ * - mediaType is the media type associated with the event, e.g. Audio, Video, ScreenShare. These are defined in `CallDiagnosticEventMediaType`.
+ */
+const diagnosticChangedListener = (diagnosticInfo: NetworkDiagnosticChangedEventArgs | MediaDiagnosticChangedEventArgs) => {
+ console.log(`Diagnostic changed: ` +
+ `Diagnostic: ${diagnosticInfo.diagnostic}` +
+ `Value: ${diagnosticInfo.value}` +
+ `Value type: ${diagnosticInfo.valueType}` +
+ `Media type: ${diagnosticInfo.mediaType}` +
+
+ if (diagnosticInfo.valueType === 'DiagnosticQuality') {
+ if (diagnosticInfo.value === DiagnosticQuality.Bad) {
+ console.error(`${diagnosticInfo.diagnostic} is bad quality`);
+
+ } else if (diagnosticInfo.value === DiagnosticQuality.Poor) {
+ console.error(`${diagnosticInfo.diagnostic} is poor quality`);
+ }
+
+ } else if (diagnosticInfo.valueType === 'DiagnosticFlag') {
+ if (diagnosticInfo.value === true) {
+ console.error(`${diagnosticInfo.diagnostic}`);
+ }
+ }
+};
+
+callDiagnostics.network.on('diagnosticChanged', diagnosticChangedListener);
+callDiagnostics.media.on('diagnosticChanged', diagnosticChangedListener);
+```
+
+## Get the latest diagnostics
+
+- Get the latest call diagnostic values that were raised. If a diagnostic is undefined, that is because it was never raised.
+
+```js
+const latestNetworkDiagnostics = callDiagnostics.network.getLatest();
+
+console.log(
+ `noNetwork: ${latestNetworkDiagnostics.noNetwork.value}, ` +
+ `value type = ${latestNetworkDiagnostics.noNetwork.valueType}`
+);
+
+console.log(
+ `networkReconnect: ${latestNetworkDiagnostics.networkReconnect.value}, ` +
+ `value type = ${latestNetworkDiagnostics.networkReconnect.valueType}`
+);
+
+console.log(
+ `networkReceiveQuality: ${latestNetworkDiagnostics.networkReceiveQuality.value}, ` +
+ `value type = ${latestNetworkDiagnostics.networkReceiveQuality.valueType}`
+);
+
+const latestMediaDiagnostics = callDiagnostics.media.getLatest();
+
+console.log(
+ `speakingWhileMicrophoneIsMuted: ${latestMediaDiagnostics.speakingWhileMicrophoneIsMuted.value}, ` +
+ `value type = ${latestMediaDiagnostics.speakingWhileMicrophoneIsMuted.valueType}`
+);
+
+console.log(
+ `cameraStartFailed: ${latestMediaDiagnostics.cameraStartFailed.value}, ` +
+ `value type = ${latestMediaDiagnostics.cameraStartFailed.valueType}`
+);
+
+console.log(
+ `microphoneNotFunctioning: ${latestMediaDiagnostics.microphoneNotFunctioning.value}, ` +
+ `value type = ${latestMediaDiagnostics.microphoneNotFunctioning.valueType}`
+);
+```
communication-services Call Recording https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/call-recording.md
Run-time control APIs can be used to manage recording via internal business logi
An Event Grid notification `Microsoft.Communication.RecordingFileStatusUpdated` is published when a recording is ready for retrieval, typically a few minutes after the recording process has completed (e.g. meeting ended, recording stopped). Recording event notifications include `contentLocation` and `metadataLocation`, which are used to retrieve both recorded media and a recording metadata file. ### Notification Schema Reference
-```
+```typescript
{ "id": string, // Unique guid for event "topic": string, // Azure Communication Services resource id
Many countries and states have laws and regulations that apply to the recording
Regulations around the maintenance of personal data require the ability to export user data. In order to support these requirements, recording metadata files include the participantId for each call participant in the `participants` array. You can cross-reference the MRIs in the `participants` array with your internal user identities to identify participants in a call. An example of a recording metadata file is provided below for reference. ## Next steps
-Check out the [Call Recoding Quickstart Sample](../../quickstarts/voice-video-calling/call-recording-sample.md) to learn more.
+Check out the [Call Recoding Quickstart](../../quickstarts/voice-video-calling/call-recording-sample.md) to learn more.
Learn more about [Call Automation APIs](./call-automation-apis.md).
communication-services Call Transcription https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/call-transcription.md
+
+ Title: Display call transcription state on the client
+
+description: Use Azure Communication Services SDKs to display the call transcription state
++++ Last updated : 08/10/2021+
+zone_pivot_groups: acs-plat-ios-android
+
+#Customer intent: As a developer, I want to display the call transcription state on the client.
++
+# Display call transcription state on the client
++
+When using call transcription you may want to let your users know that a call is being transcribe. Here's how.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
+++
+## Next steps
+- [Learn how to manage video](./manage-video.md)
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to record calls](./record-calls.md)
communication-services Dominant Speaker https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/dominant-speaker.md
+
+ Title: Get active speakers
+
+description: Use Azure Communication Services SDKs to render the active speakers in a call.
++++ Last updated : 08/10/2021++
+#Customer intent: As a developer, I want to get a list of active speakers within a call.
++
+# Get active speakers within a call
++
+During an active call, you may want to get a list of active speakers in order to render or display them differently. Here's how.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
++
+## Next steps
+- [Learn how to manage video](./manage-video.md)
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to record calls](./record-calls.md)
communication-services Events https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/events.md
+
+ Title: Subscribe to SDK events
+
+description: Use Azure Communication Services SDKs to subscribe to SDK events.
++++ Last updated : 08/10/2021+
+zone_pivot_groups: acs-plat-web-ios-android
+
+#Customer intent: As a developer, I want to subscribe to SDK events so that I know when things change.
++
+# Subscribe to SDK events
+
+Azure Communication Services SDKs are dynamic and contain a lot of properties. When these change, as a developer you might want to know when and more importantly what changes. Here's how!
++++
+## Next steps
+- [Try our calling quickstart](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
+- [Try our video calling quickstart](../../quickstarts/voice-video-calling/get-started-with-video-calling.md)
+- [Learn how to enable push notifications](./push-notifications.md)
communication-services Manage Calls https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/manage-calls.md
+
+ Title: Manage calls
+
+description: Use Azure Communication Services SDKs to manage calls.
++++ Last updated : 08/10/2021+
+zone_pivot_groups: acs-plat-web-ios-android-windows
+
+#Customer intent: As a developer, I want to manage calls with the acs sdks so that I can create a calling application that manages calls.
++
+# Manage calls
+
+Learn how to manage calls with the Azure Communication Services SDKS. We'll learn how to place calls, manage their participants and properties.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A `User Access Token` to enable the call client. For more information on [how to get a `User Access Token`](../../quickstarts/access-tokens.md)
+- Optional: Complete the quickstart for [getting started with adding calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
+++++
+## Next steps
+- [Learn how to manage video](./manage-video.md)
+- [Learn how to record calls](./record-calls.md)
+- [Learn how to transcribe calls](./call-transcription.md)
communication-services Manage Video https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/manage-video.md
+
+ Title: Manage video during calls
+
+description: Use Azure Communication Services SDKs to manage video calls.
++++ Last updated : 08/10/2021+
+zone_pivot_groups: acs-web-ios-android
+
+#Customer intent: As a developer, I want to manage video calls with the acs sdks so that I can create a calling application that provides video capabilities.
++
+# Manage video during calls
+
+Learn how to manage video calls with the Azure Communication Services SDKS. We'll learn how to manage receiving and sending video within a call.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
++++
+## Next steps
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to record calls](./record-calls.md)
+- [Learn how to transcribe calls](./call-transcription.md)
communication-services Push Notifications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/push-notifications.md
+
+ Title: Enable push notifications for calls.
+
+description: Use Azure Communication Services SDKs to enable push notifications for calls.
++++ Last updated : 08/10/2021+
+zone_pivot_groups: acs-plat-ios-android
+
+#Customer intent: As a developer, I want to enable push notifications with the acs sdks so that I can create a calling application that provides push notifications to its users.
++
+# Enable push notifications for calls
+
+Here, we'll learn how to enable push notifications for Azure Communication Services calls. Setting these up will let your users know when they have an incoming call which they can then answer.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
+++
+## Next steps
+- [Learn how to subscribe to events](./events.md)
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to manage video](./manage-video.md)
communication-services Record Calls https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/record-calls.md
+
+ Title: Manage call recording on the client
+
+description: Use Azure Communication Services SDKs to manage call recording on the client.
++++ Last updated : 08/10/2021+
+zone_pivot_groups: acs-web-ios-android
+
+#Customer intent: As a developer, I want to manage call recording on the client so that my users can record calls.
++
+# Manage call recording on the client
++
+[Call recording](../../concepts/voice-video-calling/call-recording.md), lets your users record their calls made with Azure Communication Services. Here we'll learn how to manage recording on the client side. Before this can work you will need to setup [server side](../../quickstarts/voice-video-calling/call-recording-sample.md) recording.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
++++
+## Next steps
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to manage video](./manage-video.md)
+- [Learn how to transcribe calls](./call-transcription.md)
communication-services Teams Interoperability https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/teams-interoperability.md
+
+ Title: Join a Teams meeting
+
+description: Use Azure Communication Services SDKs to join a Teams meeting.
++++ Last updated : 08/10/2021++
+#Customer intent: As a developer, I want to join a Teams meeting.
++
+# Join a teams meeting
++
+Azure Communication Services SDKs can allow your users to join regular Microsoft Teams meetings. Here's how!
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
+
+> [!NOTE]
+> This API is provided as a preview for developers and may change based on feedback that we receive. Do not use this API in a production environment. To use this api please use 'beta' release of ACS Calling Web SDK
+
+To join a Teams meeting, use the `join` method and pass a meeting link or a meeting's coordinates.
+
+Join by using a meeting link:
+
+```js
+const locator = { meetingLink: '<MEETING_LINK>'}
+const call = callAgent.join(locator);
+```
+
+Join by using meeting coordinates:
+
+```js
+const locator = {
+ threadId: <thread id>,
+ organizerId: <organizer id>,
+ tenantId: <tenant id>,
+ messageId: <message id>
+}
+const call = callAgent.join(locator);
+```
+## Next steps
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to manage video](./manage-video.md)
+- [Learn how to transfer calls](./transfer-calls.md)
communication-services Transfer Calls https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/calling-sdk/transfer-calls.md
+
+ Title: Transfer calls
+
+description: Use Azure Communication Services SDKs to transfer calls.
++++ Last updated : 08/10/2021++
+#Customer intent: As a developer, I want to learn how to transfer calls so that users have the option to transfer calls.
++
+# Transfer calls
++
+During an active call, you may want to transfer the call to another person or number. Let's learn how.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- A user access token to enable the calling client. For more information, see [Create and manage access tokens](../../quickstarts/access-tokens.md).
+- Optional: Complete the quickstart to [add voice calling to your application](../../quickstarts/voice-video-calling/getting-started-with-calling.md)
++
+## Next steps
+- [Learn how to manage calls](./manage-calls.md)
+- [Learn how to manage video](./manage-video.md)
+- [Learn how to record calls](./record-calls.md)
+- [Learn how to transcribe calls](./call-transcription.md)
communication-services Calling Client Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/calling-client-samples.md
- Title: Quickstart - Use the Azure Communication Services Calling SDK-
-description: Learn about the Communication Services Calling SDK capabilities.
----- Previously updated : 06/30/2021--
-zone_pivot_groups: acs-plat-web-ios-android-windows
--
-# Quickstart: Use the Communication Services Calling SDK
-
-Get started with Azure Communication Services by using the Communication Services Calling SDK to add voice and video calling to your app.
-----
-## Clean up resources
-
-If you want to clean up and remove a Communication Services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it. Learn more about [cleaning up resources](../create-communication-resource.md#clean-up-resources).
-
-## Next steps
-
-For more information, see the following articles:
--- Check out our [calling hero sample](../../samples/calling-hero-sample.md)-- Learn more about [how calling works](../../concepts/voice-video-calling/about-call-types.md)
connectors Apis List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/connectors/apis-list.md
Although you create connections from within a workflow, connections are separate
### Firewall access for connections
-If you use a firewall that limits traffic, and your logic app workflows need to communicate through that firewall, you have to set up your firewall to allow access for both the [inbound](../logic-apps/logic-apps-limits-and-config.md#inbound) and [outbound](../logic-apps/logic-apps-limits-and-config.md#outbound) IP addresses used by the Logic Apps service or runtime in the Azure region where your logic app workflows exist. If your workflows also use managed connectors, such as the Office 365 Outlook connector or SQL connector, or use custom connectors, your firewall also needs to allow access for *all* the [managed connector outbound IP addresses](../logic-apps/logic-apps-limits-and-config.md#outbound) in your logic app's Azure region. For more information, review [Firewall configuration](../logic-apps/logic-apps-limits-and-config.md#firewall-configuration-ip-addresses-and-service-tags).
+If you use a firewall that limits traffic, and your logic app workflows need to communicate through that firewall, you have to set up your firewall to allow access for both the [inbound](../logic-apps/logic-apps-limits-and-config.md#inbound) and [outbound](../logic-apps/logic-apps-limits-and-config.md#outbound) IP addresses used by the Logic Apps service or runtime in the Azure region where your logic app workflows exist. If your workflows also use managed connectors, such as the Office 365 Outlook connector or SQL connector, or use custom connectors, your firewall also needs to allow access for *all* the [managed connector outbound IP addresses](/connectors/common/outbound-ip-addresses#azure-logic-apps) in your logic app's Azure region. For more information, review [Firewall configuration](../logic-apps/logic-apps-limits-and-config.md#firewall-configuration-ip-addresses-and-service-tags).
## Recurrence behavior
connectors Connectors Create Api Azureblobstorage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/connectors/connectors-create-api-azureblobstorage.md
Other solutions for accessing storage accounts behind firewalls:
### Access storage accounts in other regions
-Logic apps can't directly access storage accounts behind firewalls when they're both in the same region. As a workaround, put your logic apps in a different region than your storage account. Then, give access to the [outbound IP addresses for the managed connectors in your region](../logic-apps/logic-apps-limits-and-config.md#outbound).
+Logic apps can't directly access storage accounts behind firewalls when they're both in the same region. As a workaround, put your logic apps in a different region than your storage account. Then, give access to the [outbound IP addresses for the managed connectors in your region](/connectors/common/outbound-ip-addresses#azure-logic-apps).
> [!NOTE] > This solution doesn't apply to the Azure Table Storage connector and Azure Queue Storage connector. Instead, to access your Table Storage or Queue Storage, [use the built-in HTTP trigger and actions](../logic-apps/logic-apps-http-endpoint.md).
container-registry Container Registry Troubleshoot Login https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-troubleshoot-login.md
Related links:
## Advanced troubleshooting
-If [collection of resource logs](monitor-service.md) is enabled in the registry, review the ContainterRegistryLoginEvents log. This log stores authentication events and status, including the incoming identity and IP address. Query the log for [registry authentication failures](monitor-service.md#registry-authentication-failures).
+If [collection of resource logs](monitor-service.md) is enabled in the registry, review the ContainerRegistryLoginEvents log. This log stores authentication events and status, including the incoming identity and IP address. Query the log for [registry authentication failures](monitor-service.md#registry-authentication-failures).
Related links:
If you don't resolve your problem here, see the following options.
* [Troubleshoot registry performance](container-registry-troubleshoot-performance.md) * [Community support](https://azure.microsoft.com/support/community/) options * [Microsoft Q&A](/answers/products/)
-* [Open a support ticket](https://azure.microsoft.com/support/create-ticket/) - based on information you provide, a quick diagnostic might be run for authentication failures in your registry
+* [Open a support ticket](https://azure.microsoft.com/support/create-ticket/) - based on information you provide, a quick diagnostic might be run for authentication failures in your registry
cosmos-db Concepts Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/concepts-limits.md
After you create an Azure Cosmos account under your subscription, you can manage
### Provisioned throughput
-You can provision throughput at a container-level or a database-level in terms of [request units (RU/s or RUs)](request-units.md). The following table lists the limits for storage and throughput per container/database.
+You can provision throughput at a container-level or a database-level in terms of [request units (RU/s or RUs)](request-units.md). The following table lists the limits for storage and throughput per container/database. Storage refers to the combined amount of data and index storage.
| Resource | Default limit | | | |
cosmos-db Cosmosdb Monitor Resource Logs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cosmosdb-monitor-resource-logs.md
Platform metrics and the Activity logs are collected automatically, whereas you
|CassandraRequests | Cassandra | Logs user-initiated requests from the front end to serve requests to Azure Cosmos DB's API for Cassandra. When you enable this category, make sure to disable DataPlaneRequests. | `operationName`, `requestCharge`, `piiCommandText` | |GremlinRequests | Gremlin | Logs user-initiated requests from the front end to serve requests to Azure Cosmos DB's API for Gremlin. When you enable this category, make sure to disable DataPlaneRequests. | `operationName`, `requestCharge`, `piiCommandText`, `retriedDueToRateLimiting` | |QueryRuntimeStatistics | SQL | This table details query operations executed against a SQL API account. By default, the query text and its parameters are obfuscated to avoid logging personal data with full text query logging available by request. | `databasename`, `partitionkeyrangeid`, `querytext` |
- |PartitionKeyStatistics | All APIs | Logs the statistics of logical partition keys by representing the storage size (KB) of the partition keys. This table is useful when troubleshooting storage skews. This PartitionKeyStatistics log is only emitted if the following conditions are true: <br/><ul><li> At least 1% of the documents have same logical partition key. </li><li> Out of all the keys, the top 3 keys with largest storage size are captured by the PartitionKeyStatistics log. </li></ul> If the previous conditions are not met, the partition key statistics data is not available. It's okay if the above conditions are not met for your account, which typically indicates you have no logical partition storage skew. | `subscriptionId`, `regionName`, `partitionKey`, `sizeKB` |
+ |PartitionKeyStatistics | All APIs | Logs the statistics of logical partition keys by representing the estimated storage size (KB) of the partition keys. This table is useful when troubleshooting storage skews. This PartitionKeyStatistics log is only emitted if the following conditions are true: <br/><ul><li> At least 1% of the documents in the physical partition have same logical partition key. </li><li> Out of all the keys in the physical partition, the top 3 keys with largest storage size are captured by the PartitionKeyStatistics log. </li></ul> If the previous conditions are not met, the partition key statistics data is not available. It's okay if the above conditions are not met for your account, which typically indicates you have no logical partition storage skew. <br/><br/>Note: The estimated size of the partition keys is calculated using a sampling approach that assumes the documents in the physical partition are roughly the same size. If the document sizes are not uniform in the physical partition, the estimated partition key size may not be accurate. | `subscriptionId`, `regionName`, `partitionKey`, `sizeKB` |
|PartitionKeyRUConsumption | SQL API | Logs the aggregated per-second RU/s consumption of partition keys. This table is useful for troubleshooting hot partitions. Currently, Azure Cosmos DB reports partition keys for SQL API accounts only and for point read/write and stored procedure operations. | `subscriptionId`, `regionName`, `partitionKey`, `requestCharge`, `partitionKeyRangeId` | |ControlPlaneRequests | All APIs | Logs details on control plane operations i.e. creating an account, adding or removing a region, updating account replication settings etc. | `operationName`, `httpstatusCode`, `httpMethod`, `region` | |TableApiRequests | Table API | Logs user-initiated requests from the front end to serve requests to Azure Cosmos DB's API for Table. When you enable this category, make sure to disable DataPlaneRequests. | `operationName`, `requestCharge`, `piiCommandText` |
cosmos-db Scaling Provisioned Throughput Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/scaling-provisioned-throughput-best-practices.md
This means that for 1 TB of data, we'll need 1000 GB / 40 GB = 25 physical parti
If we're using autoscale throughput or a shared throughput database, to get 25 physical partitions, we'd first provision 25 * 10,000 RU/s = 250,000 RU/s. Because we are already at the highest RU/s that can be supported with 25 physical partitions, we would not further increase our provisioned RU/s before the ingestion.
-In theory, with 250,000 RU/s and 1 TB of data, if we assume 1-kb documents and 10 RUs required for write, the ingestion can theoretically complete in: 1000 GB * (1,000,000 kb / 1 GB) * (1 document / 1 kb) * (10 RU / document) * (1 sec / 150,000 RU) * (1 hour / 3600 seconds) = 11.1 hours.
+In theory, with 250,000 RU/s and 1 TB of data, if we assume 1-kb documents and 10 RUs required for write, the ingestion can theoretically complete in: 1000 GB * (1,000,000 kb / 1 GB) * (1 document / 1 kb) * (10 RU / document) * (1 sec / 250,000 RU) * (1 hour / 3600 seconds) = 11.1 hours.
This calculation is an estimate assuming the client performing the ingestion can fully saturate the throughput and distribute writes across all physical partitions. As a best practice, itΓÇÖs recommended to ΓÇ£shuffleΓÇ¥ your data on the client-side. This ensures that each second, the client is writing to many distinct logical (and thus physical) partitions.
Once the migration is over, we can lower the RU/s or enable autoscale as needed.
## Next steps * [Monitor normalized RU/s consumption](monitor-normalized-request-units.md) of your database or container. * [Diagnose and troubleshoot](troubleshoot-request-rate-too-large.md) request rate too large (429) exceptions.
-* [Enable autoscale on a database or container](provision-throughput-autoscale.md).
+* [Enable autoscale on a database or container](provision-throughput-autoscale.md).
data-factory Author Visually https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/author-visually.md
Previously updated : 09/08/2020 Last updated : 09/09/2021 # Visual authoring in Azure Data Factory
data-factory Ci Cd Github Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/ci-cd-github-troubleshoot-guide.md
Previously updated : 09/07/2021 Last updated : 09/09/2021 # Troubleshoot CI-CD, Azure DevOps, and GitHub issues in ADF
data-factory Compute Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/compute-linked-services.md
Previously updated : 05/08/2019 Last updated : 09/09/2021
data-factory Concepts Data Flow Column Pattern https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-column-pattern.md
Previously updated : 05/21/2021 Last updated : 09/09/2021 # Using column patterns in mapping data flow
data-factory Concepts Data Flow Debug Mode https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-debug-mode.md
Previously updated : 04/16/2021 Last updated : 09/09/2021 # Mapping data flow Debug Mode
Once you turn on the slider, you will be prompted to select which integration ru
![Debug IR selection](media/data-flow/debug-new-1.png "Debug IR selection")
-When Debug mode is on, you'll interactively build your data flow with an active Spark cluster. The session will close once you turn debug off in Azure Data Factory. You should be aware of the hourly charges incurred by Azure Databricks during the time that you have the debug session turned on.
+When Debug mode is on, you'll interactively build your data flow with an active Spark cluster. The session will close once you turn debug off in Azure Data Factory. You should be aware of the hourly charges incurred by Azure Data Factory during the time that you have the debug session turned on.
In most cases, it's a good practice to build your Data Flows in debug mode so that you can validate your business logic and view your data transformations before publishing your work in Azure Data Factory. Use the "Debug" button on the pipeline panel to test your data flow in a pipeline.
data-factory Concepts Data Flow Expression Builder https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-expression-builder.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Build expressions in mapping data flow
data-factory Concepts Data Flow Manage Graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-manage-graph.md
Previously updated : 09/02/2020 Last updated : 09/09/2021 # Managing the mapping data flow graph
data-factory Concepts Data Flow Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-monitoring.md
Previously updated : 06/18/2021 Last updated : 09/09/2021 # Monitor Data Flows
data-factory Concepts Data Flow Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-performance.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Mapping data flows performance and tuning guide
data-factory Concepts Data Flow Schema Drift https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-schema-drift.md
Previously updated : 04/15/2020 Last updated : 09/09/2021 # Schema drift in mapping data flow
data-factory Concepts Datasets Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-datasets-linked-services.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Datasets in Azure Data Factory and Azure Synapse Analytics
data-factory Concepts Integration Runtime Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-integration-runtime-performance.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Optimizing performance of the Azure Integration Runtime
data-factory Concepts Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-integration-runtime.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Integration runtime in Azure Data Factory
data-factory Concepts Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-linked-services.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Linked services in Azure Data Factory and Azure Synapse Analytics
data-factory Concepts Pipeline Execution Triggers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-pipeline-execution-triggers.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Concepts Pipelines Activities https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-pipelines-activities.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Pipelines and activities in Azure Data Factory and Azure Synapse Analytics
data-factory Connect Data Factory To Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connect-data-factory-to-azure-purview.md
Previously updated : 09/02/2021 Last updated : 09/09/2021 # Connect Data Factory to Azure Purview (Preview)
data-factory Connector Amazon Marketplace Web Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-marketplace-web-service.md
Title: Copy data from AWS Marketplace
+description: Learn how to copy data from Amazon Marketplace Web Service to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Amazon Marketplace Web Service to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Amazon Marketplace Web Service using Azure Data Factory
+# Copy data from Amazon Marketplace Web Service using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Amazon Marketplace Web Service. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Amazon Marketplace Web Service. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Amazon Marketplace Web Service connector is supported for the following act
You can copy data from Amazon Marketplace Web Service to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Amazon Marketplace Web Service linked
| endpoint | The endpoint of the Amazon MWS server, (that is, mws.amazonservices.com) | Yes | | marketplaceID | The Amazon Marketplace ID you want to retrieve data from. To retrieve data from multiple Marketplace IDs, separate them with a comma (`,`). (that is, A2EUQ1WTGCTBG2) | Yes | | sellerID | The Amazon seller ID. | Yes |
-| mwsAuthToken | The Amazon MWS authentication token. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| mwsAuthToken | The Amazon MWS authentication token. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| accessKeyId | The access key ID used to access data. | Yes |
-| secretKey | The secret key used to access data. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| secretKey | The secret key used to access data. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To copy data from Amazon Marketplace Web Service, set the source type in the cop
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Amazon Redshift https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-redshift.md
Title: Copy data from Amazon Redshift
+description: Learn how to copy data from Amazon Redshift to supported sink data stores using Azure Data Factory or Synapse Analytics pipelines.
-description: Learn about how to copy data from Amazon Redshift to supported sink data stores by using Azure Data Factory.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Amazon Redshift using Azure Data Factory
+# Copy data from Amazon Redshift using Azure Data Factory or Synapse Analytics
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] > * [Version 1](v1/data-factory-amazon-redshift-connector.md) > * [Current version](connector-amazon-redshift.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from an Amazon Redshift. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from an Amazon Redshift. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Amazon Redshift linked service:
| port |The number of the TCP port that the Amazon Redshift server uses to listen for client connections. |No, default is 5439 | | database |Name of the Amazon Redshift database. |Yes | | username |Name of user who has access to the database. |Yes |
-| password |Password for the user account. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
+| password |Password for the user account. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use Azure Integration Runtime or Self-hosted Integration Runtime (if your data store is located in private network). If not specified, it uses the default Azure Integration Runtime. |No | **Example:**
To copy data from Amazon Redshift, set the source type in the copy activity to *
| query |Use the custom query to read data. For example: select * from MyTable. |No (if "tableName" in dataset is specified) | | redshiftUnloadSettings | Property group when using Amazon Redshift UNLOAD. | No | | s3LinkedServiceName | Refers to an Amazon S3 to-be-used as an interim store by specifying a linked service name of "AmazonS3" type. | Yes if using UNLOAD |
-| bucketName | Indicate the S3 bucket to store the interim data. If not provided, Data Factory service generates it automatically. | Yes if using UNLOAD |
+| bucketName | Indicate the S3 bucket to store the interim data. If not provided, the service generates it automatically. | Yes if using UNLOAD |
**Example: Amazon Redshift source in copy activity using UNLOAD**
For this sample use case, copy activity unloads data from Amazon Redshift to Ama
## Data type mapping for Amazon Redshift
-When copying data from Amazon Redshift, the following mappings are used from Amazon Redshift data types to Azure Data Factory interim data types. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
+When copying data from Amazon Redshift, the following mappings are used from Amazon Redshift data types to interim data types used internally within the service. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
-| Amazon Redshift data type | Data factory interim data type |
+| Amazon Redshift data type | Interim service data type |
|: |: | | BIGINT |Int64 | | BOOLEAN |String |
When copying data from Amazon Redshift, the following mappings are used from Ama
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Amazon S3 Compatible Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-s3-compatible-storage.md
Title: Copy data from Amazon Simple Storage Service (S3) Compatible Storage
+description: Learn about how to copy data from Amazon S3 Compatible Storage to supported sink data stores using an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn about how to copy data from Amazon S3 Compatible Storage to supported sink data stores by using Azure Data Factory.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Amazon S3 Compatible Storage by using Azure Data Factory
+# Copy data from Amazon S3 Compatible Storage by using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to copy data from Amazon Simple Storage Service (Amazon S3) Compatible Storage. To learn about Azure Data Factory, read the [introductory article](introduction.md).
--
+This article outlines how to copy data from Amazon Simple Storage Service (Amazon S3) Compatible Storage. To learn more, read the introductory articles for [Azure Data Factory](introduction.md) and [Synapse Analytics](../synapse-analytics/overview-what-is.md).
## Supported capabilities
Specifically, this Amazon S3 Compatible Storage connector supports copying files
To copy data from Amazon S3 Compatible Storage, make sure you've been granted the following permissions for Amazon S3 object operations: `s3:GetObject` and `s3:GetObjectVersion`.
-If you use Data Factory UI to author, additional `s3:ListAllMyBuckets` and `s3:ListBucket`/`s3:GetBucketLocation` permissions are required for operations like testing connection to linked service and browsing from root. If you don't want to grant these permissions, you can choose "Test connection to file path" or "Browse from specified path" options from the UI.
+If you use UI to author, additional `s3:ListAllMyBuckets` and `s3:ListBucket`/`s3:GetBucketLocation` permissions are required for operations like testing connection to linked service and browsing from root. If you don't want to grant these permissions, you can choose "Test connection to file path" or "Browse from specified path" options from the UI.
For the full list of Amazon S3 permissions, see [Specifying Permissions in a Policy](https://docs.aws.amazon.com/AmazonS3/latest/dev/using-with-s3-actions.html) on the AWS site.
Use the following steps to create a linked service to Amazon S3 Compatible Stora
## Connector configuration details
-The following sections provide details about properties that are used to define Data Factory entities specific to Amazon S3 Compatible Storage.
+The following sections provide details about properties that are used to define entities specific to Amazon S3 Compatible Storage.
## Linked service properties
The following properties are supported for an Amazon S3 Compatible linked servic
|: |: |: | | type | The **type** property must be set to **AmazonS3Compatible**. | Yes | | accessKeyId | ID of the secret access key. |Yes |
-| secretAccessKey | The secret access key itself. Mark this field as a **SecureString** to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
+| secretAccessKey | The secret access key itself. Mark this field as a **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
| serviceUrl | Specify the custom S3 endpoint `https://<service url>`. | No | | forcePathStyle | Indicates whether to use S3 [path-style access](https://docs.aws.amazon.com/AmazonS3/latest/dev/VirtualHosting.html#path-style-access) instead of virtual hosted-style access. Allowed values are: **false** (default), **true**.<br> Check each data storeΓÇÖs documentation on if path-style access is needed or not. |No | | connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. |No |
This section describes the resulting behavior of using a file list path in a Cop
Assume that you have the following source folder structure and want to copy the files in bold:
-| Sample source structure | Content in FileListToCopy.txt | Data Factory configuration |
+| Sample source structure | Content in FileListToCopy.txt | Configuration |
| | | | | bucket<br/>&nbsp;&nbsp;&nbsp;&nbsp;FolderA<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;**File1.csv**<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File2.json<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;**File3.csv**<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4.json<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;**File5.csv**<br/>&nbsp;&nbsp;&nbsp;&nbsp;Metadata<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;FileListToCopy.txt | File1.csv<br>Subfolder1/File3.csv<br>Subfolder1/File5.csv | **In dataset:**<br>- Bucket: `bucket`<br>- Folder path: `FolderA`<br><br>**In Copy activity source:**<br>- File list path: `bucket/Metadata/FileListToCopy.txt` <br><br>The file list path points to a text file in the same data store that includes a list of files you want to copy, one file per line, with the relative path to the path configured in the dataset. |
To learn details about the properties, check [Delete activity](delete-activity.m
## Next steps
-For a list of data stores that the Copy activity in Azure Data Factory supports as sources and sinks, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores that the Copy activity supports as sources and sinks, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Amazon Simple Storage Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-simple-storage-service.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from Amazon Simple Storage Service by using Azure Data Factory
data-factory Connector Azure Blob Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-blob-storage.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure Blob storage by using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Azure Cosmos Db Mongodb Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-cosmos-db-mongodb-api.md
Title: Copy data from Azure Cosmos DB's API for MongoDB
+description: Learn how to copy data from supported source data stores to or from Azure Cosmos DB's API for MongoDB to supported sink stores using Azure Data Factory or Synapse Analytics pipelines.
-description: Learn how to copy data from supported source data stores to or from Azure Cosmos DB's API for MongoDB to supported sink stores by using Data Factory.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data to or from Azure Cosmos DB's API for MongoDB by using Azure Data Factory
+# Copy data to or from Azure Cosmos DB's API for MongoDB using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use Copy Activity in Azure Data Factory to copy data from and to Azure Cosmos DB's API for MongoDB. The article builds on [Copy Activity in Azure Data Factory](copy-activity-overview.md), which presents a general overview of Copy Activity.
+This article outlines how to use Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from and to Azure Cosmos DB's API for MongoDB. The article builds on [Copy Activity](copy-activity-overview.md), which presents a general overview of Copy Activity.
>[!NOTE] >This connector only supports copy data to/from Azure Cosmos DB's API for MongoDB. For SQL API, refer to [Cosmos DB SQL API connector](connector-azure-cosmos-db.md). Other API types are not supported now.
The following properties are supported in the Copy Activity **sink** section:
| Property | Description | Required | |: |: |: | | type | The **type** property of the Copy Activity sink must be set to **CosmosDbMongoDbApiSink**. |Yes |
-| writeBehavior |Describes how to write data to Azure Cosmos DB. Allowed values: **insert** and **upsert**.<br/><br/>The behavior of **upsert** is to replace the document if a document with the same `_id` already exists; otherwise, insert the document.<br /><br />**Note**: Data Factory automatically generates an `_id` for a document if an `_id` isn't specified either in the original document or by column mapping. This means that you must ensure that, for **upsert** to work as expected, your document has an ID. |No<br />(the default is **insert**) |
+| writeBehavior |Describes how to write data to Azure Cosmos DB. Allowed values: **insert** and **upsert**.<br/><br/>The behavior of **upsert** is to replace the document if a document with the same `_id` already exists; otherwise, insert the document.<br /><br />**Note**: The service automatically generates an `_id` for a document if an `_id` isn't specified either in the original document or by column mapping. This means that you must ensure that, for **upsert** to work as expected, your document has an ID. |No<br />(the default is **insert**) |
| writeBatchSize | The **writeBatchSize** property controls the size of documents to write in each batch. You can try increasing the value for **writeBatchSize** to improve performance and decreasing the value if your document size being large. |No<br />(the default is **10,000**) | | writeBatchTimeout | The wait time for the batch insert operation to finish before it times out. The allowed value is timespan. | No<br/>(the default is **00:30:00** - 30 minutes) |
The following properties are supported in the Copy Activity **sink** section:
You can use this Azure Cosmos DB connector to easily: * Copy documents between two Azure Cosmos DB collections as-is.
-* Import JSON documents from various sources to Azure Cosmos DB, including from MongoDB, Azure Blob storage, Azure Data Lake Store, and other file-based stores that Azure Data Factory supports.
+* Import JSON documents from various sources to Azure Cosmos DB, including from MongoDB, Azure Blob storage, Azure Data Lake Store, and other file-based stores that the service supports.
* Export JSON documents from an Azure Cosmos DB collection to various file-based stores. To achieve schema-agnostic copy:
After copy activity execution, below BSON ObjectId is generated in sink:
## Next steps
-For a list of data stores that Copy Activity supports as sources and sinks in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores that Copy Activity supports as sources and sinks, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Azure Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-cosmos-db.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure Cosmos DB (SQL API) by using Azure Data Factory
data-factory Connector Azure Data Explorer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-explorer.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data to or from Azure Data Explorer by using Azure Data Factory
data-factory Connector Azure Data Lake Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-lake-storage.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Azure Data Lake Store https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-lake-store.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data to or from Azure Data Lake Storage Gen1 using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Azure Database For Mariadb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-mariadb.md
Title: Copy data from Azure Database for MariaDB
+description: Learn how to copy data from Azure Database for MariaDB to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Azure Database for MariaDB to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Azure Database for MariaDB using Azure Data Factory
+# Copy data from Azure Database for MariaDB using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Azure Database for MariaDB. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Azure Database for MariaDB. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Azure Database for MariaDB connector is supported for the following activit
You can copy data from Azure Database for MariaDB to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
To copy data from Azure Database for MariaDB, the following properties are suppo
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Azure Database For Mysql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-mysql.md
Title: Copy and transform data in Azure Database for MySQL
+description: Learn how to copy and transform data in Azure Database for MySQL using Azure Data Factory or Synapse Analytics pipelines.
-description: earn how to copy and transform data in Azure Database for MySQL by using Azure Data Factory.
--++ Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy and transform data in Azure Database for MySQL by using Azure Data Factory
+# Copy and transform data in Azure Database for MySQL using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use Copy Activity in Azure Data Factory to copy data from and to Azure Database for MySQL, and use Data Flow to transform data in Azure Database for MySQL. To learn about Azure Data Factory, read the [introductory article](introduction.md).
+This article outlines how to use Copy Activity in Azure Data Factory or Synapse Analytics pipelines to copy data from and to Azure Database for MySQL, and use Data Flow to transform data in Azure Database for MySQL. To learn more, read the introductory articles for [Azure Data Factory](introduction.md) and [Synapse Analytics](../synapse-analytics/overview-what-is.md).
This connector is specialized for [Azure Database for MySQL service](../mysql/overview.md). To copy data from generic MySQL database located on-premises or in the cloud, use [MySQL connector](connector-mysql.md).
To learn details about the properties, check [Lookup activity](control-flow-look
## Data type mapping for Azure Database for MySQL
-When copying data from Azure Database for MySQL, the following mappings are used from MySQL data types to Azure Data Factory interim data types. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
+When copying data from Azure Database for MySQL, the following mappings are used from MySQL data types to interim data types used internally within the service. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
-| Azure Database for MySQL data type | Data factory interim data type |
+| Azure Database for MySQL data type | Interim service data type |
|: |: | | `bigint` |`Int64` | | `bigint unsigned` |`Decimal` |
When copying data from Azure Database for MySQL, the following mappings are used
| `year` |`Int32` | ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Azure Database For Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-postgresql.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure Database for PostgreSQL by using Azure Data Factory
data-factory Connector Azure Databricks Delta Lake https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-databricks-delta-lake.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data to and from Azure Databricks Delta Lake using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Azure File Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-file-storage.md
Previously updated : 03/17/2021 Last updated : 09/09/2021 # Copy data from or to Azure Files by using Azure Data Factory
data-factory Connector Azure Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-search.md
Title: Copy data to Search index
+description: Learn about how to push or copy data to an Azure search index using the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn about how to push or copy data to an Azure search index by using the Copy Activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data to an Azure Cognitive Search index using Azure Data Factory
+# Copy data to an Azure Cognitive Search index using Azure Data Factory or Synapse Analytics
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] > * [Version 1](v1/data-factory-azure-search-connector.md)
Last updated 08/30/2021
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data into Azure Cognitive Search index. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data into Azure Cognitive Search index. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Azure Cognitive Search linked service
|: |: |: | | type | The type property must be set to: **AzureSearch** | Yes | | url | URL for the search service. | Yes |
-| key | Admin key for the search service. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| key | Admin key for the search service. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use Azure Integration Runtime or Self-hosted Integration Runtime (if your data store is located in private network). If not specified, it uses the default Azure Integration Runtime. |No | > [!IMPORTANT]
To copy data into Azure Cognitive Search, the following properties are supported
| Property | Description | Required | |: |: |: | | type | The type property of the dataset must be set to: **AzureSearchIndex** | Yes |
-| indexName | Name of the search index. Data Factory does not create the index. The index must exist in Azure Cognitive Search. | Yes |
+| indexName | Name of the search index. The service does not create the index. The index must exist in Azure Cognitive Search. | Yes |
**Example:**
The following table specifies whether an Azure Cognitive Search data type is sup
Currently other data types e.g. ComplexType are not supported. For a full list of Azure Cognitive Search supported data types, see [Supported data types (Azure Cognitive Search)](/rest/api/searchservice/supported-data-types). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Azure Sql Data Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-sql-data-warehouse.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure Synapse Analytics by using Azure Data Factory or Synapse pipelines
data-factory Connector Azure Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-sql-database.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure SQL Database by using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Azure Sql Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-sql-managed-instance.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Azure SQL Managed Instance by using Azure Data Factory
data-factory Connector Azure Table Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-table-storage.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data to and from Azure Table storage by using Azure Data Factory
data-factory Connector Cassandra https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-cassandra.md
Title: Copy data from Cassandra using Azure Data Factory
+ Title: Copy data from Cassandra
+description: Learn how to copy data from Cassandra to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Cassandra to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Cassandra using Azure Data Factory
+# Copy data from Cassandra using Azure Data Factory or Synapse Analytics
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] > * [Version 1](v1/data-factory-onprem-cassandra-connector.md) > * [Current version](connector-cassandra.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from a Cassandra database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a Cassandra database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Cassandra linked service:
| port |The TCP port that the Cassandra server uses to listen for client connections. |No (default is 9042) | | authenticationType | Type of authentication used to connect to the Cassandra database.<br/>Allowed values are: **Basic**, and **Anonymous**. |Yes | | username |Specify user name for the user account. |Yes, if authenticationType is set to Basic. |
-| password |Specify password for the user account. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes, if authenticationType is set to Basic. |
+| password |Specify password for the user account. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes, if authenticationType is set to Basic. |
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. Learn more from [Prerequisites](#prerequisites) section. If not specified, it uses the default Azure Integration Runtime. |No | >[!NOTE]
To copy data from Cassandra, set the source type in the copy activity to **Cassa
## Data type mapping for Cassandra
-When copying data from Cassandra, the following mappings are used from Cassandra data types to Azure Data Factory interim data types. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
+When copying data from Cassandra, the following mappings are used from Cassandra data types to interim data types used internally within the service. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
-| Cassandra data type | Data factory interim data type |
+| Cassandra data type | Interim service data type |
|: |: | | ASCII |String | | BIGINT |Int64 |
When copying data from Cassandra, the following mappings are used from Cassandra
## Work with collections using virtual table
-Azure Data Factory uses a built-in ODBC driver to connect to and copy data from your Cassandra database. For collection types including map, set and list, the driver renormalizes the data into corresponding virtual tables. Specifically, if a table contains any collection columns, the driver generates the following virtual tables:
+The service uses a built-in ODBC driver to connect to and copy data from your Cassandra database. For collection types including map, set and list, the driver renormalizes the data into corresponding virtual tables. Specifically, if a table contains any collection columns, the driver generates the following virtual tables:
* A **base table**, which contains the same data as the real table except for the collection columns. The base table uses the same name as the real table that it represents. * A **virtual table** for each collection column, which expands the nested data. The virtual tables that represent collections are named using the name of the real table, a separator "*vt*" and the name of the column.
-Virtual tables refer to the data in the real table, enabling the driver to access the denormalized data. See Example section for details. You can access the content of Cassandra collections by querying and joining the virtual tables.
+Virtual tables refer to the data in the real table, enabling the driver to access the de-normalized data. See Example section for details. You can access the content of Cassandra collections by querying and joining the virtual tables.
### Example
The following tables show the virtual tables that renormalize the data from the
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Concur https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-concur.md
Title: Copy data from Concur using Azure Data Factory (Preview)
+ Title: Copy data from Concur (Preview)
+description: Learn how to copy data from Concur to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Concur to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Concur using Azure Data Factory (Preview)
+# Copy data from Concur using Azure Data Factory or Synapse Analytics(Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Concur. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Concur. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
The following properties are supported for Concur linked service:
| host | The endpoint of the Concur server, e.g. `implementation.concursolutions.com`. | Yes | | baseUrl | The base URL of your Concur's authorization URL. | Yes for `OAuth_2.0_Bearer` authentication | | clientId | Application client ID supplied by Concur App Management. | Yes |
-| clientSecret | The client secret corresponding to the client ID. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes for `OAuth_2.0_Bearer` authentication |
+| clientSecret | The client secret corresponding to the client ID. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes for `OAuth_2.0_Bearer` authentication |
| username | The user name that you use to access Concur service. | Yes |
-| password | The password corresponding to the user name that you provided in the username field. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| password | The password corresponding to the user name that you provided in the username field. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To copy data from Concur, set the source type in the copy activity to **ConcurSo
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Couchbase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-couchbase.md
Title: Copy data from Couchbase using Azure Data Factory (Preview)
+ Title: Copy data from Couchbase (Preview)
+description: Learn how to copy data from Couchbase to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Couchbase to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from Couchbase using Azure Data Factory (Preview) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Couchbase. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Couchbase. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Couchbase connector is supported for the following activities:
You can copy data from Couchbase to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
To copy data from Couchbase, set the source type in the copy activity to **Couch
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Db2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-db2.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from DB2 by using Azure Data Factory
data-factory Connector Drill https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-drill.md
Title: Copy data from Drill using Azure Data Factory
+ Title: Copy data from Drill
+description: Learn how to copy data from Drill to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Drill to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Drill using Azure Data Factory
+# Copy data from Drill using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Drill. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Drill. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Drill connector is supported for the following activities:
You can copy data from Drill to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
To copy data from Drill, set the source type in the copy activity to **DrillSour
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Dynamics Ax https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-dynamics-ax.md
Title: Copy data from Dynamics AX
+description: Learn how to copy data from Dynamics AX to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Dynamics AX to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Dynamics AX by using Azure Data Factory
+# Copy data from Dynamics AX using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use Copy Activity in Azure Data Factory to copy data from Dynamics AX source. The article builds on [Copy Activity in Azure Data Factory](copy-activity-overview.md), which presents a general overview of Copy Activity.
+This article outlines how to use Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from Dynamics AX source. The article builds on [Copy Activity](copy-activity-overview.md), which presents a general overview of Copy Activity.
## Supported capabilities
The following properties are supported for Dynamics AX linked service:
| type | The **type** property must be set to **DynamicsAX**. |Yes | | url | The Dynamics AX (or Dynamics 365 Finance and Operations) instance OData endpoint. |Yes | | servicePrincipalId | Specify the application's client ID. | Yes |
-| servicePrincipalKey | Specify the application's key. Mark this field as a **SecureString** to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| servicePrincipalKey | Specify the application's key. Mark this field as a **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| tenant | Specify the tenant information (domain name or tenant ID) under which your application resides. Retrieve it by hovering the mouse in the top-right corner of the Azure portal. | Yes | | aadResourceId | Specify the AAD resource you are requesting for authorization. For example, if your Dynamics URL is `https://sampledynamics.sandbox.operations.dynamics.com/data/`, the corresponding AAD resource is usually `https://sampledynamics.sandbox.operations.dynamics.com`. | Yes | | connectVia | The [Integration Runtime](concepts-integration-runtime.md) to use to connect to the data store. You can choose Azure Integration Runtime or a self-hosted Integration Runtime (if your data store is located in a private network). If not specified, the default Azure Integration Runtime is used. |No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores that Copy Activity supports as sources and sinks in Azure Data Factory, see [Supported data stores and formats](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores that Copy Activity supports as sources and sinks, see [Supported data stores and formats](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Dynamics Crm Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-dynamics-crm-office-365.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from and to Dynamics 365 (Microsoft Dataverse) or Dynamics CRM
data-factory Connector File System https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-file-system.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Ftp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-ftp.md
Previously updated : 03/17/2021 Last updated : 09/09/2021
data-factory Connector Github https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-github.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 - # Use GitHub to read Common Data Model entity references [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-The GitHub connector in Azure Data Factory is only used to receive the entity reference schema for the [Common Data Model](format-common-data-model.md) format in mapping data flow.
+The GitHub connector in Azure Data Factory and Synapse Analytics pipelines is only used to receive the entity reference schema for the [Common Data Model](format-common-data-model.md) format in mapping data flow.
## Create a linked service to GitHub using UI
data-factory Connector Google Adwords https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-google-adwords.md
Title: Copy data from Google AdWords
+description: Learn how to copy data from Google AdWords to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Google AdWords to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Google AdWords using Azure Data Factory
+# Copy data from Google AdWords using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Google AdWords. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Google AdWords. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Google AdWords connector is supported for the following activities:
You can copy data from Google AdWords to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Google AdWords linked service:
|: |: |: | | type | The type property must be set to: **GoogleAdWords** | Yes | | clientCustomerID | The Client customer ID of the AdWords account that you want to fetch report data for. | Yes |
-| developerToken | The developer token associated with the manager account that you use to grant access to the AdWords API. You can choose to mark this field as a SecureString to store it securely in ADF, or store password in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
+| developerToken | The developer token associated with the manager account that you use to grant access to the AdWords API. You can choose to mark this field as a SecureString to store it securely, or store password in Azure Key Vault and let the copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
| authenticationType | The OAuth 2.0 authentication mechanism used for authentication. ServiceAuthentication can only be used on self-hosted IR. <br/>Allowed values are: **ServiceAuthentication**, **UserAuthentication** | Yes |
-| refreshToken | The refresh token obtained from Google for authorizing access to AdWords for UserAuthentication. You can choose to mark this field as a SecureString to store it securely in ADF, or store password in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | No |
-| clientId | The client ID of the Google application used to acquire the refresh token. You can choose to mark this field as a SecureString to store it securely in ADF, or store password in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | No |
-| clientSecret | The client secret of the google application used to acquire the refresh token. You can choose to mark this field as a SecureString to store it securely in ADF, or store password in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | No |
+| refreshToken | The refresh token obtained from Google for authorizing access to AdWords for UserAuthentication. You can choose to mark this field as a SecureString to store it securely, or store password in Azure Key Vault and let the copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | No |
+| clientId | The client ID of the Google application used to acquire the refresh token. You can choose to mark this field as a SecureString to store it securely, or store password in Azure Key Vault and let the copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | No |
+| clientSecret | The client secret of the google application used to acquire the refresh token. You can choose to mark this field as a SecureString to store it securely, or store password in Azure Key Vault and let the copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | No |
| email | The service account email ID that is used for ServiceAuthentication and can only be used on self-hosted IR. | No | | keyFilePath | The full path to the .p12 key file that is used to authenticate the service account email address and can only be used on self-hosted IR. | No | | trustedCertPath | The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. This property can only be set when using TLS on self-hosted IR. The default value is the cacerts.pem file installed with the IR. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Google Bigquery https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-google-bigquery.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from Google BigQuery by using Azure Data Factory
data-factory Connector Google Cloud Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-google-cloud-storage.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Greenplum https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-greenplum.md
Title: Copy data from Greenplum using Azure Data Factory
+ Title: Copy data from Greenplum
+description: Learn how to copy data from Greenplum to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Greenplum to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Greenplum using Azure Data Factory
+# Copy data from Greenplum using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Greenplum. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Greenplum. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Greenplum connector is supported for the following activities:
You can copy data from Greenplum to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
To copy data from Greenplum, set the source type in the copy activity to **Green
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Hbase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hbase.md
Title: Copy data from HBase using Azure Data Factory
+ Title: Copy data from HBase
+description: Learn how to copy data from HBase to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from HBase to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from HBase using Azure Data Factory
+# Copy data from HBase using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from HBase. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from HBase. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This HBase connector is supported for the following activities:
You can copy data from HBase to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
The following properties are supported for HBase linked service:
| httpPath | The partial URL corresponding to the HBase server, e.g. `/hbaserest0` when using HDInsights cluster. | No | | authenticationType | The authentication mechanism to use to connect to the HBase server. <br/>Allowed values are: **Anonymous**, **Basic** | Yes | | username | The user name used to connect to the HBase instance. | No |
-| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| enableSsl | Specifies whether the connections to the server are encrypted using TLS. The default value is false. | No | | trustedCertPath | The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. This property can only be set when using TLS on self-hosted IR. The default value is the cacerts.pem file installed with the IR. | No | | allowHostNameCNMismatch | Specifies whether to require a CA-issued TLS/SSL certificate name to match the host name of the server when connecting over TLS. The default value is false. | No |
To copy data from HBase, set the source type in the copy activity to **HBaseSour
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Hdfs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hdfs.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Hive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hive.md
Title: Copy data from Hive using Azure Data Factory
+ Title: Copy data from Hive
+description: Learn how to copy data from Hive to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Hive to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data from Hive using Azure Data Factory [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Hive. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Hive. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Hive connector is supported for the following activities:
You can copy data from Hive to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
The following properties are supported for Hive linked service:
| zooKeeperNameSpace | The namespace on ZooKeeper under which Hive Server 2 nodes are added. | No | | useNativeQuery | Specifies whether the driver uses native HiveQL queries, or converts them into an equivalent form in HiveQL. | No | | username | The user name that you use to access Hive Server. | No |
-| password | The password corresponding to the user. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | The password corresponding to the user. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| httpPath | The partial URL corresponding to the Hive server. | No | | enableSsl | Specifies whether the connections to the server are encrypted using TLS. The default value is false. | No | | trustedCertPath | The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. This property can only be set when using TLS on self-hosted IR. The default value is the cacerts.pem file installed with the IR. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Http https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-http.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from an HTTP endpoint by using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Hubspot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hubspot.md
Title: Copy data from HubSpot using Azure Data Factory
+ Title: Copy data from HubSpot
+description: Learn how to copy data from HubSpot to supported sink data stores by using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from HubSpot to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from HubSpot using Azure Data Factory
+# Copy data from HubSpot using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from HubSpot. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from HubSpot. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This HubSpot connector is supported for the following activities:
You can copy data from HubSpot to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for HubSpot linked service:
|: |: |: | | type | The type property must be set to: **Hubspot** | Yes | | clientId | The client ID associated with your HubSpot application. Learn how to create an app in HubSpot from [here](https://developers.hubspot.com/docs/faq/how-do-i-create-an-app-in-hubspot). | Yes |
-| clientSecret | The client secret associated with your HubSpot application. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
-| accessToken | The access token obtained when initially authenticating your OAuth integration. Learn how to get access token with your client ID and secret from [here](https://developers.hubspot.com/docs/methods/oauth2/get-access-and-refresh-tokens). Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
-| refreshToken | The refresh token obtained when initially authenticating your OAuth integration. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| clientSecret | The client secret associated with your HubSpot application. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| accessToken | The access token obtained when initially authenticating your OAuth integration. Learn how to get access token with your client ID and secret from [here](https://developers.hubspot.com/docs/methods/oauth2/get-access-and-refresh-tokens). Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| refreshToken | The refresh token obtained when initially authenticating your OAuth integration. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Impala https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-impala.md
Title: Copy data from Impala by using Azure Data Factory
+ Title: Copy data from Impala
+description: Learn how to copy data from Impala to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Impala to supported sink data stores by using a copy activity in a data factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Impala by using Azure Data Factory
+# Copy data from Impala using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use Copy Activity in Azure Data Factory to copy data from Impala. It builds on the [Copy Activity overview](copy-activity-overview.md) article that presents a general overview of the copy activity.
+This article outlines how to use Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Impala. It builds on the [Copy Activity overview](copy-activity-overview.md) article that presents a general overview of the copy activity.
## Supported capabilities
This Impala connector is supported for the following activities:
You can copy data from Impala to any supported sink data store. For a list of data stores that are supported as sources or sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Data Factory provides a built-in driver to enable connectivity. Therefore, you don't need to manually install a driver to use this connector.
+The service provides a built-in driver to enable connectivity. Therefore, you don't need to manually install a driver to use this connector.
## Prerequisites
The following properties are supported for Impala linked service.
| port | The TCP port that the Impala server uses to listen for client connections. The default value is 21050. | No | | authenticationType | The authentication type to use. <br/>Allowed values are **Anonymous**, **SASLUsername**, and **UsernameAndPassword**. | Yes | | username | The user name used to access the Impala server. The default value is anonymous when you use SASLUsername. | No |
-| password | The password that corresponds to the user name when you use UsernameAndPassword. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | The password that corresponds to the user name when you use UsernameAndPassword. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| enableSsl | Specifies whether the connections to the server are encrypted by using TLS. The default value is **false**. | No | | trustedCertPath | The full path of the .pem file that contains trusted CA certificates used to verify the server when you connect over TLS. This property can be set only when you use TLS on Self-hosted Integration Runtime. The default value is the cacerts.pem file installed with the integration runtime. | No | | useSystemTrustStore | Specifies whether to use a CA certificate from the system trust store or from a specified PEM file. The default value is **false**. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Data Factory, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Informix https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-informix.md
Title: Copy data from and to IBM Informix using Azure Data Factory
+ Title: Copy data from and to IBM Informix
+description: Learn how to copy data from and to IBM Informix using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from and to IBM Informix by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from and to IBM Informix using Azure Data Factory
+# Copy data from and to IBM Informix using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from an IBM Informix data store. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from an IBM Informix data store. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Informix linked service:
| connectionString | The ODBC connection string excluding the credential portion. You can specify the connection string or use the system DSN (Data Source Name) you set up on the Integration Runtime machine (you need still specify the credential portion in linked service accordingly). <br> You can also put a password in Azure Key Vault and pull the `password` configuration out of the connection string. Refer to [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md) with more details.| Yes | | authenticationType | Type of authentication used to connect to the Informix data store.<br/>Allowed values are: **Basic** and **Anonymous**. | Yes | | userName | Specify user name if you are using Basic authentication. | No |
-| password | Specify password for the user account you specified for the userName. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | Specify password for the user account you specified for the userName. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| credential | The access credential portion of the connection string specified in driver-specific property-value format. Mark this field as a SecureString. | No | | connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. A Self-hosted Integration Runtime is required as mentioned in [Prerequisites](#prerequisites). |Yes |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Jira https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-jira.md
Title: Copy data from Jira using Azure Data Factory
+ Title: Copy data from Jira
+description: Learn how to copy data from Jira to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Jira to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Jira using Azure Data Factory
+# Copy data from Jira using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Jira. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Jira. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Jira connector is supported for the following activities:
You can copy data from Jira to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Jira linked service:
| host | The IP address or host name of the Jira service. (for example, jira.example.com) | Yes | | port | The TCP port that the Jira server uses to listen for client connections. The default value is 443 if connecting through HTTPS, or 8080 if connecting through HTTP. | No | | username | The user name that you use to access Jira Service. | Yes |
-| password | The password corresponding to the user name that you provided in the username field. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| password | The password corresponding to the user name that you provided in the username field. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To copy data from Jira, set the source type in the copy activity to **JiraSource
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Magento https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-magento.md
Title: Copy data from Magento using Azure Data Factory (Preview)
+ Title: Copy data from Magento (Preview)
+description: Learn how to copy data from Magento to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Magento to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Magento using Azure Data Factory (Preview)
+# Copy data from Magento using Azure Data Factory or Synapse Analytics(Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Magento. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Magento. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Magento connector is supported for the following activities:
You can copy data from Magento to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Magento linked service:
|: |: |: | | type | The type property must be set to: **Magento** | Yes | | host | The URL of the Magento instance. (that is, 192.168.222.110/magento3) | Yes |
-| accessToken | The access token from Magento. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| accessToken | The access token from Magento. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To copy data from Magento, set the source type in the copy activity to **Magento
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Mariadb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mariadb.md
Title: Copy data from MariaDB using Azure Data Factory
+ Title: Copy data from MariaDB
+description: Learn how to copy data from MariaDB to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from MariaDB to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from MariaDB using Azure Data Factory
+# Copy data from MariaDB using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from MariaDB. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from MariaDB. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This MariaDB connector is supported for the following activities:
You can copy data from MariaDB to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
This connector currently supports MariaDB of version 10.0 to 10.2.
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Marketo https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-marketo.md
Title: Copy data from Marketo using Azure Data Factory (Preview)
+ Title: Copy data from Marketo (Preview)
+description: Learn how to copy data from Marketo to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Marketo to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Marketo using Azure Data Factory (Preview)
+
+# Copy data from Marketo using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Marketo. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Marketo. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
The following properties are supported for Marketo linked service:
| type | The type property must be set to: **Marketo** | Yes | | endpoint | The endpoint of the Marketo server. (i.e. 123-ABC-321.mktorest.com) | Yes | | clientId | The client Id of your Marketo service. | Yes |
-| clientSecret | The client secret of your Marketo service. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| clientSecret | The client secret of your Marketo service. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Microsoft Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-microsoft-access.md
Title: Copy data from and to Microsoft Access
+description: Learn how to copy data from and to Microsoft Access using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from and to Microsoft Access by using a copy activity in an Azure Data Factory pipeline.
--++ Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from and to Microsoft Access using Azure Data Factory
+# Copy data from and to Microsoft Access using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from a Microsoft Access data store. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from a Microsoft Access data store. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Microsoft Access linked service:
| connectionString | The ODBC connection string excluding the credential portion. You can specify the connection string or use the system DSN (Data Source Name) you set up on the Integration Runtime machine (you need still specify the credential portion in linked service accordingly).<br> You can also put a password in Azure Key Vault and pull the `password` configuration out of the connection string. Refer to [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md) with more details.| Yes | | authenticationType | Type of authentication used to connect to the Microsoft Access data store.<br/>Allowed values are: **Basic** and **Anonymous**. | Yes | | userName | Specify user name if you are using Basic authentication. | No |
-| password | Specify password for the user account you specified for the userName. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | Specify password for the user account you specified for the userName. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| credential | The access credential portion of the connection string specified in driver-specific property-value format. Mark this field as a SecureString. | No | | connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. A Self-hosted Integration Runtime is required as mentioned in [Prerequisites](#prerequisites). |Yes |
To copy data to Microsoft Access, the following properties are supported in the
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Mongodb Atlas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mongodb-atlas.md
Title: Copy data from or to MongoDB Atlas
+description: Learn how to copy data from MongoDB Atlas to supported sink data stores, or from supported source data stores to MongoDB Atlas, using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from MongoDB Atlas to supported sink data stores, or from supported source data stores to MongoDB Atlas, by using a copy activity in an Azure Data Factory pipeline.
--++ Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from or to MongoDB Atlas using Azure Data Factory
+# Copy data from or to MongoDB Atlas using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from and to a MongoDB Atlas database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from and to a MongoDB Atlas database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported in the copy activity **source** section:
| batchSize | Specifies the number of documents to return in each batch of the response from MongoDB Atlas instance. In most cases, modifying the batch size will not affect the user or the application. Cosmos DB limits each batch cannot exceed 40MB in size, which is the sum of the batchSize number of documents' size, so decrease this value if your document size being large. | No<br/>(the default is **100**) | >[!TIP]
->ADF support consuming BSON document in **Strict mode**. Make sure your filter query is in Strict mode instead of Shell mode. More description can be found at [MongoDB manual](https://docs.mongodb.com/manual/reference/mongodb-extended-json/https://docsupdatetracker.net/index.html).
+>The service supports consuming BSON document in **Strict mode**. Make sure your filter query is in Strict mode instead of Shell mode. More description can be found at [MongoDB manual](https://docs.mongodb.com/manual/reference/mongodb-extended-json/https://docsupdatetracker.net/index.html).
**Example:**
The following properties are supported in the Copy Activity **sink** section:
| Property | Description | Required | |: |: |: | | type | The **type** property of the Copy Activity sink must be set to **MongoDbAtlasSink**. |Yes |
-| writeBehavior |Describes how to write data to MongoDB Atlas. Allowed values: **insert** and **upsert**.<br/><br/>The behavior of **upsert** is to replace the document if a document with the same `_id` already exists; otherwise, insert the document.<br /><br />**Note**: Data Factory automatically generates an `_id` for a document if an `_id` isn't specified either in the original document or by column mapping. This means that you must ensure that, for **upsert** to work as expected, your document has an ID. |No<br />(the default is **insert**) |
+| writeBehavior |Describes how to write data to MongoDB Atlas. Allowed values: **insert** and **upsert**.<br/><br/>The behavior of **upsert** is to replace the document if a document with the same `_id` already exists; otherwise, insert the document.<br /><br />**Note**: The service automatically generates an `_id` for a document if an `_id` isn't specified either in the original document or by column mapping. This means that you must ensure that, for **upsert** to work as expected, your document has an ID. |No<br />(the default is **insert**) |
| writeBatchSize | The **writeBatchSize** property controls the size of documents to write in each batch. You can try increasing the value for **writeBatchSize** to improve performance and decreasing the value if your document size being large. |No<br />(the default is **10,000**) | | writeBatchTimeout | The wait time for the batch insert operation to finish before it times out. The allowed value is timespan. | No<br/>(the default is **00:30:00** - 30 minutes) |
The following properties are supported in the Copy Activity **sink** section:
You can use this MongoDB Atlas connector to easily: * Copy documents between two MongoDB Atlas collections as-is.
-* Import JSON documents from various sources to MongoDB Atlas, including from Azure Cosmos DB, Azure Blob storage, Azure Data Lake Store, and other file-based stores that Azure Data Factory supports.
+* Import JSON documents from various sources to MongoDB Atlas, including from Azure Cosmos DB, Azure Blob storage, Azure Data Lake Store, and other supported file-based stores.
* Export JSON documents from a MongoDB Atlas collection to various file-based stores. To achieve such schema-agnostic copy, skip the "structure" (also called *schema*) section in dataset and schema mapping in copy activity.
To achieve such schema-agnostic copy, skip the "structure" (also called *schema*
To copy data from MongoDB Atlas to tabular sink or reversed, refer to [schema mapping](copy-activity-schema-and-type-mapping.md#schema-mapping). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Mongodb Legacy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mongodb-legacy.md
Title: Copy data from MongoDB using legacy
+description: Learn how to copy data from Mongo DB to supported sink data stores using a copy activity in a legacy Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Mongo DB to supported sink data stores by using a copy activity in a legacy Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from MongoDB using Azure Data Factory (legacy)
+# Copy data from MongoDB using Azure Data Factory or Synapse Analytics (legacy)
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] > * [Version 1](v1/data-factory-on-premises-mongodb-connector.md) > * [Current version](connector-mongodb.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from a MongoDB database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a MongoDB database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
>[!IMPORTANT]
->ADF release a new MongoDB connector which provides better native MongoDB support comparing to this ODBC-based implementation, refer to [MongoDB connector](connector-mongodb.md) article on details. This legacy MongoDB connector is kept supported as-is for backward compability, while for any new workload, please use the new connector.
+>The service has released a new MongoDB connector which provides better native MongoDB support comparing to this ODBC-based implementation, refer to [MongoDB connector](connector-mongodb.md) article on details. This legacy MongoDB connector is kept supported as-is for backward compatibility, while for any new workload, please use the new connector.
## Supported capabilities
The following properties are supported for MongoDB linked service:
| databaseName |Name of the MongoDB database that you want to access. |Yes | | authenticationType | Type of authentication used to connect to the MongoDB database.<br/>Allowed values are: **Basic**, and **Anonymous**. |Yes | | username |User account to access MongoDB. |Yes (if basic authentication is used). |
-| password |Password for the user. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes (if basic authentication is used). |
+| password |Password for the user. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes (if basic authentication is used). |
| authSource |Name of the MongoDB database that you want to use to check your credentials for authentication. |No. For basic authentication, default is to use the admin account and the database specified using databaseName property. | | enableSsl | Specifies whether the connections to the server are encrypted using TLS. The default value is false. | No | | allowSelfSignedServerCert | Specifies whether to allow self-signed certificates from the server. The default value is false. | No |
Azure Data Factory service infers schema from a MongoDB collection by using the
## Data type mapping for MongoDB
-When copying data from MongoDB, the following mappings are used from MongoDB data types to Azure Data Factory interim data types. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
+When copying data from MongoDB, the following mappings are used from MongoDB data types to interim data types used within the service internally. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
-| MongoDB data type | Data factory interim data type |
+| MongoDB data type | Interim service data type |
|: |: | | Binary |Byte[] | | Boolean |Boolean |
When copying data from MongoDB, the following mappings are used from MongoDB dat
## Support for complex types using virtual tables
-Azure Data Factory uses a built-in ODBC driver to connect to and copy data from your MongoDB database. For complex types such as arrays or objects with different types across the documents, the driver re-normalizes data into corresponding virtual tables. Specifically, if a table contains such columns, the driver generates the following virtual tables:
+The service uses a built-in ODBC driver to connect to and copy data from your MongoDB database. For complex types such as arrays or objects with different types across the documents, the driver re-normalizes data into corresponding virtual tables. Specifically, if a table contains such columns, the driver generates the following virtual tables:
* A **base table**, which contains the same data as the real table except for the complex type columns. The base table uses the same name as the real table that it represents. * A **virtual table** for each complex type column, which expands the nested data. The virtual tables are named using the name of the real table, a separator ΓÇ£_" and the name of the array or object.
-Virtual tables refer to the data in the real table, enabling the driver to access the denormalized data. You can access the content of MongoDB arrays by querying and joining the virtual tables.
+Virtual tables refer to the data in the real table, enabling the driver to access the de-normalized data. You can access the content of MongoDB arrays by querying and joining the virtual tables.
### Example
The following tables show the virtual tables that represent the original arrays
| 2222 |1 |2 | ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mongodb.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from or to MongoDB by using Azure Data Factory
data-factory Connector Mysql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mysql.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Netezza https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-netezza.md
Title: Copy data from Netezza by using Azure Data Factory
+ Title: Copy data from Netezza
+description: Learn how to copy data from Netezza to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Netezza to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Netezza by using Azure Data Factory
+# Copy data from Netezza by using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use Copy Activity in Azure Data Factory to copy data from Netezza. The article builds on [Copy Activity in Azure Data Factory](copy-activity-overview.md), which presents a general overview of Copy Activity.
+This article outlines how to use Copy Activity in Azure Data Factory or Synapse Analytics pipelines to copy data from Netezza. The article builds on [Copy Activity](copy-activity-overview.md), which presents a general overview of Copy Activity.
>[!TIP]
->For data migration scenario from Netezza to Azure, learn more from [Use Azure Data Factory to migrate data from on-premises Netezza server to Azure](data-migration-guidance-netezza-azure-sqldw.md).
+>For data migration scenario from Netezza to Azure, learn more from [Migrate data from on-premises Netezza server to Azure](data-migration-guidance-netezza-azure-sqldw.md).
## Supported capabilities
You can copy data from Netezza to any supported sink data store. For a list of d
Netezza connector supports parallel copying from source. See the [Parallel copy from Netezza](#parallel-copy-from-netezza) section for details.
-Azure Data Factory provides a built-in driver to enable connectivity. You don't need to manually install any driver to use this connector.
+The service provides a built-in driver to enable connectivity. You don't need to manually install any driver to use this connector.
## Prerequisites
Use the following steps to create a linked service to Netezza in the Azure porta
## Connector configuration details
-The following sections provide details about properties you can use to define Data Factory entities that are specific to the Netezza connector.
+The following sections provide details about properties you can use to define entities that are specific to the Netezza connector.
## Linked service properties
The Data Factory Netezza connector provides built-in data partitioning to copy d
![Screenshot of partition options](./media/connector-netezza/connector-netezza-partition-options.png)
-When you enable partitioned copy, Data Factory runs parallel queries against your Netezza source to load data by partitions. The parallel degree is controlled by the [`parallelCopies`](copy-activity-performance-features.md#parallel-copy) setting on the copy activity. For example, if you set `parallelCopies` to four, Data Factory concurrently generates and runs four queries based on your specified partition option and settings, and each query retrieves a portion of data from your Netezza database.
+When you enable partitioned copy, the service runs parallel queries against your Netezza source to load data by partitions. The parallel degree is controlled by the [`parallelCopies`](copy-activity-performance-features.md#parallel-copy) setting on the copy activity. For example, if you set `parallelCopies` to four, the service concurrently generates and runs four queries based on your specified partition option and settings, and each query retrieves a portion of data from your Netezza database.
You are suggested to enable parallel copy with data partitioning especially when you load large amount of data from your Netezza database. The following are suggested configurations for different scenarios. When copying data into file-based data store, it's recommanded to write to a folder as multiple files (only specify folder name), in which case the performance is better than writing to a single file. | Scenario | Suggested settings | | | |
-| Full load from large table. | **Partition option**: Data Slice. <br><br/>During execution, Data Factory automatically partitions the data based on [Netezza's built-in data slices](https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.2.1/com.ibm.nz.adm.doc/c_sysadm_data_slices_parts_disks.html), and copies data by partitions. |
-| Load large amount of data by using a custom query. | **Partition option**: Data Slice.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE mod(datasliceid, ?AdfPartitionCount) = ?AdfDataSliceCondition AND <your_additional_where_clause>`.<br>During execution, Data Factory replaces `?AdfPartitionCount` (with parallel copy number set on copy activity) and `?AdfDataSliceCondition` with the data slice partition logic, and sends to Netezza. |
-| Load large amount of data by using a custom query, having an integer column with evenly distributed value for range partitioning. | **Partition options**: Dynamic range partition.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE ?AdfRangePartitionColumnName <= ?AdfRangePartitionUpbound AND ?AdfRangePartitionColumnName >= ?AdfRangePartitionLowbound AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used to partition data. You can partition against the column with integer data type.<br>**Partition upper bound** and **partition lower bound**: Specify if you want to filter against the partition column to retrieve data only between the lower and upper range.<br><br>During execution, Data Factory replaces `?AdfRangePartitionColumnName`, `?AdfRangePartitionUpbound`, and `?AdfRangePartitionLowbound` with the actual column name and value ranges for each partition, and sends to Netezza. <br>For example, if your partition column "ID" set with the lower bound as 1 and the upper bound as 80, with parallel copy set as 4, Data Factory retrieves data by 4 partitions. Their IDs are between [1,20], [21, 40], [41, 60], and [61, 80], respectively. |
+| Full load from large table. | **Partition option**: Data Slice. <br><br/>During execution, the service automatically partitions the data based on [Netezza's built-in data slices](https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.2.1/com.ibm.nz.adm.doc/c_sysadm_data_slices_parts_disks.html), and copies data by partitions. |
+| Load large amount of data by using a custom query. | **Partition option**: Data Slice.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE mod(datasliceid, ?AdfPartitionCount) = ?AdfDataSliceCondition AND <your_additional_where_clause>`.<br>During execution, the service replaces `?AdfPartitionCount` (with parallel copy number set on copy activity) and `?AdfDataSliceCondition` with the data slice partition logic, and sends to Netezza. |
+| Load large amount of data by using a custom query, having an integer column with evenly distributed value for range partitioning. | **Partition options**: Dynamic range partition.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE ?AdfRangePartitionColumnName <= ?AdfRangePartitionUpbound AND ?AdfRangePartitionColumnName >= ?AdfRangePartitionLowbound AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used to partition data. You can partition against the column with integer data type.<br>**Partition upper bound** and **partition lower bound**: Specify if you want to filter against the partition column to retrieve data only between the lower and upper range.<br><br>During execution, the service replaces `?AdfRangePartitionColumnName`, `?AdfRangePartitionUpbound`, and `?AdfRangePartitionLowbound` with the actual column name and value ranges for each partition, and sends to Netezza. <br>For example, if your partition column "ID" set with the lower bound as 1 and the upper bound as 80, with parallel copy set as 4, the service retrieves data by 4 partitions. Their IDs are between [1,20], [21, 40], [41, 60], and [61, 80], respectively. |
**Example: query with data slice partition**
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores that Copy Activity supports as sources and sinks in Azure Data Factory, see [Supported data stores and formats](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores that Copy Activity supports as sources and sinks, see [Supported data stores and formats](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Odata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-odata.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from an OData source by using Azure Data Factory
data-factory Connector Odbc https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-odbc.md
Previously updated : 05/10/2021 Last updated : 09/09/2021 # Copy data from and to ODBC data stores using Azure Data Factory
data-factory Connector Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-office-365.md
Previously updated : 10/20/2019 Last updated : 09/09/2021 # Copy data from Office 365 into Azure using Azure Data Factory
data-factory Connector Oracle Cloud Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-cloud-storage.md
Title: Copy data from Oracle Cloud Storage by using Azure Data Factory
+ Title: Copy data from Oracle Cloud Storage
+description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores using a Azure Data Factory or Synapse Analytics pipeline.
-description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores by using Azure Data Factory.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Oracle Cloud Storage by using Azure Data Factory
+# Copy data from Oracle Cloud Storage using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to copy data from Oracle Cloud Storage. To learn about Azure Data Factory, read the [introductory article](introduction.md).
+This article outlines how to copy data from Oracle Cloud Storage. To learn more, read the introductory articles for [Azure Data Factory](introduction.md) and [Synapse Analytics](../synapse-analytics/overview-what-is.md).
## Supported capabilities
Use the following steps to create a linked service to Oracle Cloud Storage in th
## Connector configuration details
-The following sections provide details about properties that are used to define Data Factory entities specific to Oracle Cloud Storage.
+The following sections provide details about properties that are used to define entities specific to Oracle Cloud Storage.
## Linked service properties
The following properties are supported for Oracle Cloud Storage linked
|: |: |: | | type | The **type** property must be set to **OracleCloudStorage**. | Yes | | accessKeyId | ID of the secret access key. To find the access key and secret, see [Prerequisites](#prerequisites). |Yes |
-| secretAccessKey | The secret access key itself. Mark this field as **SecureString** to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
+| secretAccessKey | The secret access key itself. Mark this field as **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
| serviceUrl | Specify the custom endpoint as `https://<namespace>.compat.objectstorage.<region identifier>.oraclecloud.com`. Refer [here](https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm) for more details | Yes | | connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. |No |
This section describes the resulting behavior of using a file list path in the C
Assume that you have the following source folder structure and want to copy the files in bold:
-| Sample source structure | Content in FileListToCopy.txt | Data Factory configuration |
+| Sample source structure | Content in FileListToCopy.txt | Configuration |
| | | | | bucket<br/>&nbsp;&nbsp;&nbsp;&nbsp;FolderA<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;**File1.csv**<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File2.json<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Subfolder1<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;**File3.csv**<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;File4.json<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;**File5.csv**<br/>&nbsp;&nbsp;&nbsp;&nbsp;Metadata<br/>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;FileListToCopy.txt | File1.csv<br>Subfolder1/File3.csv<br>Subfolder1/File5.csv | **In dataset:**<br>- Bucket: `bucket`<br>- Folder path: `FolderA`<br><br>**In copy activity source:**<br>- File list path: `bucket/Metadata/FileListToCopy.txt` <br><br>The file list path points to a text file in the same data store that includes a list of files you want to copy, one file per line, with the relative path to the path configured in the dataset. |
To learn details about the properties, check [Delete activity](delete-activity.m
## Next steps
-For a list of data stores that the Copy activity in Azure Data Factory supports as sources and sinks, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores that the Copy activity supports as sources and sinks, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Oracle Eloqua https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-eloqua.md
Title: Copy data from Oracle Eloqua (Preview)
+description: Learn how to copy data from Oracle Eloqua to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Oracle Eloqua to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Oracle Eloqua using Azure Data Factory (Preview)
+# Copy data from Oracle Eloqua using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Oracle Eloqua. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Oracle Eloqua. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and provide feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Oracle Eloqua connector is supported for the following activities:
You can copy data from Oracle Eloqua to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Oracle Eloqua linked service:
| type | The type property must be set to: **Eloqua** | Yes | | endpoint | The endpoint of the Eloqua server. Eloqua supports multiple data centers, to determine your endpoint, login to https://login.eloqua.com with your credential, then copy the **base URL** portion from the redirected URL with the pattern of `xxx.xxx.eloqua.com`. | Yes | | username | The site name and user name of your Eloqua account in the form: `SiteName\Username` e.g. `Eloqua\Alice`. | Yes |
-| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of supported data stored by Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of supported data stores in the service, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Oracle Responsys https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-responsys.md
Title: Copy data from Oracle Responsys (Preview)
+description: Learn how to copy data from Oracle Responsys to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Oracle Responsys to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Oracle Responsys using Azure Data Factory (Preview)
+# Copy data from Oracle Responsys using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Oracle Responsys. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Oracle Responsys. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Oracle Responsys connector is supported for the following activities:
You can copy data from Oracle Responsys to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Oracle Responsys linked service:
| type | The type property must be set to: **Responsys** | Yes | | endpoint | The endpoint of the Respopnsys server | Yes | | clientId | The client ID associated with the Responsys application. | Yes |
-| clientSecret | The client secret associated with the Responsys application. You can choose to mark this field as a SecureString to store it securely in ADF, or store password in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
+| clientSecret | The client secret associated with the Responsys application. You can choose to mark this field as a SecureString to store it securely in the service, or store password in Azure Key Vault and let the service copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Oracle Service Cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-service-cloud.md
Title: Copy data from Oracle Service Cloud (Preview)
+description: Learn how to copy data from Oracle Service Cloud to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Oracle Service Cloud to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Oracle Service Cloud using Azure Data Factory (Preview)
+# Copy data from Oracle Service Cloud using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Oracle Service Cloud. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Oracle Service Cloud. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and provide feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Oracle Service Cloud connector is supported for the following activities:
You can copy data from Oracle Service Cloud to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Oracle Service Cloud linked service:
| type | The type property must be set to: **OracleServiceCloud** | Yes | | host | The URL of the Oracle Service Cloud instance. | Yes | | username | The user name that you use to access Oracle Service Cloud server. | Yes |
-| password | The password corresponding to the user name that you provided in the username key. You can choose to mark this field as a SecureString to store it securely in ADF, or store password in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
+| password | The password corresponding to the user name that you provided in the username key. You can choose to mark this field as a SecureString to store it securely in the service, or store password in Azure Key Vault and let the service copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Oracle https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-overview.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Paypal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-paypal.md
Title: Copy data from PayPal using Azure Data Factory (Preview)
+ Title: Copy data from PayPal (Preview)
+description: Learn how to copy data from PayPal to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from PayPal to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from PayPal using Azure Data Factory (Preview)
+# Copy data from PayPal using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from PayPal. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from PayPal. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This PayPal connector is supported for the following activities:
You can copy data from PayPal to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for PayPal linked service:
| type | The type property must be set to: **PayPal** | Yes | | host | The URL of the PayPal instance. (that is, api.sandbox.paypal.com) | Yes | | clientId | The client ID associated with your PayPal application. | Yes |
-| clientSecret | The client secret associated with your PayPal application. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| clientSecret | The client secret associated with your PayPal application. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Phoenix https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-phoenix.md
Title: Copy data from Phoenix using Azure Data Factory
+ Title: Copy data from Phoenix
+description: Learn how to copy data from Phoenix to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Phoenix to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Phoenix using Azure Data Factory
+# Copy data from Phoenix using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Phoenix. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Phoenix. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Phoenix connector is supported for the following activities:
You can copy data from Phoenix to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
The following properties are supported for Phoenix linked service:
| httpPath | The partial URL corresponding to the Phoenix server. (that is, /gateway/sandbox/phoenix/version). Specify `/hbasephoenix0` if using HDInsights cluster. | No | | authenticationType | The authentication mechanism used to connect to the Phoenix server. <br/>Allowed values are: **Anonymous**, **UsernameAndPassword**, **WindowsAzureHDInsightService** | Yes | | username | The user name used to connect to the Phoenix server. | No |
-| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| enableSsl | Specifies whether the connections to the server are encrypted using TLS. The default value is false. | No | | trustedCertPath | The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. This property can only be set when using TLS on self-hosted IR. The default value is the cacerts.pem file installed with the IR. | No | | useSystemTrustStore | Specifies whether to use a CA certificate from the system trust store or from a specified PEM file. The default value is false. | No |
To copy data from Phoenix, set the source type in the copy activity to **Phoenix
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-postgresql.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from PostgreSQL by using Azure Data Factory
data-factory Connector Presto https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-presto.md
Title: Copy data from Presto using Azure Data Factory
+ Title: Copy data from Presto
+description: Learn how to copy data from Presto to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Presto to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Presto using Azure Data Factory
+# Copy data from Presto using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Presto. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Presto. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Presto connector is supported for the following activities:
You can copy data from Presto to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Presto linked service:
| port | The TCP port that the Presto server uses to listen for client connections. The default value is 8080. | No | | authenticationType | The authentication mechanism used to connect to the Presto server. <br/>Allowed values are: **Anonymous**, **LDAP** | Yes | | username | The user name used to connect to the Presto server. | No |
-| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | The password corresponding to the user name. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| enableSsl | Specifies whether the connections to the server are encrypted using TLS. The default value is false. | No | | trustedCertPath | The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. This property can only be set when using TLS on self-hosted IR. The default value is the cacerts.pem file installed with the IR. | No | | useSystemTrustStore | Specifies whether to use a CA certificate from the system trust store or from a specified PEM file. The default value is false. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Quickbooks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-quickbooks.md
Title: Copy data from QuickBooks Online using Azure Data Factory (Preview)
+ Title: Copy data from QuickBooks Online (Preview)
+description: Learn how to copy data from QuickBooks Online to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from QuickBooks Online to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from QuickBooks Online using Azure Data Factory (Preview)
+# Copy data from QuickBooks Online using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from QuickBooks Online. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from QuickBooks Online. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
The following properties are supported for QuickBooks linked service:
| endpoint | The endpoint of the QuickBooks Online server. (that is, quickbooks.api.intuit.com) | Yes | | companyId | The company ID of the QuickBooks company to authorize. For info about how to find the company ID, see [How do I find my Company ID](https://quickbooks.intuit.com/community/Getting-Started/How-do-I-find-my-Company-ID/m-p/185551). | Yes | | consumerKey | The client ID of your QuickBooks Online application for OAuth 2.0 authentication. Learn more from [here](https://developer.intuit.com/app/developer/qbo/docs/develop/authentication-and-authorization/oauth-2.0#obtain-oauth2-credentials-for-your-app). | Yes |
-| consumerSecret | The client secret of your QuickBooks Online application for OAuth 2.0 authentication. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
-| refreshToken | The OAuth 2.0 refresh token associated with the QuickBooks application. Learn more from [here](https://developer.intuit.com/app/developer/qbo/docs/develop/authentication-and-authorization/oauth-2.0#obtain-oauth2-credentials-for-your-app). Note refresh token will be expired after 180 days. Customer need to regularly update the refresh token. <br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md).| Yes |
+| consumerSecret | The client secret of your QuickBooks Online application for OAuth 2.0 authentication. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| refreshToken | The OAuth 2.0 refresh token associated with the QuickBooks application. Learn more from [here](https://developer.intuit.com/app/developer/qbo/docs/develop/authentication-and-authorization/oauth-2.0#obtain-oauth2-credentials-for-your-app). Note refresh token will be expired after 180 days. Customer need to regularly update the refresh token. <br/>Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md).| Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | **Example:**
To copy data from QuickBooks Online, set the source type in the copy activity to
``` ## Copy data from Quickbooks Desktop
-The Copy Activity in Azure Data Factory cannot copy data directly from Quickbooks Desktop. To copy data from Quickbooks Desktop, export your Quickbooks data to a comma-separated-values (CSV) file and then upload the file to Azure Blob Storage. From there, you can use Data Factory to copy the data to the sink of your choice.
+The Copy Activity in the service cannot copy data directly from Quickbooks Desktop. To copy data from Quickbooks Desktop, export your Quickbooks data to a comma-separated-values (CSV) file and then upload the file to Azure Blob Storage. From there, you can use the service to copy the data to the sink of your choice.
## Lookup activity properties
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Rest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-rest.md
Previously updated : 08/30/2021 Last updated : 09/09/2021
data-factory Connector Salesforce Marketing Cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-salesforce-marketing-cloud.md
Title: Copy data from Salesforce Marketing Cloud
+description: Learn how to copy data from Salesforce Marketing Cloud to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Salesforce Marketing Cloud to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Salesforce Marketing Cloud using Azure Data Factory
+# Copy data from Salesforce Marketing Cloud using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Salesforce Marketing Cloud. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in Azure Data Factory or Synapse Analytics pipelines to copy data from Salesforce Marketing Cloud. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Salesforce Marketing Cloud linked ser
| authenticationType | Specifies the authentication method to use. Allowed values are `Enhanced sts OAuth 2.0` or `OAuth_2.0`.<br><br>Salesforce Marketing Cloud legacy package only supports `OAuth_2.0`, while enhanced package needs `Enhanced sts OAuth 2.0`. <br>Since August 1, 2019, Salesforce Marketing Cloud has removed the ability to create legacy packages. All new packages are enhanced packages. | Yes | | host | For enhanced package, the host should be your [subdomain](https://developer.salesforce.com/docs/atlas.en-us.mc-apis.meta/mc-apis/your-subdomain-tenant-specific-endpoints.htm) which is represented by a 28-character string starting with the letters "mc", e.g. `mc563885gzs27c5t9-63k636ttgm`. <br>For legacy package, specify `www.exacttargetapis.com`. | Yes | | clientId | The client ID associated with the Salesforce Marketing Cloud application. | Yes |
-| clientSecret | The client secret associated with the Salesforce Marketing Cloud application. You can choose to mark this field as a SecureString to store it securely in ADF, or store the secret in Azure Key Vault and let ADF copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
+| clientSecret | The client secret associated with the Salesforce Marketing Cloud application. You can choose to mark this field as a SecureString to store it securely in the service, or store the secret in Azure Key Vault and let the service copy activity pull from there when performing data copy - learn more from [Store credentials in Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To copy data from Salesforce Marketing Cloud, set the source type in the copy ac
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Salesforce Service Cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-salesforce-service-cloud.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from and to Salesforce Service Cloud by using Azure Data Factory
data-factory Connector Sap Business Warehouse Open Hub https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-business-warehouse-open-hub.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from SAP Business Warehouse via Open Hub using Azure Data Factory
data-factory Connector Sap Business Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-business-warehouse.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from SAP Business Warehouse using Azure Data Factory
data-factory Connector Sap Cloud For Customer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-cloud-for-customer.md
Title: Copy data from/to SAP Cloud for Customer
+description: Learn how to copy data from SAP Cloud for Customer to supported sink data stores (or) from supported source data stores to SAP Cloud for Customer using an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from SAP Cloud for Customer to supported sink data stores (or) from supported source data stores to SAP Cloud for Customer by using Data Factory.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from SAP Cloud for Customer (C4C) using Azure Data Factory
+# Copy data from SAP Cloud for Customer (C4C) using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from/to SAP Cloud for Customer (C4C). It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from/to SAP Cloud for Customer (C4C). It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
>[!TIP]
->To learn ADF's overall support on SAP data integration scenario, see [SAP data integration using Azure Data Factory whitepaper](https://github.com/Azure/Azure-DataFactory/blob/master/whitepaper/SAP%20Data%20Integration%20using%20Azure%20Data%20Factory.pdf) with detailed introduction on each SAP connector, comparsion and guidance.
+>To learn the service's overall support on SAP data integration scenario, see [SAP data integration using Azure Data Factory whitepaper](https://github.com/Azure/Azure-DataFactory/blob/master/whitepaper/SAP%20Data%20Integration%20using%20Azure%20Data%20Factory.pdf) with detailed introduction on each SAP connector, comparison and guidance.
## Supported capabilities
This SAP Cloud for Customer connector is supported for the following activities:
You can copy data from SAP Cloud for Customer to any supported sink data store, or copy data from any supported source data store to SAP Cloud for Customer. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Specifically, this connector enables Azure Data Factory to copy data from/to SAP Cloud for Customer including the SAP Cloud for Sales, SAP Cloud for Service, and SAP Cloud for Social Engagement solutions.
+Specifically, this connector enables the service to copy data from/to SAP Cloud for Customer including the SAP Cloud for Sales, SAP Cloud for Service, and SAP Cloud for Social Engagement solutions.
## Getting started
The following properties are supported for SAP Cloud for Customer linked service
| type | The type property must be set to: **SapCloudForCustomer**. | Yes | | url | The URL of the SAP C4C OData service. | Yes | | username | Specify the user name to connect to the SAP C4C. | Yes |
-| password | Specify the password for the user account you specified for the username. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| password | Specify the password for the user account you specified for the username. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No | **Example:**
To copy data to SAP Cloud for Customer, set the sink type in the copy activity t
## Data type mapping for SAP Cloud for Customer
-When copying data from SAP Cloud for Customer, the following mappings are used from SAP Cloud for Customer data types to Azure Data Factory interim data types. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
+When copying data from SAP Cloud for Customer, the following mappings are used from SAP Cloud for Customer data types to interim data types used internally within the service. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
-| SAP C4C OData Data Type | Data factory interim data type |
+| SAP C4C OData Data Type | Interim service data type |
|: |: | | Edm.Binary | Byte[] | | Edm.Boolean | Bool |
When copying data from SAP Cloud for Customer, the following mappings are used f
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Sap Ecc https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-ecc.md
Previously updated : 10/28/2020 Last updated : 09/09/2021 # Copy data from SAP ECC by using Azure Data Factory
data-factory Connector Sap Hana https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-hana.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from SAP HANA using Azure Data Factory
data-factory Connector Sap Table https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-table.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from an SAP table using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Servicenow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-servicenow.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from ServiceNow using Azure Data Factory
data-factory Connector Sftp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sftp.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from and to the SFTP server using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Sharepoint Online List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sharepoint-online-list.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy data from SharePoint Online List by using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Shopify https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-shopify.md
Title: Copy data from Shopify (Preview)
+description: Learn how to copy data from Shopify to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Shopify to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Shopify using Azure Data Factory (Preview)
+# Copy data from Shopify using Azure Data Factoryor Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Shopify. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Shopify. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Shopify connector is supported for the following activities:
You can copy data from Shopify to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Shopify linked service:
|: |: |: | | type | The type property must be set to: **Shopify** | Yes | | host | The endpoint of the Shopify server. (that is, mystore.myshopify.com) | Yes |
-| accessToken | The API access token that can be used to access ShopifyΓÇÖs data. The token does not expire if it is offline mode. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| accessToken | The API access token that can be used to access ShopifyΓÇÖs data. The token does not expire if it is offline mode. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Snowflake https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-snowflake.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data in Snowflake using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Spark https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-spark.md
Title: Copy data from Spark
+description: Learn how to copy data from Spark to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Spark to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Spark using Azure Data Factory
+# Copy data from Spark using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Spark. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Spark. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Spark connector is supported for the following activities:
You can copy data from Spark to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
The following properties are supported for Spark linked service:
| thriftTransportProtocol | The transport protocol to use in the Thrift layer. <br/>Allowed values are: **Binary**, **SASL**, **HTTP** | No | | authenticationType | The authentication method used to access the Spark server. <br/>Allowed values are: **Anonymous**, **Username**, **UsernameAndPassword**, **WindowsAzureHDInsightService** | Yes | | username | The user name that you use to access Spark Server. | No |
-| password | The password corresponding to the user. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| password | The password corresponding to the user. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| httpPath | The partial URL corresponding to the Spark server. | No | | enableSsl | Specifies whether the connections to the server are encrypted using TLS. The default value is false. | No | | trustedCertPath | The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. This property can only be set when using TLS on self-hosted IR. The default value is the cacerts.pem file installed with the IR. | No |
To copy data from Spark, set the source type in the copy activity to **SparkSour
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sql-server.md
Previously updated : 08/30/2021 Last updated : 09/09/2021 # Copy and transform data to and from SQL Server by using Azure Data Factory or Azure Synapse Analytics
data-factory Connector Square https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-square.md
Title: Copy data from Square (Preview)
+description: Learn how to copy data from Square to supported sink data stores by using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Square to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Square using Azure Data Factory (Preview)
+# Copy data from Square using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Square. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Square. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
This Square connector is supported for the following activities:
You can copy data from Square to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Square linked service:
| ***Under `connectionProperties`:*** | | | | host | The URL of the Square instance. (i.e. mystore.mysquare.com) | Yes | | clientId | The client ID associated with your Square application. | Yes |
-| clientSecret | The client secret associated with your Square application. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
-| accessToken | The access token obtained from Square. Grants limited access to a Square account by asking an authenticated user for explicit permissions. OAuth access tokens expires 30 days after issued, but refresh tokens do not expire. Access tokens can be refreshed by refresh token.<br>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
-| refreshToken | The refresh token obtained from Square. Used to obtain new access tokens when the current one expires.<br>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| clientSecret | The client secret associated with your Square application. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| accessToken | The access token obtained from Square. Grants limited access to a Square account by asking an authenticated user for explicit permissions. OAuth access tokens expires 30 days after issued, but refresh tokens do not expire. Access tokens can be refreshed by refresh token.<br>Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| refreshToken | The refresh token obtained from Square. Used to obtain new access tokens when the current one expires.<br>Mark this field as a SecureString to store it securelyFactory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
Square support two types of access token: **personal** and **OAuth**.
- Personal access tokens are used to get unlimited Connect API access to resources in your own Square account. - OAuth access tokens are used to get authenticated and scoped Connect API access to any Square account. Use them when your app accesses resources in other Square accounts on behalf of account owners. OAuth access tokens can also be used to access resources in your own Square account.
-In Data Factory, Authentication via personal access token only needs `accessToken`, while authentication via OAuth requires `accessToken` and `refreshToken`. Learn how to retrieve access token from [here](https://developer.squareup.com/docs/build-basics/access-tokens).
+Authentication via personal access token only needs `accessToken`, while authentication via OAuth requires `accessToken` and `refreshToken`. Learn how to retrieve access token from [here](https://developer.squareup.com/docs/build-basics/access-tokens).
**Example:**
To copy data from Square, set the source type in the copy activity to **SquareSo
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Sybase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sybase.md
Title: Copy data from Sybase using Azure Data Factory
+ Title: Copy data from Sybase
+description: Learn how to copy data from Sybase to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Sybase to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Sybase using Azure Data Factory
+# Copy data from Sybase using Azure Data Factory or Synapse Analytics
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] > * [Version 1](v1/data-factory-onprem-sybase-connector.md) > * [Current version](connector-sybase.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from a Sybase database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a Sybase database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Sybase linked service:
| database | Name of the Sybase database. |Yes | | authenticationType | Type of authentication used to connect to the Sybase database.<br/>Allowed values are: **Basic**, and **Windows**. |Yes | | username | Specify user name to connect to the Sybase database. |Yes |
-| password | Specify password for the user account you specified for the username. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
+| password | Specify password for the user account you specified for the username. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. A Self-hosted Integration Runtime is required as mentioned in [Prerequisites](#prerequisites). |Yes | **Example:**
If you were using `RelationalSource` typed source, it is still supported as-is,
## Data type mapping for Sybase
-When copying data from Sybase, the following mappings are used from Sybase data types to Azure Data Factory interim data types. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
+When copying data from Sybase, the following mappings are used from Sybase data types to interim data types used internally within the service. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
-Sybase supports T-SQL types. For a mapping table from SQL types to Azure Data Factory interim data types, see [Azure SQL Database Connector - data type mapping](connector-azure-sql-database.md#data-type-mapping-for-azure-sql-database) section.
+Sybase supports T-SQL types. For a mapping table from SQL types to interim service data types, see [Azure SQL Database Connector - data type mapping](connector-azure-sql-database.md#data-type-mapping-for-azure-sql-database) section.
## Lookup activity properties
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Teradata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-teradata.md
Title: Copy data from Teradata Vantage by using Azure Data Factory
+ Title: Copy data from Teradata Vantage
+description: The Teradata Connector in Azure Data Factory and Synapse Analytics lets you copy data from a Teradata Vantage to supported sink data stores.
-description: The Teradata Connector of the Data Factory service lets you copy data from a Teradata Vantage to data stores supported by Data Factory as sinks.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Teradata Vantage by using Azure Data Factory
+# Copy data from Teradata Vantage using Azure Data Factory and Synapse Analytics
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] >
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the copy activity in Azure Data Factory to copy data from Teradata Vantage. It builds on the [copy activity overview](copy-activity-overview.md).
+This article outlines how to use the copy activity in Azure Data Factory and Synapse Analytics pipelines to copy data from Teradata Vantage. It builds on the [copy activity overview](copy-activity-overview.md).
## Supported capabilities
To copy data from Teradata, the following properties are supported in the copy a
## Parallel copy from Teradata
-The Data Factory Teradata connector provides built-in data partitioning to copy data from Teradata in parallel. You can find data partitioning options on the **Source** table of the copy activity.
+The Teradata connector provides built-in data partitioning to copy data from Teradata in parallel. You can find data partitioning options on the **Source** table of the copy activity.
![Screenshot of partition options](./media/connector-teradata/connector-teradata-partition-options.png)
-When you enable partitioned copy, Data Factory runs parallel queries against your Teradata source to load data by partitions. The parallel degree is controlled by the [`parallelCopies`](copy-activity-performance-features.md#parallel-copy) setting on the copy activity. For example, if you set `parallelCopies` to four, Data Factory concurrently generates and runs four queries based on your specified partition option and settings, and each query retrieves a portion of data from your Teradata.
+When you enable partitioned copy, the service runs parallel queries against your Teradata source to load data by partitions. The parallel degree is controlled by the [`parallelCopies`](copy-activity-performance-features.md#parallel-copy) setting on the copy activity. For example, if you set `parallelCopies` to four, the service concurrently generates and runs four queries based on your specified partition option and settings, and each query retrieves a portion of data from your Teradata.
-You are suggested to enable parallel copy with data partitioning especially when you load large amount of data from your Teradata. The following are suggested configurations for different scenarios. When copying data into file-based data store, it's recommanded to write to a folder as multiple files (only specify folder name), in which case the performance is better than writing to a single file.
+You are suggested to enable parallel copy with data partitioning especially when you load large amount of data from your Teradata. The following are suggested configurations for different scenarios. When copying data into file-based data store, it's recommended to write to a folder as multiple files (only specify folder name), in which case the performance is better than writing to a single file.
| Scenario | Suggested settings | | | |
-| Full load from large table. | **Partition option**: Hash. <br><br/>During execution, Data Factory automatically detects the primary index column, applies a hash against it, and copies data by partitions. |
-| Load large amount of data by using a custom query. | **Partition option**: Hash.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE ?AdfHashPartitionCondition AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used for apply hash partition. If not specified, Data Factory automatically detects the PK column of the table you specified in the Teradata dataset.<br><br>During execution, Data Factory replaces `?AdfHashPartitionCondition` with the hash partition logic, and sends to Teradata. |
-| Load large amount of data by using a custom query, having an integer column with evenly distributed value for range partitioning. | **Partition options**: Dynamic range partition.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE ?AdfRangePartitionColumnName <= ?AdfRangePartitionUpbound AND ?AdfRangePartitionColumnName >= ?AdfRangePartitionLowbound AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used to partition data. You can partition against the column with integer data type.<br>**Partition upper bound** and **partition lower bound**: Specify if you want to filter against the partition column to retrieve data only between the lower and upper range.<br><br>During execution, Data Factory replaces `?AdfRangePartitionColumnName`, `?AdfRangePartitionUpbound`, and `?AdfRangePartitionLowbound` with the actual column name and value ranges for each partition, and sends to Teradata. <br>For example, if your partition column "ID" set with the lower bound as 1 and the upper bound as 80, with parallel copy set as 4, Data Factory retrieves data by 4 partitions. Their IDs are between [1,20], [21, 40], [41, 60], and [61, 80], respectively. |
+| Full load from large table. | **Partition option**: Hash. <br><br/>During execution, the service automatically detects the primary index column, applies a hash against it, and copies data by partitions. |
+| Load large amount of data by using a custom query. | **Partition option**: Hash.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE ?AdfHashPartitionCondition AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used for apply hash partition. If not specified, the service automatically detects the PK column of the table you specified in the Teradata dataset.<br><br>During execution, the service replaces `?AdfHashPartitionCondition` with the hash partition logic, and sends to Teradata. |
+| Load large amount of data by using a custom query, having an integer column with evenly distributed value for range partitioning. | **Partition options**: Dynamic range partition.<br>**Query**: `SELECT * FROM <TABLENAME> WHERE ?AdfRangePartitionColumnName <= ?AdfRangePartitionUpbound AND ?AdfRangePartitionColumnName >= ?AdfRangePartitionLowbound AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used to partition data. You can partition against the column with integer data type.<br>**Partition upper bound** and **partition lower bound**: Specify if you want to filter against the partition column to retrieve data only between the lower and upper range.<br><br>During execution, the service replaces `?AdfRangePartitionColumnName`, `?AdfRangePartitionUpbound`, and `?AdfRangePartitionLowbound` with the actual column name and value ranges for each partition, and sends to Teradata. <br>For example, if your partition column "ID" set with the lower bound as 1 and the upper bound as 80, with parallel copy set as 4, the service retrieves data by 4 partitions. Their IDs are between [1,20], [21, 40], [41, 60], and [61, 80], respectively. |
**Example: query with hash partition**
You are suggested to enable parallel copy with data partitioning especially when
## Data type mapping for Teradata
-When you copy data from Teradata, the following mappings apply. To learn about how the copy activity maps the source schema and data type to the sink, see [Schema and data type mappings](copy-activity-schema-and-type-mapping.md).
+When you copy data from Teradata, the following mappings apply from Teradata's data types to the internal data types used by the service. To learn about how the copy activity maps the source schema and data type to the sink, see [Schema and data type mappings](copy-activity-schema-and-type-mapping.md).
-| Teradata data type | Data Factory interim data type |
+| Teradata data type | Interim service data type |
|: |: | | BigInt |Int64 | | Blob |Byte[] |
To learn details about the properties, check [Lookup activity](control-flow-look
## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Data Factory, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-troubleshoot-guide.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Connector Vertica https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-vertica.md
Title: Copy data from Vertica using Azure Data Factory
+ Title: Copy data from Vertica
+description: Learn how to copy data from Vertica to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Vertica to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Vertica using Azure Data Factory
+# Copy data from Vertica using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Vertica. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Vertica. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
This Vertica connector is supported for the following activities:
You can copy data from Vertica to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Prerequisites
To copy data from Vertica, set the source type in the copy activity to **Vertica
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Web Table https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-web-table.md
Title: Copy data from Web Table using Azure Data Factory
+ Title: Copy data from Web Table
+description: Learn about Web Table Connector that lets you copy data from a web table to data stores supported as sinks by Azure Data Factory and Synapse Analytics.
-description: Learn about Web Table Connector of Azure Data Factory that lets you copy data from a web table to data stores supported by Data Factory as sinks.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Web table by using Azure Data Factory
+# Copy data from Web table by using Azure Data Factory or Synapse Analytics
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"] > * [Version 1](v1/data-factory-web-table-connector.md) > * [Current version](connector-web-table.md) [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from a Web table database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from a Web table database. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
The difference among this Web table connector, the [REST connector](connector-rest.md) and the [HTTP connector](connector-http.md) are:
If you are using Excel 2013, use [Microsoft Power Query for Excel](https://www.m
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Xero https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-xero.md
Title: Copy data from Xero using Azure Data Factory
+ Title: Copy data from Xero
+description: Learn how to copy data from Xero to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Xero to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Xero using Azure Data Factory
+# Copy data from Xero using Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Xero. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Xero. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
## Supported capabilities
The following properties are supported for Xero linked service:
| ***Under `connectionProperties`:*** | | | | host | The endpoint of the Xero server (`api.xero.com`). | Yes | | authenticationType | Allowed values are `OAuth_2.0` and `OAuth_1.0`. | Yes |
-| consumerKey | For OAuth 2.0, specify the **client ID** for your Xero application.<br>For OAuth 1.0, specify the consumer key associated with the Xero application.<br>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
-| privateKey | For OAuth 2.0, specify the **client secret** for your Xero application.<br>For OAuth 1.0, specify the private key from the .pem file that was generated for your Xero private application, see [Create a public/private key pair](https://developer.xero.com/documentation/auth-and-limits/create-publicprivate-key). Note to **generate the privatekey.pem with numbits of 512** using `openssl genrsa -out privatekey.pem 512`, 1024 is not supported. Include all the text from the .pem file including the Unix line endings(\n), see sample below.<br/><br>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| consumerKey | For OAuth 2.0, specify the **client ID** for your Xero application.<br>For OAuth 1.0, specify the consumer key associated with the Xero application.<br>Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| privateKey | For OAuth 2.0, specify the **client secret** for your Xero application.<br>For OAuth 1.0, specify the private key from the .pem file that was generated for your Xero private application, see [Create a public/private key pair](https://developer.xero.com/documentation/auth-and-limits/create-publicprivate-key). Note to **generate the privatekey.pem with numbits of 512** using `openssl genrsa -out privatekey.pem 512`, 1024 is not supported. Include all the text from the .pem file including the Unix line endings(\n), see sample below.<br/><br>Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| tenantId | The tenant ID associated with your Xero application. Applicable for OAuth 2.0 authentication.<br>Learn how to get the tenant ID from [Check the tenants you're authorized to access section](https://developer.xero.com/documentation/oauth2/auth-flow). | Yes for OAuth 2.0 authentication |
-| refreshToken | Applicable for OAuth 2.0 authentication.<br/>The OAuth 2.0 refresh token is associated with the Xero application and used to refresh the access token; the access token expires after 30 minutes. Learn about how the Xero authorization flow works and how to get the refresh token from [this article](https://developer.xero.com/documentation/oauth2/auth-flow). To get a refresh token, you must request the [offline_access scope](https://developer.xero.com/documentation/oauth2/scopes). <br/>**Know limitation**: Note Xero resets the refresh token after it's used for access token refresh. For operationalized workload, before each copy activity run, you need to set a valid refresh token for ADF to use.<br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes for OAuth 2.0 authentication |
+| refreshToken | Applicable for OAuth 2.0 authentication.<br/>The OAuth 2.0 refresh token is associated with the Xero application and used to refresh the access token; the access token expires after 30 minutes. Learn about how the Xero authorization flow works and how to get the refresh token from [this article](https://developer.xero.com/documentation/oauth2/auth-flow). To get a refresh token, you must request the [offline_access scope](https://developer.xero.com/documentation/oauth2/scopes). <br/>**Know limitation**: Note Xero resets the refresh token after it's used for access token refresh. For operationalized workload, before each copy activity run, you need to set a valid refresh token for the service to use.<br/>Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes for OAuth 2.0 authentication |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether the host name is required in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
data-factory Connector Zoho https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-zoho.md
Title: Copy data from Zoho using Azure Data Factory (Preview)
+ Title: Copy data from Zoho (Preview)
+description: Learn how to copy data from Zoho to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.
-description: Learn how to copy data from Zoho to supported sink data stores by using a copy activity in an Azure Data Factory pipeline.
Previously updated : 08/30/2021 Last updated : 09/09/2021
-# Copy data from Zoho using Azure Data Factory (Preview)
+# Copy data from Zoho using Azure Data Factory or Synapse Analytics (Preview)
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Zoho. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
+This article outlines how to use the Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy data from Zoho. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
> [!IMPORTANT] > This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/).
You can copy data from Zoho to any supported sink data store. For a list of data
This connector supports Xero access token authentication and OAuth 2.0 authentication.
-Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
+The service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
## Getting started
The following properties are supported for Zoho linked service:
| endpoint | The endpoint of the Zoho server (`crm.zoho.com/crm/private`). | Yes | | authenticationType | Allowed values are `OAuth_2.0` and `Access Token`. | Yes | | clientId | The client ID associated with your Zoho application. | Yes for OAuth 2.0 authentication |
-| clientSecrect | The clientsecret associated with your Zoho application. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes for OAuth 2.0 authentication |
-| refreshToken | The OAuth 2.0 refresh token associated with your Zoho application, used to refresh the access token when it expires. Refresh token will never expire. To get a refresh token, you must request the `offline` access_type, learn more from [this article](https://www.zoho.com/crm/developer/docs/api/auth-request.html). <br>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md).| Yes for OAuth 2.0 authentication |
-| accessToken | The access token for Zoho authentication. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
+| clientSecrect | The clientsecret associated with your Zoho application. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes for OAuth 2.0 authentication |
+| refreshToken | The OAuth 2.0 refresh token associated with your Zoho application, used to refresh the access token when it expires. Refresh token will never expire. To get a refresh token, you must request the `offline` access_type, learn more from [this article](https://www.zoho.com/crm/developer/docs/api/auth-request.html). <br>Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md).| Yes for OAuth 2.0 authentication |
+| accessToken | The access token for Zoho authentication. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
| useEncryptedEndpoints | Specifies whether the data source endpoints are encrypted using HTTPS. The default value is true. | No | | useHostVerification | Specifies whether to require the host name in the server's certificate to match the host name of the server when connecting over TLS. The default value is true. | No | | usePeerVerification | Specifies whether to verify the identity of the server when connecting over TLS. The default value is true. | No |
To copy data from Zoho, set the source type in the copy activity to **ZohoSource
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Continuous Integration Deployment https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/continuous-integration-deployment.md
Below is a sample overview of the CI/CD lifecycle in an Azure data factory that'
The below image highlights the different steps of this lifecycle.
-![Diagram of continuous integration with Azure Pipelines](media/continuous-integration-deployment/continuous-integration-image12.png)
## Automate continuous integration by using Azure Pipelines releases
The following is a guide for setting up an Azure Pipelines release that automate
1. On the left side of the page, select **Pipelines**, and then select **Releases**.
- ![Select Pipelines, Releases](media/continuous-integration-deployment/continuous-integration-image6.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image6.png" alt-text="Select Pipelines, Releases":::
1. Select **New pipeline**, or, if you have existing pipelines, select **New** and then **New release pipeline**. 1. Select the **Empty job** template.
- ![Select Empty job](media/continuous-integration-deployment/continuous-integration-image13.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image13.png" alt-text="Select Empty job":::
1. In the **Stage name** box, enter the name of your environment. 1. Select **Add artifact**, and then select the git repository configured with your development data factory. Select the [publish branch](source-control.md#configure-publishing-settings) of the repository for the **Default branch**. By default, this publish branch is `adf_publish`. For the **Default version**, select **Latest from default branch**.
- ![Add an artifact](media/continuous-integration-deployment/continuous-integration-image7.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image7.png" alt-text="Add an artifact":::
1. Add an Azure Resource Manager Deployment task: a. In the stage view, select **View stage tasks**.
- ![Stage view](media/continuous-integration-deployment/continuous-integration-image14.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image14.png" alt-text="Stage view":::
b. Create a new task. Search for **ARM Template Deployment**, and then select **Add**.
The following is a guide for setting up an Azure Pipelines release that automate
> [!WARNING] > In Complete deployment mode, resources that exist in the resource group but aren't specified in the new Resource Manager template will be **deleted**. For more information, please refer to [Azure Resource Manager Deployment Modes](../azure-resource-manager/templates/deployment-modes.md)
- ![Data Factory Prod Deployment](media/continuous-integration-deployment/continuous-integration-image9.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image9.png" alt-text="Data Factory Prod Deployment":::
1. Save the release pipeline. 1. To trigger a release, select **Create release**. To automate the creation of releases, see [Azure DevOps release triggers](/azure/devops/pipelines/release/triggers)
- ![Select Create release](media/continuous-integration-deployment/continuous-integration-image10.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image10.png" alt-text="Select Create release":::
> [!IMPORTANT] > In CI/CD scenarios, the integration runtime (IR) type in different environments must be the same. For example, if you have a self-hosted IR in the development environment, the same IR must also be of type self-hosted in other environments, such as test and production. Similarly, if you're sharing integration runtimes across multiple stages, you have to configure the integration runtimes as linked self-hosted in all environments, such as development, test, and production.
There are two ways to handle secrets:
1. In the Key Vault task, select the subscription in which you created the key vault. Provide credentials if necessary, and then select the key vault.
- ![Add a Key Vault task](media/continuous-integration-deployment/continuous-integration-image8.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image8.png" alt-text="Add a Key Vault task":::
#### Grant permissions to the Azure Pipelines agent
The data factory team has provided a [sample pre- and post-deployment script](#s
1. Go to **Manage** hub in your data factory, and select **ARM template** in the "Source control" section. Under **ARM template** section, select **Export ARM template** to export the Resource Manager template for your data factory in the development environment.
- ![Export a Resource Manager template](media/continuous-integration-deployment/continuous-integration-image-1.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image-1.png" alt-text="Export a Resource Manager template":::
1. In your test and production data factories, select **Import ARM Template**. This action takes you to the Azure portal, where you can import the exported template. Select **Build your own template in the editor** to open the Resource Manager template editor.
- ![Build your own template](media/continuous-integration-deployment/custom-deployment-build-your-own-template.png)
+ :::image type="content" source="media/continuous-integration-deployment/custom-deployment-build-your-own-template.png" alt-text="Build your own template":::
1. Select **Load file**, and then select the generated Resource Manager template. This is the **arm_template.json** file located in the .zip file exported in step 1.
- ![Edit template](media/continuous-integration-deployment/custom-deployment-edit-template.png)
+ :::image type="content" source="media/continuous-integration-deployment/custom-deployment-edit-template.png" alt-text="Edit template":::
1. In the settings section, enter the configuration values, like linked service credentials. When you're done, select **Purchase** to deploy the Resource Manager template.
- ![Settings section](media/continuous-integration-deployment/continuous-integration-image5.png)
+ :::image type="content" source="media/continuous-integration-deployment/continuous-integration-image5.png" alt-text="Settings section":::
## Use custom parameters with the Resource Manager template
If your development factory has an associated git repository, you can override t
To override the default Resource Manager parameter configuration, go to the **Manage** hub and select **ARM template** in the "Source control" section. Under **ARM parameter configuration** section, click **Edit** icon in "Edit parameter configuration" to open the Resource Manager parameter configuration code editor.
-![Manage custom parameters](media/author-management-hub/management-hub-custom-parameters.png)
> [!NOTE] > **ARM parameter configuration** is only enabled in "GIT mode". Currently it is disabled in "live mode" or "Data Factory" mode. Creating a custom Resource Manager parameter configuration creates a file named **arm-template-parameters-definition.json** in the root folder of your git branch. You must use that exact file name.
-![Custom parameters file](media/continuous-integration-deployment/custom-parameters.png)
When publishing from the collaboration branch, Data Factory will read this file and use its configuration to generate which properties get parameterized. If no file is found, the default template is used.
Here's an explanation of how the preceding template is constructed, broken down
* Although type-specific customization is available for datasets, you can provide configuration without explicitly having a \*-level configuration. In the preceding example, all dataset properties under `typeProperties` are parameterized. > [!NOTE]
-> **Azure alerts and matrices** if configured for a pipeline are not currently supported as parameters for ARM deployments. To reapply the alerts and matrices in new environment, please follow [Data Factory Monitoring, Alerts and Matrices.](./monitor-using-azure-monitor.md#data-factory-metrics)
+> If **Azure alerts and matrices** are configured for a pipeline, they are not currently supported as parameters for ARM deployments. To reapply the alerts and matrices in new environment, please follow [Data Factory Monitoring, Alerts and Matrices.](./monitor-metrics-alerts.md)
> ### Default parameterization template
If you've set up CI/CD for your data factories, you might exceed the Azure Resou
If you've configured Git, the linked templates are generated and saved alongside the full Resource Manager templates in the adf_publish branch in a new folder called linkedTemplates:
-![Linked Resource Manager templates folder](media/continuous-integration-deployment/linked-resource-manager-templates.png)
The linked Resource Manager templates usually consist of a master template and a set of child templates that are linked to the master. The parent template is called ArmTemplate_master.json, and child templates are named with the pattern ArmTemplate_0.json, ArmTemplate_1.json, and so on.
When running a post-deployment script, you will need to specify a variation of t
> [!NOTE] > The `-deleteDeployment` flag is used to specify the deletion of the ADF deployment entry from the deployment history in ARM.
-![Azure PowerShell task](media/continuous-integration-deployment/continuous-integration-image11.png)
Here is the script that can be used for pre- and post-deployment. It accounts for deleted resources and resource references.
data-factory Control Flow Append Variable Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-append-variable-activity.md
Previously updated : 10/09/2018 Last updated : 09/09/2021 # Append Variable Activity in Azure Data Factory
data-factory Control Flow Azure Function Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-azure-function-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Azure Function activity in Azure Data Factory
data-factory Control Flow Execute Data Flow Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-execute-data-flow-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Data Flow activity in Azure Data Factory and Azure Synapse Analytics
data-factory Control Flow Execute Pipeline Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-execute-pipeline-activity.md
Previously updated : 01/10/2018 Last updated : 09/09/2021 # Execute Pipeline activity in Azure Data Factory
data-factory Control Flow Expression Language Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-expression-language-functions.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Expressions and functions in Azure Data Factory and Azure Synapse Analytics
data-factory Control Flow Filter Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-filter-activity.md
Previously updated : 05/04/2018 Last updated : 09/09/2021 # Filter activity in Azure Data Factory
data-factory Control Flow For Each Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-for-each-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # ForEach activity in Azure Data Factory and Azure Synapse Analytics
data-factory Control Flow Get Metadata Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-get-metadata-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Control Flow If Condition Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-if-condition-activity.md
Previously updated : 01/10/2018 Last updated : 09/09/2021
data-factory Control Flow Lookup Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-lookup-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Lookup activity in Azure Data Factory and Azure Synapse Analytics
data-factory Control Flow Set Variable Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-set-variable-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Control Flow System Variables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-system-variables.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # System variables supported by Azure Data Factory and Azure Synapse Analytics
data-factory Control Flow Until Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-until-activity.md
Previously updated : 01/10/2018 Last updated : 09/09/2021
data-factory Control Flow Validation Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-validation-activity.md
Previously updated : 03/25/2019 Last updated : 09/09/2021 # Validation activity in Azure Data Factory
data-factory Control Flow Wait Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-wait-activity.md
Previously updated : 01/12/2018 Last updated : 09/09/2021 # Execute wait activity in Azure Data Factory
data-factory Control Flow Web Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-web-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Web activity in Azure Data Factory and Azure Synapse Analytics
data-factory Control Flow Webhook Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-webhook-activity.md
Previously updated : 03/25/2019 Last updated : 09/09/2021 # Webhook activity in Azure Data Factory
data-factory Copy Activity Data Consistency https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-data-consistency.md
Previously updated : 3/27/2020 Last updated : 09/09/2021 # Data consistency verification in copy activity
data-factory Copy Activity Fault Tolerance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-fault-tolerance.md
Previously updated : 06/22/2020 Last updated : 09/09/2021 # Fault tolerance of copy activity in Azure Data Factory
data-factory Copy Activity Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-monitoring.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Monitor copy activity
data-factory Copy Activity Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-overview.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Copy Activity Performance Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance-features.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Copy activity performance optimization features
data-factory Copy Activity Performance Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance-troubleshooting.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Troubleshoot copy activity performance
data-factory Copy Activity Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Copy activity performance and scalability guide
data-factory Copy Activity Preserve Metadata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-preserve-metadata.md
Title: Preserve metadata and ACLs using copy activity in Azure Data Factory
+ Title: Preserve metadata and ACLs using copy activity
+description: Learn how to preserve metadata and ACLs when using the copy activity in Azure Data Factory and Synapse Analytics pipelines.
-description: 'Learn about how to preserve metadata and ACLs during copy using copy activity in Azure Data Factory.'
Previously updated : 09/23/2020 Last updated : 09/09/2021
-# Preserve metadata and ACLs using copy activity in Azure Data Factory
+# Preserve metadata and ACLs using copy activity in Azure Data Factory or Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
-When you use Azure Data Factory copy activity to copy data from source to sink, in the following scenarios, you can also preserve the metadata and ACLs along.
+When you use Azure Data Factory or Synapse Analytics pipelines copy activity to copy data from source to sink, in the following scenarios, you can also preserve the metadata and ACLs along.
## <a name="preserve-metadata"></a> Preserve metadata for lake migration
Copy activity supports preserving the following attributes during data copy:
- **All the customer specified metadata** - And the following **five data store built-in system properties**: `contentType`, `contentLanguage` (except for Amazon S3), `contentEncoding`, `contentDisposition`, `cacheControl`.
-**Handle differences in metadata:** Amazon S3 and Azure Storage allow different sets of characters in the keys of customer specified metadata. When you choose to preserve metadata using copy activity, ADF automatically replaces the invalid characters with '_'.
+**Handle differences in metadata:** Amazon S3 and Azure Storage allow different sets of characters in the keys of customer specified metadata. When you choose to preserve metadata using copy activity, the service automatically replaces the invalid characters with '_'.
When you copy files as-is from Amazon S3/Azure Data Lake Storage Gen2/Azure Blob storage/Azure Files to Azure Data Lake Storage Gen2/Azure Blob storage/Azure Files with binary format, you can find the **Preserve** option on the **Copy Activity** > **Settings** tab for activity authoring or the **Settings** page in Copy Data Tool.
Copy activity supports preserving the following types of ACLs during data copy.
- **Owner**: Copy and preserve the owning user of files and directories. Super-user access to sink Data Lake Storage Gen2 is required. - **Group**: Copy and preserve the owning group of files and directories. Super-user access to sink Data Lake Storage Gen2 or the owning user (if the owning user is also a member of the target group) is required.
-If you specify to copy from a folder, Data Factory replicates the ACLs for that given folder and the files and directories under it, if `recursive` is set to true. If you specify to copy from a single file, the ACLs on that file are copied.
+If you specify to copy from a folder, the service replicates the ACLs for that given folder and the files and directories under it, if `recursive` is set to true. If you specify to copy from a single file, the ACLs on that file are copied.
>[!NOTE]
->When you use ADF to preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2, the existing ACLs on sink Gen2's corresponding folder/files will be overwritten.
+>When you use the copy activity to preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2, the existing ACLs on sink Gen2's corresponding folder/files will be overwritten.
>[!IMPORTANT]
->When you choose to preserve ACLs, make sure you grant high enough permissions for Data Factory to operate against your sink Data Lake Storage Gen2 account. For example, use account key authentication or assign the Storage Blob Data Owner role to the service principal or managed identity.
+>When you choose to preserve ACLs, make sure you grant high enough permissions for the service to operate against your sink Data Lake Storage Gen2 account. For example, use account key authentication or assign the Storage Blob Data Owner role to the service principal or managed identity.
When you configure source as Data Lake Storage Gen1/Gen2 with binary format or the binary copy option, and sink as Data Lake Storage Gen2 with binary format or the binary copy option, you can find the **Preserve** option on the **Settings** page in Copy Data Tool or on the **Copy Activity** > **Settings** tab for activity authoring.
data-factory Copy Activity Schema And Type Mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-schema-and-type-mapping.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Schema and data type mapping in copy activity
data-factory Copy Data Tool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-data-tool.md
Previously updated : 06/04/2021 Last updated : 09/09/2021 # Copy Data tool in Azure Data Factory
data-factory Create Azure Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/create-azure-integration-runtime.md
description: Learn how to create Azure integration runtime in Azure Data Factory
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Create Self Hosted Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/create-self-hosted-integration-runtime.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Data Factory Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-factory-troubleshoot-guide.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Data Flow Aggregate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-aggregate.md
Previously updated : 09/14/2020 Last updated : 09/09/2021 # Aggregate transformation in mapping data flow
data-factory Data Flow Alter Row https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-alter-row.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Alter row transformation in mapping data flow
data-factory Data Flow Conditional Split https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-conditional-split.md
Previously updated : 05/21/2020 Last updated : 09/09/2021 # Conditional split transformation in mapping data flow
data-factory Data Flow Derived Column https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-derived-column.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Derived column transformation in mapping data flow
data-factory Data Flow Exists https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-exists.md
Previously updated : 05/07/2020 Last updated : 09/09/2021 # Exists transformation in mapping data flow
data-factory Data Flow Expression Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-expression-functions.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Data transformation expressions in mapping data flow
data-factory Data Flow Filter https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-filter.md
Previously updated : 05/26/2020 Last updated : 09/09/2021 # Filter transformation in mapping data flow
data-factory Data Flow Flatten https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-flatten.md
Previously updated : 03/09/2020 Last updated : 09/09/2021 # Flatten transformation in mapping data flow
data-factory Data Flow Join https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-join.md
Previously updated : 05/15/2020 Last updated : 09/09/2021 # Join transformation in mapping data flow
data-factory Data Flow Lookup https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-lookup.md
Previously updated : 02/19/2021 Last updated : 09/09/2021 # Lookup transformation in mapping data flow
data-factory Data Flow New Branch https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-new-branch.md
Previously updated : 04/16/2021 Last updated : 09/09/2021 # Creating a new branch in mapping data flow
data-factory Data Flow Pivot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-pivot.md
Previously updated : 07/17/2020 Last updated : 09/09/2021 # Pivot transformation in mapping data flow
data-factory Data Flow Rank https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-rank.md
Title: Rank transformation in mapping data flow
+description: Learn how to use a mapping data flow rank transformation to generate a ranking column in Azure Data Factory or Synapse Analytics pipelines.
-description: How to use Azure Data Factory's mapping data flow rank transformation generate a ranking column
Previously updated : 10/05/2020 Last updated : 09/09/2021 # Rank transformation in mapping data flow
data-factory Data Flow Select https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-select.md
Previously updated : 06/02/2020 Last updated : 09/09/2021 # Select transformation in mapping data flow
data-factory Data Flow Sink https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-sink.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Sink transformation in mapping data flow
data-factory Data Flow Sort https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-sort.md
Title: Sort transformation in mapping data flow
+description: Learn about the Mapping Data Sort Transformation in Azure Data Factory and Synapse Analytics pipelines.
-description: Azure Data Factory Mapping Data Sort Transformation
Previously updated : 04/14/2020 Last updated : 09/09/2021 # Sort transformation in mapping data flow
data-factory Data Flow Source https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-source.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Source transformation in mapping data flow
data-factory Data Flow Surrogate Key https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-surrogate-key.md
Previously updated : 10/30/2020 Last updated : 09/09/2021 # Surrogate key transformation in mapping data flow
data-factory Data Flow Transformation Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-transformation-overview.md
Previously updated : 10/27/2020 Last updated : 09/09/2021 # Mapping data flow transformation overview
data-factory Data Flow Union https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-union.md
Previously updated : 04/27/2020 Last updated : 09/09/2021 # Union transformation in mapping data flow
data-factory Data Flow Unpivot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-unpivot.md
Previously updated : 07/14/2020 Last updated : 09/09/2021 # Unpivot transformation in mapping data flow
data-factory Data Flow Window https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-window.md
Previously updated : 11/16/2020 Last updated : 09/09/2021 # Window transformation in mapping data flow
data-factory Delete Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/delete-activity.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Delete Activity in Azure Data Factory and Azure Synapse Analytics
data-factory Format Avro https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-avro.md
Previously updated : 09/15/2020 Last updated : 09/09/2021
data-factory Format Binary https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-binary.md
Previously updated : 10/29/2020 Last updated : 09/09/2021
data-factory Format Common Data Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-common-data-model.md
Previously updated : 02/04/2021 Last updated : 09/09/2021
data-factory Format Delimited Text https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-delimited-text.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Format Excel https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-excel.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Format Json https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-json.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Format Orc https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-orc.md
Title: ORC format in Azure Data Factory
+ Title: ORC format support
+description: This topic describes how to deal with ORC format in Azure Data Factory and Synapse Analytics pipelines.
-description: 'This topic describes how to deal with ORC format in Azure Data Factory.'
Previously updated : 09/28/2020 Last updated : 09/09/2021
-# ORC format in Azure Data Factory
+# ORC format in Azure Data Factory and Synapse Analytics
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
OrcSource sink(
> [!IMPORTANT] > For copy empowered by Self-hosted Integration Runtime e.g. between on-premises and cloud data stores, if you are not copying ORC files **as-is**, you need to install the **64-bit JRE 8 (Java Runtime Environment) or OpenJDK** and **Microsoft Visual C++ 2010 Redistributable Package** on your IR machine. Check the following paragraph with more details.
-For copy running on Self-hosted IR with ORC file serialization/deserialization, ADF locates the Java runtime by firstly checking the registry *`(SOFTWARE\JavaSoft\Java Runtime Environment\{Current Version}\JavaHome)`* for JRE, if not found, secondly checking system variable *`JAVA_HOME`* for OpenJDK.
+For copy running on Self-hosted IR with ORC file serialization/deserialization, the service locates the Java runtime by firstly checking the registry *`(SOFTWARE\JavaSoft\Java Runtime Environment\{Current Version}\JavaHome)`* for JRE, if not found, secondly checking system variable *`JAVA_HOME`* for OpenJDK.
- **To use JRE**: The 64-bit IR requires 64-bit JRE. You can find it from [here](https://go.microsoft.com/fwlink/?LinkId=808605). - **To use OpenJDK**: It's supported since IR version 3.13. Package the jvm.dll with all other required assemblies of OpenJDK into Self-hosted IR machine, and set system environment variable JAVA_HOME accordingly.
For copy running on Self-hosted IR with ORC file serialization/deserialization,
![Set JVM heap size on Self-hosted IR](./media/supported-file-formats-and-compression-codecs/set-jvm-heap-size-on-selfhosted-ir.png)
-Example: set variable `_JAVA_OPTIONS` with value `-Xms256m -Xmx16g`. The flag `Xms` specifies the initial memory allocation pool for a Java Virtual Machine (JVM), while `Xmx` specifies the maximum memory allocation pool. This means that JVM will be started with `Xms` amount of memory and will be able to use a maximum of `Xmx` amount of memory. By default, ADF use min 64 MB and max 1G.
+Example: set variable `_JAVA_OPTIONS` with value `-Xms256m -Xmx16g`. The flag `Xms` specifies the initial memory allocation pool for a Java Virtual Machine (JVM), while `Xmx` specifies the maximum memory allocation pool. This means that JVM will be started with `Xms` amount of memory and will be able to use a maximum of `Xmx` amount of memory. By default, the service uses min 64 MB and max 1G.
## Next steps
data-factory Format Parquet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-parquet.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory Format Xml https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-xml.md
Previously updated : 04/29/2021 Last updated : 09/09/2021
data-factory How To Create Event Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-event-trigger.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Create a trigger that runs a pipeline in response to a storage event
data-factory How To Create Schedule Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-schedule-trigger.md
Previously updated : 08/24/2021 Last updated : 09/09/2021
data-factory How To Create Tumbling Window Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-tumbling-window-trigger.md
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Create a trigger that runs a pipeline on a tumbling window
data-factory Iterative Development Debugging https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/iterative-development-debugging.md
Title: Iterative development and debugging in Azure Data Factory description: Learn how to develop and debug Data Factory pipelines iteratively in the ADF UX Previously updated : 04/21/2021 Last updated : 09/09/2021
data-factory Load Azure Sql Data Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/load-azure-sql-data-warehouse.md
Title: Load data into Azure Synapse Analytics
-description: Use Azure Data Factory or a Synapse pipeline to copy data into Azure Synapse Analytics.
+description: Use Azure Data Factory or an Azure Synapse pipeline to copy data into Azure Synapse Analytics.
Previously updated : 08/24/2021 Last updated : 09/09/2021 # Load data into Azure Synapse Analytics using Azure Data Factory or a Synapse pipeline
This article shows you how to use the Copy Data tool to _load data from Azure SQ
Advance to the following article to learn about Azure Synapse Analytics support: > [!div class="nextstepaction"]
->[Azure Synapse Analytics connector](connector-azure-sql-data-warehouse.md)
+>[Azure Synapse Analytics connector](connector-azure-sql-data-warehouse.md)
data-factory Load Sap Bw Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/load-sap-bw-data.md
Previously updated : 08/04/2021 Last updated : 09/09/2021 # Copy data from SAP Business Warehouse by using Azure Data Factory
data-factory Monitor Configure Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/monitor-configure-diagnostics.md
+
+ Title: Configure diagnostic settings and workspace
+description: Learn how to configure diagnostic settings and a Log Analytics Workspace to monitor Azure Data Factory.
++++++ Last updated : 09/02/2021++
+# Configure diagnostic settings and workspace
+
+Create or add diagnostic settings for your data factory.
+
+1. In the portal, go to Monitor. Select **Settings** > **Diagnostic settings**.
+
+1. Select the data factory for which you want to set a diagnostic setting.
+
+1. If no settings exist on the selected data factory, you're prompted to create a setting. Select **Turn on diagnostics**.
+
+ :::image type="content" source="media/data-factory-monitor-oms/monitor-oms-image1.png" alt-text="Create a diagnostic setting if no settings exist":::
+
+ If there are existing settings on the data factory, you see a list of settings already configured on the data factory. Select **Add diagnostic setting**.
+
+ :::image type="content" source="media/data-factory-monitor-oms/add-diagnostic-setting.png" alt-text="Add a diagnostic setting if settings exist":::
+
+1. Give your setting a name, select **Send to Log Analytics**, and then select a workspace from **Log Analytics Workspace**.
+
+ * In _Azure-Diagnostics_ mode, diagnostic logs flow into the _AzureDiagnostics_ table.
+
+ * In _Resource-Specific_ mode, diagnostic logs from Azure Data Factory flow into the following tables:
+ - _ADFActivityRun_
+ - _ADFPipelineRun_
+ - _ADFTriggerRun_
+ - _ADFSSISIntegrationRuntimeLogs_
+ - _ADFSSISPackageEventMessageContext_
+ - _ADFSSISPackageEventMessages_
+ - _ADFSSISPackageExecutableStatistics_
+ - _ADFSSISPackageExecutionComponentPhases_
+ - _ADFSSISPackageExecutionDataStatistics_
+
+ You can select various logs relevant to your workloads to send to Log Analytics tables. For example, if you don't use SQL Server Integration Services (SSIS) at all, you need not select any SSIS logs. If you want to log SSIS Integration Runtime (IR) start/stop/maintenance operations, you can select SSIS IR logs. If you invoke SSIS package executions via T-SQL on SQL Server Management Studio (SSMS), SQL Server Agent, or other designated tools, you can select SSIS package logs. If you invoke SSIS package executions via Execute SSIS Package activities in ADF pipelines, you can select all logs.
+
+ * If you select _AllMetrics_, various ADF metrics will be made available for you to monitor or raise alerts on, including the metrics for ADF activity, pipeline, and trigger runs, as well as for SSIS IR operations and SSIS package executions.
+
+ :::image type="content" source="media/data-factory-monitor-oms/monitor-oms-image2.png" alt-text="Name your settings and select a log-an