Updates from: 09/24/2021 03:13:56
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Partner Zscaler https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/partner-zscaler.md
The rest of the steps aren't relevant to this tutorial.
Next, you need to obtain a SAML metadata URL in the following format:
-```https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy-name>/Samlp/metadata```
+`https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy-name>/Samlp/metadata`
Note that `<tenant-name>` is the name of your Azure AD B2C tenant, and `<policy-name>` is the name of the custom SAML policy that you created in the preceding step.
-For example, the URL might be `https://safemarch.b2clogin.com/safemarch.onmicrosoft.com/B2C_1A_signup_signin_saml//Samlp/metadata`.
+For example, the URL might be:
+
+`https://safemarch.b2clogin.com/safemarch.onmicrosoft.com/B2C_1A_signup_signin_saml/Samlp/metadata`.
Open a web browser and go to the SAML metadata URL. Right-click anywhere on the page, select **Save as**, and then save the file to your computer for use in the next step.
For more information, review the following articles:
- [Get started with custom policies in Azure AD B2C](./tutorial-create-user-flows.md?pivots=b2c-custom-policy) - [Register a SAML application in Azure AD B2C](./saml-service-provider.md) - [Step-by-step configuration guide for ZPA](https://help.zscaler.com/zpa/step-step-configuration-guide-zpa)-- [Configure an IdP for single sign-on](https://help.zscaler.com/zpa/configuring-idp-single-sign)
+- [Configure an IdP for single sign-on](https://help.zscaler.com/zpa/configuring-idp-single-sign)
active-directory-domain-services Tutorial Perform Disaster Recovery Drill https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/tutorial-perform-disaster-recovery-drill.md
+
+ Title: Tutorial - Perform a disaster recovery drill in Azure AD Domain Services | Microsoft Docs
+description: Learn how to perform a disaster recovery drill using replica sets in Azure AD Domain Services
++++++++ Last updated : 09/22/2021++
+#Customer intent: As an identity administrator, I want to perform a disaster recovery drill by using replica sets in Azure Active Directory Domain Services to demonstrate resiliency for geographically distributed domain data.
++
+# Tutorial: Perform a disaster recovery drill using replica sets in Azure Active Directory Domain Services
+
+This topic shows how to perform a disaster recovery (DR) drill for Azure AD Domain Services (Azure AD DS) using replica sets. This will simulate one of the replica sets going offline by making changes to the network virtual network properties to block client access to it. It is not a true DR drill in that the replica set will not be taken offline.
+
+The DR drill will cover:
+
+1. A client machine is connected to a given replica set. It can authenticate to the domain and perform LDAP queries.
+1. The clientΓÇÖs connection to the replica set will be terminated. This will happen by restricting network access.
+1. The client will then establish a new connection with the other replica set. Once that happens, the client will be able to authenticate to the domain and perform LDAP queries.
+1. The domain member will be rebooted, and a domain user will be able to log in post reboot.
+1. The network restrictions will be removed, and the client will be able to connect to original replica set.
+
+## Prerequisites
+
+The following requirements must be in place to complete the DR drill:
+
+- An active Azure AD DS instance deployed with at least one extra replica set in place. The domain must be in a healthy state.
+- A client machine that is joined to the Azure AD DS hosted domain. The client must be in its own virtual network, virtual network peering enabled with both replica set virtual networks, and the virtual network must have the IP addresses of all domain controllers in the replica sets listed in DNS.
+
+## Environment validation
+
+1. Log into the client machine with a domain account.
+1. Install the Active Directory Domain Services RSAT tools.
+1. Start an elevated PowerShell window.
+1. Perform basic domain validation checks:
+ - Run `nslookup [domain]` to ensure DNS resolution is working properly
+ - Run `nltest /dsgetdc:` to return a success and say which domain controller is currently being used
+ - Run `nltest /dclist:` to return the full list of domain controllers in the directory
+1. Perform basic domain controller validation on each domain controller in the directory (you can get the full list from the output of ΓÇ£nltest /dclist:ΓÇ¥):
+ - Run `nltest /sc_reset:[domain name]\[domain controller name]` to establish a secure connection with the domain controller.
+ - Run `Get-AdDomain` to retrieve the basic directory settings.
+
+## Perform the disaster recovery drill
+
+You will be performing these operations for each replica set in the Azure AD DS instance. This will simulate an outage for each replica set. When domain controllers are not reachable, the client will automatically failover to a reachable domain controller and this experience should be seamless to the end user or workload. Therefore it is critical that applications and services don't point to a specific domain controller.
+
+1. Identify the domain controllers in the replica set that you want to simulate going offline.
+1. On the client machine, connect to one of the domain controllers using `nltest /sc_reset:[domain]\[domain controller name]`.
+1. In the Azure portal, go to the client virtual network peering and update the properties so that all traffic between the client and the replica set is blocked.
+ 1. Select the peered network that you want to update.
+ 1. Select to block all network traffic that enters or leaves the virtual network.
+ ![Screenshot of how to block traffic in the Azure portal](./media/tutorial-perform-disaster-recovery-drill/block-traffic.png)
+1. On the client machine, attempt to reestablish a secure connection with both domain controllers from step 2 using the same nltest command. These operations should fail as network connectivity has been blocked.
+1. Run `Get-AdDomain` and `Get-AdForest` to get basic directory properties. These calls will succeed because they are automatically going to one of the domain controllers in the other replica set.
+1. Reboot the client and login with the same domain account. This shows that authentication is still working as expected and logins are not blocked.
+1. In the Azure portal, go to the client virtual network peering and update the properties so that all traffic is unblocked. This reverts the changes that were made in step 3.
+1. On the client machine, attempt to reestablish a secure connection with the domain controllers from step 2 using the same nltest command. These operations should succeed as network connectivity has been unblocked.
+
+These operations demonstrate that the domain is still available even though one of the replica sets is unreachable by the client. Perform this set of steps for each replica set in the Azure AD DS instance.
+
+## Summary
+
+After you complete these steps, you will see domain members continue to access the directory if one of the replica sets in the Azure AD DS is not reachable. You can simulate the same behavior by blocking all network access for a replica set instead of a client machine, but we don't recommend it. It wonΓÇÖt change the behavior from a client perspective, but it will impact the health of your Azure AD DS instance until the network access is restored.
+
+## Next steps
+
+In this tutorial, you learned how to:
+
+> [!div class="checklist"]
+> * Validate client connectivity to domain controllers in a replica set
+> * Block network traffic between the client and the replica set
+> * Validate client connectivity to domain controllers in another replica set
+
+For more conceptual information, learn how replica sets work in Azure AD DS.
+
+> [!div class="nextstepaction"]
+> [Replica sets concepts and features][concepts-replica-sets]
+
+<!-- INTERNAL LINKS -->
+[replica-sets]: concepts-replica-sets.md
+[tutorial-create-instance]: tutorial-create-instance-advanced.md
+[create-azure-ad-tenant]: ../active-directory/fundamentals/sign-up-organization.md
+[associate-azure-ad-tenant]: ../active-directory/fundamentals/active-directory-how-subscriptions-associated-directory.md
+[howto-change-sku]: change-sku.md
+[concepts-replica-sets]: concepts-replica-sets.md
active-directory On Premises Application Provisioning Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-application-provisioning-architecture.md
You can also check whether all the required ports are open.
## Next steps - [App provisioning](user-provisioning.md)-- [Azure AD ECMA Connector Host prerequisites](on-premises-ecma-prerequisites.md)-- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md)
active-directory On Premises Ecma Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-ecma-configure.md
- Title: 'Azure AD ECMA Connector Host configuration'
-description: This article describes how to configure the Azure AD ECMA Connector Host.
------ Previously updated : 06/04/2021-----
-# Configure the Azure AD ECMA Connector Host and the provisioning agent
-
->[!IMPORTANT]
-> The on-premises provisioning preview is currently in an invitation-only preview. To request access to the capability, use the [access request form](https://aka.ms/onpremprovisioningpublicpreviewaccess). We'll open the preview to more customers and connectors over the next few months as we prepare for general availability.
-
-This article provides guidance on how to configure the Azure Active Directory (Azure AD) ECMA Connector Host and the provisioning agent after you've successfully installed them.
-
-This flow guides you through the process of installing and configuring the Azure AD ECMA Connector Host.
-
- ![Diagram that shows the installation flow.](./media/on-premises-ecma-configure/flow-1.png)
-
-For more installation and configuration information, see:
- - [Prerequisites for the Azure AD ECMA Connector Host](on-premises-ecma-prerequisites.md)
- - [Installation of the Azure AD ECMA Connector Host](on-premises-ecma-install.md)
- - [Azure AD ECMA Connector Host generic SQL connector configuration](on-premises-sql-connector-configure.md)
-
-## Configure the Azure AD ECMA Connector Host
-Configuring the Azure AD ECMA Connector Host occurs in two parts:
-
- - **Configure the settings**: Configure the port and certificate for the Azure AD ECMA Connector Host to use. This step is only done the first time the ECMA Connector Host is started.
- - **Create a connector**: Create a connector (for example, SQL or LDAP) to allow the Azure AD ECMA Connector Host to export or import data to a data source.
-
-### Configure the settings
-When you first start the Azure AD ECMA Connector Host, you'll see a port number that's filled with the default **8585**.
-
- ![Screenshot that shows configuring your settings.](.\media\on-premises-ecma-configure\configure-1.png)
-
-For the preview, you'll need to generate a new self-signed certificate.
-
- >[!NOTE]
- >This preview uses a time-sensitive certificate. The autogenerated certificate will be self-signed. Part of the trusted root and the SAN matches the hostname.
--
-### Create a connector
-Now you must create a connector for the Azure AD ECMA Connector Host to use. This connector will allow the ECMA Connector Host to export data to the data source for the connector you create. You can also use it to import data if you want.
-
-The configuration steps for each of the individual connectors are longer and are provided in their own documents.
-
-To create and configure a connector, use the [generic SQL connector](on-premises-sql-connector-configure.md). This connector will work with Microsoft SQL databases, such as Azure SQL Database or Azure Database for MySQL.
-
-## Establish connectivity between Azure AD and the Azure AD ECMA Connector Host
-The following sections guide you through establishing connectivity with the on-premises Azure AD ECMA Connector Host and Azure AD.
-
-### Ensure the ECMA2Host service is running
-1. On the server running the Azure AD ECMA Connector Host, select **Start**.
-1. Enter **run**, and enter **services.msc** in the box.
-1. In the **Services** list, ensure that **Microsoft ECMA2Host** is present and running. If not, select **Start**.
-
- ![Screenshot that shows that the service is running.](.\media\on-premises-ecma-configure\configure-2.png)
-
-### Add an enterprise application
-1. Sign in to the Azure portal as an application administrator.
-1. In the portal, go to **Azure Active Directory** > **Enterprise applications**.
-1. Select **New application**.
-
- ![Screenshot that shows Add new application.](.\media\on-premises-ecma-configure\configure-4.png)
-1. Locate the **On-premises provisioning** application from the gallery, and select **Create**.
-
-### Configure the application and test
- 1. After the application is created, select the **Provisioning** page.
- 1. Select **Get started**.
-
- ![Screenshot that shows Get started.](.\media\on-premises-ecma-configure\configure-6.png)
- 1. On the **Provisioning** page, change **Provisioning Mode** to **Automatic**.
-
- ![Screenshot that shows changing the mode.](.\media\on-premises-ecma-configure\configure-7.png)
- 1. In the **On-Premises Connectivity** section, select the agent that you deployed and select **Assign Agent(s)**.
-
- ![Screenshot that shows Assign an agent.](.\media\on-premises-ecma-configure\configure-8.png)</br>
-
- >[!NOTE]
- >After you add the agent, wait 10 to 20 minutes for the registration to complete. The connectivity test won't work until the registration completes.
- >
- >Alternatively, you can force the agent registration to complete by restarting the provisioning agent on your server. Go to your server, search for **services** in the Windows search bar, identify the **Azure AD Connect Provisioning Agent Service**, right-click the service, and restart.
-
- 1. After 10 minutes, under the **Admin Credentials** section, enter the following URL. Replace the `"connectorName"` portion with the name of the connector on the ECMA Host.
-
- |Property|Value|
- |--|--|
- |Tenant URL|https://localhost:8585/ecma2host_connectorName/scim|
-
- 1. Enter the secret token value that you defined when you created the connector.
- 1. Select **Test Connection** and wait one minute.
-
- ![Screenshot that shows Test Connection.](.\media\on-premises-ecma-configure\configure-5.png)
-
- >[!NOTE]
- >Be sure to wait 10 to 20 minutes after you assign the agent to test the connection. The connection will fail if registration hasn't finished.
-
- 1. After the connection test is successful, select **Save**.</br>
-
- ![Screenshot that shows Successful test.](.\media\on-premises-ecma-configure\configure-9.png)
-
-## Configure who's in scope for provisioning
-Now that you have the Azure AD ECMA Connector Host talking with Azure AD, you can move on to configuring who's in scope for provisioning. The following sections provide information on how to scope your users.
-
-### Assign users to your application
-By using Azure AD, you can scope who should be provisioned based on assignment to an application or by filtering on a particular attribute. Determine who should be in scope for provisioning, and define your scoping rules, as necessary. For more information, see [Manage user assignment for an app in Azure Active Directory](../../active-directory/manage-apps/assign-user-or-group-access-portal.md).
-
-### Configure your attribute mappings
-Now you map the user attributes in Azure AD to the attributes in the target application. The Azure AD provisioning service relies on the SCIM standard for provisioning. As a result, the attributes surfaced have the SCIM name space. The following example shows how you can map the **mail** and **objectId** attributes in Azure AD to the **Email** and **InternalGUID** attributes in an application.
-
->[!NOTE]
->The default mapping connects **userPrincipalName** to an attribute name *PLACEHOLDER*. You must change the *PLACEHOLDER* attribute to one that's found in your application. For more information, see [Matching users in the source and target systems](customize-application-attributes.md#matching-users-in-the-source-and-target--systems).
-
-|Attribute name in Azure AD|Attribute name in SCIM|Attribute name in target application|
-|--|--|--|
-|mail|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:Email|Email|
-|objectId|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:InternalGUID|InternalGUID|
-
-### Configure attribute mapping
- 1. In the Azure AD portal, under **Enterprise applications**, select the **Provisioning** page.
- 2. Select **Get started**.
- 3. Expand **Mappings**, and select **Provision Azure Active Directory Users**.
-
- ![Screenshot that shows Provision Azure Active Directory Users.](.\media\on-premises-ecma-configure\configure-10.png)
- 1. Select **Add New Mapping**.
-
- ![Screenshot that shows Add New Mapping.](.\media\on-premises-ecma-configure\configure-11.png)
- 1. Specify the source and target attributes, and select **OK**.</br>
-
- ![Screenshot that shows the Edit Attribute pane.](.\media\on-premises-ecma-configure\configure-12.png)
--
-For more information on mapping user attributes from applications to Azure AD, see [Tutorial - Customize user provisioning attribute-mappings for SaaS applications in Azure Active Directory](customize-application-attributes.md).
-
-### Test your configuration by provisioning users on demand
-To test your configuration, you can use on-demand provisioning of users. For information on provisioning users on-demand, see [On-demand provisioning](provision-on-demand.md).
-
- 1. Go to the single sign-on pane, and then go back to the provisioning pane. On the new provisioning overview pane, select **On-demand**.
- 1. Test provisioning a few users on demand as described in [On-demand provisioning in Azure Active Directory](provision-on-demand.md).
-
- ![Screenshot that shows testing provisioning.](.\media\on-premises-ecma-configure\configure-13.png)
-
-### Start provisioning users
- 1. After on-demand provisioning is successful, go back to the provisioning configuration page. Ensure that the scope is set to only assigned users and groups, turn the provisioning status to **On**, and select **Save**.
-
- ![Screenshot that shows starting provisioning.](.\media\on-premises-ecma-configure\configure-14.png)
-
-1. Wait several minutes for provisioning to start. It might take up to 40 minutes. After the provisioning job has completed, as described in the next section, you can change the provisioning status to **Off**, and select **Save**. This step will stop the provisioning service from running in the future.
-
-### Verify users were successfully provisioned
-After waiting, check your data source to see if new users are being provisioned.
-
- ![Screenshot that shows verifying that users are provisioned.](.\media\on-premises-ecma-configure\configure-15.png)
-
-## Monitor your deployment
-
-1. Use the provisioning logs to determine which users were provisioned successfully or unsuccessfully.
-1. Build custom alerts, dashboards, and queries by using the Azure Monitor integration.
-1. If the provisioning configuration seems to be in an unhealthy state, the application goes into quarantine. Learn more about [quarantine states](https://github.com/MicrosoftDocs/azure-docs-pr/blob/master/articles/active-directory/app-provisioning/application-provisioning-quarantine-status.md).
-
-## Next steps
--- [Azure AD ECMA Connector Host prerequisites](on-premises-ecma-prerequisites.md)-- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Generic SQL connector](on-premises-sql-connector-configure.md)
active-directory On Premises Ecma Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-ecma-install.md
- Title: 'Azure AD ECMA Connector Host installation'
-description: This article describes how to install the Azure AD ECMA Connector Host.
------ Previously updated : 05/28/2021-----
-# Installation of the Azure AD ECMA Connector Host
-
->[!IMPORTANT]
-> The on-premises provisioning preview is currently in an invitation-only preview. To request access to the capability, use the [access request form](https://aka.ms/onpremprovisioningpublicpreviewaccess). We'll open the preview to more customers and connectors over the next few months as we prepare for general availability.
-
-The Azure Active Directory (Azure AD) ECMA Connector Host is included as part of the Azure AD Connect Provisioning Agent Package. The provisioning agent and Azure AD ECMA Connector Host are two separate Windows services. They're installed by using one installer, which is deployed on the same machine.
-
-This flow guides you through the process of installing and configuring the Azure AD ECMA Connector Host.
-
- ![Diagram that shows the installation flow.](./media/on-premises-ecma-install/flow-1.png)
-
-For more installation and configuration information, see:
-
- - [Prerequisites for the Azure AD ECMA Connector Host](on-premises-ecma-prerequisites.md)
- - [Configure the Azure AD ECMA Connector Host and the provisioning agent](on-premises-ecma-configure.md)
- - [Azure AD ECMA Connector Host generic SQL connector configuration](on-premises-sql-connector-configure.md)
-
-## Download and install the Azure AD Connect Provisioning Agent Package
-
- 1. Sign in to the Azure portal.
- 1. Go to **Enterprise applications** > **Add a new application**.
- 1. Search for the **On-premises provisioning** application, and add it to your tenant image.
- 1. Go to the **Provisioning** pane.
- 1. Select **On-premises connectivity**.
- 1. Download the agent installer.
- 1. Run the Azure AD Connect provisioning installer **AADConnectProvisioningAgentSetup.msi**.
- 1. On the **Microsoft Azure AD Connect Provisioning Agent Package** screen, accept the licensing terms, and select **Install**.
-
- ![Microsoft Azure AD Connect Provisioning Agent Package screen.](media/on-premises-ecma-install/install-1.png)</br>
- 1. After this operation finishes, the configuration wizard starts. Select **Next**.
-
- ![Screenshot that shows the Welcome screen.](media/on-premises-ecma-install/install-2.png)</br>
-
- 1. On the **Select Extension** screen, select **On-premises application provisioning (Azure AD to application)**. Select **Next**.
-
- ![Screenshot that shows Select extension.](media/on-premises-ecma-install/install-3.png)</br>
- 1. Use your global administrator account to sign in to Azure AD.
-
- ![Screenshot that shows Azure sign-in.](media/on-premises-ecma-install/install-4.png)</br>
- 1. On the **Agent configuration** screen, select **Confirm**.
-
- ![Screenshot that shows Confirm installation.](media/on-premises-ecma-install/install-5.png)</br>
- 1. After the installation is complete, you should see a message at the bottom of the wizard. Select **Exit**.
-
- ![Screenshot that shows finishing.](media/on-premises-ecma-install/install-6.png)</br>
-
-
-Now that the agent package has been successfully installed, you need to configure the Azure AD ECMA Connector Host and create or import connectors.
-
-## Next steps
--- [Azure AD ECMA Connector Host prerequisites](on-premises-ecma-prerequisites.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md)-- [Generic SQL connector](on-premises-sql-connector-configure.md)
active-directory On Premises Ecma Prerequisites https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-ecma-prerequisites.md
- Title: 'Prerequisites for Azure AD ECMA Connector Host'
-description: This article describes the prerequisites and hardware requirements you need for using the Azure AD ECMA Connector Host.
------ Previously updated : 05/28/2021-----
-# Prerequisites for the Azure AD ECMA Connector Host
-
->[!IMPORTANT]
-> The on-premises provisioning preview is currently in an invitation-only preview. To request access to the capability, use the [access request form](https://aka.ms/onpremprovisioningpublicpreviewaccess). We'll open the preview to more customers and connectors over the next few months as we prepare for general availability.
-
-This article provides guidance on the prerequisites that are needed for using the Azure Active Directory (Azure AD) ECMA Connector Host.
-
-This flow guides you through the process of installing and configuring the Azure AD ECMA Connector Host.
-
- ![Diagram that shows the installation flow.](./media/on-premises-ecma-prerequisites/flow-1.png)
-
-For more installation and configuration information, see:
-
- - [Installation of the Azure AD ECMA Connector Host](on-premises-ecma-install.md)
- - [Configure the Azure AD ECMA Connector Host and the provisioning agent](on-premises-ecma-configure.md)
- - [Azure AD ECMA Connector Host generic SQL connector configuration](on-premises-sql-connector-configure.md)
-
-## On-premises prerequisites
--
-## Cloud requirements
-
-
- [!INCLUDE [active-directory-p1-license.md](../../../includes/active-directory-p1-license.md)]
-
-## Next steps
--- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md)-- [Generic SQL connector](on-premises-sql-connector-configure.md)-- [Tutorial - ECMA Connector Host generic SQL connector](tutorial-ecma-sql-connector.md)
active-directory On Premises Ecma Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-ecma-troubleshoot.md
See [About anchor attributes and distinguished names](on-premises-application-pr
## Next steps -- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md)-- [Generic SQL connector](on-premises-sql-connector-configure.md) - [Tutorial: ECMA Connector Host generic SQL connector](tutorial-ecma-sql-connector.md)
active-directory On Premises Migrate Microsoft Identity Manager https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-migrate-microsoft-identity-manager.md
At this point, the MIM Sync server is no longer needed.
## Next steps - [App provisioning](user-provisioning.md)-- [Azure AD ECMA Connector Host prerequisites](on-premises-ecma-prerequisites.md)-- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md) - [Generic SQL connector](on-premises-sql-connector-configure.md)
active-directory On Premises Scim Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-scim-provisioning.md
To provision users to SCIM-enabled apps:
## Next steps - [App provisioning](user-provisioning.md)-- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md) - [Generic SQL connector](on-premises-sql-connector-configure.md) - [Tutorial: ECMA Connector Host generic SQL connector](tutorial-ecma-sql-connector.md)
active-directory On Premises Sql Connector Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/on-premises-sql-connector-configure.md
# Azure AD ECMA Connector Host generic SQL connector configuration
+The following documentation provides configuration and tutorial information demonstrating how the generic SQL connector and the ECMA Connector Host can be used with a SQL Server.
->[!IMPORTANT]
-> The on-premises provisioning preview is currently in an invitation-only preview. To request access to the capability, use the [access request form](https://aka.ms/onpremprovisioningpublicpreviewaccess). We'll open the preview to more customers and connectors over the next few months as we prepare for general availability.
-This article describes how to create a new SQL connector with the Azure Active Directory (Azure AD) ECMA Connector Host and how to configure it. You'll need to do this task after you've successfully installed the Azure AD ECMA Connector Host.
-
->[!NOTE]
-> This article covers only the configuration of the generic SQL connector. For a step-by-step example of how to set up the generic SQL connector, see [Tutorial: ECMA Connector Host generic SQL connector](tutorial-ecma-sql-connector.md)
-
- This flow guides you through the process of installing and configuring the Azure AD ECMA Connector Host.
-
- ![Diagram that shows the installation flow.](./media/on-premises-sql-connector-configure/flow-1.png)
-
-For more installation and configuration information, see:
- - [Prerequisites for the Azure AD ECMA Connector Host](on-premises-ecma-prerequisites.md)
- - [Installation of the Azure AD ECMA Connector Host](on-premises-ecma-install.md)
- - [Configure the Azure AD ECMA Connector Host and the provisioning agent](on-premises-ecma-configure.md)
-
-Depending on the options you select, some of the wizard screens might not be available and the information might be slightly different. For purposes of this configuration, the user object type is used. Use the following information to guide you in your configuration.
-
-#### Supported systems
-* Microsoft SQL Server and Azure SQL
-* IBM DB2 10.x
-* IBM DB2 9.x
-* Oracle 10 and 11g
-* Oracle 12c and 18c
-* MySQL 5.x
-
-## Create a generic SQL connector
-
-To create a generic SQL connector:
-
- 1. Select the ECMA Connector Host shortcut on the desktop.
- 1. Select **New Connector**.
-
- ![Screenshot that shows Choose new connector.](.\media\on-premises-sql-connector-configure\sql-1.png)
-
- 1. On the **Properties** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows Enter properties.](.\media\on-premises-sql-connector-configure\sql-2.png)
-
- |Property|Description|
- |--|--|
- |Name|The name for this connector.|
- |Autosync timer (minutes)|Minimum allowed is 120 minutes.|
- |Secret Token|123456 (The token must be a string of 10 to 20 ASCII letters and/or digits.)|
- |Description|The description of the connector.|
- |Extension DLL|For a generic SQL connector, select **Microsoft.IAM.Connector.GenericSql.dll**.|
- 1. On the **Connectivity** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows Enter connectivity.](.\media\on-premises-sql-connector-configure\sql-3.png)
-
- |Property|Description|
- |--|--|
- |DSN File|The Data Source Name file used to connect to the SQL Server instance.|
- |User Name|The username of an individual with rights to the SQL Server instance. It must be in the form of hostname\sqladminaccount for standalone servers or domain\sqladminaccount for domain member servers.|
- |Password|The password of the username just provided.|
- |DN is Anchor|Unless your environment is known to require these settings, don't select the **DN is Anchor** and **Export Type:Object Replace** checkboxes.|
- |Export Type:Object Replace||
- 1. On the **Schema 1** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows the Schema 1 page.](.\media\on-premises-sql-connector-configure\sql-4.png)
-
- |Property|Description|
- |--|--|
- |Object type detection method|The method used to detect the object type the connector will be provisioning.|
- |Fixed value list/Table/View/SP|This box should contain **User**.|
- |Column Name for Table/View/SP||
- |Stored Procedure Parameters||
- |Provide SQL query for detecting object types||
- 1. On the **Schema 2** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes. This schema screen might be slightly different or have additional information depending on the object types you selected in the previous step.
-
- ![Screenshot that shows the Schema 2 page.](.\media\on-premises-sql-connector-configure\sql-5.png)
-
- |Property|Description|
- |--|--|
- |User:Attribute Detection|This property should be set to **Table**.|
- |User:Table/View/SP|This box should contain **Employees**.|
- |User:Name of Multi-Valued Table/Views||
- |User:Store Procedure Parameters||
- |User:Provide SQL query for detecting attributes||
- 1. On the **Schema 3** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes. The attributes you see depends on the information you provided in the previous step.
-
- ![Screenshot that shows the Schema 3 page.](.\media\on-premises-sql-connector-configure\sql-6.png)
-
- |Property|Description|
- |--|--|
- |Select DN attribute for User||
- 1. On the **Schema 4** page, review the **DataType** attribute and the direction of flow for the connector. You can adjust them if needed and select **Next**.
-
- ![Screenshot that shows the schema 4 page.](.\media\on-premises-sql-connector-configure\sql-7.png)
- 1. On the **Global** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows the Global page.](.\media\on-premises-sql-connector-configure\sql-8.png)
-
- |Property|Description|
- |--|--|
- |Water Mark Query||
- |Data Source Time Zone|Select the time zone that the data source is located in.|
- |Data Source Date Time Format|Specify the format for the data source.|
- |Use named parameters to execute a stored procedure||
- |Operation Methods||
- |Extension Name||
- |Set Password SP Name||
- |Set Password SP Parameters||
- 1. On the **Select partition** page, ensure that the correct partitions are selected and select **Next**.
-
- ![Screenshot that shows the Select partition page.](.\media\on-premises-sql-connector-configure\sql-9.png)
-
- 1. On the **Run Profiles** page, select the run profiles that you want to use and select **Next**.
-
- ![Screenshot that shows the Run Profiles page.](.\media\on-premises-sql-connector-configure\sql-10.png)
-
- |Property|Description|
- |--|--|
- |Export|Run profile that will export data to SQL. This run profile is required.|
- |Full import|Run profile that will import all data from SQL sources specified earlier.|
- |Delta import|Run profile that will import only changes from SQL since the last full or delta import.|
-
- 1. On the **Run Profiles** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows Enter Export information.](.\media\on-premises-sql-connector-configure\sql-11.png)
-
- |Property|Description|
- |--|--|
- |Operation Method||
- |Table/View/SP||
- |Start Index Parameter Name||
- |End Index Parameter Name||
- |Stored Procedure Parameters||
-
- 1. On the **Object Types** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows the Object Types page.](.\media\on-premises-sql-connector-configure\sql-12.png)
-
- |Property|Description|
- |--|--|
- |Target object|The object that you're configuring.|
- |Anchor|The attribute that will be used as the object's anchor. This attribute should be unique in the target system. The Azure AD provisioning service will query the ECMA host by using this attribute after the initial cycle. This anchor value should be the same as the anchor value in Schema 3.|
- |Query Attribute|Used by the ECMA host to query the in-memory cache. This attribute should be unique.|
- |DN|The attribute that's used for the target object's distinguished name. The **Autogenerated** checkbox should be selected in most cases. If it isn't selected, ensure that the DN attribute is mapped to an attribute in Azure AD that stores the DN in this format: CN = anchorValue, Object = objectType.|
-
- 1. The ECMA host discovers the attributes supported by the target system. You can choose which of those attributes you want to expose to Azure AD. These attributes can then be configured in the Azure portal for provisioning. On the **Select Attributes** page, select attributes from the dropdown list to add.
-
- ![Screenshot that shows the Select Attributes page.](.\media\on-premises-sql-connector-configure\sql-13.png)
-
-1. On the **Deprovisioning** page, review the deprovisioning information and make adjustments as necessary. Attributes selected on the previous page won't be available to select on the **Deprovisioning** page. Select **Finish**.
-
- ![Screenshot that shows the Deprovisioning page.](.\media\on-premises-sql-connector-configure\sql-14.png)
## Next steps - [App provisioning](user-provisioning.md)-- [Azure AD ECMA Connector Host installation](on-premises-ecma-install.md)-- [Azure AD ECMA Connector Host configuration](on-premises-ecma-configure.md) - [Tutorial: ECMA Connector Host generic SQL connector](tutorial-ecma-sql-connector.md)
active-directory Tutorial Ecma Sql Connector https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/tutorial-ecma-sql-connector.md
# Azure AD ECMA Connector Host generic SQL connector tutorial
->[!IMPORTANT]
-> The on-premises provisioning preview is currently in an invitation-only preview. To request access to the capability, use the [access request form](https://aka.ms/onpremprovisioningpublicpreviewaccess). We'll open the preview to more customers and connectors over the next few months as we prepare for general availability.
-
-This tutorial describes the steps you need to perform to automatically provision and deprovision users from Azure Active Directory (Azure AD) into a SQL database. For important details on what this service does, how it works, and frequently asked questions, see [Automate user provisioning and deprovisioning to SaaS applications with Azure Active Directory](../app-provisioning/user-provisioning.md).
-
-This tutorial covers how to set up and use the generic SQL connector with the Azure AD ECMA Connector Host.
-
-## Prepare the sample database
-On a server running SQL Server, run the SQL script found in [Appendix A](#appendix-a). This script creates a sample database with the name CONTOSO. This is the database that you'll be provisioning users into.
--
-## Create the DSN connection file
-The generic SQL connector is a DSN file to connect to the SQL server. First, you need to create a file with the ODBC connection information.
-
-1. Start the ODBC management utility on your server.
-
- ![Screenshot that shows ODBC management.](./media/tutorial-ecma-sql-connector/odbc.png)
-1. Select the **File DSN** tab, and select **Add**.
-
- ![Screenshot that shows the File DSN tab.](./media/tutorial-ecma-sql-connector/dsn-2.png)
-1. Select **SQL Server Native Client 11.0** and select **Next**.
-
- ![Screenshot that shows choosing a native client.](./media/tutorial-ecma-sql-connector/dsn-3.png)
-1. Give the file a name, such as **GenericSQL**, and select **Next**.
-
- ![Screenshot that shows naming the connector.](./media/tutorial-ecma-sql-connector/dsn-4.png)
-1. Select **Finish**.
-
- ![Screenshot that shows Finish.](./media/tutorial-ecma-sql-connector/dsn-5.png)
-1. Now configure the connection. Enter **APP1** for the name of the server and select **Next**.
-
- ![Screenshot that shows entering a server name.](./media/tutorial-ecma-sql-connector/dsn-6.png)
-1. Keep Windows authentication and select **Next**.
-
- ![Screenshot that shows Windows authentication.](./media/tutorial-ecma-sql-connector/dsn-7.png)
-1. Enter the name of the sample database, which is **CONTOSO**.
-
- ![Screenshot that shows entering a database name.](./media/tutorial-ecma-sql-connector/dsn-8.png)
-1. Keep everything default on this screen, and select **Finish**.
-
- ![Screenshot that shows selecting Finish.](./media/tutorial-ecma-sql-connector/dsn-9.png)
-1. To check everything is working as expected, select **Test Data Source**.
-
- ![Screenshot that shows Test Data Source.](./media/tutorial-ecma-sql-connector/dsn-10.png)
-1. Make sure the test is successful.
-
- ![Screenshot that shows success.](./media/tutorial-ecma-sql-connector/dsn-11.png)
-1. Select **OK** twice. Close the ODBC Data Source Administrator.
-
-## Download and install the Azure AD Connect Provisioning Agent Package
-
- 1. Sign in to the server you'll use with enterprise admin permissions.
- 1. Sign in to the Azure portal, and then go to **Azure Active Directory**.
- 1. On the menu on the left, select **Azure AD Connect**.
- 1. Select **Manage cloud sync** > **Review all agents**.
- 1. Download the Azure AD Connect Provisioning Agent Package from the Azure portal.
- 1. Accept the terms and select **Download**.
- 1. Run the Azure AD Connect provisioning installer AADConnectProvisioningAgentSetup.msi.
- 1. On the **Microsoft Azure AD Connect Provisioning Agent Package** screen, select **Install**.
-
- ![Screenshot that shows the Microsoft Azure AD Connect Provisioning Agent Package screen.](media/on-premises-ecma-install/install-1.png)</br>
- 1. After this operation finishes, the configuration wizard starts. Select **Next**.
-
- ![Screenshot that shows the Welcome screen.](media/on-premises-ecma-install/install-2.png)</br>
- 1. On the **Select Extension** screen, select **On-premises application provisioning (Azure AD to application)** and select **Next**.
-
- ![Screenshot that shows the Select Extension screen.](media/on-premises-ecma-install/install-3.png)</br>
- 1. Use your global administrator account and sign in to Azure AD.
-
- ![Screenshot that shows the Azure sign-in screen.](media/on-premises-ecma-install/install-4.png)</br>
- 1. On the **Agent configuration** screen, select **Confirm**.
-
- ![Screenshot that shows confirming the installation.](media/on-premises-ecma-install/install-5.png)</br>
- 1. After the installation is complete, you should see a message at the bottom of the wizard. Select **Exit**.
-
- ![Screenshot that shows the Exit button.](media/on-premises-ecma-install/install-6.png)</br>
-
-## Configure the Azure AD ECMA Connector Host
-1. On the desktop, select the ECMA shortcut.
-1. After the ECMA Connector Host Configuration starts, leave the default port **8585** and select **Generate** to generate a certificate. The autogenerated certificate will be self-signed as part of the trusted root. The SAN matches the host name.
-
- ![Screenshot that shows configuring your settings.](.\media\on-premises-ecma-configure\configure-1.png)
-1. Select **Save**.
-
-## Create a generic SQL connector
- 1. Select the ECMA Connector Host shortcut on the desktop.
- 1. Select **New Connector**.
-
- ![Screenshot that shows choosing New Connector.](.\media\on-premises-sql-connector-configure\sql-1.png)
-
- 1. On the **Properties** page, fill in the boxes with the values specified in the table that follows the image and select **Next**.
-
- ![Screenshot that shows entering properties.](.\media\tutorial-ecma-sql-connector\conn-1.png)
-
- |Property|Value|
- |--|--|
- |Name|SQL|
- |Autosync timer (minutes)|120|
- |Secret Token|Enter your own key here. It should be 12 characters minimum.|
- |Extension DLL|For a generic SQL connector, select **Microsoft.IAM.Connector.GenericSql.dll**.|
- 1. On the **Connectivity** page, fill in the boxes with the values specified in the table that follows the image and select **Next**.
-
- ![Screenshot that shows the Connectivity page.](.\media\tutorial-ecma-sql-connector\conn-2.png)
-
- |Property|Value|
- |--|--|
- |DSN File|Go to the file created at the beginning of the tutorial in "Create the DSN connection file."|
- |User Name|contoso\administrator|
- |Password|Enter the administrator's password.|
- 1. On the **Schema 1** page, fill in the boxes with the values specified in the table that follows the image and select **Next**.
-
- ![Screenshot that shows the Schema 1 page.](.\media\tutorial-ecma-sql-connector\conn-3.png)
-
- |Property|Value|
- |--|--|
- |Object type detection method|Fixed Value|
- |Fixed value list/Table/View/SP|User|
- 1. On the **Schema 2** page, fill in the boxes with the values specified in the table that follows the image and select **Next**.
-
- ![Screenshot that shows the Schema 2 page.](.\media\tutorial-ecma-sql-connector\conn-4.png)
-
- |Property|Value|
- |--|--|
- |User:Attribute Detection|Table|
- |User:Table/View/SP|Employees|
- 1. On the **Schema 3** page, fill in the boxes with the values specified in the table that follows the image and select **Next**.
-
- ![Screenshot that shows the Schema 3 page.](.\media\tutorial-ecma-sql-connector\conn-5.png)
-
- |Property|Description|
- |--|--|
- |Select Anchor for :User|User:ContosoLogin|
- |Select DN attribute for User|AzureID|
- 1. On the **Schema 4** page, leave the defaults and select **Next**.
-
- ![Screenshot that shows the Schema 4 page.](.\media\tutorial-ecma-sql-connector\conn-6.png)
- 1. On the **Global** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows the Global page.](.\media\tutorial-ecma-sql-connector\conn-7.png)
-
- |Property|Description|
- |--|--|
- |Data Source Date Time Format|yyyy-MM-dd HH:mm:ss|
- 1. On the **Partitions** page, select **Next**.
-
- ![Screenshot that shows the Partitions page.](.\media\tutorial-ecma-sql-connector\conn-8.png)
-
- 1. On the **Run Profiles** page, keep the **Export** checkbox selected. Select the **Full import** checkbox and select **Next**.
-
- ![Screenshot that shows the Run Profiles page.](.\media\tutorial-ecma-sql-connector\conn-9.png)
-
- 1. On the **Export** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows the Export page.](.\media\tutorial-ecma-sql-connector\conn-10.png)
-
- |Property|Description|
- |--|--|
- |Operation Method|Table|
- |Table/View/SP|Employees|
-
- 1. On the **Full Import** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- ![Screenshot that shows the Full Import page.](.\media\tutorial-ecma-sql-connector\conn-11.png)
-
- |Property|Description|
- |--|--|
- |Operation Method|Table|
- |Table/View/SP|Employees|
-
- 1. On the **Object Types** page, fill in the boxes and select **Next**. Use the table that follows the image for guidance on the individual boxes.
-
- - **Anchor**: This attribute should be unique in the target system. The Azure AD provisioning service will query the ECMA host by using this attribute after the initial cycle. This anchor value should be the same as the anchor value in schema 3.
- - **Query Attribute**: Used by the ECMA host to query the in-memory cache. This attribute should be unique.
- - **DN**: The **Autogenerated** option should be selected in most cases. If it isn't selected, ensure that the DN attribute is mapped to an attribute in Azure AD that stores the DN in this format: CN = anchorValue, Object = objectType.
-
- ![Screenshot that shows the Object Types page.](.\media\tutorial-ecma-sql-connector\conn-12.png)
-
- |Property|Description|
- |--|--|
- |Target object|User|
- |Anchor|ContosoLogin|
- |Query Attribute|AzureID|
- |DN|AzureID|
- |Autogenerated|Checked|
-
-
- 1. On the **Select Attributes** page, add all the attributes in the dropdown list and select **Next**.
-
- ![Screenshot that shows the Select Attributes page.](.\media\tutorial-ecma-sql-connector\conn-13.png)
-
- The **Attribute** dropdown list shows any attribute that was discovered in the target system and *wasn't* chosen on the previous **Select Attributes** page.
- 1. On the **Deprovisioning** page, under **Disable flow**, select **Delete**. Select **Finish**.
-
- ![Screenshot that shows the Deprovisioning page.](.\media\tutorial-ecma-sql-connector\conn-14.png)
-
-## Ensure ECMA2Host service is running
-1. On the server the running the Azure AD ECMA Connector Host, select **Start**.
-1. Enter **run** and enter **services.msc** in the box.
-1. In the **Services** list, ensure that **Microsoft ECMA2Host** is present and running. If not, select **Start**.
-
- ![Screenshot that shows the service is running.](.\media\on-premises-ecma-configure\configure-2.png)
-
-## Add an enterprise application
-1. Sign in to the Azure portal as an application administrator
-1. In the portal, go to **Azure Active Directory** > **Enterprise applications**.
-1. Select **New application**.
-
- ![Screenshot that shows adding a new application.](.\media\on-premises-ecma-configure\configure-4.png)
-1. Search the gallery for **On-premises ECMA app** and select **Create**.
-
-## Configure the application and test
-1. After it has been created, select the **Provisioning** page.
-1. Select **Get started**.
-
- ![Screenshot that shows get started.](.\media\on-premises-ecma-configure\configure-1.png)
-1. On the **Provisioning** page, change the mode to **Automatic**.
-
- ![Screenshot that shows changing the mode to Automatic.](.\media\on-premises-ecma-configure\configure-7.png)
-1. In the **On-Premises Connectivity** section, select the agent that you just deployed and select **Assign Agent(s)**.
- >[!NOTE]
- >After you add the agent, wait 10 minutes for the registration to complete. The connectivity test won't work until the registration completes.
- >
- >Alternatively, you can force the agent registration to complete by restarting the provisioning agent on your server. Go to your server, search for **services** in the Windows search bar, identify the **Azure AD Connect Provisioning Agent Service**, right-click the service, and restart.
-
- ![Screenshot that shows restarting an agent.](.\media\on-premises-ecma-configure\configure-8.png)
-1. After 10 minutes, under the **Admin credentials** section, enter the following URL. Replace the `connectorName` portion with the name of the connector on the ECMA host. You can also replace `localhost` with the host name.
-
- |Property|Value|
- |--|--|
- |Tenant URL|https://localhost:8585/ecma2host_connectorName/scim|
-
-1. Enter the **Secret Token** value that you defined when you created the connector.
-1. Select **Test Connection**, and wait one minute.
-
- ![Screenshot that shows assigning an agent.](.\media\on-premises-ecma-configure\configure-5.png)
-1. After the connection test is successful, select **Save**.</br>
-
- ![Screenshot that shows testing an agent.](.\media\on-premises-ecma-configure\configure-9.png)
-
-## Assign users to an application
-Now that you have the Azure AD ECMA Connector Host talking with Azure AD, you can move on to configuring who's in scope for provisioning.
-
-1. In the Azure portal, select **Enterprise applications**.
-1. Select the **On-premises provisioning** application.
-1. On the left, under **Manage**, select **Users and groups**.
-1. Select **Add user/group**.
-
- ![Screenshot that shows adding a user.](.\media\tutorial-ecma-sql-connector\app-2.png)
-1. Under **Users**, select **None Selected**.
-
- ![Screenshot that shows None Selected.](.\media\tutorial-ecma-sql-connector\app-3.png)
-1. Select users from the right and select the **Select** button.</br>
-
- ![Screenshot that shows Select users.](.\media\tutorial-ecma-sql-connector\app-4.png)
-1. Now select **Assign**.
-
- ![Screenshot that shows Assign users.](.\media\tutorial-ecma-sql-connector\app-5.png)
-
-## Configure attribute mappings
-Now you need to map attributes between the on-premises application and your SQL server.
-
-#### Configure attribute mapping
- 1. In the Azure AD portal, under **Enterprise applications**, select the **Provisioning** page.
- 1. Select **Get started**.
- 1. Expand **Mappings** and select **Provision Azure Active Directory Users**.
-
- ![Screenshot that shows provisioning a user.](.\media\on-premises-ecma-configure\configure-10.png)
- 1. Select **Add New Mapping**.
-
- ![Screenshot that shows Add New Mapping.](.\media\on-premises-ecma-configure\configure-11.png)
- 1. Specify the source and target attributes, and add all the mappings in the following table.
-
- |Mapping type|Source attribute|Target attribute|
- |--|--|--|
- |Direct|userPrincipalName|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:ContosoLogin|
- |Direct|objectID|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:AzureID|
- |Direct|mail|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:Email|
- |Direct|givenName|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:FirstName|
- |Direct|surName|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:LastName|
- |Direct|mailNickname|urn:ietf:params:scim:schemas:extension:ECMA2Host:2.0:User:textID|
-
- 1. Select **Save**.
-
- ![Screenshot that shows saving the mapping.](.\media\tutorial-ecma-sql-connector\app-6.png)
-
-## Test provisioning
-Now that your attributes are mapped, you can test on-demand provisioning with one of your users.
-
- 1. In the Azure portal, select **Enterprise applications**.
- 1. Select the **On-premises provisioning** application.
- 1. On the left, select **Provisioning**.
- 1. Select **Provision on demand**.
- 1. Search for one of your test users, and select **Provision**.
-
- ![Screenshot that shows testing provisioning.](.\media\on-premises-ecma-configure\configure-13.png)
-
-## Start provisioning users
- 1. After on-demand provisioning is successful, change back to the provisioning configuration page. Ensure that the scope is set to only assigned users and groups, turn provisioning **On**, and select **Save**.
-
- ![Screenshot that shows Start provisioning.](.\media\on-premises-ecma-configure\configure-14.png)
- 1. Wait several minutes for provisioning to start. It might take up to 40 minutes. After the provisioning job has been completed, as described in the next section, you can change the provisioning status to **Off**, and select **Save**. This action stops the provisioning service from running in the future.
-
-## Check that users were successfully provisioned
-After waiting, check the SQL database to ensure users are being provisioned.
-
- ![Screenshot that shows checking that users are provisioned.](.\media\on-premises-ecma-configure\configure-15.png)
-
-## Appendix A
-Use the following SQL script to create the sample database.
-
-```SQL
-Creating the Database
-Create Database CONTOSO
-Go
--Using the Database--
-Use [CONTOSO]
-Go
---
-/****** Object: Table [dbo].[Employees] Script Date: 1/6/2020 7:18:19 PM ******/
-SET ANSI_NULLS ON
-GO
-
-SET QUOTED_IDENTIFIER ON
-GO
-
-CREATE TABLE [dbo].[Employees](
- [ContosoLogin] [nvarchar](128) NULL,
- [FirstName] [nvarchar](50) NOT NULL,
- [LastName] [nvarchar](50) NOT NULL,
- [Email] [nvarchar](128) NULL,
- [InternalGUID] [uniqueidentifier] NULL,
- [AzureID] [uniqueidentifier] NULL,
- [textID] [nvarchar](128) NULL
-) ON [PRIMARY]
-GO
-
-ALTER TABLE [dbo].[Employees] ADD CONSTRAINT [DF_Employees_InternalGUID] DEFAULT (newid()) FOR [InternalGUID]
-GO
-
-```
-- ## Next steps - [Troubleshoot on-premises application provisioning](on-premises-ecma-troubleshoot.md) - [Review known limitations](known-issues.md)-- [On-premises provisioning prerequisites](on-premises-ecma-prerequisites.md)-- [Review prerequisites for on-premises provisioning](on-premises-ecma-prerequisites.md)+
active-directory Howto Authentication Temporary Access Pass https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-authentication-temporary-access-pass.md
Previously updated : 08/11/2021 Last updated : 09/23/2021
Users can bootstrap Passwordless methods in one of two ways:
A Temporary Access Pass is a time-limited passcode issued by an admin that satisfies strong authentication requirements and can be used to onboard other authentication methods, including Passwordless ones. A Temporary Access Pass also makes recovery easier when a user has lost or forgotten their strong authentication factor like a FIDO2 security key or Microsoft Authenticator app, but needs to sign in to register new strong authentication methods. - This article shows you how to enable and use a Temporary Access Pass in Azure AD using the Azure portal. You can also perform these actions using the REST APIs.
The user is now signed in and can update or register a method such as FIDO2 secu
Users who update their authentication methods due to losing their credentials or device should make sure they remove the old authentication methods. Users can also continue to sign-in by using their password; a TAP doesnΓÇÖt replace a userΓÇÖs password.
-Users can also use their Temporary Access Pass to register for Passwordless phone sign-in directly from the Authenticator app. For more information, see [Add your work or school account to the Microsoft Authenticator app](https://support.microsoft.com/account-billing/add-your-work-or-school-account-to-the-microsoft-authenticator-app-43a73ab5-b4e8-446d-9e54-2a4cb8e4e93c).
+### Passwordless phone sign-in
+
+Users can also use their Temporary Access Pass to register for Passwordless phone sign-in directly from the Authenticator app.
+For more information, see [Add your work or school account to the Microsoft Authenticator app](https://support.microsoft.com/account-billing/add-your-work-or-school-account-to-the-microsoft-authenticator-app-43a73ab5-b4e8-446d-9e54-2a4cb8e4e93c).
![Screenshot of how to enter a Temporary Access Pass using work or school account](./media/how-to-authentication-temporary-access-pass/enter-work-school.png)
-## Delete a Temporary Access Pass
+### Guest access
+
+Guest users can sign-in to a resource tenant with a Temporary Access Pass that was issued by their home tenant if the Temporary Access Pass meets the home tenant authentication requirement.
+If MFA is required for the resource tenant, the guest user needs to perform MFA in order to gain access to the resource.
+
+### Expiration
+
+An expired or deleted Temporary Access Pass canΓÇÖt be used for interactive or non-interactive authentication.
+Users need to reauthenticate with different authentication methods after the Temporary Access Pass is expired or deleted.
+
+## Delete an expired Temporary Access Pass
-An expired Temporary Access Pass canΓÇÖt be used. Under the **Authentication methods** for a user, the **Detail** column shows when the Temporary Access Pass expired. You can delete an expired Temporary Access Pass using the following steps:
+Under the **Authentication methods** for a user, the **Detail** column shows when the Temporary Access Pass expired. You can delete an expired Temporary Access Pass using the following steps:
1. In the Azure AD portal, browse to **Users**, select a user, such as *Tap User*, then choose **Authentication methods**. 1. On the right-hand side of the **Temporary Access Pass (Preview)** authentication method shown in the list, select **Delete**.
For more information about NIST standards for onboarding and recovery, see [NIST
Keep these limitations in mind: - When using a one-time Temporary Access Pass to register a Passwordless method such as FIDO2 or Phone sign-in, the user must complete the registration within 10 minutes of sign-in with the one-time Temporary Access Pass. This limitation does not apply to a Temporary Access Pass that can be used more than once.-- Guest users can't sign in with a Temporary Access Pass. - Temporary Access Pass is in public preview and currently not available in Azure for US Government. - Users in scope for Self Service Password Reset (SSPR) registration policy *or* [Identity Protection Multi-factor authentication registration policy](../identity-protection/howto-identity-protection-configure-mfa-policy.md) will be required to register authentication methods after they have signed in with a Temporary Access Pass. Users in scope for these policies will get redirected to the [Interrupt mode of the combined registration](concept-registration-mfa-sspr-combined.md#combined-registration-modes). This experience does not currently support FIDO2 and Phone Sign-in registration.
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 9/17/2021 Last updated : 9/22/2021
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included**: A list of service plans in the product that correspond to the string ID and GUID - **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID
->[!NOTE]
->This information last updated on September 17th, 2021.
+>[!NOTES]
+>This information last updated on September 22nd, 2021.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing%20v9_22_2021.csv).
+><br/>
| Product name | String ID | GUID | Service plans included | Service plans included (friendly names) | | | | | | |
active-directory How To Connect Sync Endpoint Api V2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/how-to-connect-sync-endpoint-api-v2.md
Microsoft has deployed a new endpoint (API) for Azure AD Connect that improves t
> Currently, the new endpoint does not have a configured group size limit for Microsoft 365 groups that are written back. This may have an effect on your Active Directory and sync cycle latencies. It is recommended to increase your group sizes incrementally. >[!NOTE]
-> The Azure AD Connect sync V2 endpoint API is currently only available in these Azure environments:
+> The Azure AD Connect sync V2 endpoint API is Generally Available but currently can only be used in these Azure environments:
> - Azure Commercial > - Azure China cloud > - Azure US Government cloud
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/reference-connect-version-history.md
However, if youΓÇÖd like all the latest features and updates, the best way to se
## 1.6.14.2 >[!NOTE] >This is an update release of Azure AD Connect. This version is intended to be used by customers who are running an older version of Windows Server and cannot upgrade their server to Windows Server 2016 or newer at this time. You cannot use this version to update an Azure AD Connect V2.0 server.
+>We will begin auto upgrading eligible tenants when this version is available for download, autoupgrade will take a few weeks to complete.
### Release status 9/21/2021: Released for download and auto upgrade.
active-directory Concept Identity Protection Risks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/identity-protection/concept-identity-protection-risks.md
Previously updated : 08/30/2021 Last updated : 09/23/2021
Identity Protection provides organizations access to powerful resources to see a
Risk can be detected at the **User** and **Sign-in** level and two types of detection or calculation **Real-time** and **Offline**.
-Real-time detections may not show up in reporting for five to ten minutes. Offline detections may not show up in reporting for two to twenty-four hours.
+Real-time detections may not show up in reporting for five to 10 minutes. Offline detections may not show up in reporting for 48 hours.
### User-linked detections
-Risky activity can be detected for a user that is not linked to a specific malicious sign-in but to the user itself. These risk detections are calculated offline using Microsoft's internal and external threat intelligence sources including security researchers, law enforcement professionals, security teams at Microsoft, and other trusted sources.
+Risky activity can be detected for a user that isn't linked to a specific malicious sign-in but to the user itself. These risk detections are calculated offline using Microsoft's internal and external threat intelligence sources, like security researchers, law enforcement professionals, security teams at Microsoft, and other trusted sources.
These risks are calculated offline using Microsoft's internal and external threat intelligence sources including security researchers, law enforcement professionals, security teams at Microsoft, and other trusted sources. | Risk detection | Description | | | |
-| Leaked credentials | This risk detection type indicates that the user's valid credentials have been leaked. When cybercriminals compromise valid passwords of legitimate users, they often share those credentials. This sharing is typically done by posting publicly on the dark web, paste sites, or by trading and selling the credentials on the black market. When the Microsoft leaked credentials service acquires user credentials from the dark web, paste sites, or other sources, they are checked against Azure AD users' current valid credentials to find valid matches. For more information about leaked credentials, see [Common questions](#common-questions). |
+| Leaked credentials | This risk detection type indicates that the user's valid credentials have been leaked. When cybercriminals compromise valid passwords of legitimate users, they often share those credentials. This sharing is typically done by posting publicly on the dark web, paste sites, or by trading and selling the credentials on the black market. When the Microsoft leaked credentials service acquires user credentials from the dark web, paste sites, or other sources, they're checked against Azure AD users' current valid credentials to find valid matches. For more information about leaked credentials, see [Common questions](#common-questions). |
| Azure AD threat intelligence | This risk detection type indicates user activity that is unusual for the given user or is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. | ### Sign-in risk
These risks can be calculated in real-time or calculated offline using Microsoft
| Risk detection | Detection type | Description | | | | |
-| Anonymous IP address | Real-time | This risk detection type indicates sign-ins from an anonymous IP address (for example, Tor browser or anonymous VPN). These IP addresses are typically used by actors who want to hide their login telemetry (IP address, location, device, etc.) for potentially malicious intent. |
+| Anonymous IP address | Real-time | This risk detection type indicates sign-ins from an anonymous IP address (for example, Tor browser or anonymous VPN). These IP addresses are typically used by actors who want to hide their login telemetry (IP address, location, device, and so on) for potentially malicious intent. |
| Atypical travel | Offline | This risk detection type identifies two sign-ins originating from geographically distant locations, where at least one of the locations may also be atypical for the user, given past behavior. Among several other factors, this machine learning algorithm takes into account the time between the two sign-ins and the time it would have taken for the user to travel from the first location to the second, indicating that a different user is using the same credentials. <br><br> The algorithm ignores obvious "false positives" contributing to the impossible travel conditions, such as VPNs and locations regularly used by other users in the organization. The system has an initial learning period of the earliest of 14 days or 10 logins, during which it learns a new user's sign-in behavior. |
-| Anomalous Token | Offline | This detection indicates that there are abnormal characteristics in the token such as an unusual token lifetime or a token played from an unfamiliar location. This detection covers Session Tokens and Refresh Tokens. |
+| Anomalous Token | Offline | This detection indicates that there are abnormal characteristics in the token such as an unusual token lifetime or a token that is played from an unfamiliar location. This detection covers Session Tokens and Refresh Tokens. |
| Token Issuer Anomaly | Offline |This risk detection indicates the SAML token issuer for the associated SAML token is potentially compromised. The claims included in the token are unusual or match known attacker patterns. | | Malware linked IP address | Offline | This risk detection type indicates sign-ins from IP addresses infected with malware that is known to actively communicate with a bot server. This detection is determined by correlating IP addresses of the user's device against IP addresses that were in contact with a bot server while the bot server was active. | | Suspicious browser | Offline | Suspicious browser detection indicates anomalous behavior based on suspicious sign-in activity across multiple tenants from different countries in the same browser. |
-| Unfamiliar sign-in properties | Real-time | This risk detection type considers past sign-in history (IP, Latitude / Longitude and ASN) to look for anomalous sign-ins. The system stores information about previous locations used by a user, and considers these "familiar" locations. The risk detection is triggered when the sign-in occurs from a location that's not already in the list of familiar locations. Newly created users will be in "learning mode" for a period of time in which unfamiliar sign-in properties risk detections will be turned off while our algorithms learn the user's behavior. The learning mode duration is dynamic and depends on how much time it takes the algorithm to gather enough information about the user's sign-in patterns. The minimum duration is five days. A user can go back into learning mode after a long period of inactivity. The system also ignores sign-ins from familiar devices, and locations that are geographically close to a familiar location. <br><br> We also run this detection for basic authentication (or legacy protocols). Because these protocols do not have modern properties such as client ID, there is limited telemetry to reduce false positives. We recommend our customers to move to modern authentication. <br><br> Unfamiliar sign-in properties can be detected on both interactive and non-interactive sign-ins. When this detection is detected on non-interactive sign-ins it deserves increased scrutiny due to the risk of token replay attacks. |
+| Unfamiliar sign-in properties | Real-time | This risk detection type considers past sign-in history (IP, Latitude / Longitude and ASN) to look for anomalous sign-ins. The system stores information about previous locations used by a user, and considers these "familiar" locations. The risk detection is triggered when the sign-in occurs from a location that's not already in the list of familiar locations. Newly created users will be in "learning mode" for a while where unfamiliar sign-in properties risk detections will be turned off while our algorithms learn the user's behavior. The learning mode duration is dynamic and depends on how much time it takes the algorithm to gather enough information about the user's sign-in patterns. The minimum duration is five days. A user can go back into learning mode after a long period of inactivity. The system also ignores sign-ins from familiar devices, and locations that are geographically close to a familiar location. <br><br> We also run this detection for basic authentication (or legacy protocols). Because these protocols don't have modern properties such as client ID, there's limited telemetry to reduce false positives. We recommend our customers to move to modern authentication. <br><br> Unfamiliar sign-in properties can be detected on both interactive and non-interactive sign-ins. When this detection is detected on non-interactive sign-ins, it deserves increased scrutiny due to the risk of token replay attacks. |
| Admin confirmed user compromised | Offline | This detection indicates an admin has selected 'Confirm user compromised' in the Risky users UI or using riskyUsers API. To see which admin has confirmed this user compromised, check the user's risk history (via UI or API). | | Malicious IP address | Offline | This detection indicates sign-in from a malicious IP address. An IP address is considered malicious based on high failure rates because of invalid credentials received from the IP address or other IP reputation sources. | | Suspicious inbox manipulation rules | Offline | This detection is discovered by [Microsoft Cloud App Security (MCAS)](/cloud-app-security/anomaly-detection-policy#suspicious-inbox-manipulation-rules). This detection profiles your environment and triggers alerts when suspicious rules that delete or move messages or folders are set on a user's inbox. This detection may indicate that the user's account is compromised, that messages are being intentionally hidden, and that the mailbox is being used to distribute spam or malware in your organization. |
These risks can be calculated in real-time or calculated offline using Microsoft
| Risk detection | Detection type | Description | | | | |
-| Additional risk detected | Real-time or Offline | This detection indicates that one of the above premium detections was detected. Since the premium detections are visible only to Azure AD Premium P2 customers, they are titled "additional risk detected" for customers without Azure AD Premium P2 licenses. |
+| Additional risk detected | Real-time or Offline | This detection indicates that one of the above premium detections was detected. Since the premium detections are visible only to Azure AD Premium P2 customers, they're titled "additional risk detected" for customers without Azure AD Premium P2 licenses. |
## Common questions ### Risk levels
-Identity Protection categorizes risk into three tiers: low, medium, and high. When configuring [custom Identity protection policies](./concept-identity-protection-policies.md#custom-conditional-access-policy), you can also configure it to trigger upon **No risk** level. No Risk means there is no active indication that the user's identity has been compromised.
+Identity Protection categorizes risk into three tiers: low, medium, and high. When configuring [custom Identity protection policies](./concept-identity-protection-policies.md#custom-conditional-access-policy), you can also configure it to trigger upon **No risk** level. No Risk means there's no active indication that the user's identity has been compromised.
-While Microsoft does not provide specific details about how risk is calculated, we will say that each level brings higher confidence that the user or sign-in is compromised. For example, something like one instance of unfamiliar sign-in properties for a user might not be as threatening as leaked credentials for another user.
+While Microsoft doesn't provide specific details about how risk is calculated, we'll say that each level brings higher confidence that the user or sign-in is compromised. For example, something like one instance of unfamiliar sign-in properties for a user might not be as threatening as leaked credentials for another user.
### Password hash synchronization
active-directory Howto Identity Protection Configure Notifications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/identity-protection/howto-identity-protection-configure-notifications.md
Previously updated : 11/09/2020 Last updated : 09/23/2021
Azure AD Identity Protection sends two types of automated notification emails to
This article provides you with an overview of both notification emails.
+We don't support sending emails to users in group-assigned roles.
+ ## Users at risk detected email In response to a detected account at risk, Azure AD Identity Protection generates an email alert with **Users at risk detected** as subject. The email includes a link to the **[Users flagged for risk](./overview-identity-protection.md)** report. As a best practice, you should immediately investigate the users at risk.
-The configuration for this alert allows you to specify at what user risk level you want the alert to be generated. The email will be generated when the user's risk level reaches what you have specified. For example, if you set the policy to alert on medium user risk and your user John's user risk score moves to medium risk due to a real-time sign-in risk, you will receive the users at risk detected email. If the user has subsequent risk detections that cause the user risk level calculation to be the specified risk level (or higher), you will receive additional user at risk detected emails when the user risk score is recalculated. For example, if a user moves to medium risk on January 1, you will receive an email notification if your settings are set to alert on medium risk. If that same user then has another risk detection on January 5 that's also medium risk, and the user risk score is recalculated and is still medium, you will receive another email notification.
+The configuration for this alert allows you to specify at what user risk level you want the alert to be generated. The email will be generated when the user's risk level reaches what you have specified. For example, if you set the policy to alert on medium user risk and your user John's user risk score moves to medium risk because of a real-time sign-in risk, you'll receive the users at risk detected email. If the user has subsequent risk detections that cause the user risk level calculation to be the specified risk level (or higher), you'll receive more user at risk detected emails when the user risk score is recalculated. For example, if a user moves to medium risk on January 1, you'll receive an email notification if your settings are set to alert on medium risk. If that same user then has another risk detection on January 5 that's also medium risk, and the user risk score is recalculated and is still medium, you'll receive another email notification.
-However, an additional email notification will only be sent if the time the risk detection occurred (that caused the change in user risk level) is more recent than when the last email was sent. For example, a user signs in on January 1 at 5 AM and there is no real-time risk (meaning no email would be generated due to that sign-in). Ten minutes later, at 5:10 AM, the same user signs-in again and has high real-time risk, causing the user risk level to move to high and an email to be sent. Then, at 5:15 AM, the offline risk score for the original sign-in at 5 AM changes to high risk due to offline risk processing. An additional user flagged for risk e-mail would not be sent, since the time of the first sign-in was before the second sign-in that already triggered an email notification.
+However, an extra email notification will only be sent if the time the risk detection occurred (that caused the change in user risk level) is more recent than when the last email was sent. For example, a user signs in on January 1 at 5 AM and there's no real-time risk (meaning no email would be generated because of that sign-in). 10 minutes later, at 5:10 AM, the same user signs-in again and has high real-time risk, causing the user risk level to move to high and an email to be sent. Then, at 5:15 AM, the offline risk score for the original sign-in at 5 AM changes to high risk because of offline risk processing. Another user flagged for risk e-mail wouldn't be sent, since the time of the first sign-in was before the second sign-in that already triggered an email notification.
-To prevent an overload of e-mails, you will only receive one email within a 5-second time period. This delay means that if multiple users move to the specified risk level during the same 5-second time period, we will aggregate and send one e-mail to represent the change in risk level for all of them.
+To prevent an overload of e-mails, you'll only receive one email within a 5-second time period. This delay means that if multiple users move to the specified risk level during the same 5-second time period, we'll aggregate and send one e-mail to represent the change in risk level for all of them.
-If your organization has enabled self-remediation as described in the article, [User experiences with Azure AD Identity Protection](concept-identity-protection-user-experience.md) there is a chance that the user may remediate their risk before you have the opportunity to investigate. You can see risky users and risky sign-ins that have been remediated by adding "Remediated" to the **Risk state** filter in either the **Risky users** or **Risky sign-ins** reports.
+If your organization has enabled self-remediation as described in the article, [User experiences with Azure AD Identity Protection](concept-identity-protection-user-experience.md) there's a chance that the user may remediate their risk before you have the opportunity to investigate. You can see risky users and risky sign-ins that have been remediated by adding "Remediated" to the **Risk state** filter in either the **Risky users** or **Risky sign-ins** reports.
![Users at risk detected email](./media/howto-identity-protection-configure-notifications/01.png)
active-directory Migrate Okta Sign On Policies To Azure Active Directory Conditional Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access.md
Title: Tutorial to migrate Okta sign on policies to Azure Active Directory Conditional Access
+ Title: Tutorial to migrate Okta sign-on policies to Azure Active Directory Conditional Access
-description: Learn how to migrate Okta sign on policies to Azure Active Directory Conditional Access
+description: In this tutorial, you learn how to migrate Okta sign-on policies to Azure Active Directory Conditional Access.
-# Tutorial: Migrate Okta sign on policies to Azure Active Directory Conditional Access
+# Tutorial: Migrate Okta sign-on policies to Azure AD Conditional Access
-In this tutorial, learn how organizations can migrate from global or application-level sign-on policies in Okta to Azure Active Directory (AD) Conditional Access (CA) policies to secure user access in Azure AD and connected applications.
+In this tutorial, you'll learn how your organization can migrate from global or application-level sign-on policies in Okta to Azure Active Directory (Azure AD) Conditional Access policies to secure user access in Azure AD and connected applications.
-This tutorial assumes you have an Office 365 tenant federated to Okta for sign-on and multifactor authentication (MFA). You should also have Azure AD Connect server or Azure AD Connect cloud provisioning agents configured for user provisioning to Azure AD.
+This tutorial assumes you have an Office 365 tenant federated to Okta for sign-in and multi-factor authentication (MFA). You should also have Azure AD Connect server or Azure AD Connect cloud provisioning agents configured for user provisioning to Azure AD.
## Prerequisites
-When switching from Okta sign on to Azure AD CA, it's important to understand licensing requirements. Azure AD CA requires users have an Azure AD Premium P1 License assigned before registration for Azure AD Multi-Factor Authentication.
+When you switch from Okta sign-on to Azure AD Conditional Access, it's important to understand licensing requirements. Azure AD Conditional Access requires users to have an Azure AD Premium P1 License assigned before registration for Azure AD Multi-Factor Authentication.
-Before you do any of the steps for hybrid Azure AD join, you'll need an enterprise administrator credential in the on-premises forest to configure the Service Connection Point (SCP) record.
+Before you do any of the steps for Hybrid Azure AD Join, you'll need an enterprise administrator credential in the on-premises forest to configure the service connection point (SCP) record.
-## Step 1 - Catalog current Okta sign on policies
+## Catalog current Okta sign-on policies
-To complete a successful transition to CA, the existing Okta sign on policies should be evaluated to determine use cases and requirements that will be transitioned to Azure AD.
+To complete a successful transition to Conditional Access, evaluate the existing Okta sign-on policies to determine use cases and requirements that will be transitioned to Azure AD.
-1. Check the global sign-on policies by navigating to **Security**, selecting **Authentication**, and then **Sign On**.
+1. Check the global sign-on policies by selecting **Security** > **Authentication** > **Sign On**.
- ![image shows global sign on policies](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/global-sign-on-policies.png)
+ ![Screenshot that shows global sign-on policies.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/global-sign-on-policies.png)
- In this example, our global sign-on policy is enforcing MFA on all sessions outside of our configured network zones.
+ In this example, the global sign-on policy enforces MFA on all sessions outside of our configured network zones.
- ![image shows global sign on policies enforc mfa](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/global-sign-on-policies-enforce-mfa.png)
+ ![Screenshot that shows global sign-on policies enforcing MFA.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/global-sign-on-policies-enforce-mfa.png)
-2. Next, navigate to **Applications**, and check the application-level sign-on policies. Select **Applications** from the submenu, and then select your Office 365 connected instance from the **Active apps list**.
+1. Go to **Applications**, and check the application-level sign-on policies. Select **Applications** from the submenu, and then select your Office 365 connected instance from the **Active apps list**.
-3. Finally, select **Sign On** and scroll to the bottom of the page.
+1. Select **Sign On** and scroll to the bottom of the page.
-In the following example, our Office 365 application sign-on policy has four separate rules.
+ In the following example, the Office 365 application sign-on policy has four separate rules:
-- **Enforce MFA for mobile sessions** - Requires MFA from every modern authentication or browser session on iOS or Android.
+ - **Enforce MFA for Mobile Sessions**: Requires MFA from every modern authentication or browser session on iOS or Android.
+ - **Allow Trusted Windows Devices**: Prevents your trusted Okta devices from being prompted for more verification or factors.
+ - **Require MFA from Untrusted Windows Devices**: Requires MFA from every modern authentication or browser session on untrusted Windows devices.
+ - **Block Legacy Authentication**: Prevents any legacy authentication clients from connecting to the service.
-- **Allow trusted Windows devices** - Prevents your trusted Okta devices from being prompted for additional verification or factors.
+ ![Screenshot that shows Office 365 sign-on rules.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/sign-on-rules.png)
-- **Require MFA from untrusted Windows devices** - Requires MFA from every modern authentication or browser session on untrusted Windows devices.
+## Configure condition prerequisites
-- **Block legacy authentication** - Prevents any legacy authentication clients from connecting to the service.
+Azure AD Conditional Access policies can be configured to match Okta's conditions for most scenarios without more configuration.
- ![image shows o365 sign on rules](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/sign-on-rules.png)
+In some scenarios, you might need more setup before you configure the Conditional Access policies. The two known scenarios at the time of writing this article are:
-## Step 2 - Configure condition pre-requisites
+- **Okta network locations to named locations in Azure AD**: Follow the instructions in [Using the location condition in a Conditional Access policy](../conditional-access/location-condition.md#named-locations) to configure named locations in Azure AD.
+- **Okta device trust to device-based CA**: Conditional Access offers two possible options when you evaluate a user's device:
-Azure AD CA policies can be configured to match Okta's conditions for most scenarios without additional configuration.
+ - [Use Hybrid Azure AD Join](#hybrid-azure-ad-join-configuration), which is a feature enabled within the Azure AD Connect server that synchronizes Windows current devices, such as Windows 10, Windows Server 2016, and Windows Server 2019, to Azure AD.
+ - [Enroll the device in Endpoint Manager](#configure-device-compliance), and assign a compliance policy.
-In some scenarios, you may need additional setup before you configure the CA policies. The two known scenarios at the time of writing this article are:
+### Hybrid Azure AD Join configuration
-- **Okta network locations to named locations in Azure AD** - Follow [this article](../conditional-access/location-condition.md#named-locations) to configure named locations in Azure AD.--- **Okta device trust to device-based CA** - CA offers two possible options when evaluating a user's device.-
- - [Hybrid Azure AD join](#hybrid-azure-ad-join-configuration) - A feature enabled within the Azure AD Connect server that synchronizes Windows current devices such as Windows 10, Server 2016 and 2019, to Azure AD.
-
- - [Enroll the device into Microsoft Endpoint Manager](#configure-device-compliance) and assign a compliance policy.
-
-### Hybrid Azure AD join configuration
-
-Enabling hybrid Azure AD join can be done on your Azure AD Connect server by running the configuration wizard. Post configuration, steps will need to be taken to automatically enroll devices.
+To enable Hybrid Azure AD Join on your Azure AD Connect server, run the configuration wizard. You'll need to take steps post-configuration to automatically enroll devices.
>[!NOTE]
->Hybrid Azure AD join isn't supported with the Azure AD Connect cloud provisioning agents.
+>Hybrid Azure AD Join isn't supported with the Azure AD Connect cloud provisioning agents.
-1. Follow these [instructions](../devices/hybrid-azuread-join-managed-domains.md#configure-hybrid-azure-ad-join) to enable Hybrid Azure AD join.
+1. To enable Hybrid Azure AD Join, follow these [instructions](../devices/hybrid-azuread-join-managed-domains.md#configure-hybrid-azure-ad-join).
-2. On the SCP configuration page, select the **Authentication Service** drop-down. Choose your Okta federation provider URL followed by **Add**. Enter your on-premises enterprise administrator credentials then select **Next**.
+1. On the **SCP configuration** page, select the **Authentication Service** dropdown. Choose your Okta federation provider URL, and select **Add**. Enter your on-premises enterprise administrator credentials, and then select **Next**.
- ![image shows scp configuration](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/scp-configuration.png)
+ ![Screenshot that shows SCP configuration.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/scp-configuration.png)
-3. If you have blocked legacy authentication on Windows clients in either the global or app level sign on policy, make a rule to allow the hybrid Azure AD join process to finish.
+1. If you've blocked legacy authentication on Windows clients in either the global or app-level sign-on policy, make a rule to allow the Hybrid Azure AD Join process to finish.
-4. You can either allow the entire legacy authentication stack through for all Windows clients or contact Okta support to enable their custom client string on your existing app policies.
+1. Allow the entire legacy authentication stack through for all Windows clients. You can also contact Okta support to enable its custom client string on your existing app policies.
### Configure device compliance
-While hybrid Azure AD join is direct replacement for Okta device trust on Windows, CA policies can also look at device compliance for devices that have fully enrolled into Microsoft Endpoint Manager.
--- **Compliance overview** - Refer to [device compliance policies in Microsoft Intune](/mem/intune/protect/device-compliance-get-started#:~:text=Reference%20for%20non-compliance%20and%20Conditional%20Access%20on%20the,applicable%20%20...%20%203%20more%20rows).--- **Device compliance** - Create [policies in Microsoft Intune](/mem/intune/protect/create-compliance-policy).
+Hybrid Azure AD Join is a direct replacement for Okta device trust on Windows. Conditional Access policies can also look at device compliance for devices that have fully enrolled in Endpoint
-- **Windows enrollment** - If you've opted to deploy hybrid Azure AD join, an additional group policy can be deployed to complete the [auto-enrollment process of these devices into Microsoft Intune](/windows/client-management/mdm/enroll-a-windows-10-device-automatically-using-group-policy).
+- **Compliance overview**: Refer to [device compliance policies in Intune](/mem/intune/protect/device-compliance-get-started#:~:text=Reference%20for%20non-compliance%20and%20Conditional%20Access%20on%20the,applicable%20%20...%20%203%20more%20rows).
+- **Device compliance**: Create [policies in Intune](/mem/intune/protect/create-compliance-policy).
+- **Windows enrollment**: If you've opted to deploy Hybrid Azure AD Join, you can deploy another group policy to complete the [auto-enrollment process of these devices in Intune](/windows/client-management/mdm/enroll-a-windows-10-device-automatically-using-group-policy).
+- **iOS/iPadOS enrollment**: Before you enroll an iOS device, you must make [more configurations](/mem/intune/enrollment/ios-enroll) in the Endpoint Management console.
+- **Android enrollment**: Before you enroll an Android device, you must make [more configurations](/mem/intune/enrollment/android-enroll) in the Endpoint Management console.
-- **iOS/iPadOS enrollment** - Before enrolling an iOS device, [additional configurations](/mem/intune/enrollment/ios-enroll) must be made in the Endpoint Management Console.
+## Configure Azure AD Multi-Factor Authentication tenant settings
-- **Android enrollment** - Before enrolling an Android device, [additional configurations](/mem/intune/enrollment/android-enroll) must be made in the Endpoint Management Console.
+Before you convert to Conditional Access, confirm the base Azure AD Multi-Factor Authentication tenant settings for your organization.
-## Step 3 - Configure Azure AD Multi-Factor Authentication tenant settings
+1. Go to the [Azure portal](https://portal.azure.com), and sign in with a global administrator account.
-Before converting to CA, confirm the base Azure AD Multi-Factor Authentication
-tenant settings for your organization.
+1. Select **Azure Active Directory** > **Users** > **Multi-Factor Authentication** to go to the legacy Azure AD Multi-Factor Authentication portal.
-1. Navigate to the [Azure portal](https://portal.azure.com) and sign in with a global administrator account.
+ ![Screenshot that shows the legacy Azure AD Multi-Factor Authentication portal.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/legacy-azure-ad-portal.png)
-2. Select **Azure Active Directory**, followed by **Users**, and then **Multi-Factor Authentication** this will take you to the Legacy Azure MFA portal.
+ You can also use the legacy link to the [Azure AD Multi-Factor Authentication portal](https://aka.ms/mfaportal).
- ![image shows legacy Azure AD Multi-Factor Authentication portal](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/legacy-azure-ad-portal.png)
+1. On the legacy **multi-factor authentication** menu, change the status menu through **Enabled** and **Enforced** to confirm you have no users enabled for legacy MFA. If your tenant has users in the following views, you must disable them in the legacy menu. Only then will Conditional Access policies take effect on their account.
-Instead, you can use **<https://aka.ms/mfaportal>**.
+ ![Screenshot that shows disabling a user in the legacy Azure AD Multi-Factor Authentication portal.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/disable-user-legacy-azure-ad-portal.png)
-4. From the **Legacy Azure MFA** menu, change the status menu through **enabled** and **enforced** to confirm you have no users enabled for Legacy MFA. If your tenant has users in the below views, you must disable them in the legacy menu. Only then CA policies will take effect on their account.
+ The **Enforced** field should also be empty.
- ![image shows disable user in legacy Azure AD Multi-Factor Authentication portal](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/disable-user-legacy-azure-ad-portal.png)
+ ![Screenshot that shows the Enforced field is empty in the legacy Azure AD Multi-Factor Authentication portal.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/enforced-empty-legacy-azure-ad-portal.png)
- **Enforced** field should also be empty.
+1. Select the **Service settings** option. Change the **App passwords** selection to **Do not allow users to create app passwords to sign in to non-browser apps**.
- ![image shows enforced field is empty in legacy Azure AD Multi-Factor Authentication portal](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/enforced-empty-legacy-azure-ad-portal.png)
-
-5. After confirming no users are configured for legacy MFA, select the **Service settings** option. Change the **App passwords** selection to **Do not allow users to create app passwords to sign in to non-browser apps**.
-
-6. Ensure the **Skip multi-factor authentication for requests from federated users on my intranet** and **Allow users to remember multi-factor authentication on devices they trust (between one to 365 days)** boxes are unchecked and then select **Save**.
+1. Ensure the **Skip multi-factor authentication for requests from federated users on my intranet** and **Allow users to remember multi-factor authentication on devices they trust (between one to 365 days)** checkboxes are cleared, and then select **Save**.
>[!NOTE]
- >See [best practices for configuring MFA prompt settings](../authentication/concepts-azure-multi-factor-authentication-prompts-session-lifetime.md).
+ >See [best practices for configuring the MFA prompt settings.](../authentication/concepts-azure-multi-factor-authentication-prompts-session-lifetime.md).
- ![image shows uncheck fields in legacy Azure AD Multi-Factor Authentication portal](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/uncheck-fields-legacy-azure-ad-portal.png)
+ ![Screenshot that shows cleared checkboxes in the legacy Azure AD Multi-Factor Authentication portal.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/uncheck-fields-legacy-azure-ad-portal.png)
-## Step 4 - Configure CA policies
+## Configure Conditional Access policies
-After you configured the pre-requisites, and established the base settings its time to build the first CA policy.
+After you've configured the prerequisites and established the base settings, it's time to build the first Conditional Access policy.
-1. To configure CA policies in Azure AD, navigate to the [Azure portal](https://portal.azure.com). Select **View** on Manage Azure Active Directory.
+1. To configure Conditional Access policies in Azure AD, go to the [Azure portal](https://portal.azure.com). On **Manage Azure Active Directory**, select **View**.
-2. Configuration of CA policies should keep in mind [best
-practices for deploying and designing CA](../conditional-access/plan-conditional-access.md#understand-conditional-access-policy-components).
+ Configure Conditional Access policies by following [best
+practices for deploying and designing Conditional Access](../conditional-access/plan-conditional-access.md#understand-conditional-access-policy-components).
-3. To mimic global sign-on MFA policy from Okta, [create a policy](../conditional-access/howto-conditional-access-policy-all-users-mfa.md).
+1. To mimic the global sign-on MFA policy from Okta, [create a policy](../conditional-access/howto-conditional-access-policy-all-users-mfa.md).
-4. Create a [device trust based CA rule](../conditional-access/require-managed-devices.md).
+1. Create a [device trust-based Conditional Access rule.](../conditional-access/require-managed-devices.md).
-5. This policy as any other in this tutorial can be targeted to a specific application, test group of users or both.
+ This policy as any other in this tutorial can be targeted to a specific application, a test group of users, or both.
- ![image shows testing user](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/test-user.png)
+ ![Screenshot that shows testing a user.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/test-user.png)
- ![image shows success in testing user](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/success-test-user.png)
+ ![Screenshot that shows success in testing a user.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/success-test-user.png)
-6. After you configured the location-based policy, and device
-trust policy, its time to configure the equivalent [**Block legacy authentication**](../conditional-access/howto-conditional-access-policy-block-legacy.md) policy.
+1. After you've configured the location-based policy and device trust policy, it's time to configure the equivalent [block legacy authentication](../conditional-access/howto-conditional-access-policy-block-legacy.md) policy.
-With these three CA policies, the original Okta sign on policies experience has been replicated in Azure AD. Next steps involve enrolling the user to Azure MFA and testing the policies.
+With these three Conditional Access policies, the original Okta sign-on policies experience has been replicated in Azure AD. Next steps involve enrolling the user via Azure Multi-Factor Authentication and testing the policies.
-## Step 5 - Enroll pilot members in Azure AD Multi-Factor Authentication
+## Enroll pilot members in Azure AD Multi-Factor Authentication
-Once the CA policies have been configured, users will
-need to register for Azure MFA methods. Users can be required to register through several different methods.
+After you configure the Conditional Access policies, users must register for Azure Multi-Factor Authentication methods. Users can be required to register through several different methods.
-1. For individual registration, you can direct users to
-<https://aka.ms/mfasetup> to manually enter the registration information.
+1. For individual registration, direct users to the [Microsoft Sign-in pane](https://aka.ms/mfasetup) to manually enter the registration information.
-2. User can go to <https://aka.ms/mysecurityinfo> to
-enter information or manage form of MFA registration.
+1. Users can go to the [Microsoft Security info page](https://aka.ms/mysecurityinfo) to enter information or manage the form of MFA registration.
See [this guide](../authentication/howto-registration-mfa-sspr-combined.md) to fully understand the MFA registration process.
-Navigate to <https://aka.ms/mfasetup> after signing in with Okta MFA, you're instructed to register for MFA with Azure AD.
+Go to the [Microsoft Sign-in pane](https://aka.ms/mfasetup). After you sign in with Okta MFA, you're instructed to register for MFA with Azure AD.
>[!NOTE]
->If registration already happened in the past for that user,
-they'll be taken to **My Security** information page after satisfying the MFA prompt.
+>If registration already happened in the past for a user, they're taken to the **My Security** information page after they satisfy the MFA prompt.
-See the [end-user documentation for MFA enrollment](../user-help/security-info-setup-signin.md).
+See the [user documentation for MFA enrollment](../user-help/security-info-setup-signin.md).
-## Step 6 - Enable CA policies
+## Enable Conditional Access policies
1. To roll out testing, change the policies created in the earlier examples to **Enabled test user login**.
- ![image shows enable test user](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/enable-test-user.png)
+ ![Screenshot that shows enabling a test user.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/enable-test-user.png)
-2. On the next sign-in to Office 365, the test user John Smith is prompted to sign in with Okta MFA, and Azure AD Multi-Factor Authentication.
+1. On the next Office 365 **Sign-In** pane, the test user John Smith is prompted to sign in with Okta MFA and Azure AD Multi-Factor Authentication.
- ![image shows sign-in through okta](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/sign-in-through-okta.png)
+ ![Screenshot that shows the Azure Sign-In pane.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/sign-in-through-okta.png)
-3. Complete the MFA verification through Okta.
+1. Complete the MFA verification through Okta.
- ![image shows mfa verification through okta](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/mfa-verification-through-okta.png)
+ ![Screenshot that shows MFA verification through Okta.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/mfa-verification-through-okta.png)
-4. After the user completes the Okta MFA prompt, the user will be prompted for CA. Ensure that the policies have been configured appropriately and is within conditions to be triggered for MFA.
+1. After the user completes the Okta MFA prompt, the user is prompted for Conditional Access. Ensure that the policies were configured appropriately and are within conditions to be triggered for MFA.
- ![image shows mfa verification through okta prompted for CA](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/mfa-verification-through-okta-prompted-ca.png)
+ ![Screenshot that shows MFA verification through Okta prompted for Conditional Access.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/mfa-verification-through-okta-prompted-ca.png)
-## Step 7 - Cutover from sign on to CA policies
+## Cut over from sign-on to Conditional Access policies
-After conducting thorough testing on the pilot members to ensure that CA is in effect as expected, the remaining organization members can be added into CA policies after registration has been completed.
+After you conduct thorough testing on the pilot members to ensure that Conditional Access is in effect as expected, the remaining organization members can be added to Conditional Access policies after registration has been completed.
-To avoid double-prompting between Azure MFA and Okta MFA, you should opt out from Okta MFA by modifying sign-on policies.
+To avoid double-prompting between Azure Multi-Factor Authentication and Okta MFA, opt out from Okta MFA by modifying sign-on policies.
-The final migration step to CA can be done in a staged or cut-over fashion.
+The final migration step to Conditional Access can be done in a staged or cut-over fashion.
-1. Navigate to the Okta admin console, select **Security**, followed by **Authentication**, and then navigate to the **Sign On Policy**.
+1. Go to the Okta admin console, select **Security** > **Authentication**, and then go to **Sign-on Policy**.
->[!NOTE]
->Global policies should be set to inactive only if all applications from Okta are protected by their own application sign on policies.
+ >[!NOTE]
+ > Set global policies to **Inactive** only if all applications from Okta are protected by their own application sign-on policies.
-2. Set the Enforce MFA policy to **Inactive** or assign the policy to a new group that doesn't include our Azure AD users.
+1. Set the **Enforce MFA** policy to **Inactive**. You can also assign the policy to a new group that doesn't include the Azure AD users.
- ![image shows mfa policy to inactive](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/mfa-policy-inactive.png)
+ ![Screenshot that shows Global MFA Sign On Policy as Inactive.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/mfa-policy-inactive.png)
-3. On the application-level sign-on policy, update the policies to inactive by selecting the **Disable Rule** option. You can also assign the policy to a new group that doesn't include the Azure AD users.
+1. On the application-level sign-on policy pane, update the policies to **Inactive** by selecting the **Disable Rule** option. You can also assign the policy to a new group that doesn't include the Azure AD users.
-4. Ensure there is at least one application level sign-on policy that is enabled for the application that allows access without MFA.
+1. Ensure there's at least one application-level sign-on policy that's enabled for the application that allows access without MFA.
- ![image shows application access without mfa](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/application-access-without-mfa.png)
+ ![Screenshot that shows application access without MFA.](media/migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access/application-access-without-mfa.png)
-5. After disabling the Okta sign on policies, or excluding the migrated Azure AD users from the enforcement groups, the users should be prompted **only** for CA on their next sign-in.
+1. After you disable the Okta sign-on policies or exclude the migrated Azure AD users from the enforcement groups, users are prompted *only* for Conditional Access the next time they sign in.
## Next steps -- [Migrate applications from Okta to Azure AD](migrate-applications-from-okta-to-azure-active-directory.md)
+For more information about migrating from Okta to Azure AD, see:
+- [Migrate applications from Okta to Azure AD](migrate-applications-from-okta-to-azure-active-directory.md)
- [Migrate Okta federation to Azure AD](migrate-okta-federation-to-azure-active-directory.md)--- [Migrate Okta sync provisioning to Azure AD Connect based synchronization](migrate-okta-sync-provisioning-to-azure-active-directory.md)
+- [Migrate Okta sync provisioning to Azure AD Connect-based synchronization](migrate-okta-sync-provisioning-to-azure-active-directory.md)
active-directory Migrate Okta Sync Provisioning To Azure Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-okta-sync-provisioning-to-azure-active-directory.md
Title: Tutorial to migrate Okta sync provisioning to Azure AD Connect based synchronization
+ Title: Tutorial to migrate Okta sync provisioning to Azure AD Connect-based synchronization
-description: Learn how to migrate your Okta sync provisioning to Azure AD Connect based synchronization
+description: In this tutorial, you learn how to migrate your Okta sync provisioning to Azure AD Connect-based synchronization.
-# Tutorial: Migrate Okta sync provisioning to Azure Active Directory Connect based synchronization
+# Tutorial: Migrate Okta sync provisioning to Azure AD Connect-based synchronization
-This article will guide organizations who currently use User provisioning from Okta to Azure Active Directory (Azure AD), migrate either User sync, or Universal sync to Azure AD Connect. This will enable further provisioning into Azure AD and Office 365.
+In this tutorial, you'll learn how your organization can currently migrate User provisioning from Okta to Azure Active Directory (Azure AD) and migrate either User sync or Universal sync to Azure AD Connect. This capability will enable further provisioning into Azure AD and Office 365.
-Migrating synchronization platforms isn't a small change. Each step of the process mentioned in this article should be validated against your own environment before you remove the Azure AD Connect from staging mode or enable the Azure AD cloud provisioning agent.
+Migrating synchronization platforms isn't a small change. Each step of the process mentioned in this article should be validated against your own environment before you remove Azure AD Connect from staging mode or enable the Azure AD cloud provisioning agent.
## Prerequisites
-When switching from Okta provisioning to Azure AD, customers have
-two choices, either Azure AD Connect Server, or Azure AD cloud
-provisioning. It is recommended to read the full [comparison article from Microsoft](../cloud-sync/what-is-cloud-sync.md#comparison-between-azure-ad-connect-and-cloud-sync) to understand the differences between the two products.
+When you switch from Okta provisioning to Azure AD, you have two choices. You can use either an Azure AD Connect server or Azure AD cloud provisioning. To understand the differences between the two, read the [comparison article from Microsoft](../cloud-sync/what-is-cloud-sync.md#comparison-between-azure-ad-connect-and-cloud-sync).
-Azure AD cloud provisioning will be most familiar migration path for Okta customers using Universal or User sync. The cloud provisioning agents are lightweight, and can be installed on or near domain controllers like the Okta directory sync agents. It is not recommended to install them on the same server.
+Azure AD cloud provisioning will be the most familiar migration path for Okta customers who use Universal or User sync. The cloud provisioning agents are lightweight. They can be installed on or near domain controllers like the Okta directory sync agents. Don't install them on the same server.
-Azure AD Connect server should be chosen if your organization needs to take advantage of any of the following technologies when synchronizing users.
--- Device synchronization - Hybrid Azure AD join or Hello for
- Business
+Use an Azure AD Connect server if your organization needs to take advantage of any of the following technologies when you synchronize users:
+- Device synchronization: Hybrid Azure AD join or Hello for Business
- Passthrough authentication--- More than 150k object support-
+- More than 150,000 object support
- Support for writeback >[!NOTE]
->All pre-requisites should be taken into consideration when installing Azure AD Connect or Azure AD cloud provisioning. Refer to [this article to learn more](../hybrid/how-to-connect-install-prerequisites.md) before installation.
+>All prerequisites should be taken into consideration when you install Azure AD Connect or Azure AD cloud provisioning. To learn more before you continue with installation, see [Prerequisites for Azure AD Connect](../hybrid/how-to-connect-install-prerequisites.md).
-## Step 1 - Confirm ImmutableID attribute synchronized by Okta
+## Confirm ImmutableID attribute synchronized by Okta
-ImmutableID is the core attribute used to tie synchronized objects to their on-premises counterparts. Okta takes the Active Directory objectGUID of an on-premises object and converts it to a Base64 encoded string. Then, by default stamps that string to the ImmutableID field in Azure AD.
+ImmutableID is the core attribute used to tie synchronized objects to their on-premises counterparts. Okta takes the Active Directory objectGUID of an on-premises object and converts it to a Base64 encoded string. Then, by default it stamps that string to the ImmutableID field in Azure AD.
-You can connect to Azure AD PowerShell and examine the current ImmutableID value. If you've never used the Azure AD PowerShell module, run an
-`Install-Module AzureAD` in an administrative PowerShell session before you run the following commands.
+You can connect to Azure AD PowerShell and examine the current ImmutableID value. If you've never used the Azure AD PowerShell module, run
+`Install-Module AzureAD` in an administrative PowerShell session before you run the following commands:
```Powershell Import-module AzureAD Connect-AzureAD ```
-In case you already have the module, you may receive a warning to update to the latest version if it is out of date.
+If you already have the module, you might receive a warning to update to the latest version if it's out of date.
After the module is installed, import it, and follow these steps to connect to the Azure AD service: 1. Enter your global administrator credentials in the modern authentication window.
- ![image shows import module](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/import-module.png)
+ ![Screenshot that shows import-module.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/import-module.png)
-2. After connecting to the tenant, verify what your ImmutableID's are set as. The example shown is using Okta defaults of objectGUID to ImmutableID.
+1. After you connect to the tenant, verify the settings for your ImmutableID values. The example shown uses Okta defaults of objectGUID to ImmutableID.
- ![image shows Okta defaults of objectGUID to ImmutableID](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/okta-default-objectid.png)
+ ![Screenshot that shows Okta defaults of objectGUID to ImmutableID.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/okta-default-objectid.png)
-3. There are several ways to manually confirm the objectGUID to Base64 conversion on-premises, for individual validation use this example:
+1. There are several ways to manually confirm the objectGUID to Base64 conversion on-premises. For individual validation, use this example:
```PowerShell Get-ADUser onpremupn | fl objectguid
After the module is installed, import it, and follow these steps to connect to t
[system.convert]::ToBase64String(([GUID]$objectGUID).ToByteArray()) ```
- ![image shows how manually change Okta objectGUID to ImmutableID](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/manual-objectguid.png)
+ ![Screenshot that shows how to manually change Okta objectGUID to ImmutableID.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/manual-objectguid.png)
-## Step 2 - Mass validation methods for objectGUID
+## Mass validation methods for objectGUID
-Before cutting over to Azure AD Connect, it's critical to validate that the ImmutableID's in Azure AD are going to exactly match their on-premises values.
+Before you cut over to Azure AD Connect, it's critical to validate that the ImmutableID values in Azure AD are going to exactly match their on-premises values.
-The example will grab **all** on-premises AD users, and export a list of their objectGUID's and ImmutableID's already calculated to a CSV file.
+The example will grab *all* on-premises Azure AD users and export a list of their objectGUID values and ImmutableID values already calculated to a CSV file.
-1. Run these commands in PowerShell on a domain controller on-premises.
+1. Run these commands in PowerShell on a domain controller on-premises:
```PowerShell Get-ADUser -Filter * -Properties objectGUID | Select-Object
The example will grab **all** on-premises AD users, and export a list of their o
} } | export-csv C:\\Temp\\OnPremIDs.csv ```
- ![image shows domain controller on-premises commands](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/domain-controller.png)
+ ![Screenshot that shows domain controller on-premises commands.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/domain-controller.png)
-2. Run these commands in an Azure AD PowerShell session to gather the already synchronized values:
+1. Run these commands in an Azure AD PowerShell session to gather the already synchronized values:
```powershell
The example will grab **all** on-premises AD users, and export a list of their o
ImmutableID | export-csv C:\\temp\\AzureADSyncedIDS.csv ```
- ![image shows azure ad powershell session](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/azure-ad-powershell.png)
+ ![Screenshot that shows an Azure AD PowerShell session.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/azure-ad-powershell.png)
- Once you have both exports, confirm that the ImmutableID for each user matches.
+ After you have both exports, confirm that the ImmutableID for each user matches.
>[!IMPORTANT]
- >If your ImmutableIDs in the cloud don;t match objectGUID values, you've modified the defaults for Okta sync. You've
- likely chosen another attribute to determine ImmutableIDs. Before moving onto the next section, it's critical to identify which source attribute is populating ImmutableID's. Ensure that you update the attribute Okta is syncing before disabling Okta sync.
+ >If your ImmutableID values in the cloud don't match objectGUID values, you've modified the defaults for Okta sync. You've likely chosen another attribute to determine ImmutableID values. Before you move on to the next section, it's critical to identify which source attribute is populating ImmutableID values. Ensure that you update the attribute Okta is syncing before you disable Okta sync.
-## Step 3 - Install Azure AD Connect in staging mode
+## Install Azure AD Connect in staging mode
-Once you've prepared your list of source and destination targets, its time to install Azure AD Connect server. If you've opted to use Azure AD Connect cloud provisioning, skip this section.
+After you've prepared your list of source and destination targets, it's time to install an Azure AD Connect server. If you've opted to use Azure AD Connect cloud provisioning, skip this section.
1. Continue with [downloading and installing Azure AD Connect](../hybrid/how-to-connect-install-custom.md) to your chosen server.
-2. On the **Identifying Users** page, under the **select how users should be identified with Azure AD** select the radial for **Choose a specific attribute**. Then, select **mS-DS-ConsistencyGUID** if you haven't modified the Okta defaults.
+1. On the **Identifying users** page, under the **Select how users should be identified with Azure AD**, select the **Choose a specific attribute** option. Then, select **mS-DS-ConsistencyGUID** if you haven't modified the Okta defaults.
>[!WARNING]
- >This is the most critical step before selecting **next**
- on this page. Ensure that the attribute you're selecting for source anchor is what **currently** populates your existing Azure AD users. If you select the wrong attribute, you must uninstall and reinstall Azure AD Connect to reselect this option.
+ >This is the most critical step on this page. Before you select **Next**, ensure that the attribute you're selecting for a source anchor is what *currently* populates your existing Azure AD users. If you select the wrong attribute, you must uninstall and reinstall Azure AD Connect to reselect this option.
- ![image shows consistency guid](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/consistency-guid.png)
+ ![Screenshot that shows mS-DS-ConsistencyGuid.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/consistency-guid.png)
-3. On the **Configure** page, make sure to select the checkbox for **Enable staging mode** followed by **Install**.
+1. On the **Configure** page, make sure to select the **Enable staging mode** checkbox. Then select **Install**.
- ![image shows enable staging mode](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/enable-staging-mode.png)
+ ![Screenshot that shows the Enable staging mode checkbox.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/enable-staging-mode.png)
-4. After the configuration is complete, select **Exit**.
+1. After the configuration is complete, select **Exit**.
-Before exiting the staging mode, it's important to verify that the ImmutableID's have matched properly.
+ Before you exit the staging mode, verify that the ImmutableID values match properly.
-1. Open the Synchronization service as an **Administrator**.
+1. Open **Synchronization Service** as an administrator.
- ![image shows opening sync service](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/open-sync-service.png)
+ ![Screenshot that shows opening Synchronization Service.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/open-sync-service.png)
-2. First check that the Full Synchronization to the domain.onmicrosoft.com connector space has users displaying under the **Connectors with Flow Updates** tab.
+1. Check that **Full Synchronization** to the domain.onmicrosoft.com connector space has users displaying under the **Connectors with Flow Updates** tab.
- ![image shows connector with flow update](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/connector-flow-update.png)
+ ![Screenshot that shows the Connectors with Flow Updates tab.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/connector-flow-update.png)
-3. Next, verify there are no deletions pending in the export. Select the **Connectors** tab and then highlight the domain.onmicrosoft.com connector space. Then, select **Search Connector Space**.
+1. Verify there are no deletions pending in the export. Select the **Connectors** tab, and then highlight the domain.onmicrosoft.com connector space. Then select **Search Connector Space**.
- ![image shows search connector space](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/search-connector-space.png)
+ ![Screenshot that shows the Search Connector Space action.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/search-connector-space.png)
-4. In the Connector Space search, select the Scope dropdown and select **Pending Export**.
+1. In the **Search Connector Space** dialog, select the **Scope** dropdown and select **Pending Export**.
- ![image shows pending export](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/pending-export.png)
+ ![Screenshot that shows Pending Export.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/pending-export.png)
-5. Select **Delete** followed by **Search** if all objects have matched properly, there should be zero matching records for Deletes. Record any objects pending deletion and their on-premises values.
+1. Select **Delete**, and then select **Search**. If all objects have matched properly, there should be zero matching records for **Deletes**. Record any objects pending deletion and their on-premises values.
- ![image shows deleted matching records](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/delete-matching-records.png)
+ ![Screenshot that shows deleted matching records.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/delete-matching-records.png)
-6. Next, uncheck **Delete**, and select **Add and Modify**, followed by a search. You should see update functions for all users currently being synchronized to Azure AD via Okta. Add any new objects that Okta isn't currently syncing, but exist in the Organizational Unit (OU) structure that was selected during the Azure AD Connect install.
+1. Clear **Delete**, and select **Add** and **Modify**, followed by a search. You should see update functions for all users currently being synchronized to Azure AD via Okta. Add any new objects that Okta isn't currently syncing, but that exist in the organizational unit (OU) structure that was selected during the Azure AD Connect installation.
- ![image shows add new object](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/add-new-object.png)
+ ![Screenshot that shows adding a new object.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/add-new-object.png)
-7. Double-clicking on updates will show what Azure AD Connect will communicate with Azure AD.
+1. Double-clicking on updates shows what Azure AD Connect will communicate with Azure AD.
-8. If there are any Add functions for a user who already exists in Azure AD, their on-premises account isn't matching to their cloud account and AD Connect has determined it will create a new object, record any new adds that are unexpected. Make sure to correct the ImmutableID value in Azure AD before exiting staging mode.
+1. If there are any **add** functions for a user who already exists in Azure AD, their on-premises account doesn't match their cloud account. AD Connect has determined it will create a new object and record any new adds that are unexpected. Make sure to correct the ImmutableID value in Azure AD before you exit the staging mode.
- In this example, Okta had been stamping the Mail attribute to the user's account, even though the on-premises value wasn't properly filled in. When Azure AD Connect takes over John Smith's account, the Mail attribute is deleted from his object.
+ In this example, Okta stamped the **mail** attribute to the user's account, even though the on-premises value wasn't properly filled in. When Azure AD Connect takes over John Smith's account, the **mail** attribute is deleted from his object.
- Verify that your updates still include all attributes expected in Azure AD. If multiple attributes are being deleted, you may need to manually populate these on-premises AD values before removing staging mode.
+ Verify that your updates still include all attributes expected in Azure AD. If multiple attributes are being deleted, you might need to manually populate these on-premises AD values before you remove the staging mode.
- ![image shows populate on-premises ad values](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/on-premises-ad-values.png)
+ ![Screenshot that shows populating on-premises add values.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/on-premises-ad-values.png)
>[!NOTE]
- >Before you continue to the next step, ensure all user attributes are syncing properly and are showing in the **Pending Export** tab as expected. If they're deleted, make sure their ImmutableID's match and the User is in one of the selected OUs for synchronization.
+ >Before you continue to the next step, ensure all user attributes are syncing properly and show on the **Pending Export** tab as expected. If they're deleted, make sure their ImmutableID values match and the user is in one of the selected OUs for synchronization.
-## Step 4 - Install Azure AD cloud sync agents
+## Install Azure AD cloud sync agents
-Once you've prepared your list of source and destination targets, its time to [install and configure Azure AD cloud sync agents](../cloud-sync/tutorial-single-forest.md). If you've opted to use Azure AD Connect server, skip this section.
+After you've prepared your list of source and destination targets, it's time to [install and configure Azure AD cloud sync agents](../cloud-sync/tutorial-single-forest.md). If you've opted to use an Azure AD Connect server, skip this section.
-## Step 5 - Disable Okta provisioning to Azure AD
+## Disable Okta provisioning to Azure AD
-Once the Azure AD Connect install has been verified and your pending exports are in order, it's time to disable Okta provisioning to Azure AD.
+After you've verified the Azure AD Connect installation and your pending exports are in order, it's time to disable Okta provisioning to Azure AD.
-1. Navigate to your Okta portal, select **Applications**, followed by your Okta app used to provision users to Azure AD. Open provisioning tab and **Integration** section.
+1. Go to your Okta portal, select **Applications**, and then select your Okta app used to provision users to Azure AD. Open the **Provisioning** tab, and select the **Integration** section.
- ![image shows integration section in Okta](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/integration-section.png)
+ ![Screenshot that shows the Integration section in Okta.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/integration-section.png)
-2. Select **Edit**, uncheck **Enable API integration** option and **Save**.
+1. Select **Edit**, clear the **Enable API integration** option, and select **Save**.
- ![image shows edit enable api integration in Okta](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/edit-api-integration.png)
+ ![Screenshot that shows editing the Enable API integration in Okta.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/edit-api-integration.png)
>[!NOTE]
- >If you have multiple Office 365 apps handling provisioning to Azure AD, ensure that all are switched off.
+ >If you have multiple Office 365 apps handling provisioning to Azure AD, ensure they're all switched off.
-## Step 6 - Disable staging mode in Azure AD Connect
+## Disable staging mode in Azure AD Connect
-After disabling Okta Provisioning, the Azure AD Connect server is ready to begin synchronizing objects. If you have chosen to go with Azure AD cloud sync agents, skip this section.
+After you disable Okta provisioning, the Azure AD Connect server is ready to begin synchronizing objects. If you've chosen to go with Azure AD cloud sync agents, skip this section.
1. Run the installation wizard from the desktop again, and select **Configure**.
- ![image shows azure AD connect server](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/azure-ad-connect-server.png)
+ ![Screenshot that shows the Azure AD Connect server.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/azure-ad-connect-server.png)
-2. Select **Configure Staging Mode** followed by **Next** and enter your global administrator credentials.
+1. Select **Configure staging mode**, and then select **Next**. Enter your global administrator credentials.
- ![image shows configure staging mode](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/configure-staging-mode.png)
+ ![Screenshot that shows the Configure staging mode option.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/configure-staging-mode.png)
-3. Uncheck **Enable Staging Mode** followed by next.
+1. Clear the **Enable staging mode** option, and select **Next**.
- ![image shows uncheck enable staging mode](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/uncheck-enable-staging-mode.png)
+ ![Screenshot that shows clearing the Enable staging mode option.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/uncheck-enable-staging-mode.png)
-4. Select **Configure** to continue.
+1. Select **Configure** to continue.
- ![image shows ready to configure](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/ready-to-configure.png)
+ ![Screenshot that shows selecting the Configure button.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/ready-to-configure.png)
-5. After the configuration completes, open the **Synchronization Service** as an administrator. View the Export on the domain.onmicrosoft.com connector. Verify all adds, updates, and deletes are done as expected.
+1. After the configuration completes, open the **Synchronization Service** as an administrator. View the **Export** on the domain.onmicrosoft.com connector. Verify that all additions, updates, and deletions are done as expected.
- ![image shows verify sync service](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/verify-sync-service.png)
+ ![Screenshot that shows verifying the sync service.](./media/migrate-okta-sync-provisioning-to-azure-active-directory-connect-based-synchronization/verify-sync-service.png)
-You've now successfully migrated to Azure AD Connect server based provisioning. Updates and expansions to the feature set
-of Azure AD connect can be done by rerunning to the installation wizard.
+You've now successfully migrated to Azure AD Connect server-based provisioning. Updates and expansions to the feature set of Azure AD Connect can be done by rerunning the installation wizard.
-## Step 7 - Enable Cloud sync agents
+## Enable cloud sync agents
-After disabling Okta Provisioning, the Azure AD cloud sync agent is ready to begin synchronizing objects, return to the [Azure AD Portal](https://aad.portal.azure.com/).
+After you disable Okta provisioning, the Azure AD cloud sync agent is ready to begin synchronizing objects. Return to the [Azure AD portal](https://aad.portal.azure.com/).
-1. Modify the **Configuration profile** to **Enabled**.
+1. Modify the **Configuration** profile to **Enabled**.
-2. After enabling, return to the provisioning menu and select **Logs**.
+1. Return to the provisioning menu, and select **Logs**.
-3. Evaluate that the provisioning connector has properly updated in place objects. The cloud sync agents are non-destructive. They'll fail their updates if a match didn't occur properly.
+1. Evaluate that the provisioning connector has properly updated in-place objects. The cloud sync agents are nondestructive. They'll fail their updates if a match didn't occur properly.
-4. If a user is mismatched, make the necessary updates to bind the immutableID's, then restart the cloud provisioning sync.
+1. If a user is mismatched, make the necessary updates to bind the ImmutableID values. Then restart the cloud provisioning sync.
## Next steps
+For more information about migrating from Okta to Azure AD, see:
- [Migrate applications from Okta to Azure AD](migrate-applications-from-okta-to-azure-active-directory.md)- - [Migrate Okta federation to Azure AD managed authentication](migrate-okta-federation-to-azure-active-directory.md)--- [Migrate Okta sign on policies to Azure AD Conditional Access](migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access.md)
+- [Migrate Okta sign-on policies to Azure AD Conditional Access](migrate-okta-sign-on-policies-to-azure-active-directory-conditional-access.md)
active-directory Secure Hybrid Access Integrations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/secure-hybrid-access-integrations.md
# Secure hybrid access with Azure Active Directory partner integrations
-Azure Active Directory (Azure AD) supports modern authentication protocols that keep applications secure in a highly connected, cloud-based world. However, many business applications were created to work in a protected corporate network, and some of these applications use legacy authentication methods. As companies look to build a Zero Trust strategy and support hybrid and cloud-first work environments, they need solutions that connect apps to Azure AD and provide modern authentication solutions for legacy applications.
+Azure Active Directory (Azure AD) supports modern authentication protocols that help keep applications secure in a highly connected, cloud-based world. However, many business applications were created to work in a protected corporate network, and some of these applications use legacy authentication methods. As companies look to build a Zero Trust strategy and support hybrid and cloud-first work environments, they need solutions that connect apps to Azure AD and provide modern authentication solutions for legacy applications.
-Azure AD natively supports modern protocols like SAML, WS-Fed, and OIDC. Azure AD's App Proxy supports Kerberos and header-based authentication. Other protocols like SSH, NTLM, LDAP, Cookies, aren't yet supported, but ISVs can create solutions to connect these applications with Azure AD to support customers on their journey to Zero Trust.
+Azure AD natively supports modern protocols like SAML, WS-Fed, and OIDC. App Proxy in Azure AD supports Kerberos and header-based authentication. Other protocols, like SSH, NTLM, LDAP, and cookies, aren't yet supported. But ISVs can create solutions to connect these applications with Azure AD to support customers on their journey to Zero Trust.
-ISVs have the opportunity to help customers discover and migrate SaaS applications into Azure AD. They can also connect apps that use legacy authentication methods with Azure AD. This will help customers consolidate onto a single platform (Azure AD) to simplify their app management and enable them to implement Zero Trust principles. Supporting apps using legacy authentication makes their users more secure. This solution can be a great stop-gap until the customer modernizes their apps to support modern authentication protocols.
+ISVs have the opportunity to help customers discover and migrate software as a service (SaaS) applications into Azure AD. They can also connect apps that use legacy authentication methods with Azure AD. This will help customers consolidate onto a single platform (Azure AD) to simplify their app management and enable them to implement Zero Trust principles. Supporting apps that use legacy authentication makes users more secure. This solution can be a great stopgap until the customers modernize their apps to support modern authentication protocols.
## Solution overview
-The solution you build can include the following parts:
+The solution that you build can include the following parts:
-1. **App discovery**. Often, customers aren't aware of all the applications they're using. So as a first step you can build application discovery capabilities into your solution and surface discovered applications in the user interface. This enables the customer to prioritize how they want to approach integrating their applications with Azure AD.
-2. **App migration**. Next you can create an in-product workflow where the customer can directly integrate apps with Azure AD without having to go to the Azure AD portal. If you don't implement discovery capabilities in your solution you can start your solution here, integrating the applications customers do know about with Azure AD.
-3. **Legacy authentication support**. You can connect apps using legacy authentication methods to Azure AD so that they get the benefits of single sign-on (SSO) and other features.
-4. **Conditional access**. As an additional feature, you can enable customers to apply Azure AD [Conditional Access](/azure/active-directory/conditional-access/overview/) policies to the applications from within your solution without having to go the Azure AD portal.
+1. **App discovery**. Often, customers aren't aware of all the applications they're using. So as a first step, you can build application discovery capabilities into your solution and surface discovered applications in the user interface. This enables the customer to prioritize how they want to approach integrating their applications with Azure AD.
+2. **App migration**. Next, you can create an in-product workflow where the customer can directly integrate apps with Azure AD without having to go to the Azure AD portal. If you don't implement discovery capabilities in your solution, you can start your solution here, integrating the applications that customers do know about with Azure AD.
+3. **Legacy authentication support**. You can connect apps by using legacy authentication methods to Azure AD so that they get the benefits of single sign-on (SSO) and other features.
+4. **Conditional Access**. As an additional feature, you can enable customers to apply Azure AD [Conditional Access](/azure/active-directory/conditional-access/overview/) policies to the applications from within your solution without having to go the Azure AD portal.
The rest of this guide explains the technical considerations and our recommendations for implementing a solution.
-## Publish your application to the Azure AD app gallery
+## Publishing your application to Azure Marketplace
-You can pre-integrate your application with Azure AD to support SSO and automated provisioning by following the process to [publish it in the Azure AD app gallery](/azure/active-directory/develop/v2-howto-app-gallery-listing/). The Azure AD app gallery is a trusted source of Azure AD compatible applications for IT admins. Applications listed there have been validated to be compatible with Azure AD. They support SSO, automate user provisioning, and can easily integrate into customer tenants with automated app registration.
+You can pre-integrate your application with Azure AD to support SSO and automated provisioning by following the process to [publish it in Azure Marketplace](/azure/active-directory/develop/v2-howto-app-gallery-listing/). Azure Marketplace is a trusted source of applications for IT admins. Applications listed there have been validated to be compatible with Azure AD. They support SSO, automate user provisioning, and can easily integrate into customer tenants with automated app registration.
-In addition, we recommend that you become a [verified publisher](/azure/active-directory/develop/publisher-verification-overview/) so that customers know you are the trusted publisher of the app.
+In addition, we recommend that you become a [verified publisher](/azure/active-directory/develop/publisher-verification-overview/) so that customers know you're the trusted publisher of the app.
-## Enable IT admin single sign-on
+## Enabling single sign-on for IT admins
-You'll want to [choose either OIDC or SAML](/azure/active-directory/manage-apps/sso-options#choosing-a-single-sign-on-method/) to enable SSO for IT administrators to your solution.
+[Choose either OIDC or SAML](/azure/active-directory/manage-apps/sso-options#choosing-a-single-sign-on-method/) to enable SSO for IT administrators to your solution. The best option is to use OIDC.
-The best option is to use OIDC. Microsoft Graph uses [OIDC/OAuth](/azure/active-directory/develop/v2-protocols-oidc/). This means that if your solution uses OIDC with Azure AD for IT administrator SSO, then your customers will have a seamless end-to-end experience. They'll use OIDC to sign in to your solution and that same JSON Web Token (JWT) that was issued by Azure AD can then be used to interact with Microsoft Graph.
+Microsoft Graph uses [OIDC/OAuth](/azure/active-directory/develop/v2-protocols-oidc/). If your solution uses OIDC with Azure AD for IT administrator SSO, your customers will have a seamless end-to-end experience. They'll use OIDC to sign in to your solution, and the same JSON Web Token (JWT) that Azure AD issued can then be used to interact with Microsoft Graph.
-If your solution is instead using [SAML](/azure/active-directory/manage-apps/configure-saml-single-sign-on/) for IT administrator SSO, the SAML token won't enable your solution to interact with Microsoft Graph. You can still use SAML for IT administrator SSO but your solution needs to support OIDC integration with Azure AD so it can get a JWT from Azure AD to properly interact with Microsoft Graph. You can use one of the following approaches:
+If your solution instead uses [SAML](/azure/active-directory/manage-apps/configure-saml-single-sign-on/) for IT administrator SSO, the SAML token won't enable your solution to interact with Microsoft Graph. You can still use SAML for IT administrator SSO, but your solution needs to support OIDC integration with Azure AD so it can get a JWT from Azure AD to properly interact with Microsoft Graph. You can use one of the following approaches:
-Recommended SAML Approach: Create a new registration in the Azure AD app gallery, which is [an OIDC app](/azure/active-directory/saas-apps/openidoauth-tutorial/). This provides the most seamless experience for your customer. They'll add both the SAML and OIDC apps to their tenant. If your application isn't in the Azure AD gallery today, you can start with a non-gallery [multi-tenant application](/azure/active-directory/develop/howto-convert-app-to-be-multi-tenant/).
+- **Recommended SAML approach**: Create a new registration in Azure Marketplace, which is [an OIDC app](/azure/active-directory/saas-apps/openidoauth-tutorial/). This provides the most seamless experience for your customers. They'll add both the SAML and OIDC apps to their tenant. If your application isn't in the Azure AD gallery today, you can start with a non-gallery [multi-tenant application](/azure/active-directory/develop/howto-convert-app-to-be-multi-tenant/).
-Alternate SAML Approach: Your customer can manually [create an OIDC application registration](/azure/active-directory/saas-apps/openidoauth-tutorial/) in their Azure AD tenant and ensure they set the right URI's, endpoints, and permissions specified later in this document.
+- **Alternate SAML approach**: Your customers can manually [create an OIDC application registration](/azure/active-directory/saas-apps/openidoauth-tutorial/) in their Azure AD tenant and ensure that they set the right URIs, endpoints, and permissions specified later in this article.
-You'll would want to use the [client_credentials grant type](/azure/active-directory/develop/v2-oauth2-client-creds-grant-flow#get-a-token/), which will require that your solution allows the customer to input a client_ID and secret into your user interface, and that you store this information. Get a JWT from Azure AD, which you can then use to interact with Microsoft Graph.
+You'll want to use the [client_credentials grant type](/azure/active-directory/develop/v2-oauth2-client-creds-grant-flow#get-a-token/). It will require that your solution allows each customer to enter a client ID and secret into your user interface, and that you store this information. Get a JWT from Azure AD, and then use it to interact with Microsoft Graph.
-If you choose this route, you should have ready-made documentation for your customer about how to create this application registration within their Azure AD tenant including the endpoints, URI's, and permissions required.
+If you choose this route, you should have ready-made documentation for your customer about how to create this application registration within their Azure AD tenant. This information includes the endpoints, URIs, and required permissions.
> [!NOTE]
-> Before any applications can be used for either IT administrator or end-user sign-on, the customer's IT administrator will need to [consent to the application in their tenant](/azure/active-directory/manage-apps/grant-admin-consent/).
+> Before any applications can be used for either IT administrator or user SSO, the customer's IT administrator will need to [consent to the application in their tenant](/azure/active-directory/manage-apps/grant-admin-consent/).
## Authentication flows
-The solution will include three key authentication flows that support the following scenarios:
+The solution includes three key authentication flows that support the following scenarios:
-1. The customer's IT administrator signs in with SSO to administer your solution.
+- The customer's IT administrator signs in with SSO to administer your solution.
-2. The customer's IT administrator uses your solution to integrate applications with Azure AD via Microsoft Graph.
+- The customer's IT administrator uses your solution to integrate applications with Azure AD via Microsoft Graph.
-3. End-users sign into legacy applications secured by your solution and Azure AD.
+- Users sign in to legacy applications secured by your solution and Azure AD.
### Your customer's IT administrator does single sign-on to your solution
-Your solution can use either SAML or OIDC for SSO when the customer's IT administrator signs in. Either way, its recommended that the IT administrator can sign in to your solution using their Azure AD credentials, which enables them a seamless experience and allows them to use the existing security controls they already have in place. Your solution should be integrated with Azure AD for SSO using either SAML or OIDC.
+Your solution can use either SAML or OIDC for SSO when the customer's IT administrator signs in. Either way, we recommend that the IT administrator can sign in to your solution by using their Azure AD credentials. It enables a seamless experience and allows them to use the existing security controls that they already have in place. Your solution should be integrated with Azure AD for SSO through either SAML or OIDC.
-![image diagram of the IT administrator being redirected by the solution to Azure AD to log in, and then being redirected by Azure AD back to the solution with a SAML token or JWT](./media/secure-hybrid-access-integrations/admin-flow.png)
+Here's a diagram and summary of this user authentication flow:
-1. The IT administrator wants to sign-in to your solution with their Azure AD credentials.
+![Diagram that shows an I T administrator being redirected by the solution to Azure AD to sign in, and then being redirected by Azure AD back to the solution in a user authentication flow.](./media/secure-hybrid-access-integrations/admin-flow.png)
-2. Your solution will redirect them to Azure AD either with a SAML or OIDC sign-in request.
+1. The IT administrator wants to sign in to your solution with their Azure AD credentials.
-3. Azure AD will authenticate the IT administrator and then send them back to your solution with either a SAML token or JWT in tow to be authorized within your solution
+2. Your solution redirects the IT administrator to Azure AD with either a SAML or an OIDC sign-in request.
-### The IT administrator integrates applications with Azure AD using your solution
+3. Azure AD authenticates the IT administrator and then sends them back to your solution with either a SAML token or JWT in tow to be authorized within your solution.
-The second leg of the IT administrator journey will be to integrate applications with Azure AD by using your solution. To do this, your solution will use Microsoft Graph to create application registrations and Azure AD Conditional Access policies.
+### The IT administrator integrates applications with Azure AD by using your solution
-Here is a diagram and summary of this user authentication flow:
+The second leg of the IT administrator journey is to integrate applications with Azure AD by using your solution. To do this, your solution will use Microsoft Graph to create application registrations and Azure AD Conditional Access policies.
-![image diagram of the IT administrator being redirected by the solution to Azure AD to log in, then being redirected by Azure AD back to the solution with a SAML token or JWT, and finally the solution making a call to Microsoft Graph with the JWT](./media/secure-hybrid-access-integrations/registration-flow.png)
+Here's a diagram and summary of this user authentication flow:
+![Diagram of redirects and other interactions between the I T administrator, Azure Active Directory, your solution, and Microsoft Graph in a user authentication flow.](./media/secure-hybrid-access-integrations/registration-flow.png)
-1. The IT administrator wants to sign-in to your solution with their Azure AD credentials.
-2. Your solution will redirect them to Azure AD either with a SAML or OIDC sign-in request.
+1. The IT administrator wants to sign in to your solution with their Azure AD credentials.
-3. Azure AD will authenticate the IT administrator and then send them back to your solution with either a SAML token or JWT for authorization within your solution.
+2. Your solution redirects the IT administrator to Azure AD with either a SAML or an OIDC sign-in request.
-4. When an IT administrator wants to integrate one of their applications with Azure AD, rather than having to go to the Azure AD portal, your solution will call the Microsoft Graph with their existing JWT to register those applications or apply Azure AD Conditional Access policies to them.
+3. Azure AD authenticates the IT administrator and then sends them back to your solution with either a SAML token or JWT for authorization within your solution.
-### End-users sign-in to the applications secured by your solution and Azure AD
+4. When the IT administrator wants to integrate one of their applications with Azure AD, rather than having to go to the Azure AD portal, your solution calls Microsoft Graph with their existing JWT to register those applications or apply Azure AD Conditional Access policies to them.
-When end users need to sign into individual applications secured with your solution and Azure AD, they use either OIDC or SAML. If the applications need to interact with Microsoft Graph or any Azure AD protected API for some reason, its recommended that the individual applications you register with Microsoft Graph be configured to use OIDC. This will ensure that the JWT that they get from Azure AD to authenticate them into the applications can also be applied for interacting with Microsoft Graph. If there is no need for the individual applications to interact with Microsoft Graph or any Azure AD protected API, then SAML will suffice.
+### Users sign in to the applications secured by your solution and Azure AD
-Here is a diagram and summary of this user authentication flow:
+When users need to sign in to individual applications secured with your solution and Azure AD, they use either OIDC or SAML. If the applications need to interact with Microsoft Graph or any Azure AD-protected API, we recommend that you configure them to use OICD. This configuration will ensure that the JWT that the applications get from Azure AD to authenticate them into the applications can also be applied for interacting with Microsoft Graph. If there's no need for the individual applications to interact with Microsoft Graph or any Azure AD protected API, then SAML will suffice.
-![image diagram of the end user being redirected by the solution to Azure AD to log in, then being redirected by Azure AD back to the solution with a SAML token or JWT, and finally the solution making a call to another application using the application's preferred authentication type](./media/secure-hybrid-access-integrations/end-user-flow.png)
+Here's a diagram and summary of this user authentication flow:
-1. The end user wants to sign-in to an application secured by your solution and Azure AD.
-2. Your solution will redirect them to Azure AD either with a SAML or OIDC sign-in request.
-3. Azure AD will authenticate the end user and then send them back to your solution with either a SAML token or JWT for authorization within your solution.
-4. Once authorized against your solution, your solution will then allow the original request to the application to go through using the preferred protocol of the application.
+![Diagram of redirects and other interactions between the user, Azure Active Directory, your solution, and the application in a user authentication flow.](./media/secure-hybrid-access-integrations/end-user-flow.png)
-## Summary of Microsoft Graph APIs you will use
+1. The user wants to sign in to an application secured by your solution and Azure AD.
+2. Your solution redirects the user to Azure AD with either a SAML or an OIDC sign-in request.
+3. Azure AD authenticates the user and then sends them back to your solution with either a SAML token or JWT for authorization within your solution.
+4. After authorization, your solution allows the original request to the application to go through by using the preferred protocol of the application.
-Your solution will need to use these APIs. Azure AD will allow you to configure either the delegated permissions or the application permissions. For this solution, you only need delegated permissions.
+## Summary of Microsoft Graph APIs
-[Application Templates API](/graph/application-saml-sso-configure-api#retrieve-the-gallery-application-template-identifier/): If you're interested in searching the Azure AD app gallery, you can use this API to find a matching application template. **Permission required** : Application.Read.All.
+Your solution needs to use the following APIs. Azure AD allows you to configure either delegated permissions or application permissions. For this solution, you need only delegated permissions.
-[Application Registration API](/graph/api/application-post-applications): You'll use this API to create either OIDC or SAML application registrations so end users can sign-in to the applications that the customers have secured with your solution. Doing this will enable these applications to also be secured with Azure AD. **Permissions required** : Application.Read.All, Application.ReadWrite.All
+- [Application Templates API](/graph/application-saml-sso-configure-api#retrieve-the-gallery-application-template-identifier/): If you're interested in searching Azure Marketplace, you can use this API to find a matching application template. **Permission required**: Application.Read.All.
-[Service Principal API](/graph/api/serviceprincipal-update): After doing the app registration, you'll need to update the Service Principal Object to set some SSO properties. **Permissions required** : Application.ReadWrite.All, Directory.AccessAsUser.All, AppRoleAssignment.ReadWrite.All (for assignment)
+- [Application Registration API](/graph/api/application-post-applications): You use this API to create either OIDC or SAML application registrations so that users can sign in to the applications that the customers have secured with your solution. Doing this enables these applications to also be secured with Azure AD. **Permissions required**: Application.Read.All, Application.ReadWrite.All.
-[Conditional Access API](/graph/api/resources/conditionalaccesspolicy): If you want to also apply Azure AD Conditional Access policies to these end-user applications, you can use this API to do so. **Permissions required** : Policy.Read.All, Policy.ReadWrite.ConditionalAccess, and Application.Read.All
+- [Service Principal API](/graph/api/serviceprincipal-update): After you register the app, you need to update the service principal object to set some SSO properties. **Permissions required**: Application.ReadWrite.All, Directory.AccessAsUser.All, AppRoleAssignment.ReadWrite.All (for assignment).
+
+- [Conditional Access API](/graph/api/resources/conditionalaccesspolicy): If you want to also apply Azure AD Conditional Access policies to these user applications, you can use this API. **Permissions required**: Policy.Read.All, Policy.ReadWrite.ConditionalAccess, and Application.Read.All.
## Example Graph API scenarios
-This section provides a reference example for using Microsoft Graph APIs to implement application registrations, connect legacy applications, and enable conditional access policies via your solution. In addition, there is guidance on automating admin consent, getting the token signing certificate, and assigning users and groups. This functionality may be useful in your solution.
+This section provides a reference example for using Microsoft Graph APIs to implement application registrations, connect legacy applications, and enable Conditional Access policies via your solution. This section also gives guidance on automating admin consent, getting the token-signing certificate, and assigning users and groups. This functionality might be useful in your solution.
### Use the Graph API to register apps with Azure AD
-#### Apps in the Azure AD app gallery
+#### Add apps that are in Azure Marketplace
-Some of the applications your customer is using will already be available in the [Azure AD Application Gallery](https://azuremarketplace.microsoft.com/marketplace/apps). You can create a solution that programmatically adds these applications to the customer's tenant. The following is an example of using the Microsoft Graph API to search the Azure AD app gallery for a matching template and then registering the application in the customer's Azure AD tenant.
+Some of the applications that your customer is using will already be available in [Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps). You can create a solution that programmatically adds these applications to the customer's tenant. The following code is an example of using the Microsoft Graph API to search Azure Marketplace for a matching template and then registering the application in the customer's Azure AD tenant.
-Search the Azure AD app gallery for a matching application. When using the application templates API, the display name is case-sensitive.
+Search Azure Marketplace for a matching application. When you're using the Application Templates API, the display name is case-sensitive.
```http Authorization: Required with a valid Bearer token
Method: Get
https://graph.microsoft.com/v1.0/applicationTemplates?$filter=displayname eq "Salesforce.com" ```
-If a match is found from the prior API call, capture the ID and then make this API call while providing a user-friendly display name for the application in the JSON body:
+If a match is found from the preceding API call, capture the ID and then make the following API call while providing a user-friendly display name for the application in the JSON body:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applicationTemplates/cd3ed3de-93ee-400b-8b19-b6
} ```
-When you make the above API call, we'll also generate a Service Principal object, which might take a few seconds. From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+When you make the preceding API call, you'll also generate a service principal object, which might take a few seconds. Be sure to capture the application ID and the service principal ID. You'll use them in the next API calls.
-Next, you'll want to PATCH the Service Principal Object with the saml protocol and the appropriate login URL:
+Next, patch the service principal object with the SAML protocol and the appropriate login URL:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f7
} ```
-And lastly, you'll want to patch the Application Object with the appropriate redirecturis and the identifieruris:
+Finally, patch the application object with the appropriate redirect URIs and the identifier URIs:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applications/54c4806b-b260-4a12-873c-9671169837
} ```
-#### Applications not in the Azure AD app gallery
-
-If you can't find a match in the Azure AD app gallery or you just want to integrate a custom application, then you have the option of registering a custom application in Azure AD using this template ID:
+#### Add apps that are not in Azure Marketplace
-**8adf8e6e-67b2-4cf2-a259-e3dc5476c621**
-
-And then make this API call while providing a user-friendly display name of the application in the JSON body:
+If you can't find a match in Azure Marketplace or you just want to integrate a custom application, you can register a custom application in Azure AD by using this template ID: **8adf8e6e-67b2-4cf2-a259-e3dc5476c621**. Then, make the following API call while providing a user-friendly display name of the application in the JSON body:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applicationTemplates/8adf8e6e-67b2-4cf2-a259-e3
} ```
-When you make the above API call, we'll also generate a Service Principal object, which might take a few seconds. From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+When you make the preceding API call, you'll also generate a service principal object, which might take a few seconds. Be sure to capture the application ID and the service principal ID. You'll use them in the next API calls.
-Next, you'll want to PATCH the Service Principal Object with the saml protocol and the appropriate login URL:
+Next, patch the service principal object with the SAML protocol and the appropriate login URL:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f7
} ```
-And lastly, you'll want to patch the Application Object with the appropriate redirecturis and the identifieruris:
+Finally, patch the application object with the appropriate redirect URIs and the identifier URIs:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applications/54c4806b-b260-4a12-873c-9671169837
#### Cut over to Azure AD single sign-on
-Once you have these SaaS applications registered inside Azure AD, the applications still need to be cut over to start us Azure AD as their identity provider. There are two ways to do this:
+After you have the SaaS applications registered inside Azure AD, the applications still need to be cut over to start using Azure AD as their identity provider. There are two ways to do this:
-1. If the applications support one-click SSO, then Azure AD can cut over the application for the customer. They just need to go into the Azure AD portal and perform the one-click SSO with the administrative credentials for the supported SaaS application. You can read about this in [one-click, SSO configuration of your Azure Marketplace application](/azure/active-directory/manage-apps/one-click-sso-tutorial/).
-2. If the application doesn't support one-click SSO, then the customer will need to manually cutover the application to start using Azure AD. You can learn more in the [SaaS App Integration Tutorials for use with Azure AD](/azure/active-directory/saas-apps/tutorial-list/).
+- If the applications support one-click SSO, Azure AD can cut over the applications for the customer. The customer just needs to go into the Azure AD portal and perform the one-click SSO with the administrative credentials for the supported SaaS applications. For more information, see [One-click app configuration of single sign-on](/azure/active-directory/manage-apps/one-click-sso-tutorial/).
+- If the applications don't support one-click SSO, the customer needs to manually cut over the applications to start using Azure AD. For more information, see [Tutorials for integrating SaaS applications with Azure Active Directory](/azure/active-directory/saas-apps/tutorial-list/).
-### Connect apps using legacy authentication methods to Azure AD
+### Connect apps by using legacy authentication methods to Azure AD
-This is where your solution can sit in between Azure AD and the application and enable the customer to get the benefits of Single-Sign On and other Azure Active Directory features even for applications that are not supported. To do so, your application will call Azure AD to authenticate the user and apply Azure AD Conditional Access policies before they can access these applications with legacy protocols.
+This is where your solution can sit in between Azure AD and the application and enable the customer to get the benefits of SSO and other Azure Active Directory features, even for applications that are not supported. To do so, your application will call Azure AD to authenticate the user and apply Azure AD Conditional Access policies before the user can access these applications with legacy protocols.
-You can enable customers to do this integration directly from your console so that the discovery and integration is a seamless end-to-end experience. This will involve your platform creating either a SAML or OIDC application registration between your platform and Azure AD.
+You can enable customers to do this integration directly from your console so that the discovery and integration is a seamless end-to-end experience. This will involve your platform creating either a SAML or an OIDC application registration between your platform and Azure AD.
#### Create a SAML application registration
-Use the custom application template ID for this:
-
-**8adf8e6e-67b2-4cf2-a259-e3dc5476c621**
-
-And then make this API call while providing a user-friendly display name in the JSON body:
+To create a SAML application registration, use this custom application template ID for a custom application: **8adf8e6e-67b2-4cf2-a259-e3dc5476c621**. Then make the following API call while providing a user-friendly display name in the JSON body:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applicationTemplates/8adf8e6e-67b2-4cf2-a259-e3
} ```
-When you make the above API call, we'll also generate a Service Principal object, which might take a few seconds. From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+When you make the preceding API call, you'll also generate a service principal object, which might take a few seconds. Be sure to capture the application ID and the service principal ID. You'll use them in the next API calls.
-Next, you'll want to PATCH the Service Principal Object with the saml protocol and the appropriate login URL:
+Next, patch the service principal object with the SAML protocol and the appropriate login URL:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f7
} ```
-And lastly, you'll want to PATCH the Application Object with the appropriate redirecturis and the identifieruris:
+Finally, patch the application object with the appropriate redirect URIs and the identifier URIs:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applications/54c4806b-b260-4a12-873c-9671169837
#### Create an OIDC application registration
-You should use the custom application template ID for this:
-
-**8adf8e6e-67b2-4cf2-a259-e3dc5476c621**
-
-And then make this API call while providing a user-friendly display name in the JSON body:
+To create an OIDC application registration, use this template ID for a custom application: **8adf8e6e-67b2-4cf2-a259-e3dc5476c621**. Then make the following API call while providing a user-friendly display name in the JSON body:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applicationTemplates/8adf8e6e-67b2-4cf2-a259-e3
} ```
-From the previous API call, you'll want to capture the Application ID and the Service Principal ID, which you'll use in the next API calls.
+From the API call, capture the application ID and the service principal ID. You'll use them in the next API calls.
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/applications/{Application Object ID}
``` > [!NOTE]
-> The API Permissions listed above within the resourceAccess node will grant the application access to OpenID, User.Read, and offline_access, which should be enough to get the user signed in to your solution. You can find more information on permissions on the [permissions reference page](/graph/permissions-reference/).
+> The API permissions listed in within the `resourceAccess` node will grant the application the *openid*, *User.Read*, and *offline_access* permissions, which should be enough to get the user signed in to your solution. For more information about permissions, see the [Microsoft Graph permissions reference](/graph/permissions-reference/).
+
+### Apply Conditional Access policies
-### Apply conditional access policies
+Customers and partners can also use the Microsoft Graph API to create or apply Conditional Access policies to customer applications. For partners, this can provide additional value because customers can apply these policies directly from your solution without having to go to the Azure AD portal.
-We want to empower customers and partners to also use the Microsoft Graph API to create or apply Conditional Access policies to customer's applications. For partners, this can provide additional value so the customer can apply these policies directly from your solution without having to go to the Azure AD portal. You have two options when applying Azure AD Conditional Access Policies:
+You have two options when applying Azure AD Conditional Access policies:
-- You can assign the application to an existing Conditional Access Policy-- You can create a new Conditional Access policy and assign the application to that new policy
+- Assign the application to an existing Conditional Access Policy.
+- Create a new Conditional Access policy and assign the application to that new policy.
-#### An existing conditional access policy
+#### Use an existing Conditional Access policy
-First, you'll want to query to get a list of all Conditional Access Policies and grab the Object ID of the policy you want to modify:
+First, run the following query to get a list of all Conditional Access policies. Get the object ID of the policy that you want to modify.
```https Authorization: Required with a valid Bearer token
Method:GET
https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies ```
-Next, you'll want to Patch the policy by including the Application Object ID to be in scope of the includeApplications within the JSON body:
+Next, patch the policy by including the application object ID to be in scope of `includeApplications` within the JSON body:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies/{policyid}
} ```
-#### Create a new Azure AD conditional access policy
+#### Create a new Conditional Access policy
-You'll want to add the Application Object ID to be in scope of the includeApplications within the JSON body:
+Add the application object ID to be in scope of `includeApplications` within the JSON body:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies/
} ```
-If you're interested in creating new Azure AD Conditional Access Policies, here are some additional templates that can help get you started using the [Conditional Access API](/azure/active-directory/conditional-access/howto-conditional-access-apis/).
+If you're interested in creating new Azure AD Conditional Access policies, here are some additional templates that can help get you started with using the [Conditional Access API](/azure/active-directory/conditional-access/howto-conditional-access-apis/):
```https #Policy Template for Requiring Compliant Device
If you're interested in creating new Azure AD Conditional Access Policies, here
### Automate admin consent
-If the customer is onboarding numerous applications from your platform to Azure AD, you'll likely want to automate admin consent for them so they don't have to manually consent to lots of applications. This can also be done via Microsoft Graph. You'll need both the Service Principal Object ID of the application you created in previous API calls and the Service Principal Object ID of Microsoft Graph from the customer's tenant.
+If the customer is onboarding numerous applications from your platform to Azure AD, you can automate admin consent for them so they don't have to manually consent to lots of applications. You can also do this automation via Microsoft Graph. You'll need both the service principal object ID of the application that you created in previous API calls and the service principal object ID of Microsoft Graph from the customer's tenant.
-You can get the Service Principal Object ID of Microsoft Graph by making this API call:
+Get the service principal object ID of Microsoft Graph by making this API call:
```https Authorization: Required with a valid Bearer token
Method:GET
https://graph.microsoft.com/v1.0/serviceprincipals/?$filter=appid eq '00000003-0000-0000-c000-000000000000'&$select=id,appDisplayName ```
-Then when you're ready to automate admin consent, you can make this API call:
+When you're ready to automate admin consent, make this API call:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/oauth2PermissionGrants
"clientId":"{Service Principal Object ID of Application}", "consentType":"AllPrincipals", "principalId":null,
- "resourceId":"{Service Principal Object ID Of MicrosofT Graph}",
+ "resourceId":"{Service Principal Object ID Of Microsoft Graph}",
"scope":"openid user.read offline_access}" } ```
-### Get the token signing certificate
+### Get the token-signing certificate
-To get the public portion of the token signing certificate for all these applications, you can GET it from the Azure AD metadata endpoint for the application:
+To get the public portion of the token-signing certificate for all these applications, use `GET` from the Azure AD metadata endpoint for the application:
```https Method:GET
https://login.microsoftonline.com/{Tenant_ID}/federationmetadata/2007-06/federat
### Assign users and groups
-Once you've published the applications to Azure AD, you can optionally assign it to users and groups to ensure it shows up on the [MyApplications](/azure/active-directory/user-help/my-applications-portal-workspaces/) portal. This assignment is stored on the Service Principal Object that was generated when you created the application:
+After you've published the application to Azure AD, you can optionally assign it to users and groups to ensure that it shows up on the [MyApplications](/azure/active-directory/user-help/my-applications-portal-workspaces/) portal. This assignment is stored on the service principal object that was generated when you created the application.
-First you'll want to get any AppRoles that the application may have associated with it. It's common for SaaS applications to have various AppRoles associated with them. For custom applications, there is typically just the one default AppRole. Grab the ID of the AppRole you want to assign:
+First, get any `AppRole` instances that the application may have associated with it. It's common for SaaS applications to have various `AppRole` instances associated with them. For custom applications, there's typically just the one default `AppRole` instance. Get the ID of the `AppRole` instance that you want to assign:
```https Authorization: Required with a valid Bearer token
Method:GET
https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f71680b27 ```
-Next, you'll want to get the Object ID of the user or group from Azure AD that you'll want to assign to the application. Also take the App Role ID from the previous API call and submit it as part of the PATCH body on the Service Principal:
+Next, get the object ID of the user or group from Azure AD that you want to assign to the application. Also take the app role ID from the previous API call and submit it as part of the patch body on the service principal:
```https Authorization: Required with a valid Bearer token
https://graph.microsoft.com/v1.0/servicePrincipals/3161ab85-8f57-4ae0-82d3-7a1f7
} ```
-## Existing partners
+## Partnerships
-Microsoft has existing partnerships with these third-party providers to protect legacy applications while using existing networking and delivery controllers.
+Microsoft has partnerships with these application delivery controller (ADC) providers to help protect legacy applications while using existing networking and delivery controllers.
| **ADC provider** | **Link** | | | |
-| Akamai Enterprise Application Access (EAA) | [https://docs.microsoft.com/azure/active-directory/saas-apps/akamai-tutorial](/azure/active-directory/saas-apps/akamai-tutorial) |
-| Citrix Application Delivery Controller (ADC) | [https://docs.microsoft.com/azure/active-directory/saas-apps/citrix-netscaler-tutorial](/azure/active-directory/saas-apps/citrix-netscaler-tutorial) |
-| F5 Big-IP APM | [https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-integration](/azure/active-directory/manage-apps/f5-aad-integration) |
-| Kemp | [https://docs.microsoft.com/azure/active-directory/saas-apps/kemp-tutorial](/azure/active-directory/saas-apps/kemp-tutorial) |
-| Pulse Secure Virtual Traffic Manager (VTM) | [https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial](/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial) |
+| Akamai Enterprise Application Access | [https://docs.microsoft.com/azure/active-directory/saas-apps/akamai-tutorial](/azure/active-directory/saas-apps/akamai-tutorial) |
+| Citrix ADC | [https://docs.microsoft.com/azure/active-directory/saas-apps/citrix-netscaler-tutorial](/azure/active-directory/saas-apps/citrix-netscaler-tutorial) |
+| F5 Big-IP Access Policy Manager | [https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-integration](/azure/active-directory/manage-apps/f5-aad-integration) |
+| Kemp LoadMaster | [https://docs.microsoft.com/azure/active-directory/saas-apps/kemp-tutorial](/azure/active-directory/saas-apps/kemp-tutorial) |
+| Pulse Secure Virtual Traffic Manager | [https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial](/azure/active-directory/saas-apps/pulse-secure-virtual-traffic-manager-tutorial) |
-The following VPN solution providers connect with Azure AD to enable modern authentication and authorization methods like SSO and multi-factor authentication.
+The following VPN solution providers connect with Azure AD to enable modern authentication and authorization methods like SSO and multifactor authentication.
| **VPN vendor** | **Link** | | | | | Cisco AnyConnect | [https://docs.microsoft.com/azure/active-directory/saas-apps/cisco-anyconnect](/azure/active-directory/saas-apps/cisco-anyconnect) |
-| Fortinet | [https://docs.microsoft.com/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial](/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial) |
-| F5 Big-IP APM | [https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-password-less-vpn](/azure/active-directory/manage-apps/f5-aad-password-less-vpn) |
-| Palo Alto Networks Global Protect | [https://docs.microsoft.com/azure/active-directory/saas-apps/paloaltoadmin-tutorial](/azure/active-directory/saas-apps/paloaltoadmin-tutorial) |
-| Pulse Secure Pulse Connect Secure (PCS) | [https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial](/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial) |
+| Fortinet FortiGate | [https://docs.microsoft.com/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial](/azure/active-directory/saas-apps/fortigate-ssl-vpn-tutorial) |
+| F5 Big-IP Access Policy Manager | [https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-password-less-vpn](/azure/active-directory/manage-apps/f5-aad-password-less-vpn) |
+| Palo Alto Networks GlobalProtect | [https://docs.microsoft.com/azure/active-directory/saas-apps/paloaltoadmin-tutorial](/azure/active-directory/saas-apps/paloaltoadmin-tutorial) |
+| Pulse Connect Secure | [https://docs.microsoft.com/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial](/azure/active-directory/saas-apps/pulse-secure-pcs-tutorial) |
-The following SDP solution providers connect with Azure AD to enable modern authentication and authorization methods like SSO and multi-factor authentication.
+The following providers of software-defined perimeter (SDP) solutions connect with Azure AD to enable modern authentication and authorization methods like SSO and multifactor authentication.
| **SDP vendor** | **Link** | | | |
-| Datawiza Access Broker | [https://docs.microsoft.com/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso](/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso) |
+| Datawiza Access Broker | [https://docs.microsoft.com/en-us/azure/active-directory/manage-apps/datawiza-with-azure-ad](/azure/active-directory/manage-apps/datawiza-with-azure-ad) |
| Perimeter 81 | [https://docs.microsoft.com/azure/active-directory/saas-apps/perimeter-81-tutorial](/azure/active-directory/saas-apps/perimeter-81-tutorial) |
-| Silverfort Authentication Platform | [https://docs.microsoft.com/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso](/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso) |
-| Strata | [https://docs.microsoft.com/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial](/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial) |
-| Zscaler Private Access (ZPA) | [https://docs.microsoft.com/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial](/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial) |
+| Silverfort Authentication Platform | [https://docs.microsoft.com/en-us/azure/active-directory/manage-apps/silverfort-azure-ad-integration](/azure/active-directory/manage-apps/silverfort-azure-ad-integration) |
+| Strata Maverics Identity Orchestrator | [https://docs.microsoft.com/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial](/azure/active-directory/saas-apps/maverics-identity-orchestrator-saml-connector-tutorial) |
+| Zscaler Private Access | [https://docs.microsoft.com/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial](/azure/active-directory/saas-apps/zscalerprivateaccess-tutorial) |
active-directory Troubleshoot Password Based Sso https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/troubleshoot-password-based-sso.md
The following information explains what each notification item means and provide
- **Copy error**: Enables you to select the **copy icon** to the right of the **Copy error** textbox to copy the notification details to help with support. Example:
- ```{"errorCode":"InternalUrl\_Duplicate","localizedErrorDetails":{"errorDetail":"Internal url 'https://google.com/' is invalid since it is already in use"},"operationResults":\[{"objectId":null,"displayName":null,"status":0,"details":"Internal url 'https://bing.com/' is invalid since it is already in use"}\],"timeStampUtc":"2017-03-23T19:50:26.465743Z","clientRequestId":"302fd775-3329-4670-a9f3-bea37004f0bb","internalTransactionId":"ea5b5475-03b9-4f08-8e95-bbb11289ab65","upn":"tperkins@f128.info","tenantId":"7918d4b5-0442-4a97-be2d-36f9f9962ece","userObjectId":"17f84be4-51f8-483a-b533-383791227a99"}```
+
+ `{"errorCode":"InternalUrl\_Duplicate","localizedErrorDetails":{"errorDetail":"Internal url 'https://google.com/' is invalid since it is already in use"},"operationResults":\[{"objectId":null,"displayName":null,"status":0,"details":"Internal url 'https://bing.com/' is invalid since it is already in use"}\],"timeStampUtc":"2017-03-23T19:50:26.465743Z","clientRequestId":"302fd775-3329-4670-a9f3-bea37004f0bb","internalTransactionId":"ea5b5475-03b9-4f08-8e95-bbb11289ab65","upn":"tperkins@f128.info","tenantId":"7918d4b5-0442-4a97-be2d-36f9f9962ece","userObjectId":"17f84be4-51f8-483a-b533-383791227a99"}`
## Next steps - [Quickstart Series on Application Management](view-applications-portal.md)-- [Plan a My Apps deployment](my-apps-deployment-plan.md)
+- [Plan a My Apps deployment](my-apps-deployment-plan.md)
active-directory Appneta Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/appneta-tutorial.md
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment. - AppNeta Performance Manager supports **SP** initiated SSO- - AppNeta Performance Manager supports **Just In Time** user provisioning > [!NOTE]
Follow these steps to enable Azure AD SSO in the Azure portal.
![Screenshot that shows the default attributes for a SAML token.](./media/appneta-tutorial/edit-attribute.png)
-1. In addition to above, AppNeta Performance Manager application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirement.
+1. In addition to above, the AppNeta Performance Manager application expects a few more attributes to be passed back in SAML response, which are shown below. These attributes are also prepopulated, but you can review them as per your requirement.
- | Name | Source Attribute |
+ | Name | Source attribute |
| | - | | firstName | user.givenname | | lastName | user.surname |
Follow these steps to enable Azure AD SSO in the Azure portal.
| title | user.jobtitle | | | |
- > [!NOTE]
- > **groups** refers to the security group in AppNeta Performance Manager that is mapped to a **Role** in Azure AD. For more information, see [App roles UI](../develop/howto-add-app-roles-in-azure-ad-apps.md#app-roles-ui), which explains how to create custom roles in Azure AD. Rather than creating custom roles, most customers add a group claim in the AppNeta enterprise application for security groups with the source attribute group ID. To add a group claim:
-
- 1. Click **Edit** on **User Attributes & Claims**.
-
- 1. Click **Add a group claim** at the top of the page.
-
- ![Screenshot that shows the Attributes & Claims pane with the add a group claim option selected.](./media/appneta-tutorial/add-a-group-claim.png)
-
- 1. Select **Security groups**.
-
- 1. Set **Source attribute** as "Group ID".
-
- 1. Under **Advanced options**, select **Customize the name of the group claim** and enter ΓÇ£groupsΓÇ¥ in the **Name** field:
-
- ![Screenshot that shows the Group Claims pane with security groups, source attribute, and advanced options selected.](./media/appneta-tutorial/specify-security-groups.png)
-
- 1. Click **Save**. This will send Group Object IDs of users when they sign into AppNeta Performance Manager via SSO. Role mappings should be configured using these object IDs and the relevant user role in AppNeta Performance Manager.
-
- ![Screenshot that shows the details of a group claim, with the object ID selected.](./media/appneta-tutorial/object-id.png)
-
- ![Screenshot that shows the Edit Identity Provider pane, with the security group number selected. ](./media/appneta-tutorial/edit-identity-provider.png)
-
-1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+1. To properly pass along your "groups" SAML assertions, you need to configure app roles and set the value to match the role mappings that are set within AppNeta Performance Manager. Under **Azure Active Directory** > **App registrations** > **All applications**, select **Appneta Performance Manager** .
+1. Click **App roles** in the left pane.
+1. Click **Create App role**.
+1. On the **Create app role** pane, complete these steps:
+ 1. In **Display name**, enter a name for the role.
+ 1. In **Allowed member types**, select **Users/Groups**.
+ 1. In **Value**, enter the value of the security group set in your AppNeta Performance Manager role mappings.
+ 1. In **Description**, enter a description for the role.
+ 1. Click **Apply**.
+
+1. After you create the roles, you need to map the roles to your users and groups. Go to **Azure Active Directory** > **Enterprise Applications** > **Appneta Performance Manger** > **Users and groups**.
+1. Select a user or group, and then assign the relevant app role for the user or group.
+1. After you map the app roles, go to **Azure Active Directory** > **Enterprise Applications** > **Appneta Performance Manager** > **Single sign-on**.
+1. On the **Set up single sign-on with SAML** pane, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
![The Certificate download link](common/metadataxml.png)
active-directory Bcinthecloud Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/bcinthecloud-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with BC in the Cloud | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with BC in the Cloud'
description: Learn how to configure single sign-on between Azure Active Directory and BC in the Cloud.
Previously updated : 02/06/2019 Last updated : 09/20/2021
-# Tutorial: Azure Active Directory integration with BC in the Cloud
+# Tutorial: Azure AD SSO integration with BC in the Cloud
-In this tutorial, you learn how to integrate BC in the Cloud with Azure Active Directory (Azure AD).
-Integrating BC in the Cloud with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate BC in the Cloud with Azure Active Directory (Azure AD). When you integrate BC in the Cloud with Azure AD, you can:
-* You can control in Azure AD who has access to BC in the Cloud.
-* You can enable your users to be automatically signed-in to BC in the Cloud (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to BC in the Cloud.
+* Enable your users to be automatically signed-in to BC in the Cloud with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with BC in the Cloud, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* BC in the Cloud single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* BC in the Cloud single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* BC in the Cloud supports **SP** initiated SSO
-
-## Adding BC in the Cloud from the gallery
-
-To configure the integration of BC in the Cloud into Azure AD, you need to add BC in the Cloud from the gallery to your list of managed SaaS apps.
-
-**To add BC in the Cloud from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
+* BC in the Cloud supports **SP** initiated SSO.
-4. In the search box, type **BC in the Cloud**, select **BC in the Cloud** from result panel then click **Add** button to add the application.
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
- ![BC in the Cloud in the results list](common/search-new-app.png)
+## Add BC in the Cloud from the gallery
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with BC in the Cloud based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in BC in the Cloud needs to be established.
-
-To configure and test Azure AD single sign-on with BC in the Cloud, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure BC in the Cloud Single Sign-On](#configure-bc-in-the-cloud-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create BC in the Cloud test user](#create-bc-in-the-cloud-test-user)** - to have a counterpart of Britta Simon in BC in the Cloud that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of BC in the Cloud into Azure AD, you need to add BC in the Cloud from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **BC in the Cloud** in the search box.
+1. Select **BC in the Cloud** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for BC in the Cloud
-To configure Azure AD single sign-on with BC in the Cloud, perform the following steps:
+Configure and test Azure AD SSO with BC in the Cloud using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in BC in the Cloud.
-1. In the [Azure portal](https://portal.azure.com/), on the **BC in the Cloud** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with BC in the Cloud, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure BC in the Cloud SSO](#configure-bc-in-the-cloud-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create BC in the Cloud test user](#create-bc-in-the-cloud-test-user)** - to have a counterpart of B.Simon in BC in the Cloud that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **BC in the Cloud** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![BC in the Cloud Domain and URLs single sign-on information](common/sp-identifier.png)
+ a. In the **Identifier (Entity ID)** text box, type the URL:
+ `https://app.bcinthecloud.com`
- a. In the **Sign on URL** text box, type a URL using the following pattern:
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
`https://app.bcinthecloud.com/router/loginSaml/<customerid>`
- b. In the **Identifier (Entity ID)** text box, type the URL:
- `https://app.bcinthecloud.com`
- > [!NOTE] > This value is not real. Update this value with the actual Sign-On URL. Contact [BC in the Cloud Client support team](https://www.bcinthecloud.com/supportcenter/) to get this value.
To configure Azure AD single sign-on with BC in the Cloud, perform the following
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-
-### Configure BC in the Cloud Single Sign-On
-
-To configure single sign-on on **BC in the Cloud** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [BC in the Cloud support team](https://www.bcinthecloud.com/supportcenter/). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
+In this section, you'll create a test user in the Azure portal called B.Simon.
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to BC in the Cloud.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **BC in the Cloud**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **BC in the Cloud**.
-
- ![The BC in the Cloud link in the Applications list](common/all-applications.png)
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to BC in the Cloud.
-3. In the menu on the left, select **Users and groups**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **BC in the Cloud**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![The "Users and groups" link](common/users-groups-blade.png)
+## Configure BC in the Cloud SSO
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **BC in the Cloud** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [BC in the Cloud support team](https://www.bcinthecloud.com/supportcenter/). They set this setting to have the SAML SSO connection set properly on both sides.
### Create BC in the Cloud test user In this section, you create a user called Britta Simon in BC in the Cloud. Work with [BC in the Cloud support team](https://www.bcinthecloud.com/supportcenter/) to add the users in the BC in the Cloud platform. Users must be created and activated before you use single sign-on.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the BC in the Cloud tile in the Access Panel, you should be automatically signed in to the BC in the Cloud for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+* Click on **Test this application** in Azure portal. This will redirect to BC in the Cloud Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to BC in the Cloud Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the BC in the Cloud tile in the My Apps, this will redirect to BC in the Cloud Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure BC in the Cloud you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Borrowbox Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/borrowbox-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with BorrowBox | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with BorrowBox'
description: Learn how to configure single sign-on between Azure Active Directory and BorrowBox.
Previously updated : 02/21/2019 Last updated : 09/20/2021
-# Tutorial: Azure Active Directory integration with BorrowBox
+# Tutorial: Azure AD SSO integration with BorrowBox
-In this tutorial, you learn how to integrate BorrowBox with Azure Active Directory (Azure AD).
-Integrating BorrowBox with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate BorrowBox with Azure Active Directory (Azure AD). When you integrate BorrowBox with Azure AD, you can:
-* You can control in Azure AD who has access to BorrowBox.
-* You can enable your users to be automatically signed-in to BorrowBox (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to BorrowBox.
+* Enable your users to be automatically signed-in to BorrowBox with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with BorrowBox, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* BorrowBox single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* BorrowBox single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* BorrowBox supports **SP and IDP** initiated SSO
-* BorrowBox supports **Just In Time** user provisioning
+* BorrowBox supports **SP and IDP** initiated SSO.
+* BorrowBox supports **Just In Time** user provisioning.
-## Adding BorrowBox from the gallery
+## Add BorrowBox from the gallery
To configure the integration of BorrowBox into Azure AD, you need to add BorrowBox from the gallery to your list of managed SaaS apps.
-**To add BorrowBox from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **BorrowBox**, select **BorrowBox** from result panel then click **Add** button to add the application.
-
- ![BorrowBox in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with BorrowBox based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in BorrowBox needs to be established.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **BorrowBox** in the search box.
+1. Select **BorrowBox** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-To configure and test Azure AD single sign-on with BorrowBox, you need to complete the following building blocks:
+## Configure and test Azure AD SSO for BorrowBox
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure BorrowBox Single Sign-On](#configure-borrowbox-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create BorrowBox test user](#create-borrowbox-test-user)** - to have a counterpart of Britta Simon in BorrowBox that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+Configure and test Azure AD SSO with BorrowBox using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in BorrowBox.
-### Configure Azure AD single sign-on
+To configure and test Azure AD SSO with BorrowBox, perform the following steps:
-In this section, you enable Azure AD single sign-on in the Azure portal.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure BorrowBox SSO](#configure-borrowbox-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create BorrowBox test user](#create-borrowbox-test-user)** - to have a counterpart of B.Simon in BorrowBox that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-To configure Azure AD single sign-on with BorrowBox, perform the following steps:
+## Configure Azure AD SSO
-1. In the [Azure portal](https://portal.azure.com/), on the **BorrowBox** application integration page, select **Single sign-on**.
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Configure single sign-on link](common/select-sso.png)
+1. In the Azure portal, on the **BorrowBox** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, the user does not have to perform any step as the app is already pre-integrated with Azure.
- ![[Screenshot shows the Basic SAML Configuration.] Domain and URLs single sign-on information](common/preintegrated.png)
- 5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![[Screenshot shows Set additional U R Ls where you can enter a Sign on U R L.] Domain and URLs single sign-on information](common/metadata-upload-additional-signon.png)
- In the **Sign-on URL** text box, type a URL using the following pattern: `https://fe.bolindadigital.com/wldcs_bol_fo/b2i/mainPage.html?b2bSite=<ID>`
To configure Azure AD single sign-on with BorrowBox, perform the following steps
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-
-### Configure BorrowBox Single Sign-On
-
-To configure single sign-on on **BorrowBox** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [BorrowBox support team](mailto:borrowbox@bolinda.com). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
+In this section, you'll create a test user in the Azure portal called B.Simon.
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to BorrowBox.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **BorrowBox**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to BorrowBox.
-2. In the applications list, select **BorrowBox**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **BorrowBox**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![The BorrowBox link in the Applications list](common/all-applications.png)
+## Configure BorrowBox SSO
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **BorrowBox** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [BorrowBox support team](mailto:borrowbox@bolinda.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create BorrowBox test user
In this section, a user called Britta Simon is created in BorrowBox. BorrowBox s
> [!Note] > If you need to create a user manually, contact [BorrowBox support team](mailto:borrowbox@bolinda.com).
-### Test single sign-on
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to BorrowBox Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to BorrowBox Sign-on URL directly and initiate the login flow from there.
-When you click the BorrowBox tile in the Access Panel, you should be automatically signed in to the BorrowBox for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the BorrowBox for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the BorrowBox tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the BorrowBox for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure BorrowBox you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Check Point Remote Access Vpn Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/check-point-remote-access-vpn-tutorial.md
There are two options:
1. If you do not want to use an on-premises Active Directory (LDAP), select only External User Profiles and click OK. 2. If you do want to use an on-premises Active Directory (LDAP), select only LDAP users and in the LDAP Lookup Type select email. Then click OK.
- ![screenshot to manual configuration.](./media/check-point-remote-access-vpn-tutorial/manual-configuration.png)
+ ![Screenshot of manual configuration.](./media/check-point-remote-access-vpn-tutorial/manual-configuration.png)
1. Configure the required settings in the management database:
There are two options:
By default, the Windows client uses its embedded browser and the macOS client uses Safari to authenticate on the Identity Provider's portal. For Windows client to change this behavior to use Internet Explorer instead:
- 1. On the client machine, open a plain-text editor as an Administrator.
- 2. Open the trac.defaults file in the text editor.
- * On 32-bit Windows:
-``%ProgramFiles%\CheckPoint\Endpoint Connect\trac.defaults``
- * On 64-bit Windows:
-``%ProgramFiles(x86)%\CheckPoint\Endpoint Connect\trac.defaults``
- 3. Change the idp_browser_mode attribute value from ΓÇ£embeddedΓÇ¥ to ΓÇ£IEΓÇ¥:
- 4. Save the file.
- 5. Restart the Check Point Endpoint Security VPN client service.
-Open the Windows Command Prompt as an Administrator and run these commands:
+ 1. On the client machine, open a plain-text editor as an Administrator.
- `# net stop TracSrvWrapper `
+ 2. Open the `trac.defaults` file in the text editor.
- `# net start TracSrvWrapper`
-
+ - On 32-bit Windows:
+
+ `%ProgramFiles%\CheckPoint\Endpoint Connect\trac.defaults`
+
+ - On 64-bit Windows:
+
+ `%ProgramFiles(x86)%\CheckPoint\Endpoint Connect\trac.defaults`
+
+ 3. Change the `idp_browser_mode` attribute value from `embedded` to `IE`.
+
+ 4. Save the file.
+
+ 5. Restart the Check Point Endpoint Security VPN client service.
+
+ Open the Windows Command Prompt as an Administrator and run these commands:
+
+ `# net stop TracSrvWrapper`
+
+ `# net start TracSrvWrapper`
1. Start authentication with browser running in background:
- 1. On the client machine, open a plain-text editor as an Administrator.
- 2. Open the trac.defaults file in the text editor.
- * On 32-bit Windows: `%ProgramFiles%\CheckPoint\Endpoint Connect\trac.defaults`
- * On 64-bit Windows: `%ProgramFiles(x86)%\CheckPoint\Endpoint Connect\trac.defaults`
+ 1. On the client machine, open a plain-text editor as an Administrator.
- * On macOS: `/Library/Application Support/Checkpoint/Endpoint Security/Endpoint Connect/Trac.defaults`
+ 2. Open the `trac.defaults` file in the text editor.
- 3. Change the value of **idp_show_browser_primary_auth_flow** to **false**
- 4. Save the file.
- 5. Restart the Check Point Endpoint Security VPN client service
- * On Windows clients
-Open the Windows Command Prompt as an Administrator and run these commands:
+ - On 32-bit Windows:
- `# net stop TracSrvWrapper`
-
- `# net start TracSrvWrapper`
+ `%ProgramFiles%\CheckPoint\Endpoint Connect\trac.defaults`
- * On macOS clients
+ - On 64-bit Windows:
- `sudo launchctl stop com.checkpoint.epc.service`
+ `%ProgramFiles(x86)%\CheckPoint\Endpoint Connect\trac.defaults`
- `sudo launchctl start com.checkpoint.epc.service`
+ - On macOS:
+
+ `/Library/Application Support/Checkpoint/Endpoint Security/Endpoint Connect/trac.defaults`
+ 3. Change the value of `idp_show_browser_primary_auth_flow` to `false`.
+
+ 4. Save the file.
+
+ 5. Restart the Check Point Endpoint Security VPN client service.
+ - On Windows clients, open the Windows Command Prompt as an Administrator and run these commands:
+
+ `# net stop TracSrvWrapper`
+
+ `# net start TracSrvWrapper`
+
+ - On macOS clients, run:
+
+ `sudo launchctl stop com.checkpoint.epc.service`
+
+ `sudo launchctl start com.checkpoint.epc.service`
### Create Check Point Remote Secure Access VPN test user
In this section, you create a user called Britta Simon in Check Point Remote Sec
## Test SSO
-1. Open the VPN client and click **Connect to…**.
+1. Open the VPN client and click **Connect to...**.
![screenshot for Connect to.](./media/check-point-remote-access-vpn-tutorial/connect.png)
In this section, you create a user called Britta Simon in Check Point Remote Sec
## Next steps Once you configure Check Point Remote Secure Access VPN you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).--
active-directory Clever Nelly Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/clever-nelly-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Clever Nelly | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Clever Nelly'
description: Learn how to configure single sign-on between Azure Active Directory and Clever Nelly.
Previously updated : 06/27/2019 Last updated : 09/20/2021
-# Tutorial: Integrate Clever Nelly with Azure Active Directory
+# Tutorial: Azure AD SSO integration with Clever Nelly
In this tutorial, you'll learn how to integrate Clever Nelly with Azure Active Directory (Azure AD). When you integrate Clever Nelly with Azure AD, you can:
In this tutorial, you'll learn how to integrate Clever Nelly with Azure Active D
* Enable your users to be automatically signed-in to Clever Nelly with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
-* An Azure AD subscription. If you don't have a subscription, you can get one-month free trial [here](https://azure.microsoft.com/pricing/free-trial/).
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
* Clever Nelly single sign-on (SSO) enabled subscription. ## Scenario description
-In this tutorial, you configure and test Azure AD SSO in a test environment. Clever Nelly supports **SP and IDP** initiated SSO.
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Clever Nelly supports **SP and IDP** initiated SSO.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Clever Nelly from the gallery
+## Add Clever Nelly from the gallery
To configure the integration of Clever Nelly into Azure AD, you need to add Clever Nelly from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Clever Nelly** in the search box. 1. Select **Clever Nelly** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for Clever Nelly
Configure and test Azure AD SSO with Clever Nelly using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Clever Nelly.
-To configure and test Azure AD SSO with Clever Nelly, complete the following building blocks:
+To configure and test Azure AD SSO with Clever Nelly, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
-2. **[Configure Clever Nelly SSO](#configure-clever-nelly-sso)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Clever Nelly test user](#create-clever-nelly-test-user)** - to have a counterpart of Britta Simon in Clever Nelly that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Clever Nelly SSO](#configure-clever-nelly-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Clever Nelly test user](#create-clever-nelly-test-user)** - to have a counterpart of B.Simon in Clever Nelly that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Clever Nelly** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **Clever Nelly** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
- a. In the **Identifier** text box, type a URL:
+ a. In the **Identifier** text box, type one of the following URLs:
| Environment | URL Pattern | | - | - |
Follow these steps to enable Azure AD SSO in the Azure portal.
| Production | `https://secure.elephantsdontforget.com/plato` | | | |
- b. In the **Reply URL** text box, type a URL:
+ b. In the **Reply URL** text box, type type one of the following URLs:
| Environment | URL Pattern | | - | - |
Follow these steps to enable Azure AD SSO in the Azure portal.
1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- In the **Sign-on URL** text box, type a URL:
+ In the **Sign-on URL** text box, type one of the following URLs::
| Environment | URL Pattern | | - | - |
Follow these steps to enable Azure AD SSO in the Azure portal.
| Production | `https://secure.elephantsdontforget.com/plato/sso/microsoft/index.xhtml` | | | |
- > [!NOTE]
- > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Clever Nelly Client support team](mailto:support@elephantsdontforget.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
- 1. On the **Set up Single Sign-On with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer. ![The Certificate download link](common/copy-metadataurl.png)
-### Configure Clever Nelly SSO
-
-To configure single sign-on on **Clever Nelly** side, you need to send the **App Federation Metadata Url** to [Clever Nelly support team](mailto:support@elephantsdontforget.com). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Clever Nelly**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
+## Configure Clever Nelly SSO
+
+To configure single sign-on on **Clever Nelly** side, you need to send the **App Federation Metadata Url** to [Clever Nelly support team](mailto:support@elephantsdontforget.com). They set this setting to have the SAML SSO connection set properly on both sides.
+ ### Create Clever Nelly test user In this section, you create a user called Britta Simon in Clever Nelly. Work with [Clever Nelly support team](mailto:support@elephantsdontforget.com) to add the users in the Clever Nelly platform. Users must be created and activated before you use single sign-on.
-### Test SSO
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Clever Nelly Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to Clever Nelly Sign-on URL directly and initiate the login flow from there.
-When you click the Clever Nelly tile in the Access Panel, you should be automatically signed in to the Clever Nelly for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Clever Nelly for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Clever Nelly tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Clever Nelly for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Clever Nelly you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Clicktime Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/clicktime-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with ClickTime | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with ClickTime'
description: Learn how to configure single sign-on between Azure Active Directory and ClickTime.
Previously updated : 01/21/2019 Last updated : 09/21/2021
-# Tutorial: Azure Active Directory integration with ClickTime
+# Tutorial: Azure AD SSO integration with ClickTime
-In this tutorial, you learn how to integrate ClickTime with Azure Active Directory (Azure AD).
-Integrating ClickTime with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate ClickTime with Azure Active Directory (Azure AD). When you integrate ClickTime with Azure AD, you can:
-* You can control in Azure AD who has access to ClickTime.
-* You can enable your users to be automatically signed-in to ClickTime (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to ClickTime.
+* Enable your users to be automatically signed-in to ClickTime with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with ClickTime, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* ClickTime single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* ClickTime single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* ClickTime supports **IDP** initiated SSO
-
-## Adding ClickTime from the gallery
-
-To configure the integration of ClickTime into Azure AD, you need to add ClickTime from the gallery to your list of managed SaaS apps.
-
-**To add ClickTime from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **ClickTime**, select **ClickTime** from result panel then click **Add** button to add the application.
-
- ![ClickTime in the results list](common/search-new-app.png)
+* ClickTime supports **IDP** initiated SSO.
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with ClickTime based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in ClickTime needs to be established.
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-To configure and test Azure AD single sign-on with ClickTime, you need to complete the following building blocks:
+## Add ClickTime from the gallery
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure ClickTime Single Sign-On](#configure-clicktime-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create ClickTime test user](#create-clicktime-test-user)** - to have a counterpart of Britta Simon in ClickTime that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of ClickTime into Azure AD, you need to add ClickTime from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **ClickTime** in the search box.
+1. Select **ClickTime** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for ClickTime
-To configure Azure AD single sign-on with ClickTime, perform the following steps:
+Configure and test Azure AD SSO with ClickTime using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in ClickTime.
-1. In the [Azure portal](https://portal.azure.com/), on the **ClickTime** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with ClickTime, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure ClickTime SSO](#configure-clicktime-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create ClickTime test user](#create-clicktime-test-user)** - to have a counterpart of B.Simon in ClickTime that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **ClickTime** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Set up Single Sign-On with SAML** page, perform the following steps:
- ![ClickTime Domain and URLs single sign-on information](common/idp-intiated.png)
-
- a. In the **Identifier** text box, type a URL:
+ a. In the **Identifier** text box, type the URL:
`https://app.clicktime.com/sp/`
- b. In the **Reply URL** text box, type a URL using the following pattern:
+ b. In the **Reply URL** text box, type one of the following URLs:
- ```http
- https://app.clicktime.com/Login/
- https://app.clicktime.com/App/Login/Consume.aspx
- ```
+ | **Reply URL** |
+ |-|
+ | `https://app.clicktime.com/Login/` |
+ | `https://app.clicktime.com/App/Login/Consume.aspx` |
4. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with ClickTime, perform the following steps
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
- b. Azure Ad Identifier
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- c. Logout URL
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to ClickTime.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **ClickTime**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure ClickTime Single Sign-On
+## Configure ClickTime SSO
1. In a different web browser window, log into your ClickTime company site as an administrator.
To configure Azure AD single sign-on with ClickTime, perform the following steps
1. In the **Single Sign-On Preferences** configuration section, perform the following steps:
- ![Security Settings](./media/clicktime-tutorial/tic777280.png "Security Settings")
+ ![Security Settings](./media/clicktime-tutorial/toolbar.png "Security Settings")
a. Select **Allow** sign-in using Single Sign-On (SSO) with **Azure AD**.
To configure Azure AD single sign-on with ClickTime, perform the following steps
d. Click **Save**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to ClickTime.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **ClickTime**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **ClickTime**.
-
- ![The ClickTime link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create ClickTime test user In order to enable Azure AD users to log into ClickTime, they must be provisioned into ClickTime.
In the case of ClickTime, provisioning is a manual task.
1. In the toolbar on the top, click **Company**, and then click **People**.
- ![Screenshot shows the ClickTime tenant with Company and People selected.](./media/clicktime-tutorial/tic777282.png "People")
+ ![Screenshot shows the ClickTime tenant with Company and People selected.](./media/clicktime-tutorial/account.png "People")
1. Click **Add Person**.
- ![Add Person](./media/clicktime-tutorial/tic777283.png "Add Person")
+ ![Add Person](./media/clicktime-tutorial/company.png "Add Person")
1. In the New Person section, perform the following steps:
- ![Screenshot shows the Add Person section where you can add the information in this step.](./media/clicktime-tutorial/tic777284.png "People")
+ ![Screenshot shows the Add Person section where you can add the information in this step.](./media/clicktime-tutorial/information.png "New Person")
a. In the **full name** textbox, type full name of user like **Britta Simon**.
In the case of ClickTime, provisioning is a manual task.
c. Click **Save**.
-### Test single sign-on
-
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+## Test SSO
-When you click the ClickTime tile in the Access Panel, you should be automatically signed in to the ClickTime for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional resources
+* Click on Test this application in Azure portal and you should be automatically signed in to the ClickTime for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the ClickTime tile in the My Apps, you should be automatically signed in to the ClickTime for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure ClickTime you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Cognidox Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/cognidox-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Cognidox | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Cognidox'
description: Learn how to configure single sign-on between Azure Active Directory and Cognidox.
Previously updated : 07/18/2019 Last updated : 09/21/2021
-# Tutorial: Integrate Cognidox with Azure Active Directory
+# Tutorial: Azure AD SSO integration with Cognidox
In this tutorial, you'll learn how to integrate Cognidox with Azure Active Directory (Azure AD). When you integrate Cognidox with Azure AD, you can:
In this tutorial, you'll learn how to integrate Cognidox with Azure Active Direc
* Enable your users to be automatically signed-in to Cognidox with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Cognidox supports **SP and IDP** initiated SSO
-* Cognidox supports **Just In Time** user provisioning
+* Cognidox supports **SP and IDP** initiated SSO.
+* Cognidox supports **Just In Time** user provisioning.
-## Adding Cognidox from the gallery
+## Add Cognidox from the gallery
To configure the integration of Cognidox into Azure AD, you need to add Cognidox from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Cognidox** in the search box. 1. Select **Cognidox** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for Cognidox
Configure and test Azure AD SSO with Cognidox using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Cognidox.
-To configure and test Azure AD SSO with Cognidox, complete the following building blocks:
+To configure and test Azure AD SSO with Cognidox, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
-2. **[Configure Cognidox SSO](#configure-cognidox-sso)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
-5. **[Create Cognidox test user](#create-cognidox-test-user)** - to have a counterpart of B.Simon in Cognidox that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Cognidox SSO](#configure-cognidox-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Cognidox test user](#create-cognidox-test-user)** - to have a counterpart of B.Simon in Cognidox that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Cognidox** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **Cognidox** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
- a. In the **Identifier** text box, type a URL using the following pattern:
+ a. In the **Identifier** text box, type a value using the following pattern:
`urn:net.cdox.<YOURCOMPANY>` b. In the **Reply URL** text box, type a URL using the following pattern:
Follow these steps to enable Azure AD SSO in the Azure portal.
6. In addition to above, Cognidox application expects few more attributes to be passed back in SAML response. In the User Claims section on the User Attributes dialog, perform the following steps to add SAML token attribute as shown in the below table: | Name | Namespace | Transformation | Parameter 1 |
- | | | |
+ | | | |--|
| wanshort | http:\//appinux.com/windowsaccountname2 | ExtractMailPrefix() | user.userprincipalname | - a. Click **Add new claim** to open the **Manage user claims** dialog. b. In the **Name** textbox, type the attribute name shown for that row.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
-### Configure Cognidox SSO
-
-To configure single sign-on on **Cognidox** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Cognidox support team](mailto:support@cognidox.com). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Cognidox**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
+## Configure Cognidox SSO
+
+To configure single sign-on on **Cognidox** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Cognidox support team](mailto:support@cognidox.com). They set this setting to have the SAML SSO connection set properly on both sides.
+ ### Create Cognidox test user In this section, a user called B.Simon is created in Cognidox. Cognidox supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Cognidox, a new one is created after authentication.
-### Test SSO
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Cognidox Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to Cognidox Sign-on URL directly and initiate the login flow from there.
-When you click the Cognidox tile in the Access Panel, you should be automatically signed in to the Cognidox for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Cognidox for which you set up the SSO.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Cognidox tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Cognidox for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Cognidox you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Edubrite Lms Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/edubrite-lms-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with EduBrite LMS | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with EduBrite LMS'
description: Learn how to configure single sign-on between Azure Active Directory and EduBrite LMS.
Previously updated : 04/03/2019 Last updated : 09/21/2021
-# Tutorial: Azure Active Directory integration with EduBrite LMS
+# Tutorial: Azure AD SSO integration with EduBrite LMS
-In this tutorial, you learn how to integrate EduBrite LMS with Azure Active Directory (Azure AD).
-Integrating EduBrite LMS with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate EduBrite LMS with Azure Active Directory (Azure AD). When you integrate EduBrite LMS with Azure AD, you can:
-* You can control in Azure AD who has access to EduBrite LMS.
-* You can enable your users to be automatically signed-in to EduBrite LMS (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to EduBrite LMS.
+* Enable your users to be automatically signed-in to EduBrite LMS with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites To configure Azure AD integration with EduBrite LMS, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/)
-* EduBrite LMS single sign-on enabled subscription
+* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/).
+* EduBrite LMS single sign-on enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* EduBrite LMS supports **SP and IDP** initiated SSO
+* EduBrite LMS supports **SP and IDP** initiated SSO.
-* EduBrite LMS supports **Just In Time** user provisioning
+* EduBrite LMS supports **Just In Time** user provisioning.
-## Adding EduBrite LMS from the gallery
+## Add EduBrite LMS from the gallery
To configure the integration of EduBrite LMS into Azure AD, you need to add EduBrite LMS from the gallery to your list of managed SaaS apps.
-**To add EduBrite LMS from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **EduBrite LMS**, select **EduBrite LMS** from result panel then click **Add** button to add the application.
-
- ![EduBrite LMS in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **EduBrite LMS** in the search box.
+1. Select **EduBrite LMS** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you configure and test Azure AD single sign-on with EduBrite LMS based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in EduBrite LMS needs to be established.
+## Configure and test Azure AD SSO for EduBrite LMS
-To configure and test Azure AD single sign-on with EduBrite LMS, you need to complete the following building blocks:
+Configure and test Azure AD SSO with EduBrite LMS using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in EduBrite LMS.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure EduBrite LMS Single Sign-On](#configure-edubrite-lms-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create EduBrite LMS test user](#create-edubrite-lms-test-user)** - to have a counterpart of Britta Simon in EduBrite LMS that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure and test Azure AD SSO with EduBrite LMS, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure EduBrite LMS SSO](#configure-edubrite-lms-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create EduBrite LMS test user](#create-edubrite-lms-test-user)** - to have a counterpart of B.Simon in EduBrite LMS that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with EduBrite LMS, perform the following steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **EduBrite LMS** application integration page, select **Single sign-on**.
+1. In the Azure portal, on the **EduBrite LMS** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Configure single sign-on link](common/select-sso.png)
-
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
- ![Screenshot that shows the "Basic S A M L Configuration" with the "Identifier", "Reply U R L", and "Save" button highlighted.](common/idp-intiated.png)
- a. In the **Identifier** text box, type a URL using the following pattern: `https://<customer-specific>.edubrite.com`
To configure Azure AD single sign-on with EduBrite LMS, perform the following st
5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![EduBrite LMS Domain and URLs single sign-on information](common/metadata-upload-additional-signon.png)
- In the **Sign-on URL** text box, type a URL using the following pattern: `https://<customer-specific>.edubrite.com/oltpublish/site/samlLoginResponse.do`
To configure Azure AD single sign-on with EduBrite LMS, perform the following st
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure AD Identifier
-
- c. Logout URL
-
-### Configure EduBrite LMS Single Sign-On
-
-To configure single sign-on on **EduBrite LMS** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [EduBrite LMS support team](mailto:support@edubrite.com). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type `brittasimon@yourcompanydomain.extension`. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to EduBrite LMS.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to EduBrite LMS.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **EduBrite LMS**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **EduBrite LMS**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure EduBrite LMS SSO
-2. In the applications list, select **EduBrite LMS**.
-
- ![The EduBrite LMS link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
+To configure single sign-on on **EduBrite LMS** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [EduBrite LMS support team](mailto:support@edubrite.com). They set this setting to have the SAML SSO connection set properly on both sides.
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
+### Create EduBrite LMS test user
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
+In this section, a user called Britta Simon is created in EduBrite LMS. EduBrite LMS supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in EduBrite LMS, a new one is created after authentication.
-7. In the **Add Assignment** dialog click the **Assign** button.
+## Test SSO
-### Create EduBrite LMS test user
+In this section, you test your Azure AD single sign-on configuration with following options.
-In this section, a user called Britta Simon is created in EduBrite LMS. EduBrite LMS supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in EduBrite LMS, a new one is created after authentication.
+#### SP initiated:
-### Test single sign-on
+* Click on **Test this application** in Azure portal. This will redirect to EduBrite LMS Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to EduBrite LMS Sign-on URL directly and initiate the login flow from there.
-When you click the EduBrite LMS tile in the Access Panel, you should be automatically signed in to the EduBrite LMS for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the EduBrite LMS for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the EduBrite LMS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the EduBrite LMS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure EduBrite LMS you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Expensein Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/expensein-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with ExpenseIn | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with ExpenseIn'
description: Learn how to configure single sign-on between Azure Active Directory and ExpenseIn.
Previously updated : 07/17/2020 Last updated : 09/21/2021
-# Tutorial: Integrate ExpenseIn with Azure Active Directory
+# Tutorial: Azure AD SSO integration with ExpenseIn
In this tutorial, you'll learn how to integrate ExpenseIn with Azure Active Directory (Azure AD). When you integrate ExpenseIn with Azure AD, you can:
In this tutorial, you'll learn how to integrate ExpenseIn with Azure Active Dire
* Enable your users to be automatically signed-in to ExpenseIn with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment.
-* ExpenseIn supports **SP and IDP** initiated SSO.
-* Once you configure ExpenseIn you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* ExpenseIn supports **SP and IDP** initiated SSO.
-## Adding ExpenseIn from the gallery
+## Add ExpenseIn from the gallery
To configure the integration of ExpenseIn into Azure AD, you need to add ExpenseIn from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**.
To configure the integration of ExpenseIn into Azure AD, you need to add Expense
Configure and test Azure AD SSO with ExpenseIn using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in ExpenseIn.
-To configure and test Azure AD SSO with ExpenseIn, complete the following building blocks:
+To configure and test Azure AD SSO with ExpenseIn, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with ExpenseIn, complete the following buildi
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **ExpenseIn** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **ExpenseIn** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- In the **Sign-on URL** text box, type a URL:
+ In the **Sign-on URL** text box, type the URL:
`https://app.expensein.com/saml` 1. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and click **Download** to download the **Certificate (Base64)** and save it on your computer.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **ExpenseIn**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button. - ## Configure ExpenseIn SSO 1. To automate the configuration within ExpenseIn, you need to install **My Apps Secure Sign-in browser extension** by clicking **Install the extension**.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Click on **Admin** on the top of the page then navigate to **Single Sign-On** and click **Add provider**.
- ![Screenshot that shows the "Admin" tab and the "Single Sign-On - Providers" page and "Add Provider" selected.](./media/expenseIn-tutorial/config01.png)
+ ![Screenshot that shows the "Admin" tab and the "Single Sign-On - Providers" page and "Add Provider" selected.](./media/expenseIn-tutorial/admin.png)
1. On the **New Identity Provider** pop-up, Perform the following steps:
- ![Screenshot that shows the "Edit Identity Provider" pop-up with values entered.](./media/expenseIn-tutorial/config02.png)
+ ![Screenshot that shows the "Edit Identity Provider" pop-up with values entered.](./media/expenseIn-tutorial/certificate.png)
a. In the **Provider Name** text box, type the name; for example, Azure.
- b. Select **Yes** for **Allow Provider Intitiated Sign-On**.
+ b. Select **Yes** for **Allow Provider Initiated Sign-On**.
c. In the **Target Url** text box, paste the value of **Login URL**, which you have copied from Azure portal.
To enable Azure AD users to sign in to ExpenseIn, they must be provisioned into
2. Click on **Admin** on the top of the page then navigate to **Users** and click **New User**.
- ![Screenshot that shows the "Admin" tab and the "Manage Users" page with "New User" selected.](./media/expenseIn-tutorial/config03.png)
+ ![Screenshot that shows the "Admin" tab and the "Manage Users" page with "New User" selected.](./media/expenseIn-tutorial/users.png)
3. On the **Details** pop-up, perform the following steps:
- ![ExpenseIn configuration](./media/expenseIn-tutorial/config04.png)
+ ![ExpenseIn configuration](./media/expenseIn-tutorial/details.png)
a. In **First Name** text box, enter the first name of user like **B**.
To enable Azure AD users to sign in to ExpenseIn, they must be provisioned into
## Test SSO
-When you select the ExpenseIn tile in the Access Panel, you should be automatically signed in to the ExpenseIn for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
-## Additional resources
+* Click on **Test this application** in Azure portal. This will redirect to ExpenseIn Sign on URL where you can initiate the login flow.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* Go to ExpenseIn Sign-on URL directly and initiate the login flow from there.
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+#### IDP initiated:
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the ExpenseIn for which you set up the SSO.
-- [Try ExpenseIn with Azure AD](https://aad.portal.azure.com/)
+You can also use Microsoft My Apps to test the application in any mode. When you click the ExpenseIn tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ExpenseIn for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [How to protect ExpenseIn with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure ExpenseIn you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Globalone Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/globalone-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with EY GlobalOne | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with EY GlobalOne'
description: Learn how to configure single sign-on between Azure Active Directory and EY GlobalOne.
Previously updated : 08/07/2020 Last updated : 09/22/2021
-# Tutorial: Integrate EY GlobalOne with Azure Active Directory
+# Tutorial: Azure AD SSO integration with EY GlobalOne
In this tutorial, you'll learn how to integrate EY GlobalOne with Azure Active Directory (Azure AD). When you integrate EY GlobalOne with Azure AD, you can:
In this tutorial, you'll learn how to integrate EY GlobalOne with Azure Active D
* Enable your users to be automatically signed-in to EY GlobalOne with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment.
-* EY GlobalOne supports **SP and IDP** initiated SSO
+* EY GlobalOne supports **SP and IDP** initiated SSO.
* EY GlobalOne supports **Just In Time** user provisioning.
-* Once you configure EY GlobalOne you can enforce Session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad)
-## Adding EY GlobalOne from the gallery
+## Add EY GlobalOne from the gallery
To configure the integration of EY GlobalOne into Azure AD, you need to add EY GlobalOne from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**.
To configure the integration of EY GlobalOne into Azure AD, you need to add EY G
Configure and test Azure AD SSO with EY GlobalOne using a test user called **B. Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in EY GlobalOne.
-To configure and test Azure AD SSO with EY GlobalOne, complete the following building blocks:
+To configure and test Azure AD SSO with EY GlobalOne, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with B. Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable B. Simon to use Azure AD single sign-on.
-1. **[Configure EY GlobalOne](#configure-ey-globalone)** to configure the SSO settings on application side.
- * **[Create EY GlobalOne test user](#create-ey-globalone-test-user)** to have a counterpart of B. Simon in EY GlobalOne that is linked to the Azure AD representation of user.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with B. Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable B. Simon to use Azure AD single sign-on.
+1. **[Configure EY GlobalOne SSO](#configure-ey-globalone-sso)** to configure the SSO settings on application side.
+ 1. **[Create EY GlobalOne test user](#create-ey-globalone-test-user)** to have a counterpart of B. Simon in EY GlobalOne that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **EY GlobalOne** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **EY GlobalOne** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
In this section, you'll enable B. Simon to use Azure single sign-on by granting
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **EY GlobalOne**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B. Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
-## Configure EY GlobalOne
+## Configure EY GlobalOne SSO
To configure single sign-on on **EY GlobalOne** side, you need to send the downloaded **Certificate (Raw)** and appropriate copied URLs from Azure portal to [EY GlobalOne support team](mailto:globalone.support@ey.com). They set this setting to have the SAML SSO connection set properly on both sides.
In this section, a user called Britta Simon is created in EY GlobalOne. EY Globa
## Test SSO
-When you select the EY GlobalOne tile in the Access Panel, you should be automatically signed in to the EY GlobalOne for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to EY GlobalOne Sign on URL where you can initiate the login flow.
+
+* Go to EY GlobalOne Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the EY GlobalOne for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the EY GlobalOne tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the EY GlobalOne for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure EY GlobalOne you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Huddle Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/huddle-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Huddle | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Huddle'
description: Learn how to configure single sign-on between Azure Active Directory and Huddle.
Previously updated : 02/15/2019 Last updated : 09/22/2021
-# Tutorial: Azure Active Directory integration with Huddle
+# Tutorial: Azure AD SSO integration with Huddle
-In this tutorial, you learn how to integrate Huddle with Azure Active Directory (Azure AD).
-Integrating Huddle with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Huddle with Azure Active Directory (Azure AD). When you integrate Huddle with Azure AD, you can:
-* You can control in Azure AD who has access to Huddle.
-* You can enable your users to be automatically signed-in to Huddle (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Huddle.
+* Enable your users to be automatically signed-in to Huddle with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Huddle, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Huddle single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Huddle single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Huddle supports **SP and IDP** initiated SSO
-
-## Adding Huddle from the gallery
-
-To configure the integration of Huddle into Azure AD, you need to add Huddle from the gallery to your list of managed SaaS apps.
-
-**To add Huddle from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Huddle**, select **Huddle** from result panel then click **Add** button to add the application.
-
- ![Huddle in the results list](common/search-new-app.png)
+* Huddle supports **SP and IDP** initiated SSO.
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Huddle based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Huddle needs to be established.
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-To configure and test Azure AD single sign-on with Huddle, you need to complete the following building blocks:
+## Add Huddle from the gallery
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Huddle Single Sign-On](#configure-huddle-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Huddle test user](#create-huddle-test-user)** - to have a counterpart of Britta Simon in Huddle that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of Huddle into Azure AD, you need to add Huddle from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Huddle** in the search box.
+1. Select **Huddle** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for Huddle
-To configure Azure AD single sign-on with Huddle, perform the following steps:
+Configure and test Azure AD SSO with Huddle using a test user called **B. Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Huddle.
-1. In the [Azure portal](https://portal.azure.com/), on the **Huddle** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with Huddle, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with B. Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable B. Simon to use Azure AD single sign-on.
+1. **[Configure Huddle SSO](#configure-huddle-sso)** to configure the SSO settings on application side.
+ 1. **[Create Huddle test user](#create-huddle-test-user)** to have a counterpart of B. Simon in Huddle that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **Huddle** application integration page, find the **Manage** section and select **Single sign-on**.
+1. On the **Select a Single sign-on method** page, select **SAML**.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, If you wish to configure the application in **IDP** initiated mode, perform the following steps: > [!NOTE] > Your huddle instance will be automatically detected from the domain you enter below.
- ![Screenshot shows the Basic SAML Configuration, where you can enter Identifier, Reply U R L, and select Save.](common/idp-intiated.png)
+ a. In the **Identifier** text box,type one of the following URLs:
- a. In the **Identifier** text box,type a URL:
+ | **Identifier** |
+ ||
+ | `https://login.huddle.net` |
+ | `https://login.huddle.com` |
- ```http
- https://login.huddle.net
- https://login.huddle.com
- ```
+ b. In the **Reply URL** text box, type one of the following URLs:
- b. In the **Reply URL** text box, type a URL:
-
- ```http
- https://login.huddle.net/saml/browser-sso
- https://login.huddle.com/saml/browser-sso
- https://login.huddle.com/saml/idp-initiated-sso
- ```
+ | **Reply URL** |
+ |-|
+ | `https://login.huddle.net/saml/browser-sso` |
+ | `https://login.huddle.com/saml/browser-sso` |
+ | `https://login.huddle.com/saml/idp-initiated-sso` |
5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![Screenshot shows Set additional U R Ls where you can enter a Sign on U R L.](common/metadata-upload-additional-signon.png)
-
- In the **Sign-on URL** text box, type a URL using the following pattern:
-
- ```http
- https://<customsubdomain>.huddle.com
- https://us.huddle.com
- ```
+ In the **Sign-on URL** text box, type a URL using the following patterns:
+ | **Sign-on URL** |
+ |-|
+ | `https://<customsubdomain>.huddle.com` |
+ | `https://us.huddle.com` |
+
> [!NOTE] > The Sign-on URL value is not real. Update this value with the actual Sign-On URL. Contact [Huddle Client support team](https://huddle.zendesk.com) to get this value.
To configure Azure AD single sign-on with Huddle, perform the following steps:
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-
-### Configure Huddle Single Sign-On
-
-To configure single sign-on on **Huddle** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Huddle support team](https://huddle.zendesk.com/). They set this setting to have the SAML SSO connection set properly on both sides.
-
-> [!NOTE]
-> Single sign-on needs to be enabled by the Huddle support team. You get a notification when the configuration has been completed.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
+In this section, you'll create a test user in the Azure portal called B. Simon.
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B. Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B. Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Huddle.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Huddle**.
+In this section, you'll enable B. Simon to use Azure single sign-on by granting access to Huddle.
- ![Enterprise applications blade](common/enterprise-applications.png)
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Huddle**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B. Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-2. In the applications list, select **Huddle**.
+## Configure Huddle SSO
- ![The Huddle link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
+To configure single sign-on on **Huddle** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Huddle support team](https://huddle.zendesk.com/). They set this setting to have the SAML SSO connection set properly on both sides.
-7. In the **Add Assignment** dialog click the **Assign** button.
+> [!NOTE]
+> Single sign-on needs to be enabled by the Huddle support team. You get a notification when the configuration has been completed.
### Create Huddle test user
To enable Azure AD users to log in to Huddle, they must be provisioned into Hudd
3. Click **People \> Invite People**.
- ![People](./media/huddle-tutorial/ic787838.png "People")
+ ![People](./media/huddle-tutorial/tasks.png "People")
4. In the **Create a new invitation** section, perform the following steps:
- ![New Invitation](./media/huddle-tutorial/ic787839.png "New Invitation")
+ ![New Invitation](./media/huddle-tutorial/team.png "New Invitation")
a. In the **Choose a team to invite people to join** list, select **team**.
To enable Azure AD users to log in to Huddle, they must be provisioned into Hudd
> [!NOTE] > You can use any other Huddle user account creation tools or APIs provided by Huddle to provision Azure AD user accounts.
-### Test single sign-on
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Huddle Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to Huddle Sign-on URL directly and initiate the login flow from there.
-When you click the Huddle tile in the Access Panel, you should be automatically signed in to the Huddle for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Huddle for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Huddle tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Huddle for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Huddle you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Kintone Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/kintone-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Kintone | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Kintone'
description: Learn how to configure single sign-on between Azure Active Directory and Kintone.
Previously updated : 03/26/2019 Last updated : 09/22/2021
-# Tutorial: Azure Active Directory integration with Kintone
+# Tutorial: Azure AD SSO integration with Kintone
-In this tutorial, you learn how to integrate Kintone with Azure Active Directory (Azure AD).
-Integrating Kintone with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Kintone with Azure Active Directory (Azure AD). When you integrate Kintone with Azure AD, you can:
-* You can control in Azure AD who has access to Kintone.
-* You can enable your users to be automatically signed-in to Kintone (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Kintone.
+* Enable your users to be automatically signed-in to Kintone with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites To configure Azure AD integration with Kintone, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/)
-* Kintone single sign-on enabled subscription
+* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/).
+* Kintone single sign-on enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Kintone supports **SP** initiated SSO
+* Kintone supports **SP** initiated SSO.
-## Adding Kintone from the gallery
+## Add Kintone from the gallery
To configure the integration of Kintone into Azure AD, you need to add Kintone from the gallery to your list of managed SaaS apps.
-**To add Kintone from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Kintone**, select **Kintone** from result panel then click **Add** button to add the application.
-
- ![Kintone in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Kintone based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Kintone needs to be established.
-
-To configure and test Azure AD single sign-on with Kintone, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Kintone Single Sign-On](#configure-kintone-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Kintone test user](#create-kintone-test-user)** - to have a counterpart of Britta Simon in Kintone that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Kintone** in the search box.
+1. Select **Kintone** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-### Configure Azure AD single sign-on
+## Configure and test Azure AD SSO for Kintone
-In this section, you enable Azure AD single sign-on in the Azure portal.
+Configure and test Azure AD SSO with Kintone using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Kintone.
-To configure Azure AD single sign-on with Kintone, perform the following steps:
+To configure and test Azure AD SSO with Kintone, perform the following steps:
-1. In the [Azure portal](https://portal.azure.com/), on the **Kintone** application integration page, select **Single sign-on**.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Kintone SSO](#configure-kintone-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Kintone test user](#create-kintone-test-user)** - to have a counterpart of B.Simon in Kintone that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
- ![Configure single sign-on link](common/select-sso.png)
+## Configure Azure AD SSO
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Single sign-on select mode](common/select-saml-option.png)
+1. In the Azure portal, on the **Kintone** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Kintone Domain and URLs single sign-on information](common/sp-identifier.png)
-
- a. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://<companyname>.kintone.com`
-
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ a. In the **Identifier (Entity ID)** text box, type a URL using one of the following patterns:
- ```http
- https://<companyname>.cybozu.com
- https://<companyname>.kintone.com
- ```
+ | **Identifier** |
+ ||
+ | `https://<companyname>.cybozu.com` |
+ | `https://<companyname>.kintone.com` |
+
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://<companyname>.kintone.com`
> [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [Kintone Client support team](https://www.kintone.com/contact/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Kintone Client support team](https://www.kintone.com/contact/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Kintone, perform the following steps:
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
- b. Azure AD Identifier
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Kintone.
- c. Logout URL
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Kintone**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure Kintone Single Sign-On
+## Configure Kintone SSO
1. In a different web browser window, sign into your **Kintone** company site as an administrator. 1. Click **Settings icon**.
- ![Settings](./media/kintone-tutorial/ic785879.png "Settings")
+ ![Settings](./media/kintone-tutorial/icon.png "Settings")
1. Click **Users & System Administration**.
- ![Users & System Administration](./media/kintone-tutorial/ic785880.png "Users & System Administration")
+ ![Users & System Administration](./media/kintone-tutorial/user.png "Users & System Administration")
1. Under **System Administration \> Security** click **Login**.
- ![Login](./media/kintone-tutorial/ic785881.png "Login")
+ ![Login](./media/kintone-tutorial/system.png "Login")
1. Click **Enable SAML authentication**.
- ![Screenshot that shows "Users & System Administration" selected.](./media/kintone-tutorial/ic785882.png "SAML Authentication")
+ ![Screenshot that shows "Users & System Administration" selected.](./media/kintone-tutorial/security.png "SAML Authentication")
1. In the SAML Authentication section, perform the following steps:
- ![SAML Authentication](./media/kintone-tutorial/ic785883.png "SAML Authentication")
+ ![SAML Authentication](./media/kintone-tutorial/certificate.png "SAML Authentication")
a. In the **Login URL** textbox, paste the value of **Login URL** which you have copied from Azure portal.
To configure Azure AD single sign-on with Kintone, perform the following steps:
d. Click **Save**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type `brittasimon@yourcompanydomain.extension`
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Kintone.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Kintone**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Kintone**.
-
- ![The Kintone link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create Kintone test user To enable Azure AD users to sign in to Kintone, they must be provisioned into Kintone. In the case of Kintone, provisioning is a manual task.
To enable Azure AD users to sign in to Kintone, they must be provisioned into Ki
1. Click **Settings icon**.
- ![Settings](./media/kintone-tutorial/ic785879.png "Settings")
+ ![Settings](./media/kintone-tutorial/icon.png "Settings")
1. Click **Users & System Administration**.
- ![User & System Administration](./media/kintone-tutorial/ic785880.png "User & System Administration")
+ ![User & System Administration](./media/kintone-tutorial/user.png "User & System Administration")
1. Under **User Administration**, click **Departments & Users**.
- ![Department & Users](./media/kintone-tutorial/ic785888.png "Department & Users")
+ ![Department & Users](./media/kintone-tutorial/services.png "Department & Users")
1. Click **New User**.
- ![Screenshot that shows the "Users" section with the "New User" action selected.](./media/kintone-tutorial/ic785889.png "New Users")
+ ![Screenshot that shows the "Users" section with the "New User" action selected.](./media/kintone-tutorial/status.png "New Users")
1. In the **New User** section, perform the following steps:
- ![New Users](./media/kintone-tutorial/ic785890.png "New Users")
+ ![New Users](./media/kintone-tutorial/details.png "New Users")
a. Type a **Display Name**, **Login Name**, **New Password**, **Confirm Password**, **E-mail Address**, and other details of a valid Azure AD account you want to provision into the related textboxes.
To enable Azure AD users to sign in to Kintone, they must be provisioned into Ki
> [!NOTE] > You can use any other Kintone user account creation tools or APIs provided by Kintone to provision Azure AD user accounts.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Kintone tile in the Access Panel, you should be automatically signed in to the Kintone for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+* Click on **Test this application** in Azure portal. This will redirect to Kintone Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Kintone Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Kintone tile in the My Apps, this will redirect to Kintone Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Kintone you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory N2f Expensereports Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/n2f-expensereports-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with N2F - Expense reports | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with N2F - Expense reports'
description: Learn how to configure single sign-on between Azure Active Directory and N2F - Expense reports.
Previously updated : 03/01/2019 Last updated : 09/22/2021
-# Tutorial: Azure Active Directory integration with N2F - Expense reports
+# Tutorial: Azure AD SSO integration with N2F - Expense reports
-In this tutorial, you learn how to integrate N2F - Expense reports with Azure Active Directory (Azure AD).
-Integrating N2F - Expense reports with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate N2F - Expense reports with Azure Active Directory (Azure AD). When you integrate N2F - Expense reports with Azure AD, you can:
-* You can control in Azure AD who has access to N2F - Expense reports.
-* You can enable your users to be automatically signed-in to N2F - Expense reports (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to N2F - Expense reports.
+* Enable your users to be automatically signed-in to N2F - Expense reports with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with N2F - Expense reports, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* N2F - Expense reports single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* N2F - Expense reports single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* N2F - Expense reports supports **SP** and **IDP** initiated SSO
+* N2F - Expense reports supports **SP** and **IDP** initiated SSO.
-## Adding N2F - Expense reports from the gallery
+## Add N2F - Expense reports from the gallery
To configure the integration of N2F - Expense reports into Azure AD, you need to add N2F - Expense reports from the gallery to your list of managed SaaS apps.
-**To add N2F - Expense reports from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **N2F - Expense reports**, select **N2F - Expense reports** from result panel then click **Add** button to add the application.
-
- ![N2F - Expense reports in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with N2F - Expense reports based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in N2F - Expense reports needs to be established.
-
-To configure and test Azure AD single sign-on with N2F - Expense reports, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure N2F - Expense reports Single Sign-On](#configure-n2fexpense-reports-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create N2F - Expense reports test user](#create-n2fexpense-reports-test-user)** - to have a counterpart of Britta Simon in N2F - Expense reports that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **N2F - Expense reports** in the search box.
+1. Select **N2F - Expense reports** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-### Configure Azure AD single sign-on
+## Configure and test Azure AD SSO for N2F - Expense reports
-In this section, you enable Azure AD single sign-on in the Azure portal.
+Configure and test Azure AD SSO with N2F - Expense reports using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in N2F - Expense reports.
-To configure Azure AD single sign-on with N2F - Expense reports, perform the following steps:
+To configure and test Azure AD SSO with N2F - Expense reports, perform the following steps:
-1. In the [Azure portal](https://portal.azure.com/), on the **N2F - Expense reports** application integration page, select **Single sign-on**.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure N2F - Expense reports SSO](#configure-n2fexpense-reports-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create N2F - Expense reports test user](#create-n2fexpense-reports-test-user)** - to have a counterpart of B.Simon in N2F - Expense reports that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
- ![Configure single sign-on link](common/select-sso.png)
+## Configure Azure AD SSO
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Single sign-on select mode](common/select-saml-option.png)
+1. In the Azure portal, on the **N2F - Expense reports** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, the user does not have to perform any steps as the app is already pre-integrated with Azure.
- ![Screenshot shows the SAML-based Sign-on page with Basic SAML Configuration.](common/preintegrated.png)
- 5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![Screenshot shows the Integrations page where you can add Azure A D Single Sign-On.](common/metadata-upload-additional-signon.png)
-
- In the **Sign-on URL** text box, type a URL:
+ In the **Sign-on URL** text box, type the URL:
`https://www.n2f.com/app/` 6. On the **Set up Single Sign-On with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
To configure Azure AD single sign-on with N2F - Expense reports, perform the fol
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- b. Azure AD Identifier
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to N2F - Expense reports.
- c. Logout URL
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **N2F - Expense reports**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure N2F - Expense reports Single Sign-On
+## Configure N2F - Expense reports SSO
1. In a different web browser window, sign in to your N2F - Expense reports company site as an administrator. 2. Click on **Settings** and then select **Advance Settings** from the dropdown.
- ![Screenshot shows Advanced Settings selected.](./media/n2f-expensereports-tutorial/configure1.png)
+ ![Screenshot shows Advanced Settings selected.](./media/n2f-expensereports-tutorial/profile.png)
3. Select **Account settings** tab.
- ![Screenshot shows Account settings selected.](./media/n2f-expensereports-tutorial/configure2.png)
+ ![Screenshot shows Account settings selected.](./media/n2f-expensereports-tutorial/account.png)
4. Select **Authentication** and then select **+ Add an authentication method** tab.
- ![Screenshot shows Account Setting Authentication where you can add an authentication method.](./media/n2f-expensereports-tutorial/configure3.png)
+ ![Screenshot shows Account Setting Authentication where you can add an authentication method.](./media/n2f-expensereports-tutorial/general.png)
5. Select **SAML Microsoft Office 365** as Authentication method.
- ![Screenshot shows Authentication method with SAML Microsoft Office 365 selected.](./media/n2f-expensereports-tutorial/configure4.png)
+ ![Screenshot shows Authentication method with SAML Microsoft Office 365 selected.](./media/n2f-expensereports-tutorial/method.png)
6. On the **Authentication method** section, perform the following steps:
- ![Screenshot shows Authentication method where you can enter the values described.](./media/n2f-expensereports-tutorial/configure5.png)
+ ![Screenshot shows Authentication method where you can enter the values described.](./media/n2f-expensereports-tutorial/metadata.png)
a. In the **Entity ID** textbox, paste the **Azure AD Identifier** value, which you have copied from the Azure portal.
To configure Azure AD single sign-on with N2F - Expense reports, perform the fol
c. Click **Save**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to N2F - Expense reports.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **N2F - Expense reports**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **N2F - Expense reports**.
-
- ![The N2F - Expense reports link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create N2F - Expense reports test user To enable Azure AD users to log in to N2F - Expense reports, they must be provisioned into N2F - Expense reports. In the case of N2F - Expense reports, provisioning is a manual task.
To enable Azure AD users to log in to N2F - Expense reports, they must be provis
2. Click on **Settings** and then select **Advance Settings** from the dropdown.
- ![Screenshot shows Advanced Settings selected.](./media/n2f-expensereports-tutorial/configure1.png)
+ ![Screenshot shows Advanced Settings selected.](./media/n2f-expensereports-tutorial/profile.png)
3. Select **Users** tab from left navigation panel.
- ![Screenshot shows Users selected.](./media/n2f-expensereports-tutorial/user1.png)
+ ![Screenshot shows Users selected.](./media/n2f-expensereports-tutorial/user.png)
4. Select **+ New user** tab.
- ![Screenshot shows the New user option.](./media/n2f-expensereports-tutorial/user2.png)
+ ![Screenshot shows the New user option.](./media/n2f-expensereports-tutorial/create-user.png)
5. On the **User** section, perform the following steps:
- ![Screenshot shows the section where you can enter the values described.](./media/n2f-expensereports-tutorial/user3.png)
+ ![Screenshot shows the section where you can enter the values described.](./media/n2f-expensereports-tutorial/values.png)
a. In the **Email address** textbox, enter the email address of user like **brittasimon\@contoso.com**.
To enable Azure AD users to log in to N2F - Expense reports, they must be provis
> [!NOTE] > If you are facing any problems while adding the user, please contact [N2F - Expense reports support team](mailto:support@n2f.com)
-### Test single sign-on
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to N2F - Expense reports Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to N2F - Expense reports Sign-on URL directly and initiate the login flow from there.
-When you click the N2F - Expense reports tile in the Access Panel, you should be automatically signed in to the N2F - Expense reports for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the N2F - Expense reports for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the N2F - Expense reports tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the N2F - Expense reports for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure N2F - Expense reports you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Ringcentral Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/ringcentral-provisioning-tutorial.md
The scenario outlined in this tutorial assumes that you already have the followi
A [RingCentral](https://www.ringcentral.com/office/plansandpricing.html) admin account is required to Authorize in the Admin Credentials section in Step 5.
+In the RingCentral admin portal, under Account Settings -> Directory Integrations, set the *Directory Provider* setting to *SCIM*
+![image](https://user-images.githubusercontent.com/49566142/134523440-20320d8e-3c25-4358-9ace-d4888ce8e4ea.png)
++ > [!NOTE] > To assign licenses to users, refer to the video link [here](https://support.ringcentral.com/s/article/5-10-Adding-Extensions-via-Web?language).
Once you've configured provisioning, use the following resources to monitor your
## Next steps
-* [Learn how to review logs and get reports on provisioning activity](../app-provisioning/check-status-user-account-provisioning.md)
+* [Learn how to review logs and get reports on provisioning activity](../app-provisioning/check-status-user-account-provisioning.md)
active-directory Taskize Connect Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/taskize-connect-tutorial.md
Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Taskize Connect | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Taskize Connect'
description: Learn how to configure single sign-on between Azure Active Directory and Taskize Connect.
Previously updated : 07/16/2021 Last updated : 09/23/2021
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with Taskize Connect
+# Tutorial: Azure AD SSO integration with Taskize Connect
In this tutorial, you'll learn how to integrate Taskize Connect with Azure Active Directory (Azure AD). When you integrate Taskize Connect with Azure AD, you can:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment. * Taskize Connect supports **SP and IDP** initiated SSO.
+* Taskize Connect supports [Automated user provisioning](taskize-connect-provisioning-tutorial.md).
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
The objective of this section is to create a user called B.Simon in Taskize Conn
>[!Note] >If you need to create a user manually, contact [Taskize Connect support team](mailto:support@taskize.com).
+Taskize Connect also supports automatic user provisioning, you can find more details [here](./taskize-connect-provisioning-tutorial.md) on how to configure automatic user provisioning.
+ ## Test SSO In this section, you test your Azure AD single sign-on configuration with following options.
active-directory Tinfoil Security Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/tinfoil-security-tutorial.md
Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with TINFOIL SECURITY | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with TINFOIL SECURITY'
description: Learn how to configure single sign-on between Azure Active Directory and TINFOIL SECURITY.
Previously updated : 10/16/2019 Last updated : 09/20/2021
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with TINFOIL SECURITY
+# Tutorial: Azure AD SSO integration with TINFOIL SECURITY
In this tutorial, you'll learn how to integrate TINFOIL SECURITY with Azure Active Directory (Azure AD). When you integrate TINFOIL SECURITY with Azure AD, you can:
In this tutorial, you'll learn how to integrate TINFOIL SECURITY with Azure Acti
* Enable your users to be automatically signed-in to TINFOIL SECURITY with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* TINFOIL SECURITY supports **IDP** initiated SSO
+* TINFOIL SECURITY supports **IDP** initiated SSO.
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding TINFOIL SECURITY from the gallery
+## Add TINFOIL SECURITY from the gallery
To configure the integration of TINFOIL SECURITY into Azure AD, you need to add TINFOIL SECURITY from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **TINFOIL SECURITY** in the search box. 1. Select **TINFOIL SECURITY** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for TINFOIL SECURITY
+## Configure and test Azure AD SSO for TINFOIL SECURITY
Configure and test Azure AD SSO with TINFOIL SECURITY using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in TINFOIL SECURITY.
-To configure and test Azure AD SSO with TINFOIL SECURITY, complete the following building blocks:
+To configure and test Azure AD SSO with TINFOIL SECURITY, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure TINFOIL SECURITY SSO](#configure-tinfoil-security-sso)** - to configure the single sign-on settings on application side.
- * **[Create TINFOIL SECURITY test user](#create-tinfoil-security-test-user)** - to have a counterpart of B.Simon in TINFOIL SECURITY that is linked to the Azure AD representation of user.
+ 1. **[Create TINFOIL SECURITY test user](#create-tinfoil-security-test-user)** - to have a counterpart of B.Simon in TINFOIL SECURITY that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **TINFOIL SECURITY** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **TINFOIL SECURITY** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **TINFOIL SECURITY**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the toolbar on the top, click **My Account**.
- ![Dashboard](./media/tinfoil-security-tutorial/ic798971.png "Dashboard")
+ ![Dashboard](./media/tinfoil-security-tutorial/account.png "Dashboard")
1. Click **Security**.
- ![Security](./media/tinfoil-security-tutorial/ic798972.png "Security")
+ ![Security](./media/tinfoil-security-tutorial/details.png "Security")
1. On the **Single Sign-On** configuration page, perform the following steps:
- ![Single Sign-On](./media/tinfoil-security-tutorial/ic798973.png "Single Sign-On")
+ ![Single Sign-On](./media/tinfoil-security-tutorial/certificate.png "Single Sign-On")
a. Select **Enable SAML**.
In order to enable Azure AD users to sign in to TINFOIL SECURITY, they must be p
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the TINFOIL SECURITY tile in the Access Panel, you should be automatically signed in to the TINFOIL SECURITY for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
-
-## Additional resources
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the TINFOIL SECURITY for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the TINFOIL SECURITY tile in the My Apps, you should be automatically signed in to the TINFOIL SECURITY for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try TINFOIL SECURITY with Azure AD](https://aad.portal.azure.com/)
+Once you configure TINFOIL SECURITY you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
aks Concepts Network https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/concepts-network.md
The following behavior differences exist between kubenet and Azure CNI:
| Access to resources secured by service endpoints | Supported | Supported | | Expose Kubernetes services using a load balancer service, App Gateway, or ingress controller | Supported | Supported | | Default Azure DNS and Private Zones | Supported | Supported |
+| Support for Windows node pools | Not Supported | Supported |
Regarding DNS, with both kubenet and Azure CNI plugins DNS are offered by CoreDNS, a deployment running in AKS with its own autoscaler. For more information on CoreDNS on Kubernetes, see [Customizing DNS Service](https://kubernetes.io/docs/tasks/administer-cluster/dns-custom-nameservers/). CoreDNS by default is configured to forward unknown domains to the DNS functionality of the Azure Virtual Network where the AKS cluster is deployed. Hence, Azure DNS and Private Zones will work for pods running in AKS.
aks Csi Secrets Store Driver https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/csi-secrets-store-driver.md
spec:
objects: | array: - |
- objectName: <secret-name> # In this example, 'ExampleSecret'
+ objectName: <secret-name> # In this example, 'ExampleSecret'
+ objectAlias: <secret-alias> # [OPTIONAL] specify the filename of the object when written to disk - defaults to objectName if not provided
objectType: secret # Object types: secret, key or cert objectVersion: "" # [OPTIONAL] object versions, default to latest if empty tenantId: "<tenant-id>" # the tenant ID containing the Azure Key Vault instance
aks Custom Node Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/custom-node-configuration.md
The supported Kubelet parameters and accepted values are listed below.
| `allowedUnsafeSysctls` | `kernel.shm*`, `kernel.msg*`, `kernel.sem`, `fs.mqueue.*`, `net.*` | None | Allowed list of unsafe sysctls or unsafe sysctl patterns. | | `containerLogMaxSizeMB` | Size in megabytes (MB) | 10 MB | The maximum size (for example, 10 MB) of a container log file before it's rotated. | | `containerLogMaxFiles` | ≥ 2 | 5 | The maximum number of container log files that can be present for a container. |
+| `podMaxPids` | -1 to kernel PID limit | -1 (∞)| The maximum amount of process IDs that can be running in a Pod |
### Linux OS custom configuration
aks Private Clusters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/private-clusters.md
As mentioned, virtual network peering is one way to access your private cluster.
> If you are using [Bring Your Own Route Table with kubenet](./configure-kubenet.md#bring-your-own-subnet-and-route-table-with-kubenet) and Bring Your Own DNS with Private Cluster, the cluster creation will fail. You will need to associate the [RouteTable](./configure-kubenet.md#bring-your-own-subnet-and-route-table-with-kubenet) in the node resource group to the subnet after the cluster creation failed, in order to make the creation successful. ## Limitations
-* AKS-RunCommand does not work on clusters with AKS managed AAD and Private link enabled.
* IP authorized ranges can't be applied to the private api server endpoint, they only apply to the public API server * [Azure Private Link service limitations][private-link-service] apply to private clusters. * No support for Azure DevOps Microsoft-hosted Agents with private clusters. Consider to use [Self-hosted Agents](/azure/devops/pipelines/agents/agents?tabs=browser).
aks Use Azure Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/use-azure-policy.md
Custom policy definitions are written in JSON. To learn more about creating a cu
> [!NOTE] > Azure Policy now utilizes a new property known as *templateInfo* that allows users to define the source type for the constraint template. By defining *templateInfo* in policy definitions, users donΓÇÖt have to define *constraintTemplate* or *constraint* properties. Users still need to define *apiGroups* and *kinds*. For more information on this, see [Understanding Azure Policy effects][azure-policy-effects-audit].
-Once your custom policy definition has been created, see [Assign a policy definition][azure-policy-tutorial-assign] for a step-by-step walkthrough of assigning the policy to your Kubernetes cluster.
+Once your custom policy definition has been created, see [Assign a policy definition][custom-policy-tutorial-assign] for a step-by-step walkthrough of assigning the policy to your Kubernetes cluster.
## Validate a Azure Policy is running
app-service Configure Authentication Provider Apple https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-authentication-provider-apple.md
Apple requires the client secret be the base64-encoding of a JWT token. The deco
- **exp**: No more than six months after **nbf** The base64-encoded version of the above payload looks like this:
-```eyJhbGciOiJFUzI1NiIsImtpZCI6IlVSS0VZSUQwMDEifQ.eyJzdWIiOiJjb20ueW91cmNvbXBhbnkuYXBwMSIsIm5iZiI6MTU2MDIwMzIwNywiZXhwIjoxNTYwMjg5NjA3LCJpc3MiOiJBQkMxMjNERUZHIiwiYXVkIjoiaHR0cHM6Ly9hcHBsZWlkLmFwcGxlLmNvbSJ9.ABSXELWuTbgqfrIUz7bLi6nXvkXAz5O8vt0jB2dSHTQTib1x1DSP4__4UrlKI-pdzNg1sgeocolPNTmDKazO8-BHAZCsdeeTNlgFEzBytIpMKFfVEQbEtGRkam5IeclUK7S9oOva4EK4jV4VmgDrr-LGWWO3TaAxAvy3_ZoKohvFFkVG```
+`eyJhbGciOiJFUzI1NiIsImtpZCI6IlVSS0VZSUQwMDEifQ.eyJzdWIiOiJjb20ueW91cmNvbXBhbnkuYXBwMSIsIm5iZiI6MTU2MDIwMzIwNywiZXhwIjoxNTYwMjg5NjA3LCJpc3MiOiJBQkMxMjNERUZHIiwiYXVkIjoiaHR0cHM6Ly9hcHBsZWlkLmFwcGxlLmNvbSJ9.ABSXELWuTbgqfrIUz7bLi6nXvkXAz5O8vt0jB2dSHTQTib1x1DSP4__4UrlKI-pdzNg1sgeocolPNTmDKazO8-BHAZCsdeeTNlgFEzBytIpMKFfVEQbEtGRkam5IeclUK7S9oOva4EK4jV4VmgDrr-LGWWO3TaAxAvy3_ZoKohvFFkVG`
_Note: Apple doesn't accept client secret JWTs with an expiration date more than six months after the creation (or nbf) date. That means you'll need to rotate your client secret, at minimum, every six months._
app-service Configure Language Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-nodejs.md
When a working Node.js app behaves differently in App Service or has errors, try
- Certain web frameworks may use custom startup scripts when running in production mode. - Run your app in App Service in development mode. For example, in [MEAN.js](https://meanjs.org/), you can set your app to development mode in runtime by [setting the `NODE_ENV` app setting](configure-common.md). +
+#### You do not have permission to view this directory or page
+
+After deploying your Node.js code to a native Windows app in App Service, you may see the message `You do not have permission to view this directory or page.` in the browser when navigating to your app's URL. This is most likely because you don't have a *web.config* file (see the [template](https://github.com/projectkudu/kudu/blob/master/Kudu.Core/Scripts/iisnode.config.template) and an [example](https://github.com/Azure-Samples/nodejs-docs-hello-world/blob/master/web.config)).
+
+If you deploy your files by using Git, or by using ZIP deployment [with build automation enabled](deploy-zip.md#enable-build-automation-for-zip-deploy), the deployment engine generates a *web.config* in the web root of your app (`%HOME%\site\wwwroot`) automatically if one of the following conditions is true:
+
+- Your project root has a *package.json* that defines a `start` script that contains the path of a JavaScript file.
+- Your project root has either a *server.js* or an *app.js*.
+
+The generated *web.config* is tailored to the detected start script. For other deployment methods, add this *web.config* manually. Make sure the file is formatted properly.
+
+If you use [ZIP deployment](deploy-zip.md) (through Visual Studio Code, for example), be sure to [enable build automation](deploy-zip.md#enable-build-automation-for-zip-deploy) because it's not enabled by default. [`az webapp up`](/cli/azure/webapp#az_webapp_up) uses ZIP deployment with build automation enabled.
++ ::: zone pivot="platform-linux" [!INCLUDE [robots933456](../../includes/app-service-web-configure-robots933456.md)]
app-service Configure Ssl Certificate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-ssl-certificate.md
The free certificate comes with the following limitations:
- If a certificate is for a CNAME-mapped domain, the CNAME must be mapped directly to `<app-name>.azurewebsites.net`. > [!NOTE]
-> The free certificate is issued by DigiCert. For some top-level domains, you must explicitly allow DigiCert as a certificate issuer by creating a [CAA domain record](https://wikipedia.org/wiki/DNS_Certification_Authority_Authorization) with the value: `0 issue digicert.com`.
+> The free certificate is issued by DigiCert. For some domains, you must explicitly allow DigiCert as a certificate issuer by creating a [CAA domain record](https://wikipedia.org/wiki/DNS_Certification_Authority_Authorization) with the value: `0 issue digicert.com`.
> In the <a href="https://portal.azure.com" target="_blank">Azure portal</a>, from the left menu, select **App Services** > **\<app-name>**.
Use the following table to help you configure the certificate. When finished, cl
| Legal Terms | Click to confirm that you agree with the legal terms. The certificates are obtained from GoDaddy. | > [!NOTE]
-> App Service Certificates purchased from Azure are issued by GoDaddy. For some top-level domains, you must explicitly allow GoDaddy as a certificate issuer by creating a [CAA domain record](https://wikipedia.org/wiki/DNS_Certification_Authority_Authorization) with the value: `0 issue godaddy.com`
+> App Service Certificates purchased from Azure are issued by GoDaddy. For some domains, you must explicitly allow GoDaddy as a certificate issuer by creating a [CAA domain record](https://wikipedia.org/wiki/DNS_Certification_Authority_Authorization) with the value: `0 issue godaddy.com`
> ### Store in Azure Key Vault
To replace an expiring certificate, how you update the certificate binding with
### Renew an App Service certificate
+> [!NOTE]
+> Starting Sept 23 2021, App Service certificates will require domain validation every 395 days. Unlike App Service Managed Certificate, domain re-validation for App Service Certificate will NOT be automated.
+ > [!NOTE] > The renewal process requires that [the well-known service principal for App Service has the required permissions on your key vault](deploy-resource-manager-template.md#deploy-web-app-certificate-from-key-vault). This permission is configured for you when you import an App Service Certificate through the portal, and should not be removed from your key vault.
app-service Networking https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/networking.md
Title: App Service Environment Networking description: App Service Environment networking details-+ ms.assetid: 6f262f63-aef5-4598-88d2-2f2c2f2bfc24 Last updated 06/30/2021-+
>
-The App Service Environment (ASE) is a single tenant deployment of the Azure App Service that hosts web apps, api apps, and function apps. When you install an ASE, you pick the Azure Virtual Network (VNet) that you want it to be deployed in. All of the inbound and outbound traffic application will be inside the VNet you specify. The ASE is deployed into a single subnet in your VNet. Nothing else can be deployed into that same subnet. The subnet needs to be delegated to Microsoft.Web/HostingEnvironments
+The App Service Environment (ASE) is a single tenant deployment of the Azure App Service that hosts web apps, api apps, and function apps. When you install an ASE, you pick the Azure Virtual Network (VNet) that you want it to be deployed in. All of the inbound and outbound traffic application will be inside the VNet you specify. The ASE is deployed into a single subnet in your VNet. Nothing else can be deployed into that same subnet.
+
+## Subnet requirements
+
+The subnet must be delegated to Microsoft.Web/hostingEnvironments and must be empty.
+
+The size of the subnet can affect the scaling limits of the App Service Plan instances within the ASE. We recommend using a /24 address space (256 addresses) for your subnet to ensure enough addresses to support production scale.
+
+To use a smaller subnet, you should be aware of the following details of the ASE and network setup.
+
+Any given subnet has five addresses reserved for management purposes. On top of the management addresses, ASE will dynamically scale the supporting infrastructure and will use between 4 and 27 addresses depending on configuration, scale, and load. The remaining addresses can be used for instances in the App Service Plan. The minimal size of your subnet is a /27 address space (32 addresses).
+
+The effect of running out of addresses is, that you can be restricted from scaling out your App Service Plans in the ASE or you can experience increased latency during intensive traffic load if we are not able scale the supporting infrastructure.
## Addresses
The ASE has the following network information at creation:
| ASE virtual network | The VNet the ASE is deployed into | | ASE subnet | The subnet that the ASE is deployed into | | Domain suffix | The domain suffix that is used by the apps made in this ASE |
-| Virtual IP | This is the VIP type used by the ASE. The two possible values are internal and external |
+| Virtual IP | This setting is the VIP type used by the ASE. The two possible values are internal and external |
| Inbound address | The inbound address is the address your apps on this ASE are reached at. If you have an internal VIP, it is an address in your ASE subnet. If the address is external, it will be a public facing address | | Default outbound addresses | The apps in this ASE will use this address, by default, when making outbound calls to the internet. |
app-service Quickstart Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/quickstart-nodejs.md
Title: 'Quickstart: Create a Node.js web app'
description: Deploy your first Node.js Hello World to Azure App Service in minutes. ms.assetid: 582bb3c2-164b-42f5-b081-95bfcb7a502a Previously updated : 08/01/2020 Last updated : 09/14/2021
-zone_pivot_groups: app-service-platform-windows-linux
+#zone_pivot_groups: app-service-platform-windows-linux
+zone_pivot_groups: app-service-ide-oss
adobe-target: true adobe-target-activity: DocsExpΓÇô386541ΓÇôA/BΓÇôEnhanced-Readability-QuickstartsΓÇô2.19.2021 adobe-target-experience: Experience B
adobe-target-content: ./quickstart-nodejs-uiex
# Create a Node.js web app in Azure
+In this quickstart, you'll learn how to create and deploy your first Node.js ([Express](https://www.expressjs.com)) web app to [Azure App Service](overview.md). App Service supports various versions of Node.js on both Linux and Windows.
-Get started with Azure App Service by creating a Node.js/Express app locally using Visual Studio Code and then deploying the app to the cloud. Because you use a free App Service tier, you incur no costs to complete this quickstart.
+This quickstart configures an App Service app in the **Free** tier and incurs no cost for your Azure subscription.
-## Prerequisites
+## Set up your initial environment
-- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?utm_source=campaign&utm_campaign=vscode-tutorial-app-service-extension&mktingSource=vscode-tutorial-app-service-extension).-- <a href="https://git-scm.com/" target="_blank">Install Git</a>-- [Node.js and npm](https://nodejs.org). Run the command `node --version` to verify that Node.js is installed.-- [Visual Studio Code](https://code.visualstudio.com/).-- The [Azure App Service extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azureappservice) for Visual Studio Code.-
-## Clone and run a local Node.js application
-
-1. On your local computer, open a terminal and clone the sample repository:
- ```bash
- git clone https://github.com/Azure-Samples/nodejs-docs-hello-world
- ```
-
-1. Navigate into the new app folder:
+- Have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?utm_source=campaign&utm_campaign=vscode-tutorial-app-service-extension&mktingSource=vscode-tutorial-app-service-extension).
+- Install [Node.js and npm](https://nodejs.org). Run the command `node --version` to verify that Node.js is installed.
+- Install [Visual Studio Code](https://code.visualstudio.com/).
+- The [Azure App Service extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azureappservice) for Visual Studio Code.
+ <!-
- ```bash
- cd nodejs-docs-hello-world
- ```
-1. Start the app to test it locally:
- ```bash
- npm start
- ```
-
-1. Open your browser and navigate to `http://localhost:1337`. The browser should display "Hello World!".
+- Have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?utm_source=campaign&utm_campaign=vscode-tutorial-app-service-extension&mktingSource=vscode-tutorial-app-service-extension).
+- Install [Node.js and npm](https://nodejs.org). Run the command `node --version` to verify that Node.js is installed.
+- Install <a href="/cli/azure/install-azure-cli" target="_blank">Azure CLI</a>, with which you run commands in any shell to provision and configure Azure resources.
-1. Press **Ctrl**+**C** in the terminal to stop the server.
-> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=create-app)
+## Create your Node.js application
-## Deploy the app to Azure
+In this step, you create a starter Node.js application and make sure it runs on your computer.
-In this section, you deploy your Node.js app to Azure using VS Code and the Azure App Service extension.
+> [!TIP]
+> If you have already completed the [Node.js tutorial](https://code.visualstudio.com/docs/nodejs/nodejs-tutorial), you can skip ahead to [Deploy to Azure](#deploy-to-azure).
-1. In the terminal, make sure you're in the *nodejs-docs-hello-world* folder, then start Visual Studio Code with the following command:
+1. Create a simple Node.js application using the [Express Generator](https://expressjs.com/starter/generator.html), which is installed by default with Node.js and NPM.
```bash
- code .
+ npx express-generator myExpressApp --view pug
```
-1. In the VS Code activity bar, select the Azure logo to show the **AZURE APP SERVICE** explorer. Select **Sign in to Azure...** and follow the instructions. (See [Troubleshooting Azure sign-in](#troubleshooting-azure-sign-in) below if you run into errors.) Once signed in, the explorer should show the name of your Azure subscription.
+1. Change to the application's directory and install the NPM packages.
- ![Sign in to Azure](media/quickstart-nodejs/sign-in.png)
-
-1. In the **AZURE APP SERVICE** explorer of VS Code, select the blue up arrow icon to deploy your app to Azure. (You can also invoke the same command from the **Command Palette** (**Ctrl**+**Shift**+**P**) by typing 'deploy to web app' and choosing **Azure App Service: Deploy to Web App**).
-
- :::image type="content" source="media/quickstart-nodejs/deploy.png" alt-text="Screenshot of the Azure App service in VS Code showing the blue arrow icon selected.":::
-
-1. Choose the *nodejs-docs-hello-world* folder.
+ ```bash
+ cd myExpressApp
+ npm install
+ ```
-1. Choose a creation option based on the operating system to which you want to deploy:
+1. Start the development server.
- - Linux: Choose **Create new Web App**
- - Windows: Choose **Create new Web App... Advanced**
+ ```bash
+ npm start
+ ```
-1. Type a globally unique name for your web app and press **Enter**. The name must be unique across all of Azure and use only alphanumeric characters ('A-Z', 'a-z', and '0-9') and hyphens ('-').
+1. In a browser, navigate to `http://localhost:3000`. You should see something like this:
-1. If targeting Linux, select a Node.js version when prompted. An **LTS** version is recommended.
+ ![Running Express Application](./media/quickstart-nodejs/express.png)
-1. If targeting Windows, follow the additional prompts:
- 1. Select **Create a new resource group**, then enter a name for the resource group, such as `AppServiceQS-rg`.
- 1. Select **Windows** for the operating system.
- 1. Select **Create new App Service plan**, then enter a name for the plan (such as `AppServiceQS-plan`), then select **F1 Free** for the pricing tier.
- 1. Choose **Skip for now** when prompted about Application Insights.
- 1. Choose a region near you or near resources you wish to access.
+> [!div class="nextstepaction"]
+> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=create-app)
-1. After you respond to all the prompts, VS Code shows the Azure resources that are being created for your app in its notification popup.
+## Deploy to Azure
- When deploying to Linux, select **Yes** when prompted to update your configuration to run `npm install` on the target Linux server.
+Before you continue, ensure that you have all the prerequisites installed and configured.
- ![Prompt to update configuration on the target Linux server](media/quickstart-nodejs/server-build.png)
+> [!NOTE]
+> For your Node.js application to run in Azure, it needs to listen on the port provided by the `PORT` environment variable. In your generated Express app, this environment variable is already used in the startup script *bin/www* (search for `process.env.PORT`).
+>
-1. Select **Yes** when prompted with **Always deploy the workspace "nodejs-docs-hello-world" to (app name)"**. Selecting **Yes** tells VS Code to automatically target the same App Service Web App with subsequent deployments.
-1. If deploying to Linux, select **Browse Website** in the prompt to view your freshly deployed web app once deployment is complete. The browser should display "Hello World!"
+#### Sign in to Azure
-1. If deploying to Windows, you must first set the Node.js version number for the web app:
+1. In the terminal, make sure you're in the *myExpressApp* directory, then start Visual Studio Code with the following command:
- 1. In VS Code, expand the node for the new app service, right-click **Application Settings**, and select **Add New Setting...**:
+ ```bash
+ code .
+ ```
- ![Add app setting command](media/quickstart-nodejs/add-setting.png)
+1. In Visual Studio Code, in the [Activity Bar](https://code.visualstudio.com/docs/getstarted/userinterface), select the **Azure** logo.
- 1. Enter `WEBSITE_NODE_DEFAULT_VERSION` for the setting key.
- 1. Enter `10.15.2` for the setting value.
- 1. Right-click the node for the app service and select **Restart**
+1. In the **App Service** explorer, select **Sign in to Azure...** and follow the instructions.
- ![Restart app service command](media/quickstart-nodejs/restart.png)
+ In Visual Studio Code, you should see your Azure email address in the Status Bar and your subscription in the **AZURE APP SERVICE** explorer.
- 1. Right-click the node for the app service once more and select **Browse Website**.
+ ![sign in to Azure](./media/quickstart-nodejs/sign-in.png)
> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=deploy-app)
-
-### Troubleshooting Azure sign-in
+> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=getting-started)
-If you see the error **"Cannot find subscription with name [subscription ID]"** when signing into Azure, it might be because you're behind a proxy and unable to reach the Azure API. Configure `HTTP_PROXY` and `HTTPS_PROXY` environment variables with your proxy information in your terminal using `export`.
+#### Configure the App Service app and deploy code
-```bash
-export HTTPS_PROXY=https://username:password@proxy:8080
-export HTTP_PROXY=http://username:password@proxy:8080
-```
+1. In the **App Service** explorer, select the **Deploy to Web App** icon.
-If setting the environment variables doesn't correct the issue, contact us by selecting the **I ran into an issue** button above.
+ :::image type="content" source="media/quickstart-nodejs/deploy.png" alt-text="Screenshot of the Azure App service in Visual Studio Code showing the blue arrow icon selected.":::
+
+1. Choose the *myExpressApp* folder.
-### Update the app
+# [Deploy to Linux](#tab/linux)
-You can deploy changes to this app by making edits in VS Code, saving your files, and then using the same process as before only choosing the existing app rather than creating a new one.
+3. Choose **Create new Web App**. A Linux container is used by default.
+1. Type a globally unique name for your web app and press **Enter**. The name must be unique across all of Azure and use only alphanumeric characters ('A-Z', 'a-z', and '0-9') and hyphens ('-').
+1. In Select a runtime stack, select the Node.js version you want. An **LTS** version is recommended.
+1. In Select a pricing tier, select **Free (F1)** and wait for the the resources to be provisioned in Azure.
+1. In the popup **Always deploy the workspace "myExpressApp" to \<app-name>"**, select **Yes**. This way, as long as you're in the same workspace, Visual Studio Code deploys to the same App Service app each time.
-## Viewing Logs
+ While Visual Studio Code provisions the Azure resources and deploys the code, it shows [progress notifications](https://code.visualstudio.com/api/references/extension-guidelines#notifications).
-You can view log output (calls to `console.log`) from the app directly in the VS Code output window.
+1. Once deployment completes, select **Browse Website** in the notification popup. The browser should display the Express default page.
-1. In the **AZURE APP SERVICE** explorer, right-click the app node and choose **Start Streaming Logs**.
+# [Deploy to Windows](#tab/windows)
- ![Start Streaming Logs](media/quickstart-nodejs/view-logs.png)
+3. Choose **Create new Web App... Advanced**.
+1. Type a globally unique name for your web app and press **Enter**. The name must be unique across all of Azure and use only alphanumeric characters ('A-Z', 'a-z', and '0-9') and hyphens ('-').
+1. Select **Create a new resource group**, then enter a name for the resource group, such as *AppServiceQS-rg*.
+1. Select the Node.js version you want. An **LTS** version is recommended.
+1. Select **Windows** for the operating system.
+1. Select the location you want to serve your app from. For example, *West Europe*.
+1. Select **Create new App Service plan**, then enter a name for the plan (such as *AppServiceQS-plan*), then select **F1 Free** for the pricing tier.
+1. For **Select an Application Insights resource for your app**, select **Skip for now** and wait the resources to be provisioned in Azure.
+1. In the popup **Always deploy the workspace "myExpressApp" to \<app-name>"**, select **Yes**. This way, as long as you're in the same workspace, Visual Studio Code deploys to the same App Service app each time.
-1. When prompted, choose to enable logging and restart the application. Once the app is restarted, the VS Code output window opens with a connection to the log stream.
+ While Visual Studio Code provisions the Azure resources and deploys the code, it shows [progress notifications](https://code.visualstudio.com/api/references/extension-guidelines#notifications).
- :::image type="content" source="media/quickstart-nodejs/enable-restart.png" alt-text="Screenshot of the Visual Studio Code prompt to enable logging and restart the application with the Yes button selected.":::
+ > [!NOTE]
+ > When deployment completes, your Azure app doesn't run yet because your project root doesn't have a *web.config*. Follow the remaining steps to generate it automatically. For more information, see [You do not have permission to view this directory or page](configure-language-nodejs.md#you-do-not-have-permission-to-view-this-directory-or-page).
-1. After a few seconds, the output window shows a message indicating that you're connected to the log-streaming service. You can generate more output activity by refreshing the page in the browser.
+1. In the **App Service** explorer in Visual Studio code, expand the node for the new app, right-click **Application Settings**, and select **Add New Setting**:
- <pre>
- Connecting to log stream...
- 2020-03-04T19:29:44 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours.
- Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).
- </pre>
+ ![Add app setting command](media/quickstart-nodejs/add-setting.png)
-> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=tailing-logs)
+1. Enter `SCM_DO_BUILD_DURING_DEPLOYMENT` for the setting key.
+1. Enter `true` for the setting value.
-## Next steps
+ This app setting enables build automation at deploy time, which automatically detects the start script and generates the *web.config* with it.
-Congratulations, you've successfully completed this quickstart!
+1. In the **App Service** explorer, select the **Deploy to Web App** icon again, confirm by clicking **Deploy** again.
+1. Wait for deployment to complete, then select **Browse Website** in the notification popup. The browser should display the Express default page.
-> [!div class="nextstepaction"]
-> [Tutorial: Node.js app with MongoDB](tutorial-nodejs-mongodb-app.md)
+--
> [!div class="nextstepaction"]
-> [Configure Node.js app](configure-language-nodejs.md)
-
-Check out the other Azure extensions.
-
-* [Cosmos DB](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-cosmosdb)
-* [Azure Functions](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions)
-* [Docker Tools](https://marketplace.visualstudio.com/items?itemName=PeterJausovec.vscode-docker)
-* [Azure CLI Tools](https://marketplace.visualstudio.com/items?itemName=ms-vscode.azurecli)
-* [Azure Resource Manager Tools](https://marketplace.visualstudio.com/items?itemName=msazurermtools.azurerm-vscode-tools)
+> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=deploy-app)
-Or get them all by installing the
-[Node Pack for Azure](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack) extension pack.
::: zone-end
-## Prerequisites
-If you don't have an Azure account, [sign up today](https://azure.microsoft.com/free/?utm_source=campaign&utm_campaign=vscode-tutorial-app-service-extension&mktingSource=vscode-tutorial-app-service-extension) for a free account with $200 in Azure credits to try out any combination of services.
+In the terminal, make sure you're in the *myExpressApp* directory, and deploy the code in your local folder (*myExpressApp*) using the `az webapp up` command:
-You need [Visual Studio Code](https://code.visualstudio.com/) installed along with [Node.js and npm](https://nodejs.org/en/download), the Node.js package manager.
+# [Deploy to Linux](#tab/linux)
-You will also need to install the [Azure App Service extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azureappservice), which you can use to create, manage, and deploy Linux Web Apps on the Azure Platform as a Service (PaaS).
+```azurecli
+az webapp up --sku F1 --name <app-name>
+```
-### Sign in
+# [Deploy to Windows](#tab/windows)
-Once the extension is installed, log into your Azure account. In the Activity Bar, select the Azure logo to show the **AZURE APP SERVICE** explorer. Select **Sign in to Azure...** and follow the instructions.
+```azurecli
+az webapp up --sku F1 --name <app-name> --os-type Windows
+```
-![sign in to Azure](./media/quickstart-nodejs/sign-in.png)
+--
-### Troubleshooting
+- If the `az` command isn't recognized, be sure you have the Azure CLI installed as described in [Set up your initial environment](#set-up-your-initial-environment).
+- Replace `<app_name>` with a name that's unique across all of Azure (*valid characters are `a-z`, `0-9`, and `-`*). A good pattern is to use a combination of your company name and an app identifier.
+- The `--sku F1` argument creates the web app on the Free pricing tier, which incurs a no cost.
+- You can optionally include the argument `--location <location-name>` where `<location_name>` is an available Azure region. You can retrieve a list of allowable regions for your Azure account by running the [`az account list-locations`](/cli/azure/appservice#az_appservice_list_locations) command.
+- The command creates a Linux app for Node.js by default. To create a Windows app instead, use the `--os-type` argument.
+- If you see the error, "Could not auto-detect the runtime stack of your app," make sure you're running the command in the *myExpressApp* directory (See [Troubleshooting auto-detect issues with az webapp up](https://github.com/Azure/app-service-linux-docs/blob/master/AzWebAppUP/runtime_detection.md)).
-If you see the error **"Cannot find subscription with name [subscription ID]"**, it might be because you're behind a proxy and unable to reach the Azure API. Configure `HTTP_PROXY` and `HTTPS_PROXY` environment variables with your proxy information in your terminal using `export`.
+The command may take a few minutes to complete. While running, it provides messages about creating the resource group, the App Service plan, and the app resource, configuring logging, and doing ZIP deployment. It then gives the message, "You can launch the app at http://&lt;app-name&gt;.azurewebsites.net", which is the app's URL on Azure.
-```sh
-export HTTPS_PROXY=https://username:password@proxy:8080
-export HTTP_PROXY=http://username:password@proxy:8080
-```
+<pre>
+The webapp '&lt;app-name>' doesn't exist
+Creating Resource group '&lt;group-name>' ...
+Resource group creation complete
+Creating AppServicePlan '&lt;app-service-plan-name>' ...
+Creating webapp '&lt;app-name>' ...
+Configuring default logging for the app, if not already enabled
+Creating zip with contents of dir /home/cephas/myExpressApp ...
+Getting scm site credentials for zip deployment
+Starting zip deployment. This operation can take a while to complete ...
+Deployment endpoint responded with status code 202
+You can launch the app at http://&lt;app-name>.azurewebsites.net
+{
+ "URL": "http://&lt;app-name>.azurewebsites.net",
+ "appserviceplan": "&lt;app-service-plan-name>",
+ "location": "centralus",
+ "name": "&lt;app-name>",
+ "os": "&lt;os-type>",
+ "resourcegroup": "&lt;group-name>",
+ "runtime_version": "node|10.14",
+ "runtime_version_detected": "0.0",
+ "sku": "FREE",
+ "src_path": "//home//cephas//myExpressApp"
+}
+</pre>
-If setting the environment variables doesn't correct the issue, contact us by selecting the **I ran into an issue** button below.
-### Prerequisite check
-Before you continue, ensure that you have all the prerequisites installed and configured.
+## Redeploy updates
-In VS Code, you should see your Azure email address in the Status Bar and your subscription in the **AZURE APP SERVICE** explorer.
+You can deploy changes to this app by making edits in Visual Studio Code, saving your files, and then redeploy to your Azure app. For example:
-> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=getting-started)
+1. From the sample project, open *views/index.pug* and change
-## Create your Node.js application
+ ```PUG
+ p Welcome to #{title}
+ ```
-Next, create a Node.js application that can be deployed to the Cloud. This quickstart uses an application generator to quickly scaffold out the application from a terminal.
+ to
+
+ ```PUG
+ p Welcome to Azure!
+ ```
-> [!TIP]
-> If you have already completed the [Node.js tutorial](https://code.visualstudio.com/docs/nodejs/nodejs-tutorial), you can skip ahead to [Deploy to Azure](#deploy-to-azure).
-### Scaffold a new application with the Express Generator
+2. In the **App Service** explorer, select the **Deploy to Web App** icon again, confirm by clicking **Deploy** again.
-[Express](https://www.expressjs.com) is a popular framework for building and running Node.js applications. You can scaffold (create) a new Express application using the [Express Generator](https://expressjs.com/en/starter/generator.html) tool. The Express Generator is shipped as an npm module and can be run directly (without installation) by using the npm command-line tool `npx`.
+1. Wait for deployment to complete, then select **Browse Website** in the notification popup. You should see that the `Welcome to Express` message has been changed to `Welcome to Azure!`.
-```bash
-npx express-generator myExpressApp --view pug --git
-```
-The `--view pug --git` parameters tell the generator to use the [pug](https://pugjs.org/api/getting-started.html) template engine (formerly known as `jade`) and to create a `.gitignore` file.
-To install all of the application's dependencies, go to the new folder and run `npm install`.
+2. Save your changes, then redeploy the app using the `az webapp up` command again with no arguments:
-```bash
-cd myExpressApp
-npm install
-```
+ ```azurecli
+ az webapp up
+ ```
+
+ This command uses values that are cached locally in the *.azure/config* file, such as the app name, resource group, and App Service plan.
+
+1. Once deployment is complete, refresh the webpage `http://<app-name>.azurewebsites.net`. You should see that the `Welcome to Express` message has been changed to `Welcome to Azure!`.
-### Run the application
-Next, ensure that the application runs. From the terminal, start the application using the `npm start` command to start the server.
+## Stream Logs
-```bash
-npm start
-```
-Now, open your browser and navigate to `http://localhost:3000`, where you should see something like this:
+You can stream log output (calls to `console.log()`) from the Azure app directly in the Visual Studio Code output window.
-![Running Express Application](./media/quickstart-nodejs/express.png)
+1. In the **App Service** explorer, right-click the app node and choose **Start Streaming Logs**.
-> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=create-app)
+ ![Start Streaming Logs](media/quickstart-nodejs/view-logs.png)
-## Deploy to Azure
+1. If asked to restart the app, click **Yes**. Once the app is restarted, the Visual Studio Code output window opens with a connection to the log stream.
-In this section, you deploy your Node.js app using VS Code and the Azure App Service extension. This quickstart uses the most basic deployment model where your app is zipped and deployed to an Azure Web App on Linux.
+1. After a few seconds, the output window shows a message indicating that you're connected to the log-streaming service. You can generate more output activity by refreshing the page in the browser.
-### Deploy using Azure App Service
+ <pre>
+ Connecting to log stream...
+ 2020-03-04T19:29:44 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours.
+ Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).
+ </pre>
-First, open your application folder in VS Code.
+> [!div class="nextstepaction"]
+> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=tailing-logs)
-```bash
-code .
-```
-In the **AZURE APP SERVICE** explorer, select the blue up arrow icon to deploy your app to Azure.
+You can access the console logs generated from inside the app and the container in which it runs. Logs include any output generated by calls to `console.log()`.
-> [!TIP]
-> You can also deploy from the **Command Palette** (CTRL + SHIFT + P) by typing 'deploy to web app' and running the **Azure App Service: Deploy to Web App** command.
+To stream logs, run the [az webapp log tail](/cli/azure/webapp/log#az_webapp_log_tail) command:
-1. Choose the directory that you currently have open, `myExpressApp`.
+```azurecli
+az webapp log tail
+```
-1. Choose **Create new Web App**, which deploys to App Service on Linux by default.
+The command uses the resource group name cached in the *.azure/config* file.
-1. Type a globally unique name for your Web App and press ENTER. Valid characters for an app name are 'a-z', '0-9', and '-'.
+You can also include the `--logs` parameter with then `az webapp up` command to automatically open the log stream on deployment.
-1. Choose your **Node.js version**, LTS is recommended.
+Refresh the app in the browser to generate console logs, which include messages describing HTTP requests to the app. If no output appears immediately, try again in 30 seconds.
- The notification channel shows the Azure resources that are being created for your app.
+To stop log streaming at any time, press **Ctrl**+**C** in the terminal.
-1. Select **Yes** when prompted to update your configuration to run `npm install` on the target server. Your app is then deployed.
- :::image type="content" source="./media/quickstart-nodejs/server-build.png" alt-text="Screenshot of the prompt to update your configuration on the target server with the yes button selected.":::
+## Clean up resources
-1. When the deployment starts, you're prompted to update your workspace so that later deployments will automatically target the same App Service Web App. Choose **Yes** to ensure your changes are deployed to the correct app.
- :::image type="content" source="./media/quickstart-nodejs/save-configuration.png" alt-text="Screenshot of the prompt to update your workspace with the yes button selected.":::
+In the preceding steps, you created Azure resources in a resource group. The create steps in this quickstart put all the resources in this resource group. To clean up, you just need to remove the resource group.
-> [!TIP]
-> Be sure that your application is listening on the port provided by the PORT environment variable: `process.env.PORT`.
-### Browse the app in Azure
+1. In the Azure extension of Visual Studio, expand the **Resource Groups** explorer.
-Once the deployment completes, select **Browse Website** in the prompt to view your freshly deployed web app.
+1. Expand the subscription, right-click the resource group you created earlier, and select **Delete**.
-### Troubleshooting
+ :::image type="content" source="media/quickstart-nodejs/clean-up.png" alt-text="Screenshot of the Visual Studio Code navigation to delete a resource that contains App Service resources.":::
-If you see the error **"You do not have permission to view this directory or page."**, then the application probably failed to start correctly. Head to the next section and view the log output to find and fix the error. If you aren't able to fix it, contact us by selecting the **I ran into an issue** button below. We're happy to help!
+1. When prompted, confirm your deletion by entering the name of the resource group you're deleting. Once you confirm, the resource group is deleted, and you see a [notification](https://code.visualstudio.com/api/references/extension-guidelines#notifications) when it's done.
> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=deploy-app)
+> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=clean-up)
-### Update the app
-You can deploy changes to this app by using the same process and choosing the existing app rather than creating a new one.
-## Viewing Logs
+In the preceding steps, you created Azure resources in a resource group. The resource group has a name like "appsvc_rg_Linux_CentralUS" depending on your location.
-In this section, you learn how to view (or "tail") the logs from the running App Service app. Any calls to `console.log` in the app are displayed in the output window in Visual Studio Code.
+If you don't expect to need these resources in the future, delete the resource group by running the following command:
-Find the app in the **AZURE APP SERVICE** explorer, right-click the app, and choose **Start Streaming Logs**.
+```azurecli
+az group delete --no-wait
+```
-The VS Code output window opens with a connection to the log stream.
+The command uses the resource group name cached in the *.azure/config* file.
-![Start Streaming Logs](./media/quickstart-nodejs/view-logs.png)
+The `--no-wait` argument allows the command to return before the operation is complete.
-After a few seconds, you'll see a message indicating that you're connected to the log-streaming service. Refresh the page a few times to see more activity.
+## Next steps
-<pre>
-2019-09-20 20:37:39.574 INFO - Initiating warmup request to container msdocs-vscode-node_2_00ac292a for site msdocs-vscode-node
-2019-09-20 20:37:55.011 INFO - Waiting for response to warmup request for container msdocs-vscode-node_2_00ac292a. Elapsed time = 15.4373071 sec
-2019-09-20 20:38:08.233 INFO - Container msdocs-vscode-node_2_00ac292a for site msdocs-vscode-node initialized successfully and is ready to serve requests.
-2019-09-20T20:38:21 Startup Request, url: /Default.cshtml, method: GET, type: request, pid: 61,1,7, SCM_SKIP_SSL_VALIDATION: 0, SCM_BIN_PATH: /opt/Kudu/bin, ScmType: None
-</pre>
+Congratulations, you've successfully completed this quickstart!
> [!div class="nextstepaction"]
-> [I ran into an issue](https://www.research.net/r/PWZWZ52?tutorial=node-deployment-azure-app-service&step=tailing-logs)
-
-## Next steps
+> [Tutorial: Node.js app with MongoDB](tutorial-nodejs-mongodb-app.md)
-Congratulations, you've successfully completed this quickstart!
+> [!div class="nextstepaction"]
+> [Configure Node.js app](configure-language-nodejs.md)
-Next, check out the other Azure extensions.
+Check out the other Azure extensions.
* [Cosmos DB](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-cosmosdb) * [Azure Functions](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions)
Next, check out the other Azure extensions.
Or get them all by installing the [Node Pack for Azure](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack) extension pack.--
app-service Quickstart Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/quickstart-python.md
In this quickstart, you deploy a Python web app to [App Service on Linux](overvi
1. Have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio). 1. Install <a href="https://www.python.org/downloads/" target="_blank">Python 3.6 or higher</a>.
-1. Install the <a href="/cli/azure/install-azure-cli" target="_blank">Azure CLI</a> 2.0.80 or higher, with which you run commands in any shell to provision and configure Azure resources.
+1. Install the <a href="/cli/azure/install-azure-cli" target="_blank">Azure CLI</a>, with which you run commands in any shell to provision and configure Azure resources.
Open a terminal window and check your Python version is 3.6 or higher:
app-service Samples Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/samples-cli.md
Title: CLI Samples
+ Title: Azure CLI Samples for Azure App Service | Microsoft Docs
description: Find Azure CLI samples for some of the common App Service scenarios. Learn how to automate your App Service deployment or management tasks. tags: azure-service-management ms.assetid: 53e6a15a-370a-48df-8618-c6737e26acec Previously updated : 07/07/2020- Last updated : 09/17/2021+
+keywords: azure cli samples, azure cli examples, azure cli code samples
# CLI samples for Azure App Service
app-service Scenario Secure App Access Microsoft Graph As User https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/scenario-secure-app-access-microsoft-graph-as-user.md
Previously updated : 09/14/2021 Last updated : 09/23/2021
Select **Delegated permissions**, and then select **User.Read** from the list. S
## Configure App Service to return a usable access token
-The web app now has the required permissions to access Microsoft Graph as the signed-in user. In this step, you configure App Service authentication and authorization to give you a usable access token for accessing Microsoft Graph. For this step, you need the client/app ID of the downstream service (Microsoft Graph). The app ID for Microsoft Graph is *00000003-0000-0000-c000-000000000000*.
+The web app now has the required permissions to access Microsoft Graph as the signed-in user. In this step, you configure App Service authentication and authorization to give you a usable access token for accessing Microsoft Graph. For this step, you need to add the User.Read scope for the downstream service (Microsoft Graph): `https://graph.microsoft.com/User.Read`.
> [!IMPORTANT] > If you don't configure App Service to return a usable access token, you receive a ```CompactToken parsing failed with error code: 80049217``` error when you call Microsoft Graph APIs in your code.
The Azure Resource Explorer is now opened with your web app selected in the reso
In the left browser, drill down to **config** > **authsettingsV2**.
-In the **authsettingsV2** view, select **Edit**. Find the **login** section of **identityProviders** -> **azureActiveDirectory** and add the following **loginParameters** settings: `"loginParameters":[ "response_type=code id_token","resource=00000003-0000-0000-c000-000000000000" ]` .
+In the **authsettingsV2** view, select **Edit**. Find the **login** section of **identityProviders** -> **azureActiveDirectory** and add the following **loginParameters** settings: `"loginParameters":[ "response_type=code id_token","scope=openid offline_access profile https://graph.microsoft.com/User.Read" ]` .
```json "identityProviders": {
In the **authsettingsV2** view, select **Edit**. Find the **login** section of *
"login": { "loginParameters":[ "response_type=code id_token",
- "resource=00000003-0000-0000-c000-000000000000"
+ "scope=openid offline_access profile https://graph.microsoft.com/User.Read"
] } }
Get your existing 'config/authsettingsv2ΓÇÖ settings and save to a local *authse
az rest --method GET --url '/subscriptions/{SUBSCRIPTION_ID}/resourceGroups/{RESOURCE_GROUP}/providers/Microsoft.Web/sites/{WEBAPP_NAME}/config/authsettingsv2/list?api-version=2020-06-01' > authsettings.json ```
-Open the authsettings.json file using your preferred text editor. Find the **login** section of **identityProviders** -> **azureActiveDirectory** and add the following **loginParameters** settings: `"loginParameters":[ "response_type=code id_token","resource=00000003-0000-0000-c000-000000000000" ]` .
+Open the authsettings.json file using your preferred text editor. Find the **login** section of **identityProviders** -> **azureActiveDirectory** and add the following **loginParameters** settings: `"loginParameters":[ "response_type=code id_token","scope=openid offline_access profile https://graph.microsoft.com/User.Read" ]` .
```json "identityProviders": {
Open the authsettings.json file using your preferred text editor. Find the **log
"login": { "loginParameters":[ "response_type=code id_token",
- "resource=00000003-0000-0000-c000-000000000000"
+ "scope=openid offline_access profile https://graph.microsoft.com/User.Read"
] } }
az rest --method PUT --url '/subscriptions/{SUBSCRIPTION_ID}/resourceGroups/{RES
```
-## Update the issuer URL
-In the [Azure portal](https://portal.azure.com), navigate to your App Service and then the **Authentication** blade.
-
-Click the **Edit** link next to the Microsoft identity provider.
-
-Check the the **Issuer URL** in the **Basics** tab. If the **Issuer URL** contains "/v2.0" at the end of it, remove it and click **Save**. If you donΓÇÖt remove ΓÇ£/v2.0ΓÇ¥, you get an *AADSTS901002: The 'resource' request parameter is not supported* when you sign in to the web app.
- ## Call Microsoft Graph (.NET) Your web app now has the required permissions and also adds Microsoft Graph's client ID to the login parameters. Using the [Microsoft.Identity.Web library](https://github.com/AzureAD/microsoft-identity-web/), the web app gets an access token for authentication with Microsoft Graph. In version 1.2.0 and later, the Microsoft.Identity.Web library integrates with and can run alongside the App Service authentication/authorization module. Microsoft.Identity.Web detects that the web app is hosted in App Service and gets the access token from the App Service authentication/authorization module. The access token is then passed along to authenticated requests with the Microsoft Graph API.
app-service Tutorial Auth Aad https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/tutorial-auth-aad.md
description: Learn how to use App Service authentication and authorization to se
keywords: app service, azure app service, authN, authZ, secure, security, multi-tiered, azure active directory, azure ad ms.devlang: dotnet Previously updated : 04/26/2021 Last updated : 09/23/2021 zone_pivot_groups: app-service-platform-windows-linux
The front-end app now has the required permissions to access the back-end app as
1. Save your settings by clicking **PUT**.
+ ::: zone pivot="platform-linux"
+
+ > [!NOTE]
+ > For Linux apps, There's a temporary requirement to configure a versioning setting for the back-end app registration. In the Cloud Shell, configure it with the following commands. Be sure to replace *\<back-end-client-id>* with your back end's client ID.
+ >
+ > ```azurecli-interactive
+ > id=$(az ad app show --id <back-end-client-id> --query objectId --output tsv)
+ > az rest --method PATCH --url https://graph.microsoft.com/v1.0/applications/$id --body "{'api':{'requestedAccessTokenVersion':2}}"
+ > ```
+
+ ::: zone-end
+
Your apps are now configured. The front end is now ready to access the back end with a proper access token. For information on how to configure the access token for other providers, see [Refresh identity provider tokens](configure-authentication-oauth-tokens.md#refresh-auth-tokens).
application-gateway Key Vault Certs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/key-vault-certs.md
Key Vault integration offers two models for TLS termination: -- You can explicitly provide TLS/SSL certificates attached to the listener. This model is the traditional way to pass TLS/SSL certificates to Application Gateway for TLS termination.-- You can optionally provide a reference to an existing Key Vault certificate or secret when you create an HTTPS-enabled listener.
+- Explicitly provide TLS/SSL certificates attached to the listener. This model is the traditional way to pass TLS/SSL certificates to Application Gateway for TLS termination.
+- Optionally provide a reference to an existing Key Vault certificate or secret when you create an HTTPS-enabled listener.
Application Gateway integration with Key Vault offers many benefits, including:
Application Gateway integration with Key Vault offers many benefits, including:
- Support for importing existing certificates into your key vault. Or use Key Vault APIs to create and manage new certificates with any of the trusted Key Vault partners. - Support for automatic renewal of certificates that are stored in your key vault.
-Application Gateway currently supports software-validated certificates only. Hardware security module (HSM)-validated certificates are not supported. After Application Gateway is configured to use Key Vault certificates, its instances retrieve the certificate from Key Vault and install them locally for TLS termination. The instances also poll Key Vault at 4-hour intervals to retrieve a renewed version of the certificate, if it exists. If an updated certificate is found, the TLS/SSL certificate currently associated with the HTTPS listener is automatically rotated.
+## Supported certificates
-Application Gateway uses Secret Identifier in Key Vault to reference the certificates. For Azure PowerShell, CLI, or ARM it is strongly recommended to use a secret identifier that doesn't specify a version. This way, Azure Application Gateway will automatically rotate the certificate, if a newer version is available in your Key Vault. An example of a secret URI without a version is `https://myvault.vault.azure.net/secrets/mysecret/`.
+Application Gateway currently supports software-validated certificates only. Hardware security module (HSM)-validated certificates are not supported.
-> [!NOTE]
-> The Azure portal only supports KeyVault Certificates, not secrets. Application Gateway still supports referencing secrets from KeyVault, but only through non-Portal resources like PowerShell, CLI, API, ARM templates, etc.
+After Application Gateway is configured to use Key Vault certificates, its instances retrieve the certificate from Key Vault and install them locally for TLS termination. The instances also poll Key Vault at four-hour intervals to retrieve a renewed version of the certificate, if it exists. If an updated certificate is found, the TLS/SSL certificate that's currently associated with the HTTPS listener is automatically rotated.
+
+Application Gateway uses a secret identifier in Key Vault to reference the certificates. For Azure PowerShell, the Azure CLI, or Azure Resource Manager, we strongly recommend that you use a secret identifier that doesn't specify a version. This way, Application Gateway will automatically rotate the certificate if a newer version is available in your key vault. An example of a secret URI without a version is `https://myvault.vault.azure.net/secrets/mysecret/`.
+
+The Azure portal supports only Key Vault certificates, not secrets. Application Gateway still supports referencing secrets from Key Vault, but only through non-portal resources like PowerShell, the Azure CLI, APIs, and Azure Resource Manager templates (ARM templates).
> [!WARNING]
-> Azure Application Gateway currently only supports Key Vault accounts in the same subscription as the Application Gateway resource. Choosing a Key Vault under a different subscription than your Application Gateway will result in a failure.
+> Azure Application Gateway currently supports only Key Vault accounts in the same subscription as the Application Gateway resource. Choosing a key vault under a different subscription than your Application Gateway will result in a failure.
## Certificate settings in Key Vault
-For TLS termination, Application Gateway supports certificates in Personal Information Exchange (PFX) format. You can either import an existing certificate or create a new one in your key vault. Ensure that the certificate's status is set to Enabled in Key Vault to avoid any failures.
+For TLS termination, Application Gateway supports certificates in Personal Information Exchange (PFX) format. You can either import an existing certificate or create a new one in your key vault. To avoid any failures, ensure that the certificate's status is set to **Enabled** in Key Vault.
## How integration works Application Gateway integration with Key Vault is a three-step configuration process:
- ![Key vault certificates](media/key-vault-certs/ag-kv.png)
+![Diagram that shows three steps for integrating Application Gateway with Key Vault.](media/key-vault-certs/ag-kv.png)
-1. **Create a user-assigned managed identity**
+### Create a user-assigned managed identity
- You create or reuse an existing user-assigned managed identity, which Application Gateway uses to retrieve certificates from Key Vault on your behalf. For more information, see [Create, list, delete, or assign a role to a user-assigned managed identity using the Azure portal](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md). This step creates a new identity in the Azure Active Directory tenant. The identity is trusted by the subscription that's used to create the identity.
+You either create a user-assigned managed identity or reuse an existing one. Application Gateway uses the managed identity to retrieve certificates from Key Vault on your behalf. For more information, see [Create, list, delete, or assign a role to a user-assigned managed identity using the Azure portal](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md).
-1. **Configure your key vault**
+This step creates a new identity in the Azure Active Directory tenant. The identity is trusted by the subscription that's used to create the identity.
- **Access Policies** - After you've created a certificate in key vault by referring to the preceding section, you must define Key Vault's Access Policies to use the user-assigned managed identity with your Key Vault.
+### Configure your key vault
+
+Define access policies to use the user-assigned managed identity with your key vault:
- i) Go to key vault in the Azure portal.<br />
- ii) Open the Access policies pane.<br />
- iii) If using the permission model "Vault access policy": Click "+Add Access Policy". Choose "Get" permission for "Secret permissions" field and choose your user-assigned managed identity for "Select principal" field, and click on "Save".<br />
- iv) If using the permission model "Azure role-based access control": Add a role assignment for the user-managed identity to the Azure Key Vault for the role "Key Vault Secrets User."<br />
-
- **Firewalls and virtual networks** - When using a restricted key vault, you can configure your Application Gateway in the following manner.
-
- > [!IMPORTANT]
- > Starting March 15th 2021, Key Vault recognizes Azure Application Gateway as one of the Trusted Services, thus allowing you to build a secure network boundary in Azure. This gives you an ability to deny access to traffic from all networks (including internet traffic) to Key Vault but still make it accessible for Application Gateway resource under your subscription.
- >
- > Please note, in addition to the **Trusted Services** setting, you will need to allow your **Application Gateway's subnet** with **Service endpoint configuration** to grant access for all the scenarios.
-
- a) Under Key VaultΓÇÖs Networking blade <br />
- b) Choose Private endpoint and selected networks in "Firewall and Virtual Networks" tab <br/>
- c) Then using Virtual Networks, add your Application GatewayΓÇÖs virtual network and Subnet. During the process also configure ΓÇÿMicrosoft.KeyVault' service endpoint by selecting its checkbox. <br/>
- d) Finally, select ΓÇ£YesΓÇ¥ to allow Trusted Services to bypass Key VaultΓÇÖs firewall. <br/>
+1. In the Azure portal, go to **Key Vault**.
+1. Open the **Access policies** pane.
+1. If you're using the permission model **Vault access policy**: Select **+ Add Access Policy**, select **Get** for **Secret permissions**, and choose your user-assigned managed identity for **Select principal**. Then select **Save**.
+
+ If you're using the permission model **Azure role-based access control**: Add a role assignment for the user-assigned managed identity to the Azure key vault for the role **Key Vault Secrets User**.
+
+As of March 15, 2021, Key Vault recognizes Application Gateway as a trusted service, so you can build a secure network boundary in Azure. You can deny access to traffic from all networks (including internet traffic) to Key Vault but still make Key Vault accessible for an Application Gateway resource under your subscription.
+
+When you're using a restricted key vault, use the following steps to configure Application Gateway to use firewalls and virtual networks:
+
+1. In the Azure portal, in your key vault, select **Networking**.
+1. On the **Firewalls and virtual networks** tab, select **Private endpoint and selected networks**.
+1. For **Virtual networks**, select **+ Add existing virtual networks**, and then add the virtual network and subnet for your Application Gateway instance. During the process, also configure the `Microsoft.KeyVault` service endpoint by selecting its checkbox.
+1. Select **Yes** to allow trusted services to bypass the key vault's firewall.
- ![Key Vault Firewall](media/key-vault-certs/key-vault-firewall.png)
+![Screenshot that shows selections for configuring Application Gateway to use firewalls and virtual networks.](media/key-vault-certs/key-vault-firewall.png)
+
+If you deploy the Application Gateway instance via an ARM template by using either the Azure CLI or PowerShell, or via an Azure application deployed from the Azure portal, the SSL certificate is stored in the key vault as a Base64-encoded PFX file. You must complete the steps in [Use Azure Key Vault to pass secure parameter value during deployment](../azure-resource-manager/templates/key-vault-parameter.md).
+It's particularly important to set `enabledForTemplateDeployment` to `true`. The certificate might or might not have a password. In the case of a certificate with a password, the following example shows a possible configuration for the `sslCertificates` entry in `properties` for the ARM template configuration for Application Gateway.
- > [!NOTE]
- > If you deploy the application gateway via an ARM template, either by using the Azure CLI or PowerShell, or via an Azure application deployed from the Azure portal, the SSL certificate is stored in the key vault as a base64-encoded PFX file. You must complete the steps in [Use Azure Key Vault to pass secure parameter value during deployment](../azure-resource-manager/templates/key-vault-parameter.md).
- >
- > It's particularly important to set `enabledForTemplateDeployment` to `true`. The certificate may be passwordless or it may have a password. In the case of a certificate with a password, the following example shows a possible configuration for the `sslCertificates` entry in the `properties` for the ARM template configuration for an app gateway. The values of `appGatewaySSLCertificateData` and `appGatewaySSLCertificatePassword` are looked up from the key vault as described in the section [Reference secrets with dynamic ID](../azure-resource-manager/templates/key-vault-parameter.md#reference-secrets-with-dynamic-id). Follow the references backward from `parameters('secretName')` to see how the lookup happens. If the certificate is passwordless, omit the `password` entry.
- >
- > ```
- > "sslCertificates": [
- > {
- > "name": "appGwSslCertificate",
- > "properties": {
- > "data": "[parameters('appGatewaySSLCertificateData')]",
- > "password": "[parameters('appGatewaySSLCertificatePassword')]"
- > }
- > }
- > ]
- > ```
+```
+"sslCertificates": [
+ {
+ "name": "appGwSslCertificate",
+ "properties": {
+ "data": "[parameters('appGatewaySSLCertificateData')]",
+ "password": "[parameters('appGatewaySSLCertificatePassword')]"
+ }
+ }
+]
+```
-1. **Configure the application gateway**
+The values of `appGatewaySSLCertificateData` and `appGatewaySSLCertificatePassword` are looked up from the key vault, as described in [Reference secrets with dynamic ID](../azure-resource-manager/templates/key-vault-parameter.md#reference-secrets-with-dynamic-id). Follow the references backward from `parameters('secretName')` to see how the lookup happens. If the certificate is passwordless, omit the `password` entry.
- After you complete the two preceding steps, you can assign the user-assigned managed identity for your application gateway through its Identity and access management (IAM). For PowerShell, see [Set-AzApplicationGatewayIdentity](/powershell/module/az.network/set-azapplicationgatewayidentity).
+### Configure Application Gateway
+After you create a user-assigned managed identity and configure your key vault, you can assign the managed identity for your Application Gateway instance through identity and access management (IAM). For PowerShell, see [Set-AzApplicationGatewayIdentity](/powershell/module/az.network/set-azapplicationgatewayidentity).
## Investigating and resolving Key Vault errors
-Azure Application Gateway not only polls for the renewed certificate version on Key Vault at every 4-hour interval, but also logs any error and is integrated with Azure Advisor to surface any misconfiguration as a recommendation. The details of the recommendation contain the exact issue and the associated Key Vault resource. You can use this information along with the [troubleshooting guide](../application-gateway/application-gateway-key-vault-common-errors.md) to quickly resolve such configuration error.
+Azure Application Gateway doesn't just poll for the renewed certificate version on Key Vault at every four-hour interval. It also logs any error and is integrated with Azure Advisor to surface any misconfiguration as a recommendation. The recommendation contains details about the problem and the associated Key Vault resource. You can use this information along with the [troubleshooting guide](../application-gateway/application-gateway-key-vault-common-errors.md) to quickly resolve such a configuration error.
-It is strongly recommended that you [configure Advisor Alerts](../advisor/advisor-alerts-portal.md) to stay updated when such a problem is detected. To set an alert for this specific case, use the Recommendation Type as "Resolve Azure Key Vault issue for your Application Gateway".
+We strongly recommend that you [configure Advisor alerts](../advisor/advisor-alerts-portal.md) to stay updated when a problem is detected. To set an alert for this specific case, use **Resolve Azure Key Vault issue for your Application Gateway** as the recommendation type.
## Next steps
applied-ai-services Generate Sas Tokens https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/applied-ai-services/form-recognizer/generate-sas-tokens.md
To get started, you'll need:
## Upload your documents
-1. Go to the [Azure portal](https://ms.portal.azure.com/#home) and navigate as follows: **Your storage account** → **Data storage** → **containers**
+1. Go to the [Azure portal](https://ms.portal.azure.com/#home) and navigate as follows: **Your storage account** → **Data storage** → **Containers**
:::image type="content" source="media/sas-tokens/data-storage-menu.png" alt-text="Screenshot: Data storage menu in the Azure portal.":::
To get started, you'll need:
1. In the [Azure portal](https://ms.portal.azure.com/#home), navigate as follows:
- **Your storage account** → **containers**
+ **Your storage account** → **Containers**
1. Select a container from the list. 1. Navigate to the right of the main window and select the three ellipses associated with your chosen container. 1. Select **Generate SAS** from the drop-down menu to open the **Generate SAS Window**.
To get started, you'll need:
1. Define **Permissions** by checking or clearing the appropriate checkbox. Make sure the **Read**, **Write**, **Delete**, and **List** permissions are selected.
- :::image type="content" source="media/sas-tokens/sas-permissions.png" alt-text="Screenshot (Azure protal): SAS permission fields.":::
+ :::image type="content" source="media/sas-tokens/sas-permissions.png" alt-text="Screenshot (Azure portal): SAS permission fields.":::
>[!IMPORTANT] >
To get started, you'll need:
1. **Copy and paste the Blob SAS token and URL values in a secure location. They'll only be displayed once and cannot be retrieved once the window is closed.**
-1. To use the **Blob SAS URL**, add it to your REST API call as follows:
-
-```json
-{
- "source":"<BLOB SAS URL>"
-}
-```
- ## Create a SAS with Azure Command-Line Interface (CLI) 1. To create a user delegation SAS for a container using the Azure CLI, make sure that you have installed version 2.0.78 or later. To check your installed version, use the `az --version` command.
az storage container generate-sas \
--as-user ```
+## How to use your Blob SAS URL
+
+* To use your Blob SAS URL with the [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/TrainCustomModelAsync), add the SAS URL to the request body:
+
+ ```json
+ {
+ "source":"<BLOB SAS URL>"
+ }
+ ```
+
+* To use your **Blob SAS URL** with the [**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/connections/create), add the SAS URL to the **Connections Settings** → **Azure blob container** → **SAS URI** field:
+
+ :::image type="content" source="media/sas-tokens/fott-add-sas-uri.png" alt-text="{alt-text}":::
+ That's it. You've learned how to generate SAS tokens to authorize how clients access your data. > [!div class="nextstepaction"]
automation Disable Managed Identity For Automation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/disable-managed-identity-for-automation.md
Perform the following steps.
$sub = Get-AzSubscription -ErrorAction SilentlyContinue if(-not($sub)) {
- Connect-AzAccount -Subscription
+ Connect-AzAccount
} # If you have multiple subscriptions, set the one to use
automation Remove User Assigned Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/remove-user-assigned-identity.md
Use PowerShell cmdlet [Set-AzAutomationAccount](/powershell/module/az.automation
$sub = Get-AzSubscription -ErrorAction SilentlyContinue if(-not($sub)) {
- Connect-AzAccount -Subscription
+ Connect-AzAccount
} ```
avere-vfxt Avere Vfxt Data Ingest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/avere-vfxt/avere-vfxt-data-ingest.md
To use ``msrsync`` to populate an Azure cloud volume with an Avere cluster, foll
For example, this command is designed to move 11,000 files in 64 processes from /test/source-repository to /mnt/vfxt/repository:
- ``msrsync -P --stats -p 64 -f 170 --rsync "-ahv --inplace" /test/source-repository/ /mnt/vfxt/repository && msrsync -P --stats -p 64 -f 170 --rsync "-ahv --inplace" /test/source-repository/ /mnt/vfxt/repository``
+ `msrsync -P --stats -p 64 -f 170 --rsync "-ahv --inplace" /test/source-repository/ /mnt/vfxt/repository && msrsync -P --stats -p 64 -f 170 --rsync "-ahv --inplace" /test/source-repository/ /mnt/vfxt/repository`
## Use the parallel copy script
avere-vfxt Avere Vfxt Mount Clients https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/avere-vfxt/avere-vfxt-mount-clients.md
In addition to the paths, include the [Mount command arguments](#mount-command-a
To ensure a seamless client mount, pass these settings and arguments in your mount command:
-``mount -o hard,proto=tcp,mountproto=tcp,retry=30 ${VSERVER_IP_ADDRESS}:/${NAMESPACE_PATH} ${LOCAL_FILESYSTEM_MOUNT_POINT}``
+`mount -o hard,proto=tcp,mountproto=tcp,retry=30 ${VSERVER_IP_ADDRESS}:/${NAMESPACE_PATH} ${LOCAL_FILESYSTEM_MOUNT_POINT}`
| Required settings | Description | |
azure-app-configuration Rest Api Authentication Hmac https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/rest-api-authentication-hmac.md
HTTP request header names, separated by semicolons, required to sign the request
### Required HTTP request headers
-```x-ms-date```[or ```Date```];```host```;```x-ms-content-sha256```
+`x-ms-date`[or `Date`];`host`;`x-ms-content-sha256`
Any other HTTP request headers can also be added to the signing. Just append them to the ```SignedHeaders``` argument.
x-ms-date;host;x-ms-content-sha256;```Content-Type```;```Accept```
### Signature
-Base64 encoded HMACSHA256 hash of the String-To-Sign. It uses the access key identified by `Credential`.
-```base64_encode(HMACSHA256(String-To-Sign, Secret))```
+Base64 encoded HMACSHA256 hash of the String-To-Sign. It uses the access key identified by `Credential`. `base64_encode(HMACSHA256(String-To-Sign, Secret))`
### String-To-Sign
azure-arc Quickstart Connect Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/quickstart-connect-cluster.md
In this quickstart, you'll learn the benefits of Azure Arc-enabled Kubernetes an
| `https://management.azure.com` (for Azure Cloud), `https://management.usgovcloudapi.net` (for Azure US Government) | Required for the agent to connect to Azure and register the cluster. | | `https://<region>.dp.kubernetesconfiguration.azure.com` (for Azure Cloud), `https://<region>.dp.kubernetesconfiguration.azure.us` (for Azure US Government) | Data plane endpoint for the agent to push status and fetch configuration information. | | `https://login.microsoftonline.com`, `login.windows.net` (for Azure Cloud), `https://login.microsoftonline.us` (for Azure US Government) | Required to fetch and update Azure Resource Manager tokens. |
-| `https://mcr.microsoft.com` | Required to pull container images for Azure Arc agents. |
+| `https://mcr.microsoft.com`, `https://*.data.mcr.microsoft.com` | Required to pull container images for Azure Arc agents. |
| `https://gbl.his.arc.azure.com` | Required to get the regional endpoint for pulling system-assigned Managed Service Identity (MSI) certificates. | | `https://*.his.arc.azure.com` (for Azure Cloud), `https://usgv.his.arc.azure.us` (for Azure US Government) | Required to pull system-assigned Managed Service Identity (MSI) certificates. | |`*.servicebus.windows.net`, `guestnotificationservice.azure.com`, `*.guestnotificationservice.azure.com`, `sts.windows.net` | For [Cluster Connect](cluster-connect.md) and for [Custom Location](custom-locations.md) based scenarios. |
azure-functions Disable Function https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/disable-function.md
Functions can be disabled in the same way when running locally. To disable a fun
"Values": { "FUNCTIONS_WORKER_RUNTIME": "python", "AzureWebJobsStorage": "UseDevelopmentStorage=true",
- "AzureWebJobs.HttpExample.Disabled": "true"
+ "AzureWebJobs.HttpExample.Disabled": true
} } ```
azure-functions Functions Cli Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-cli-samples.md
Title: Azure CLI Samples - Azure Functions
-description: Azure CLI Samples - Azure Functions
+ Title: Azure CLI samples for Azure Functions | Microsoft Docs
+description: Find links to bash scripts for Azure Functions that use the Azure CLI. Learn how to create a function app that allows integration and deployment.
ms.assetid: 577d2f13-de4d-40d2-9dfc-86ecc79f3ab0 Previously updated : 01/09/2018- Last updated : 09/17/2021+
+keywords: functions, azure cli samples, azure cli examples, azure cli code samples
# Azure CLI Samples
azure-functions Functions Versions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-versions.md
Title: Azure Functions runtime versions overview
description: Azure Functions supports multiple versions of the runtime. Learn the differences between them and how to choose the one that's right for you. Previously updated : 05/19/2021 Last updated : 09/22/2021 # Azure Functions runtime versions overview
To migrate an app from 3.x to 4.x:
### Breaking changes between 3.x and 4.x
-The following are some changes to be aware of before upgrading a 3.x app to 4.x. For a full list, see Azure Functions GitHub issues labeled [*Breaking Change: Approved*](https://github.com/Azure/azure-functions/issues?q=is%3Aissue+label%3A%22Breaking+Change%3A+Approved%22+is%3A%22closed+OR+open%22).
+The following are some changes to be aware of before upgrading a 3.x app to 4.x. For a full list, see Azure Functions GitHub issues labeled [*Breaking Change: Approved*](https://github.com/Azure/azure-functions/issues?q=is%3Aissue+label%3A%22Breaking+Change%3A+Approved%22+is%3A%22closed+OR+open%22). More changes are expected during the preview period. Subscribe to [App Service Announcements](https://github.com/Azure/app-service-announcements/issues) for updates.
#### Runtime
azure-functions Performance Reliability https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/performance-reliability.md
As part of your solution, you may develop and publish multiple functions. These
Each function that you create has a memory footprint. While this footprint is usually small, having too many functions within a function app can lead to slower startup of your app on new instances. It also means that the overall memory usage of your function app might be higher. It's hard to say how many functions should be in a single app, which depends on your particular workload. However, if your function stores a lot of data in memory, consider having fewer functions in a single app.
-If you run multiple function apps in a single Premium plan or dedicated (App Service) plan, these apps are all scaled together. If you have one function app that has a much higher memory requirement than the others, it uses a disproportionate amount of memory resources on each instance to which the app is deployed. Because this could leave less memory available for the other apps on each instance, you might want to run a high-memory-using function app like this in its own separate hosting plan.
+If you run multiple function apps in a single Premium plan or dedicated (App Service) plan, these apps are all sharing the same resources allocated to the plan. If you have one function app that has a much higher memory requirement than the others, it uses a disproportionate amount of memory resources on each instance to which the app is deployed. Because this could leave less memory available for the other apps on each instance, you might want to run a high-memory-using function app like this in its own separate hosting plan.
> [!NOTE] > When using the [Consumption plan](./functions-scale.md), we recommend you always put each app in its own plan, since apps are scaled independently anyway.
azure-government Connect With Azure Pipelines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/connect-with-azure-pipelines.md
Follow through one of the quickstarts below to set up a Build for your specific
## Generate a service principal
-1. Download or copy and paste the [service principal creation](https://github.com/yujhongmicrosoft/spncreationn/blob/master/spncreation.ps1) powershell script into an IDE or editor.
+1. Download or copy and paste the [service principal creation](https://github.com/yujhongmicrosoft/spncreationn/blob/master/spncreation.ps1) powershell script into an IDE or editor.
+ 2. Open up the file and navigate to the `param` parameter. Replace the `$environmentName` variable with AzureUSGovernment." This sets the service principal to be created in Azure Government.
-3. Open your Powershell window and run the following command. This command sets a policy that enables running local files.
- ```Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass```
+3. Open your Powershell window and run the following command. This command sets a policy that enables running local files.
+
+ `Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass`
- When you are asked whether you want to change the execution policy, enter "A" (for "Yes to All").
+ When you are asked whether you want to change the execution policy, enter "A" (for "Yes to All").
+
+4. Navigate to the directory that has the edited script above.
-4. Navigate to the directory that has the edited script above.
5. Edit the following command with the name of your script and run:
- `./<name of script file you saved>`
-6. The "subscriptionName" parameter can be found by logging into your Azure Government subscription with `Connect-AzAccount -EnvironmentName AzureUSGovernment` and then running `Get-AzureSubscription`.
+
+ `./<name of script file you saved>`
+
+6. The "subscriptionName" parameter can be found by logging into your Azure Government subscription via `Connect-AzAccount -EnvironmentName AzureUSGovernment` and then running `Get-AzureSubscription`.
+ 7. When prompted for the "password" parameter, enter your desired password. + 8. After providing your Azure Government subscription credentials, you should see the following: > [!NOTE]
- > The Environment variable should be "AzureUSGovernment"
- >
- >
+ > The Environment variable should be `AzureUSGovernment`.
-9. After the script has run, you should see your service connection values. Copy these values as we will need them when setting up our endpoint.
+9. After the script has run, you should see your service connection values. Copy these values as we will need them when setting up our endpoint.
- ![ps4](./media/documentation-government-vsts-img11.png)
+ ![ps4](./media/documentation-government-vsts-img11.png)
## Configure the Azure Pipelines service connection
azure-monitor Api Filtering Sampling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/api-filtering-sampling.md
For apps written by using [ASP.NET Core](asp-net-core.md#adding-telemetryinitial
### JavaScript telemetry initializers *JavaScript*
-Insert a telemetry initializer immediately after the initialization code that you got from the portal:
+Insert a telemetry initializer using the snippet onInit callback:
-```JS
+```html
<script type="text/javascript">
- // ... initialization code
- ...({
- instrumentationKey: "your instrumentation key"
- });
- window.appInsights = appInsights;
--
- // Adding telemetry initializer.
- // This is called whenever a new telemetry item
- // is created.
-
- appInsights.addTelemetryInitializer(function (envelope) {
- var telemetryItem = envelope.data.baseData;
-
- // To check the telemetry items type - for example PageView:
- if (envelope.name == Microsoft.ApplicationInsights.Telemetry.PageView.envelopeType) {
- // this statement removes url from all page view documents
- telemetryItem.url = "URL CENSORED";
- }
-
- // To set custom properties:
- telemetryItem.properties = telemetryItem.properties || {};
- telemetryItem.properties["globalProperty"] = "boo";
-
- // To set cloud role name / instance
- envelope.tags["ai.cloud.role"] = "your role name";
- envelope.tags["ai.cloud.roleInstance"] = "your role instance";
- });
-
- // End of inserted code.
-
- appInsights.trackPageView();
+!function(T,l,y){<!-- Removed the Snippet code for brevity -->}(window,document,{
+src: "https://js.monitor.azure.com/scripts/b/ai.2.min.js",
+crossOrigin: "anonymous",
+onInit: function (sdk) {
+ sdk.addTelemetryInitializer(function (envelope) {
+ envelope.data.someField = 'This item passed through my telemetry initializer';
+ });
+}, // Once the application insights instance has loaded and initialized this method will be called
+cfg: { // Application Insights Configuration
+ instrumentationKey: "YOUR_INSTRUMENTATION_KEY"
+}});
</script> ```
azure-monitor Azure Web Apps Net Core https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/azure-web-apps-net-core.md
Enabling monitoring on your ASP.NET Core based web applications running on [Azur
# [Windows](#tab/Windows) > [!IMPORTANT]
-> The following versions of ASP.NET Core are supported for auto-instrumentation on windows: ASP.NET Core 2.1, 3.1, and 5.0. Versions 2.0, 2.2, and 3.0 have been retired and are no longer supported. Please upgrade to a [supported version](https://dotnet.microsoft.com/platform/support/policy/dotnet-core) of .NET Core for auto-instrumentation to work.
+> The following versions of ASP.NET Core are supported for auto-instrumentation on windows: ASP.NET Core 3.1, and 5.0. Versions 2.0, 2.1, 2.2, and 3.0 have been retired and are no longer supported. Please upgrade to a [supported version](https://dotnet.microsoft.com/platform/support/policy/dotnet-core) of .NET Core for auto-instrumentation to work.
Targeting the full framework from ASP.NET Core is **not supported** in Windows. Use [manual instrumentation](./asp-net-core.md) via code instead.
For the latest updates and bug fixes, [consult the release notes](web-app-extens
* [Monitor service health metrics](../data-platform.md) to make sure your service is available and responsive. * [Receive alert notifications](../alerts/alerts-overview.md) whenever operational events happen or metrics cross a threshold. * Use [Application Insights for JavaScript apps and web pages](javascript.md) to get client telemetry from the browsers that visit a web page.
-* [Set up Availability web tests](monitor-web-app-availability.md) to be alerted if your site is down.
+* [Set up Availability web tests](monitor-web-app-availability.md) to be alerted if your site is down.
azure-monitor Container Insights Onboard https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/containers/container-insights-onboard.md
Last updated 06/30/2020
This article provides an overview of the options that are available for setting up Container insights to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on: - [Azure Kubernetes Service (AKS)](../../aks/index.yml) -- [Azure Red Hat OpenShift](../../openshift/intro-openshift.md) versions 3.x and 4.x -- [Red Hat OpenShift](https://docs.openshift.com/container-platform/4.3/welcome/index.html) version 4.x -- An [Arc-enabled Kubernetes cluster](../../azure-arc/kubernetes/overview.md)
+- [Arc-enabled Kubernetes cluster](../../azure-arc/kubernetes/overview.md)
+ - [Azure Stack](/azure-stack/user/azure-stack-kubernetes-aks-engine-overview) or on-premises
+ - [AKS engine](https://github.com/Azure/aks-engine)
+ - [Azure Red Hat OpenShift](../../openshift/intro-openshift.md) version 4.x
+ - [Red Hat OpenShift](https://docs.openshift.com/container-platform/4.3/welcome/index.html) version 4.x
-You can also monitor the performance of workloads that are deployed to self-managed Kubernetes clusters hosted on:
-- Azure, by using the [AKS engine](https://github.com/Azure/aks-engine)-- [Azure Stack](/azure-stack/user/azure-stack-kubernetes-aks-engine-overview) or on-premises, by using the AKS engine. You can enable Container insights for a new deployment or for one or more existing deployments of Kubernetes by using any of the following supported methods:
You can enable Container insights for a new deployment or for one or more existi
- The Azure CLI - [Terraform and AKS](/azure/developer/terraform/create-k8s-cluster-with-tf-and-aks)
+For any non-AKS kubernetes cluster, you will need to first connect your cluster to [Azure Arc](../../azure-arc/kubernetes/overview.md) before enabling monitoring.
+ [!INCLUDE [updated-for-az](../../../includes/updated-for-az.md)] ## Prerequisites
Kubelet secure port (:10250) should be opened in the cluster's virtual network f
Container insights officially supports the following configurations: - Environments: Azure Red Hat OpenShift, Kubernetes on-premises, and the AKS engine on Azure and Azure Stack. For more information, see [the AKS engine on Azure Stack](/azure-stack/user/azure-stack-kubernetes-aks-engine-overview).-- The versions of Kubernetes and support policy are the same as those [supported in Azure Kubernetes Service (AKS)](../../aks/supported-kubernetes-versions.md).
+- The versions of Kubernetes and support policy are the same as those [supported in Azure Kubernetes Service (AKS)](../../aks/supported-kubernetes-versions.md).
+- We recommend connecting your cluster to [Azure Arc](../../azure-arc/kubernetes/overview.md) and enabling monitoring through Container Insights via Azure Arc.
## Network firewall requirements
To enable Container insights, use one of the methods that's described in the fol
| | [Create an AKS cluster by using Terraform](container-insights-enable-new-cluster.md#enable-using-terraform)| You can enable monitoring for a new AKS cluster that you create by using the open-source tool Terraform. | | | [Create an OpenShift cluster by using an Azure Resource Manager template](container-insights-azure-redhat-setup.md#enable-for-a-new-cluster-using-an-azure-resource-manager-template) | You can enable monitoring for a new OpenShift cluster that you create by using a preconfigured Azure Resource Manager template. | | | [Create an OpenShift cluster by using the Azure CLI](/cli/azure/openshift#az_openshift_create) | You can enable monitoring when you deploy a new OpenShift cluster by using the Azure CLI. |
-| Existing Kubernetes cluster | [Enable monitoring of an AKS cluster by using the Azure CLI](container-insights-enable-existing-clusters.md#enable-using-azure-cli) | You can enable monitoring for an AKS cluster that's already deployed by using the Azure CLI. |
+| Existing AKS cluster | [Enable monitoring of an AKS cluster by using the Azure CLI](container-insights-enable-existing-clusters.md#enable-using-azure-cli) | You can enable monitoring for an AKS cluster that's already deployed by using the Azure CLI. |
| |[Enable for AKS cluster using Terraform](container-insights-enable-existing-clusters.md#enable-using-terraform) | You can enable monitoring for an AKS cluster that's already deployed by using the open-source tool Terraform. | | | [Enable for AKS cluster from Azure Monitor](container-insights-enable-existing-clusters.md#enable-from-azure-monitor-in-the-portal)| You can enable monitoring for one or more AKS clusters that are already deployed from the multi-cluster page in Azure Monitor. | | | [Enable from AKS cluster](container-insights-enable-existing-clusters.md#enable-directly-from-aks-cluster-in-the-portal)| You can enable monitoring directly from an AKS cluster in the Azure portal. | | | [Enable for AKS cluster using an Azure Resource Manager template](container-insights-enable-existing-clusters.md#enable-using-an-azure-resource-manager-template)| You can enable monitoring for an AKS cluster by using a preconfigured Azure Resource Manager template. |
-| | [Enable for hybrid Kubernetes cluster](container-insights-hybrid-setup.md) | You can enable monitoring for the AKS engine that's hosted on Azure Stack or for a Kubernetes cluster that's hosted on-premises. |
-| | [Enable for Arc enabled Kubernetes cluster](container-insights-enable-arc-enabled-clusters.md). | You can enable monitoring for your Kubernetes clusters that are hosted outside of Azure and enabled with Azure Arc. |
-| | [Enable for OpenShift cluster using an Azure Resource Manager template](container-insights-azure-redhat-setup.md#enable-using-an-azure-resource-manager-template) | You can enable monitoring for an existing OpenShift cluster by using a preconfigured Azure Resource Manager template. |
-| | [Enable for OpenShift cluster from Azure Monitor](container-insights-azure-redhat-setup.md#from-the-azure-portal) | You can enable monitoring for one or more OpenShift clusters that are already deployed from the multicluster page in Azure Monitor. |
+| Existing non-AKS Kubernetes cluster | [Enable for non-AKS Kubernetes cluster by using the Azure CLI](container-insights-enable-arc-enabled-clusters.md#create-extension-instance-using-azure-cli). | You can enable monitoring for your Kubernetes clusters that are hosted outside of Azure and enabled with Azure Arc, this includes hybrid, OpenShift, and multi-cloud using Azure CLI. |
+| | [Enable for non-AKS Kubernetes cluster using an Azure Resource Manager template](container-insights-enable-arc-enabled-clusters.md#create-extension-instance-using-azure-resource-manager) | You can enable monitoring for your clusters enabled with Arc by using a preconfigured Azure Resource Manager template. |
+| | [Enable for non-AKS Kubernetes cluster from Azure Monitor](container-insights-enable-arc-enabled-clusters.md#create-extension-instance-using-azure-portal) | You can enable monitoring for one or more clusters enabled with Arc that are already deployed from the multicluster page in Azure Monitor. |
## Next steps
azure-monitor Containers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/containers/containers.md
Last updated 07/06/2020
This article describes how to set up and use the Container Monitoring solution in Azure Monitor, which helps you view and manage your Docker and Windows container hosts in a single location. Docker is a software virtualization system used to create containers that automate software deployment to their IT infrastructure.
+> [!IMPORTANT]
+> The Container Monitoring solution is being phased out, to monitor your Kubernetes environments, we recommend using [Azure Monitor Container insights](container-insights-onboard.md)
+ [!INCLUDE [azure-monitor-log-analytics-rebrand](../../../includes/azure-monitor-log-analytics-rebrand.md)] The solution shows which containers are running, what container image theyΓÇÖre running, and where containers are running. You can view detailed audit information showing commands used with containers. And, you can troubleshoot containers by viewing and searching centralized logs without having to remotely view Docker or Windows hosts. You can find containers that may be noisy and consuming excess resources on a host. And, you can view centralized CPU, memory, storage, and network usage and performance information for containers. On computers running Windows, you can centralize and compare logs from Windows Server, Hyper-V, and Docker containers. The solution supports the following container orchestrators: - Docker Swarm - DC/OS-- Kubernetes - Service Fabric-- Red Hat OpenShift+
+We recommend using Azure Monitor Container insights for monitoring your Kubernetes and Red Hat OpenShift:
+- AKS ([Configure Container insights for AKS](container-insights-enable-existing-clusters.md))
+- Red Hat OpenShift ([Configure Container insights using Azure Arc](container-insights-enable-arc-enabled-clusters.md))
If you have containers deployed in [Azure Service Fabric](../../service-fabric/service-fabric-overview.md), we recommend enabling both the [Service Fabric solution](../../service-fabric/service-fabric-diagnostics-oms-setup.md) and this solution to include monitoring of cluster events. Before enabling the Service Fabric solution, review [Using the Service Fabric solution](../../service-fabric/service-fabric-diagnostics-event-analysis-oms.md) to understand what it provides and how to use it.
azure-netapp-files Azure Netapp Files Resource Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/azure-netapp-files-resource-limits.md
na ms.devlang: na Previously updated : 08/24/2021 Last updated : 09/23/2021 # Resource limits for Azure NetApp Files
The following table describes resource limits for Azure NetApp Files:
| Maximum number of export policy rules per volume | 5 | No | | Minimum assigned throughput for a manual QoS volume | 1 MiB/s | No | | Maximum assigned throughput for a manual QoS volume | 4,500 MiB/s | No |
-| Number of cross-region replication data protection volumes (destination volumes) | 5 | Yes |
+| Number of cross-region replication data protection volumes (destination volumes) | 10 | Yes |
For more information, see [Capacity management FAQs](azure-netapp-files-faqs.md#capacity-management-faqs).
azure-netapp-files Configure Nfs Clients https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/configure-nfs-clients.md
na ms.devlang: na Previously updated : 06/17/2021 Last updated : 09/22/2021 # Configure an NFS client for Azure NetApp Files
The following steps are optional. You need to perform the steps only if you use
`krb5_realm = CONTOSO.COM (domain name in caps)` `krb5_kpasswd = winad2016.contoso.com (same as AD address which is added in /etc/hosts)` `use_fully_qualified_names = false`
+
+ In the `[domain/contoso-ldap]` configuration above:
+ * `id_provider` is set to `ldap` and not `ad`.
+ * The configuration has specified search bases and user and group classes for searches.
+ * `ldap_sasl_authid` is the machine account name from `klist -kte`.
+ * `use_fully_qualified_names` is set to `false`. This setting means this configuration is used when a short name is used.
+ * `ldap_id_mapping` is NOT specified, which defaults to `false`.
+
+ The `realm join` configuration is generated by the client and looks like this:
`[domain/contoso.com] (Do not edit or remove any of the following information. This information is automatically generated during the realm join process.)` `ad_domain = contoso.com`
The following steps are optional. You need to perform the steps only if you use
`use_fully_qualified_names = True` `fallback_homedir = /home/%u@%d` `access_provider = ad`
+
+ In the `[domain/contoso.com]` configuration above:
+ * `id_provider` is set to `ad`.
+ * `ldap_id_mapping` is set to `true`. It uses the SSSD generated IDs. Alternately, you can set this value to `false` if you want to use POSIX UIDs for ALL styles of usernames. You can determine the value based on your client configuration.
+ * `use_fully_qualified_names` is `true`. This setting means `user@CONTOSO.COM` will use this configuration.
4. Ensure your `/etc/nsswitch.conf` has the `sss` entry:
azure-resource-manager Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/deploy-cli.md
Title: Deploy resources with Azure CLI and Bicep files
+ Title: Deploy resources with Azure CLI and Bicep files | Microsoft Docs
description: Use Azure Resource Manager and Azure CLI to deploy resources to Azure. The resources are defined in a Bicep file. Previously updated : 07/15/2021 Last updated : 09/17/2021+
-# Deploy resources with Bicep and Azure CLI
+# How to deploy resources with Bicep and Azure CLI
This article explains how to use Azure CLI with Bicep files to deploy your resources to Azure. If you aren't familiar with the concepts of deploying and managing your Azure solutions, see [Bicep overview](./overview.md).
azure-resource-manager Loop Properties https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-properties.md
description: Use a Bicep property loop to iterate when creating a resource prope
Previously updated : 08/30/2021 Last updated : 09/23/2021 # Property iteration in Bicep
Loops can be used to declare multiple properties by:
## Loop limits
-The Bicep file's loop iterations can't be a negative number or exceed 800 iterations.
+Bicep loop has these limitations:
+
+- Can't loop on multiple levels of properties.
+- Loop iterations can't be a negative number or exceed 800 iterations.
## Loop array
azure-resource-manager Loop Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/loop-resources.md
description: Use loops and arrays in a Bicep file to deploy multiple instances o
Previously updated : 08/30/2021 Last updated : 09/23/2021 # Resource iteration in Bicep
Loops can be used declare multiple resources by:
## Loop limits
-The Bicep file's loop iterations can't be a negative number or exceed 800 iterations.
+Bicep loop has these limitations:
+
+- Can't loop a resource with nested child resources. You must change the child resources to top-level resources. See [Iteration for a child resource](#iteration-for-a-child-resource).
+- Can't loop on multiple levels of properties. See [Property iteration in Bicep](./loop-properties.md).
+- Loop iterations can't be a negative number or exceed 800 iterations.
## Loop index
azure-resource-manager Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deploy-cli.md
Title: Deploy resources with Azure CLI and template
-description: Use Azure Resource Manager and Azure CLI to deploy resources to Azure. The resources are defined in a Resource Manager template.
+ Title: Azure deployment templates with Azure CLI ΓÇô Azure Resource Manager | Microsoft Docs
+description: Use Azure Resource Manager and Azure CLI to create and deploy resource groups to Azure. The resources are defined in an Azure deployment template.
Previously updated : 07/15/2021 Last updated : 09/17/2021+
+keywords: azure cli deploy arm template, create resource group azure, azure deployment template, deployment resources, arm template, azure arm template
-# Deploy resources with ARM templates and Azure CLI
+# How to use Azure Resource Manager (ARM) deployment templates with Azure CLI
-This article explains how to use Azure CLI with Azure Resource Manager templates (ARM templates) to deploy your resources to Azure. If you aren't familiar with the concepts of deploying and managing your Azure solutions, see [template deployment overview](overview.md).
+This article explains how to use Azure CLI with Azure Resource Manager templates (ARM templates) to deploy your resources to Azure. If you aren't familiar with the concepts of deploying and managing your Azure solutions, see [template deployment overview](overview.md).
The deployment commands changed in Azure CLI version 2.2.0. The examples in this article require [Azure CLI version 2.20.0 or later](/cli/azure/install-azure-cli).
If you don't have Azure CLI installed, you can use Azure Cloud Shell. For more i
## Deployment scope
-You can target your deployment to a resource group, subscription, management group, or tenant. Depending on the scope of the deployment, you use different commands.
+You can target your Azure deployment template to a resource group, subscription, management group, or tenant. Depending on the scope of the deployment, you use different commands.
* To deploy to a **resource group**, use [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create):
For every scope, the user deploying the template must have the required permissi
## Deploy local template
-You can deploy a template from your local machine or one that is stored externally. This section describes deploying a local template.
+You can deploy an ARM template from your local machine or one that is stored externally. This section describes deploying a local template.
If you're deploying to a resource group that doesn't exist, create the resource group. The name of the resource group can only include alphanumeric characters, periods, underscores, hyphens, and parenthesis. It can be up to 90 characters. The name can't end in a period.
az deployment group create \
--parameters storageAccountType=Standard_GRS ```
-The deployment can take a few minutes to complete. When it finishes, you see a message that includes the result:
+The Azure deployment template can take a few minutes to complete. When it finishes, you see a message that includes the result:
```output "provisioningState": "Succeeded",
az deployment group create \
For more information, see [Use relative path for linked templates](./linked-templates.md#linked-template).
-## Deployment name
+## Azure deployment template name
-When deploying an ARM template, you can give the deployment a name. This name can help you retrieve the deployment from the deployment history. If you don't provide a name for the deployment, the name of the template file is used. For example, if you deploy a template named _azuredeploy.json_ and don't specify a deployment name, the deployment is named `azuredeploy`.
+When deploying an ARM template, you can give the Azure deployment template a name. This name can help you retrieve the deployment from the deployment history. If you don't provide a name for the deployment, the name of the template file is used. For example, if you deploy a template named _azuredeploy.json_ and don't specify a deployment name, the deployment is named `azuredeploy`.
Every time you run a deployment, an entry is added to the resource group's deployment history with the deployment name. If you run another deployment and give it the same name, the earlier entry is replaced with the current deployment. If you want to maintain unique entries in the deployment history, give each deployment a unique name.
For more information, see [Azure Resource Manager template specs](template-specs
## Preview changes
-Before deploying your template, you can preview the changes the template will make to your environment. Use the [what-if operation](./deploy-what-if.md) to verify that the template makes the changes that you expect. What-if also validates the template for errors.
+Before deploying your ARM template, you can preview the changes the template will make to your environment. Use the [what-if operation](./deploy-what-if.md) to verify that the template makes the changes that you expect. What-if also validates the template for errors.
## Parameters
azure-resource-manager Secure Template With Sas Token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/secure-template-with-sas-token.md
Title: Securely deploy template with SAS token
-description: Deploy resources to Azure with an Azure Resource Manager template that is protected by a SAS token. Shows Azure PowerShell and Azure CLI.
+ Title: Deploy ARM template with SAS token - Azure Resource Manager | Microsoft Docs
+description: Learn how to use Azure CLI or Azure PowerShell to securely deploy a private ARM template with a SAS token. Protect and manage access to your templates.
Previously updated : 08/25/2020 - Last updated : 09/17/2021+
+keywords: private template, sas token template, storage account, template security, azure arm template, azure resource manager template
-# Deploy private ARM template with SAS token
+# How to deploy private ARM template with SAS token
-When your Azure Resource Manager template (ARM template) is located in a storage account, you can restrict access to the template to avoid exposing it publicly. You access a secured template by creating a shared access signature (SAS) token for the template, and providing that token during deployment. This article explains how to use Azure PowerShell or Azure CLI to deploy a template with a SAS token.
+When your Azure Resource Manager template (ARM template) is located in a storage account, you can restrict access to the template to avoid exposing it publicly. You access a secured template by creating a shared access signature (SAS) token for the template, and providing that token during deployment. This article explains how to use Azure PowerShell or Azure CLI to securely deploy an ARM template with a SAS token.
+
+You will find information on how to protect and manage access to your private ARM templates with directions on how to do the following:
+
+* Create storage account with secured container
+* Upload template to storage account
+* Provide SAS token during deployment
> [!IMPORTANT]
-> Instead of securing your template with a SAS token, consider using [template specs](template-specs.md). With template specs, you can share your templates with other users in your organization and manage access to the templates through Azure RBAC.
+> Instead of securing your private template with a SAS token, consider using [template specs](template-specs.md). With template specs, you can share your templates with other users in your organization and manage access to the templates through Azure RBAC.
## Create storage account with secured container
-The following script creates a storage account and container with public access turned off.
+The following script creates a storage account and container with public access turned off for template security.
# [PowerShell](#tab/azure-powershell)
az storage container create \
-## Upload template to storage account
+## Upload private template to storage account
Now, you're ready to upload your template to the storage account. Provide the path to the template you want to use.
az storage blob upload \
To deploy a private template in a storage account, generate a SAS token and include it in the URI for the template. Set the expiry time to allow enough time to complete the deployment. > [!IMPORTANT]
-> The blob containing the template is accessible to only the account owner. However, when you create a SAS token for the blob, the blob is accessible to anyone with that URI. If another user intercepts the URI, that user is able to access the template. A SAS token is a good way of limiting access to your templates, but you should not include sensitive data like passwords directly in the template.
+> The blob containing the private template is accessible to only the account owner. However, when you create a SAS token for the blob, the blob is accessible to anyone with that URI. If another user intercepts the URI, that user is able to access the template. A SAS token is a good way of limiting access to your templates, but you should not include sensitive data like passwords directly in the template.
> # [PowerShell](#tab/azure-powershell)
azure-sql Authentication Azure Ad Only Authentication Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/authentication-azure-ad-only-authentication-policy-how-to.md
Previously updated : 08/31/2021 Last updated : 09/22/2021 # Using Azure Policy to enforce Azure Active Directory only authentication with Azure SQL
azure-sql Authentication Azure Ad Only Authentication Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/authentication-azure-ad-only-authentication-policy.md
Previously updated : 08/31/2021 Last updated : 09/22/2021 # Azure Policy for Azure Active Directory only authentication with Azure SQL
The Azure Policy can prevent a new logical server or managed instance from being
## Next steps > [!div class="nextstepaction"]
-> [Using Azure Policy to enforce Azure Active Directory only authentication with Azure SQL](authentication-azure-ad-only-authentication-policy-how-to.md)
+> [Using Azure Policy to enforce Azure Active Directory only authentication with Azure SQL](authentication-azure-ad-only-authentication-policy-how-to.md)
azure-sql Az Cli Script Samples Content Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/az-cli-script-samples-content-guide.md
Title: Azure CLI script examples
+ Title: Azure CLI samples for Azure SQL Database & Managed Instances | Microsoft Docs
-description: Azure CLI script examples to create and manage Azure SQL Database and Azure SQL Managed Instance
+description: Find Azure CLI script samples to create and manage Azure SQL Database and Azure SQL Managed Instance.
-+ ms.devlang: azurecli Previously updated : 02/03/2019 Last updated : 09/17/2021
+keywords: sql database, managed instance, azure cli samples, azure cli examples, azure cli code samples, azure cli script examples
# Azure CLI samples for Azure SQL Database and SQL Managed Instance
azure-sql Service Tiers Sql Database Vcore https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/service-tiers-sql-database-vcore.md
Service tier options in the vCore purchase model include General Purpose, Busine
|Best for|Most business workloads. Offers budget-oriented, balanced, and scalable compute and storage options. |Offers business applications the highest resilience to failures by using several isolated replicas, and provides the highest I/O performance per database replica.|Most business workloads with highly scalable storage and read-scale requirements. Offers higher resilience to failures by allowing configuration of more than one isolated database replica. | |Storage|Uses remote storage.<br/>**SQL Database provisioned compute**:<br/>5 GB ΓÇô 4 TB<br/>**Serverless compute**:<br/>5 GB - 3 TB|Uses local SSD storage.<br/>**SQL Database provisioned compute**:<br/>5 GB ΓÇô 4 TB|Flexible autogrow of storage as needed. Supports up to 100 TB of storage. Uses local SSD storage for local buffer-pool cache and local data storage. Uses Azure remote storage as final long-term data store. | |IOPS and throughput (approximate)|**SQL Database**: See resource limits for [single databases](resource-limits-vcore-single-databases.md) and [elastic pools](resource-limits-vcore-elastic-pools.md).|See resource limits for [single databases](resource-limits-vcore-single-databases.md) and [elastic pools](resource-limits-vcore-elastic-pools.md).|Hyperscale is a multi-tiered architecture with caching at multiple levels. Effective IOPS and throughput will depend on the workload.|
-|Availability|1 replica, no read-scale replicas|3 replicas, 1 [read-scale replica](read-scale-out.md),<br/>zone-redundant high availability (HA)|1 read-write replica, plus 0-4 [read-scale replicas](read-scale-out.md)|
+|Availability|1 replica, no read-scale replicas, <br/>zone-redundant high availability (HA) (preview)|3 replicas, 1 [read-scale replica](read-scale-out.md),<br/>zone-redundant high availability (HA)|1 read-write replica, plus 0-4 [read-scale replicas](read-scale-out.md)|
|Backups|A choice of geo-redundant, zone-redundant\*, or locally-redundant\* backup storage, 1-35 day retention (default 7 days)|A choice of geo-redundant, zone-redundant\*, or locally-redundant\* backup storage, 1-35 day retention (default 7 days)|A choice of geo-redundant, zone-redundant\*\*, or locally-redundant\*\* backup storage, 7 day retention.<p>Snapshot-based backups in Azure remote storage. Restores use snapshots for fast recovery. Backups are instantaneous and don't impact compute I/O performance. Restores are fast and aren't a size-of-data operation (taking minutes rather than hours).| |In-memory|Not supported|Supported|Partial support. Memory-optimized table types, table variables, and natively compiled modules are supported.| |||
azure-sql Troubleshoot Common Errors Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/troubleshoot-common-errors-issues.md
To resolve this issue, try the steps (in the order presented) in the [Steps to f
### Error 26: Error Locating server specified
-``System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections.(provider: SQL Network Interfaces, error: 26 ΓÇô Error Locating Server/Instance Specified)``
+`System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections.(provider: SQL Network Interfaces, error: 26 ΓÇô Error Locating Server/Instance Specified)`
#### Error 40: Could not open a connection to the server
-``A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)``
+`A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)`
#### Error 10053: A transport-level error has occurred when receiving results from the server
-``10053: A transport-level error has occurred when receiving results from the server. (Provider: TCP Provider, error: 0 - An established connection was aborted by the software in your host machine)``
+`10053: A transport-level error has occurred when receiving results from the server. (Provider: TCP Provider, error: 0 - An established connection was aborted by the software in your host machine)`
These issues occur if the application can't connect to the server.
To resolve this issue, make sure that port 1433 is open for outbound connections
### Login failed for user '< User name >'
-``Login failed for user '<User name>'.This session has been assigned a tracing ID of '<Tracing ID>'. Provide this tracing ID to customer support when you need assistance. (Microsoft SQL Server, Error: 18456)``
+`Login failed for user '<User name>'.This session has been assigned a tracing ID of '<Tracing ID>'. Provide this tracing ID to customer support when you need assistance. (Microsoft SQL Server, Error: 18456)`
To resolve this issue, contact your service administrator to provide you with a valid user name and password.
For more information, see [Managing databases and logins in Azure SQL Database](
### System.Data.SqlClient.SqlException (0x80131904): Connection Timeout Expired
-``System.Data.SqlClient.SqlException (0x80131904): Connection Timeout Expired. The timeout period elapsed while attempting to consume the pre-login handshake acknowledgement. This could be because the pre-login handshake failed or the server was unable to respond back in time. The duration spent while attempting to connect to this server was - [Pre-Login] initialization=3; handshake=29995;``
+`System.Data.SqlClient.SqlException (0x80131904): Connection Timeout Expired. The timeout period elapsed while attempting to consume the pre-login handshake acknowledgement. This could be because the pre-login handshake failed or the server was unable to respond back in time. The duration spent while attempting to connect to this server was - [Pre-Login] initialization=3; handshake=29995;`
### System.Data.SqlClient.SqlException (0x80131904): Timeout expired
-``System.Data.SqlClient.SqlException (0x80131904): Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.``
+`System.Data.SqlClient.SqlException (0x80131904): Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.`
### System.Data.Entity.Core.EntityException: The underlying provider failed on Open
-``System.Data.Entity.Core.EntityException: The underlying provider failed on Open. -> System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. -> System.ComponentModel.Win32Exception: The wait operation timed out``
+`System.Data.Entity.Core.EntityException: The underlying provider failed on Open. -> System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. -> System.ComponentModel.Win32Exception: The wait operation timed out`
### Cannot connect to < server name >
-``Cannot connect to <server name>.ADDITIONAL INFORMATION:Connection Timeout Expired. The timeout period elapsed during the post-login phase. The connection could have timed out while waiting for server to complete the login process and respond; Or it could have timed out while attempting to create multiple active connections. The duration spent while attempting to connect to this server was - [Pre-Login] initialization=231; handshake=983; [Login] initialization=0; authentication=0; [Post-Login] complete=13000; (Microsoft SQL Server, Error: -2) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%20SQL%20Server&EvtSrc=MSSQLServer&EvtID=-2&LinkId=20476 The wait operation timed out``
+`Cannot connect to <server name>.ADDITIONAL INFORMATION:Connection Timeout Expired. The timeout period elapsed during the post-login phase. The connection could have timed out while waiting for server to complete the login process and respond; Or it could have timed out while attempting to create multiple active connections. The duration spent while attempting to connect to this server was - [Pre-Login] initialization=231; handshake=983; [Login] initialization=0; authentication=0; [Post-Login] complete=13000; (Microsoft SQL Server, Error: -2) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%20SQL%20Server&EvtSrc=MSSQLServer&EvtID=-2&LinkId=20476 The wait operation timed out`
These exceptions can occur either because of connection or query issues. To confirm that this error is caused by connectivity issues, see [Confirm whether an error is caused by a connectivity issue](#confirm-whether-an-error-is-caused-by-a-connectivity-issue).
Connection timeouts occur because the application can't connect to the server. T
### Error 10928: Resource ID: %d
-``10928: Resource ID: %d. The %s limit for the database is %d and has been reached. See http://go.microsoft.com/fwlink/?LinkId=267637 for assistance. The Resource ID value in error message indicates the resource for which limit has been reached. For sessions, Resource ID = 2.``
+`10928: Resource ID: %d. The %s limit for the database is %d and has been reached. See http://go.microsoft.com/fwlink/?LinkId=267637 for assistance. The Resource ID value in error message indicates the resource for which limit has been reached. For sessions, Resource ID = 2.`
To work around this issue, try one of the following methods:
For more information about database limits, see [SQL Database resource limits f
### Error 10929: Resource ID: 1
-``10929: Resource ID: 1. The %s minimum guarantee is %d, maximum limit is %d and the current usage for the database is %d. However, the server is currently too busy to support requests greater than %d for this database. See http://go.microsoft.com/fwlink/?LinkId=267637 for assistance. Otherwise, please try again later.``
+`10929: Resource ID: 1. The %s minimum guarantee is %d, maximum limit is %d and the current usage for the database is %d. However, the server is currently too busy to support requests greater than %d for this database. See http://go.microsoft.com/fwlink/?LinkId=267637 for assistance. Otherwise, please try again later.`
### Error 40501: The service is currently busy
-``40501: The service is currently busy. Retry the request after 10 seconds. Incident ID: %ls. Code: %d.``
+`40501: The service is currently busy. Retry the request after 10 seconds. Incident ID: %ls. Code: %d.`
This is an engine throttling error, an indication that resource limits are being exceeded.
For more information about resource limits, see [Logical SQL server resource lim
### Error 40544: The database has reached its size quota
-``40544: The database has reached its size quota. Partition or delete data, drop indexes, or consult the documentation for possible resolutions. Incident ID: <ID>. Code: <code>.``
+`40544: The database has reached its size quota. Partition or delete data, drop indexes, or consult the documentation for possible resolutions. Incident ID: <ID>. Code: <code>.`
This error occurs when the database has reached its size quota.
The following steps can either help you work around the problem or provide you w
### Error 40549: Session is terminated because you have a long-running transaction
-``40549: Session is terminated because you have a long-running transaction. Try shortening your transaction.``
+`40549: Session is terminated because you have a long-running transaction. Try shortening your transaction.`
If you repeatedly encounter this error, try to resolve the issue by following these steps:
Also consider batching your queries. For information on batching, see [How to us
### Error 40551: The session has been terminated because of excessive TEMPDB usage
-``40551: The session has been terminated because of excessive TEMPDB usage. Try modifying your query to reduce the temporary table space usage.``
+`40551: The session has been terminated because of excessive TEMPDB usage. Try modifying your query to reduce the temporary table space usage.`
To work around this issue, follow these steps:
To work around this issue, follow these steps:
### Error 40552: The session has been terminated because of excessive transaction log space usage
-``40552: The session has been terminated because of excessive transaction log space usage. Try modifying fewer rows in a single transaction.``
+`40552: The session has been terminated because of excessive transaction log space usage. Try modifying fewer rows in a single transaction.`
To resolve this issue, try the following methods:
Try to reduce the number of rows that are operated on immediately by implementin
### Error 40553: The session has been terminated because of excessive memory usage
-``40553 : The session has been terminated because of excessive memory usage. Try modifying your query to process fewer rows.``
+`40553: The session has been terminated because of excessive memory usage. Try modifying your query to process fewer rows.`
To work around this issue, try to optimize the query.
azure-sql Troubleshoot Transaction Log Errors Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/troubleshoot-transaction-log-errors-issues.md
For more information on managing the file space of databases and elastic pools,
### Error 40552: The session has been terminated because of excessive transaction log space usage
-``40552: The session has been terminated because of excessive transaction log space usage. Try modifying fewer rows in a single transaction.``
+`40552: The session has been terminated because of excessive transaction log space usage. Try modifying fewer rows in a single transaction.`
To resolve this issue, try the following methods:
For information on transaction log sizes, see:
- For DTU resource limits for a single database, see [resource limits for single databases using the DTU purchasing model](resource-limits-dtu-single-databases.md) - For DTU resource limits for elastic pools, see [resource limits for elastic pools using the DTU purchasing model](resource-limits-dtu-elastic-pools.md) - For resource limits for SQL Managed Instance, see [resource limits for SQL Managed Instance](../managed-instance/resource-limits.md).-
azure-sql Connectivity Architecture Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/connectivity-architecture-overview.md
Deploy SQL Managed Instance in a dedicated subnet inside the virtual network. Th
- **Network security group (NSG):** An NSG needs to be associated with the SQL Managed Instance subnet. You can use an NSG to control access to the SQL Managed Instance data endpoint by filtering traffic on port 1433 and ports 11000-11999 when SQL Managed Instance is configured for redirect connections. The service will automatically provision and keep current [rules](#mandatory-inbound-security-rules-with-service-aided-subnet-configuration) required to allow uninterrupted flow of management traffic. - **User defined route (UDR) table:** A UDR table needs to be associated with the SQL Managed Instance subnet. You can add entries to the route table to route traffic that has on-premises private IP ranges as a destination through the virtual network gateway or virtual network appliance (NVA). Service will automatically provision and keep current [entries](#mandatory-user-defined-routes-with-service-aided-subnet-configuration) required to allow uninterrupted flow of management traffic. - **Sufficient IP addresses:** The SQL Managed Instance subnet must have at least 32 IP addresses. For more information, see [Determine the size of the subnet for SQL Managed Instance](vnet-subnet-determine-size.md). You can deploy managed instances in [the existing network](vnet-existing-add-subnet.md) after you configure it to satisfy [the networking requirements for SQL Managed Instance](#network-requirements). Otherwise, create a [new network and subnet](virtual-network-subnet-create-arm-template.md).
+- **Unlocked resources:** The virtual network that contains the subnet delegated to SQL Managed Instance must not have any [write or delete locks](../../azure-resource-manager/management/lock-resources.md) placed on the virtual network resource, its parent resource group, or subscription. Placing locks on the virtual network or its parent resources may prevent SQL Managed Instance from completing its regular maintenance and cause degraded performance, delayed bugfixes, loss of regulatory compliance, operation outside of SLOs, and make the instance unusable.
+- **Allowed by Azure policies:** If you leverage [Azure Policy](../../governance/policy/overview.md) to control the creation, modification and deletion of resources via deny effects in the scope that includes the virtual network whose subnet is delegated to SQL Managed Instance, you need to take steps to ensure that such policies do not prevent SQL Managed Instance from deploying or performing regular maintenance. If resources of those resource types cannot be created or managed by SQL Managed Instance, it may fail to deploy or become unusable following a maintenance operation. The types of resources that need to be excluded from deny effects are:
+ - Microsoft.Network/serviceEndpointPolicies
+ - Microsoft.Network/networkIntentPolicies
+ - Microsoft.Network/virtualNetworks/subnets/contextualServiceEndpointPolicies
> [!IMPORTANT] > When you create a managed instance, a network intent policy is applied on the subnet to prevent noncompliant changes to networking setup. After the last instance is removed from the subnet, the network intent policy is also removed. Rules below are for the informational purposes only, and you should not deploy them using ARM template / PowerShell / CLI. If you want to use the latest official template you could always [retrieve it from the portal](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md).
azure-sql Rhel High Availability Listener Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/linux/rhel-high-availability-listener-tutorial.md
At this point, the resource group has a load balancer that connects to all SQL S
1. Check your cluster resources using the command `sudo pcs resource`, and you should see that the primary instance is now `<VM2>`. > [!NOTE]
- > This article contains references to the term slave, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
+ > This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
```output
azure-sql Rhel High Availability Stonith Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/linux/rhel-high-availability-stonith-tutorial.md
If the `synchronization_state_desc` lists SYNCHRONIZED for `db1`, this means the
We will be following the guide to [create the availability group resources in the Pacemaker cluster](/sql/linux/sql-server-linux-create-availability-group#create-the-availability-group-resources-in-the-pacemaker-cluster-external-only). > [!NOTE]
-> This article contains references to the term slave, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
### Create the AG cluster resource
azure-sql Sql Agent Extension Automatic Registration All Vms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/sql-agent-extension-automatic-registration-all-vms.md
To do so, follow these steps:
1. Save [this script](https://github.com/microsoft/tigertoolbox/blob/master/AzureSQLVM/EnableBySubscription.ps1). 1. Navigate to where you saved the script by using an administrative Command Prompt or PowerShell window. 1. Connect to Azure (`az login`).
-1. Execute the script, passing in SubscriptionIds as parameters such as
- `.\EnableBySubscription.ps1 -SubscriptionList SubscriptionId1,SubscriptionId2`
+1. Execute the script, passing in SubscriptionIds as parameters. If no subscriptions are sepcified, the script will enable auto-registration for all the subscriptions in the user account.
- For example:
+ The following command will enable auto-registration for two subscriptions:
```console .\EnableBySubscription.ps1 -SubscriptionList a1a1a-aa11-11aa-a1a1-a11a111a1,b2b2b2-bb22-22bb-b2b2-b2b2b2bb ```
+ The following command will enable auto-registration for all subscriptions:
+
+ ```console
+ .\EnableBySubscription.ps1
+ ```
Failed registration errors are stored in `RegistrationErrors.csv` located in the same directory where you saved and executed the `.ps1` script from.
azure-vmware Configure Vmware Syslogs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/configure-vmware-syslogs.md
+
+ Title: Configure VMware syslogs for Azure VMware Solution
+description: Learn how to configure diagnostic settings to collect VMware syslogs for your Azure VMware Solution private cloud.
+ Last updated : 09/24/2021+
+#Customer intent: As an Azure service administrator, I want to collect VMWare syslogs and store it in my storage account so that I can view the vCenter logs and analyze for any diagnostic purposes.
+++
+# Configure VMware syslogs for Azure VMware Solution
+
+Diagnostic settings are used to configure streaming export of platform logs and metrics for a resource to the destination of your choice. You can create up to five different diagnostic settings to send different logs and metrics to independent destinations.
+
+In this topic, you'll configure a diagnostic setting to collect VMware syslogs for your Azure VMware Solution private cloud. You'll store the syslog to a storage account to view the vCenter logs and analyze for diagnostic purposes.
+
+## Prerequisites
+
+Make sure you have an Azure VMware Solution private cloud with access to the vCenter and NSX-T Manager interfaces.
+
+## Configure diagnostic settings
+
+1. From your Azure VMware Solution private cloud, select **Diagnostic settings**, and then **Add diagnostic settings**.
+
+ :::image type="content" source="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-1.png" alt-text="Screenshot showing where to configure VMware syslogs." lightbox="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-1.png":::
++
+1. Select the **vmwaresyslog**, **Allmetrics**, and **Archive to storage account** options.
+
+ >[!IMPORTANT]
+ >The **Send to log analytics workspace** option does not currently work.
+
+1. Select the storage account where you want to store the logs and then select **Save**.
+
+ :::image type="content" source="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-2.png" alt-text="Screenshot showing the options to select for storing the syslogs." lightbox="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-2.png":::
+
+1. Go to your storage and account and verify that **Insight logs vmwarelog** has been created and select it.
+
+ :::image type="content" source="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-3.png" alt-text="Screenshot showing the Insight logs vmwarelog option created and available." lightbox="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-3.png":::
++
+1. Browse to the location and download the json file to view the logs.
+
+ :::image type="content" source="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-4.png" alt-text="Screenshot showing the drill-down path to the json file." lightbox="media/diagnostic-settings/diagnostic-settings-log-analytics-workspace-4.png":::
+
azure-vmware Tutorial Configure Networking https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/tutorial-configure-networking.md
In this tutorial, you learn how to:
>[!NOTE] >Before you create a new vNet, evaluate if you already have an existing vNet in Azure and plan to use it to connect to Azure VMware Solution; or whether to create a new vNet entirely.
->* To use an existing vNet, use the **[Azure vNet connect](#select-an-existing-vnet)** tab under **Connectivity**.
->* To create a new vNet, use the **[Azure vNet connect](#create-a-new-vnet)** tab or create one [manually](#create-a-vnet-manually).
+>* To use an existing vNet in same Azure subscription as Azure VMware Solution, use the **[Azure vNet connect](#select-an-existing-vnet)** tab under **Connectivity**.
+>* To use an existing vNet in a different Azure subscription than Azure VMware Solution, use the guidance on **[Connect to the private cloud manually](#connect-to-the-private-cloud-manually)**.
+>* To create a new vNet in same Azure subscription as Azure VMware Solution, use the **[Azure vNet connect](#create-a-new-vnet)** tab or create one [manually](#create-a-vnet-manually).
## Connect with the Azure vNet connect feature
backup Backup Azure Backup Import Export https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-backup-import-export.md
This section describes the offline backup workflow so that your data can be deli
The *AzureOfflineBackupDiskPrep* utility prepares the SATA drives that are sent to the nearest Azure datacenter. This utility is available in the Azure Backup Agent installation directory in the following path:
-```*\Microsoft Azure Recovery Services Agent\Utils\\*```
+`*\Microsoft Azure Recovery Services Agent\Utils\\*`
1. Go to the directory, and copy the *AzureOfflineBackupDiskPrep* directory to another computer where the SATA drives are connected. On the computer with the connected SATA drives, ensure that:
The *AzureOfflineBackupDiskPrep* utility prepares the SATA drives that are sent
1. Open an elevated command prompt on the copy computer with the *AzureOfflineBackupDiskPrep* utility directory as the current directory. Run the following command:
- ```.\AzureOfflineBackupDiskPrep.exe s:<Staging Location Path>```
+ `.\AzureOfflineBackupDiskPrep.exe s:<Staging Location Path>`
| Parameter | Description | | | |
backup Backup Azure Monitoring Built In Monitor https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-monitoring-built-in-monitor.md
To opt-in to Azure Monitor alerts for backup failure and restore failure scenari
> [!NOTE] > It may take up to 24 hours for the registration to take effect. To enable this feature for multiple subscriptions, repeat the above process by selecting the relevant subscription at the top of the screen. We also recommend to re-register the preview flag if a new resource has been created in the subscription after the initial registration to continue receiving alerts.
+4. As a best practice, we also recommend you to register the resource provider to ensure that the feature registration information gets synced with the Azure Backup service as expected. To register the resource provider, run the following PowerShell command in the subscription for which you have registered the feature flag.
+
+```powershell
+Register-AzResourceProvider -ProviderNamespace <ProviderNamespace>
+```
+
+To receive alerts for Recovery Services vaults, use the value _Microsoft.RecoveryServices_ for the _ProviderNamespace_ parameter. To receive alerts for Backup vaults, use the value _Microsoft.DataProtection_.
+ ### Viewing fired alerts in the Azure portal Once an alert is fired for a vault, you can view the alert in the Azure portal by navigating to Backup center. On the **Overview** tab, you can see a summary of active alerts split by severity. There're two kinds of alerts displayed:
backup Backup Mabs System State And Bmr https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-mabs-system-state-and-bmr.md
You also can run the system state restore at a command prompt:
1. To get the version identifer, at a command prompt, enter:
- ```wbadmin get versions -backuptarget \<servername\sharename\>```
+ `wbadmin get versions -backuptarget \<servername\sharename\>`
1. Use the version identifier to start the system state restore. At the command prompt, enter:
- ```wbadmin start systemstaterecovery -version:<versionidentified> -backuptarget:<servername\sharename>```
+ `wbadmin start systemstaterecovery -version:<versionidentified> -backuptarget:<servername\sharename>`
1. Confirm that you want to start the recovery. You can see the process in the Command Prompt window. A restore log is created.
backup Selective Disk Backup Restore https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/selective-disk-backup-restore.md
Enable-AzRecoveryServicesBackupProtection -Item $item -ExcludeAllDataDisks -Vau
Enable-AzRecoveryServicesBackupProtection -Item $item -ResetExclusionSettings -VaultId $targetVault.ID ```
+> [!NOTE]
+> If the command fails with the error that a policy parameter is required, then check the protection status of the backup item. It is likely that the protection is stopped and hence a policy is required to resume the protection and also to reset all previous disk exclusion settings.
+ ### Restore selective disks with PowerShell ```azurepowershell
bastion Bastion Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/bastion/bastion-faq.md
Yes, connectivity via Bastion will continue to work for peered VNets across diff
:::image type="content" source="./media/bastion-faq/global-subscriptions.png" alt-text="Global subscriptions filter." lightbox="./media/bastion-faq/global-subscriptions.png":::
+### Does Bastion support connectivity to Azure Virtual Desktop?
+No, Bastion connectivity to Azure Virtual Desktop is not supported.
+ ### I have access to the peered VNet, but I can't see the VM deployed there. Make sure the user has **read** access to both the VM, and the peered VNet. Additionally, check under IAM that the user has **read** access to following resources:
Make sure the user has **read** access to both the VM, and the peered VNet. Addi
|Microsoft.Network/networkInterfaces/ipconfigurations/read|Gets a network interface IP configuration definition.|Action| |Microsoft.Network/virtualNetworks/read|Get the virtual network definition|Action| |Microsoft.Network/virtualNetworks/subnets/virtualMachines/read|Gets references to all the virtual machines in a virtual network subnet|Action|
-|Microsoft.Network/virtualNetworks/virtualMachines/read|Gets references to all the virtual machines in a virtual network|Action|
+|Microsoft.Network/virtualNetworks/virtualMachines/read|Gets references to all the virtual machines in a virtual network|Action|
batch Batch Pool No Public Ip Address https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/batch-pool-no-public-ip-address.md
To restrict access to these nodes and reduce the discoverability of these nodes
- **Authentication**. To use a pool without public IP addresses inside a [virtual network](./batch-virtual-network.md), the Batch client API must use Azure Active Directory (AD) authentication. Azure Batch support for Azure AD is documented in [Authenticate Batch service solutions with Active Directory](batch-aad-auth.md). If you aren't creating your pool within a virtual network, either Azure AD authentication or key-based authentication can be used. - **An Azure VNet**. If you are creating your pool in a [virtual network](batch-virtual-network.md), follow these requirements and configurations. To prepare a VNet with one or more subnets in advance, you can use the Azure portal, Azure PowerShell, the Azure Command-Line Interface (CLI), or other methods.+ - The VNet must be in the same subscription and region as the Batch account you use to create your pool.+ - The subnet specified for the pool must have enough unassigned IP addresses to accommodate the number of VMs targeted for the pool; that is, the sum of the `targetDedicatedNodes` and `targetLowPriorityNodes` properties of the pool. If the subnet doesn't have enough unassigned IP addresses, the pool partially allocates the compute nodes, and a resize error occurs.+ - You must disable private link service and endpoint network policies. This can be done by using Azure CLI:
- ```az network vnet subnet update --vnet-name <vnetname> -n <subnetname> --resource-group <resourcegroup> --disable-private-endpoint-network-policies --disable-private-link-service-network-policies```
+
+ `az network vnet subnet update --vnet-name <vnetname> -n <subnetname> --resource-group <resourcegroup> --disable-private-endpoint-network-policies --disable-private-link-service-network-policies`
> [!IMPORTANT] > For each 100 dedicated or low-priority nodes, Batch allocates one private link service and one load balancer. These resources are limited by the subscription's [resource quotas](../azure-resource-manager/management/azure-subscription-service-limits.md). For large pools, you might need to [request a quota increase](batch-quota-limit.md#increase-a-quota) for one or more of these resources. Additionally, no resource locks should be applied to any resource created by Batch, since this prevent cleanup of resources as a result of user-initiated actions such as deleting a pool or resizing to zero.
batch Batch Cli Sample Add Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/scripts/batch-cli-sample-add-application.md
Title: Azure CLI Script Example - Add an Application in Batch
-description: This sample script demonstrates how to add an application for use with an Azure Batch pool or a task.
+ Title: Azure CLI Script Example - Add an Application in Batch | Microsoft Docs
+description: Learn how to add an application for use with an Azure Batch pool or a task using the Azure CLI.
Previously updated : 01/29/2018 -- Last updated : 09/17/2021+
+keywords: batch, azure cli samples, azure cli code samples, azure cli script samples
# CLI example: Add an application to an Azure Batch account
batch Batch Cli Sample Create Account https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/scripts/batch-cli-sample-create-account.md
Title: Azure CLI Script Example - Create Batch account - Batch service
-description: This script creates an Azure Batch account in Batch service mode and shows how to query or update various properties of the account.
+ Title: Azure CLI Script Example - Create Batch account - Batch service | Microsoft Docs
+description: Learn how to create a Batch account in Batch service mode with this Azure CLI script example. This also script shows how to query or update various properties of the account.
Previously updated : 01/29/2018 - Last updated : 09/17/2021+
+keywords: batch, azure cli samples, azure cli code samples, azure cli script samples
# CLI example: Create a Batch account in Batch service mode
batch Batch Cli Sample Create User Subscription Account https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/scripts/batch-cli-sample-create-user-subscription-account.md
Title: Azure CLI Script Example - Create Batch account - user subscription
-description: This script creates an Azure Batch account in user subscription mode. This account allocates compute nodes into your subscription.
+ Title: Azure CLI Script Example - Create Batch account - user subscription | Microsoft Docs
+description: Learn how to create an Azure Batch account in user subscription mode. This account allocates compute nodes into your subscription.
Previously updated : 08/31/2021 - Last updated : 09/17/2021+
+keywords: batch, azure cli samples, azure cli examples, azure cli code samples
# CLI example: Create a Batch account in user subscription mode
batch Batch Cli Sample Manage Linux Pool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/scripts/batch-cli-sample-manage-linux-pool.md
Title: Azure CLI Script Example - Linux Pool in Batch
-description: This script demonstrates some of the commands available in the Azure CLI to create and manage a pool of Linux compute nodes in Azure Batch.
+ Title: Azure CLI Script Example - Linux Pool in Batch | Microsoft Docs
+description: Learn the commands available in the Azure CLI to create and manage a pool of Linux compute nodes in Azure Batch.
Previously updated : 01/29/2018 -- Last updated : 09/17/2021 +
+keywords: linux, azure cli samples, azure cli code samples, azure cli script samples
# CLI example: Create and manage a Linux pool in Azure Batch
batch Batch Cli Sample Manage Windows Pool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/scripts/batch-cli-sample-manage-windows-pool.md
Title: Azure CLI Script Example - Windows Pool in Batch
-description: This script demonstrates some of the commands available in the Azure CLI to create and manage a pool of Windows compute nodes in Azure Batch.
+ Title: Azure CLI Script Example - Windows Pool in Batch | Microsoft Docs
+description: Learn some of the commands available in the Azure CLI to create and manage a pool of Windows compute nodes in Azure Batch.
Previously updated : 12/12/2019 - Last updated : 09/17/2021+
+keywords: windows pool, azure cli samples, azure cli code samples, azure cli script samples
# CLI example: Create and manage a Windows pool in Azure Batch
batch Batch Cli Sample Run Job https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/scripts/batch-cli-sample-run-job.md
Title: Azure CLI Script Example - Run a Batch job
-description: This script creates a Batch job and adds a series of tasks to the job. It also demonstrates how to monitor a job and its tasks.
+ Title: Azure CLI Script Example - Run a Batch job | Microsoft Docs
+description: Learn how to create a Batch job and add a series of tasks to the job using the Azure CLI. This article also shows how to monitor a job and its tasks.
Previously updated : 12/12/2019 -- Last updated : 09/17/2021+
+keywords: batch, batch job, monitor job, azure cli samples, azure cli code samples, azure cli script samples
# CLI example: Run a job and tasks with Azure Batch
cdn Cdn Azure Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-azure-diagnostic-logs.md
To download the tool, see [Azure Storage Explorer](https://storageexplorer.com/)
6. Each blob *PT1H.json* file represents the analytics logs for one hour for a specific CDN endpoint or its custom domain. 7. The schema of the contents of this JSON file is described in the section schema of the core analytics logs. - #### Blob path format
-Core analytics logs are generated every hour and the data is collected and stored inside a single Azure blob as a JSON payload. Storage explorer tool interprets '/' as a directory separator and shows the hierarchy. The path to the Azure blob appears as if there's a hierarchical structure and represents the blob name. The name of the blob follows the following naming convention:
+Core analytics logs are generated every hour and the data is collected and stored inside a single Azure blob as a JSON payload. Storage explorer tool interprets '/' as a directory separator and shows the hierarchy. The path to the Azure blob appears as if there's a hierarchical structure and represents the blob name. The name of the blob follows the following naming convention:
-```resourceId=/SUBSCRIPTIONS/{Subscription Id}/RESOURCEGROUPS/{Resource Group Name}/PROVIDERS/MICROSOFT.CDN/PROFILES/{Profile Name}/ENDPOINTS/{Endpoint Name}/ y={Year}/m={Month}/d={Day}/h={Hour}/m={Minutes}/PT1H.json```
+`resourceId=/SUBSCRIPTIONS/{Subscription Id}/RESOURCEGROUPS/{Resource Group Name}/PROVIDERS/MICROSOFT.CDN/PROFILES/{Profile Name}/ENDPOINTS/{Endpoint Name}/ y=/m=/d=/h=/m=/PT1H.json`
**Description of fields:**
cdn Cdn Http Debug Headers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-http-debug-headers.md
The terms used in the above response header syntax are defined as follows:
The following sample response header indicates the cache state of the requested content at the time that it was requested:
-```X-EC-Debug: x-ec-cache-state: max-age=604800 (7d); cache-ts=1341802519 (Mon, 09 Jul 2012 02:55:19 GMT); cache-age=0 (0s); remaining-ttl=604800 (7d); expires-delta=none```
-
+`X-EC-Debug: x-ec-cache-state: max-age=604800 (7d); cache-ts=1341802519 (Mon, 09 Jul 2012 02:55:19 GMT); cache-age=0 (0s); remaining-ttl=604800 (7d); expires-delta=none`
cdn Cdn Improve Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-improve-performance.md
The standard and premium CDN tiers provide the same compression functionality, b
### Azure CDN Standard from Microsoft profiles For **Azure CDN Standard from Microsoft** profiles, only eligible files are compressed. To be eligible for compression, a file must:-- Be of a MIME type that has been [configured for compression](#enabling-compression).
+- Be of a MIME type that has been [configured for compression](#enabling-compression)
+- Have only "identity" *Content-Encoding* headers in the origin response
- Be larger than 1 KB - Be smaller than 8 MB
cdn Cdn Token Auth https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-token-auth.md
The following flowchart describes how Azure CDN validates a client request when
The OpenSSL tool has the following syntax:
- ```rand -hex <key length>```
+ `rand -hex <key length>`
For example:
- ```OpenSSL> rand -hex 32```
+ `OpenSSL> rand -hex 32`
To avoid downtime, create both a primary and a backup key. A backup key provides uninterrupted access to your content when your primary key is being updated.
The following flowchart describes how Azure CDN validates a client request when
## Azure CDN features and provider pricing
-For information about features, see [Azure CDN product features](cdn-features.md). For information about pricing, see [Content Delivery Network pricing](https://azure.microsoft.com/pricing/details/cdn/).
+For information about features, see [Azure CDN product features](cdn-features.md). For information about pricing, see [Content Delivery Network pricing](https://azure.microsoft.com/pricing/details/cdn/).
cdn Microsoft Pop Abbreviations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/microsoft-pop-abbreviations.md
This article lists Microsoft POP locations, sorted by POP abbreviation, for **Az
## Next steps * View [Azure CDN from Microsoft POP locations by metro](cdn-pop-locations.md#microsoft).
-* To get the latest IP addresses for allow listing, see the [Azure CDN Edge Nodes API](/rest/api/cdn/cdn/edgenodes).
+* To get the latest IP addresses for allow listing, see the [Azure CDN Edge Nodes API](/rest/api/cdn/edge-nodes/list).
* Learn how to [create an Azure CDN profile](cdn-create-new-endpoint.md).
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Anomaly-Detector/whats-new.md
We've also added links to some user-generated content. Those items will be marke
## Release notes
+### September 2021
+* Anomaly Detector (univariate) available in Jio India West.
+ ### August 2021 * Multivariate anomaly detection APIs deployed in five more regions: West US 3, Japan East, Brazil South, Central US, Norway East. Now in total 15 regions are supported.
We've also added links to some user-generated content. Those items will be marke
* August 20, 2019 [Bring Anomaly Detector on-premises with containers support](https://channel9.msdn.com/Shows/AI-Show/Bring-Anomaly-Detector-on-premise-with-containers-support) - AI Show with Qun Ying and Seth Juarez * August 13, 2019 [Introducing Azure Anomaly Detector](https://channel9.msdn.com/Shows/AI-Show/Introducing-Azure-Anomaly-Detector?WT.mc_id=ai-c9-niner) - AI Show with Qun Ying and Seth Juarez
-## Open-source projects
-
-* June 3, 2019 **[UGC]** [Jupyter notebook demonstrating Anomaly Detection and streaming to Power BI](https://github.com/marvinbuss/MS-AnomalyDetector) - Marvin Buss
## Service updates
cognitive-services Deploy Computer Vision On Premises https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/deploy-computer-vision-on-premises.md
spec:
In the same *templates* folder, copy and paste the following helper functions into `helpers.tpl`. `helpers.tpl` defines useful functions to help generate Helm template. > [!NOTE]
-> This article contains references to the term slave, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
```yaml {{- define "rabbitmq.hostname" -}}
cognitive-services Create Sas Tokens https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/document-translation/create-sas-tokens.md
In this article, you'll learn how to create shared access signature (SAS) tokens
## Create SAS tokens for blobs in the Azure portal
-> [!NOTE]
-> Creating SAS tokens for containers directly in the Azure portal is currently not supported. However, you can create an SAS token with [**Azure Storage Explorer**](#create-your-sas-tokens-with-azure-storage-explorer) or complete the task [programmatically](../../../storage/blobs/sas-service-create.md).
- <!-- markdownlint-disable MD024 --> ### Prerequisites
cognitive-services Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/migration-guide.md
The key phrase extraction feature has not changed in v3 outside of the endpoint
#### REST API
-If your application uses the REST API, update its request endpoint to the v3 endpoint for key phrase extraction. For example: `https://<your-custom-subdomain>.api.cognitiveservices.azure.com/text/analytics/v3.0/keyPhrases`
+If your application uses the REST API, update its request endpoint to the v3 endpoint for key phrase extraction. For example: `https://<your-custom-subdomain>.api.cognitiveservices.azure.com/text/analytics/v3.1/keyPhrases`
See the reference documentation for examples of the JSON response. * [Version 2.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v2-1/operations/56f30ceeeda5650db055a3c6)
cognitive-services Client Libraries Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/quickstarts/client-libraries-rest-api.md
Previously updated : 08/25/2021 Last updated : 09/23/2021 keywords: text mining, sentiment analysis, text analytics
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/whats-new.md
The Text Analytics API is updated on an ongoing basis. To stay up-to-date with r
## September 2021 * Starting with version `3.0.017010001-onprem-amd64` The Text Analytics for health container can now be called using the client library. See [How to install and run Text Analytics containers](how-tos/text-analytics-how-to-install-containers.md?tabs=healthcare#run-the-container-with-client-library-support) for more information.
-* Quality improvements for the [Extractive Summarization](how-tos/extractive-summarization.md) feature in model-version `2021-08-01`.
## August 2021
communication-services Privacy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/privacy.md
Sent and received SMS messages are ephemerally processed by the service and not
### PSTN voice calling
-Audio and video communication is ephemerally processed by the service and no data is retained in your resource other than Azure Monitor logs.
+Audio and video communication is ephemerally processed by the service and no call processing data is retained in your resource other than Azure Monitor logs.
### Internet voice and video calling
-Audio and video communication is ephemerally processed by the service and no data is retained in your resource other than Azure Monitor logs.
+Audio and video communication is ephemerally processed by the service and no call processing data is retained in your resource other than Azure Monitor logs.
### Call Recording
communication-services Certified Session Border Controllers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/certified-session-border-controllers.md
If you have any questions about the SBC certification program for Communication
|: |: |: |[AudioCodes](https://www.audiocodes.com/media/lbjfezwn/mediant-sbc-with-microsoft-azure-communication-services.pdf)|Mediant SBC|7.40A |Metaswitch|Perimeta SBC|4.9|
-|[Oracle](https://www.oracle.com/technical-resources/documentation/acme-packet.html)|Acme Packet SBC|8.4.0|
+|[Oracle](https://www.oracle.com/technical-resources/documentation/acme-packet.html)|Oracle Acme Packet SBC|8.4|
|Ribbon Communications|SBC SWe|9.02| ||SBC SWe Lite|9.0
cosmos-db Choose Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/choose-api.md
If you are migrating from other databases such as Oracle, DynamoDB, HBase etc. a
### Capacity planning for migration to API for MongoDB Trying to do capacity planning for a migration to Azure Cosmos DB SQL API from an existing database cluster? You can use information about your existing database cluster for capacity planning.
- * If all you know is the number of vcores and servers in your existing sharded and replicated database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
- * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+* If all you know is the number of vcores and servers in your existing sharded and replicated database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md).
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md).
## API for MongoDB
API for MongoDB is compatible with the 4.0, 3.6, and 3.2 MongoDB server versions
### Capacity planning for migration to API for MongoDB Trying to do capacity planning for a migration to Azure Cosmos DB API for MongoDB from an existing database cluster? You can use information about your existing database cluster for capacity planning.
- * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
- * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](./mongodb/estimate-ru-capacity-planner.md)
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md).
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](./mongodb/estimate-ru-capacity-planner.md).
## Cassandra API
Azure Cosmos DB's Gremlin API is based on the [Apache TinkerPop](https://tinkerp
This API stores data in key/value format. If you are currently using Azure Table storage, you may see some limitations in latency, scaling, throughput, global distribution, index management, low query performance. Table API overcomes these limitations and itΓÇÖs recommended to migrate your app if you want to use the benefits of Azure Cosmos DB. Table API only supports OLTP scenarios.
-Applications written for Azure Table storage can migrate to the Table API with little code changes and take advantage of premium capabilities. To learn more, see [Table API](introduction.md) article.
+Applications written for Azure Table storage can migrate to the Table API with little code changes and take advantage of premium capabilities. To learn more, see [Table API](table/introduction.md) article.
## Next steps
Applications written for Azure Table storage can migrate to the Table API with l
* [Get started with Azure Cosmos DB Table API](create-table-dotnet.md) * Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning. * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
- * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Connect Using Mongoose https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/connect-using-mongoose.md
After you create the database, you'll use the name in the `COSMOSDB_DBNAME` envi
1. To create a Node.js application in the folder of your choice, run the following command in a node command prompt.
- ```npm init```
+ `npm init`
- Answer the questions and your project will be ready to go.
+ Answer the questions and your project will be ready to go.
2. Add a new file to the folder and name it ```index.js```.+ 3. Install the necessary packages using one of the ```npm install``` options:+ * Mongoose: ```npm install mongoose@5 --save``` > [!Note]
After you create the database, you'll use the name in the `COSMOSDB_DBNAME` envi
>[!Note] > The ```--save``` flag adds the dependency to the package.json file.
-4. Import the dependencies in your index.js file.
+4. Import the dependencies in your `index.js` file.
```JavaScript var mongoose = require('mongoose');
cosmos-db Feature Support 36 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/feature-support-36.md
In the $regex queries, left-anchored expressions allow index search. However, us
When there's a need to include '$' or '|', it is best to create two (or more) regex queries. For example, given the following original query: ```find({x:{$regex: /^abc$/})```, it has to be modified as follows:
-```find({x:{$regex: /^abc/, x:{$regex:/^abc$/}})```
+`find({x:{$regex: /^abc/, x:{$regex:/^abc$/}})`
The first part will use the index to restrict the search to those documents beginning with ^abc and the second part will match the exact entries. The bar operator '|' acts as an "or" function - the query ```find({x:{$regex: /^abc |^def/})``` matches the documents in which field 'x' has values that begin with "abc" or "def". To utilize the index, it's recommended to break the query into two different queries joined by the $or operator: ```find( {$or : [{x: $regex: /^abc/}, {$regex: /^def/}] })```.
cosmos-db Pre Migration Steps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/pre-migration-steps.md
The following Azure Cosmos DB configuration choices cannot be modified or undone
This command will output a JSON document similar to the following:
- ```{ "_t": "GetRequestStatisticsResponse", "ok": 1, "CommandName": "find", "RequestCharge": 10.1, "RequestDurationInMilliSeconds": 7.2}```
+ `{ "_t": "GetRequestStatisticsResponse", "ok": 1, "CommandName": "find", "RequestCharge": 10.1, "RequestDurationInMilliSeconds": 7.2}`
* You can also use [the diagnostic settings](../cosmosdb-monitor-resource-logs.md) to understand the frequency and patterns of the queries executed against Azure Cosmos DB. The results from the diagnostic logs can be sent to a storage account, an EventHub instance or [Azure Log Analytics](../../azure-monitor/logs/log-analytics-tutorial.md).
cosmos-db Cli Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql/cli-samples.md
Title: Azure CLI Samples for Azure Cosmos DB Core (SQL) API
-description: Azure CLI Samples for Azure Cosmos DB Core (SQL) API
+ Title: Azure CLI Samples for Azure Cosmos DB | Microsoft Docs
+description: This article lists several Azure CLI code samples available for interacting with Azure Cosmos DB. View API-specific CLI samples.
Previously updated : 10/13/2020 Last updated : 09/17/2021 -+
+keywords: cosmos db, azure cli samples, azure cli code samples, azure cli script samples
# Azure CLI samples for Azure Cosmos DB Core (SQL) API [!INCLUDE[appliesto-sql-api](../includes/appliesto-sql-api.md)]
-The following table includes links to sample Azure CLI scripts for Azure Cosmos DB. Use the links on the right to navigate to API specific samples. Common samples are the same across all APIs. Reference pages for all Azure Cosmos DB CLI commands are available in the [Azure CLI Reference](/cli/azure/cosmosdb). Azure Cosmos DB CLI script samples can also be found in the [Azure Cosmos DB CLI GitHub Repository](https://github.com/Azure-Samples/azure-cli-samples/tree/master/cosmosdb).
+The following table includes links to sample Azure CLI scripts for Azure Cosmos DB. Use the links on the right to navigate to API-specific samples. Common samples are the same across all APIs. Reference pages for all Azure Cosmos DB CLI commands are available in the [Azure CLI Reference](/cli/azure/cosmosdb). Azure Cosmos DB CLI script samples can also be found in the [Azure Cosmos DB CLI GitHub Repository](https://github.com/Azure-Samples/azure-cli-samples/tree/master/cosmosdb).
These samples require Azure CLI version 2.12.1 or later. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI](/cli/azure/install-azure-cli)
These samples apply to all Azure Cosmos DB APIs
|Task | Description | |||
-| [Create an Azure Cosmos account, database and container](../scripts/cli/sql/create.md?toc=%2fcli%2fazure%2ftoc.json)| Creates an Azure Cosmos DB account, database, and container for Core (SQL) API. |
-| [Create an Azure Cosmos account, database and container with autoscale](../scripts/cli/sql/autoscale.md?toc=%2fcli%2fazure%2ftoc.json)| Creates an Azure Cosmos DB account, database, and container with autoscale for Core (SQL) API. |
-| [Throughput operations](../scripts/cli/sql/throughput.md?toc=%2fcli%2fazure%2ftoc.json) | Read, update and migrate between autoscale and standard throughput on a database and container.|
+| [Create an Azure Cosmos account, database, and container](../scripts/cli/sql/create.md?toc=%2fcli%2fazure%2ftoc.json)| Creates an Azure Cosmos DB account, database, and container for Core (SQL) API. |
+| [Create an Azure Cosmos account, database, and container with autoscale](../scripts/cli/sql/autoscale.md?toc=%2fcli%2fazure%2ftoc.json)| Creates an Azure Cosmos DB account, database, and container with autoscale for Core (SQL) API. |
+| [Throughput operations](../scripts/cli/sql/throughput.md?toc=%2fcli%2fazure%2ftoc.json) | Read, update, and migrate between autoscale and standard throughput on a database and container.|
| [Lock resources from deletion](../scripts/cli/sql/lock.md?toc=%2fcli%2fazure%2ftoc.json)| Prevent resources from being deleted with resource locks.| |||
cosmos-db Couchbase Cosmos Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql/couchbase-cosmos-migration.md
The following are the code snippets for CRUD operations:
Where *_repo* is the object of repository and *doc* is the POJO classΓÇÖs object. You can use `.save` to insert or upsert (if document with specified ID found). The following code snippet shows how to insert or update a doc object:
-```_repo.save(doc);```
+`_repo.save(doc);`
### Delete Operation Consider the following code snippet, where doc object will have ID and partition key mandatory to locate and delete the object:
-```_repo.delete(doc);```
+`_repo.delete(doc);`
### Read Operation
cost-management-billing Review Enterprise Agreement Bill https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/understand/review-enterprise-agreement-bill.md
See [Power BI self-service sign up](https://powerbi.microsoft.com/documentation/
### To access Microsoft Azure Consumption Insights:
-1. Go to [Microsoft Azure Consumption Insights](https://app.powerbi.com/getdata/services/azureconsumption?cpcode=MicrosoftAzureConsumptionInsights&amp;getDataForceConnect=true&amp;WT.mc_id=azurebg_email_Trans_33675_1378_Service_Notice_EA_Customer_Power_BI_EA_Content_Pack_Apr26).
+1. Go to Microsoft Azure Consumption Insights.
1. Select **Get It Now**. 1. Provide an enrollment number and the number of months, and then select **Next**. 1. Provide your API access key to connect. You can find the key for your enrollment in the [Enterprise portal](https://ea.azure.com/?WT.mc_id=azurebg_email_Trans_33675_1378_Service_Notice_EA_Customer_Power_BI_EA_Content_Pack_Apr26).
In this tutorial, you learned how to:
Continue to the next article to learn more using the Azure EA portal. > [!div class="nextstepaction"]
-> [Get started with the Azure EA portal](../manage/ea-portal-get-started.md)
+> [Get started with the Azure EA portal](../manage/ea-portal-get-started.md)
data-factory Concepts Data Flow Expression Builder https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-expression-builder.md
If your data flow uses a defined schema in any of its sources, you can reference
When you have column names that include special characters or spaces, surround the name with curly braces to reference them in an expression.
-```{[dbo].this_is my complex name$$$}```
+`{[dbo].this_is my complex name$$$}`
### Parameters
Below are a list of shortcuts available in the expression builder. Most intellis
### Convert to dates or timestamps
-To include string literals in your timestamp output, wrap your conversion in ```toString()```.
+To include string literals in your timestamp output, wrap your conversion in `toString()`.
-```toString(toTimestamp('12/31/2016T00:12:00', 'MM/dd/yyyy\'T\'HH:mm:ss'), 'MM/dd /yyyy\'T\'HH:mm:ss')```
+`toString(toTimestamp('12/31/2016T00:12:00', 'MM/dd/yyyy\'T\'HH:mm:ss'), 'MM/dd /yyyy\'T\'HH:mm:ss')`
To convert milliseconds from epoch to a date or timestamp, use `toTimestamp(<number of milliseconds>)`. If time is coming in seconds, multiply by 1,000.
-```toTimestamp(1574127407*1000l)```
+`toTimestamp(1574127407*1000l)`
The trailing "l" at the end of the previous expression signifies conversion to a long type as inline syntax.
data-factory Connector Azure Data Lake Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-lake-storage.md
To move source files to another location post-processing, first select "Move" fo
If you have a source path with wildcard, your syntax will look like this below:
-```/data/sales/20??/**/*.csv```
+`/data/sales/20??/**/*.csv`
You can specify "from" as
-```/data/sales```
+`/data/sales`
And "to" as
-```/backup/priorSales```
+`/backup/priorSales`
In this case, all files that were sourced under /data/sales are moved to /backup/priorSales.
Examples:
Parameters are also supported through expression builder, for example:
-```mkdir -p {$tempPath}/commands/c1/c2```
-```mv {$tempPath}/commands/*.* {$tempPath}/commands/c1/c2```
-
+`mkdir -p {$tempPath}/commands/c1/c2`
+`mv {$tempPath}/commands/*.* {$tempPath}/commands/c1/c2`
By default, folders are created as user/root. Refer to the top level container with ΓÇÿ/ΓÇÖ.
data-factory Connector Azure Data Lake Store https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-lake-store.md
To move source files to another location post-processing, first select "Move" fo
If you have a source path with wildcard, your syntax will look like this below:
-```/data/sales/20??/**/*.csv```
+`/data/sales/20??/**/*.csv`
You can specify "from" as
-```/data/sales```
+`/data/sales`
And "to" as
-```/backup/priorSales```
+`/backup/priorSales`
In this case, all files that were sourced under /data/sales are moved to /backup/priorSales.
data-factory Connector Azure Database For Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-postgresql.md
To copy data from Azure Database for PostgreSQL, set the source type in the copy
|: |: |: | | type | The type property of the copy activity source must be set to **AzurePostgreSqlSource** | Yes | | query | Use the custom SQL query to read data. For example: `SELECT * FROM mytable` or `SELECT * FROM "MyTable"`. Note in PostgreSQL, the entity name is treated as case-insensitive if not quoted. | No (if the tableName property in the dataset is specified) |
+| partitionOptions | Specifies the data partitioning options used to load data from Azure SQL Database. <br>Allowed values are: **None** (default), **PhysicalPartitionsOfTable**, and **DynamicRange**.<br>When a partition option is enabled (that is, not `None`), the degree of parallelism to concurrently load data from an Azure SQL Database is controlled by the [`parallelCopies`](copy-activity-performance-features.md#parallel-copy) setting on the copy activity. | No |
+| partitionSettings | Specify the group of the settings for data partitioning. <br>Apply when the partition option isn't `None`. | No |
+| ***Under `partitionSettings`:*** | | |
+| partitionColumnName | Specify the name of the source column **in integer or date/datetime type** (`int`, `smallint`, `bigint`, `date`, `smalldatetime`, `datetime`, `datetime2`, or `datetimeoffset`) that will be used by range partitioning for parallel copy. If not specified, the index or the primary key of the table is autodetected and used as the partition column.<br>Apply when the partition option is `DynamicRange`. If you use a query to retrieve the source data, hook `?AdfDynamicRangePartitionCondition ` in the WHERE clause. For an example, see the [Parallel copy from Azure Database for PostgreSQL](#parallel-copy-from-azure-database-for-postgresql) section. | No |
+| partitionUpperBound | The maximum value of the partition column for partition range splitting. This value is used to decide the partition stride, not for filtering the rows in table. All rows in the table or query result will be partitioned and copied. If not specified, copy activity auto detect the value. <br>Apply when the partition option is `DynamicRange`. For an example, see the [Parallel copy from Azure Database for PostgreSQL](#parallel-copy-from-azure-database-for-postgresql) section. | No |
+| partitionLowerBound | The minimum value of the partition column for partition range splitting. This value is used to decide the partition stride, not for filtering the rows in table. All rows in the table or query result will be partitioned and copied. If not specified, copy activity auto detect the value.<br>Apply when the partition option is `DynamicRange`. For an example, see the [Parallel copy from Azure Database for PostgreSQL](#parallel-copy-from-azure-database-for-postgresql) section. | No |
**Example**:
To copy data to Azure Database for PostgreSQL, the following properties are supp
] ```
+## Parallel copy from Azure Database for PostgreSQL
+
+The Azure Database for PostgreSQL connector in copy activity provides built-in data partitioning to copy data in parallel. You can find data partitioning options on the **Source** tab of the copy activity.
+
+![Screenshot of partition options](.\media\connector-azure-database-for-postgresql/connector-postgresql-partition-options.png)
+
+When you enable partitioned copy, copy activity runs parallel queries against your Azure Database for PostgreSQL source to load data by partitions. The parallel degree is controlled by the [`parallelCopies`](copy-activity-performance-features.md#parallel-copy) setting on the copy activity. For example, if you set `parallelCopies` to four, the service concurrently generates and runs four queries based on your specified partition option and settings, and each query retrieves a portion of data from your Azure Database for PostgreSQL.
+
+You are suggested to enable parallel copy with data partitioning especially when you load large amount of data from your Azure Database for PostgreSQL. The following are suggested configurations for different scenarios. When copying data into file-based data store, it's recommended to write to a folder as multiple files (only specify folder name), in which case the performance is better than writing to a single file.
+
+| Scenario | Suggested settings |
+| | |
+| Full load from large table, with physical partitions. | **Partition option**: Physical partitions of table. <br><br/>During execution, the service automatically detects the physical partitions, and copies data by partitions. |
+| Full load from large table, without physical partitions, while with an integer or datetime column for data partitioning. | **Partition options**: Dynamic range partition.<br>**Partition column** (optional): Specify the column used to partition data. If not specified, the index or primary key column is used.<br/>**Partition upper bound** and **partition lower bound** (optional): Specify if you want to determine the partition stride. This is not for filtering the rows in table, all rows in the table will be partitioned and copied. If not specified, copy activity auto detect the values.<br><br>For example, if your partition column "ID" has values range from 1 to 100, and you set the lower bound as 20 and the upper bound as 80, with parallel copy as 4, the service retrieves data by 4 partitions - IDs in range <=20, [21, 50], [51, 80], and >=81, respectively. |
+| Load a large amount of data by using a custom query, without physical partitions, while with an integer or date/datetime column for data partitioning. | **Partition options**: Dynamic range partition.<br>**Query**: `SELECT * FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used to partition data.<br>**Partition upper bound** and **partition lower bound** (optional): Specify if you want to determine the partition stride. This is not for filtering the rows in table, all rows in the query result will be partitioned and copied. If not specified, copy activity auto detect the value.<br><br>During execution, the service replaces `?AdfRangePartitionColumnName` with the actual column name and value ranges for each partition, and sends to Azure Database for PostgreSQL. <br>For example, if your partition column "ID" has values range from 1 to 100, and you set the lower bound as 20 and the upper bound as 80, with parallel copy as 4, the service retrieves data by 4 partitions- IDs in range <=20, [21, 50], [51, 80], and >=81, respectively. <br><br>Here are more sample queries for different scenarios:<br> 1. Query the whole table: <br>`SELECT * FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition`<br> 2. Query from a table with column selection and additional where-clause filters: <br>`SELECT <column_list> FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>`<br> 3. Query with subqueries: <br>`SELECT <column_list> FROM (<your_sub_query>) AS T WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>`<br> 4. Query with partition in subquery: <br>`SELECT <column_list> FROM (SELECT <your_sub_query_column_list> FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition) AS T`
+|
+
+Best practices to load data with partition option:
+
+1. Choose distinctive column as partition column (like primary key or unique key) to avoid data skew.
+2. If the table has built-in partition, use partition option "Physical partitions of table" to get better performance.
+3. If you use Azure Integration Runtime to copy data, you can set larger "[Data Integration Units (DIU)](copy-activity-performance-features.md#data-integration-units)" (>4) to utilize more computing resource. Check the applicable scenarios there.
+4. "[Degree of copy parallelism](copy-activity-performance-features.md#parallel-copy)" control the partition numbers, setting this number too large sometime hurts the performance, recommend setting this number as (DIU or number of Self-hosted IR nodes) * (2 to 4).
+
+**Example: full load from large table with physical partitions**
+
+```json
+"source": {
+ "type": "AzurePostgreSqlSource",
+ "partitionOption": "PhysicalPartitionsOfTable"
+}
+```
+
+**Example: query with dynamic range partition**
+
+```json
+"source": {
+ "type": "AzurePostgreSqlSource",
+ "query":ΓÇ»"SELECT * FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>",
+ "partitionOption": "DynamicRange",
+ "partitionSettings": {
+ "partitionColumnName": "<partition_column_name>",
+ "partitionUpperBound": "<upper_value_of_partition_column (optional) to decide the partition stride, not as data filter>",
+ "partitionLowerBound": "<lower_value_of_partition_column (optional) to decide the partition stride, not as data filter>"
+ }
+}
+```
+ ## Mapping data flow properties When transforming data in mapping data flow, you can read and write to tables from Azure Database for PostgreSQL. For more information, see the [source transformation](data-flow-source.md) and [sink transformation](data-flow-sink.md) in mapping data flows. You can choose to use an Azure Database for PostgreSQL dataset or an [inline dataset](data-flow-source.md#inline-datasets) as source and sink type.
data-factory Copy Activity Performance Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance-features.md
+ Last updated 09/09/2021+ # Copy activity performance optimization features
The allowed DIUs to empower a copy activity run is **between 2 and 256**. If not
|: |: |- | | Between file stores |- **Copy from or to single file**: 2-4 <br>- **Copy from and to multiple files**: 2-256 depending on the number and size of the files <br><br>For example, if you copy data from a folder with 4 large files and choose to preserve hierarchy, the max effective DIU is 16; when you choose to merge file, the max effective DIU is 4. |Between 4 and 32 depending on the number and size of the files | | From file store to non-file store |- **Copy from single file**: 2-4 <br/>- **Copy from multiple files**: 2-256 depending on the number and size of the files <br/><br/>For example, if you copy data from a folder with 4 large files, the max effective DIU is 16. |- **Copy into Azure SQL Database or Azure Cosmos DB**: between 4 and 16 depending on the sink tier (DTUs/RUs) and source file pattern<br>- **Copy into Azure Synapse Analytics** using PolyBase or COPY statement: 2<br>- Other scenario: 4 |
-| From non-file store to file store |- **Copy from partition-option-enabled data stores** (including [Azure SQL Database](connector-azure-sql-database.md#azure-sql-database-as-the-source), [Azure SQL Managed Instance](connector-azure-sql-managed-instance.md#sql-managed-instance-as-a-source), [Azure Synapse Analytics](connector-azure-sql-data-warehouse.md#azure-synapse-analytics-as-the-source), [Oracle](connector-oracle.md#oracle-as-source), [Netezza](connector-netezza.md#netezza-as-source), [SQL Server](connector-sql-server.md#sql-server-as-a-source), and [Teradata](connector-teradata.md#teradata-as-source)): 2-256 when writing to a folder, and 2-4 when writing to one single file. Note per source data partition can use up to 4 DIUs.<br>- **Other scenarios**: 2-4 |- **Copy from REST or HTTP**: 1<br/>- **Copy from Amazon Redshift** using UNLOAD: 2<br>- **Other scenario**: 4 |
-| Between non-file stores |- **Copy from partition-option-enabled data stores** (including [Azure SQL Database](connector-azure-sql-database.md#azure-sql-database-as-the-source), [Azure SQL Managed Instance](connector-azure-sql-managed-instance.md#sql-managed-instance-as-a-source), [Azure Synapse Analytics](connector-azure-sql-data-warehouse.md#azure-synapse-analytics-as-the-source), [Oracle](connector-oracle.md#oracle-as-source), [Netezza](connector-netezza.md#netezza-as-source), [SQL Server](connector-sql-server.md#sql-server-as-a-source), and [Teradata](connector-teradata.md#teradata-as-source)): 2-256 when writing to a folder, and 2-4 when writing to one single file. Note per source data partition can use up to 4 DIUs.<br/>- **Other scenarios**: 2-4 |- **Copy from REST or HTTP**: 1<br>- **Other scenario**: 4 |
+| From non-file store to file store |- **Copy from partition-option-enabled data stores** (including [Azure Database for PostgreSQL](connector-azure-database-for-postgresql.md#azure-database-for-postgresql-as-source), [Azure SQL Database](connector-azure-sql-database.md#azure-sql-database-as-the-source), [Azure SQL Managed Instance](connector-azure-sql-managed-instance.md#sql-managed-instance-as-a-source), [Azure Synapse Analytics](connector-azure-sql-data-warehouse.md#azure-synapse-analytics-as-the-source), [Oracle](connector-oracle.md#oracle-as-source), [Netezza](connector-netezza.md#netezza-as-source), [SQL Server](connector-sql-server.md#sql-server-as-a-source), and [Teradata](connector-teradata.md#teradata-as-source)): 2-256 when writing to a folder, and 2-4 when writing to one single file. Note per source data partition can use up to 4 DIUs.<br>- **Other scenarios**: 2-4 |- **Copy from REST or HTTP**: 1<br/>- **Copy from Amazon Redshift** using UNLOAD: 2<br>- **Other scenario**: 4 |
+| Between non-file stores |- **Copy from partition-option-enabled data stores** (including [Azure Database for PostgreSQL](connector-azure-database-for-postgresql.md#azure-database-for-postgresql-as-source), [Azure SQL Database](connector-azure-sql-database.md#azure-sql-database-as-the-source), [Azure SQL Managed Instance](connector-azure-sql-managed-instance.md#sql-managed-instance-as-a-source), [Azure Synapse Analytics](connector-azure-sql-data-warehouse.md#azure-synapse-analytics-as-the-source), [Oracle](connector-oracle.md#oracle-as-source), [Netezza](connector-netezza.md#netezza-as-source), [SQL Server](connector-sql-server.md#sql-server-as-a-source), and [Teradata](connector-teradata.md#teradata-as-source)): 2-256 when writing to a folder, and 2-4 when writing to one single file. Note per source data partition can use up to 4 DIUs.<br/>- **Other scenarios**: 2-4 |- **Copy from REST or HTTP**: 1<br>- **Other scenario**: 4 |
You can see the DIUs used for each copy run in the copy activity monitoring view or activity output. For more information, see [Copy activity monitoring](copy-activity-monitoring.md). To override this default, specify a value for the `dataIntegrationUnits` property as follows. The *actual number of DIUs* that the copy operation uses at run time is equal to or less than the configured value, depending on your data pattern.
data-factory Data Flow Parse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-parse.md
Here is where you will configure the target output schema from the parsing that
In this example, we have defined parsing of the incoming field "jsonString" which is plain text, but formatted as a JSON structure. We're going to store the parsed results as JSON in a new column called "json" with this schema:
-```(trade as boolean, customers as string[])```
+`(trade as boolean, customers as string[])`
Refer to the inspect tab and data preview to verify your output is mapped properly.
data-factory Data Flow Script https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-script.md
Previously updated : 02/15/2021 Last updated : 09/22/2021 # Data flow script (DFS)
You can use this script to identify key columns and view the cardinality of all
aggregate(each(match(true()), $$ = countDistinct($$))) ~> KeyPattern ```
+### Compare previous or next row values
+This sample snippet demonstrates how the Window transformation can be used to compare column values from the current row context with column values from rows before and after the current row. In this example, a Derived Column is used to generate a dummy value to enable a window partition across the entire data set. A Surrogate Key transformation is used to assign a unique key value for each row. When you apply this pattern to your data transformations, you can remove the surrogate key if you are a column that you wish to order by and you can remove the derived column if you have columns to use to partition your data by.
+
+```
+source1 keyGenerate(output(sk as long),
+ startAt: 1L) ~> SurrogateKey1
+SurrogateKey1 derive(dummy = 1) ~> DerivedColumn1
+DerivedColumn1 window(over(dummy),
+ asc(sk, true),
+ prevAndCurr = lag(title,1)+'-'+last(title),
+ nextAndCurr = lead(title,1)+'-'+last(title)) ~> leadAndLag
+```
+ ## Next steps Explore Data Flows by starting with the [data flows overview article](concepts-data-flow-overview.md)
data-factory How To Fixed Width https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-fixed-width.md
By using mapping data flows in Microsoft Azure Data Factory, you can transform d
10. In the expression builder, type the following:
- ```substring(Column_1,1,4)```
+ `substring(Column_1,1,4)`
:::image type="content" source="media/data-flow/fwderivedcol1.png" alt-text="derived column":::
data-factory How To Sqldb To Cosmosdb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-sqldb-to-cosmosdb.md
The resulting CosmosDB container will embed the inner query into a single docume
16. The aggregate transformation will only output columns that are part of aggregate or group by formulas. So, we need to include the columns from the sales header as well. To do that, add a column pattern in that same aggregate transformation. This pattern will include all other columns in the output:
-```instr(name,'OrderQty')==0&&instr(name,'UnitPrice')==0&&instr(name,'SalesOrderID')==0```
+ `instr(name,'OrderQty')==0&&instr(name,'UnitPrice')==0&&instr(name,'SalesOrderID')==0`
17. Use the "this" syntax in the other properties so that we maintain the same column names and use the ```first()``` function as an aggregate:
data-factory Source Control https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/source-control.md
Previously updated : 09/08/2021 Last updated : 09/22/2021 # Source control in Azure Data Factory
Version control systems (also known as _source control_) let developers collabor
### Creating feature branches
-Each Azure Repos Git repository that's associated with a data factory has a collaboration branch. (`main`) is the default collaboration branch). Users can also create feature branches by clicking **+ New Branch** in the branch dropdown. Once the new branch pane appears, enter the name of your feature branch.
+Each Azure Repos Git repository that's associated with a data factory has a collaboration branch. (`main` is the default collaboration branch). Users can also create feature branches by clicking **+ New Branch** in the branch dropdown.
:::image type="content" source="media/author-visually/new-branch.png" alt-text="Create a new branch":::
+Once the new branch pane appears, enter the name of your feature branch and select a branch to base the work off of.
++ When you are ready to merge the changes from your feature branch to your collaboration branch, click on the branch dropdown and select **Create pull request**. This action takes you to Azure Repos Git where you can raise pull requests, do code reviews, and merge changes to your collaboration branch. (`main` is the default). You are only allowed to publish to the Data Factory service from your collaboration branch. :::image type="content" source="media/author-visually/create-pull-request.png" alt-text="Create a new pull request":::
data-factory Tutorial Data Flow Private https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-data-flow-private.md
If you didn't use the hyperlink when you tested the preceding connection, follow
:::image type="content" source="media/tutorial-data-flow-private/filter-years.png" alt-text="Screenshot that shows FilterYears."::: 1. The data flow expression builder lets you interactively build expressions to use in various transformations. Expressions can include built-in functions, columns from the input schema, and user-defined parameters. For more information on how to build expressions, see [Data flow expression builder](./concepts-data-flow-expression-builder.md).
- * In this tutorial, you want to filter movies in the comedy genre that came out between the years 1910 and 2000. Because the year is currently a string, you need to convert it to an integer by using the ```toInteger()``` function. Use the greater than or equal to (>=) and less than or equal to (<=) operators to compare against the literal year values 1910 and 2000. Union these expressions together with the and (&&) operator. The expression comes out as:
+ * In this tutorial, you want to filter movies in the comedy genre that came out between the years 1910 and 2000. Because the year is currently a string, you need to convert it to an integer by using the `toInteger()` function. Use the greater than or equal to (>=) and less than or equal to (<=) operators to compare against the literal year values 1910 and 2000. Union these expressions together with the and (&&) operator. The expression comes out as:
- ```toInteger(year) >= 1910 && toInteger(year) <= 2000```
+ `toInteger(year) >= 1910 && toInteger(year) <= 2000`
- * To find which movies are comedies, you can use the ```rlike()``` function to find the pattern 'Comedy' in the column genres. Union the rlike expression with the year comparison to get:
+ * To find which movies are comedies, you can use the `rlike()` function to find the pattern 'Comedy' in the column genres. Union the `rlike` expression with the year comparison to get:
- ```toInteger(year) >= 1910 && toInteger(year) <= 2000 && rlike(genres, 'Comedy')```
+ `toInteger(year) >= 1910 && toInteger(year) <= 2000 && rlike(genres, 'Comedy')`
* If you have a debug cluster active, you can verify your logic by selecting **Refresh** to see the expression output compared to the inputs used. There's more than one right answer on how you can accomplish this logic by using the data flow expression language.
If you didn't use the hyperlink when you tested the preceding connection, follow
:::image type="content" source="media/tutorial-data-flow-private/name-column.png" alt-text="Screenshot that shows the aggregate column name."::: 1. To get the average of column **Rating**, use the ```avg()``` aggregate function. Because **Rating** is a string and ```avg()``` takes in a numerical input, we must convert the value to a number via the ```toInteger()``` function. This expression looks like:
- ```avg(toInteger(Rating))```
+ `avg(toInteger(Rating))`
1. Select **Save and finish** after you're finished.
If you followed this tutorial correctly, you should have written 83 rows and 2 c
## Summary
-In this tutorial, you used the Data Factory UI to create a pipeline that copies and transforms data from a Data Lake Storage Gen2 source to a Data Lake Storage Gen2 sink (both allowing access to only selected networks) by using mapping data flow in [Data Factory Managed Virtual Network](managed-virtual-network-private-endpoint.md).
+In this tutorial, you used the Data Factory UI to create a pipeline that copies and transforms data from a Data Lake Storage Gen2 source to a Data Lake Storage Gen2 sink (both allowing access to only selected networks) by using mapping data flow in [Data Factory Managed Virtual Network](managed-virtual-network-private-endpoint.md).
data-factory Tutorial Data Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-data-flow.md
Once you create your Data Flow, you'll be automatically sent to the data flow ca
:::image type="content" source="media/tutorial-data-flow/filter1.png" alt-text="Screenshot that shows the Filter on expression box."::: 1. The data flow expression builder lets you interactively build expressions to use in various transformations. Expressions can include built-in functions, columns from the input schema, and user-defined parameters. For more information on how to build expressions, see [Data Flow expression builder](concepts-data-flow-expression-builder.md).
- In this tutorial, you want to filter movies of genre comedy that came out between the years 1910 and 2000. As year is currently a string, you need to convert it to an integer using the ```toInteger()``` function. Use the greater than or equals to (>=) and less than or equals to (<=) operators to compare against literal year values 1910 and 2000. Union these expressions together with the and (&&) operator. The expression comes out as:
+ In this tutorial, you want to filter movies of genre comedy that came out between the years 1910 and 2000. As year is currently a string, you need to convert it to an integer using the `toInteger()` function. Use the greater than or equals to (>=) and less than or equals to (<=) operators to compare against literal year values 1910 and 2000. Union these expressions together with the and (&&) operator. The expression comes out as:
- ```toInteger(year) >= 1910 && toInteger(year) <= 2000```
+ `toInteger(year) >= 1910 && toInteger(year) <= 2000`
- To find which movies are comedies, you can use the ```rlike()``` function to find pattern 'Comedy' in the column genres. Union the rlike expression with the year comparison to get:
+ To find which movies are comedies, you can use the `rlike()` function to find pattern 'Comedy' in the column genres. Union the `rlike` expression with the year comparison to get:
- ```toInteger(year) >= 1910 && toInteger(year) <= 2000 && rlike(genres, 'Comedy')```
+ `toInteger(year) >= 1910 && toInteger(year) <= 2000 && rlike(genres, 'Comedy')`
If you've a debug cluster active, you can verify your logic by clicking **Refresh** to see expression output compared to the inputs used. There's more than one right answer on how you can accomplish this logic using the data flow expression language.
Once you create your Data Flow, you'll be automatically sent to the data flow ca
:::image type="content" source="media/tutorial-data-flow/agg3.png" alt-text="Screenshot that shows the year option in the Aggregates tab under Aggregate Settings."::: 1. To get the average of column **Rating**, use the ```avg()``` aggregate function. As **Rating** is a string and ```avg()``` takes in a numerical input, we must convert the value to a number via the ```toInteger()``` function. This is expression looks like:
- ```avg(toInteger(Rating))```
+ `avg(toInteger(Rating))`
Click **Save and Finish** when done.
data-lake-store Data Lake Store Access Control https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-access-control.md
Title: Overview of access control in Data Lake Storage Gen1 | Microsoft Docs description: Learn about the basics of the access control model of Azure Data Lake Storage Gen1, which derives from HDFS.--- + Last updated 03/26/2018-+ # Access control in Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Archive Eventhub Capture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-archive-eventhub-capture.md
Title: Capture data from Event Hubs to Azure Data Lake Storage Gen1 description: Learn how to use Azure Data Lake Storage Gen1 to capture data received by Azure Event Hubs. Begin by verifying the prerequisites. -+ Last updated 05/29/2018-+ # Use Azure Data Lake Storage Gen1 to capture data from Event Hubs
data-lake-store Data Lake Store Comparison With Blob Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-comparison-with-blob-storage.md
Title: Comparison of Azure Data Lake Storage Gen1 with Blob storage description: Learn about the differences between Azure Data Lake Storage Gen1 and Azure Blob Storage regarding some key aspects of big data processing. -+ Last updated 03/26/2018-+ # Comparing Azure Data Lake Storage Gen1 and Azure Blob Storage
data-lake-store Data Lake Store Compatible Oss Other Applications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-compatible-oss-other-applications.md
Title: Big data applications compatible with Data Lake Storage Gen1 | Microsoft Docs description: List of open source applications that work with Azure Data Lake Storage Gen1 (previously known as Azure Data Lake Store)--- + Last updated 06/27/2018-+ # Open Source Big Data applications that work with Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Copy Data Azure Storage Blob https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-copy-data-azure-storage-blob.md
Title: Copy data from Azure Storage blobs to Data Lake Storage Gen1 description: Use AdlCopy tool to copy data from Azure Storage Blobs to Azure Data Lake Storage Gen1 -+ Last updated 05/29/2018-+ # Copy data from Azure Storage Blobs to Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Copy Data Wasb Distcp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-copy-data-wasb-distcp.md
Title: Copy data to and from WASB into Azure Data Lake Storage Gen1 using DistCp description: Use the DistCp tool to copy data to and from Azure Storage blobs to Azure Data Lake Storage Gen1 -+ Last updated 01/03/2020-+
data-lake-store Data Lake Store Data Operations Net Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-data-operations-net-sdk.md
Title: .NET SDK - Filesystem operations on Data Lake Storage Gen1 - Azure description: Use the Azure Data Lake Storage Gen1 .NET SDK for filesystem operations on Data Lake Storage Gen1 such as create folders, etc. -+ Last updated 01/03/2020-+
data-lake-store Data Lake Store Data Operations Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-data-operations-python.md
Title: 'Python: Filesystem operations on Azure Data Lake Storage Gen1 | Microsoft Docs' description: Learn how to use Python SDK to work with the Data Lake Storage Gen1 file system.-- + Last updated 05/29/2018-+
data-lake-store Data Lake Store Data Operations Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-data-operations-rest-api.md
Title: 'REST API: Filesystem operations on Azure Data Lake Storage Gen1 | Microsoft Docs' description: Use WebHDFS REST APIs to perform filesystem operations on Azure Data Lake Storage Gen1--- + Last updated 05/29/2018-+ # Filesystem operations on Azure Data Lake Storage Gen1 using REST API
data-lake-store Data Lake Store Data Scenarios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-data-scenarios.md
Title: Data scenarios involving Data Lake Storage Gen1 | Microsoft Docs description: Understand the different scenarios and tools using which data can ingested, processed, downloaded, and visualized in Data Lake Storage Gen1 (previously known as Azure Data Lake Store)--- + Last updated 06/27/2018-+ # Using Azure Data Lake Storage Gen1 for big data requirements
data-lake-store Data Lake Store Data Transfer Sql Sqoop https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-data-transfer-sql-sqoop.md
Title: Copy data between Data Lake Storage Gen1 and Azure SQL - Sqoop | Microsoft Docs description: Use Sqoop to copy data between Azure SQL Database and Azure Data Lake Storage Gen1-- + Last updated 07/30/2019-+ # Copy data between Data Lake Storage Gen1 and Azure SQL Database using Sqoop
data-lake-store Data Lake Store Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-diagnostic-logs.md
Title: Viewing diagnostic logs for Azure Data Lake Storage Gen1 | Microsoft Docs description: 'Understand how to setup and access diagnostic logs for Azure Data Lake Storage Gen1 '--- + Last updated 03/26/2018-+ # Accessing diagnostic logs for Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Disaster Recovery Guidance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-disaster-recovery-guidance.md
Title: Disaster recovery guidance for Azure Data Lake Storage Gen1 | Microsoft Docs description: Learn how to further protect your data from region-wide outages or accidental deletions beyond the locally redundant storage of Azure Data Lake Storage Gen1. -+ Last updated 02/21/2018-+ # High availability and disaster recovery guidance for Data Lake Storage Gen1
data-lake-store Data Lake Store Encryption https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-encryption.md
Title: Encryption in Azure Data Lake Storage Gen1 | Microsoft Docs description: Encryption in Azure Data Lake Storage Gen1 helps you protect your data, implement enterprise security policies, and meet regulatory compliance requirements. This article provides an overview of the design, and discusses some of the technical aspects of implementation.-- + Last updated 03/26/2018-+ # Encryption of data in Azure Data Lake Storage Gen1
data-lake-store Data Lake Store End User Authenticate Java Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-end-user-authenticate-java-sdk.md
Title: End-user authentication - Java with Data Lake Storage Gen1 - Azure description: Learn how to achieve end-user authentication with Azure Data Lake Storage Gen1 using Azure Active Directory with Java -+ Last updated 05/29/2018 -+ # End-user authentication with Azure Data Lake Storage Gen1 using Java
data-lake-store Data Lake Store End User Authenticate Net Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-end-user-authenticate-net-sdk.md
Title: End-user authentication - .NET with Data Lake Storage Gen1 - Azure description: Learn how to achieve end-user authentication with Azure Data Lake Storage Gen1 using Azure Active Directory with .NET SDK -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store End User Authenticate Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-end-user-authenticate-python.md
Title: End-user authentication - Python with Data Lake Storage Gen1 - Azure description: Learn how to achieve end-user authentication with Azure Data Lake Storage Gen1 using Azure Active Directory with Python -+ Last updated 05/29/2018-+ # End-user authentication with Azure Data Lake Storage Gen1 using Python
data-lake-store Data Lake Store End User Authenticate Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-end-user-authenticate-rest-api.md
Title: End-user authentication - REST with Data Lake Storage Gen1 - Azure description: Learn how to achieve end-user authentication with Azure Data Lake Storage Gen1 using Azure Active Directory using REST API -+ Last updated 05/29/2018-+ # End-user authentication with Azure Data Lake Storage Gen1 using REST API
data-lake-store Data Lake Store End User Authenticate Using Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-end-user-authenticate-using-active-directory.md
Title: End-user authentication - Data Lake Storage Gen1 with Azure AD description: Learn how to achieve end-user authentication with Azure Data Lake Storage Gen1 using Azure Active Directory -+ Last updated 05/29/2018-+ # End-user authentication with Azure Data Lake Storage Gen1 using Azure Active Directory
data-lake-store Data Lake Store Get Started Cli 2.0 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-cli-2.0.md
Title: Manage Azure Data Lake Storage Gen1 account - Azure CLI description: Use the Azure CLI to create a Data Lake Storage Gen1 account and perform basic operations. -+ Last updated 06/27/2018-+ # Get started with Azure Data Lake Storage Gen1 using the Azure CLI
data-lake-store Data Lake Store Get Started Java Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-java-sdk.md
Title: Java SDK - Filesystem operations on Data Lake Storage Gen1 - Azure description: Use the Java SDK for Azure Data Lake Storage Gen1 to perform filesystem operations on Data Lake Storage Gen1 such as creating folders, and uploading and downloading data files. -+ Last updated 05/29/2018 -+ # Filesystem operations on Azure Data Lake Storage Gen1 using Java SDK
data-lake-store Data Lake Store Get Started Net Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-net-sdk.md
Title: Manage an Azure Data Lake Storage Gen1 account with .NET description: Learn how to use the .NET SDK for Azure Data Lake Storage Gen1 account management operations. -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store Get Started Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-portal.md
Title: Get started with Azure Data Lake Storage Gen1 - portal description: Use the Azure portal to create a Data Lake Storage Gen1 account and perform basic operations in the account. -+ Last updated 06/27/2018-+ # Get started with Azure Data Lake Storage Gen1 using the Azure portal
data-lake-store Data Lake Store Get Started Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-powershell.md
Title: Get started with Azure Data Lake Storage Gen1 - PowerShell | Microsoft Docs description: Use Azure PowerShell to create an Azure Data Lake Storage Gen1 account and perform basic operations. -+ Last updated 06/27/2018-+
data-lake-store Data Lake Store Get Started Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-python.md
Title: Manage an Azure Data Lake Storage Gen1 account with Python description: Learn how to use the Python SDK for Azure Data Lake Storage Gen1 account management operations. -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store Get Started Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-get-started-rest-api.md
Title: Manage an Azure Data Lake Storage Gen1 account with REST description: Use the WebHDFS REST API to perform account management operations on an Azure Data Lake Storage Gen1 account. -+ Last updated 05/29/2018-+ # Account management operations on Azure Data Lake Storage Gen1 using REST API
data-lake-store Data Lake Store Hdinsight Hadoop Use Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-hdinsight-hadoop-use-portal.md
Title: Create Azure HDInsight clusters with Data Lake Storage Gen1 - portal description: Use the Azure portal to create and use HDInsight clusters with Azure Data Lake Storage Gen1 -+ Last updated 05/29/2018-+ # Create HDInsight clusters with Azure Data Lake Storage Gen1 by using the Azure portal
data-lake-store Data Lake Store Hdinsight Hadoop Use Powershell For Default Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-hdinsight-hadoop-use-powershell-for-default-storage.md
Title: PowerShell - HDInsight cluster with Data Lake Storage Gen1 - Azure description: Use Azure PowerShell to create and use Azure HDInsight clusters with Azure Data Lake Storage Gen1. -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store Hdinsight Hadoop Use Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-hdinsight-hadoop-use-powershell.md
Title: PowerShell - HDInsight with Data Lake Storage Gen1 - add-on storage - Azure description: Learn how to use Azure PowerShell to configure an HDInsight cluster with Azure Data Lake Storage Gen1 as additional storage. -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store Hdinsight Hadoop Use Resource Manager Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-hdinsight-hadoop-use-resource-manager-template.md
Title: Template - HDInsight cluster with Data Lake Storage Gen1 description: Use Azure Resource Manager templates to create and use Azure HDInsight clusters with Azure Data Lake Storage Gen1. -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store Integrate With Other Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-integrate-with-other-services.md
Title: Integrate Data Lake Storage Gen1 with other Azure services description: Understand how you can integrate Azure Data Lake Storage Gen1 with other Azure services. -+ Last updated 05/29/2018-+ # Integrating Azure Data Lake Storage Gen1 with other Azure services
data-lake-store Data Lake Store Migration Cross Region https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-migration-cross-region.md
Title: Azure Data Lake Storage Gen1 cross-region migration | Microsoft Docs description: Learn what to consider as you plan and complete a migration to Azure Data Lake Storage Gen1 as it becomes available in new regions.--- + -+ Last updated 01/27/2017-+ # Migrate Azure Data Lake Storage Gen1 across regions
data-lake-store Data Lake Store Network Security https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-network-security.md
Title: Network security in Azure Data Lake Storage Gen1 | Microsoft Docs description: Understand how virtual network integration works in Azure Data Lake Storage Gen1--- + - Last updated 10/09/2018-+ # Virtual network integration for Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Offline Bulk Data Upload https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-offline-bulk-data-upload.md
Title: Upload large data set to Azure Data Lake Storage Gen1 - offline methods description: Use the Import/Export service to copy data from Azure Blob storage to Azure Data Lake Storage Gen1 -+ Last updated 05/29/2018-+
data-lake-store Data Lake Store Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-overview.md
Title: What is Azure Data Lake Storage Gen1? | Microsoft Docs description: Overview of Data Lake Storage Gen1 (previously known as Azure Data Lake Store), and the value it provides over other data stores ---+ Last updated 04/17/2019-+
data-lake-store Data Lake Store Performance Tuning Guidance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-performance-tuning-guidance.md
Title: Azure Data Lake Storage Gen1 - performance tuning description: Learn how using all available throughput in Azure Data Lake Storage Gen1 is important to get the best performance by performing as many reads and writes in parallel as possible. -+ Last updated 06/30/2017-+ # Tune Azure Data Lake Storage Gen1 for performance
data-lake-store Data Lake Store Performance Tuning Hive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-performance-tuning-hive.md
Title: Performance tuning - Hive on Azure Data Lake Storage Gen1 description: Learn about performance tuning for Hive on HdInsight and Azure Data Lake Storage Gen1. For I/O intensive queries, tune Hive to get better performance. -+ Last updated 12/19/2016-+ # Performance tuning guidance for Hive on HDInsight and Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Performance Tuning Mapreduce https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-performance-tuning-mapreduce.md
Title: Azure Data Lake Storage Gen1 performance tuning - MapReduce description: Learn about performance tuning for MapReduce in Azure Data Lake Storage Gen1, including parameters, guidance, an example calculation, and limitations. -+ Last updated 12/19/2016-+ # Performance tuning guidance for MapReduce on HDInsight and Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Performance Tuning Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-performance-tuning-powershell.md
Title: Azure Data Lake Storage Gen1 performance tuning - PowerShell description: Tips on how to improve performance when using Azure PowerShell with Azure Data Lake Storage Gen1. -+ Last updated 01/09/2018-+ # Performance tuning guidance for using PowerShell with Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Performance Tuning Spark https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-performance-tuning-spark.md
Title: Performance tuning - Spark with Azure Data Lake Storage Gen1 description: Learn about performance tuning guidelines for Spark on Azure HDInsight and Azure Data Lake Storage Gen1. -+ Last updated 12/19/2016-+ # Performance tuning guidance for Spark on HDInsight and Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Performance Tuning Storm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-performance-tuning-storm.md
Title: Performance tuning - Storm with Azure Data Lake Storage Gen1 description: Understand the factors that should be considered when you tune the performance of an Azure Storm topology, including troubleshooting common issues. -+ Last updated 12/19/2016-+ # Performance tuning guidance for Storm on HDInsight and Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Power Bi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-power-bi.md
Title: Analyze data in Azure Data Lake Storage Gen1 - Power BI description: Learn how to use Power BI Desktop to analyze and visualize data stored in Azure Data Lake Storage Gen1. -+ Last updated 05/29/2018-+ # Analyze data in Azure Data Lake Storage Gen1 by using Power BI
data-lake-store Data Lake Store Secure Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-secure-data.md
Title: Securing data stored in Azure Data Lake Storage Gen1 | Microsoft Docs description: Learn how to secure data in Azure Data Lake Storage Gen1 using groups and access control lists--- + Last updated 03/26/2018-+ # Securing data stored in Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Security Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-security-overview.md
Title: Overview of security in Azure Data Lake Storage Gen1 | Microsoft Docs description: Learn about security capabilities of Azure Data Lake Storage Gen1, including authentication, authorization, network isolation, data protection, and auditing.-- + Last updated 03/11/2020-+ # Security in Azure Data Lake Storage Gen1
data-lake-store Data Lake Store Service To Service Authenticate Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-service-to-service-authenticate-java.md
Title: Service-to-service authentication - Data Lake Storage Gen1 ΓÇô Java SDK description: Learn how to achieve service-to-service authentication with Azure Data Lake Storage Gen1 using Azure Active Directory with Java -+ Last updated 05/29/2018 -+ # Service-to-service authentication with Azure Data Lake Storage Gen1 using Java
data-lake-store Data Lake Store Service To Service Authenticate Net Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-service-to-service-authenticate-net-sdk.md
Title: .NET - Service-to-service authentication - Data Lake Storage Gen1 description: Learn how to achieve service-to-service authentication with Azure Data Lake Storage Gen1 using Azure Active Directory using .NET SDK -+ Last updated 05/29/2018-+ # Service-to-service authentication with Azure Data Lake Storage Gen1 using .NET SDK
data-lake-store Data Lake Store Service To Service Authenticate Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-service-to-service-authenticate-python.md
Title: Python - Service-to-service authentication - Data Lake Storage Gen1 description: Learn how to achieve service-to-service authentication with Azure Data Lake Storage Gen1 using Azure Active Directory using Python -+ Last updated 05/29/2018-+ # Service-to-service authentication with Azure Data Lake Storage Gen1 using Python
data-lake-store Data Lake Store Service To Service Authenticate Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-service-to-service-authenticate-rest-api.md
Title: REST - Service-to-service authentication - Data Lake Storage Gen1 - Azure description: Learn how to achieve service-to-service authentication with Azure Data Lake Storage Gen1 and Azure Active Directory using the REST API. -+ Last updated 05/29/2018-+ # Service-to-service authentication with Azure Data Lake Storage Gen1 using REST API
data-lake-store Data Lake Store Service To Service Authenticate Using Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-service-to-service-authenticate-using-active-directory.md
Title: Service-to-service authentication - Data Lake Storage Gen1 - Azure description: Learn how to achieve service-to-service authentication with Azure Data Lake Storage Gen1 using Azure Active Directory. -+ Last updated 05/29/2018-+ # Service-to-service authentication with Azure Data Lake Storage Gen1 using Azure Active Directory
data-lake-store Data Lake Store Stream Analytics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-stream-analytics.md
Title: Stream data from Stream Analytics to Data Lake Storage Gen1 - Azure description: Learn how to use Azure Data Lake Storage Gen1 as an output for an Azure Stream Analytics job, with a simple scenario that reads data from an Azure Storage blob. -+ Last updated 05/30/2018-+ # Stream data from Azure Storage Blob into Azure Data Lake Storage Gen1 using Azure Stream Analytics
data-lake-store Data Lake Store With Data Catalog https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lake-store-with-data-catalog.md
Title: Integrate Data Lake Storage Gen1 with Azure Data Catalog description: Learn how to register data from Azure Data Lake Storage Gen1 in Azure Data Catalog to make data discoverable in your organization. -+ Last updated 05/29/2018-+ # Register data from Azure Data Lake Storage Gen1 in Azure Data Catalog
data-lake-store Data Lakes Store Authentication Using Azure Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/data-lakes-store-authentication-using-azure-active-directory.md
Title: Authentication - Data Lake Storage Gen1 with Azure AD description: Learn how to authenticate with Azure Data Lake Storage Gen1 using Azure Active Directory. -+ Last updated 05/29/2018-+ # Authentication with Azure Data Lake Storage Gen1 using Azure Active Directory
data-lake-store Policy Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-store/policy-reference.md
Title: Built-in policy definitions for Azure Data Lake Storage Gen1
description: Lists Azure Policy built-in policy definitions for Azure Data Lake Storage Gen1. These built-in policy definitions provide common approaches to managing your Azure resources. Last updated 09/17/2021 --++
databox-online Azure Stack Edge Gpu Deploy Stateless Application Git Ops Guestbook https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-stateless-application-git-ops-guestbook.md
The deployment is done using GitOps on the Arc enabled Kubernetes cluster on you
This procedure is intended for people who have reviewed the [Kubernetes workloads on Azure Stack Edge Pro device](azure-stack-edge-gpu-kubernetes-workload-management.md) and are familiar with the concepts of [What is Azure Arc enabled Kubernetes (Preview)](../azure-arc/kubernetes/overview.md). > [!NOTE]
-> This article contains references to the term slave, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
## Prerequisites Before you can deploy the stateless application, make sure that you have completed the following prerequisites on your device and the client that you will use to access the device:
-> [!NOTE]
-> This article contains references to the term slave, a term that Microsoft no longer uses. When the term is removed from the software, weΓÇÖll remove it from this article.
- ### For device 1. You have sign-in credentials to a 1-node Azure Stack Edge Pro device.
defender-for-iot How To Deploy Edge https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/how-to-deploy-edge.md
Title: Deploy IoT Edge security module description: Learn about how to deploy a Defender for IoT security agent on IoT Edge. Previously updated : 05/26/2021 Last updated : 09/23/2021 # Deploy a security module on your IoT Edge device
Complete each step to complete your IoT Edge deployment for Defender for IoT.
#### Step 1: Modules 1. Select the **AzureSecurityCenterforIoT** module.+ 1. On the **Module Settings** tab, change the **name** to **azureiotsecurity**.+ 1. On the **Environment Variables** tab, add a variable if needed (for example, you can add *debug level* and set it to one of the following values: "Fatal", "Error", "Warning", or "Information").+ 1. On the **Container Create Options** tab, add the following configuration: ``` json
Complete each step to complete your IoT Edge deployment for Defender for IoT.
1. On the **Module Twin Settings** tab, add the following configuration: Module Twin Property:
-
+ ``` json "ms_iotn:urn_azureiot_Security_SecurityAgentConfiguration" ```
- Module Twin Property Content:
+ Module Twin Property Content:
```json { } ```
-
+ For more information about configuring the agent, see [Configure security agents](./how-to-agent-configuration.md). 1. Select **Update**.
If you encounter an issue, container logs are the best way to learn about the st
To learn more about configuration options, continue to the how-to guide for module configuration. > [!div class="nextstepaction"]
-> [Module configuration how-to guide](./how-to-agent-configuration.md)
+> [Module configuration how-to guide](./how-to-agent-configuration.md)
defender-for-iot How To Deploy Linux C https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/how-to-deploy-linux-c.md
This script performs the following function:
For additional help, run the script with the ΓÇôhelp parameter:
-```./InstallSecurityAgent.sh --help```
+`./InstallSecurityAgent.sh --help`
### Uninstall the agent To uninstall the agent, run the script with the ΓÇô-uninstall parameter:
-```./InstallSecurityAgent.sh -ΓÇôuninstall```
+`./InstallSecurityAgent.sh -ΓÇôuninstall`
## Troubleshooting Check the deployment status by running:
-```systemctl status ASCIoTAgent.service```
+`systemctl status ASCIoTAgent.service`
## Next steps
defender-for-iot How To Deploy Windows Cs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/how-to-deploy-windows-cs.md
Get-Help example: ```Get-Help .\InstallSecurityAgent.ps1```
Check the agent deployment status by running:
-```sc.exe query "ASC IoT Agent"```
+`sc.exe query "ASC IoT Agent"`
### Uninstall the agent
defender-for-iot How To Send Security Messages https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/how-to-send-security-messages.md
In this guide, you learn how to:
## Defender for IoT capabilities
-Defender for IoT can process and analyze any kind of security message data as long as the data sent conforms to the [Defender for IoT schema](https://aka.ms/iot-security-schemas) and the message is set as a security message.
+Defender for IoT can process and analyze any kind of security message data as long as the data sent conforms to the Defender for IoT schema and the message is set as a security message.
## Security message Defender for IoT defines a security message using the following criteria: - If the message was sent with Azure IoT SDK-- If the message conforms to the [security message schema](https://aka.ms/iot-security-schemas)
+- If the message conforms to the security message schema
- If the message was set as a security message prior to sending Each security message includes the metadata of the sender such as `AgentId`, `AgentVersion`, `MessageSchemaVersion` and a list of security events.
Send security messages *without* using Defender for IoT agent, by using the [Azu
To send the device data from your devices for processing by Defender for IoT, use one of the following APIs to mark messages for correct routing to Defender for IoT processing pipeline.
-All data that is sent, even if marked with the correct header, must also comply with the [Defender for IoT message schema](https://aka.ms/iot-security-schemas).
+All data that is sent, even if marked with the correct header, must also comply with the Defender for IoT message schema.
### Send security message API
defender-for-iot Security Edge Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/security-edge-architecture.md
Title: Defender for IoT Defender-IoT-micro-agent for IoT Edge
-description: Understand the architecture and capabilities of Azure Defender for IoT Defender-IoT-micro-agent for IoT Edge.
+ Title: Defender for IoT azureiotsecurity for IoT Edge
+description: Understand the architecture and capabilities of Azure Defender for IoT azureiotsecurity for IoT Edge.
Previously updated : 09/09/2020 Last updated : 09/23/2021
-# Azure Defender for IoT Edge Defender-IoT-micro-agent
+# Azure Defender for IoT Edge azureiotsecurity
[Azure IoT Edge](../../iot-edge/index.yml) provides powerful capabilities to manage and perform business workflows at the edge. The key part that IoT Edge plays in IoT environments make it particularly attractive for malicious actors.
-Defender for IoT Defender-IoT-micro-agent provides a comprehensive security solution for your IoT Edge devices.
+Defender for IoT azureiotsecurity provides a comprehensive security solution for your IoT Edge devices.
Defender for IoT module collects, aggregates and analyzes raw security data from your Operating System and container system into actionable security recommendations and alerts. Similar to Defender for IoT security agents for IoT devices, the Defender for IoT Edge module is highly customizable through its module twin. See [Configure your agent](how-to-agent-configuration.md) to learn more.
-Defender for IoT Defender-IoT-micro-agent for IoT Edge offers the following features:
+Defender for IoT azureiotsecurity for IoT Edge offers the following features:
- Collects raw security events from the underlying Operating System (Linux), and the IoT Edge Container systems.
Defender for IoT Defender-IoT-micro-agent for IoT Edge offers the following feat
- Aggregates raw security events into messages sent through [IoT Edge Hub](../../iot-edge/iot-edge-runtime.md#iot-edge-hub). -- Remove configuration through use of the Defender-IoT-micro-agent twin.
+- Remove configuration through use of the azureiotsecurity twin.
See [Configure a Defender for IoT agent](how-to-agent-configuration.md) to learn more.
-Defender for IoT Defender-IoT-micro-agent for IoT Edge runs in a privileged mode under IoT Edge.
+Defender for IoT azureiotsecurity for IoT Edge runs in a privileged mode under IoT Edge.
Privileged mode is required to allow the module to monitor the Operating System, and other IoT Edge modules. ## Module supported platforms
-Defender for IoT Defender-IoT-micro-agent for IoT Edge is currently only available for Linux.
+Defender for IoT azureiotsecurity for IoT Edge is currently only available for Linux.
## Next steps
-In this article, you learned about the architecture and capabilities of Defender for IoT Defender-IoT-micro-agent for IoT Edge.
+In this article, you learned about the architecture and capabilities of Defender for IoT azureiotsecurity for IoT Edge.
To continue getting started with Defender for IoT deployment, use the following articles: -- Deploy [Defender-IoT-micro-agent for IoT Edge](how-to-deploy-edge.md)
+- Deploy [azureiotsecurity for IoT Edge](how-to-deploy-edge.md)
- Learn how to [configure your Defender-IoT-micro-agent](how-to-agent-configuration.md) - Learn how to [Enable Defender for IoT service in your IoT Hub](quickstart-onboard-iot-hub.md) - Learn more about the service from the [Defender for IoT FAQ](resources-agent-frequently-asked-questions.md)
defender-for-iot How To Install Software https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/how-to-install-software.md
Title: Defender for IoT installation description: Learn how to install a sensor and the on-premises management console for Azure Defender for IoT. Previously updated : 09/12/2021 Last updated : 09/23/2021
To install:
1. Select **SENSOR-RELEASE-\<version\> Enterprise**.
- :::image type="content" source="media/tutorial-install-components/sensor-version-select-screen-v2.png" alt-text="Select your sensor version and enterprise type.":::
+ :::image type="content" source="media/tutorial-install-components/sensor-version-select-screen-v2.png" alt-text="Select your sensor version and enterprise type.":::
1. Define the appliance profile, and network properties:
- :::image type="content" source="media/tutorial-install-components/appliance-profile-screen-v2.png" alt-text="Screenshot that shows the appliance profile, and network properties.":::
+ :::image type="content" source="media/tutorial-install-components/appliance-profile-screen-v2.png" alt-text="Screenshot that shows the appliance profile, and network properties.":::
| Parameter | Configuration | |--|--|
To install:
| -| - | | **Hardware profile** | Select **corporate**. | | **Management interface** | **eno2** |
- | **Default network parameters (provided by the customer)** | **management network IP address:** <br/>**subnet mask:** <br/>**appliance hostname:** <br/>**DNS:** <br/>**the default gateway IP address:**|
+ | **Default network parameters (provided by the customer)** | **management network IP address:** <br> **subnet mask:** <br/>**appliance hostname:** <br/>**DNS:** <br/>**the default gateway IP address:**|
| **input interfaces:** | The system generates a list of input interfaces for you.<br/><br/>To mirror the input interfaces, copy all the items presented in the list with a comma separator.<br/><br/> You do not need to configure the bridge interface. This option is used for special use cases only. | 1. After about 10 minutes, the two sets of credentials appear. One is for a **CyberX** user, and one is for a **support** user.
To install:
## HP EdgeLine 300 installation
-ΓÇó A default administrative user is provided. We recommend that you change the password during the network configuration.
+- A default administrative user is provided. We recommend that you change the password during the network configuration.
-ΓÇó The installation process takes about 20 minutes. After the installation, the system is restarted several times.
+- The installation process takes about 20 minutes. After the installation, the system is restarted several times.
### HP EdgeLine 300 back panel
The following procedure describes how to configure the BIOS for HP EL300 applian
| Parameter | Configuration | |--|--| | **configure hardware profile** | **office** |
- | **configure management network interface** | **enp3s0** <br />or <br />**possible value** |
+ | **configure management network interface** | **enp3s0** <br /> or <br />**possible value** |
| **configure management network IP address:** | **IP address provided by the customer** | | **configure subnet mask:** | **IP address provided by the customer** | | **configure DNS:** | **IP address provided by the customer** | | **configure default gateway IP address:** | **IP address provided by the customer** |
- | **configure input interface(s)** | **enp4s0** <br />or <br />**possible value** |
+ | **configure input interface(s)** | **enp4s0** <br /> or <br />**possible value** |
| **configure bridge interface(s)** | N/A | 1. Accept the settings and continue by entering `Y`.
To install the software:
| Parameter | Configuration | |--|--|
- | **configure management network interface** | For Dell: **eth0, eth1** <br /> For HP: **enu1, enu2** <br /> or <br />**possible value** |
+ | **configure management network interface** | For Dell: **eth0, eth1** <br /> For HP: **enu1, enu2** <br> or <br />**possible value** |
| **configure management network IP address:** | **IP address provided by the customer** | | **configure subnet mask:** | **IP address provided by the customer** | | **configure DNS:** | **IP address provided by the customer** |
The on-premises management console VM supports the following architectures:
| Architecture | Specifications | Usage | |--|--|--|
-| Enterprise <br/>(Default and most common) | CPU: 8 <br/>Memory: 32G RAM<br/> HDD: 1.8 TB | Large production environments |
+| Enterprise <br/>(Default and most common) | CPU: 8 <br/>Memory: 32G RAM<br/> HDD: 1.8 TB | Large production environments |
| Small | CPU: 4 <br/> Memory: 8G RAM<br/> HDD: 500 GB | Large production environments | | Office | CPU: 4 <br/>Memory: 8G RAM <br/> HDD: 100 GB | Small test environments |
This section provides the Nuvo 5006LP installation procedure. Before installing
The following procedure describes how to configure the Nuvo 5006LP BIOS. Make sure the operating system was previously installed on the appliance.
-To configure the BIOS:
+**To configure the BIOS**:
1. Power on the appliance.
To configure the BIOS:
The installation process takes approximately 20 minutes. After installation, the system is restarted several times.
+**To install the software**:
+ 1. Connect the external CD, or disk on key with the ISO image. 1. Boot the appliance.
This section provides the Fitlet2 installation procedure. Before installing the
#### Configure the Fitlet2 BIOS
+**To configure the Fitlet2 BIOS**:
+ 1. Power on the appliance. 1. Navigate to **Main** > **OS Selection**.
This section provides the Fitlet2 installation procedure. Before installing the
1. Navigate to **CSM Configuration** > **CSM Support**. 1. Press **+/-** to select **Enabled**.+ 1. Navigate to **Advanced** > **Boot option filter [Legacy only]** and change setting in the following fields to **Legacy**:+ - Network - Storage - Video
Post-installation validation must include the following tests:
- **Task Manager**: Translates the tasks that appear in the table of processes to the following layers: - Persistent layer (Redis)+ - Cash layer (SQL) - **Network Statistics**: Displays your network statistics.
Post-installation validation must include the following tests:
- **TOP**: Shows the table of processes. It's a Linux command that provides a dynamic real-time view of the running system. - **Backup Memory Check**: Provides the status of the backup memory, checking the following:+ - The location of the backup folder+ - The size of the backup folder+ - The limitations of the backup folder+ - When the last backup happened+ - How much space there is for the extra backup files - **ifconfig**: Displays the parameters for the appliance's physical interfaces.
Post-installation validation must include the following tests:
- **Errors from Core, log**: Displays errors from the core log file.
-To access the tool:
+**To access the tool**:
1. Sign in to the sensor with the **Support** user credentials.
To access the tool:
### Check system health by using the CLI
-**Test 1: Sanity**
+Verify that the system is up, and running prior to testing the system's sanity.
-Verify that the system is up and running:
+**To test the system's sanity**:
1. Connect to the CLI with the Linux terminal (for example, PuTTY) and the user **Support**.
Verify that the system is up and running:
1. Verify that **System is UP! (prod)** appears at the bottom.
-**Test 2: Version check**
- Verify that the correct version is used:
+**To check the system's version**:
+ 1. Connect to the CLI with the Linux terminal (for example, PuTTY) and the user **Support**. 1. Enter `system version`. 1. Check that the correct version appears.
-**Test 3: Network validation**
- Verify that all the input interfaces configured during the installation process are running:
+**To validate the system's network status**:
+ 1. Connect to the CLI with the Linux terminal (for example, PuTTY) and the user **Support**. 1. Enter `network list` (the equivalent of the Linux command `ifconfig`).
Verify that all the input interfaces configured during the installation process
:::image type="content" source="media/tutorial-install-components/interface-list-screen.png" alt-text="Screenshot that shows the list of interfaces.":::
-**Test 4: Management access to the UI**
- Verify that you can access the console web GUI:
+**To check that management has access to the UI**:
+ 1. Connect a laptop with an Ethernet cable to the management port (**Gb1**). 1. Define the laptop NIC address to be in the same range as the appliance.
Verify that you can access the console web GUI:
For any other issues, contact [Microsoft Support](https://support.microsoft.com/en-us/supportforbusiness/productselection?sapId=82c88f35-1b8e-f274-ec11-c6efdd6dd099).
-## Appendix A: Mirroring port on vSwitch (ESXi)
+## Configure a SPAN port
-### Configure a SPAN port on an existing vSwitch
+A vSwitch does not have mirroring capabilities, but you can use a workaround to implement a SPAN port. You can implement the workaround with either ESXi, or Hyper-V.
-A vSwitch does not have mirroring capabilities, but you can use a workaround to implement a SPAN port.
-To configure a SPAN port:
+### Configure a SPAN port with ESXi
+
+**To configure a SPAN port with ESXi**:
1. Open vSwitch properties.
To configure a SPAN port:
1. Select **OK**.
-1. Connect to the sensor and verify that mirroring works.
+1. Connect to the sensor, and verify that mirroring works.
+
+### Configure a SPAN port with Hyper-V
+
+Prior to starting you will need to:
+
+- Ensure that there is no instance of ClearPass VA running.
+
+- Enable Ensure SPAN on the data port, and not the management port.
+
+- Ensure that the data port SPAN configuration is not configured with an IP address.
+
+**To configure a SPAN port with Hyper-V**:
+
+1. Open the Virtual Switch Manager.
+
+1. In the Virtual Switches list, select **New virtual network switch** > **External** as the dedicated spanned network adapter type.
+
+ :::image type="content" source="media/tutorial-install-components/new-virtual-network.png" alt-text="Screenshot of selecting new virtual network and external before creating the virtual switch.":::
+
+1. Select **Create Virtual Switch**.
+
+1. Under connection type, select **External Network**.
+
+1. Ensure the checkbox for **Allow management operating system to share this network adapter** is checked.
+
+ :::image type="content" source="media/tutorial-install-components/external-network.png" alt-text="Select external network, and allow the management operating system to share the network adapter.":::
+
+1. Select **OK**.
+
+#### Attach a ClearPass SPAN Virtual Interface to the virtual switch
+
+You are able to attach a ClearPass SPAN Virtual Interface to the Virtual Switch through Windows PowerShell, or through Hyper-V Manager.
+
+**To attach a ClearPass SPAN Virtual Interface to the virtual switch with PowerShell**:
+
+1. Select the newly added SPAN virtual switch, and add a new network adapter with the following command:
+
+ ```bash
+ ADD-VMNetworkAdapter -VMName VK-C1000V-LongRunning-650 -Name Monitor -SwitchName vSwitch_Span
+ ```
+
+1. Enable port mirroring for the selected interface as the span destination with the following command:
+
+ ```bash
+ Get-VMNetworkAdapter -VMName VK-C1000V-LongRunning-650 | ? Name -eq Monitor | Set-VMNetworkAdapter -PortMirroring Destination
+ ```
+
+ | Parameter | Description |
+ |--|--|
+ | VK-C1000V-LongRunning-650 | CPPM VA name |
+ |vSwitch_Span |Newly added SPAN virtual switch name |
+ |Monitor |Newly added adapter name |
+
+1. Select **OK**.
+
+These commands set the name of the newly added adapter hardware to be `Monitor`. If you are using Hyper-V Manager, the name of the newly added adapter hardware is set to `Network Adapter`.
+
+**To attach a ClearPass SPAN Virtual Interface to the virtual switch with Hyper-V Manager**:
+
+1. Under the Hardware list, select **Network Adapter**.
+
+1. In the Virtual Switch field, select **vSwitch_Span**.
+
+ :::image type="content" source="media/tutorial-install-components/vswitch-span.png" alt-text="Screenshot of selecting the following options on the virtual switch screen.":::
+
+1. In the Hardware list, under the Network Adapter drop-down list, select **Advanced Features**.
+
+1. In the Port Mirroring section, select **Destination** as the mirroring mode for the new virtual interface.
+
+ :::image type="content" source="media/tutorial-install-components/destination.png" alt-text="Screenshot of the selections needed to configure mirroring mode.":::
+
+1. Select **OK**.
+
+#### Enable Microsoft NDIS capture extensions for the virtual switch
+
+Microsoft NDIS Capture Extensions will need to be enabled for the new virtual switch.
+
+**To enable Microsoft NDIS capture extensions for the newly added virtual switch**:
+
+1. Open the Virtual Switch Manager on the Hyper-V host.
+
+1. In the Virtual Switches list, expand the virtual switch name `vSwitch_Span` and select **Extensions**.
+
+1. In the Switch Extensions field, select **Microsoft NDIS Capture**.
+
+ :::image type="content" source="media/tutorial-install-components/microsoft-ndis.png" alt-text="Screenshot of enabling the Microsoft NDIS by selecting it from the switch extensions menu.":::
+
+1. Select **OK**.
+
+#### Set the Mirroring Mode on the external port
+
+Mirroring mode will need to be set on the external port of the new virtual switch to be the source.
+
+You will need to configure the Hyper-V virtual switch (vSwitch_Span) to forward any traffic that comes to the external source port, to the virtual network adapter that you configured as the destination.
+
+Use the following PowerShell commands to set the external virtual switch port to source mirror mode:
+
+```bash
+$ExtPortFeature=Get-VMSystemSwitchExtensionPortFeature -FeatureName "Ethernet Switch Port Security Settings"
+$ExtPortFeature.SettingData.MonitorMode=2
+Add-VMSwitchExtensionPortFeature -ExternalPort -SwitchName vSwitch_Span -VMSwitchExtensionFeature $ExtPortFeature
+```
+
+| Parameter | Description |
+|--|--|
+| vSwitch_Span | Newly added SPAN virtual switch name. |
+| MonitorMode=2 | Source |
+| MonitorMode=1 | Destination |
+| MonitorMode=0 | None |
+
+Use the following PowerShell command to verify the monitoring mode status:
+
+```bash
+Get-VMSwitchExtensionPortFeature -FeatureName "Ethernet Switch Port Security Settings" -SwitchName vSwitch_Span -ExternalPort | select -ExpandProperty SettingData
+```
+
+| Parameter | Description |
+|--|--|
+| vSwitch_Span | Newly added SPAN virtual switch name |
-## Appendix B: Access sensors from the on-premises management console
+## Access sensors from the on-premises management console
You can enhance system security by preventing direct user access to the sensor. Instead, use proxy tunneling to let users access the sensor from the on-premises management console with a single firewall rule. This technique narrows the possibility of unauthorized access to the network environment beyond the sensor. The user's experience when signing in to the sensor remains the same. :::image type="content" source="media/tutorial-install-components/sensor-system-graph.png" alt-text="Screenshot that shows access to the sensor.":::
-To enable tunneling:
+**To enable tunneling**:
1. Sign in to the on-premises management console's CLI with the **CyberX**, or the **Support** user credentials.
defender-for-iot How To Manage The On Premises Management Console https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/how-to-manage-the-on-premises-management-console.md
To define:
1. Sign in to the CLI for the on-premises management with administrative credentials. 1. Type ```nano /var/cyberx/properties/remote-interfaces.properties```. 1. Select enter. The following prompts appear.
-```mail.smtp_server= ```
-```mail.port=25 ```
-```mail.sender=```
+ `mail.smtp_server=`
+ `mail.port=25`
+ `mail.sender=`
1. Enter the SMTP server name and sender and select enter. ## See also
defender-for-iot Tutorial Forescout https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/tutorial-forescout.md
description: In this tutorial, you will learn how to integrate Azure Defender fo
Previously updated : 09/14/2021 Last updated : 09/23/2021
In this tutorial, you learn how to:
> - View device attributes in Forescout > - Create Azure Defender for IoT policies in Forescout
+If you do not already have an Azure account, you can [create your Azure free account today](https://azure.microsoft.com/free/).
+ ## Prerequisites - Azure Defender for IoT version 2.4 or above
defender-for-iot Tutorial Fortinet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/tutorial-fortinet.md
description: In this tutorial, you will learn how to integrate Azure Defender fo
Previously updated : 09/20/2021 Last updated : 09/23/2021
In this tutorial, you learn how to:
> - Send Defender for IoT alerts to FortiSIEM > - Block a malicious source using the Fortigate firewall
+If you do not already have an Azure account, you can [create your Azure free account today](https://azure.microsoft.com/free/).
+ ## Prerequisites There are no prerequisites for this tutorial.
An application programming interface (API) key is a uniquely generated code that
When the API key is generated, save it as it will not be provided again. ## Set a forwarding rule to block malware-related alerts
The FortiGate firewall can be used to block suspicious traffic.
1. To configure the FortiGate forwarding rule, set the following parameters:
- :::image type="content" source="media/tutorial-fortinet/configure.png" alt-text="Screenshot of the configure the Create Forwarding Rule window":::
+ :::image type="content" source="media/tutorial-fortinet/configure.png" alt-text="Screenshot of the configure the Create Forwarding Rule window.":::
| Parameter | Description | |--|--|
You can then use Defender for IoT's Forwarding Rules to send alert information t
4. Enter the FortiSIEM server details.
- :::image type="content" source="media/tutorial-fortinet/details.png" alt-text="Screenshot of the add the FortiSIEm details to the forwarding rule":::
+ :::image type="content" source="media/tutorial-fortinet/details.png" alt-text="Screenshot of the add the FortiSIEm details to the forwarding rule.":::
| Parameter | Description | | | -- |
You can set policies to automatically block malicious sources in the FortiGate f
For example, the following alert can block the malicious source: **To set a FortiGate firewall rule that blocks a malicious source**:
defender-for-iot Tutorial Onboarding https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/tutorial-onboarding.md
description: In this tutorial, you will learn how to onboard to Azure Defender f
Previously updated : 09/06/2021 Last updated : 09/23/2021
A vSwitch does not have mirroring capabilities, but you can use a workaround to
1. Select **OK**.
-1. Connect to the sensor and verify that mirroring works.
+1. Connect to the sensor, and verify that mirroring works.
### Configure a SPAN port with Hyper-V
dns Dns Protect Private Zones Recordsets https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dns/dns-protect-private-zones-recordsets.md
Title: Protecting private DNS Zones and Records - Azure DNS
description: In this learning path, get started protecting private DNS zones and record sets in Microsoft Azure DNS. --++ Last updated 05/07/2021
event-grid Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/edge/api.md
Event Grid on IoT Edge has the following APIs exposed over HTTP (port 5888) and
### Request query string All API requests require the following query string parameter:
-```?api-version=2019-01-01-preview```
+`?api-version=2019-01-01-preview`
### Request content type All API requests must have a **Content-Type**. In case of **EventGridSchema** or **CustomSchema**, the value of Content-Type can be one of the following values:
-```Content-Type: application/json```
+`Content-Type: application/json`
-```Content-Type: application/json; charset=utf-8```
+`Content-Type: application/json; charset=utf-8`
In case of **CloudEventSchemaV1_0** in structured mode, the value of Content-Type can be one of the following values:
-```Content-Type: application/cloudevents+json```
+`Content-Type: application/cloudevents+json`
-```Content-Type: application/cloudevents+json; charset=utf-8```
+`Content-Type: application/cloudevents+json; charset=utf-8`
-```Content-Type: application/cloudevents-batch+json```
+`Content-Type: application/cloudevents-batch+json`
-```Content-Type: application/cloudevents-batch+json; charset=utf-8```
+`Content-Type: application/cloudevents-batch+json; charset=utf-8`
In case of **CloudEventSchemaV1_0** in binary mode, refer to [documentation](https://github.com/cloudevents/spec/blob/master/http-protocol-binding.md) for details.
To publish to a Storage Queue, set the `endpointType` to `storageQueue` and pro
* queueName: Name of the Storage Queue you're publishing to. * connectionString: Connection string for the Storage Account the Storage Queue is in.
- >[!NOTE]
- > Unline Event Hubs, Service Bus Queues, and Service Bus Topics, the connection string used for Storage Queues is not entity specific. Instead, it must but the connection string for the Storage Account.
-
- ```json
- {
- "properties": {
- "destination": {
- "endpointType": "storageQueue",
- "properties": {
- "queueName": "<your-storage-queue-name>",
- "connectionString": "<your-storage-account-connection-string>"
- }
- }
- }
+ >[!NOTE]
+ > Unlike Event Hubs, Service Bus Queues, and Service Bus Topics, the connection string used for Storage Queues is not entity specific. Instead, it must be the connection string for the Storage Account.
+
+ ```json
+ {
+ "properties": {
+ "destination": {
+ "endpointType": "storageQueue",
+ "properties": {
+ "queueName": "<your-storage-queue-name>",
+ "connectionString": "<your-storage-account-connection-string>"
}
- ```
+ }
+ }
+ }
+ ```
event-grid Event Schema Api Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/event-schema-api-management.md
API Management emits the following event types:
| Event type | Description | | - | -- |
-| Microsoft.APIManagement.UserCreated | Raised when a user is created. |
-| Microsoft.APIManagement.UserUpdated | Raised when a user is updated. |
-| Microsoft.APIManagement.UserDeleted | Raised when a user is deleted. |
-| Microsoft.APIManagement.APICreated | Raised when an API is created. |
-| Microsoft.APIManagement.APIUpdated | Raised when an API is updated. |
-| Microsoft.APIManagement.APIDeleted | Raised when an API is deleted. |
-| Microsoft.APIManagement.ProductCreated | Raised when a product is created. |
-| Microsoft.APIManagement.ProductUpdated | Raised when a product is updated. |
-| Microsoft.APIManagement.ProductDeleted | Raised when a product is deleted. |
-| Microsoft.APIManagement.ReleaseCreated | Raised when an API release is created. |
-| Microsoft.APIManagement.ReleaseUpdated | Raised when an API release is updated. |
-| Microsoft.APIManagement.ReleaseDeleted | Raised when an API release is deleted. |
-| Microsoft.APIManagement.SubscriptionCreated | Raised when a subscription is created. |
-| Microsoft.APIManagement.SubscriptionUpdated | Raised when a subscription is updated. |
-| Microsoft.APIManagement.SubscriptionDeleted | Raised when a subscription is deleted. |
+| Microsoft.ApiManagement.UserCreated | Raised when a user is created. |
+| Microsoft.ApiManagement.UserUpdated | Raised when a user is updated. |
+| Microsoft.ApiManagement.UserDeleted | Raised when a user is deleted. |
+| Microsoft.ApiManagement.APICreated | Raised when an API is created. |
+| Microsoft.ApiManagement.APIUpdated | Raised when an API is updated. |
+| Microsoft.ApiManagement.APIDeleted | Raised when an API is deleted. |
+| Microsoft.ApiManagement.ProductCreated | Raised when a product is created. |
+| Microsoft.ApiManagement.ProductUpdated | Raised when a product is updated. |
+| Microsoft.ApiManagement.ProductDeleted | Raised when a product is deleted. |
+| Microsoft.ApiManagement.ReleaseCreated | Raised when an API release is created. |
+| Microsoft.ApiManagement.ReleaseUpdated | Raised when an API release is updated. |
+| Microsoft.ApiManagement.ReleaseDeleted | Raised when an API release is deleted. |
+| Microsoft.ApiManagement.SubscriptionCreated | Raised when a subscription is created. |
+| Microsoft.ApiManagement.SubscriptionUpdated | Raised when a subscription is updated. |
+| Microsoft.ApiManagement.SubscriptionDeleted | Raised when a subscription is deleted. |
## Example event
The following example shows the schema of an API updated event. The schema of ot
"source": "/subscriptions/{subscription-id}/resourceGroups/{your-rg}/providers/Microsoft.ApiManagement/service/{your-APIM-instance}", "subject": "/apis/myapi;Rev=1", "data": {
- "resourceUri": "/subscriptions/subscription-id}/resourceGroups/{your-rg}/providers/Microsoft.ApiManagement/service/{your-APIM-instance}/apis/myapi;Rev=1"
+ "resourceUri": "/subscriptions/{subscription-id}/resourceGroups/{your-rg}/providers/Microsoft.ApiManagement/service/{your-APIM-instance}/apis/myapi;Rev=1"
}, "Type": "Microsoft.ApiManagement.APIUpdated", "Time": "2021-07-12T23:13:44.9048323Z",
frontdoor Edge Locations Abbreviation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/edge-locations-abbreviation.md
This article lists Microsoft edge locations, sorted by location abbreviation, fo
## Next steps * View [Azure Front Door edge locations by metro](edge-locations-by-region.md).
-* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/cdn/edgenodes/list).
+* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/edge-nodes/list).
* Learn how to [create an Azure Front Door profile](quickstart-create-front-door.md).
frontdoor Edge Locations By Region https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/edge-locations-by-region.md
This article lists current metros containing edge locations, sorted by region, f
## Next steps * View [Azure Front Door edge locations by abbreviation](edge-locations-abbreviation.md).
-* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/cdn/edgenodes/list).
+* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/edge-nodes/list).
* Learn how to [create an Azure Front Door profile](quickstart-create-front-door.md).
frontdoor Edge Locations By Abbreviation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/edge-locations-by-abbreviation.md
This article lists Microsoft edge locations, sorted by location abbreviation, fo
## Next steps * View [Azure Front Door edge locations by metro](edge-locations.md).
-* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/cdn/edgenodes/list).
+* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/edge-nodes/list).
* Learn how to [create an Azure Front Door Standard/Premium profile](create-front-door-portal.md).
frontdoor Edge Locations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/edge-locations.md
This article lists current metros containing edge locations, sorted by region, f
## Next steps * View [Azure Front Door edge locations by abbreviation](edge-locations-by-abbreviation.md).
-* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/cdn/edgenodes/list).
+* To get the latest list of edge nodes for Azure Front Door, see [Edge Nodes List - REST API](/rest/api/cdn/edge-nodes/list).
* Learn how to [create an Azure Front Door Standard/Premium profile](create-front-door-portal.md).
fxt-edge-filer Mount Clients https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/fxt-edge-filer/mount-clients.md
In addition to the paths, include the [mount command options](#use-recommended-m
To ensure a seamless client mount, pass these settings and arguments in your mount command:
-``mount -o hard,nointr,proto=tcp,mountproto=tcp,retry=30 ${VSERVER_IP_ADDRESS}:/${NAMESPACE_PATH} ${LOCAL_FILESYSTEM_MOUNT_POINT}``
+`mount -o hard,nointr,proto=tcp,mountproto=tcp,retry=30 ${VSERVER_IP_ADDRESS}:/${NAMESPACE_PATH} ${LOCAL_FILESYSTEM_MOUNT_POINT}`
| Required settings | Description | |
To ensure a seamless client mount, pass these settings and arguments in your mou
``mountproto=netid`` | This option supports appropriate handling of network errors for mount operations. ``retry=n`` | Set ``retry=30`` to avoid transient mount failures. (A different value is recommended in foreground mounts.)
-| Preferred settings | Description |
- |
-``nointr`` | If your clients use older OS kernels (before April 2008) that support this option, use it. The option "intr" is the default.
+| Preferred settings | Description |
+| | |
+| `nointr` | If your clients use older OS kernels (before April 2008) that support this option, use it. The option "intr" is the default. |
## Next steps
After you have mounted clients, you can test your workflow and get started with
If you need to move data to a new cloud core filer, take advantage of the cache structure by using parallel data ingest. Some strategies are described in [Moving data to a vFXT cluster](../avere-vfxt/avere-vfxt-data-ingest.md). (Avere vFXT for Azure is a cloud-based product that uses caching technology very similar to the Azure FXT Edge Filer.)
-Read [Monitor Azure FXT Edge Filer hardware status](monitor.md) if you need to troubleshoot any hardware issues.
+Read [Monitor Azure FXT Edge Filer hardware status](monitor.md) if you need to troubleshoot any hardware issues.
governance Guest Configuration Create Definition https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/how-to/guest-configuration-create-definition.md
$PolicyParameterInfo = @(
} )
-New-GuestConfigurationPolicy
+New-GuestConfigurationPolicy `
-PolicyId 'My GUID' ` -ContentUri '<paste the ContentUri output from the Publish command>' ` -DisplayName 'Audit Windows Service.' `
hdinsight Interactive Query Troubleshoot Migrate 36 To 40 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/interactive-query/interactive-query-troubleshoot-migrate-36-to-40.md
This article provides answers to some of the most common issues that customers f
## Reduce latency when running `DESCRIBE TABLE_NAME` Workaround:+ * Increase maximum number of objects (tables/partitions) that can be retrieved from metastore in one batch. Set it to a large number (default is 300) until satisfactory latency levels are reached. The higher the number, the fewer round trips are needed to the Hive metastore server, but it may also cause higher memory requirement at the client side.
- ```hive.metastore.batch.retrieve.max=2000```
+ `hive.metastore.batch.retrieve.max=2000`
+ * Restart Hive and all stale services ## Unable to query Gzipped text file if skip.header.line.count and skip.footer.line.count are set for table
MetaStoreAuthzAPIAuthorizerEmbedOnly effectively disables security checks becaus
## Permission errors in Hive job after upgrading to HDInsight 4.0 * In HDInsight 4.0, all cluster shapes with Hive components are configured with a new authorization provider:
-```org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider```
+
+ `org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider`
* HDFS file permissions should be assigned to the hive user for the file being accessed. The error message provides the details needed to resolve the issue. * You can also switch to ```MetaStoreAuthzAPIAuthorizerEmbedOnly``` provider used in HDInsight 3.6 Hive clusters.
-```org.apache.hadoop.hive.ql.security.authorization.MetaStoreAuthzAPIAuthorizerEmbedOnly```
- :::image type="content" source="./media/apache-hive-40-migration-guide/hive-job-permission-errors.png" alt-text="Set authorization to MetaStoreAuthzAPIAuthorizerEmbedOnly" border="true":::
+ `org.apache.hadoop.hive.ql.security.authorization.MetaStoreAuthzAPIAuthorizerEmbedOnly`
+
+ :::image type="content" source="./media/apache-hive-40-migration-guide/hive-job-permission-errors.png" alt-text="Set authorization to MetaStoreAuthzAPIAuthorizerEmbedOnly" border="true":::
## Unable to query table with OpenCSVSerde
hpc-cache Hpc Cache Create https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hpc-cache/hpc-cache-create.md
Supply these values:
* Azure region * Cache subnet, in this format:
- ``--subnet "/subscriptions/<subscription_id>/resourceGroups/<cache_resource_group>/providers/Microsoft.Network/virtualNetworks/<virtual_network_name>/subnets/<cache_subnet_name>"``
+ `--subnet "/subscriptions/<subscription_id>/resourceGroups/<cache_resource_group>/providers/Microsoft.Network/virtualNetworks/<virtual_network_name>/subnets/<cache_subnet_name>"`
The cache subnet needs at least 64 IP addresses (/24), and it can't house any other resources.
hpc-cache Hpc Cache Ingest Msrsync https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hpc-cache/hpc-cache-ingest-msrsync.md
Follow these instructions to use ``msrsync`` to populate Azure Blob storage with
For example, this command is designed to move 11,000 files in 64 processes from /test/source-repository to /mnt/hpccache/repository:
- ``mrsync -P --stats -p64 -f170 --rsync "-ahv --inplace" /test/source-repository/ /mnt/hpccache/repository``
+ `mrsync -P --stats -p64 -f170 --rsync "-ahv --inplace" /test/source-repository/ /mnt/hpccache/repository`
industry Sensor Partner Integration In Azure Farmbeats https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/industry/agriculture/sensor-partner-integration-in-azure-farmbeats.md
Here are the most common request headers that need to be specified when you make
**Header** | **Description and example** |
-Content-Type | The request format (Content-Type: application/<format>). For FarmBeats Datahub APIs, the format is JSON. Content-Type: application/json
-Authorization | Specifies the access token required to make an API call. Authorization: Bearer <Access-Token>
+Content-Type | The request format (Content-Type: `application/<format>`). For FarmBeats Datahub APIs, the format is JSON. Content-Type: application/json
+Authorization | Specifies the access token required to make an API call. Authorization: Bearer \<Access-Token\>
Accept | The response format. For FarmBeats Datahub APIs, the format is JSON. Accept: application/json **API requests**
-To make a REST API request, you combine the HTTP (GET, POST, or PUT) method, the URL to the API service, the Uniform Resource Identifier (URI) to a resource to query, submit data to, update, or delete, and one or more HTTP request headers. The URL to the API service is the API endpoint you provide. Here's a sample: https://\<yourdatahub-website-name>.azurewebsites.net
+To make a REST API request, you combine the HTTP (GET, POST, or PUT) method, the URL to the API service, the Uniform Resource Identifier (URI) to a resource to query, submit data to, update, or delete, and one or more HTTP request headers. The URL to the API service is the API endpoint you provide. Here's a sample: `https://\<yourdatahub-website-name>.azurewebsites.net`
Optionally, you can include query parameters on GET calls to filter, limit the size of, and sort the data in the responses.
iot-central Concepts Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/concepts-architecture.md
# Azure IoT Central architecture
-This article provides an overview of the key concepts in the Azure IoT Central architecture.
+This article provides an overview of the key elements in an IoT Central solution architecture.
-## Devices
-Devices exchange data with your Azure IoT Central application. A device can:
+An IoT Central application:
-- Send measurements such as telemetry.-- Synchronize settings with your application.
+- Lets you manage the IoT devices in your solution.
+- Lets you view and analyze the data from your devices.
+- Can export to and integrate with other services that are part of the solution.
-In Azure IoT Central, the data that a device can exchange with your application is specified in a device template. For more information about device templates, see [Device Templates](concepts-device-templates.md).
+## IoT Central
-To learn more about how devices connect to your Azure IoT Central application, see [Device connectivity](concepts-get-connected.md).
+IoT Central is a ready-made environment for IoT solution development. It's a platform as a service (PaaS) IoT solution and its primary interface is a web UI. There's also a [REST API](#rest-api) that lets you interact with your application programmatically.
-### Azure IoT Edge devices
+This section describes the key capabilities of an IoT Central application.
-As well as devices created using the [Azure IoT SDKs](https://github.com/Azure/azure-iot-sdks), you can also connect [Azure IoT Edge devices](../../iot-edge/about-iot-edge.md) to an IoT Central application. IoT Edge lets you run cloud intelligence and custom logic directly on IoT devices managed by IoT Central. You can also use IoT Edge as a gateway to enable other downstream devices to connect to IoT Central.
+### Manage devices
-To learn more, see [Connect Azure IoT Edge devices to an Azure IoT Central application](concepts-iot-edge.md).
+IoT Central lets you manage the fleet of [IoT devices](#devices) that are sending data to your solution. For example, you can:
-## Cloud gateway
+- Control which devices can [connect](concepts-get-connected.md) to your application and how they authenticate.
+- Use [device templates](concepts-device-templates.md) to define the types of device that can connect to your application.
+- Manage devices by setting properties or calling commands on connected devices. For example, set a target temperature property for a thermostat device or call a command to trigger a device to update its firmware. You can set properties and call commands on:
+ - Individual devices through a [customizable](concepts-device-templates.md#views) web UI.
+ - Multiple devices with scheduled or on-demand [jobs](howto-manage-devices-in-bulk.md).
+- Maintain [device metadata such](concepts-device-templates.md#cloud-properties) as customer address or last service date.
-Azure IoT Central uses Azure IoT Hub as a cloud gateway that enables device connectivity. IoT Hub enables:
+### View and analyze data
-- Data ingestion at scale in the cloud.-- Device management.-- Secure device connectivity.
+In an IoT Central application, you can view and analyze data for individual devices or for aggregated data from multiple devices:
-To learn more about IoT Hub, see [Azure IoT Hub](../../iot-hub/index.yml).
+- Use device templates to define [custom views](howto-set-up-template.md#views) for individual devices of specific types. For example, you can plot temperature over time for an individual thermostat or show the live location of a delivery truck.
+- Use the built-in [analytics](tutorial-use-device-groups.md) to view aggregate data for multiple devices. For example, you can see the total occupancy across multiple retail stores or identifying the stores with the highest or lowest occupancy rates.
+- Create custom [dashboards](howto-manage-dashboards.md) to help you manage your devices. For example, you can add maps, tiles, and charts to show device telemetry.
-To learn more about device connectivity in Azure IoT Central, see [Device connectivity](concepts-get-connected.md).
+### Secure your solution
-## Data stores
+In an IoT Central application you can manage the following security aspects of your solution:
-Azure IoT Central stores application data in the cloud. Application data stored includes:
+- [Device connectivity](concepts-get-connected.md): Create, revoke, and update the security keys that your devices use to establish a connection to your application.
+- [App integrations](howto-authorize-rest-api.md#get-an-api-token): Create, revoke, and update the security keys that other applications use to establish secure connections with your application.
+- [User management](howto-manage-users-roles.md): Manage the users that can sign in to the application and the roles that determine what permissions those users have.
+- [Organizations](howto-create-organizations.md): Define a hierarchy to manage which users can see which devices in your IoT Central application.
-- Device templates.-- Device identities.-- Device metadata.-- User and role data.
+### REST API
-Azure IoT Central uses a time series store for the measurement data sent from your devices. Time series data from devices used by the analytics service.
+Build integrations that let other applications and services manage your application. For example, programmatically [manage the devices](howto-control-devices-with-rest-api.md) in your application or synchronize [user information](howto-manage-users-roles-with-rest-api.md) with an external system.
-## Data export
+## Devices
+
+Devices collect data from sensors to send as a stream of telemetry to an IoT Central application. For example, a refrigeration unit sends a stream of temperature values or a delivery truck streams its location.
+
+A device can use properties to report its state, such as whether a valve is open or closed. An IoT Central application can also use properties to set device state, for example setting a target temperature for a thermostat.
+
+IoT Central can also control devices by calling commands on the device. For example, instructing a device to download and install a firmware update.
+
+The [telemetry, properties, and commands](concepts-telemetry-properties-commands.md) that a device implements are collectively known as the device capabilities. You define these capabilities in a model that's shared between the device and the IoT Central application. In IoT Central, this model is part of the device template that defines a specific type of device.
-In an Azure IoT Central application, you can [continuously export your data](howto-export-data.md) to your own Azure Event Hubs and Azure Service Bus instances. You can also periodically export your data to your Azure Blob storage account. IoT Central can export measurements, devices, and device templates.
+The [device implementation](tutorial-connect-device.md) should follow the [IoT Plug and Play conventions](../../iot-develop/concepts-convention.md) to ensure that it can communicate with IoT Central. For more information, see the various language [SDKs and samples](../../iot-develop/libraries-sdks.md).
+
+Devices connect to IoT Central using one the supported protocols: [MQTT, AMQP, or HTTP](../../iot-hub/iot-hub-devguide-protocols.md).
+
+## Gateways
+
+Local device gateways are useful in several scenarios, such as:
+
+- Devices may not be able to connect directly to IoT Central because they can't connect to the internet. For example, you may have a collection of Bluetooth enabled occupancy sensors that need to connect through a gateway.
+- The quantity of data generated by your devices may be high. To reduce costs, you can combine or aggregate the data in a local gateway before it's sent to your IoT Central application.
+- Your solution may require fast responses to anomalies in the data. You can run rules on a gateway that identify anomalies and take an action locally without the need to send data to your IoT Central application.
+
+To learn more, see [Connect Azure IoT Edge devices to an Azure IoT Central application](concepts-iot-edge.md).
+
+## Data export
-## Batch device updates
+Although IoT Central has built-in analytics features, you can export data to other services and applications. Reasons to export data include:
-In an Azure IoT Central application, you can [create and run jobs](howto-manage-devices-in-bulk.md) to manage connected devices. These jobs let you do bulk updates to device properties or settings, or run commands. For example, you can create a job to increase the fan speed for multiple refrigerated vending machines.
+### Storage and analysis
-## Role-based access control (RBAC)
+For long-term storage and control over archiving and retention policies, you can [continuously export your data](howto-export-data.md) to other storage destinations. Use of separate storage also lets you use other analytics tools to derive insights and view the data in your solution.
-Every IoT Central application has its own built-in RBAC system. An [administrator can define access rules](howto-manage-users-roles.md) for an Azure IoT Central application using one of the predefined roles or by creating a custom role. Roles determine what areas of the application a user has access to and what they can do.
+### Business automation
-## Security
+[Rules](howto-configure-rules-advanced.md) in IoT Central let your trigger external actions, such as to send an email or fire an event, in response to conditions within IoT Central. For example, you can notify an engineer if the ambient temperature for a device reaches a threshold.
-Security features within Azure IoT Central include:
+### Additional computation
-- Data is encrypted in transit and at rest.-- Authentication is provided either by Azure Active Directory or Microsoft Account. Two-factor authentication is supported.-- Full tenant isolation.-- Device level security.
+You may need to [transform or do computations](howto-transform-data.md) on your data before it can be used either in IoT Central or another service. For example, you could add local weather information to the location data reported by a delivery truck.
## Next steps
-Now that you've learned about the architecture of Azure IoT Central, the suggested next step is to learn about [device connectivity](concepts-get-connected.md) in Azure IoT Central.
+Now that you've learned about the architecture of Azure IoT Central, the suggested next step is to learn about [device connectivity](concepts-get-connected.md) in Azure IoT Central.
iot-develop Tutorial Configure Tsi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-develop/tutorial-configure-tsi.md
Select the **Instances** tab. Find the instance that represents the device's wor
![Screenshot showing how to edit an instance.](./media/tutorial-configure-tsi/edit-instance.png)
-Open the **Type** drop-down menu and then select **Temperature Controller**. Enter *defaultComponent, <your device name>* to update the name of the instance that represents all top-level tags associated with your device.
+Open the **Type** drop-down menu and then select **Temperature Controller**. Enter *defaultComponent, \<your device name\>* to update the name of the instance that represents all top-level tags associated with your device.
![Screenshot showing how to change an instance type.](./media/tutorial-configure-tsi/change-type.png)
-Before you select **Save**, first select the **Instance Fields** tab, and then select **Device Fleet**. To group the telemetry together, enter *\<your device name> - Temp Controller*. Then select **Save**.
+Before you select **Save**, first select the **Instance Fields** tab, and then select **Device Fleet**. To group the telemetry together, enter *\<your device name\> - Temp Controller*. Then select **Save**.
![Screenshot showing how to assign an instance to a hierarchy](./media/tutorial-configure-tsi/assign-to-hierarchy.png)
iot-hub-device-update Device Update Agent Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub-device-update/device-update-agent-provisioning.md
The Device Update agent can also be configured without the IoT Identity service
1. You should see a window open with some text in it. Delete the entire string following 'connection_String=' the first-time you provision the Device Update agent on the IoT device. It is just place holder text.
- 1. In the terminal, replace <your-connection-string> with the connection string of the device for your instance of Device Update agent. Select Enter and then **Save.** It should look this example:
+ 1. In the terminal, replace \<your-connection-string\> with the connection string of the device for your instance of Device Update agent. Select Enter and then **Save.** It should look this example:
```text connection_string=<ADD CONNECTION STRING HERE>
key-vault Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/quick-create-cli.md
Title: Quickstart - Set & view Azure Key Vault certificates with Azure CLI description: Quickstart showing how to set and retrieve a certificate from Azure Key Vault using Azure CLI- tags: azure-resource-manager-
Type the commands below to create a self-signed certificate with default policy
az keyvault certificate create --vault-name "<your-unique-keyvault-name>" -n ExampleCertificate -p "$(az keyvault certificate get-default-policy)" ```
-You can now reference this certificate that you added to Azure Key Vault by using its URI. Use **'https://<your-unique-keyvault-name>.vault.azure.net/certificates/ExampleCertificate'** to get the current version.
+You can now reference this certificate that you added to Azure Key Vault by using its URI. Use **`https://<your-unique-keyvault-name>.vault.azure.net/certificates/ExampleCertificate`** to get the current version.
To view previously stored certificate:
key-vault Quick Create Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/quick-create-powershell.md
$Policy = New-AzKeyVaultCertificatePolicy -SecretContentType "application/x-pkcs
Add-AzKeyVaultCertificate -VaultName "<your-unique-keyvault-name>" -Name "ExampleCertificate" -CertificatePolicy $Policy ```
-You can now reference this certificate that you added to Azure Key Vault by using its URI. Use **"https://<your-unique-keyvault-name>.vault.azure.net/certificates/ExampleCertificate"** to get the current version.
+You can now reference this certificate that you added to Azure Key Vault by using its URI. Use **`https://<your-unique-keyvault-name>.vault.azure.net/certificates/ExampleCertificate`** to get the current version.
To view previously stored certificate:
key-vault Tutorial Python Virtual Machine https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/general/tutorial-python-virtual-machine.md
Title: Tutorial - Use Azure Key Vault with a virtual machine in Python | Microsoft Docs description: In this tutorial, you configure a virtual machine a Python application to read a secret from your key vault.- -
az keyvault set-policy --name "<your-unique-keyvault-name>" --object-id "<system
To sign in to the virtual machine, follow the instructions in [Connect and sign in to an Azure virtual machine running Linux](../../virtual-machines/linux/login-using-aad.md) or [Connect and sign in to an Azure virtual machine running Windows](../../virtual-machines/windows/connect-logon.md).
-To log into a Linux VM, you can use the ssh command with the "<publicIpAddress>" given in the [Create a virtual machine](#create-a-virtual-machine) step:
+To log into a Linux VM, you can use the ssh command with the \<publicIpAddress\> given in the [Create a virtual machine](#create-a-virtual-machine) step:
```terminal ssh azureuser@<PublicIpAddress>
pip3 install azure.identity
## Create and edit the sample Python script
-On the virtual machine, create a Python file called **sample.py**. Edit the file to contain the following code, replacing "<your-unique-keyvault-name>" with the name of your key vault:
+On the virtual machine, create a Python file called **sample.py**. Edit the file to contain the following code, replacing \<your-unique-keyvault-name\> with the name of your key vault:
```python from azure.keyvault.secrets import SecretClient
key-vault Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/keys/quick-create-cli.md
Title: Create and retrieve attributes of a key in Azure Key Vault - Azure CLI description: Quickstart showing how to set and retrieve a key from Azure Key Vault using Azure CLI- tags: azure-resource-manager
Type the commands below to create a key called **ExampleKey** :
az keyvault key create --vault-name "<your-unique-keyvault-name>" -n ExampleKey --protection software ```
-You can now reference this key that you added to Azure Key Vault by using its URI. Use **'https://<your-unique-keyvault-name>.vault.azure.net/keys/ExampleKey'** to get the current version.
+You can now reference this key that you added to Azure Key Vault by using its URI. Use **`https://<your-unique-keyvault-name>.vault.azure.net/keys/ExampleKey`** to get the current version.
To view previously stored key:
key-vault Quick Create Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/keys/quick-create-powershell.md
Type the commands below to create a called **ExampleKey** :
Add-AzKeyVaultKey -VaultName "<your-unique-keyvault-name>" -Name "ExampleKey" -Destination "Software" ```
-You can now reference this key that you added to Azure Key Vault by using its URI. Use **"https://<your-unique-keyvault-name>.vault.azure.net/keys/ExampleKey"** to get the current version.
+You can now reference this key that you added to Azure Key Vault by using its URI. Use **`https://<your-unique-keyvault-name>.vault.azure.net/keys/ExampleKey`** to get the current version.
To view previously stored key:
key-vault Quick Create Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/keys/quick-create-template.md
[Azure Key Vault](../general/overview.md) is a cloud service that provides a secure store for secrets, such as keys, passwords, certificates, and other secrets. This quickstart focuses on the process of deploying an Azure Resource Manager template (ARM template) to create a key vault and a key.
-> [!NOTE]
-> This feature is not available for Azure Government.
## Prerequisites To complete this article: - If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.-- User would need to have an Azure built-in role assigned eg. contributor. [Learn more here](../../role-based-access-control/role-assignments-portal.md)
+- User would need to have an Azure built-in role assigned, recommended role **contributor**. [Learn more here](../../role-based-access-control/role-assignments-portal.md)
- Your Azure AD user object ID is needed by the template to configure permissions. The following procedure gets the object ID (GUID). 1. Run the following Azure PowerShell or Azure CLI command by select **Try it**, and then paste the script into the shell pane. To paste the script, right-click the shell, and then select **Paste**.
key-vault Quick Create Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/managed-hsm/quick-create-powershell.md
Title: Create and retrieve attributes of a managed key in Azure Key Vault ΓÇô Azure PowerShell description: Quickstart showing how to set and retrieve a managed key from Azure Key Vault using Azure PowerShell- Last updated 01/26/2021
Use the Azure PowerShell [New-AzKeyVaultManagedHsm](/powershell/module/az.keyvau
- Managed HSM name: A string of 3 to 24 characters that can contain only numbers (0-9), letters (a-z, A-Z), and hyphens (-) > [!Important]
- > Each managed HSM must have a unique name. Replace <your-unique-managed-hsm-name> with the name of your managed HSM in the following examples.
+ > Each managed HSM must have a unique name. Replace \<your-unique-managed-hsm-name\> with the name of your managed HSM in the following examples.
- Resource group name: **myResourceGroup**. - The location: **EastUS**.
key-vault Overview Storage Keys Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/secrets/overview-storage-keys-powershell.md
The commands in this section complete the following actions:
- ### Set variables
-First, set the variables to be used by the PowerShell cmdlets in the following steps. Be sure to update the <YourStorageAccountName> and <YourKeyVaultName> placeholders.
+First, set the variables to be used by the PowerShell cmdlets in the following steps. Be sure to update the \<YourStorageAccountName\> and \<YourKeyVaultName\> placeholders.
We will also use the Azure PowerShell [New-AzStorageContext](/powershell/module/az.storage/new-azstoragecontext) cmdlets to get the context of your Azure storage account.
key-vault Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/secrets/quick-create-cli.md
Title: Quickstart - Set and retrieve a secret from Azure Key Vault description: Quickstart showing how to set and retrieve a secret from Azure Key Vault using Azure CLI- tags: azure-resource-manager-
az keyvault secret set --vault-name "<your-unique-keyvault-name>" --name "Exampl
## Retrieve a secret from Key Vault
-You can now reference this password that you added to Azure Key Vault by using its URI. Use **'https://<your-unique-keyvault-name>.vault.azure.net/secrets/ExamplePassword'** to get the current version.
+You can now reference this password that you added to Azure Key Vault by using its URI. Use **`https://<your-unique-keyvault-name>.vault.azure.net/secrets/ExamplePassword`** to get the current version.
To view the value contained in the secret as plain text, use the Azure CLI [az keyvault secret show](/cli/azure/keyvault/secret#az_keyvault_secret_show) command:
logic-apps Create Single Tenant Workflows Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/logic-apps/create-single-tenant-workflows-azure-portal.md
ms.suite: integration Previously updated : 05/25/2021 Last updated : 09/25/2021 # Create an integration workflow with single-tenant Azure Logic Apps (Standard) in the Azure portal
As you progress, you'll complete these high-level tasks:
![Screenshot that shows the Azure portal search box with the "logic apps" search term and the "Logic App (Standard)" resource selected.](./media/create-single-tenant-workflows-azure-portal/find-logic-app-resource-template.png)
-1. On the **Logic apps** page, select **Add** > **Standard**.
-
- This step creates a logic app resource that runs in the single-tenant Azure Logic Apps environment and uses the [single-tenant pricing model](logic-apps-pricing.md#standard-pricing).
+1. On the **Logic apps** page, select **Add**.
1. On the **Create Logic App** page, on the **Basics** tab, provide the following information about your logic app resource:
As you progress, you'll complete these high-level tasks:
|-|-|-|-| | **Subscription** | Yes | <*Azure-subscription-name*> | The Azure subscription to use for your logic app. | | **Resource Group** | Yes | <*Azure-resource-group-name*> | The Azure resource group where you create your logic app and related resources. This resource name must be unique across regions and can contain only letters, numbers, hyphens (**-**), underscores (**_**), parentheses (**()**), and periods (**.**). <p><p>This example creates a resource group named `Fabrikam-Workflows-RG`. |
+ | **Type** | Yes | **Standard** | This logic app resource type runs in the single-tenant Azure Logic Apps environment and uses the [Standard usage, billing, and pricing model](logic-apps-pricing.md#standard-pricing). |
| **Logic App name** | Yes | <*logic-app-name*> | The name to use for your logic app. This resource name must be unique across regions and can contain only letters, numbers, hyphens (**-**), underscores (**_**), parentheses (**()**), and periods (**.**). <p><p>This example creates a logic app named `Fabrikam-Workflows`. <p><p>**Note**: Your logic app's name automatically gets the suffix, `.azurewebsites.net`, because the **Logic App (Standard)** resource is powered by Azure Functions, which uses the same app naming convention. | | **Publish** | Yes | <*deployment-environment*> | The deployment destination for your logic app. By default, **Workflow** is selected for deployment to single-tenant Azure Logic Apps. Azure creates an empty logic app resource where you have to add your first workflow. <p><p>**Note**: Currently, the **Docker Container** option requires a [*custom location*](../azure-arc/kubernetes/conceptual-custom-locations.md) on an Azure Arc enabled Kubernetes cluster, which you can use with [Azure Arc enabled Logic Apps (Preview)](azure-arc-enabled-logic-apps-overview.md). The resource locations for your logic app, custom location, and cluster must all be the same. | | **Region** | Yes | <*Azure-region*> | The location to use for creating your resource group and resources. If you selected **Docker Container**, select your custom location. <p><p>This example deploys the sample logic app to Azure and uses **West US**. |
As you progress, you'll complete these high-level tasks:
1. For the **Application Insights** setting, either select an existing Application Insights instance, or if you want to create a new instance, select **Create new** and provide the name that you want to use.
-1. After Azure validates your logic app's settings, on the **Review + create** tab, select **Create**.
-
- For example:
+1. After Azure validates your logic app's settings, on the **Review + create** tab, select **Create**, for example:
![Screenshot that shows the Azure portal and new logic app resource settings.](./media/create-single-tenant-workflows-azure-portal/check-logic-app-resource-settings.png) > [!TIP]
- > If you get a validation error after you select **Create**, open and review the error details.
- > For example, if your selected region reaches a quota for resources that you're trying to create,
- > you might have to try a different region.
+ > If you get a validation error after this step, open and review the error details. For example,
+ > if your selected region reaches a quota for resources that you're trying to create, you might
+ > have to try a different region.
After Azure finishes deployment, your logic app is automatically live and running but doesn't do anything yet because the resource is empty, and you haven't added any workflows yet.
logic-apps Logic Apps Batch Process Send Receive Messages https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/logic-apps/logic-apps-batch-process-send-receive-messages.md
Now create one or more batch sender logic apps that send messages to the batch r
> check that you previously created and deployed your batch receiver to Azure. If you haven't, learn > [how to deploy your batch receiver logic app to Azure](../logic-apps/quickstart-create-logic-apps-with-visual-studio.md#deploy-logic-app-to-azure).
- 1. From the actions list, select this action: **Batch_messages - <*your-logic-app-name*>**
+ 1. From the actions list, select this action: **Batch_messages - \<*your-logic-app-name*\>**
- ![Select this action: "Batch_messages - <your-logic-app>"](./media/logic-apps-batch-process-send-receive-messages/batch-sender-select-batch.png)
+ ![Select this action: "Batch_messages - \<your-logic-app\>"](./media/logic-apps-batch-process-send-receive-messages/batch-sender-select-batch.png)
1. Set the batch sender's properties:
logic-apps Logic Apps Control Flow Run Steps Group Scopes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/logic-apps/logic-apps-control-flow-run-steps-group-scopes.md
so save your work often.
1. In the **Subject** field, enter this text:
- ```Time to leave: Traffic more than 10 minutes```
+ `Time to leave: Traffic more than 10 minutes`
1. In the **Body** field, enter this text with a trailing space:
- ```Travel time:```
+ `Travel time:`
- While your cursor appears in the **Body** field,
- the dynamic content list stays open so that you can
+ While your cursor appears in the **Body** field,
+ the dynamic content list stays open so that you can
select any parameters that are available at this point. 1. In the dynamic content list, choose **Expression**.
visit the [Azure Logic Apps user feedback site](https://aka.ms/logicapps-wish).
* [Run steps based on a condition (conditional statements)](../logic-apps/logic-apps-control-flow-conditional-statement.md) * [Run steps based on different values (switch statements)](../logic-apps/logic-apps-control-flow-switch-statement.md) * [Run and repeat steps (loops)](../logic-apps/logic-apps-control-flow-loops.md)
-* [Run or merge parallel steps (branches)](../logic-apps/logic-apps-control-flow-branches.md)
+* [Run or merge parallel steps (branches)](../logic-apps/logic-apps-control-flow-branches.md)
logic-apps Logic Apps Enterprise Integration Agreements https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/logic-apps/logic-apps-enterprise-integration-agreements.md
An agreement requires a *host partner*, which is always your organization, and a
This article shows how to create and manage an agreement, which you can then use to exchange B2B messages with another partner by using the AS2, X12, EDIFACT, or RosettaNet operations.
+If you're new to logic apps, review [What is Azure Logic Apps](logic-apps-overview.md)? For more information about B2B enterprise integration, review [B2B enterprise integration workflows with Azure Logic Apps and Enterprise Integration Pack](logic-apps-enterprise-integration-overview.md).
+ ## Prerequisites * An Azure account and subscription. If you don't have a subscription yet, [sign up for a free Azure account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
logic-apps Logic Apps Enterprise Integration Certificates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/logic-apps/logic-apps-enterprise-integration-certificates.md
Title: Secure B2B messages with certificates
-description: Add certificates to help secure B2B messages in Azure Logic Apps with the Enterprise Integration Pack
+ Title: Add certificates to secure B2B messages in workflows
+description: Add certificates to secure B2B messages for workflows in Azure Logic Apps using the Enterprise Integration Pack.
ms.suite: integration -- Previously updated : 08/17/2018++ Last updated : 09/23/2021
-# Improve security for B2B messages by using certificates
+# Secure messages using certificates for workflows in Azure Logic Apps
-When you need to keep B2B communication confidential, you can increase security for B2B communication in your enterprise integration apps, specifically logic apps, by adding certificates to your integration account. Certificates are digital documents that check the identities for the participants in electronic communications and help you secure communication in these ways:
+When you need to exchange confidential messages in a logic app business-to-business (B2B) workflow, you can increase the security around this communication by using certificates. A certificate is a digital document that helps secure communication in the following ways:
-* Encrypt message content.
-* Digitally sign messages.
+* Checks the participants' identities in electronic communications.
-You can use these certificates in your enterprise integration apps:
+* Encrypts message content.
-* [Public certificates](https://en.wikipedia.org/wiki/Public_key_certificate),
-which you must purchase from a public internet
-[certificate authority (CA)](https://en.wikipedia.org/wiki/Certificate_authority)
-but don't require any keys.
+* Digitally signs messages.
-* Private certificates or [*self-signed certificates*](https://en.wikipedia.org/wiki/Self-signed_certificate),
-which you create and issue yourself but also require private keys.
+You can use the following certificate types in your workflows:
+* [Public certificates](https://en.wikipedia.org/wiki/Public_key_certificate), which you must purchase from a public internet [certificate authority (CA)](https://en.wikipedia.org/wiki/Certificate_authority). These certificates don't require any keys.
-## Upload a public certificate
+* Private certificates or [*self-signed certificates*](https://en.wikipedia.org/wiki/Self-signed_certificate), which you create and issue yourself. However, these certificates require private keys.
-To use a *public certificate* in logic apps that have B2B capabilities,
-you must first upload the certificate to your integration account.
-After you define the properties in the
-[agreements](logic-apps-enterprise-integration-agreements.md) that you create,
-the certificate is available to help you secure your B2B messages.
+If you're new to logic apps, review [What is Azure Logic Apps](logic-apps-overview.md)? For more information about B2B enterprise integration, review [B2B enterprise integration workflows with Azure Logic Apps and Enterprise Integration Pack](logic-apps-enterprise-integration-overview.md).
-1. Sign in to the [Azure portal](https://portal.azure.com).
-On the main Azure menu, select **All resources**.
-In the search box, enter your integration account name,
-and then select the integration account you want.
+## Prerequisites
- ![Find and select your integration account](media/logic-apps-enterprise-integration-certificates/select-integration-account.png)
+* An Azure account and subscription. If you don't have a subscription yet, [sign up for a free Azure account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-2. Under **Components**, choose the **Certificates** tile.
+* An [integration account resource](logic-apps-enterprise-integration-create-integration-account.md) where you define and store artifacts, such as trading partners, agreements, certificates, and so on, for use in your enterprise integration and B2B workflows. This resource has to meet the following requirements:
- ![Choose "Certificates"](media/logic-apps-enterprise-integration-certificates/add-certificates.png)
+ * Is associated with the same Azure subscription as your logic app resource.
-3. Under **Certificates**, choose **Add**. Under **Add Certificate**,
-provide these details for your certificate. When you're done, choose **OK**.
+ * Exists in the same location or Azure region as your logic app resource.
- | Property | Value | Description |
- |-|-|-|
- | **Name** | <*certificate-name*> | Your certificate's name, which is "publicCert" in this example |
- | **Certificate Type** | Public | Your certificate's type |
- | **Certificate** | <*certificate-file-name*> | To find and select the certificate file you want to upload, choose the folder icon next to the **Certificate** box. |
- ||||
+ * If you use the [**Logic App (Consumption)** resource type](logic-apps-overview.md#resource-type-and-host-environment-differences), you have to [link your integration account to your logic app resource](logic-apps-enterprise-integration-create-integration-account.md#link-account) before you can use your artifacts in your workflow.
- ![Screenshot shows where to select Add to provide certificate details.](media/logic-apps-enterprise-integration-certificates/public-certificate-details.png)
+ To create and add certificates for use in **Logic App (Consumption)** workflows, you don't need a logic app resource yet. However, when you're ready to use those certificates in your workflows, your logic app resource requires a linked integration account that stores those certificates.
- After Azure validates your selection,
- Azure uploads your certificate.
+ * If you're using the [**Logic App (Standard)** resource type](logic-apps-overview.md#resource-type-and-host-environment-differences), your integration account doesn't need a link to your logic app resource but is still required to store other artifacts, such as partners, agreements, and certificates, along with using the [AS2](logic-apps-enterprise-integration-as2.md), [X12](logic-apps-enterprise-integration-x12.md), and [EDIFACT](logic-apps-enterprise-integration-edifact.md) operations. Your integration account still has to meet other requirements, such as using the same Azure subscription and existing in the same location as your logic app resource.
- ![Screenshot that shows where Azure displays the new certificate.](media/logic-apps-enterprise-integration-certificates/new-public-certificate.png)
+ > [!NOTE]
+ > Currently, only the **Logic App (Consumption)** resource type supports [RosettaNet](logic-apps-enterprise-integration-rosettanet.md) operations.
+ > The **Logic App (Standard)** resource type doesn't include [RosettaNet](logic-apps-enterprise-integration-rosettanet.md) operations.
-## Upload a private certificate
+* For private certificates, you must meet the following prerequisites:
-To use a *private certificate* in logic apps that have B2B capabilities,
-you must first upload the certificate to your integration account.
-You also need to have a private key that you first add to
-[Azure Key Vault](../key-vault/general/overview.md).
+ * Add a private key in [Azure Key Vault](../key-vault/general/overview.md) and have the **Key Name**. For more information, review [Add your private key to Azure Key Vault](../key-vault/certificates/certificate-scenarios.md#import-a-certificate).
-After you define the properties in the
-[agreements](logic-apps-enterprise-integration-agreements.md) that you create,
-the certificate is available to help you secure your B2B messages.
+ * Authorize the Azure Logic Apps service to perform operations on your key vault. To grant access to the Azure Logic Apps service principal, use the PowerShell command, [Set-AzKeyVaultAccessPolicy](/powershell/module/az.keyvault/set-azkeyvaultaccesspolicy), for example:
-> [!NOTE]
-> For private certificates, make sure that you add a corresponding
-> public certificate that appears in the
-> [AS2 agreement's](logic-apps-enterprise-integration-as2.md) **Send and Receive** settings
-> for signing and encrypting messages.
+ `Set-AzKeyVaultAccessPolicy -VaultName 'TestcertKeyVault' -ServicePrincipalName '7cd684f4-8a78-49b0-91ec-6a35d38739ba' -PermissionsToKeys decrypt, sign, get, list`
-1. [Add your private key to Azure Key Vault](../key-vault/certificates/certificate-scenarios.md#import-a-certificate)
-and provide a **Key Name**.
-
-2. Authorize Azure Logic Apps to perform operations on Azure Key Vault.
-To grant access to the Logic Apps service principal, use the PowerShell command,
-[Set-AzKeyVaultAccessPolicy](/powershell/module/az.keyvault/set-azkeyvaultaccesspolicy),
-for example:
+ [!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
- `Set-AzKeyVaultAccessPolicy -VaultName 'TestcertKeyVault' -ServicePrincipalName
- '7cd684f4-8a78-49b0-91ec-6a35d38739ba' -PermissionsToKeys decrypt, sign, get, list`
-
-3. Sign in to the [Azure portal](https://portal.azure.com).
-On the main Azure menu, select **All resources**.
-In the search box, enter your integration account name,
-and then select the integration account you want.
+ * [Add a corresponding public certificate](#add-public-certificate) to your key vault. This certificate appears in your [agreement's **Send** and **Receive** settings for signing and encrypting messages](logic-apps-enterprise-integration-agreements.md). For example, review [Reference for AS2 messages settings in Azure Logic Apps](logic-apps-enterprise-integration-as2-message-settings.md).
- ![Find your integration account](media/logic-apps-enterprise-integration-certificates/select-integration-account.png)
+* At least two [trading partners](logic-apps-enterprise-integration-partners.md) and an [agreement between those partners](logic-apps-enterprise-integration-agreements.md) in your integration account. An agreement requires a host partner and a guest partner. Also, an agreement requires that both partners use the same or compatible *business identity* qualifier that's appropriate for an AS2, X12, EDIFACT, or RosettaNet agreement.
-4. Under **Components**, choose the **Certificates** tile.
+* Optionally, the logic app resource and workflow where you want to use the certificate. The workflow requires any trigger that starts your logic app's workflow. If you haven't created a logic app workflow before, review [Quickstart: Create your first logic app](quickstart-create-first-logic-app-workflow.md).
- ![Choose the Certificates tile](media/logic-apps-enterprise-integration-certificates/add-certificates.png)
+<a name="add-public-certificate"></a>
-5. Under **Certificates**, choose **Add**. Under **Add Certificate**,
-provide these details for your certificate. When you're done, choose **OK**.
+## Add a public certificate
- | Property | Value | Description |
- |-|-|-|
- | **Name** | <*certificate-name*> | Your certificate's name, which is "privateCert" in this example |
- | **Certificate Type** | Private | Your certificate's type |
- | **Certificate** | <*certificate-file-name*> | To find and select the certificate file you want to upload, choose the folder icon next to the **Certificate** box. When using a key vault for the private key, the uploaded file will be the public certificate. |
- | **Resource Group** | <*integration-account-resource-group*> | Your integration account's resource group, which is "MyResourceGroup" in this example |
- | **Key Vault** | <*key-vault-name*> | Your Azure key vault's name |
- | **Key name** | <*key-name*> | Your key's name |
- ||||
+To use a *public certificate* in your workflow, you have to first add the certificate to your integration account.
- ![Choose "Add", provide certificate details](media/logic-apps-enterprise-integration-certificates/private-certificate-details.png)
+1. In the [Azure portal](https://portal.azure.com) search box, enter `integration accounts`, and select **Integration accounts**.
+
+1. Under **Integration accounts**, select the integration account where you want to add your certificate.
+
+1. On the integration account menu, under **Settings**, select **Certificates**.
+
+1. On the **Certificates** pane, select **Add**.
+
+1. On the **Add Certificate** pane, provide the following information about the certificate:
+
+ | Property | Required | Value | Description |
+ |-|-|-|-|
+ | **Name** | Yes | <*certificate-name*> | Your certificate's name, which is `publicCert` in this example |
+ | **Certificate Type** | Yes | **Public** | Your certificate's type |
+ | **Certificate** | Yes | <*certificate-file-name*> | To browse for the certificate file that you want to add, select the folder icon next to the **Certificate** box. |
+ |||||
+
+ ![Screenshot showing the Azure portal and integration account with "Add" selected and the "Add Certificate" pane with public certificate details.](media/logic-apps-enterprise-integration-certificates/public-certificate-details.png)
+
+1. When you're done, select **OK**.
+
+ After Azure validates your selection, Azure uploads your certificate.
+
+ ![Screenshot showing the Azure portal and integration account with the public certificate in the "Certificates" list.](media/logic-apps-enterprise-integration-certificates/new-public-certificate.png)
+
+<a name="add-public-certificate"></a>
+
+## Add a private certificate
+
+To use a *private certificate* in your workflow, you have to first add the certificate to your integration account. Make sure that you've also met the [prerequisites private certificates](#prerequisites).
+
+1. In the [Azure portal](https://portal.azure.com) search box, enter `integration accounts`, and select **Integration accounts**.
+
+1. Under **Integration accounts**, select the integration account where you want to add your certificate.
+
+1. On the integration account menu, under **Settings**, select **Certificates**.
+
+1. On the **Certificates** pane, select **Add**.
+
+1. On the **Add Certificate** pane, provide the following information about the certificate:
+
+ | Property | Required | Value | Description |
+ |-|-|-|-|
+ | **Name** | Yes | <*certificate-name*> | Your certificate's name, which is `privateCert` in this example |
+ | **Certificate Type** | Yes | **Private** | Your certificate's type |
+ | **Certificate** | Yes | <*certificate-file-name*> | To browse for the certificate file that you want to add, select the folder icon next to the **Certificate** box. In the key vault that contains your private key, the file you add there is the public certificate. |
+ | **Resource Group** | Yes | <*integration-account-resource-group*> | Your integration account's resource group, which is `Integration-Account-RG` in this example |
+ | **Key Vault** | Yes | <*key-vault-name*> | Your key vault name |
+ | **Key name** | Yes | <*key-name*> | Your key name |
+ |||||
+
+ ![Screenshot showing the Azure portal and integration account with "Add" selected and the "Add Certificate" pane with private certificate details.](media/logic-apps-enterprise-integration-certificates/private-certificate-details.png)
+
+1. When you're done, select **OK**.
After Azure validates your selection, Azure uploads your certificate.
- ![Azure displays new certificate](media/logic-apps-enterprise-integration-certificates/new-private-certificate.png)
+ ![Screenshot showing the Azure portal and integration account with the private certificate in the "Certificates" list.](media/logic-apps-enterprise-integration-certificates/new-private-certificate.png)
## Next steps
-* [Create a B2B agreement](logic-apps-enterprise-integration-agreements.md)
+* [Exchange AS2 messages](logic-apps-enterprise-integration-as2.md)
+* [Exchange EDIFACT messages](logic-apps-enterprise-integration-edifact.md)
+* [Exchange X12 messages](logic-apps-enterprise-integration-x12.md)
+* [Exchange RosettaNet messages](logic-apps-enterprise-integration-rosettanet.md)
logic-apps Manage Logic Apps With Visual Studio https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/logic-apps/manage-logic-apps-with-visual-studio.md
To build logic apps for business-to-business (B2B) enterprise integration scenar
1. In Visual Studio, open the Azure Resource Group project that contains your logic app.
-1. In Solution Explorer, open the **<logic-app-name>.json** file's shortcut menu, and select **Open With Logic App Designer**. (Keyboard: Ctrl + L)
+1. In Solution Explorer, open the **\<logic-app-name\>.json** file's shortcut menu, and select **Open With Logic App Designer**. (Keyboard: Ctrl + L)
![Open logic app's .json file with Logic App Designer](./media/manage-logic-apps-with-visual-studio/open-logic-app-designer.png)
machine-learning Concept Environments https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/concept-environments.md
- Previously updated : 11/16/2020+ Last updated : 09/23/2021 # What are Azure Machine Learning environments?
machine-learning Concept Train Machine Learning Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/concept-train-machine-learning-model.md
Previously updated : 05/13/2020 Last updated : 09/23/2021
machine-learning How To Assign Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-assign-roles.md
If you're an owner of a workspace, you can add and remove roles for the workspac
- [REST API](../role-based-access-control/role-assignments-rest.md) - [Azure Resource Manager templates](../role-based-access-control/role-assignments-template.md)
-If you have installed the [Azure Machine Learning CLI](reference-azure-machine-learning-cli.md), you can use CLI commands to assign roles to users:
-
-```azurecli-interactive
-az ml workspace share -w <workspace_name> -g <resource_group_name> --role <role_name> --user <user_corp_email_address>
-```
-
-The `user` field is the email address of an existing user in the instance of Azure Active Directory where the workspace parent subscription lives. Here is an example of how to use this command:
-
-```azurecli-interactive
-az ml workspace share -w my_workspace -g my_resource_group --role Contributor --user jdoe@contoson.com
-```
-
-> [!NOTE]
-> "az ml workspace share" command does not work for federated account by Azure Active Directory B2B. Please use Azure UI portal instead of command.
- ## Create custom role If the built-in roles are insufficient, you can create custom roles. Custom roles might have read, write, delete, and compute resource permissions in that workspace. You can make the role available at a specific workspace level, a specific resource group level, or a specific subscription level.
To deploy this custom role, use the following Azure CLI command:
az role definition create --role-definition data_scientist_role.json ```
-After deployment, this role becomes available in the specified workspace. Now you can add and assign this role in the Azure portal. Or, you can assign this role to a user by using the `az ml workspace share` CLI command:
-
-```azurecli-interactive
-az ml workspace share -w my_workspace -g my_resource_group --role "Data Scientist Custom" --user jdoe@contoson.com
-```
+After deployment, this role becomes available in the specified workspace. Now you can add and assign this role in the Azure portal.
For more information on custom roles, see [Azure custom roles](../role-based-access-control/custom-roles.md).
machine-learning How To Create Attach Compute Studio https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-create-attach-compute-studio.md
If you created your compute instance or compute cluster with SSH access enabled,
1. Copy the connection string. 1. For Windows, open PowerShell or a command prompt:
- 1. Go into the directory or folder where your key is stored
- 1. Add the -i flag to the connection string to locate the private key and point to where it is stored:
+ 1. Go into the directory or folder where your key is stored
+ 1. Add the -i flag to the connection string to locate the private key and point to where it is stored:
- ```ssh -i <keyname.pem> azureuser@... (rest of connection string)```
+ `ssh -i <keyname.pem> azureuser@... (rest of connection string)`
1. For Linux users, follow the steps from [Create and use an SSH key pair for Linux VMs in Azure](../virtual-machines/linux/mac-create-ssh-keys.md)
machine-learning How To Custom Dns https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-custom-dns.md
If you cannot access the workspace from a virtual machine or jobs fail on comput
Open a command prompt, shell, or PowerShell. Then for each of the workspace FQDNs, run the following command:
- ```nslookup <workspace FQDN>```
+ `nslookup <workspace FQDN>`
The result of each nslookup should return one of the two private IP addresses on the Private Endpoint to the Azure Machine Learning workspace. If it does not, then there is something misconfigured in the custom DNS solution.
If after running through the above steps you are unable to access the workspace
Open a command prompt, shell, or PowerShell. Then for each of the workspace FQDNs, run the following command:
- ```nslookup <workspace FQDN>```
+ `nslookup <workspace FQDN>`
The result of each nslookup should yield one of the two private IP addresses on the Private Endpoint to the Azure Machine Learning workspace. If it does not, then there is something misconfigured in the custom DNS solution.
machine-learning How To Manage Workspace Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-manage-workspace-cli.md
Previously updated : 04/02/2021 Last updated : 09/23/2021
machine-learning How To Secure Training Vnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-secure-training-vnet.md
When the creation process finishes, you train your model by using the cluster in
For steps on how to create a compute instance deployed in a virtual network, see [Create and manage an Azure Machine Learning compute instance](how-to-create-manage-compute-instance.md).
-To create a no public IP address compute instance (a preview feature) in studio, set **No public IP** checkbox in the virtual network section.
-
-You can also create no public IP compute instance through an ARM template. In the ARM template set enableNodePublicIP parameter to false.
- ### <a name="no-public-ip"></a>No public IP for compute instances (preview) When you enable **No public IP**, your compute instance doesn't use a public IP for communication with any dependencies. Instead, it communicates solely within the virtual network using Azure Private Link ecosystem as well as service/private endpoints, eliminating the need for a public IP entirely. No public IP removes access and discoverability of compute instance node from the internet thus eliminating a significant threat vector. Compute instances will also do packet filtering to reject any traffic from outside virtual network. **No public IP** instances are dependent on [Azure Private Link](how-to-configure-private-link.md) for Azure Machine Learning workspace.
-For outbound connections to work, you need to set up an egress firewall such as Azure firewall with user defined routes. For instance, you can use a firewall set up with [invound/outbound configuration](how-to-access-azureml-behind-firewall.md) and route traffic there by defining a route table on the subnet in which the compute instance is deployed. The route table entry can set up the next hop of the private IP address of the firewall with the address prefix of 0.0.0.0/0.
+For **outbound connections** to work, you need to set up an egress firewall such as Azure firewall with user defined routes. For instance, you can use a firewall set up with [invound/outbound configuration](how-to-access-azureml-behind-firewall.md) and route traffic there by defining a route table on the subnet in which the compute instance is deployed. The route table entry can set up the next hop of the private IP address of the firewall with the address prefix of 0.0.0.0/0.
-A compute instance with **No public IP** enabled has **no inbound communication requirements** compared to those for public IP compute instance. Specifically, neither inbound NSG rule (`BatchNodeManagement`, `AzureMachineLearning`) is required.
+A compute instance with **No public IP** enabled has **no inbound communication requirements** from public internet compared to those for public IP compute instance. Specifically, neither inbound NSG rule (`BatchNodeManagement`, `AzureMachineLearning`) is required.
A compute instance with **No public IP** also requires you to disable private endpoint network policies and private link service network policies. Follow instruction from [Disable network policies for Private Link service source IP](../private-link/disable-private-link-service-network-policy.md) to set the parameters `disable-private-endpoint-network-policies` and `disable-private-link-service-network-policies` on the virtual network subnet.
+To create a no public IP address compute instance (a preview feature) in studio, set **No public IP** checkbox in the virtual network section.
+You can also create no public IP compute instance through an ARM template. In the ARM template set enableNodePublicIP parameter to false.
+ [!INCLUDE [no-public-ip-info](../../includes/machine-learning-no-public-ip-availibility.md)] ## Inbound traffic
machine-learning How To Troubleshoot Online Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-troubleshoot-online-endpoints.md
+
+ Title: Troubleshooting managed online endpoints deployment (preview)
+
+description: Learn how to troubleshoot some common deployment and scoring errors with Managed Online Endpoints.
++++++ Last updated : 05/13/2021++
+#Customer intent: As a data scientist, I want to figure out why my managed online endpoint deployment failed so that I can fix it.
++
+# Troubleshooting managed online endpoints deployment and scoring (preview)
+
+Learn how to resolve common issues in the deployment and scoring of Azure Machine Learning managed online endpoints (preview).
+
+This document is structured in the way you should approach troubleshooting:
+
+1. Use [local deployment](#deploy-locally) to test and debug your models locally before deploying in the cloud.
+1. Use [container logs](#get-container-logs) to help debug issues.
+1. Understand [common deployment errors](#common-deployment-errors) that might arise and how to fix them.
+
+The section [HTTP status codes](#http-status-codes) explains how invocation and prediction errors map to HTTP status codes when scoring endpoints with REST requests.
++
+## Prerequisites
+
+* An **Azure subscription**. Try the [free or paid version of Azure Machine Learning](https://azure.microsoft.com/free/).
+* The [Azure CLI](/cli/azure/install-azure-cli).
+* The [Install, set up, and use the CLI (v2) (preview)](how-to-configure-cli.md).
+
+## Deploy locally
+
+Local deployment is deploying a model to a local Docker environment. Local deployment is useful for testing and debugging before to deployment to the cloud.
+
+Local deployment supports creation, update, and deletion of a local endpoint. It also allows you to invoke and get logs from the endpoint. To use local deployment, add `--local` to the appropriate CLI command:
+
+```azurecli
+az ml endpoint create -n <endpoint-name> -f <spec_file.yaml> --local
+```
+As a part of local deployment the following steps take place:
+
+- Docker either builds a new container image or pulls an existing image from the local Docker cache. An existing image is used if there's one that matches the environment part of the specification file.
+- Docker starts a new container with mounted local artifacts such as model and code files.
+
+For more, see [Deploy locally in Deploy and score a machine learning model with a managed online endpoint (preview)](how-to-deploy-managed-online-endpoints.md#deploy-and-debug-locally-by-using-local-endpoints).
+
+## Get container logs
+
+You can't get direct access to the VM where the model is deployed. However, you can get logs from some of the containers that are running on the VM. The amount of information depends on the provisioning status of the deployment. If the specified container is up and running you'll see its console output, otherwise you'll get a message to try again later.
+
+To see log output from container, use the following CLI command:
+
+```azurecli
+az ml endpoint get-logs -n <endpoint-name> -d <deployment-name> -l 100
+```
+
+or
+
+```azurecli
+ az ml endpoint get-logs --name <endpoint-name> --deployment <deployment-name> --lines 100
+```
+
+Add `--resource-group` and `--workspace-name` to the commands above if you have not already set these parameters via `az configure`.
+
+To see information about how to set these parameters, and if current values are already set, run:
+
+```azurecli
+az ml endpoint get-logs -h
+```
+
+By default the logs are pulled from the inference server. Logs include the console log from the inference server, which contains print/log statements from your `score.py' code.
+
+> [!NOTE]
+> If you use Python logging, ensure you use the correct logging level order for the messages to be published to logs. For example, INFO.
++
+You can also get logs from the storage initializer container by passing `ΓÇô-container storage-initializer`. These logs contain information on whether code and model data were successfully downloaded to the container.
+
+Add `--help` and/or `--debug` to commands to see more information. Include the `x-ms-client-request-id` header to help with troubleshooting.
+
+## Common deployment errors
+
+Below is a list of common deployment errors that are reported as part of the deployment operation status.
+
+### ERR_1100: Not enough quota
+
+Before deploying a model, you need to have enough compute quota. This quota defines how much virtual cores are available per subscription, per workspace, per SKU, and per region. Each deployment subtracts from available quota and adds it back after deletion, based on type of the SKU.
+
+A possible mitigation is to check if there are unused deployments that can be deleted. Or you can submit a [request for a quota increase](./how-to-manage-quotas.md).
+
+### ERR_1101: Out of capacity
+
+The specified VM Size failed to provision due to a lack of Azure Machine Learning capacity. Retry later or try deploying to a different region.
+
+### ERR_1102: No more role assignments
+
+Delete some unused role assignments in this subscription. You can check all role assignments in the Azure portal in the Access Control menu.
+
+### ERR_1103: Endpoint quota reached
+
+Delete some unused endpoints in this subscription.
+
+### ERR_1200: Unable to download user container image
+
+During deployment creation after the compute provisioning, Azure tries to pull the user container image from the workspace private Azure Container Registry (ACR). There could be two possible issues.
+
+- The user container image isn't found.
+
+ Make sure container image is available in workspace ACR.
+For example, if image is `testacr.azurecr.io/azureml/azureml_92a029f831ce58d2ed011c3c42d35acb:latest` check the repository with
+`az acr repository show-tags -n testacr --repository azureml/azureml_92a029f831ce58d2ed011c3c42d35acb --orderby time_desc --output table`.
+
+- There's a permission issue accessing ACR.
+
+ To pull the image, Azure uses [managed identities](../active-directory/managed-identities-azure-resources/overview.md) to access ACR.
+
+ - If you created the associated endpoint with SystemAssigned, then Azure role-based access control (RBAC) permission is automatically granted, and no further permissions are needed.
+ - If you created the associated endpoint with UserAssigned, then the user's managed identity must have AcrPull permission for the workspace ACR.
+
+To get more details about this error, run:
+
+```azurecli
+az ml endpoint get-logs -n <endpoint-name> --deployment <deployment-name> --tail 100
+```
+
+### ERR_1300: Unable to download user model\code artifacts
+
+After provisioning the compute resource, during deployment creation, Azure tries to mount the user model and code artifacts into the user container from the workspace storage account.
+
+- User model\code artifacts not found.
+
+ - Make sure model and code artifacts are registered to the same workspace as the deployment. Use the `show` command to show details for a model or code artifact in a workspace. For example:
+
+ ```azurecli
+ az ml model show --name <model-name>
+ az ml code show --name <code-name> --version <version>
+ ```
+
+ - You can also check if the blobs are present in the workspace storage account.
+
+ For example, if the blob is `https://foobar.blob.core.windows.net/210212154504-1517266419/WebUpload/210212154504-1517266419/GaussianNB.pkl` you can use this command to check if it exists: `az storage blob exists --account-name foobar --container-name 210212154504-1517266419 --name WebUpload/210212154504-1517266419/GaussianNB.pkl --subscription <sub-name>`
+
+- Permission issue accessing ACR.
+
+ To pull blobs, Azure uses [managed identities](../active-directory/managed-identities-azure-resources/overview.md) to access the storage account.
+
+ - If you created the associated endpoint with SystemAssigned, Azure role-based access control (RBAC) permission is automatically granted, and no further permissions are needed.
+
+ - If you created the associated endpoint with UserAssigned, the user's managed identity must have Storage blob data reader permission on the workspace storage account.
+
+To get more details about this error, run:
+
+```azurecli
+az ml endpoint get-logs -n <endpoint-name> --deployment <deployment-name> --lines 100
+```
+
+### ERR_1350: Unable to download user model, not enough space on the disk
+
+This issue happens when the size of the model is bigger than the available disk space. Try an SKU with more disk space.
+
+### ERR_2100: Unable to start user container
+
+To run the `score.py` provided as part of the deployment, Azure creates a container that includes all the resources that the `score.py` needs, and runs the scoring script on that container.
+
+This error means that this container couldn't start, which means scoring could not happen. It could be that the container is requesting more resources than what `instance_type` could support. If so, consider updating the `instance_type` of the online deployment.
+
+To get the exact reason for an error, run:
+
+```azurecli
+az ml endpoint get-logs
+```
+
+### ERR_2101: Kubernetes unschedulable
+
+The requested CPU or memory can't be satisfied. Please adjust your request or the cluster.
+
+### ERR_2200: User container has crashed\terminated
+
+To run the `score.py` provided as part of the deployment, Azure creates a container that includes all the resources that the `score.py` needs, and runs the scoring script on that container. The error in this scenario is that this container is crashing when running, which means scoring couldn't happen. This error happens when:
+
+- There's an error in `score.py`. Use `get-logs` to help diagnose common problems:
+ - A package that was imported but is not in the conda environment
+ - A syntax error
+ - A failure in the `init()` method
+- Readiness or liveness probes are not set up correctly.
+- There's an error in the environment setup of the container, such as a missing dependency.
+
+### ERR_5000: Internal error
+
+While we do our best to provide a stable and reliable service, sometimes things don't go according to plan. If you get this error, it means something isn't right on our side and we need to fix it. Submit a [customer support ticket](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest) with all related information and we'll address the issue.
+
+## HTTP status codes
+
+When you access managed online endpoints with REST requests, the returned status codes adhere to the standards for [HTTP status codes](https://aka.ms/http-status-codes). Below are details about how managed endpoint invocation and prediction errors map to HTTP status codes.
+
+| Status code| Reason phrase | Why this code might get returned |
+| | | |
+| 200 | OK | Your model executed successfully, within your latency bound. |
+| 401 | Unauthorized | You don't have permission to do the requested action, such as score, or your token is expired. |
+| 404 | Not found | Your URL isn't correct. |
+| 408 | Request timeout | The model execution took longer than the timeout supplied in `request_timeout_ms` under `request_settings` of your model deployment config.|
+| 413 | Payload too large | Your request payload is larger than 1.5 megabytes. |
+| 424 | Model Error; original-code=`<original code>` | If your model container returns a non-200 response, Azure returns a 424. |
+| 424 | Response payload too large | If your container returns a payload larger than 1.5 megabytes, Azure returns a 424. |
+| 429 | Rate-limiting | You attempted to send more than 100 requests per second to your endpoint. |
+| 429 | Too many pending requests | Your model is getting more requests than it can handle. We allow 2*`max_concurrent_requests_per_instance`*`instance_count` requests at any time. Additional requests are rejected. You can confirm these settings in your model deployment config under `request_settings` and `scale_settings`. If you are using auto-scaling, your model is getting requests faster than the system can scale up. With auto-scaling, you can try to resend requests with [exponential backoff](https://aka.ms/exponential-backoff). Doing so can give the system time to adjust. |
+| 500 | Internal server error | Azure ML-provisioned infrastructure is failing. |
+
+## Next steps
+
+- [Deploy and score a machine learning model with a managed online endpoint (preview)](how-to-deploy-managed-online-endpoints.md)
+- [Safe rollout for online endpoints (preview)](how-to-safely-rollout-managed-endpoints.md)
+- [Managed online endpoints (preview) YAML reference](reference-yaml-endpoint-managed-online.md)
+
machine-learning Overview What Is Azure Machine Learning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/overview-what-is-azure-machine-learning.md
If you're a ML Studio (classic) user, [learn about Studio (classic) deprecation
## Enterprise-readiness and security
-Azure Machine Learning integrates with te Azure cloud platform to add security to ML projects.
+Azure Machine Learning integrates with the Azure cloud platform to add security to ML projects.
Security integrations include:
machine-learning Tutorial Train Models With Aml https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/tutorial-train-models-with-aml.md
joblib.dump(value=clf, filename='outputs/sklearn_mnist_model.pkl')
Notice how the script gets data and saves models:
-+ The training script reads an argument to find the directory that contains the data. When you submit the job later, you point to the datastore for this argument:
-```parser.add_argument('--data-folder', type=str, dest='data_folder', help='data directory mounting point')```
+- The training script reads an argument to find the directory that contains the data. When you submit the job later, you point to the datastore for this argument:
-+ The training script saves your model into a directory named **outputs**. Anything written in this directory is automatically uploaded into your workspace. You access your model from this directory later in the tutorial. `joblib.dump(value=clf, filename='outputs/sklearn_mnist_model.pkl')`
+ `parser.add_argument('--data-folder', type=str, dest='data_folder', help='data directory mounting point')`
-+ The training script requires the file `utils.py` to load the dataset correctly. The following code copies `utils.py` into `script_folder` so that the file can be accessed along with the training script on the remote resource.
+- The training script saves your model into a directory named **outputs**. Anything written in this directory is automatically uploaded into your workspace. You access your model from this directory later in the tutorial. `joblib.dump(value=clf, filename='outputs/sklearn_mnist_model.pkl')`
+
+- The training script requires the file `utils.py` to load the dataset correctly. The following code copies `utils.py` into `script_folder` so that the file can be accessed along with the training script on the remote resource.
```python import shutil
managed-instance-apache-cassandra Manage Resources Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/managed-instance-apache-cassandra/manage-resources-cli.md
Title: Manage Azure Managed Instance for Apache Cassandra resources using Azure CLI
+ Title: Manage resources with Azure CLI - Azure Resource Manager | Microsoft Docs
description: Learn about the common commands to automate the management of your Azure Managed Instance for Apache Cassandra using Azure CLI. Previously updated : 03/15/2021 Last updated : 09/17/2021 -+
+keywords: azure resource manager cli
# Manage Azure Managed Instance for Apache Cassandra resources using Azure CLI (Preview)
mariadb Concepts Certificate Rotation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/mariadb/concepts-certificate-rotation.md
Last updated 01/18/2021
Azure Database for MariaDB successfully completed the root certificate change on **February 15, 2021 (02/15/2021)** as part of standard maintenance and security best practices. This article gives you more details about the changes, the resources affected, and the steps needed to ensure that your application maintains connectivity to your database server. > [!NOTE]
-> This article contains references to the term _slave_, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
> ## Why root certificate update is required?
There is no change required on client side. if you followed our previous recomme
Then replace the original keystore file with the new generated one:
- - System.setProperty("javax.net.ssl.trustStore","path_to_truststore_file");
- - System.setProperty("javax.net.ssl.trustStorePassword","password");
+ - `System.setProperty("javax.net.ssl.trustStore","path_to_truststore_file");`
+ - `System.setProperty("javax.net.ssl.trustStorePassword","password");`
- For .NET (MariaDB Connector/NET, MariaDBConnector) users, make sure **BaltimoreCyberTrustRoot** and **DigiCertGlobalRootG2** both exist in Windows Certificate Store, Trusted Root Certification Authorities. If any certificates don't exist, import the missing certificate.
mariadb Concepts Read Replicas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/mariadb/concepts-read-replicas.md
Replicas are new servers that you manage similar to regular Azure Database for M
To learn more about GTID replication, see the [MariaDB replication documentation](https://mariadb.com/kb/en/library/gtid/). > [!NOTE]
-> This article contains references to the term _slave_, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
## When to use a read replica
mariadb Howto Create Users https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/mariadb/howto-create-users.md
This article describes how you can create users in Azure Database for MariaDB.
When you first created your Azure Database for MariaDB, you provided a server admin login user name and password. For more information, you can follow the [Quickstart](quickstart-create-mariadb-server-database-using-azure-portal.md). You can locate your server admin login user name from the Azure portal. > [!NOTE]
-> This article contains references to the term _slave_, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
The server admin user gets certain privileges for your server as listed: SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, RELOAD, PROCESS, REFERENCES, INDEX, ALTER, SHOW DATABASES, CREATE TEMPORARY TABLES, LOCK TABLES, EXECUTE, REPLICATION SLAVE, REPLICATION CLIENT, CREATE VIEW, SHOW VIEW, CREATE ROUTINE, ALTER ROUTINE, CREATE USER, EVENT, TRIGGER
mariadb Howto Data In Replication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/mariadb/howto-data-in-replication.md
Review the [limitations and requirements](concepts-data-in-replication.md#limita
> If your source server is version 10.2 or newer, we recommend that you set up Data-in Replication by using [Global Transaction ID](https://mariadb.com/kb/en/library/gtid/). > [!NOTE]
-> This article contains references to the term _slave_, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
+> This article contains references to the term *slave*, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this article.
## Create a MariaDB server to use as a replica
mariadb