Updates from: 09/19/2023 01:17:14
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory Application Provisioning Quarantine Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/application-provisioning-quarantine-status.md
Previously updated : 10/06/2022 Last updated : 09/15/2023
While in quarantine:
There are three ways to check whether an application is in quarantine: -- In the Azure portal, navigate to **Azure Active Directory** > **Enterprise applications** > <*application name*> > **Provisioning** and review the progress bar for a quarantine message.
+- In the Microsoft Entra admin center, navigate to **Identity** > **Applications** > **Enterprise applications** > <*application name*> > **Provisioning** and review the progress bar for a quarantine message.
![Provisioning status bar showing quarantine status](./media/application-provisioning-quarantine-status/progress-bar-quarantined.png) -- In the Azure portal, navigate to **Azure Active Directory** > **Audit Logs** > filter on **Activity: Quarantine** and review the quarantine history. The view in the progress bar as described above shows whether provisioning is currently in quarantine. The audit logs show the quarantine history for an application.
+- In the Microsoft Entra admin center, navigate to **Identity** > **Monitoring & health** > **Audit Logs** > filter on **Activity: Quarantine** and review the quarantine history. The view in the progress bar as described above shows whether provisioning is currently in quarantine. The audit logs show the quarantine history for an application.
- Use the Microsoft Graph request [Get synchronizationJob](/graph/api/synchronization-synchronizationjob-get?tabs=http&view=graph-rest-beta&preserve-view=true) to programmatically get the status of the provisioning job:
Below are the common reasons your application may go into quarantine
||| |**SCIM Compliance issue:** An HTTP/404 Not Found response was returned rather than the expected HTTP/200 OK response. In this case, the Azure AD provisioning service has made a request to the target application and received an unexpected response.|Check the admin credentials section. See if the application requires specifying the tenant URL and that the URL is correct. If you don't see an issue, contact the application developer to ensure that their service is SCIM-compliant. https://tools.ietf.org/html/rfc7644#section-3.4.2 | |**Invalid credentials:** When attempting to authorize, access to the target application, we received a response from the target application that indicates the credentials provided are invalid.|Navigate to the admin credentials section of the provisioning configuration UI and authorize access again with valid credentials. If the application is in the gallery, review the application configuration tutorial for anymore required steps.|
-|**Duplicate roles:** Roles imported from certain applications like Salesforce and Zendesk must be unique. |Navigate to the application [manifest](../develop/reference-app-manifest.md) in the Azure portal and remove the duplicate role.|
+|**Duplicate roles:** Roles imported from certain applications like Salesforce and Zendesk must be unique. |Navigate to the application [manifest](../develop/reference-app-manifest.md) in the Microsoft Entra admin center and remove the duplicate role.|
A Microsoft Graph request to get the status of the provisioning job shows the following reason for quarantine: - `EncounteredQuarantineException` indicates that invalid credentials were provided. The provisioning service is unable to establish a connection between the source system and the target system.
First, resolve the issue that caused the application to be placed in quarantine.
After you've resolved the issue, restart the provisioning job. Certain changes to the application's provisioning settings, such as attribute mappings or scoping filters, will automatically restart provisioning for you. The progress bar on the application's **Provisioning** page indicates when provisioning last started. If you need to restart the provisioning job manually, use one of the following methods: -- Use the Azure portal to restart the provisioning job. On the application's **Provisioning** page, select **Restart provisioning**. This action fully restarts the provisioning service, which can take some time. A full initial cycle will run again, which clears escrows, removes the app from quarantine, and clears any watermarks. The service will then evaluate all the users in the source system again and determine if they are in scope for provisioning. This can be useful when your application is currently in quarantine, as this article discusses, or you need to make a change to your attribute mappings. Note that the initial cycle takes longer to complete than the typical incremental cycle due to the number of objects that need to be evaluated. You can learn more about the performance of initial and incremental cycles [here](application-provisioning-when-will-provisioning-finish-specific-user.md).
+- Use the Microsoft Entra admin center to restart the provisioning job. On the application's **Provisioning** page, select **Restart provisioning**. This action fully restarts the provisioning service, which can take some time. A full initial cycle will run again, which clears escrows, removes the app from quarantine, and clears any watermarks. The service will then evaluate all the users in the source system again and determine if they are in scope for provisioning. This can be useful when your application is currently in quarantine, as this article discusses, or you need to make a change to your attribute mappings. Note that the initial cycle takes longer to complete than the typical incremental cycle due to the number of objects that need to be evaluated. You can learn more about the performance of initial and incremental cycles [here](application-provisioning-when-will-provisioning-finish-specific-user.md).
- Use Microsoft Graph to [restart the provisioning job](/graph/api/synchronization-synchronizationjob-restart?tabs=http&view=graph-rest-beta&preserve-view=true). You'll have full control over what you restart. You can choose to clear escrows (to restart the escrow counter that accrues toward quarantine status), clear quarantine (to remove the application from quarantine), or clear watermarks. Use the following request:
active-directory Application Provisioning When Will Provisioning Finish Specific User https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/application-provisioning-when-will-provisioning-finish-specific-user.md
Previously updated : 06/29/2023 Last updated : 09/15/2023
When you first configure automatic provisioning, the **Current Status** section
After a provisioning cycle is complete, the **Statistics to date** section shows the cumulative numbers of users and groups that have been provisioned to date, along with the completion date and duration of the last cycle. The **Activity ID** uniquely identifies the most recent provisioning cycle. The **Job ID** is a unique identifier for the provisioning job, and is specific to the app in your tenant.
-The provisioning progress is viewed in the Azure portal at **Azure Active Directory > Enterprise Apps > \[application name\] > Provisioning**.
+The provisioning progress is viewed in the Microsoft Entra admin center at **Identity** > **Applications** > **Enterprise applications** > \[*application name*\] > **Provisioning**.
![Provisioning page progress bar](./media/application-provisioning-when-will-provisioning-finish-specific-user/provisioning-progress-bar-section.png)
The provisioning progress is viewed in the Azure portal at **Azure Active Direct
To see the provisioning status for a selected user, consult the [Provisioning logs (preview)](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context) in Azure AD. All operations run by the user provisioning service are recorded in the Azure AD provisioning logs. The logs include read and write operations made to the source and target systems. Associated user data related to read and write operations is also logged.
-You can access the provisioning logs in the Azure portal by selecting **Azure Active Directory** > **Enterprise Apps** > **Provisioning logs (preview)** in the **Activity** section. You can search the provisioning data based on the name of the user or the identifier in either the source system or the target system. For details, see [Provisioning logs (preview)](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context).
+You can access the provisioning logs in the Microsoft Entra admin center by selecting **Identity** > **Applications** > **Enterprise applications** > **Provisioning logs** in the **Activity** section. You can search the provisioning data based on the name of the user or the identifier in either the source system or the target system. For details, see [Provisioning logs (preview)](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context).
The provisioning logs record all the operations performed by the provisioning service, including:
The provisioning logs record all the operations performed by the provisioning se
* Comparing the user objects between the system * Adding, updating, or disabling the user account in the target system based on the comparison
-For more information on how to read the provisioning logs in the Azure portal, see [provisioning reporting guide](check-status-user-account-provisioning.md).
+For more information on how to read the provisioning logs in the Microsoft Entra admin center, see [provisioning reporting guide](check-status-user-account-provisioning.md).
## How long will it take to provision users? When you're using automatic user provisioning with an application, there are some things to keep in mind. First, Azure AD automatically provisions and updates user accounts in an app based on things like [user and group assignment](../manage-apps/assign-user-or-group-access-portal.md). The sync happens at a regularly scheduled time interval, typically every 40 minutes.
active-directory Check Status User Account Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/check-status-user-account-provisioning.md
Previously updated : 05/05/2023 Last updated : 09/15/2023
This article describes how to check the status of provisioning jobs after they h
## Overview
-Provisioning connectors are set up and configured using the [Azure portal](https://portal.azure.com), by following the [provided documentation](../saas-apps/tutorial-list.md) for the supported application. When the connector is configured and running, provisioning jobs can be reported using the following methods:
+Provisioning connectors are set up and configured using the [Microsoft Entra admin center](https://entra.microsoft.com), by following the [provided documentation](../saas-apps/tutorial-list.md) for the supported application. When the connector is configured and running, provisioning jobs can be reported using the following methods:
-- The [Azure portal](https://portal.azure.com)
+- The [Microsoft Entra admin center](https://entra.microsoft.com)
- Streaming the provisioning logs into [Azure Monitor](../app-provisioning/application-provisioning-log-analytics.md). This method allows for extended data retention and building custom dashboards, alerts, and queries.
This article uses the following terms:
* **Source System** - The repository of users that the Azure AD provisioning service synchronizes from. Azure Active Directory is the source system for most preintegrated provisioning connectors, however there are some exceptions (example: Workday Inbound Synchronization). * **Target System** - The repository of users where the Azure AD provisioning service synchronizes. The repository is typically a SaaS application, such as Salesforce, ServiceNow, G Suite, and Dropbox for Business. In some cases the repository can be an on-premises system such as Active Directory, such as Workday Inbound Synchronization to Active Directory.
-## Getting provisioning reports from the Azure portal
+## Getting provisioning reports from the Microsoft Entra admin center
-To get provisioning report information for a given application, start by launching the [Azure portal](https://portal.azure.com) and **Azure Active Directory** > **Enterprise Apps** > **Provisioning logs** in the **Activity** section. You can also browse to the Enterprise Application for which provisioning is configured. For example, if you're provisioning users to LinkedIn Elevate, the navigation path to the application details is:
+To get provisioning report information for a given application:
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator).
+1. Browse to **Identity** > **Applications** > **Enterprise applications**.
+1. Select **Provisioning logs** in the **Activity** section. You can also browse to the Enterprise Application for which provisioning is configured. For example, if you're provisioning users to LinkedIn Elevate, the navigation path to the application details is:
-**Azure Active Directory > Enterprise Applications > All applications > LinkedIn Elevate**
+**Identity** > **Applications** > **Enterprise applications** > **All applications** > **LinkedIn Elevate**
From the all applications area, you access both the provisioning progress bar and provisioning logs.
The **Current Status** should be the first place admins look to check on the ope
## Provisioning logs
-All activities performed by the provisioning service are recorded in the Azure AD [provisioning logs](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context). You can access the provisioning logs in the Azure portal by selecting **Azure Active Directory** > **Enterprise Apps** > **Provisioning logs ** in the **Activity** section. You can search the provisioning data based on the name of the user or the identifier in either the source system or the target system. For details, see [Provisioning logs](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context).
+All activities performed by the provisioning service are recorded in the Azure AD [provisioning logs](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context). You can access the provisioning logs in the Microsoft Entra admin center. You can search the provisioning data based on the name of the user or the identifier in either the source system or the target system. For details, see [Provisioning logs](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context).
## Troubleshooting
active-directory Configure Automatic User Provisioning Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/configure-automatic-user-provisioning-portal.md
Previously updated : 05/02/2023 Last updated : 09/15/2023
-# Managing user account provisioning for enterprise apps in the Azure portal
+# Managing user account provisioning for enterprise apps in the Microsoft Entra admin center
This article describes the general steps for managing automatic user account provisioning and deprovisioning for applications that support it. *User account provisioning* is the act of creating, updating, and/or disabling user account records in an applicationΓÇÖs local user profile store. Most cloud and SaaS applications store the role and permissions in the user's own local user profile store. The presence of such a user record in the user's local store is *required* for single sign-on and access to work. To learn more about automatic user account provisioning, see [Automate User Provisioning and Deprovisioning to SaaS Applications with Azure Active Directory](user-provisioning.md).
This article describes the general steps for managing automatic user account pro
[!INCLUDE [portal updates](~/articles/active-directory/includes/portal-update.md)]
-Use the Azure portal to view and manage all applications that are configured for single sign-on in a directory. Enterprise apps are apps that are deployed and used within your organization. Follow these steps to view and manage your enterprise applications:
+Use the Microsoft Entra admin center to view and manage all applications that are configured for single sign-on in a directory. Enterprise apps are apps that are deployed and used within your organization. Follow these steps to view and manage your enterprise applications:
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Browse to **Azure Active Directory** > **Enterprise applications**.
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator).
+1. Browse to **Identity** > **Applications** > **Enterprise applications**.
1. A list of all configured apps is shown, including apps that were added from the gallery. 1. Select any app to load its resource pane, where you can view reports and manage app settings. 1. Select **Provisioning** to manage user account provisioning settings for the selected app.
active-directory Customize Application Attributes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/customize-application-attributes.md
Previously updated : 03/29/2023 Last updated : 09/15/2023 # Tutorial - Customize user provisioning attribute-mappings for SaaS applications in Azure Active Directory
-Microsoft Azure AD provides support for user provisioning to third-party SaaS applications such as Salesforce, G Suite and others. If you enable user provisioning for a third-party SaaS application, the Azure portal controls its attribute values through attribute-mappings.
+Microsoft Azure AD provides support for user provisioning to third-party SaaS applications such as Salesforce, G Suite and others. If you enable user provisioning for a third-party SaaS application, the Microsoft Entra admin center controls its attribute values through attribute-mappings.
Before you get started, make sure you're familiar with app management and **single sign-on (SSO)** concepts. Check out the following links: - [Quickstart Series on App Management in Azure AD](../manage-apps/view-applications-portal.md)
You can customize the default attribute-mappings according to your business need
Follow these steps to access the **Mappings** feature of user provisioning:
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Browse to **Azure Active Directory** > **Enterprise applications**.
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator).
+1. Browse to **Identity** > **Applications** > **Enterprise applications**.
1. A list of all configured apps is shown, including apps that were added from the gallery. 1. Select any app to load its app management pane, where you can view reports and manage app settings. 1. Select **Provisioning** to manage user account provisioning settings for the selected app.
The attributes provisioned as part of Group objects can be customized in the sam
The user attributes supported for a given application are preconfigured. Most application's user management APIs don't support schema discovery. So, the Azure AD provisioning service isn't able to dynamically generate the list of supported attributes by making calls to the application.
-However, some applications support custom attributes, and the Azure AD provisioning service can read and write to custom attributes. To enter their definitions into the Azure portal, select the **Show advanced options** check box at the bottom of the **Attribute Mapping** screen, and then select **Edit attribute list for** your app.
+However, some applications support custom attributes, and the Azure AD provisioning service can read and write to custom attributes. To enter their definitions into the Microsoft Entra admin center, select the **Show advanced options** check box at the bottom of the **Attribute Mapping** screen, and then select **Edit attribute list for** your app.
Applications and systems that support customization of the attribute list include:
Applications and systems that support customization of the attribute list includ
> [!NOTE]
-> Editing the list of supported attributes is only recommended for administrators who have customized the schema of their applications and systems, and have first-hand knowledge of how their custom attributes have been defined or if a source attribute isn't automatically displayed in the Azure portal UI. This sometimes requires familiarity with the APIs and developer tools provided by an application or system. The ability to edit the list of supported attributes is locked down by default, but customers can enable the capability by navigating to the following URL: https://portal.azure.com/?Microsoft_AAD_Connect_Provisioning_forceSchemaEditorEnabled=true . You can then navigate to your application to view the [attribute list](#editing-the-list-of-supported-attributes).
+> Editing the list of supported attributes is only recommended for administrators who have customized the schema of their applications and systems, and have first-hand knowledge of how their custom attributes have been defined or if a source attribute isn't automatically displayed in the Microsoft Entra admin center UI. This sometimes requires familiarity with the APIs and developer tools provided by an application or system. The ability to edit the list of supported attributes is locked down by default, but customers can enable the capability by navigating to the following URL: https://portal.azure.com/?Microsoft_AAD_Connect_Provisioning_forceSchemaEditorEnabled=true . You can then navigate to your application to view the [attribute list](#editing-the-list-of-supported-attributes).
> [!NOTE] > When a directory extension attribute in Azure AD doesn't show up automatically in your attribute mapping drop-down, you can manually add it to the "Azure AD attribute list". When manually adding Azure AD directory extension attributes to your provisioning app, note that directory extension attribute names are case-sensitive. For example: If you have a directory extension attribute named `extension_53c9e2c0exxxxxxxxxxxxxxxx_acmeCostCenter`, make sure you enter it in the same format as defined in the directory. Provisioning multi-valued directory extension attributes is not supported.
When you're editing the list of supported attributes, the following properties a
- **Multi-value?** - Whether the attribute supports multiple values. - **Exact case?** - Whether the attributes values are evaluated in a case-sensitive way. - **API Expression** - Don't use, unless instructed to do so by the documentation for a specific provisioning connector (such as Workday).-- **Referenced Object Attribute** - If it's a Reference type attribute, then this menu lets you select the table and attribute in the target application that contains the value associated with the attribute. For example, if you have an attribute named "Department" whose stored value references an object in a separate "Departments" table, you would select "Departments.Name". The reference tables and the primary ID fields supported for a given application are preconfigured and can't be edited using the Azure portal. However, you can edit them using the [Microsoft Graph API](/graph/api/resources/synchronization-configure-with-custom-target-attributes).
+- **Referenced Object Attribute** - If it's a Reference type attribute, then this menu lets you select the table and attribute in the target application that contains the value associated with the attribute. For example, if you have an attribute named "Department" whose stored value references an object in a separate "Departments" table, you would select "Departments.Name". The reference tables and the primary ID fields supported for a given application are preconfigured and can't be edited using the Microsoft Entra admin center. However, you can edit them using the [Microsoft Graph API](/graph/api/resources/synchronization-configure-with-custom-target-attributes).
#### Provisioning a custom extension attribute to a SCIM compliant application The SCIM RFC defines a core user and group schema, while also allowing for extensions to the schema to meet your application's needs. To add a custom attribute to a SCIM application:
- 1. Sign in to the [Azure portal](https://portal.azure.com), select **Enterprise Applications**, select your application, and then select **Provisioning**.
- 2. Under **Mappings**, select the object (user or group) for which you'd like to add a custom attribute.
- 3. At the bottom of the page, select **Show advanced options**.
- 4. Select **Edit attribute list for AppName**.
- 5. At the bottom of the attribute list, enter information about the custom attribute in the fields provided. Then select **Add Attribute**.
+ 1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator).
+ 1. Browse to **Identity** > **Applications** > **Enterprise applications**.
+ 1. Select your application, and then select **Provisioning**.
+ 1. Under **Mappings**, select the object (user or group) for which you'd like to add a custom attribute.
+ 1. At the bottom of the page, select **Show advanced options**.
+ 1. Select **Edit attribute list for AppName**.
+ 1. At the bottom of the attribute list, enter information about the custom attribute in the fields provided. Then select **Add Attribute**.
For SCIM applications, the attribute name must follow the pattern shown in the example. The "CustomExtensionName" and "CustomAttribute" can be customized per your application's requirements, for example: urn:ietf:params:scim:schemas:extension:CustomExtensionName:2.0:User:CustomAttribute
active-directory Define Conditional Rules For Provisioning User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md
Previously updated : 05/05/2023 Last updated : 09/15/2023 zone_pivot_groups: app-provisioning-cross-tenant-synchronization
Scoping filters are configured as part of the attribute mappings for each Azure
[!INCLUDE [portal updates](~/articles/active-directory/includes/portal-update.md)]
-1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator).
::: zone pivot="app-provisioning"
-2. Go to the **Azure Active Directory** > **Enterprise applications** > **All applications**.
+
+2. Browse to **Identity** > **Applications** > **Enterprise applications** > **All applications**.
3. Select the application for which you have configured automatic provisioning: for example, "ServiceNow".+ ::: zone-end ::: zone pivot="cross-tenant-synchronization"
-2. Go to **Azure Active Directory** > **Cross-tenant Synchronization** > **Configurations**
+
+2. Browse to **Identity** > **External Identities** > **Cross-tenant Synchronization** > **Configurations**
3. Select your configuration.+ ::: zone-end 4. Select the **Provisioning** tab. ::: zone pivot="app-provisioning"+ 5. In the **Mappings** section, select the mapping that you want to configure a scoping filter for: for example, "Synchronize Azure Active Directory Users to ServiceNow".+ ::: zone-end ::: zone pivot="cross-tenant-synchronization"+ 5. In the **Mappings** section, select the mapping that you want to configure a scoping filter for: for example, "Provision Azure Active Directory Users".+ ::: zone-end 6. Select the **Source object scope** menu.- 7. Select **Add scoping filter**.- 8. Define a clause by selecting a source **Attribute Name**, an **Operator**, and an **Attribute Value** to match against. The following operators are supported: a. **EQUALS**. Clause returns "true" if the evaluated attribute matches the input string value exactly (case sensitive).
active-directory Export Import Provisioning Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/export-import-provisioning-configuration.md
Previously updated : 05/12/2023 Last updated : 09/15/2023
In this article, you learn how to: -- Export and import your provisioning configuration from the Azure portal
+- Export and import your provisioning configuration from the Microsoft Entra admin center
- Export and import your provisioning configuration by using the Microsoft Graph API
-## Export and import your provisioning configuration from the Azure portal
+## Export and import your provisioning configuration from the Microsoft Entra admin center
### Export your provisioning configuration
In this article, you learn how to:
To export your configuration:
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. In the left navigation panel, select **Azure Active Directory**.
-1. In the **Azure Active Directory** pane, select **Enterprise applications** and choose your application.
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com) as at least a [Application Administrator](../roles/permissions-reference.md#application-administrator).
+1. Browse to **Identity** > **Applications** > **Enterprise applications** and choose your application.
1. In the left navigation pane, select **provisioning**. From the provisioning configuration page, click on **attribute mappings**, then **show advanced options**, and finally **review your schema**. The schema editor opens. 1. Click on download in the command bar at the top of the page to download your schema.
You can use the Microsoft Graph API and the Microsoft Graph Explorer to export y
### Step 1: Retrieve your Provisioning App Service Principal ID (Object ID)
-1. Sign in to the [Azure portal](https://portal.azure.com), and navigate to the Properties section of your provisioning application. For example, if you want to export your *Workday to AD User Provisioning application* mapping navigate to the Properties section of that app.
+1. Sign in to the [Microsoft Entra admin center](https://entra.microsoft.com), and navigate to the Properties section of your provisioning application. For example, if you want to export your *Workday to AD User Provisioning application* mapping navigate to the Properties section of that app.
1. In the Properties section of your provisioning app, copy the GUID value associated with the *Object ID* field. This value is also called the **ServicePrincipalId** of your App and it's used in Microsoft Graph Explorer operations. ![Workday App Service Principal ID](./media/export-import-provisioning-configuration/wd_export_01.png)
Copy the JSON object from the response and save it to a file to create a backup
### Step 5: Import the Provisioning Schema > [!CAUTION]
-> Perform this step only if you need to modify the schema for configuration that cannot be changed using the Azure portal or if you need to restore the configuration from a previously backed up file with valid and working schema.
+> Perform this step only if you need to modify the schema for configuration that cannot be changed using the Microsoft Entra admin center or if you need to restore the configuration from a previously backed up file with valid and working schema.
In the Microsoft Graph Explorer, configure the following PUT query, replacing [servicePrincipalId] and [ProvisioningJobId] with the ServicePrincipalId and the ProvisioningJobId retrieved in the previous steps.
active-directory Expression Builder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/expression-builder.md
Previously updated : 04/26/2023 Last updated : 09/15/2023
active-directory Functions For Customizing Application Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/functions-for-customizing-application-data.md
Previously updated : 10/20/2022 Last updated : 09/15/2023
active-directory How Provisioning Works https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/how-provisioning-works.md
Previously updated : 04/10/2023 Last updated : 09/15/2023
To request an automatic Azure AD provisioning connector for an app that doesn't
## Authorization
-Credentials are required for Azure AD to connect to the application's user management API. While you're configuring automatic user provisioning for an application, you need to enter valid credentials. For gallery applications, you can find credential types and requirements for the application by referring to the app tutorial. For non-gallery applications, you can refer to the [SCIM](./use-scim-to-provision-users-and-groups.md#authorization-to-provisioning-connectors-in-the-application-gallery) documentation to understand the credential types and requirements. In the Azure portal, you're able to test the credentials by having Azure AD attempt to connect to the app's provisioning app using the supplied credentials.
+Credentials are required for Azure AD to connect to the application's user management API. While you're configuring automatic user provisioning for an application, you need to enter valid credentials. For gallery applications, you can find credential types and requirements for the application by referring to the app tutorial. For non-gallery applications, you can refer to the [SCIM](./use-scim-to-provision-users-and-groups.md#authorization-to-provisioning-connectors-in-the-application-gallery) documentation to understand the credential types and requirements. In the Microsoft Entra admin center, you're able to test the credentials by having Azure AD attempt to connect to the app's provisioning app using the supplied credentials.
## Mapping attributes
-When you enable user provisioning for a third-party SaaS application, the Azure portal controls its attribute values through attribute mappings. Mappings determine the user attributes that flow between Azure AD and the target application when user accounts are provisioned or updated.
+When you enable user provisioning for a third-party SaaS application, the Microsoft Entra admin center controls its attribute values through attribute mappings. Mappings determine the user attributes that flow between Azure AD and the target application when user accounts are provisioned or updated.
There's a preconfigured set of attributes and attribute mappings between Azure AD user objects and each SaaS appΓÇÖs user objects. Some apps manage other types of objects along with Users, such as Groups.
After the initial cycle, all other cycles will:
The provisioning service continues running back-to-back incremental cycles indefinitely, at intervals defined in the [tutorial specific to each application](../saas-apps/tutorial-list.md). Incremental cycles continue until one of the events occurs: -- The service is manually stopped using the Azure portal, or using the appropriate Microsoft Graph API command.-- A new initial cycle is triggered using the **Restart provisioning** option in the Azure portal, or using the appropriate Microsoft Graph API command. The action clears any stored watermark and causes all source objects to be evaluated again. Also, the action doesn't break the links between source and target objects. To break the links, use [Restart synchronizationJob](/graph/api/synchronization-synchronizationjob-restart?view=graph-rest-beta&tabs=http&preserve-view=true) with the request:
+- The service is manually stopped using the Microsoft Entra admin center, or using the appropriate Microsoft Graph API command.
+- A new initial cycle is triggered using the **Restart provisioning** option in the Microsoft Entra admin center, or using the appropriate Microsoft Graph API command. The action clears any stored watermark and causes all source objects to be evaluated again. Also, the action doesn't break the links between source and target objects. To break the links, use [Restart synchronizationJob](/graph/api/synchronization-synchronizationjob-restart?view=graph-rest-beta&tabs=http&preserve-view=true) with the request:
<!-- { "blockType": "request",
Resolve these failures by adjusting the attribute values for the affected user i
### Quarantine
-If most or all of the calls that are made against the target system consistently fail because of an error (for example invalid admin credentials) the provisioning job goes into a "quarantine" state. This state is indicated in the [provisioning summary report](./check-status-user-account-provisioning.md) and via email if email notifications were configured in the Azure portal.
+If most or all of the calls that are made against the target system consistently fail because of an error (for example invalid admin credentials) the provisioning job goes into a "quarantine" state. This state is indicated in the [provisioning summary report](./check-status-user-account-provisioning.md) and via email if email notifications were configured in the Microsoft Entra admin center.
When in quarantine, the frequency of incremental cycles is gradually reduced to once per day.
Performance depends on whether your provisioning job is running an initial provi
### How to tell if users are being provisioned properly
-All operations run by the user provisioning service are recorded in the Azure AD [Provisioning logs (preview)](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context). The logs include all read and write operations made to the source and target systems, and the user data that was read or written during each operation. For information on how to read the provisioning logs in the Azure portal, see the [provisioning reporting guide](./check-status-user-account-provisioning.md).
+All operations run by the user provisioning service are recorded in the Azure AD [Provisioning logs (preview)](../reports-monitoring/concept-provisioning-logs.md?context=azure/active-directory/manage-apps/context/manage-apps-context). The logs include all read and write operations made to the source and target systems, and the user data that was read or written during each operation. For information on how to read the provisioning logs in the Microsoft Entra admin center, see the [provisioning reporting guide](./check-status-user-account-provisioning.md).
## Deprovisioning The Azure AD provisioning service keeps source and target systems in sync by deprovisioning accounts when user access is removed.
active-directory Hr Attribute Retrieval Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/hr-attribute-retrieval-issues.md
Previously updated : 10/20/2022 Last updated : 09/15/2023
active-directory Hr Manager Update Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/hr-manager-update-issues.md
Previously updated : 10/20/2022 Last updated : 09/15/2023
The Azure AD provisioning service automatically updates manager information so that the user-manager relationship in Azure AD is always in sync with your HR data. It uses a process called *manager reference resolution* to accurately update the *manager* attribute. Before going into the process details, it is important to understand how manager information is stored in Azure AD and on-premises Active Directory. * In **on-premises Active Directory**, the *manager* attribute stores the *distinguishedName (dn)* of the manager's account in AD.
-* In **Azure AD**, the *manager* attribute is a DirectoryObject navigation property in Azure AD. When you view the user record in the Azure portal, it shows the *displayName* of the manager record in Azure AD.
+* In **Azure AD**, the *manager* attribute is a DirectoryObject navigation property in Azure AD. When you view the user record in the Microsoft Entra admin center, it shows the *displayName* of the manager record in Azure AD.
The *manager reference resolution* is a two step-process: * Step 1: Link the manager's HR source record with the manager's target account record using a pair of attributes referred to as *source anchor* and *target anchor*.
active-directory Hr User Creation Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/hr-user-creation-issues.md
Previously updated : 10/20/2022 Last updated : 09/15/2023
active-directory Hr User Update Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/hr-user-update-issues.md
Previously updated : 10/20/2022 Last updated : 09/15/2023
active-directory Hr Writeback Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/hr-writeback-issues.md
Previously updated : 10/20/2022 Last updated : 09/15/2023
active-directory Inbound Provisioning Api Configure App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-configure-app.md
Previously updated : 07/07/2023 Last updated : 09/15/2023
If you're configuring inbound user provisioning to on-premises Active Directory,
## Create your API-driven provisioning app
-1. Log in to the [Microsoft Entra admin center](<https://entra.microsoft.com>).
+1. Log in to the [Microsoft Entra admin center](<https://entra.microsoft.com>) as at least an [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823).
2. Browse to **Azure Active Directory** > **Applications** > **Enterprise applications**. 3. Click on **New application** to create a new provisioning application. [![Screenshot of Entra Admin Center.](media/inbound-provisioning-api-configure-app/provisioning-entra-admin-center.png)](media/inbound-provisioning-api-configure-app/provisioning-entra-admin-center.png#lightbox)
active-directory Inbound Provisioning Api Curl Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-curl-tutorial.md
## Verify processing of the bulk request payload
-1. Log in to [Microsoft Entra admin center](https://entra.microsoft.com) with *global administrator* or *application administrator* login credentials.
+1. Log in to [Microsoft Entra admin center](https://entra.microsoft.com) as at least an [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823).
1. Browse to **Azure Active Directory -> Applications -> Enterprise applications**. 1. Under all applications, use the search filter text box to find and open your API-driven provisioning application. 1. Open the Provisioning blade. The landing page displays the status of the last run.
active-directory Inbound Provisioning Api Custom Attributes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-custom-attributes.md
Previously updated : 07/24/2023 Last updated : 09/15/2023
You have configured API-driven provisioning app. You're provisioning app is succ
In this step, we'll add the two attributes "HireDate" and "JobCode" that are not part of the standard SCIM schema to the provisioning app and use them in the provisioning data flow.
-1. Log in to Microsoft Entra admin center with application administrator role.
-1. Go to **Enterprise applications** and open your API-driven provisioning app.
+1. Log in to your [Microsoft Entra admin center](https://entra.microsoft.com) as at least an [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823).
+1. Browse to **Enterprise applications** and open your API-driven provisioning app.
1. Open the **Provisioning** blade. 1. Click on the **Edit Provisioning** button. 1. Expand the **Mappings** section and click on the attribute mapping link. <br>
active-directory Inbound Provisioning Api Grant Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-grant-access.md
Previously updated : 07/07/2023 Last updated : 09/15/2023
Depending on how your API client authenticates with Azure AD, you can select bet
## Configure a service principal This configuration registers an app in Azure AD that represents the external API client and grants it permission to invoke the inbound provisioning API. The service principal client id and client secret can be used in the OAuth client credentials grant flow.
-1. Log in to Microsoft Entra admin center (https://entra.microsoft.com) with global administrator or application administrator login credentials.
+1. Log in to Microsoft Entra admin center (https://entra.microsoft.com) with at least [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823) login credentials.
1. Browse to **Azure Active Directory** -> **Applications** -> **App registrations**. 1. Click on the option **New registration**. 1. Provide an app name, select the default options, and click on **Register**.
active-directory Inbound Provisioning Api Graph Explorer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-graph-explorer.md
Previously updated : 07/18/2023 Last updated : 09/15/2023
This tutorial describes how you can quickly test [API-driven inbound provisionin
You can verify the processing either from the Microsoft Entra admin center or using Graph Explorer. ### Verify processing from Microsoft Entra admin center
-1. Log in to [Microsoft Entra admin center](https://entra.microsoft.com) with *global administrator* or *application administrator* login credentials.
+1. Log in to [Microsoft Entra admin center](https://entra.microsoft.com) with at least [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823) login credentials.
1. Browse to **Azure Active Directory -> Applications -> Enterprise applications**. 1. Under all applications, use the search filter text box to find and open your API-driven provisioning application. 1. Open the Provisioning blade. The landing page displays the status of the last run.
active-directory Inbound Provisioning Api Postman https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-postman.md
Previously updated : 07/19/2023 Last updated : 09/15/2023
If the API invocation is successful, you see the message `202 Accepted.` Under H
You can verify the processing either from the Microsoft Entra admin center or using Postman. ### Verify processing from Microsoft Entra admin center
-1. Log in to [Microsoft Entra admin center](https://entra.microsoft.com) with *global administrator* or *application administrator* login credentials.
+1. Log in to [Microsoft Entra admin center](https://entra.microsoft.com) with at least [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823) level credentials.
1. Browse to **Azure Active Directory -> Applications -> Enterprise applications**. 1. Under all applications, use the search filter text box to find and open your API-driven provisioning application. 1. Open the Provisioning blade. The landing page displays the status of the last run.
active-directory Inbound Provisioning Api Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/inbound-provisioning-api-powershell.md
Previously updated : 07/18/2023 Last updated : 09/15/2023
To illustrate the procedure, let's use the CSV file `Samples/csv-with-2-records.
This section explains how to send the generated bulk request payload to your inbound provisioning API endpoint.
-1. Log in to your Microsoft Entra admin center as *Application Administrator* or *Global Administrator*.
-1. Copy the `ServicePrincipalId` associated with your provisioning app from **Provisioning App** > **Properties** > **Object ID**.
+1. Log in to your [Microsoft Entra admin center](https://entra.microsoft.com) as at least an [Application Administrator](https://go.microsoft.com/fwlink/?linkid=2247823).
+1. Browse to **Provisioning App** > **Properties** > **Object ID** and copy the `ServicePrincipalId` associated with your provisioning app.
:::image type="content" border="true" source="./media/inbound-provisioning-api-powershell/object-id.png" alt-text="Screenshot of the Object ID." lightbox="./media/inbound-provisioning-api-powershell/object-id.png":::
-1. As user with *Global Administrator* role, run the following command by providing the correct values for `ServicePrincipalId` and `TenantId`. It will prompt you for authentication if an authenticated session doesn't already exist for this tenant. Provide your consent to permissions prompted during authentication.
+1. As user with Global Administrator role, run the following command by providing the correct values for `ServicePrincipalId` and `TenantId`. It will prompt you for authentication if an authenticated session doesn't already exist for this tenant. Provide your consent to permissions prompted during authentication.
```powershell .\CSV2SCIM.ps1 -Path '..\Samples\csv-with-2-records.csv' -AttributeMapping $AttributeMapping -ServicePrincipalId <servicePrincipalId> -TenantId "contoso.onmicrosoft.com"
This section explains how to send the generated bulk request payload to your inb
$ThumbPrint = $ClientCertificate.ThumbPrint ``` The generated certificate is stored **Current User\Personal\Certificates**. You can view it using the **Control Panel** -> **Manage user certificates** option.
-1. To associate this certificate with a valid service principal, log in to your Microsoft Entra admin center as *Application Administrator*.
+1. To associate this certificate with a valid service principal, log in to your Microsoft Entra admin center as Application Administrator.
1. Open [the service principal you configured](inbound-provisioning-api-grant-access.md#configure-a-service-principal) under **App Registrations**. 1. Copy the **Object ID** from the **Overview** blade. Use the value to replace the string `<AppObjectId>`. Copy the **Application (client) Id**. We will use it later and it is referenced as `<AppClientId>`. 1. Run the following command to upload your certificate to the registered service principal.
active-directory Isv Automatic Provisioning Multi Tenant Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/isv-automatic-provisioning-multi-tenant-apps.md
Learn more about using Microsoft Graph for provisioning:
* [Microsoft Graph Auth Overview](/graph/auth/)
-* [Getting started with Microsoft Graph](https://developer.microsoft.com/graph/get-started)
+* [Getting started with Microsoft Graph](https://developer.microsoft.com/graph/rest-api/)
## Using SAML JIT for provisioning
active-directory On Premises Powershell Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/on-premises-powershell-connector.md
The connector provides a bridge between the capabilities of the ECMA Connector H
- Connectivity between hosting server, the connector, and the target system that the PowerShell scripts interact with. - The execution policy on the server must be configured to allow the connector to run Windows PowerShell scripts. Unless the scripts the connector runs are digitally signed, configure the execution policy by running this command: `Set-ExecutionPolicy -ExecutionPolicy RemoteSigned`-- Deploying this connector requires one or more PowerShell scripts. Some Microsoft products may provide scripts for use with this connector, and the support statement for those scripts would be provided by that product. If you are developing your own scripts for use with this connector, you'll need to have familiarity with the [Extensible Connectivity Management Agent API](https://msdn.microsoft.com/library/windows/desktop/hh859557.aspx) to develop and maintain those scripts. If you are integrating with third party systems using your own scripts in a production environment, we recommend you work with the third party vendor or a deployment partner for help, guidance and support for this integration.
+- Deploying this connector requires one or more PowerShell scripts. Some Microsoft products may provide scripts for use with this connector, and the support statement for those scripts would be provided by that product. If you are developing your own scripts for use with this connector, you'll need to have familiarity with the [Extensible Connectivity Management Agent API](/previous-versions/windows/desktop/forefront-2010/hh859557(v=vs.100)?redirectedfrom=MSDN) to develop and maintain those scripts. If you are integrating with third party systems using your own scripts in a production environment, we recommend you work with the third party vendor or a deployment partner for help, guidance and support for this integration.
The connectivity tab allows you to supply configuration parameters for connectin
| Password | \<Blank\> | Password of the credential to store for use when the connector is run. | | Impersonate Connector Account |Unchecked| When true, the synchronization service runs the Windows PowerShell scripts in the context of the credentials supplied. When possible, it is recommended that the **$Credentials** parameter is passed to each script is used instead of impersonation.| | Load User Profile When Impersonating |Unchecked|Instructs Windows to load the user profile of the connectorΓÇÖs credentials during impersonation. If the impersonated user has a roaming profile, the connector does not load the roaming profile.|
-| Logon Type When Impersonating |None|Logon type during impersonation. For more information, see the [dwLogonType][dw] documentation. |
+| Logon Type When Impersonating |None|Logon type during impersonation. For more information, see the [dwLogonType](/windows/win32/api/winbase/nf-winbase-logonusera#parameters) documentation. |
|Signed Scripts Only |Unchecked| If true, the Windows PowerShell connector validates that each script has a valid digital signature. If false, ensure that the Synchronization Service serverΓÇÖs Windows PowerShell execution policy is RemoteSigned or Unrestricted.| |Common Module Script Name (with extension)|xADSyncPSConnectorModule.psm1|The connector allows you to store a shared Windows PowerShell module in the configuration. When the connector runs a script, the Windows PowerShell module is extracted to the file system so that it can be imported by each script.| |Common Module Script|[AD Sync PowerShell Connector Module code](https://github.com/microsoft/MIMPowerShellConnectors/blob/master/src/ECMA2HostCSV/Scripts/CommonModule.psm1) as value. This module will be automatically created by the ECMA2Host when the connector is running.||
active-directory On Premises Scim Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/on-premises-scim-provisioning.md
# Azure AD on-premises application provisioning to SCIM-enabled apps
-The Azure Active Directory (Azure AD) provisioning service supports a [SCIM 2.0](https://techcommunity.microsoft.com/t5/identity-standards-blog/provisioning-with-scim-getting-started/ba-p/880010) client that can be used to automatically provision users into cloud or on-premises applications. This article outlines how you can use the Azure AD provisioning service to provision users into an on-premises application that's SCIM enabled. If you want to provision users into non-SCIM on-premises applications that use SQL as a data store, see the [Azure AD ECMA Connector Host Generic SQL Connector tutorial](tutorial-ecma-sql-connector.md). If you want to provision users into cloud apps such as DropBox and Atlassian, review the app-specific [tutorials](../../active-directory/saas-apps/tutorial-list.md).
+The Azure Active Directory (Azure AD) provisioning service supports a [SCIM 2.0](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/provisioning-with-scim-getting-started/ba-p/880010) client that can be used to automatically provision users into cloud or on-premises applications. This article outlines how you can use the Azure AD provisioning service to provision users into an on-premises application that's SCIM enabled. If you want to provision users into non-SCIM on-premises applications that use SQL as a data store, see the [Azure AD ECMA Connector Host Generic SQL Connector tutorial](tutorial-ecma-sql-connector.md). If you want to provision users into cloud apps such as DropBox and Atlassian, review the app-specific [tutorials](../../active-directory/saas-apps/tutorial-list.md).
![Diagram that shows SCIM architecture.](./media/on-premises-scim-provisioning/scim-4.png)
The following video provides an overview of on-premises provisioning.
> [!VIDEO https://www.youtube.com/embed/QdfdpaFolys] ## Additional requirements
-* Ensure your [SCIM](https://techcommunity.microsoft.com/t5/identity-standards-blog/provisioning-with-scim-getting-started/ba-p/880010) implementation meets the [Azure AD SCIM requirements](use-scim-to-provision-users-and-groups.md).
+* Ensure your [SCIM](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/provisioning-with-scim-getting-started/ba-p/880010) implementation meets the [Azure AD SCIM requirements](use-scim-to-provision-users-and-groups.md).
Azure AD offers open-source [reference code](https://github.com/AzureAD/SCIMReferenceCode/wiki) that developers can use to bootstrap their SCIM implementation. The code is as is. * Support the /schemas endpoint to reduce configuration required in the Azure portal.
active-directory Sap Successfactors Integration Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/sap-successfactors-integration-reference.md
To further secure the connectivity between Azure AD provisioning service and Suc
1. Copy all IP address ranges listed within the element *addressPrefixes* and use the range to build your IP address restriction list. 1. Translate the CIDR values to IP ranges.
-1. Log in to SuccessFactors admin portal to add IP ranges to the allowlist. Refer to SAP [support note 2253200](https://apps.support.sap.com/sap/support/knowledge/en/2253200). You can now [enter IP ranges](https://answers.sap.com/questions/12882263/whitelisting-sap-cloud-platform-ip-address-range-i.html) in this tool.
+1. Log in to SuccessFactors admin portal to add IP ranges to the allowlist. Refer to SAP [support note 2253200](https://userapps.support.sap.com/sap/support/knowledge/2253200). You can now [enter IP ranges](https://answers.sap.com/questions/12882263/whitelisting-sap-cloud-platform-ip-address-range-i.html) in this tool.
## Supported entities For every user in SuccessFactors, Azure AD provisioning service retrieves the following entities. Each entity is expanded using the OData API *$expand* query parameter as outlined in the *Retrieval rule* column. Some entities are expanded by default, while some entities are expanded only if a specific attribute is present in the mapping.
active-directory Concept Continuous Access Evaluation Workload https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-continuous-access-evaluation-workload.md
Continuous access evaluation (CAE) for [workload identities](../workload-identit
Continuous access evaluation doesn't currently support managed identities.
-## Scope of preview
+## Scope of support
The continuous access evaluation for workload identities is supported only on access requests sent to Microsoft Graph as a resource provider. More resource providers will be added over time.
active-directory V2 Protocols Oidc https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/v2-protocols-oidc.md
To sign out a user, perform both of these operations:
* Redirect the user's user-agent to the Microsoft identity platform's logout URI * Clear your app's cookies or otherwise end the user's session in your application.
-If you fail to perform either operation, the user may remain authenticated and not be prompted to sign-in the next time they user your app.
+If you fail to perform either operation, the user may remain authenticated and not be prompted to sign-in the next time they use your app.
Redirect the user-agent to the `end_session_endpoint` as shown in the OpenID Connect configuration document. The `end_session_endpoint` supports both HTTP GET and POST requests.
active-directory Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/apps.md
Microsoft Entra identity governance can be integrated with many other applicatio
| [Acunetix 360](../../active-directory/saas-apps/acunetix-360-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Adobe Identity Management](../../active-directory/saas-apps/adobe-identity-management-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Adobe Identity Management (OIDC)](../../active-directory/saas-apps/adobe-identity-management-provisioning-oidc-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Airbase](../../active-directory/saas-apps/airbase-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Aha!](../../active-directory/saas-apps/aha-tutorial.md) | | ΓùÅ | | [Airstack](../../active-directory/saas-apps/airstack-provisioning-tutorial.md) | ΓùÅ | | | [Akamai Enterprise Application Access](../../active-directory/saas-apps/akamai-enterprise-application-access-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Airtable](../../active-directory/saas-apps/airtable-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Albert](../../active-directory/saas-apps/albert-provisioning-tutorial.md) | ΓùÅ | |
| [AlertMedia](../../active-directory/saas-apps/alertmedia-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Alexis HR](../../active-directory/saas-apps/alexishr-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Alinto Protect (renamed Cleanmail)](../../active-directory/saas-apps/alinto-protect-provisioning-tutorial.md) | ΓùÅ | |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Autodesk SSO](../../active-directory/saas-apps/autodesk-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Azure Databricks SCIM Connector](/azure/databricks/administration-guide/users-groups/scim/aad) | ΓùÅ | | | [AWS IAM Identity Center](../../active-directory/saas-apps/aws-single-sign-on-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Axiad Cloud](../../active-directory/saas-apps/axiad-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [BambooHR](../../active-directory/saas-apps/bamboo-hr-tutorial.md) | | ΓùÅ | | [BenQ IAM](../../active-directory/saas-apps/benq-iam-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Bentley - Automatic User Provisioning](../../active-directory/saas-apps/bentley-automatic-user-provisioning-tutorial.md) | ΓùÅ | |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Cisco Umbrella User Management](../../active-directory/saas-apps/cisco-umbrella-user-management-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Cisco Webex](../../active-directory/saas-apps/cisco-webex-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Clarizen One](../../active-directory/saas-apps/clarizen-one-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Cleanmail Swiss](../../active-directory/saas-apps/cleanmail-swiss-provisioning-tutorial.md) | ΓùÅ | |
| [Clebex](../../active-directory/saas-apps/clebex-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Cloud Academy SSO](../../active-directory/saas-apps/cloud-academy-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Coda](../../active-directory/saas-apps/coda-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Concur](../../active-directory/saas-apps/concur-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Cornerstone OnDemand](../../active-directory/saas-apps/cornerstone-ondemand-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [CybSafe](../../active-directory/saas-apps/cybsafe-provisioning-tutorial.md) | ΓùÅ | |
+| [Dagster Cloud](../../active-directory/saas-apps/dagster-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Datadog](../../active-directory/saas-apps/datadog-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Documo](../../active-directory/saas-apps/documo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [DocuSign](../../active-directory/saas-apps/docusign-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Dropbox Business](../../active-directory/saas-apps/dropboxforbusiness-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Fortes Change Cloud](../../active-directory/saas-apps/fortes-change-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Frankli.io](../../active-directory/saas-apps/frankli-io-provisioning-tutorial.md) | ΓùÅ | | | [Freshservice Provisioning](../../active-directory/saas-apps/freshservice-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Funnel Leasing](../../active-directory/saas-apps/funnel-leasing-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Fuze](../../active-directory/saas-apps/fuze-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [G Suite](../../active-directory/saas-apps/g-suite-provisioning-tutorial.md) | ΓùÅ | | | [Genesys Cloud for Azure](../../active-directory/saas-apps/purecloud-by-genesys-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Grammarly](../../active-directory/saas-apps/grammarly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Group Talk](../../active-directory/saas-apps/grouptalk-provisioning-tutorial.md) | ΓùÅ | | | [Gtmhub](../../active-directory/saas-apps/gtmhub-provisioning-tutorial.md) | ΓùÅ | |
+| [H5mag](../../active-directory/saas-apps/h5mag-provisioning-tutorial.md) | ΓùÅ | |
| [Harness](../../active-directory/saas-apps/harness-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | HCL Domino | ΓùÅ | |
+| [Headspace](../../active-directory/saas-apps/headspace-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [HelloID](../../active-directory/saas-apps/helloid-provisioning-tutorial.md) | ΓùÅ | | | [Holmes Cloud](../../active-directory/saas-apps/holmes-cloud-provisioning-tutorial.md) | ΓùÅ | | | [Hootsuite](../../active-directory/saas-apps/hootsuite-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Hoxhunt](../../active-directory/saas-apps/hoxhunt-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Howspace](../../active-directory/saas-apps/howspace-provisioning-tutorial.md) | ΓùÅ | |
-| [H5mag](../../active-directory/saas-apps/h5mag-provisioning-tutorial.md) | ΓùÅ | |
+| [Humbol](../../active-directory/saas-apps/humbol-provisioning-tutorial.md) | ΓùÅ | |
| IBM DB2 ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | | | IBM Tivoli Directory Server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | | | [Ideo](../../active-directory/saas-apps/ideo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Ideagen Cloud](../../active-directory/saas-apps/ideagen-cloud-provisioning-tutorial.md) | ΓùÅ | | | [Infor CloudSuite](../../active-directory/saas-apps/infor-cloudsuite-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [InformaCast](../../active-directory/saas-apps/informacast-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [iPass SmartConnect](../../active-directory/saas-apps/ipass-smartconnect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Iris Intranet](../../active-directory/saas-apps/iris-intranet-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Insight4GRC](../../active-directory/saas-apps/insight4grc-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Juno Journey](../../active-directory/saas-apps/juno-journey-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Keeper Password Manager & Digital Vault](../../active-directory/saas-apps/keeper-password-manager-digitalvault-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Keepabl](../../active-directory/saas-apps/keepabl-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Kintone](../../active-directory/saas-apps/kintone-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Kisi Phsyical Security](../../active-directory/saas-apps/kisi-physical-security-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Klaxoon](../../active-directory/saas-apps/klaxoon-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Klaxoon SAML](../../active-directory/saas-apps/klaxoon-saml-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [LinkedIn Sales Navigator](../../active-directory/saas-apps/linkedinsalesnavigator-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Lucid (All Products)](../../active-directory/saas-apps/lucid-all-products-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Lucidchart](../../active-directory/saas-apps/lucidchart-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [LUSID](../../active-directory/saas-apps/LUSID-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Leapsome](../../active-directory/saas-apps/leapsome-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [LogicGate](../../active-directory/saas-apps/logicgate-provisioning-tutorial.md) | ΓùÅ | | | [Looop](../../active-directory/saas-apps/looop-provisioning-tutorial.md) | ΓùÅ | | | [LogMeIn](../../active-directory/saas-apps/logmein-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Maptician](../../active-directory/saas-apps/maptician-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Markit Procurement Service](../../active-directory/saas-apps/markit-procurement-service-provisioning-tutorial.md) | ΓùÅ | |
| [MediusFlow](../../active-directory/saas-apps/mediusflow-provisioning-tutorial.md) | ΓùÅ | | | [MerchLogix](../../active-directory/saas-apps/merchlogix-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Meta Networks Connector](../../active-directory/saas-apps/meta-networks-connector-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Miro](../../active-directory/saas-apps/miro-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Monday.com](../../active-directory/saas-apps/mondaycom-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [MongoDB Atlas](../../active-directory/saas-apps/mongodb-cloud-tutorial.md) | | ΓùÅ |
+| [Moqups](../../active-directory/saas-apps/moqups-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Mural Identity](../../active-directory/saas-apps/mural-identity-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [MX3 Diagnostics](../../active-directory/saas-apps/mx3-diagnostics-connector-provisioning-tutorial.md) | ΓùÅ | | | [myPolicies](../../active-directory/saas-apps/mypolicies-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Netsparker Enterprise](../../active-directory/saas-apps/netsparker-enterprise-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [New Relic by Organization](../../active-directory/saas-apps/new-relic-by-organization-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [NordPass](../../active-directory/saas-apps/nordpass-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Notion](../../active-directory/saas-apps/notion-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| Novell eDirectory ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | | | [Office Space Software](../../active-directory/saas-apps/officespace-software-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Olfeo SAAS](../../active-directory/saas-apps/olfeo-saas-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | Open DJ ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | | | Open DS ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
+| [OpenForms](../../active-directory/saas-apps/openforms-provisioning-tutorial.md) | ΓùÅ | |
| [OpenLDAP](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) | ΓùÅ | | | [OpenText Directory Services](../../active-directory/saas-apps/open-text-directory-services-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Oracle Cloud Infrastructure Console](../../active-directory/saas-apps/oracle-cloud-infrastructure-console-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | Oracle Database ([SQL connector](../../active-directory/app-provisioning/tutorial-ecma-sql-connector.md) ) | ΓùÅ | | | Oracle E-Business Suite | ΓùÅ | ΓùÅ | | [Oracle Fusion ERP](../../active-directory/saas-apps/oracle-fusion-erp-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [O'Reilly Learning Platform](../../active-directory/saas-apps/oreilly-learning-platform-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| Oracle Internet Directory | ΓùÅ | | | Oracle PeopleSoft ERP | ΓùÅ | ΓùÅ | | Oracle SunONE Directory Server ([LDAP connector](../../active-directory/app-provisioning/on-premises-ldap-connector-configure.md) ) | ΓùÅ | |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Real Links](../../active-directory/saas-apps/real-links-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Reward Gateway](../../active-directory/saas-apps/reward-gateway-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [RFPIO](../../active-directory/saas-apps/rfpio-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Rhombus Systems](../../active-directory/saas-apps/rhombus-systems-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Ring Central](../../active-directory/saas-apps/ringcentral-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Robin](../../active-directory/saas-apps/robin-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Rollbar](../../active-directory/saas-apps/rollbar-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Rouse Sales](../../active-directory/saas-apps/rouse-sales-provisioning-tutorial.md) | ΓùÅ | |
-| [Salesforce](../../active-directory/saas-apps/salesforce-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Salesforce](../../active-directory/saas-apps/salesforce-provisioning-tutorial.md) | ΓùÅ | |
+| [SafeGuard Cyber](../../active-directory/saas-apps/safeguard-cyber-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Salesforce Sandbox](../../active-directory/saas-apps/salesforce-sandbox-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Samanage](../../active-directory/saas-apps/samanage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | SAML-based apps | | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Swit](../../active-directory/saas-apps/swit-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Symantec Web Security Service (WSS)](../../active-directory/saas-apps/symantec-web-security-service.md) | ΓùÅ | ΓùÅ | | [Tableau Cloud](../../active-directory/saas-apps/tableau-online-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Tailscale](../../active-directory/saas-apps/tailscale-provisioning-tutorial.md) | ΓùÅ | |
| [Talentech](../../active-directory/saas-apps/talentech-provisioning-tutorial.md) | ΓùÅ | |
+| [Tanium SSO](../../active-directory/saas-apps/tanium-sso-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Tap App Security](../../active-directory/saas-apps/tap-app-security-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Taskize Connect](../../active-directory/saas-apps/taskize-connect-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Teamgo](../../active-directory/saas-apps/teamgo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Uber](../../active-directory/saas-apps/uber-provisioning-tutorial.md) | ΓùÅ | | | [UNIFI](../../active-directory/saas-apps/unifi-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [uniFlow Online](../../active-directory/saas-apps/uniflow-online-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [uni-tel ) | ΓùÅ | |
+| [Vault Platform](../../active-directory/saas-apps/vault-platform-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Vbrick Rev Cloud](../../active-directory/saas-apps/vbrick-rev-cloud-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [V-Client](../../active-directory/saas-apps/v-client-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
| [Velpic](../../active-directory/saas-apps/velpic-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Visibly](../../active-directory/saas-apps/visibly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Visitly](../../active-directory/saas-apps/visitly-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Vonage](../../active-directory/saas-apps/vonage-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [WATS](../../active-directory/saas-apps/wats-provisioning-tutorial.md) | ΓùÅ | |
| [Webroot Security Awareness Training](../../active-directory/saas-apps/webroot-security-awareness-training-provisioning-tutorial.md) | ΓùÅ | | | [WEDO](../../active-directory/saas-apps/wedo-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Whimsical](../../active-directory/saas-apps/whimsical-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
Microsoft Entra identity governance can be integrated with many other applicatio
| [Workplace by Facebook](../../active-directory/saas-apps/workplace-by-facebook-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Workgrid](../../active-directory/saas-apps/workgrid-provisioning-tutorial.md) | ΓùÅ | ΓùÅ | | [Wrike](../../active-directory/saas-apps/wrike-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
+| [Xledger](../../active-directory/saas-apps/xledger-provisioning-tutorial.md) | ΓùÅ | |
| [Yellowbox](../../active-directory/saas-apps/yellowbox-provisioning-tutorial.md) | ΓùÅ | | | [Zapier](../../active-directory/saas-apps/zapier-provisioning-tutorial.md) | ΓùÅ | | | [Zendesk](../../active-directory/saas-apps/zendesk-provisioning-tutorial.md) | ΓùÅ | ΓùÅ |
active-directory Tutorial Prepare User Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/tutorial-prepare-user-accounts.md
Next, we create Britta Simon. This is the account that is used as our manager.
As an alternative, the following PowerShell script may also be used to quickly create two users needed execute a lifecycle workflow. One user represents our new employee and the second represents the new employee's manager.
->[!IMPORTANT]
->The following PowerShell script is provided to quickly create the two users required for this tutorial. These users can also be created manually by signing in to the Microsoft Entra Admin center as a global administrator and creating them.
+> [!IMPORTANT]
+> The following PowerShell script is provided to quickly create the two users required for this tutorial. These users can also be created in the Microsoft Entra Admin center.
In order to create this step, save the following PowerShell script to a location on a machine that has access to Azure.
active-directory Lines Elibrary Advance Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lines-elibrary-advance-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Lines eLibrary Advance for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Lines eLibrary Advance tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Lines eLibrary Advance for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Lines eLibrary Advance tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Lines eLibrary Advance for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Liquidfiles Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/liquidfiles-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to LiquidFiles Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the LiquidFiles tile in the My Apps, this will redirect to LiquidFiles Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the LiquidFiles tile in the My Apps, this will redirect to LiquidFiles Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Litmos Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/litmos-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the SAP Litmos for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SAP Litmos tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SAP Litmos for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SAP Litmos tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SAP Litmos for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Lms And Education Management System Leaf Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lms-and-education-management-system-leaf-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to LMS and Education Management System Leaf Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the LMS and Education Management System Leaf tile in the My Apps, this will redirect to LMS and Education Management System Leaf Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the LMS and Education Management System Leaf tile in the My Apps, this will redirect to LMS and Education Management System Leaf Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Locus Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/locus-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Locus Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Locus tile in the My Apps, this will redirect to Locus Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Locus tile in the My Apps, this will redirect to Locus Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Lusha Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lusha-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Lusha for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Lusha tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Lusha for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Lusha tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Lusha for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Lusid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lusid-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the LUSID for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the LUSID tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the LUSID for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the LUSID tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the LUSID for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Lynda Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lynda-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Lynda.com Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Lynda.com tile in the My Apps, this will redirect to Lynda.com Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Lynda.com tile in the My Apps, this will redirect to Lynda.com Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Lytx Drivecam Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lytx-drivecam-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Lytx DriveCam for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Lytx DriveCam tile in the My Apps, you should be automatically signed in to the Lytx DriveCam for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Lytx DriveCam tile in the My Apps, you should be automatically signed in to the Lytx DriveCam for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Lyve Cloud Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lyve-cloud-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Lyve Cloud for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Lyve Cloud tile in the My Apps, you should be automatically signed in to the Lyve Cloud for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Lyve Cloud tile in the My Apps, you should be automatically signed in to the Lyve Cloud for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mail Luck Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mail-luck-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Mail Luck! Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Mail Luck! tile in the My Apps, this will redirect to Mail Luck! Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Mail Luck! tile in the My Apps, this will redirect to Mail Luck! Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Manabipocket Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/manabipocket-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Manabi Pocket Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Manabi Pocket tile in the My Apps, this will redirect to Manabi Pocket Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Manabi Pocket tile in the My Apps, this will redirect to Manabi Pocket Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Manifestly Checklists Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/manifestly-checklists-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Manifestly Checklists for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Manifestly Checklists tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Manifestly Checklists for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Manifestly Checklists tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Manifestly Checklists for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mapiq Essentials Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mapiq-essentials-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Mapiq Essentials Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Mapiq Essentials tile in the My Apps, this will redirect to Mapiq Essentials Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Mapiq Essentials tile in the My Apps, this will redirect to Mapiq Essentials Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Maximo Application Suite Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/maximo-application-suite-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal to be taken to the Maximo login page where you need to enter in your SAML identity as a fully qualified email address. If the user has already authenticated with the IDP the Maximo Application Suite won't have to login again, and the browser will be redirected to the home page.
-* You can also use Microsoft My Apps to test the application in any mode. When you click the Maximo Application Suite tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Maximo Application Suite for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can also use Microsoft My Apps to test the application in any mode. When you click the Maximo Application Suite tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Maximo Application Suite for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
> [!Note] > Screenshots are from MAS Continuous-delivery 8.9 and may differ in future versions.
active-directory Maxxpoint Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/maxxpoint-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the MaxxPoint for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the MaxxPoint tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the MaxxPoint for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the MaxxPoint tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the MaxxPoint for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mcm Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mcm-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to MCM Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the MCM tile in the My Apps, this will redirect to MCM Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the MCM tile in the My Apps, this will redirect to MCM Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Meta Work Accounts Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/meta-work-accounts-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Meta Work Accounts for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Meta Work Accounts tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Meta Work Accounts for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Meta Work Accounts tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Meta Work Accounts for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mihcm Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mihcm-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to MiHCM Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the MiHCM tile in the My Apps, this will redirect to MiHCM Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the MiHCM tile in the My Apps, this will redirect to MiHCM Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mindflash Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mindflash-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Learn Sign on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Trakstar Learn tile in the My Apps, this will redirect to Learn Sign on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Trakstar Learn tile in the My Apps, this will redirect to Learn Sign on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mint Tms Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mint-tms-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the MINT TMS for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the MINT TMS tile in the My Apps, you should be automatically signed in to the MINT TMS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the MINT TMS tile in the My Apps, you should be automatically signed in to the MINT TMS for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Mist Cloud Admin Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mist-cloud-admin-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Mist Cloud Admin SSO for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Mist Cloud Admin SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Mist Cloud Admin SSO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Mist Cloud Admin SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Mist Cloud Admin SSO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mobilexpense Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mobilexpense-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Mobile Xpense for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Mobile Xpense tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Mobile Xpense for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Mobile Xpense tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Mobile Xpense for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Momenta Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/momenta-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Momenta for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Momenta tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Momenta for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Momenta tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Momenta for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Motus Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/motus-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Motus for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Motus tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Motus for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Motus tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Motus for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Moxiengage Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/moxiengage-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Moxi Engage Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Moxi Engage tile in the My Apps, this will redirect to Moxi Engage Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Moxi Engage tile in the My Apps, this will redirect to Moxi Engage Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Moxtra Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/moxtra-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Moxtra Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Moxtra tile in the My Apps, this will redirect to Moxtra Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Moxtra tile in the My Apps, this will redirect to Moxtra Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mural Identity Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mural-identity-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Mural Identity for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Mural Identity tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Mural Identity for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Mural Identity tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Mural Identity for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Change log
active-directory Myaos Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/myaos-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the myAOS for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the myAOS tile in the My Apps, you should be automatically signed in to the myAOS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the myAOS tile in the My Apps, you should be automatically signed in to the myAOS for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Myaryaka Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/myaryaka-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to MyAryaka Sign-On URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the MyAryaka tile in the My Apps, this will redirect to MyAryaka Sign-On URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the MyAryaka tile in the My Apps, this will redirect to MyAryaka Sign-On URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Myawardpoints Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/myawardpoints-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to My Award Points Top Sub/Top Team Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the My Award Points Top Sub/Top Team tile in the My Apps, this will redirect to My Award Points Top Sub/Top Team Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the My Award Points Top Sub/Top Team tile in the My Apps, this will redirect to My Award Points Top Sub/Top Team Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mymobilityhq Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mymobilityhq-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to myMobilityHQ Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the myMobilityHQ tile in the My Apps, this will redirect to myMobilityHQ Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the myMobilityHQ tile in the My Apps, this will redirect to myMobilityHQ Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Mypolicies Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mypolicies-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the myPolicies for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the myPolicies tile in the My Apps, you should be automatically signed in to the myPolicies for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the myPolicies tile in the My Apps, you should be automatically signed in to the myPolicies for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Mysdworxcom Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/mysdworxcom-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the my.sdworx.com for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the my.sdworx.com tile in the My Apps, you should be automatically signed in to the my.sdworx.com for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the my.sdworx.com tile in the My Apps, you should be automatically signed in to the my.sdworx.com for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory N2f Expensereports Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/n2f-expensereports-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the N2F - Expense reports for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the N2F - Expense reports tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the N2F - Expense reports for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the N2F - Expense reports tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the N2F - Expense reports for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Navan Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/navan-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Navan for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Navan tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Navan for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Navan tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Navan for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Negometrixportal Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/negometrixportal-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to NegometrixPortal Single Sign On (SSO) Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the NegometrixPortal Single Sign On (SSO) tile in the My Apps, this will redirect to NegometrixPortal Single Sign On (SSO) Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the NegometrixPortal Single Sign On (SSO) tile in the My Apps, this will redirect to NegometrixPortal Single Sign On (SSO) Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Neotalogicstudio Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/neotalogicstudio-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Neota Studio Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Neota Studio tile in the My Apps, this will redirect to Neota Studio Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Neota Studio tile in the My Apps, this will redirect to Neota Studio Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Netskope Cloud Exchange Administration Console Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/netskope-cloud-exchange-administration-console-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Netskope Cloud Exchange Administration Console Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Netskope Cloud Exchange Administration Console tile in the My Apps, this will redirect to Netskope Cloud Exchange Administration Console Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Netskope Cloud Exchange Administration Console tile in the My Apps, this will redirect to Netskope Cloud Exchange Administration Console Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Netskope Cloud Security Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/netskope-cloud-security-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Netskope Administrator Console for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Netskope Administrator Console tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Netskope Administrator Console for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Netskope Administrator Console tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Netskope Administrator Console for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Netsparker Enterprise Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/netsparker-enterprise-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Invicti for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Invicti tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Invicti for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Invicti tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Invicti for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Newsignature Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/newsignature-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Cloud Management Portal for Microsoft Azure Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Cloud Management Portal for Microsoft Azure tile in the My Apps, this will redirect to Cloud Management Portal for Microsoft Azure Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Cloud Management Portal for Microsoft Azure tile in the My Apps, this will redirect to Cloud Management Portal for Microsoft Azure Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Nice Cxone Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/nice-cxone-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to NICE CXone Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the NICE CXone tile in the My Apps, this will redirect to NICE CXone Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the NICE CXone tile in the My Apps, this will redirect to NICE CXone Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Nodetrax Project Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/nodetrax-project-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Nodetrax Project for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Nodetrax Project tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Nodetrax Project for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Nodetrax Project tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Nodetrax Project for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Nomadesk Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/nomadesk-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Nomadesk Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Nomadesk tile in the My Apps, this will redirect to Nomadesk Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Nomadesk tile in the My Apps, this will redirect to Nomadesk Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Ns1 Sso Azure Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/ns1-sso-azure-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the NS1 SSO for Azure for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the NS1 SSO for Azure tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the NS1 SSO for Azure for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the NS1 SSO for Azure tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the NS1 SSO for Azure for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Onedesk Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/onedesk-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the OneDesk for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the OneDesk tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the OneDesk for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the OneDesk tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the OneDesk for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Oneflow Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oneflow-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Oneflow for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Oneflow tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Oneflow for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Oneflow tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Oneflow for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oneteam Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oneteam-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Oneteam for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Oneteam tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Oneteam for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Oneteam tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Oneteam for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Opal Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/opal-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Opal for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Opal tile in the My Apps, you should be automatically signed in to the Opal for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Opal tile in the My Apps, you should be automatically signed in to the Opal for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Openlearning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/openlearning-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to OpenLearning Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the OpenLearning tile in the My Apps, this will redirect to OpenLearning Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the OpenLearning tile in the My Apps, this will redirect to OpenLearning Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Optiturn Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/optiturn-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to OptiTurn Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the OptiTurn tile in the My Apps, this will redirect to OptiTurn Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the OptiTurn tile in the My Apps, this will redirect to OptiTurn Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oracle Access Manager For Oracle Ebs Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oracle-access-manager-for-oracle-ebs-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Oracle Access Manager for Oracle E-Business Suite Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the Oracle Access Manager for Oracle E-Business Suite tile in the My Apps, this will redirect to Oracle Access Manager for Oracle E-Business Suite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the Oracle Access Manager for Oracle E-Business Suite tile in the My Apps, this will redirect to Oracle Access Manager for Oracle E-Business Suite Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oracle Access Manager For Oracle Retail Merchandising Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oracle-access-manager-for-oracle-retail-merchandising-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Oracle Access Manager for Oracle Retail Merchandising Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the Oracle Access Manager for Oracle Retail Merchandising tile in the My Apps, this will redirect to Oracle Access Manager for Oracle Retail Merchandising Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the Oracle Access Manager for Oracle Retail Merchandising tile in the My Apps, this will redirect to Oracle Access Manager for Oracle Retail Merchandising Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oracle Idcs For Ebs Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oracle-idcs-for-ebs-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Oracle IDCS for E-Business Suite Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the Oracle IDCS for E-Business Suite tile in the My Apps, this will redirect to Oracle IDCS for E-Business Suite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the Oracle IDCS for E-Business Suite tile in the My Apps, this will redirect to Oracle IDCS for E-Business Suite Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oracle Idcs For Jd Edwards Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oracle-idcs-for-jd-edwards-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Oracle IDCS for JD Edwards Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the Oracle IDCS for JD Edwards tile in the My Apps, this will redirect to Oracle IDCS for JD Edwards Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the Oracle IDCS for JD Edwards tile in the My Apps, this will redirect to Oracle IDCS for JD Edwards Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oracle Idcs For Peoplesoft Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oracle-idcs-for-peoplesoft-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Oracle IDCS for PeopleSoft Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the Oracle IDCS for PeopleSoft tile in the My Apps, this will redirect to Oracle IDCS for PeopleSoft Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the Oracle IDCS for PeopleSoft tile in the My Apps, this will redirect to Oracle IDCS for PeopleSoft Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Oreilly Learning Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/oreilly-learning-platform-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the O'Reilly learning platform for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the O'Reilly learning platform tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the O'Reilly learning platform for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the O'Reilly learning platform tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the O'Reilly learning platform for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Pagedna Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pagedna-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PageDNA Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PageDNA tile in the My Apps, this will redirect to PageDNA Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PageDNA tile in the My Apps, this will redirect to PageDNA Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Palantir Foundry Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/palantir-foundry-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Palantir Foundry for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Palantir Foundry tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Palantir Foundry for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Palantir Foundry tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Palantir Foundry for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Parallels Desktop Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/parallels-desktop-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Parallels Desktop Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Parallels Desktop tile in the My Apps, this will redirect to Parallels Desktop Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Parallels Desktop tile in the My Apps, this will redirect to Parallels Desktop Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Parkable Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/parkable-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Parkable Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Parkable tile in the My Apps, this will redirect to Parkable Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Parkable tile in the My Apps, this will redirect to Parkable Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Parkhere Corporate Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/parkhere-corporate-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the ParkHere Corporate for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the ParkHere Corporate tile in the My Apps, you should be automatically signed in to the ParkHere Corporate for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ParkHere Corporate tile in the My Apps, you should be automatically signed in to the ParkHere Corporate for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Patentsquare Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/patentsquare-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PatentSQUARE Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PatentSQUARE tile in the My Apps, this will redirect to PatentSQUARE Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PatentSQUARE tile in the My Apps, this will redirect to PatentSQUARE Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pavaso Digital Close Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pavaso-digital-close-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Pavaso Digital Close for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Pavaso Digital Close tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Pavaso Digital Close for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Pavaso Digital Close tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Pavaso Digital Close for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Peakon Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/peakon-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Peakon for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Peakon tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Peakon for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Peakon tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Peakon for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pennylane Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pennylane-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Pennylane Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the Pennylane tile in the My Apps, this will redirect to Pennylane Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the Pennylane tile in the My Apps, this will redirect to Pennylane Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Peoplecart Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/peoplecart-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Peoplecart Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Peoplecart tile in the My Apps, this will redirect to Peoplecart Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Peoplecart tile in the My Apps, this will redirect to Peoplecart Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Perceptionunitedstates Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/perceptionunitedstates-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the UltiPro Perception for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the UltiPro Perception tile in the My Apps, you should be automatically signed in to the UltiPro Perception for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the UltiPro Perception tile in the My Apps, you should be automatically signed in to the UltiPro Perception for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Percolate Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/percolate-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Percolate for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Percolate tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Percolate for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Percolate tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Percolate for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Periscope Data Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/periscope-data-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Periscope Data Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Periscope Data tile in the My Apps, this will redirect to Periscope Data Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Periscope Data tile in the My Apps, this will redirect to Periscope Data Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Phenom Txm Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/phenom-txm-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Phenom TXM for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Phenom TXM tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Phenom TXM for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Phenom TXM tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Phenom TXM for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pinpoint Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pinpoint-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Pinpoint (SAML) Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Pinpoint (SAML) tile in the My Apps, this will redirect to Pinpoint (SAML) Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Pinpoint (SAML) tile in the My Apps, this will redirect to Pinpoint (SAML) Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Pksha Chatbot Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pksha-chatbot-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PKSHA Chatbot Sign on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PKSHA Chatbot tile in the My Apps, this will redirect to PKSHA Chatbot Sign on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PKSHA Chatbot tile in the My Apps, this will redirect to PKSHA Chatbot Sign on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Planview Admin Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/planview-admin-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Planview Admin for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Planview Admin tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Planview Admin for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Planview Admin tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Planview Admin for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Planview Leankit Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/planview-leankit-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Planview LeanKit for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Planview LeanKit tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Planview LeanKit for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Planview LeanKit tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Planview LeanKit for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pluto Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pluto-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Pluto Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Pluto tile in the My Apps, this will redirect to Pluto Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Pluto tile in the My Apps, this will redirect to Pluto Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Policystat Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/policystat-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PolicyStat Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PolicyStat tile in the My Apps, this will redirect to PolicyStat Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PolicyStat tile in the My Apps, this will redirect to PolicyStat Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Postbeyond Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/postbeyond-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PostBeyond Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PostBeyond tile in the My Apps, this will redirect to PostBeyond Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PostBeyond tile in the My Apps, this will redirect to PostBeyond Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Predict360 Sso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/predict360-sso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Predict360 SSO for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Predict360 SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Predict360 SSO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Predict360 SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Predict360 SSO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Predictixordering Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/predictixordering-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Predictix Ordering Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Predictix Ordering tile in the My Apps, this will redirect to Predictix Ordering Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Predictix Ordering tile in the My Apps, this will redirect to Predictix Ordering Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Predictixpricereporting Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/predictixpricereporting-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Predictix Price Reporting Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Predictix Price Reporting tile in the My Apps, this will redirect to Predictix Price Reporting Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Predictix Price Reporting tile in the My Apps, this will redirect to Predictix Price Reporting Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Preset Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/preset-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Preset for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Preset tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Preset for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Preset tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Preset for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Printerlogic Saas Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/printerlogic-saas-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the PrinterLogic for which you set up the SSO.
-* You can also use Microsoft My Apps to test the application in any mode. When you click the PrinterLogic tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the PrinterLogic for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can also use Microsoft My Apps to test the application in any mode. When you click the PrinterLogic tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the PrinterLogic for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Printix Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/printix-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Printix Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Printix tile in the My Apps, this will redirect to Printix Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Printix tile in the My Apps, this will redirect to Printix Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Proactis Rego Invoice Capture Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/proactis-rego-invoice-capture-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Proactis Rego Invoice Capture for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Proactis Rego Invoice Capture tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Proactis Rego Invoice Capture for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Proactis Rego Invoice Capture tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Proactis Rego Invoice Capture for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Proactis Rego Source To Contract Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/proactis-rego-source-to-contract-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Proactis Rego Source-to-Contract Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Proactis Rego Source-to-Contract tile in the My Apps, this will redirect to Proactis Rego Source-to-Contract Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Proactis Rego Source-to-Contract tile in the My Apps, this will redirect to Proactis Rego Source-to-Contract Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Proactis Rego Source To Pay Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/proactis-rego-source-to-pay-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Proactis Rego Source-to-Pay Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Proactis Rego Source-to-Pay tile in the My Apps, this will redirect to Proactis Rego Source-to-Pay Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Proactis Rego Source-to-Pay tile in the My Apps, this will redirect to Proactis Rego Source-to-Pay Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Profitco Saml App Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/profitco-saml-app-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Profit.co for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Profit.co tile in the My Apps, you should be automatically signed in to the Profit.co for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Profit.co tile in the My Apps, you should be automatically signed in to the Profit.co for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Projectplace Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/projectplace-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ProjectPlace for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ProjectPlace tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProjectPlace for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ProjectPlace tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProjectPlace for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Promaster Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/promaster-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ProMaster (by Inlogik) for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ProMaster (by Inlogik) tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProMaster (by Inlogik) for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ProMaster (by Inlogik) tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProMaster (by Inlogik) for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pronovos Ops Manager Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pronovos-ops-manager-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ProNovos Ops Manager for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ProNovos Ops Manager tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProNovos Ops Manager for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ProNovos Ops Manager tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProNovos Ops Manager for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Proofpoint Ondemand Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/proofpoint-ondemand-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Proofpoint on Demand Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Proofpoint on Demand tile in the My Apps, this will redirect to Proofpoint on Demand Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Proofpoint on Demand tile in the My Apps, this will redirect to Proofpoint on Demand Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Proofpoint Security Awareness Training Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/proofpoint-security-awareness-training-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Proofpoint Security Awareness Training for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Proofpoint Security Awareness Training tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Proofpoint Security Awareness Training for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Proofpoint Security Awareness Training tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Proofpoint Security Awareness Training for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Pwc Identity Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pwc-identity-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PwC Identity Sign on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PwC Identity tile in the My Apps, this will redirect to PwC Identity Sign on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PwC Identity tile in the My Apps, this will redirect to PwC Identity Sign on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pymetrics Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pymetrics-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to pymetrics Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the pymetrics tile in the My Apps, this will redirect to pymetrics Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the pymetrics tile in the My Apps, this will redirect to pymetrics Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Qradar Soar Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/qradar-soar-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the QRadar SOAR for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the QRadar SOAR tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the QRadar SOAR for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the QRadar SOAR tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the QRadar SOAR for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Qreserve Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/qreserve-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the QReserve for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the QReserve tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the QReserve for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the QReserve tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the QReserve for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Qualaroo Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/qualaroo-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Qualaroo for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Qualaroo tile in the My Apps, you should be automatically signed in to the Qualaroo for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Qualaroo tile in the My Apps, you should be automatically signed in to the Qualaroo for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Quantum Workplace Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/quantum-workplace-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Quantum Workplace for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Quantum Workplace tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Quantum Workplace for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Quantum Workplace tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Quantum Workplace for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Questetra Bpm Suite Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/questetra-bpm-suite-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Questetra BPM Suite Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Questetra BPM Suite tile in the My Apps, this will redirect to Questetra BPM Suite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Questetra BPM Suite tile in the My Apps, this will redirect to Questetra BPM Suite Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rackspacesso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rackspacesso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Rackspace SSO for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Rackspace SSO tile in the My Apps, you should be automatically signed in to the Rackspace SSO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Rackspace SSO tile in the My Apps, you should be automatically signed in to the Rackspace SSO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
You can also use the **Validate** button in the **Rackspace SSO** Single sign-on settings:
active-directory Radancys Employee Referrals Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/radancys-employee-referrals-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Radancy's Employee Referrals for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Radancy's Employee Referrals tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Radancy's Employee Referrals for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Radancy's Employee Referrals tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Radancy's Employee Referrals for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Radiant Iot Portal Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/radiant-iot-portal-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Radiant IOT Portal Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Radiant IOT Portal tile in the My Apps, this will redirect to Radiant IOT Portal Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Radiant IOT Portal tile in the My Apps, this will redirect to Radiant IOT Portal Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Raketa Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/raketa-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Raketa Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Raketa tile in the My Apps, this will redirect to Raketa Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Raketa tile in the My Apps, this will redirect to Raketa Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Real Links Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/real-links-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Real Links Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Real Links tile in the My Apps, this will redirect to Real Links Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Real Links tile in the My Apps, this will redirect to Real Links Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Recurly Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/recurly-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Recurly for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Recurly tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Recurly for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Recurly tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Recurly for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Redocly Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/redocly-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Redocly Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Redocly tile in the My Apps, this will redirect to Redocly Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Redocly tile in the My Apps, this will redirect to Redocly Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Redvector Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/redvector-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to RedVector Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the RedVector tile in the My Apps, this will redirect to RedVector Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the RedVector tile in the My Apps, this will redirect to RedVector Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Renraku Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/renraku-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to PHONE APPLI PEOPLE Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the PHONE APPLI PEOPLE tile in the My Apps, this will redirect to PHONE APPLI PEOPLE Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the PHONE APPLI PEOPLE tile in the My Apps, this will redirect to PHONE APPLI PEOPLE Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Reprints Desk Article Galaxy Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/reprints-desk-article-galaxy-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Reprints Desk - Article Galaxy for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Reprints Desk - Article Galaxy tile in the My Apps, you should be automatically signed in to the Reprints Desk - Article Galaxy for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Reprints Desk - Article Galaxy tile in the My Apps, you should be automatically signed in to the Reprints Desk - Article Galaxy for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Respondent Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/respondent-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Respondent for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Respondent tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Respondent for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Respondent tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Respondent for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Reviewsnap Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/reviewsnap-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Reviewsnap for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Reviewsnap tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Reviewsnap for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Reviewsnap tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Reviewsnap for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rightanswers Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rightanswers-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to RightAnswers Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the RightAnswers tile in the My Apps, this will redirect to RightAnswers Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the RightAnswers tile in the My Apps, this will redirect to RightAnswers Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Risecom Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/risecom-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Rise.com for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Rise.com tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Rise.com for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Rise.com tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Rise.com for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Riskware Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/riskware-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Riskware Sign-On URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Riskware tile in the My Apps, this will redirect to Riskware Sign-On URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Riskware tile in the My Apps, this will redirect to Riskware Sign-On URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rocketreach Sso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rocketreach-sso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the RocketReach SSO for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the RocketReach SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the RocketReach SSO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the RocketReach SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the RocketReach SSO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rolepoint Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rolepoint-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to RolePoint Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the RolePoint tile in the My Apps, this will redirect to RolePoint Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the RolePoint tile in the My Apps, this will redirect to RolePoint Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rootly Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rootly-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Rootly for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Rootly tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Rootly for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Rootly tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Rootly for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rsa Archer Suite Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rsa-archer-suite-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to RSA Archer Suite Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the RSA Archer Suite tile in the My Apps, this will redirect to RSA Archer Suite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the RSA Archer Suite tile in the My Apps, this will redirect to RSA Archer Suite Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Rstudio Connect Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/rstudio-connect-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the RStudio Connect SAML Authentication for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the RStudio Connect SAML Authentication tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the RStudio Connect SAML Authentication for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the RStudio Connect SAML Authentication tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the RStudio Connect SAML Authentication for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Runmyprocess Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/runmyprocess-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to RunMyProcess Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the RunMyProcess tile in the My Apps, this will redirect to RunMyProcess Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the RunMyProcess tile in the My Apps, this will redirect to RunMyProcess Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Safeconnect Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/safeconnect-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SafeConnect Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SafeConnect tile in the My Apps, this will redirect to SafeConnect Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SafeConnect tile in the My Apps, this will redirect to SafeConnect Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Safety Culture Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/safety-culture-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically logged in to SafetyCulture for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SafetyCulture tile in My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IdP mode, you should be automatically logged in to SafetyCulture for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SafetyCulture tile in My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IdP mode, you should be automatically logged in to SafetyCulture for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Saml Toolkit Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/saml-toolkit-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Azure AD SAML Toolkit Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Azure AD SAML Toolkit tile in the My Apps, this will redirect to Azure AD SAML Toolkit Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Azure AD SAML Toolkit tile in the My Apps, this will redirect to Azure AD SAML Toolkit Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Sauce Labs Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sauce-labs-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Sauce Labs for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Sauce Labs tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Sauce Labs for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Sauce Labs tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Sauce Labs for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Scalex Enterprise Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/scalex-enterprise-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ScaleX Enterprise for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ScaleX Enterprise tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ScaleX Enterprise for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ScaleX Enterprise tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ScaleX Enterprise for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Scilife Azure Ad Sso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/scilife-azure-ad-sso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Scilife Azure AD SSO Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Scilife Azure AD SSO tile in the My Apps, this will redirect to Scilife Azure AD SSO Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Scilife Azure AD SSO tile in the My Apps, this will redirect to Scilife Azure AD SSO Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Sciquest Spend Director Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sciquest-spend-director-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SciQuest Spend Director Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SciQuest Spend Director tile in the My Apps, this will redirect to SciQuest Spend Director Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SciQuest Spend Director tile in the My Apps, this will redirect to SciQuest Spend Director Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Screencast Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/screencast-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Screencast-O-Matic Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Screencast-O-Matic tile in the My Apps, this will redirect to Screencast-O-Matic Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Screencast-O-Matic tile in the My Apps, this will redirect to Screencast-O-Matic Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Screensteps Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/screensteps-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to ScreenSteps Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the ScreenSteps tile in the My Apps, this will redirect to ScreenSteps Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ScreenSteps tile in the My Apps, this will redirect to ScreenSteps Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Scuba Analytics Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/scuba-analytics-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Scuba Analytics for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Scuba Analytics tile in the My Apps, you should be automatically signed in to the Scuba Analytics for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Scuba Analytics tile in the My Apps, you should be automatically signed in to the Scuba Analytics for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Seattletimessso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/seattletimessso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the SeattleTimesSSO for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the SeattleTimesSSO tile in the My Apps, you should be automatically signed in to the SeattleTimesSSO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SeattleTimesSSO tile in the My Apps, you should be automatically signed in to the SeattleTimesSSO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Seculio Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/seculio-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Seculio for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Seculio tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Seculio for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Seculio tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Seculio for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Securedeliver Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/securedeliver-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SECURE DELIVER Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SECURE DELIVER tile in the My Apps, this will redirect to SECURE DELIVER Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SECURE DELIVER tile in the My Apps, this will redirect to SECURE DELIVER Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Securetransport Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/securetransport-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SecureTransport Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SecureTransport tile in the My Apps, this will redirect to SecureTransport Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SecureTransport tile in the My Apps, this will redirect to SecureTransport Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Sedgwickcms Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sedgwickcms-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Sedgwick CMS for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Sedgwick CMS tile in the My Apps, you should be automatically signed in to the Sedgwick CMS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Sedgwick CMS tile in the My Apps, you should be automatically signed in to the Sedgwick CMS for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Seekout Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/seekout-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the SeekOut for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SeekOut tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SeekOut for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SeekOut tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SeekOut for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Sensoscientific Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sensoscientific-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the SensoScientific Wireless Temperature Monitoring System for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the SensoScientific Wireless Temperature Monitoring System tile in the My Apps, you should be automatically signed in to the SensoScientific Wireless Temperature Monitoring System for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SensoScientific Wireless Temperature Monitoring System tile in the My Apps, you should be automatically signed in to the SensoScientific Wireless Temperature Monitoring System for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Servicessosafe Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/servicessosafe-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the SoSafe for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SoSafe tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SoSafe for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SoSafe tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SoSafe for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Servusconnect Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/servusconnect-tutorial.md
You may test your Azure AD single sign-on configuration using one of the followi
* Go to [ServusConnect Sign-on URL](https://app.servusconnect.com/) directly and initiate the login flow from there. See **[Sign-on with SSO](#sign-on-with-sso)**, below.
-* You can use Microsoft My Apps. When you click the ServusConnect tile in the My Apps, this will redirect to ServusConnect Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ServusConnect tile in the My Apps, this will redirect to ServusConnect Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Sign-on with SSO
active-directory Settlingmusic Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/settlingmusic-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Settling music Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Settling music tile in the My Apps, this will redirect to Settling music Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Settling music tile in the My Apps, this will redirect to Settling music Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Sevone Network Monitoring System Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sevone-network-monitoring-system-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SevOne Network Monitoring System (NMS) Sign-On URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SevOne Network Monitoring System (NMS) tile in the My Apps, this will redirect to SevOne Network Monitoring System (NMS) Sign-On URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SevOne Network Monitoring System (NMS) tile in the My Apps, this will redirect to SevOne Network Monitoring System (NMS) Sign-On URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Sharefile Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sharefile-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Citrix ShareFile Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Citrix ShareFile tile in the My Apps, this will redirect to Citrix ShareFile Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Citrix ShareFile tile in the My Apps, this will redirect to Citrix ShareFile Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Sharevault Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sharevault-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ShareVault for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ShareVault tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ShareVault for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ShareVault tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ShareVault for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Shiftplanning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/shiftplanning-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Humanity Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Humanity tile in the My Apps, this will redirect to Humanity Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Humanity tile in the My Apps, this will redirect to Humanity Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Shiphazmat Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/shiphazmat-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the ShipHazmat for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the ShipHazmat tile in the My Apps, you should be automatically signed in to the ShipHazmat for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ShipHazmat tile in the My Apps, you should be automatically signed in to the ShipHazmat for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Showpad Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/showpad-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Showpad Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Showpad tile in the My Apps, this will redirect to Showpad Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Showpad tile in the My Apps, this will redirect to Showpad Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Shucchonavi Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/shucchonavi-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Shuccho Navi Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Shuccho Navi tile in the My Apps, this will redirect to Shuccho Navi Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Shuccho Navi tile in the My Apps, this will redirect to Shuccho Navi Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Signagelive Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/signagelive-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Signagelive Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Signagelive tile in the My Apps, this will redirect to Signagelive Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Signagelive tile in the My Apps, this will redirect to Signagelive Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Signiant Media Shuttle Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/signiant-media-shuttle-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Signiant Media Shuttle Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Signiant Media Shuttle tile in the My Apps, this will redirect to Signiant Media Shuttle Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Signiant Media Shuttle tile in the My Apps, this will redirect to Signiant Media Shuttle Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Silverback Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/silverback-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Silverback Sign on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Silverback tile in the My Apps, this will redirect to Silverback Sign on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Silverback tile in the My Apps, this will redirect to Silverback Sign on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Sketch Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sketch-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Sketch Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Sketch tile in the My Apps, this will redirect to Sketch Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Sketch tile in the My Apps, this will redirect to Sketch Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skillcast Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skillcast-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Skillcast Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Skillcast tile in the My Apps, this will redirect to Skillcast Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Skillcast tile in the My Apps, this will redirect to Skillcast Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skilljar Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skilljar-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Skilljar Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Skilljar tile in the My Apps, this will redirect to Skilljar Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Skilljar tile in the My Apps, this will redirect to Skilljar Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skills Workflow Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skills-workflow-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Skills Workflow Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Skills Workflow tile in the My Apps, this will redirect to Skills Workflow Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Skills Workflow tile in the My Apps, this will redirect to Skills Workflow Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skillsmanager Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skillsmanager-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Skills Manager for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Skills Manager tile in the My Apps, you should be automatically signed in to the Skills Manager for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Skills Manager tile in the My Apps, you should be automatically signed in to the Skills Manager for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skybreathe Analytics Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skybreathe-analytics-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Skybreathe® Analytics for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Skybreathe® Analytics tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Skybreathe® Analytics for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Skybreathe® Analytics tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Skybreathe® Analytics for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skydeskemail Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skydeskemail-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SkyDesk Email Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SkyDesk Email tile in the My Apps, this will redirect to SkyDesk Email Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SkyDesk Email tile in the My Apps, this will redirect to SkyDesk Email Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Skysite Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skysite-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the SKYSITE for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the SKYSITE tile in the My Apps, you should be automatically signed in to the SKYSITE for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SKYSITE tile in the My Apps, you should be automatically signed in to the SKYSITE for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Smallimprovements Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/smallimprovements-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Small Improvements Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Small Improvements tile in the My Apps, this will redirect to Small Improvements Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Small Improvements tile in the My Apps, this will redirect to Small Improvements Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Smart360 Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/smart360-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Smart360 Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Smart360 tile in the My Apps, this will redirect to Smart360 Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Smart360 tile in the My Apps, this will redirect to Smart360 Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Smartfile Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/smartfile-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SmartFile Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SmartFile tile in the My Apps, this will redirect to SmartFile Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SmartFile tile in the My Apps, this will redirect to SmartFile Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Smarthr Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/smarthr-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SmartHR Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SmartHR tile in the My Apps, this will redirect to SmartHR Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SmartHR tile in the My Apps, this will redirect to SmartHR Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Smartkargo Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/smartkargo-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SmartKargo Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SmartKargo tile in the My Apps, this will redirect to SmartKargo Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SmartKargo tile in the My Apps, this will redirect to SmartKargo Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Smartlpa Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/smartlpa-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SmartLPA Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SmartLPA tile in the My Apps, this will redirect to SmartLPA Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SmartLPA tile in the My Apps, this will redirect to SmartLPA Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Snackmagic Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/snackmagic-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Snackmagic for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Snackmagic tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Snackmagic for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Snackmagic tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Snackmagic for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Snowflake Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/snowflake-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Snowflake for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Snowflake tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Snowflake for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Snowflake tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Snowflake for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Soc Sst Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/soc-sst-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the SOC SST for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SOC SST tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SOC SST for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SOC SST tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SOC SST for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Softeon Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/softeon-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Softeon WMS for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Softeon WMS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Softeon WMS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Softeon WMS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Softeon WMS for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Soonr Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/soonr-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Soonr Workplace for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Soonr Workplace tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Soonr Workplace for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Soonr Workplace tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Soonr Workplace for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Spedtrack Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/spedtrack-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the SpedTrack for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SpedTrack tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SpedTrack for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SpedTrack tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SpedTrack for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Sso For Jama Connect Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sso-for-jama-connect-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the SSO for Jama Connect® for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the SSO for Jama Connect® tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SSO for Jama Connect® for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the SSO for Jama Connect® tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the SSO for Jama Connect® for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Stackby Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/stackby-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Stackby for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Stackby tile in the My Apps, you should be automatically signed in to the Stackby for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Stackby tile in the My Apps, you should be automatically signed in to the Stackby for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Standard For Success Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/standard-for-success-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Standard for Success K-12 for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Standard for Success K-12 tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Standard for Success K-12 for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Standard for Success K-12 tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Standard for Success K-12 for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Starmind Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/starmind-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Starmind Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Starmind tile in the My Apps, this will redirect to Starmind Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Starmind tile in the My Apps, this will redirect to Starmind Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Stormboard Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/stormboard-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Stormboard for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Stormboard tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Stormboard for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Stormboard tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Stormboard for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Superannotate Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/superannotate-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SuperAnnotate Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SuperAnnotate tile in the My Apps, this will redirect to SuperAnnotate Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SuperAnnotate tile in the My Apps, this will redirect to SuperAnnotate Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Surfconext Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/surfconext-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SURFconext Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SURFconext tile in the My Apps, this will redirect to SURFconext Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SURFconext tile in the My Apps, this will redirect to SURFconext Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Surfsecureid Azure Mfa Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/surfsecureid-azure-mfa-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SURFsecureID - Azure MFA Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SURFsecureID - Azure MFA tile in the My Apps, this will redirect to SURFsecureID - Azure MFA Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SURFsecureID - Azure MFA tile in the My Apps, this will redirect to SURFsecureID - Azure MFA Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Swit Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/swit-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Swit Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Swit tile in the My Apps, this will redirect to Swit Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Swit tile in the My Apps, this will redirect to Swit Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Synchronet Click Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/synchronet-click-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to SynchroNet CLICK Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the SynchroNet CLICK tile in the My Apps, this will redirect to SynchroNet CLICK Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the SynchroNet CLICK tile in the My Apps, this will redirect to SynchroNet CLICK Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Syniverse Customer Portal Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/syniverse-customer-portal-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Syniverse Customer Portal for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Syniverse Customer Portal tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Syniverse Customer Portal for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Syniverse Customer Portal tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Syniverse Customer Portal for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Talon Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/talon-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Talon for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Talon tile in the My Apps, you should be automatically signed in to the Talon for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Talon tile in the My Apps, you should be automatically signed in to the Talon for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Tango Reserve Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tango-reserve-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Tango Reserve by AgilQuest (EU Instance) for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Tango Reserve by AgilQuest (EU Instance) tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Tango Reserve by AgilQuest (EU Instance) for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Tango Reserve by AgilQuest (EU Instance) tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Tango Reserve by AgilQuest (EU Instance) for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tanium Sso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tanium-sso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Tanium SSO for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Tanium SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Tanium SSO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Tanium SSO tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Tanium SSO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Tasc Beta Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tasc-beta-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TASC (beta) for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TASC (beta) tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TASC (beta) for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TASC (beta) tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TASC (beta) for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Teamseer Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/teamseer-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to TeamSeer Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the TeamSeer tile in the My Apps, this will redirect to TeamSeer Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TeamSeer tile in the My Apps, this will redirect to TeamSeer Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Teamslide Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/teamslide-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to TeamSlide Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the TeamSlide tile in the My Apps, this will redirect to TeamSlide Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TeamSlide tile in the My Apps, this will redirect to TeamSlide Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Teamsticker By Communitio Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/teamsticker-by-communitio-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to TeamSticker by Communitio Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the TeamSticker by Communitio tile in the My Apps, this will redirect to TeamSticker by Communitio Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TeamSticker by Communitio tile in the My Apps, this will redirect to TeamSticker by Communitio Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tencent Cloud Idaas Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tencent-cloud-idaas-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TencentCloud IDaaS for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TencentCloud IDaaS tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TencentCloud IDaaS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TencentCloud IDaaS tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TencentCloud IDaaS for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Terratrue Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/terratrue-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TerraTrue for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TerraTrue tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TerraTrue for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TerraTrue tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TerraTrue for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tesma Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tesma-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the tesma for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the tesma tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the tesma for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the tesma tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the tesma for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Testim Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/testim-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Testim for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Testim tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Testim for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Testim tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Testim for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Textline Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/textline-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Textline for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Textline tile in the My Apps, you should be automatically signed in to the Textline for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Textline tile in the My Apps, you should be automatically signed in to the Textline for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory The Funding Portal Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/the-funding-portal-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to The Funding Portal Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the The Funding Portal tile in the My Apps, this will redirect to The Funding Portal Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the The Funding Portal tile in the My Apps, this will redirect to The Funding Portal Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Theom Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/theom-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Theom Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Theom tile in the My Apps, this will redirect to Theom Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Theom tile in the My Apps, this will redirect to Theom Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Thirdlight Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/thirdlight-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to ThirdLight Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the ThirdLight tile in the My Apps, this will redirect to ThirdLight Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ThirdLight tile in the My Apps, this will redirect to ThirdLight Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Threatq Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/threatq-platform-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to ThreatQ Platform Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the ThreatQ Platform tile in the My Apps, this will redirect to ThreatQ Platform Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the ThreatQ Platform tile in the My Apps, this will redirect to ThreatQ Platform Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Tidemark Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tidemark-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Tidemark Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Tidemark tile in the My Apps, this will redirect to Tidemark Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Tidemark tile in the My Apps, this will redirect to Tidemark Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tigergraph Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tigergraph-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TigerGraph for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TigerGraph tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TigerGraph for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TigerGraph tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TigerGraph for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Timelive Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/timelive-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to TimeLive Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the TimeLive tile in the My Apps, this will redirect to TimeLive Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TimeLive tile in the My Apps, this will redirect to TimeLive Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Timeoffmanager Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/timeoffmanager-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the TimeOffManager for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the TimeOffManager tile in the My Apps, you should be automatically signed in to the TimeOffManager for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TimeOffManager tile in the My Apps, you should be automatically signed in to the TimeOffManager for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Timetabling Solutions Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/timetabling-solutions-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Timetabling Solutions Sign-On URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Timetabling Solutions tile in the My Apps, this will redirect to Timetabling Solutions Sign-On URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Timetabling Solutions tile in the My Apps, this will redirect to Timetabling Solutions Sign-On URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Timetrack Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/timetrack-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TimeTrack for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TimeTrack tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TimeTrack for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TimeTrack tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TimeTrack for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tinfoil Security Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tinfoil-security-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the TINFOIL SECURITY for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the TINFOIL SECURITY tile in the My Apps, you should be automatically signed in to the TINFOIL SECURITY for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TINFOIL SECURITY tile in the My Apps, you should be automatically signed in to the TINFOIL SECURITY for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tivitz Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tivitz-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to TiViTz Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the TiViTz tile in the My Apps, this will redirect to TiViTz Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TiViTz tile in the My Apps, this will redirect to TiViTz Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tonicdm Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tonicdm-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TonicDM for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TonicDM tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TonicDM for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TonicDM tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TonicDM for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Torii Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/torii-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Torii for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Torii tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Torii for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Torii tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Torii for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tracker Software Technologies Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tracker-software-technologies-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Tracker Software Technologies for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Tracker Software Technologies tile in the My Apps, you should be automatically signed in to the Tracker Software Technologies for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Tracker Software Technologies tile in the My Apps, you should be automatically signed in to the Tracker Software Technologies for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Trackvia Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/trackvia-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the TrackVia for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the TrackVia tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TrackVia for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the TrackVia tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TrackVia for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Training Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/training-platform-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Training Platform for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Training Platform tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Training Platform for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Training Platform tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Training Platform for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tranxfer Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tranxfer-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Tranxfer Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Tranxfer tile in the My Apps, this will redirect to Tranxfer Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Tranxfer tile in the My Apps, this will redirect to Tranxfer Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Trelica Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/trelica-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Trelica for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Trelica tile in the My Apps, you should be automatically signed in to the Trelica for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Trelica tile in the My Apps, you should be automatically signed in to the Trelica for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tripwire Enterprise Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tripwire-enterprise-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Tripwire Enterprise for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Tripwire Enterprise tile in the My Apps, you should be automatically signed in to the Tripwire Enterprise for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Tripwire Enterprise tile in the My Apps, you should be automatically signed in to the Tripwire Enterprise for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Trunarrative Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/trunarrative-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to TruNarrative Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the TruNarrative tile in the My Apps, this will redirect to TruNarrative Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TruNarrative tile in the My Apps, this will redirect to TruNarrative Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Tvu Service Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tvu-service-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the TVU Service for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the TVU Service tile in the My Apps, you should be automatically signed in to the TVU Service for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the TVU Service tile in the My Apps, you should be automatically signed in to the TVU Service for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Twic Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/twic-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Twic for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Twic tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Twic for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Twic tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Twic for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Uber Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/uber-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Uber for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Uber tile in the My Apps, you should be automatically signed in to the Uber for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Uber tile in the My Apps, you should be automatically signed in to the Uber for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Udemy Business Saml Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/udemy-business-saml-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Udemy Business SAML Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Udemy Business SAML tile in the My Apps, this will redirect to Udemy Business SAML Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Udemy Business SAML tile in the My Apps, this will redirect to Udemy Business SAML Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Unite Us Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/unite-us-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Unite Us for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Unite Us tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Unite Us for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Unite Us tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Unite Us for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Us Bank Prepaid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/us-bank-prepaid-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the U.S. Bank Prepaid for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the U.S. Bank Prepaid tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the U.S. Bank Prepaid for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the U.S. Bank Prepaid tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the U.S. Bank Prepaid for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Userecho Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/userecho-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to UserEcho Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the UserEcho tile in the My Apps, this will redirect to UserEcho Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the UserEcho tile in the My Apps, this will redirect to UserEcho Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Usertesting Saml Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/usertesting-saml-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the UserTesting for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the UserTesting tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the UserTesting for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the UserTesting tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the UserTesting for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Userzoom Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/userzoom-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the UserZoom for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the UserZoom tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the UserZoom for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the UserZoom tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the UserZoom for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory V Client Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/v-client-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the V-Client for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the V-Client tile in the My Apps, you should be automatically signed in to the V-Client for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the V-Client tile in the My Apps, you should be automatically signed in to the V-Client for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Valence Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/valence-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Valence Security Platform for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Valence Security Platform tile in the My Apps, you should be automatically signed in to the Valence Security Platform for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Valence Security Platform tile in the My Apps, you should be automatically signed in to the Valence Security Platform for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Valid8me Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/valid8me-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the valid8Me for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the valid8Me tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the valid8Me for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the valid8Me tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the valid8Me for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vault Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vault-platform-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Vault Platform for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Vault Platform tile in the My Apps, you should be automatically signed in to the Vault Platform for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Vault Platform tile in the My Apps, you should be automatically signed in to the Vault Platform for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vbrick Rev Cloud Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vbrick-rev-cloud-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Vbrick Rev Cloud Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Vbrick Rev Cloud tile in the My Apps, this will redirect to Vbrick Rev Cloud Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Vbrick Rev Cloud tile in the My Apps, this will redirect to Vbrick Rev Cloud Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Vecos Releezme Locker Management System Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vecos-releezme-locker-management-system-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to VECOS Releezme Locker management system Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the VECOS Releezme Locker management system tile in the My Apps, this will redirect to VECOS Releezme Locker management system Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the VECOS Releezme Locker management system tile in the My Apps, this will redirect to VECOS Releezme Locker management system Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Veda Cloud Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/veda-cloud-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to VEDA Cloud Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the VEDA Cloud tile in the My Apps, this will redirect to VEDA Cloud Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the VEDA Cloud tile in the My Apps, this will redirect to VEDA Cloud Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Venafi Control Plane Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/venafi-control-plane-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Venafi Control Plane - Datacenter for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Venafi Control Plane - Datacenter tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Venafi Control Plane - Datacenter for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Venafi Control Plane - Datacenter tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Venafi Control Plane - Datacenter for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Vera Suite Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vera-suite-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Vera Suite Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Vera Suite tile in the My Apps, this will redirect to Vera Suite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Vera Suite tile in the My Apps, this will redirect to Vera Suite Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Vergesense Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vergesense-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the VergeSense for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the VergeSense tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the VergeSense for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the VergeSense tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the VergeSense for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Verme Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/verme-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Verme for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Verme tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Verme for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Verme tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Verme for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Veza Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/veza-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Veza for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Veza tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Veza for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Veza tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Veza for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vibehcm Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vibehcm-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Vibe HCM for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Vibe HCM tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Vibe HCM for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Vibe HCM tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Vibe HCM for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vida Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vida-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to VIDA Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the VIDA tile in the My Apps, this will redirect to VIDA Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the VIDA tile in the My Apps, this will redirect to VIDA Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Virtual Risk Manager Usa Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/virtual-risk-manager-usa-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Virtual Risk Manager - USA for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Virtual Risk Manager - USA tile in the My Apps, you should be automatically signed in to the Virtual Risk Manager - USA for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Virtual Risk Manager - USA tile in the My Apps, you should be automatically signed in to the Virtual Risk Manager - USA for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Visibly Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/visibly-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Visibly Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Visibly tile in the My Apps, this will redirect to Visibly Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Visibly tile in the My Apps, this will redirect to Visibly Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Visitly Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/visitly-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click **Test this application** in Azure portal and you should be automatically signed in to the Visitly for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Visitly tile in the My Apps, you should be automatically signed in to the Visitly for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Visitly tile in the My Apps, you should be automatically signed in to the Visitly for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Visitorg Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/visitorg-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Visit.org for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Visit.org tile in the My Apps, you should be automatically signed in to the Visit.org for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Visit.org tile in the My Apps, you should be automatically signed in to the Visit.org for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vmware Identity Service Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vmware-identity-service-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the VMware Identity Service for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the VMware Identity Service tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the VMware Identity Service for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the VMware Identity Service tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the VMware Identity Service for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Vocoli Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vocoli-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Vocoli for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Vocoli tile in the My Apps, you should be automatically signed in to the Vocoli for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Vocoli tile in the My Apps, you should be automatically signed in to the Vocoli for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vxmaintain Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vxmaintain-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the vxMaintain for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the vxMaintain tile in the My Apps, you should be automatically signed in to the vxMaintain for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the vxMaintain tile in the My Apps, you should be automatically signed in to the vxMaintain for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Vyond Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vyond-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Vyond for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Vyond tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Vyond for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Vyond tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Vyond for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Wan Sign Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wan-sign-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the WAN-Sign for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the WAN-Sign tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WAN-Sign for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the WAN-Sign tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WAN-Sign for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Watch By Colors Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/watch-by-colors-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Watch by Colors for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Watch by Colors tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Watch by Colors for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Watch by Colors tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Watch by Colors for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Wayleadr Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wayleadr-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Wayleadr for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Wayleadr tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Wayleadr for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Wayleadr tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Wayleadr for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Waywedo Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/waywedo-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Way We Do Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Way We Do tile in the My Apps, this will redirect to Way We Do Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Way We Do tile in the My Apps, this will redirect to Way We Do Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Webce Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/webce-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to WebCE Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the WebCE tile in the My Apps, this will redirect to WebCE Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the WebCE tile in the My Apps, this will redirect to WebCE Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Webtma Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/webtma-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the WebTMA for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the WebTMA tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WebTMA for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the WebTMA tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WebTMA for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Wedo Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wedo-tutorial.md
In this section, you test your Azure AD single sign-on configuration with the fo
* Click **Test this application** in Azure portal and you should be automatically signed in to the WEDO for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the WEDO tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WEDO for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the WEDO tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WEDO for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Weekdone Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/weekdone-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Weekdone for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Weekdone tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Weekdone for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Weekdone tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Weekdone for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Whos On Location Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/whos-on-location-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to WhosOnLocation Sign-On URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the WhosOnLocation tile in the My Apps, this will redirect to WhosOnLocation Sign-On URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the WhosOnLocation tile in the My Apps, this will redirect to WhosOnLocation Sign-On URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Whosoff Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/whosoff-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the WhosOff for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the WhosOff tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WhosOff for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the WhosOff tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the WhosOff for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Wikispaces Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wikispaces-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Wikispaces Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Wikispaces tile in the My Apps, this will redirect to Wikispaces Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Wikispaces tile in the My Apps, this will redirect to Wikispaces Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Windchill Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/windchill-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
1. Click on **Test this application** in Azure portal and you should be automatically signed in to the Windchill for which you set up the SSO.
-1. You can also use Microsoft My Apps to test the application in any mode. When you click the Windchill tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Windchill for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+1. You can also use Microsoft My Apps to test the application in any mode. When you click the Windchill tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Windchill for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Wisdom By Invictus Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wisdom-by-invictus-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Wisdom by Invictus for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Wisdom by Invictus tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Wisdom by Invictus for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Wisdom by Invictus tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Wisdom by Invictus for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Wistia Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wistia-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Wistia Sign on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Wistia tile in the My Apps, this will redirect to Wistia Sign on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Wistia tile in the My Apps, this will redirect to Wistia Sign on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Wiz Sso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/wiz-sso-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Wiz SSO Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Wiz SSO tile in the My Apps, this will redirect to Wiz SSO Sign-On URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Wiz SSO tile in the My Apps, this will redirect to Wiz SSO Sign-On URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Workgrid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workgrid-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Workgrid Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Workgrid tile in the My Apps, this will redirect to Workgrid Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Workgrid tile in the My Apps, this will redirect to Workgrid Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Workhub Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workhub-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to workhub Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the workhub tile in the My Apps, this will redirect to workhub Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the workhub tile in the My Apps, this will redirect to workhub Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Workpath Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workpath-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Workpath for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Workpath tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Workpath for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Workpath tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Workpath for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Workrite Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workrite-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Workrite Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Workrite tile in the My Apps, this will redirect to Workrite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Workrite tile in the My Apps, this will redirect to Workrite Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Workteam Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workteam-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the Workteam for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the Workteam tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Workteam for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the Workteam tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Workteam for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Workware Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workware-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Workware for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Workware tile in the My Apps, you should be automatically signed in to the Workware for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Workware tile in the My Apps, you should be automatically signed in to the Workware for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Worthix App Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/worthix-app-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Worthix App for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Worthix App tile in the My Apps, you should be automatically signed in to the Worthix App for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Worthix App tile in the My Apps, you should be automatically signed in to the Worthix App for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Xcarrier Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/xcarrier-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the xCarrier® for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the xCarrier® tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the xCarrier® for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the xCarrier® tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the xCarrier® for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory You At College Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/you-at-college-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to YOU at College Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you select the YOU at College tile in the My Apps, this will redirect to YOU at College Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you select the YOU at College tile in the My Apps, this will redirect to YOU at College Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Additional resources
active-directory Yuhu Property Management Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/yuhu-property-management-platform-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Yuhu Property Management Platform Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Yuhu Property Management Platform tile in the My Apps, this will redirect to Yuhu Property Management Platform Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Yuhu Property Management Platform tile in the My Apps, this will redirect to Yuhu Property Management Platform Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zdiscovery Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zdiscovery-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ZDiscovery for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ZDiscovery tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ZDiscovery for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ZDiscovery tile in the My Apps, if configured in SP mode you would be redirected to the application Sign-On page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ZDiscovery for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zengine Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zengine-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zengine Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zengine tile in the My Apps, this will redirect to Zengine Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zengine tile in the My Apps, this will redirect to Zengine Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zenqms Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zenqms-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on **Test this application** in Azure portal and you should be automatically signed in to the ZenQMS for which you set up the SSO.
-You can also use Microsoft My Apps to test the application in any mode. When you click the ZenQMS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ZenQMS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+You can also use Microsoft My Apps to test the application in any mode. When you click the ZenQMS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ZenQMS for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zero Networks Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zero-networks-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zero Networks Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zero Networks tile in the My Apps, this will redirect to Zero Networks Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zero Networks tile in the My Apps, this will redirect to Zero Networks Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
active-directory Zest Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zest-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Click on Test this application in Azure portal and you should be automatically signed in to the Zest for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Zest tile in the My Apps, you should be automatically signed in to the Zest for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zest tile in the My Apps, you should be automatically signed in to the Zest for which you set up the SSO. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zola Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zola-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zola Sign on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zola tile in the My Apps, this will redirect to Zola Sign on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zola tile in the My Apps, this will redirect to Zola Sign on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zscaler Beta Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zscaler-beta-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zscaler Beta Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zscaler Beta tile in the My Apps, this will redirect to Zscaler Beta Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zscaler Beta tile in the My Apps, this will redirect to Zscaler Beta Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zscaler Two Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zscaler-two-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zscaler Two Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zscaler Two tile in the My Apps, this will redirect to Zscaler Two Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zscaler Two tile in the My Apps, this will redirect to Zscaler Two Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zscaler Zscloud Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zscaler-zscloud-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zscaler ZSCloud Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zscaler ZSCloud tile in the My Apps, this will redirect to Zscaler ZSCloud Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zscaler ZSCloud tile in the My Apps, this will redirect to Zscaler ZSCloud Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Zwayam Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/zwayam-tutorial.md
In this section, you test your Azure AD single sign-on configuration with follow
* Go to Zwayam Sign-on URL directly and initiate the login flow from there.
-* You can use Microsoft My Apps. When you click the Zwayam tile in the My Apps, this will redirect to Zwayam Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Zwayam tile in the My Apps, this will redirect to Zwayam Sign-on URL. For more information, see [Azure AD My Apps](/azure/active-directory/manage-apps/end-user-experiences#azure-ad-my-apps).
## Next steps
active-directory Pci Dss Guidance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/standards/pci-dss-guidance.md
# Azure Active Directory PCI-DSS guidance
-The Payment Card Industry Security Standards Council (PCI SSC) is responsible for developing and promoting data security standards and resources, including the Payment Card Industry Data Security Standard (PCI-DSS), to ensure the security of payment transactions. To achieve PCI compliance, organizations using Azure Active Directory (Azure AD) can refer to guidance in this document. However, it is the responsibility of the organizations to ensure their PCI compliance. Their IT teams, SecOps teams, and Solutions Architects are responsible for creating and maintaining secure systems, products, and networks that handle, process, and store payment card information.
+The Payment Card Industry Security Standards Council (PCI SSC) is responsible for developing and promoting data security standards and resources, including the Payment Card Industry Data Security Standard (PCI-DSS), to ensure the security of payment transactions. To achieve PCI compliance, organizations using Azure Active Directory (Azure AD) can refer to guidance in this document. However, it's the responsibility of the organizations to ensure their PCI compliance. Their IT teams, SecOps teams, and Solutions Architects are responsible for creating and maintaining secure systems, products, and networks that handle, process, and store payment card information.
-While Azure AD helps meet some PCI-DSS control requirements, and provides modern identity and access protocols for cardholder data environment (CDE) resources, it should not be the sole mechanism for protecting cardholder data. Therefore, review this document set and all PCI-DSS requirements to establish a comprehensive security program that preserves customer trust. For a complete list of requirements, please visit the official PCI Security Standards Council website at pcisecuritystandards.org: [Official PCI Security Standards Council Site](https://docs-prv.pcisecuritystandards.org/PCI%20DSS/Standard/PCI-DSS-v4_0.pdf)
+While Azure AD helps meet some PCI-DSS control requirements, and provides modern identity and access protocols for cardholder data environment (CDE) resources, it shouldn't be the sole mechanism for protecting cardholder data. Therefore, review this document set and all PCI-DSS requirements to establish a comprehensive security program that preserves customer trust. For a complete list of requirements, visit the official PCI Security Standards Council website at pcisecuritystandards.org: [Official PCI Security Standards Council Site](https://docs-prv.pcisecuritystandards.org/PCI%20DSS/Standard/PCI-DSS-v4_0.pdf)
## PCI requirements for controls
-The global PCI-DSS v4.0 establishes a baseline of technical and operational standards for protecting account data. It ΓÇ£was developed to encourage and enhance payment card account data security and facilitate the broad adoption of consistent data security measures, globally. It provides a baseline of technical and operational requirements designed to protect account data. While specifically designed to focus on environments with payment card account data, PCI-DSS can also be used to protect against threats and secure other elements in the payment ecosystem.ΓÇ¥
+The global PCI-DSS v4.0 establishes a baseline of technical and operational standards for protecting account data. It "was developed to encourage and enhance payment card account data security and facilitate the broad adoption of consistent data security measures, globally. It provides a baseline of technical and operational requirements designed to protect account data. While designed to focus on environments with payment card account data, PCI-DSS can also be used to protect against threats and secure other elements in the payment ecosystem."
## Azure AD configuration and PCI-DSS
PCI-DSS requirements **3**, **4**, **9**, and **12** aren't addressed or met by
|PCI Data Security Standard - High Level Overview|Azure AD recommended PCI-DSS controls| |-|-|
-|Build and Maintain Secure Network and Systems|[1. Install and Maintain Network Security Controls]() </br> [2. Apply Secure Configurations to All System Components]()|
+|Build and Maintain Secure Network and Systems|[1. Install and Maintain Network Security Controls](pci-requirement-1.md) </br> [2. Apply Secure Configurations to All System Components](pci-requirement-2.md)|
|Protect Account Data|3. Protect Stored Account Data </br> 4. Protect Cardholder Data with Strong Cryptography During Transmission Over Public Networks|
-|Maintain a Vulnerability Management Program|[5. Protect All Systems and Networks from Malicious Software]() </br> [6. Develop and Maintain Secure Systems and Software]()|
-|Implement Strong Access Control Measures|[7. Restrict Access to System Components and Cardholder Data by Business Need to Know]() </br> [8. Identify and Authenticate Access to System Components]() </br> 9. Restrict Physical Access to System Components and Cardholder Data|
-|Regularly Monitor and Test Networks|[10. Log and Monitor All Access to System Components and Cardholder Data]() </br> [11. Test Security of Systems and Networks Regularly]()|
+|Maintain a Vulnerability Management Program|[5. Protect All Systems and Networks from Malicious Software](pci-requirement-5.md) </br> [6. Develop and Maintain Secure Systems and Software](pci-requirement-6.md)|
+|Implement Strong Access Control Measures|[7. Restrict Access to System Components and Cardholder Data by Business Need to Know](pci-requirement-7.md) </br> [8. Identify and Authenticate Access to System Components](pci-requirement-8.md) </br> 9. Restrict Physical Access to System Components and Cardholder Data|
+|Regularly Monitor and Test Networks|[10. Log and Monitor All Access to System Components and Cardholder Data](pci-requirement-10.md) </br> [11. Test Security of Systems and Networks Regularly](pci-requirement-11.md)|
|Maintain an Information Security Policy|12. Support Information Security with Organizational Policies and Programs| ## PCI-DSS applicability
CHD consists of:
SAD consists of security-related information used to authenticate cardholders and/or authorize payment card transactions. SAD includes, but isn't limited to: * **Full track data** - magnetic stripe or chip equivalent
-* **Card verification codes/values** - also referred to as the card validation code (CVC), or value (CVV). ItΓÇÖs the three- or four-digit value on the front or back of the payment card. ItΓÇÖs also referred to as CAV2, CVC2, CVN2, CVV2 or CID, determined by the participating payment brands (PPB).
+* **Card verification codes/values** - also referred to as the card validation code (CVC), or value (CVV). It's the three- or four-digit value on the front or back of the payment card. It's also referred to as CAV2, CVC2, CVN2, CVV2 or CID, determined by the participating payment brands (PPB).
* **PIN** - personal identification number * **PIN blocks** - an encrypted representation of the PIN used in a debit or credit card transaction. It ensures the secure transmission of sensitive information during a transaction
Protecting the CDE is essential to the security and confidentiality of customer
PCI audit scope relates to the systems, networks, and processes in the storage, processing, or transmission of CHD and/or SAD. If Account Data is stored, processed, or transmitted in a cloud environment, PCI-DSS applies to that environment and compliance typically involves validation of the cloud environment and the usage of it. There are five fundamental elements in scope for a PCI audit:
-* **Cardholder data environment (CDE)** - the area where CHD, and/or SAD, is stored, processed, or transmitted. It includes an organizationΓÇÖs components that touch CHD, such as networks, and network components, databases, servers, applications, and payment terminals.
+* **Cardholder data environment (CDE)** - the area where CHD, and/or SAD, is stored, processed, or transmitted. It includes an organization's components that touch CHD, such as networks, and network components, databases, servers, applications, and payment terminals.
* **People** - with access to the CDE, such as employees, contractors, and third-party service providers, are in the scope of a PCI audit. * **Processes** - that involve CHD, such as authorization, authentication, encryption and storage of account data in any format, are within the scope of a PCI audit. * **Technology** - that processes, stores, or transmits CHD, including hardware such as printers, and multi-function devices that scan, print and fax, end-user devices such as computers, laptops workstations, administrative workstations, tablets and mobile devices, software, and other IT systems, are in the scope of a PCI audit.
-* **System components** ΓÇô that might not store, process, or transmit CHD/SAD but have unrestricted connectivity to system components that store, process, or transmit CHD/SAD, or that could effect the security of the CDE.
+* **System components** ΓÇô that might not store, process, or transmit CHD/SAD but have unrestricted connectivity to system components that store, process, or transmit CHD/SAD, or that could affect the security of the CDE.
If PCI scope is minimized, organizations can effectively reduce the effects of security incidents and lower the risk of data breaches. Segmentation can be a valuable strategy for reducing the size of the PCI CDE, resulting in reduced compliance costs and overall benefits for the organization including but not limited to:
If PCI scope is minimized, organizations can effectively reduce the effects of s
## Strategies to reduce PCI audit scope
-An organizationΓÇÖs definition of its CDE determines PCI audit scope. Organizations document and communicate this definition to the PCI-DSS Qualified Security Assessor (QSA) performing the audit. The QSA assesses controls for the CDE to determine compliance.
+An organization's definition of its CDE determines PCI audit scope. Organizations document and communicate this definition to the PCI-DSS Qualified Security Assessor (QSA) performing the audit. The QSA assesses controls for the CDE to determine compliance.
Adherence to PCI standards and use of effective risk mitigation helps businesses protect customer personal and financial data, which maintains trust in their operations. The following section outlines strategies to reduce risk in PCI audit scope. ### Tokenization
With ongoing processes, organizations respond effectively to changes in the regu
### Implement strong security for shared infrastructure
-Typically, web services such as Azure, have a shared infrastructure wherein customer data might be stored on the same physical server or data storage device. This scenario creates the risk of unauthorized customers accessing data they donΓÇÖt own, and the risk of malicious actors targeting the shared infrastructure. Azure AD security features help mitigate risks associated with shared infrastructure:
+Typically, web services such as Azure, have a shared infrastructure wherein customer data might be stored on the same physical server or data storage device. This scenario creates the risk of unauthorized customers accessing data they don't own, and the risk of malicious actors targeting the shared infrastructure. Azure AD security features help mitigate risks associated with shared infrastructure:
* User authentication to network access technologies that support modern authentication protocols: virtual private network (VPN), remote desktop, and network access points. * Access control policies that enforce strong authentication methods and device compliance based on signals such as user context, device, location, and risk.
Implement accurate logging and monitoring to detect, and respond to, security in
Learn more:
-ΓÇó [What are Azure AD reports?](../reports-monitoring/overview-reports.md)
-ΓÇó [Azure AD built-in roles](../roles/permissions-reference.md)
+* [What are Azure AD reports?](../reports-monitoring/overview-reports.md)
+* [Azure AD built-in roles](../roles/permissions-reference.md)
### Multi-application environments: host outside the CDE
ai-services Background Removal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/background-removal.md
Start by creating a [VisionServiceOptions](/python/api/azure-ai-vision/azure.ai.
[!code-python[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/python/image-analysis/how-to/main.py?name=vision_service_options)]
+#### [Java](#tab/java)
+
+Start by creating a [VisionServiceOptions](/java/api/com.azure.ai.vision.common.visionserviceoptions) object using one of the constructors. For example:
+
+[!code-java[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/java/image-analysis/how-to/ImageAnalysis.java?name=vision_service_options)]
+ #### [C++](#tab/cpp) At the start of your code, use one of the static constructor methods [VisionServiceOptions::FromEndpoint](/cpp/cognitive-services/vision/service-visionserviceoptions#fromendpoint-1) to create a *VisionServiceOptions* object. For example:
The code in this guide uses remote images referenced by URL. You may want to try
Create a new **VisionSource** object from the URL of the image you want to analyze, using the static constructor [VisionSource.FromUrl](/dotnet/api/azure.ai.vision.common.visionsource.fromurl).
-**VisionSource** implements **IDisposable**, therefore create the object with a **using** statement or explicitly call **Dispose** method after analysis completes.
- [!code-csharp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/csharp/image-analysis/how-to/program.cs?name=vision_source)]
+**VisionSource** implements **IDisposable**, therefore create the object with a **using** statement or explicitly call **Dispose** method after analysis completes.
+ > [!TIP]
-> You can also analyze a local image by passing in the full-path image file name (see [VisionSource.FromFile](/dotnet/api/azure.ai.vision.common.visionsource.fromfile)), or by copying the image into the SDK's input buffer (see [VisionSource.FromImageSourceBuffer](/dotnet/api/azure.ai.vision.common.visionsource.fromimagesourcebuffer))
+> You can also analyze a local image by passing in the full-path image file name (see [VisionSource.FromFile](/dotnet/api/azure.ai.vision.common.visionsource.fromfile)), or by copying the image into the SDK's input buffer (see [VisionSource.FromImageSourceBuffer](/dotnet/api/azure.ai.vision.common.visionsource.fromimagesourcebuffer)). For more details, see [Call the Analyze API](./call-analyze-image-40.md?pivots=programming-language-csharp#select-the-image-to-analyze).
#### [Python](#tab/python)
In your script, create a new [VisionSource](/python/api/azure-ai-vision/azure.ai
[!code-python[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/python/image-analysis/how-to/main.py?name=vision_source)] > [!TIP]
-> You can also analyze a local image by passing in the full-path image file name to the **VisionSource** constructor instead of the image URL (see argument name **filename**). Alternatively, you can analyze an image in a memory buffer by constructing **VisionSource** using the argument **image_source_buffer**.
+> You can also analyze a local image by passing in the full-path image file name to the **VisionSource** constructor instead of the image URL (see argument name **filename**). Alternatively, you can analyze an image in a memory buffer by constructing **VisionSource** using the argument **image_source_buffer**. For more details, see [Call the Analyze API](./call-analyze-image-40.md?pivots=programming-language-python#select-the-image-to-analyze).
+
+#### [Java](#tab/java)
+
+Create a new **VisionSource** object from the URL of the image you want to analyze, using the static constructor [VisionSource.fromUrl](/java/api/com.azure.ai.vision.common.visionsource#com-azure-ai-vision-common-visionsource-fromurl(java-net-url)).
+
+[!code-java[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/java/image-analysis/how-to/ImageAnalysis.java?name=vision_source)]
+
+**VisionSource** implements **AutoCloseable**, therefore create the object in a try-with-resources block, or explicitly call the **close** method on this object when you're done analyzing the image.
+
+> [!TIP]
+> You can also analyze a local image by passing in the full-path image file name (see [VisionSource.fromFile](/jav?pivots=programming-language-java#select-the-image-to-analyze).
#### [C++](#tab/cpp)
Create a new **VisionSource** object from the URL of the image you want to analy
[!code-cpp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/cpp/image-analysis/how-to/how-to.cpp?name=vision_source)] > [!TIP]
-> You can also analyze a local image by passing in the full-path image file name (see [VisionSource::FromFile](/cpp/cognitive-services/vision/input-visionsource#fromfile)), or by copying the image into the SDK's input buffer (see [VisionSource::FromImageSourceBuffer](/cpp/cognitive-services/vision/input-visionsource#fromimagesourcebuffer)).
+> You can also analyze a local image by passing in the full-path image file name (see [VisionSource::FromFile](/cpp/cognitive-services/vision/input-visionsource#fromfile)), or by copying the image into the SDK's input buffer (see [VisionSource::FromImageSourceBuffer](/cpp/cognitive-services/vision/input-visionsource#fromimagesourcebuffer)). For more details, see [Call the Analyze API](./call-analyze-image-40.md?pivots=programming-language-cpp#select-the-image-to-analyze).
#### [REST API](#tab/rest)
To analyze a local image, you'd put the binary image data in the HTTP request bo
### [C#](#tab/csharp)
-<!-- TODO: After C# ref-docs get published, add link to SegmentationMode (/dotnet/api/azure.ai.vision.imageanalysis.imageanalysisoptions.segmentationmode) & ImageSegmentationMode (/dotnet/api/azure.ai.vision.imageanalysis.imagesegmentationmode) -->
-
-Create a new [ImageAnalysisOptions](/dotnet/api/azure.ai.vision.imageanalysis.imageanalysisoptions) object and set the property `SegmentationMode`. This property must be set if you want to do segmentation. See `ImageSegmentationMode` for supported values.
+Create a new [ImageAnalysisOptions](/dotnet/api/azure.ai.vision.imageanalysis.imageanalysisoptions) object and set the property [SegmentationMode](/dotnet/api/azure.ai.vision.imageanalysis.imageanalysisoptions.segmentationmode#azure-ai-vision-imageanalysis-imageanalysisoptions-segmentationmode). This property must be set if you want to do segmentation. See [ImageSegmentationMode](/dotnet/api/azure.ai.vision.imageanalysis.imagesegmentationmode) for supported values.
[!code-csharp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/csharp/image-analysis/segmentation/Program.cs?name=segmentation_mode)] ### [Python](#tab/python)
-<!-- TODO: Where Python ref-docs get published, add link to SegmentationMode (/python/api/azure-ai-vision/azure.ai.vision.imageanalysisoptions#azure-ai-vision-imageanalysisoptions-segmentation-mode) & ImageSegmentationMode (/python/api/azure-ai-vision/azure.ai.vision.enums.imagesegmentationmode)> -->
-
-Create a new [ImageAnalysisOptions](/python/api/azure-ai-vision/azure.ai.vision.imageanalysisoptions) object and set the property `segmentation_mode`. This property must be set if you want to do segmentation. See `ImageSegmentationMode` for supported values.
+Create a new [ImageAnalysisOptions](/python/api/azure-ai-vision/azure.ai.vision.imageanalysisoptions) object and set the property [segmentation_mode](/python/api/azure-ai-vision/azure.ai.vision.imageanalysisoptions#azure-ai-vision-imageanalysisoptions-segmentation-mode). This property must be set if you want to do segmentation. See [ImageSegmentationMode](/python/api/azure-ai-vision/azure.ai.vision.enums.imagesegmentationmode) for supported values.
[!code-python[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/python/image-analysis/segmentation/main.py?name=segmentation_mode)]
+### [Java](#tab/java)
+
+Create a new [ImageAnalysisOptions](/java/api/com.azure.ai.vision.imageanalysis.imageanalysisoptions) object and call the [setSegmentationMode](/java/api/com.azure.ai.vision.imageanalysis.imageanalysisoptions#com-azure-ai-vision-imageanalysis-imageanalysisoptions-setsegmentationmode()) method. You must call this method if you want to do segmentation. See [ImageSegmentationMode](/java/api/com.azure.ai.vision.imageanalysis.imagesegmentationmode) for supported values.
+
+[!code-java[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/java/image-analysis/segmentation/ImageAnalysis.java?name=segmentation_mode)]
+ ### [C++](#tab/cpp) Create a new [ImageAnalysisOptions](/cpp/cognitive-services/vision/imageanalysis-imageanalysisoptions) object and call the [SetSegmentationMode](/cpp/cognitive-services/vision/imageanalysis-imageanalysisoptions#setsegmentationmode) method. You must call this method if you want to do segmentation. See [ImageSegmentationMode](/cpp/cognitive-services/vision/azure-ai-vision-imageanalysis-namespace#enum-imagesegmentationmode) for supported values.
This section shows you how to make the API call and parse the results.
The following code calls the Image Analysis API and saves the resulting segmented image to a file named **output.png**. It also displays some metadata about the segmented image.
-**SegmentationResult** implements **IDisposable**, therefore create the object with a **using** statement or explicitly call **Dispose** method after analysis completes.
- [!code-csharp[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/csharp/image-analysis/segmentation/Program.cs?name=segment)]
+**SegmentationResult** implements **IDisposable**, therefore create the object with a **using** statement or explicitly call **Dispose** method after analysis completes.
+ #### [Python](#tab/python) The following code calls the Image Analysis API and saves the resulting segmented image to a file named **output.png**. It also displays some metadata about the segmented image. [!code-python[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/python/image-analysis/segmentation/main.py?name=segment)]
+#### [Java](#tab/java)
+
+The following code calls the Image Analysis API and saves the resulting segmented image to a file named **output.png**. It also displays some metadata about the segmented image.
+
+[!code-java[](~/azure-ai-vision-sdk/docs/learn.microsoft.com/java/image-analysis/segmentation/ImageAnalysis.java?name=segment)]
+
+**SegmentationResult** implements **AutoCloseable**, therefore create the object in a try-with-resources block, or explicitly call the **close** method on this object when you're done analyzing the image.
+ #### [C++](#tab/cpp) The following code calls the Image Analysis API and saves the resulting segmented image to a file named **output.png**. It also displays some metadata about the segmented image.
ai-services Call Analyze Image 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/call-analyze-image-40.md
This article demonstrates how to call the Image Analysis 4.0 API to return infor
::: zone-end +++ ::: zone pivot="programming-language-rest-api" [!INCLUDE [REST API](../includes/how-to-guides/analyze-image-40-rest.md)]
ai-services Image Analysis Client Library 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/quickstarts-sdk/image-analysis-client-library-40.md
Get started with the Image Analysis 4.0 REST API or client SDK to set up a basic
::: zone-end +++ ::: zone pivot="programming-language-rest-api" [!INCLUDE [REST API quickstart](../includes/image-analysis-curl-quickstart-40.md)]
ai-services Install Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/sdk/install-sdk.md
Last updated 08/01/2023
-zone_pivot_groups: programming-languages-vision-40-sdk
+zone_pivot_groups: programming-languages-vision-40-sdk
# Install the Vision SDK
zone_pivot_groups: programming-languages-vision-40-sdk
[!INCLUDE [Python include](../includes/setup-sdk/python.md)] ::: zone-end + ## Next steps Follow the [Image Analysis quickstart](../quickstarts-sdk/image-analysis-client-library-40.md) to get started.
ai-services Overview Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/sdk/overview-sdk.md
The Vision SDK supports the following languages and platforms:
| C# <sup>1</sup> | [quickstart](../quickstarts-sdk/image-analysis-client-library-40.md?pivots=programming-language-csharp) | [reference](/dotnet/api/azure.ai.vision.imageanalysis) | Windows, UWP, Linux | | C++ <sup>2</sup> | [quickstart](../quickstarts-sdk/image-analysis-client-library-40.md?pivots=programming-language-cpp) | [reference](/cpp/cognitive-services/vision) | Windows, Linux | | Python | [quickstart](../quickstarts-sdk/image-analysis-client-library-40.md?pivots=programming-language-python) | [reference](/python/api/azure-ai-vision) | Windows, Linux |
+| Java | [quickstart](../quickstarts-sdk/image-analysis-client-library-40.md?pivots=programming-language-java) | [reference](/java/api/com.azure.ai.vision.imageanalysis) | Windows, Linux |
+ <sup>1 The Vision SDK for C# is based on .NET Standard 2.0. See [.NET Standard](/dotnet/standard/net-standard?tabs=net-standard-2-0#net-implementation-support) documentation.</sup>
aks Azure Csi Files Storage Provision https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-csi-files-storage-provision.md
description: Learn how to create a static or dynamic persistent volume with Azure Files for use with multiple concurrent pods in Azure Kubernetes Service (AKS) Previously updated : 08/16/2023 Last updated : 09/18/2023 # Create and use a volume with Azure Files in Azure Kubernetes Service (AKS)
This section provides guidance for cluster administrators who want to provision
|Name | Meaning | Available Value | Mandatory | Default value | | | | |
-|skuName | Azure Files storage account type (alias: `storageAccountType`)| `Standard_LRS`, `Standard_ZRS`, `Standard_GRS`, `Standard_RAGRS`, `Standard_RAGZRS`,`Premium_LRS`, `Premium_ZRS` | No | `StandardSSD_LRS`<br> Minimum file share size for Premium account type is 100 GB.<br> ZRS account type is supported in limited regions.<br> NFS file share only supports Premium account type.|
-|protocol | Specify file share protocol. | `smb`, `nfs` | No | `smb` |
-|location | Specify the Azure region of the Azure storage account.| For example, `eastus`. | No | If empty, driver uses the same location name as current AKS cluster.|
-|resourceGroup | Specify the resource group for the Azure Disks.| Existing resource group name | No | If empty, driver uses the same resource group name as current AKS cluster.|
-|shareName | Specify Azure file share name. | Existing or new Azure file share name. | No | If empty, driver generates an Azure file share name. |
-|shareNamePrefix | Specify Azure file share name prefix created by driver. | Share name can only contain lowercase letters, numbers, hyphens, and length should be fewer than 21 characters. | No |
-|folderName | Specify folder name in Azure file share. | Existing folder name in Azure file share. | No | If folder name doesn't exist in file share, the mount fails. |
-|shareAccessTier | [Access tier for file share][storage-tiers] | General purpose v2 account can choose between `TransactionOptimized` (default), `Hot`, and `Cool`. Premium storage account type for file shares only. | No | Empty. Use default setting for different storage account types.|
|accountAccessTier | [Access tier for storage account][access-tiers-overview] | Standard account can choose `Hot` or `Cool`, and Premium account can only choose `Premium`. | No | Empty. Use default setting for different storage account types. |
-|server | Specify Azure storage account server address. | Existing server address, for example `accountname.privatelink.file.core.windows.net`. | No | If empty, driver uses default `accountname.file.core.windows.net` or other sovereign cloud account address. |
-|disableDeleteRetentionPolicy | Specify whether disable DeleteRetentionPolicy for storage account created by driver. | `true` or `false` | No | `false` |
+|accountQuota | Limits the quota for an account. You can specify a maximum quota in GB (102400GB by default). If the account exceeds the specified quota, the driver skips selecting the account. ||No |`102400` |
|allowBlobPublicAccess | Allow or disallow public access to all blobs or containers for storage account created by driver. | `true` or `false` | No | `false` |
+|disableDeleteRetentionPolicy | Specify whether disable DeleteRetentionPolicy for storage account created by driver. | `true` or `false` | No | `false` |
+|enableLargeFileShares |Specify whether to use a storage account with large file shares enabled or not. If this flag is set to `true` and a storage account with large file shares enabled doesn't exist, a new storage account with large file shares enabled is created. This flag should be used with the Standard sku as the storage accounts created with Premium sku have `largeFileShares` option enabled by default. |`true` or `false` |No |false |
+|folderName | Specify folder name in Azure file share. | Existing folder name in Azure file share. | No | If folder name doesn't exist in file share, the mount fails. |
+|getLatestAccount |Determins whether to get the latest account key based on the creation time. This driver gets the first key by default. |`true` or `false` |No |`false` |
+|location | Specify the Azure region of the Azure storage account.| For example, `eastus`. | No | If empty, driver uses the same location name as current AKS cluster.|
+|matchTags | Match tags when driver tries to find a suitable storage account. | `true` or `false` | No | `false` |
|networkEndpointType | Specify network endpoint type for the storage account created by driver. If `privateEndpoint` is specified, a private endpoint is created for the storage account. For other cases, a service endpoint is created by default. | "",`privateEndpoint`| No | "" |
+|protocol | Specify file share protocol. | `smb`, `nfs` | No | `smb` |
|requireInfraEncryption | Specify whether or not the service applies a secondary layer of encryption with platform managed keys for data at rest for storage account created by driver. | `true` or `false` | No | `false` |
+|resourceGroup | Specify the resource group for the Azure Disks.| Existing resource group name | No | If empty, driver uses the same resource group name as current AKS cluster.|
+|selectRandomMatchingAccount | Determines whether to randomly select a matching account. By default, the driver always selects the first matching account in alphabetical order (Note: This driver uses account search cache, which results in uneven distribution of file creation across multiple accounts). | `true` or `false` |No | `false` |
+|server | Specify Azure storage account server address. | Existing server address, for example `accountname.privatelink.file.core.windows.net`. | No | If empty, driver uses default `accountname.file.core.windows.net` or other sovereign cloud account address. |
+|shareAccessTier | [Access tier for file share][storage-tiers] | General purpose v2 account can choose between `TransactionOptimized` (default), `Hot`, and `Cool`. Premium storage account type for file shares only. | No | Empty. Use default setting for different storage account types.|
+|shareName | Specify Azure file share name. | Existing or new Azure file share name. | No | If empty, driver generates an Azure file share name. |
+|shareNamePrefix | Specify Azure file share name prefix created by driver. | Share name can only contain lowercase letters, numbers, hyphens, and length should be fewer than 21 characters. | No |
+|skuName | Azure Files storage account type (alias: `storageAccountType`)| `Standard_LRS`, `Standard_ZRS`, `Standard_GRS`, `Standard_RAGRS`, `Standard_RAGZRS`,`Premium_LRS`, `Premium_ZRS` | No | `StandardSSD_LRS`<br> Minimum file share size for Premium account type is 100 GB.<br> ZRS account type is supported in limited regions.<br> NFS file share only supports Premium account type.|
|storageEndpointSuffix | Specify Azure storage endpoint suffix. | `core.windows.net`, `core.chinacloudapi.cn`, etc. | No | If empty, driver uses default storage endpoint suffix according to cloud environment. For example, `core.windows.net`. | |tags | [Tags][tag-resources] are created in new storage account. | Tag format: 'foo=aaa,bar=bbb' | No | "" |
-|matchTags | Match tags when driver tries to find a suitable storage account. | `true` or `false` | No | `false` |
| | **Following parameters are only for SMB protocol** | | | |subscriptionID | Specify Azure subscription ID where Azure file share is created. | Azure subscription ID | No | If not empty, `resourceGroup` must be provided. | |storeAccountKey | Specify whether to store account key to Kubernetes secret. | `true` or `false`<br>`false` means driver uses kubelet identity to get account key. | No | `true` |
This section provides guidance for cluster administrators who want to provision
|secretNamespace | Specify the namespace of secret to store account key. <br><br> **Note:** <br> If `secretNamespace` isn't specified, the secret is created in the same namespace as the pod. | `default`,`kube-system`, etc. | No | PVC namespace, for example `csi.storage.k8s.io/pvc/namespace` | |useDataPlaneAPI | Specify whether to use [data plane API][data-plane-api] for file share create/delete/resize, which could solve the SRP API throttling issue because the data plane API has almost no limit, while it would fail when there's firewall or Vnet settings on storage account. | `true` or `false` | No | `false` | | | **Following parameters are only for NFS protocol** | | |
-|rootSquashType | Specify root squashing behavior on the share. The default is `NoRootSquash` | `AllSquash`, `NoRootSquash`, `RootSquash` | No |
|mountPermissions | Mounted folder permissions. The default is `0777`. If set to `0`, driver doesn't perform `chmod` after mount | `0777` | No |
+|rootSquashType | Specify root squashing behavior on the share. The default is `NoRootSquash` | `AllSquash`, `NoRootSquash`, `RootSquash` | No |
| | **Following parameters are only for VNet setting. For example, NFS, private end point** | | |
-|vnetResourceGroup | Specify VNet resource group where virtual network is defined. | Existing resource group name. | No | If empty, driver uses the `vnetResourceGroup` value in Azure cloud config file. |
-|vnetName | Virtual network name | Existing virtual network name. | No | If empty, driver uses the `vnetName` value in Azure cloud config file. |
-|subnetName | Subnet name | Existing subnet name of the agent node. | No | If empty, driver uses the `subnetName` value in Azure cloud config file. |
|fsGroupChangePolicy | Indicates how the driver changes volume's ownership. Pod `securityContext.fsGroupChangePolicy` is ignored. | `OnRootMismatch` (default), `Always`, `None` | No | `OnRootMismatch`|
+|subnetName | Subnet name | Existing subnet name of the agent node. | No | If empty, driver uses the `subnetName` value in Azure cloud config file. |
+|vnetName | Virtual network name | Existing virtual network name. | No | If empty, driver uses the `vnetName` value in Azure cloud config file. |
+|vnetResourceGroup | Specify VNet resource group where virtual network is defined. | Existing resource group name. | No | If empty, driver uses the `vnetResourceGroup` value in Azure cloud config file. |
### Create a storage class
aks Cluster Autoscaler https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cluster-autoscaler.md
To adjust to changing application demands, such as between workdays and evenings
![The cluster autoscaler and horizontal pod autoscaler often work together to support the required application demands](media/autoscaler/cluster-autoscaler.png)
-Both the horizontal pod autoscaler and cluster autoscaler can decrease the number of pods and nodes as needed. The cluster autoscaler decreases the number of nodes when there has been unused capacity for a period of time. Any pods on a node to be removed by the cluster autoscaler are safely scheduled elsewhere in the cluster.
+Both the horizontal pod autoscaler and cluster autoscaler can decrease the number of pods and nodes as needed. The cluster autoscaler decreases the number of nodes when there has been unused capacity after a period of time. Any pods on a node removed by the cluster autoscaler are safely scheduled elsewhere in the cluster.
-If the current node pool size is lower than the specified minimum or greater than the specified maximum when you enable autoscaling, the autoscaler waits to take effect until a new node is needed in the node pool or until a node can be safely deleted from the node pool. For more information, see [How does scale-down work?](https://github.com/kubernetes/autoscaler/blob/master/cluster-autoscaler/FAQ.md#how-does-scale-down-work)
+With autoscaling enabled, when the node pool size is lower than the minimum or greater than the maximum it applies the scaling rules. Next, the autoscaler waits to take effect until a new node is needed in the node pool or until a node may be safely deleted from the current node pool. For more information, see [How does scale-down work?](https://github.com/kubernetes/autoscaler/blob/master/cluster-autoscaler/FAQ.md#how-does-scale-down-work)
The cluster autoscaler may be unable to scale down if pods can't move, such as in the following situations:
-* A pod is directly created and isn't backed by a controller object, such as a deployment or replica set.
+* A directly created pod not backed by a controller object, such as a deployment or replica set.
* A pod disruption budget (PDB) is too restrictive and doesn't allow the number of pods to fall below a certain threshold. * A pod uses node selectors or anti-affinity that can't be honored if scheduled on a different node.
You can re-enable the cluster autoscaler on an existing cluster using the [`az a
> [!IMPORTANT] > If you have multiple node pools in your AKS cluster, skip to the [autoscale with multiple agent pools section](#use-the-cluster-autoscaler-with-multiple-node-pools-enabled). Clusters with multiple agent pools require the `az aks nodepool` command instead of `az aks`.
-In the previous step to create an AKS cluster or update an existing node pool, the cluster autoscaler minimum node count was set to one and the maximum node count was set to three. As your application demands change, you may need to adjust the cluster autoscaler node count.
+In our example to enable cluster autoscaling, your cluster autoscaler's minimum node count was set to one and maximum node count was set to three. As your application demands change, you need to adjust the cluster autoscaler node count to scale efficiently.
* Change the node count using the [`az aks update`][az-aks-update] command and update the cluster autoscaler using the `--update-cluster-autoscaler` parameter and specifying your updated node `--min-count` and `--max-count`.
Monitor the performance of your applications and services, and adjust the cluste
You can also configure more granular details of the cluster autoscaler by changing the default values in the cluster-wide autoscaler profile. For example, a scale down event happens after nodes are under-utilized after 10 minutes. If you have workloads that run every 15 minutes, you may want to change the autoscaler profile to scale down under-utilized nodes after 15 or 20 minutes. When you enable the cluster autoscaler, a default profile is used unless you specify different settings. The cluster autoscaler profile has the following settings you can update:
+* Example profile update that scales after 15 minutes and changes after 10 minutes of idle use.
+
+ ```azurecli-interactive
+ az aks update \
+ -g learn-aks-cluster-scalability \
+ -n learn-aks-cluster-scalability \
+ --cluster-autoscaler-profile scan-interval=5s \
+ scale-down-unready-time=10m \
+ scale-down-delay-after-add=15m
+ ```
+ | Setting | Description | Default value | |-||| | scan-interval | How often cluster is reevaluated for scale up or down | 10 seconds |
You can also configure more granular details of the cluster autoscaler by changi
| scale-down-delay-after-failure | How long after scale down failure that scale down evaluation resumes | 3 minutes | | scale-down-unneeded-time | How long a node should be unneeded before it's eligible for scale down | 10 minutes | | scale-down-unready-time | How long an unready node should be unneeded before it's eligible for scale down | 20 minutes |
-| scale-down-utilization-threshold | Node utilization level, defined as sum of requested resources divided by capacity, below which a node can be considered for scale down | 0.5 |
+| scale-down-utilization-threshold | Node utilization level, defined as sum of requested resources divided by capacity, in which a node can be considered for scale down | 0.5 |
| max-graceful-termination-sec | Maximum number of seconds the cluster autoscaler waits for pod termination when trying to scale down a node | 600 seconds | | balance-similar-node-groups | Detects similar node pools and balances the number of nodes between them | false | | expander | Type of node pool [expander](https://github.com/kubernetes/autoscaler/blob/master/cluster-autoscaler/FAQ.md#what-are-expanders) to be used in scale up. Possible values: `most-pods`, `random`, `least-waste`, `priority` | random |
aks Gpu Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/gpu-cluster.md
You can deploy a DaemonSet for the NVIDIA device plugin, which runs a pod on eac
value: "gpu" effect: "NoSchedule" containers:
- - image: mcr.microsoft.com/oss/nvidia/k8s-device-plugin:1.11
+ - image: mcr.microsoft.com/oss/nvidia/k8s-device-plugin:v0.14.1
name: nvidia-device-plugin-ctr securityContext: allowPrivilegeEscalation: false
aks Http Proxy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/http-proxy.md
Previously updated : 02/01/2023 Last updated : 09/18/2023
Some more complex solutions may require creating a chain of trust to establish s
The following scenarios are **not** supported: - Different proxy configurations per node pool-- Updating HTTP/HTTPS proxy settings post cluster creation - User/Password authentication - Custom CAs for API server communication - Windows-based clusters
In your template, provide values for *httpProxy*, *httpsProxy*, and *noProxy*. I
## Updating Proxy configurations
-Values for *httpProxy*, and *httpsProxy* can't be changed after cluster creation. However, the values for *trustedCa* and *NoProxy* can be changed and applied to the cluster with the [az aks update][az-aks-update] command. An aks update for *NoProxy* will automatically inject new environment variables into pods with the new *NoProxy* values. Pods must be rotated for the apps to pick it up. For components under kubernetes, like containerd and the node itself, this won't take effect until a node image upgrade is performed.
+> [!NOTE]
+> If switching to a new proxy, the new proxy must already exist for the update to be successful. Then, after the upgrade is completed the old proxy can be deleted.
+
+Values for *httpProxy*, *httpsProxy*, *trustedCa* and *NoProxy* can be changed and applied to the cluster with the [az aks update][az-aks-update] command. An aks update for *httpProxy*, *httpsProxy*, and/or *NoProxy* will automatically inject new environment variables into pods with the new *httpProxy*, *httpsProxy*, or *NoProxy* values. Pods must be rotated for the apps to pick it up. For components under kubernetes, like containerd and the node itself, this won't take effect until a node image upgrade is performed.
For example, assuming a new file has been created with the base64 encoded string of the new CA cert called *aks-proxy-config-2.json*, the following action updates the cluster. Or, you need to add new endpoint urls for your applications to No Proxy:
app-service Configure Vnet Integration Enable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/configure-vnet-integration-enable.md
If the virtual network is in a different subscription than the app, you must ens
## Configure in the Azure portal
-1. Go to **Networking** in the App Service portal. Under **Outbound Traffic**, select **VNet integration**.
+1. Go to **Networking** in the App Service portal. Under **Outbound traffic configuration**, select **Virtual network integration**.
-1. Select **Add VNet**.
+1. Select **Add virtual network integration**.
- :::image type="content" source="./media/configure-vnet-integration-enable/vnetint-app.png" alt-text="Screenshot that shows selecting VNet integration.":::
+ :::image type="content" source="./media/configure-vnet-integration-enable/vnetint-app.png" alt-text="Screenshot that shows selecting Virtual network integration.":::
1. The dropdown list contains all the virtual networks in your subscription in the same region. Select an empty pre-existing subnet or create a new subnet.
app-service Configure Vnet Integration Routing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/configure-vnet-integration-routing.md
Last updated 10/20/2021
# Manage Azure App Service virtual network integration routing
-Through application routing or configuration routing options, you can configure what traffic will be sent through the virtual network integration. See the [overview section](./overview-vnet-integration.md#routes) for more details.
+Through application routing or configuration routing options, you can configure what traffic is sent through the virtual network integration. For more information, see the [overview section](./overview-vnet-integration.md#routes).
## Prerequisites
Your app is already integrated using the regional virtual network integration fe
## Configure application routing
-Application routing defines what traffic is routed from your app and into the virtual network. We recommend that you use the **Route All** site setting to enable routing of all traffic. Using the configuration setting allows you to audit the behavior with [a built-in policy](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F33228571-70a4-4fa1-8ca1-26d0aba8d6ef). The existing `WEBSITE_VNET_ROUTE_ALL` app setting can still be used, and you can enable all traffic routing with either setting.
+Application routing defines what traffic is routed from your app and into the virtual network. We recommend that you use the `vnetRouteAllEnabled` site setting to enable routing of all traffic. Using the configuration setting allows you to audit the behavior with [a built-in policy](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F33228571-70a4-4fa1-8ca1-26d0aba8d6ef). The existing `WEBSITE_VNET_ROUTE_ALL` app setting can still be used, and you can enable all traffic routing with either setting.
### Configure in the Azure portal
-Follow these steps to disable **Route All** in your app through the portal.
+Follow these steps to disable outbound internet traffic routing in your app through the portal.
-1. Go to **Networking** > **VNet integration** in your app portal.
-1. Set **Route All** to **Disabled**.
+1. Go to **Networking** > **Virtual network integration** in your app portal.
+1. Uncheck the **Outbound internet traffic** setting.
- :::image type="content" source="./media/configure-vnet-integration-routing/vnetint-route-all-disabling.png" alt-text="Screenshot that shows disabling Route All.":::
+ :::image type="content" source="./media/configure-vnet-integration-routing/vnetint-route-all-disabling.png" alt-text="Screenshot that shows disabling outbound internet traffic.":::
-1. Select **Yes** to confirm.
+1. Select **Apply** to confirm.
### Configure with the Azure CLI
-You can also configure **Route All** by using the Azure CLI.
+You can also configure **Outbound internet traffic** by using the Azure CLI.
```azurecli-interactive
-az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetRouteAllEnabled [true|false]
+az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetRouteAllEnabled=[true|false]
``` ## Configure configuration routing
-When you're using virtual network integration, you can configure how parts of the configuration traffic are managed. By default, configuration traffic will go directly over the public route, but for the mentioned individual components, you can actively configure it to be routed through the virtual network integration.
+When you're using virtual network integration, you can configure how parts of the configuration traffic are managed. By default, configuration traffic goes directly over the public route, but for the mentioned individual components, you can actively configure it to be routed through the virtual network integration.
### Container image pull Routing container image pull over virtual network integration can be configured using the Azure CLI. ```azurecli-interactive
-az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetImagePullEnabled [true|false]
+az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetImagePullEnabled=[true|false]
``` We recommend that you use the site property to enable routing image pull traffic through the virtual network integration. Using the configuration setting allows you to audit the behavior with Azure Policy. The existing `WEBSITE_PULL_IMAGE_OVER_VNET` app setting with the value `true` can still be used, and you can enable routing through the virtual network with either setting.
We recommend that you use the site property to enable routing image pull traffic
Routing content share over virtual network integration can be configured using the Azure CLI. In addition to enabling the feature, you must also ensure that any firewall or Network Security Group configured on traffic from the subnet allow traffic to port 443 and 445. ```azurecli-interactive
-az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetContentShareEnabled [true|false]
+az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetContentShareEnabled=[true|false]
``` We recommend that you use the site property to enable content share traffic through the virtual network integration. Using the configuration setting allows you to audit the behavior with Azure Policy. The existing `WEBSITE_CONTENTOVERVNET` app setting with the value `1` can still be used, and you can enable routing through the virtual network with either setting.
+### Backup/restore
+
+Routing backup traffic over virtual network integration can be configured using the Azure CLI. Database backup isn't supported over the virtual network integration.
+
+```azurecli-interactive
+az resource update --resource-group <group-name> --name <app-name> --resource-type "Microsoft.Web/sites" --set properties.vnetBackupRestoreEnabled=[true|false]
+```
+ ## Next steps - [Enable virtual network integration](./configure-vnet-integration-enable.md)
app-service Overview Vnet Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/overview-vnet-integration.md
For Windows App Service plans, the virtual network integration feature supports
Virtual network integration depends on a dedicated subnet. When you create a subnet, the Azure subnet consumes five IPs from the start. One address is used from the integration subnet for each App Service plan instance. If you scale your app to four instances, then four addresses are used.
-When you scale up/down in size or in/out in number of instances, the required address space is doubled for a short period of time. This is because the scale operation adds the same number of new instances and then deletes the existing instances. The scale operation affects the real, available supported instances for a given subnet size. Platform upgrades need free IP addresses to ensure upgrades can happen without interruptions to outbound traffic. Finally, after scale up, down, or in operations complete, there might be a short period of time before IP addresses are released.
+When you scale up/down in size or in/out in number of instances, the required address space is doubled for a short period of time. The scale operation adds the same number of new instances and then deletes the existing instances. The scale operation affects the real, available supported instances for a given subnet size. Platform upgrades need free IP addresses to ensure upgrades can happen without interruptions to outbound traffic. Finally, after scale up, down, or in operations complete, there might be a short period of time before IP addresses are released.
Because subnet size can't be changed after assignment, use a subnet that's large enough to accommodate whatever scale your app might reach. You should also reserve IP addresses for platform upgrades. To avoid any issues with subnet capacity, use a `/26` with 64 addresses. When you're creating subnets in Azure portal as part of integrating with the virtual network, a minimum size of /27 is required. If the subnet already exists before integrating through the portal, you can use a /28 subnet.
Through application routing or configuration routing options, you can configure
### Application routing
-Application routing applies to traffic that is sent from your app after it has been started. See [configuration routing](#configuration-routing) for traffic during startup. When you configure application routing, you can either route all traffic or only private traffic (also known as [RFC1918](https://datatracker.ietf.org/doc/html/rfc1918#section-3) traffic) into your virtual network. You configure this behavior through the **Route All** setting. If **Route All** is disabled, your app only routes private traffic into your virtual network. If you want to route all your outbound app traffic into your virtual network, make sure that **Route All** is enabled.
+Application routing applies to traffic that is sent from your app after it has been started. See [configuration routing](#configuration-routing) for traffic during startup. When you configure application routing, you can either route all traffic or only private traffic (also known as [RFC1918](https://datatracker.ietf.org/doc/html/rfc1918#section-3) traffic) into your virtual network. You configure this behavior through the outbound internet traffic setting. If outbound internet traffic routing is disabled, your app only routes private traffic into your virtual network. If you want to route all your outbound app traffic into your virtual network, make sure that outbound internet traffic is enabled.
* Only traffic configured in application or configuration routing is subject to the NSGs and UDRs that are applied to your integration subnet.
-* When **Route All** is enabled, the source address for your outbound public traffic from your app is still one of the IP addresses that are listed in your app properties. If you route your traffic through a firewall or a NAT gateway, the source IP address originates from this service.
+* When outbound internet traffic routing is enabled, the source address for your outbound traffic from your app is still one of the IP addresses that are listed in your app properties. If you route your traffic through a firewall or a NAT gateway, the source IP address originates from this service.
Learn [how to configure application routing](./configure-vnet-integration-routing.md#configure-application-routing).
In addition to configuring the routing, you must also ensure that any firewall o
#### Container image pull
-When using custom containers, you can pull the container over the virtual network integration. To route the container pull traffic through the virtual network integration, you must ensure that the routing setting is configured. Learn [how to configure image pull routing](./configure-vnet-integration-routing.md#container-image-pull).
+When using custom containers, you can pull the container over the virtual network integration. To route the container pull traffic through the virtual network integration, you must ensure that the routing setting is configured. Learn [how to configure image pull routing](./configure-vnet-integration-routing.md#container-image-pull).
+
+#### Backup/restore
+
+App Service has built-in backup/restore, but if you want to back up to your own storage account, you can use the custom backup/restore feature. If you want to route the traffic to the storage account through the virtual network integration, you must configure the route setting. Database backup isn't supported over the virtual network integration.
#### App settings using Key Vault references App settings using Key Vault references attempt to get secrets over the public route. If the Key Vault is blocking public traffic and the app is using virtual network integration, an attempt is made to get the secrets through the virtual network integration. > [!NOTE]
-> * Backup/restore to private storage accounts is currently not supported.
> * Configure SSL/TLS certificates from private Key Vaults is currently not supported. > * App Service Logs to private storage accounts is currently not supported. We recommend using Diagnostics Logging and allowing Trusted Services for the storage account.
You can use route tables to route outbound traffic from your app without restric
Route tables and network security groups only apply to traffic routed through the virtual network integration. See [application routing](#application-routing) and [configuration routing](#configuration-routing) for details. Routes don't apply to replies from inbound app requests and inbound rules in an NSG don't apply to your app. Virtual network integration affects only outbound traffic from your app. To control inbound traffic to your app, use the [access restrictions](./overview-access-restrictions.md) feature or [private endpoints](./networking/private-endpoint.md).
-When configuring network security groups or route tables that applies to outbound traffic, you must make sure you consider your application dependencies. Application dependencies include endpoints that your app needs during runtime. Besides APIs and services the app is calling, these endpoints could also be derived endpoints like certificate revocation list (CRL) check endpoints and identity/authentication endpoint, for example Azure Active Directory. If you're using [continuous deployment in App Service](./deploy-continuous-deployment.md), you might also need to allow endpoints depending on type and language. Specifically for [Linux continuous deployment](https://github.com/microsoft/Oryx/blob/main/doc/hosts/appservice.md#network-dependencies), you need to allow `oryx-cdn.microsoft.io:443`. For Python you additionally need to allow `files.pythonhosted.org`, `pypi.org`.
+When configuring network security groups or route tables that applies to outbound traffic, you must make sure you consider your application dependencies. Application dependencies include endpoints that your app needs during runtime. Besides APIs and services the app is calling, these endpoints could also be derived endpoints like certificate revocation list (CRL) check endpoints and identity/authentication endpoint, for example Microsoft Entra ID. If you're using [continuous deployment in App Service](./deploy-continuous-deployment.md), you might also need to allow endpoints depending on type and language. Specifically for [Linux continuous deployment](https://github.com/microsoft/Oryx/blob/main/doc/hosts/appservice.md#network-dependencies), you need to allow `oryx-cdn.microsoft.io:443`. For Python you additionally need to allow `files.pythonhosted.org`, `pypi.org`.
When you want to route outbound traffic on-premises, you can use a route table to send outbound traffic to your Azure ExpressRoute gateway. If you do route traffic to a gateway, set routes in the external network to send any replies back. Border Gateway Protocol (BGP) routes also affect your app traffic. If you have BGP routes from something like an ExpressRoute gateway, your app outbound traffic is affected. Similar to user-defined routes, BGP routes affect traffic according to your routing scope setting.
After your app integrates with your virtual network, it uses the same DNS server
There are some limitations with using virtual network integration: * The feature is available from all App Service deployments in Premium v2 and Premium v3. It's also available in Basic and Standard tier but only from newer App Service deployments. If you're on an older deployment, you can only use the feature from a Premium v2 App Service plan. If you want to make sure you can use the feature in a Basic or Standard App Service plan, create your app in a Premium v3 App Service plan. Those plans are only supported on our newest deployments. You can scale down if you want after the plan is created.
-* The feature can't be used by Isolated plan apps that are in an App Service Environment.
+* The feature isn't available for Isolated plan apps in an App Service Environment.
* You can't reach resources across peering connections with classic virtual networks. * The feature requires an unused subnet that's an IPv4 `/28` block or larger in an Azure Resource Manager virtual network. * The app and the virtual network must be in the same region.
automation Automation Hrw Run Runbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-hrw-run-runbooks.md
Title: Run Azure Automation runbooks on a Hybrid Runbook Worker
description: This article describes how to run runbooks on machines in your local datacenter or other cloud provider with the Hybrid Runbook Worker. Previously updated : 03/27/2023 Last updated : 09/17/2023
# Run Automation runbooks on a Hybrid Runbook Worker > [!IMPORTANT]
-> - Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+> - Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
> - Azure Automation Run As Account will retire on September 30, 2023 and will be replaced with Managed Identities. Before that date, you'll need to start migrating your runbooks to use [managed identities](automation-security-overview.md#managed-identities). For more information, see [migrating from an existing Run As accounts to managed identity](migrate-run-as-accounts-managed-identity.md#sample-scripts) to start migrating the runbooks from Run As account to managed identities before September 30, 2023.
automation Automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-hybrid-runbook-worker.md
Title: Azure Automation Hybrid Runbook Worker overview
description: Know about Hybrid Runbook Worker. How to install and run the runbooks on machines in your local datacenter or cloud provider. Previously updated : 04/20/2023 Last updated : 09/17/2023 # Automation Hybrid Runbook Worker overview > [!IMPORTANT]
-> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
Runbooks in Azure Automation might not have access to resources in other clouds or in your on-premises environment because they run on the Azure cloud platform. You can use the Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on the machine hosting the role and against resources in the environment to manage those local resources. Runbooks are stored and managed in Azure Automation and then delivered to one or more assigned machines.
automation Automation Linux Hrw Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-linux-hrw-install.md
description: This article tells how to install an agent-based Hybrid Runbook Wo
Previously updated : 04/12/2023 Last updated : 09/17/2023 # Deploy an agent-based Linux Hybrid Runbook Worker in Automation > [!IMPORTANT]
-> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on the Azure or non-Azure machine, including servers registered with [Azure Arc-enabled servers](../azure-arc/servers/overview.md). From the machine or server that's hosting the role, you can run runbooks directly it and against resources in the environment to manage those local resources.
automation Automation Windows Hrw Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-windows-hrw-install.md
Title: Deploy an agent-based Windows Hybrid Runbook Worker in Automation
description: This article tells how to deploy an agent-based Hybrid Runbook Worker that you can use to run runbooks on Windows-based machines in your local datacenter or cloud environment. Previously updated : 04/01/2023 Last updated : 09/17/2023 # Deploy an agent-based Windows Hybrid Runbook Worker in Automation > [!IMPORTANT]
-> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on an Azure or non-Azure machine, including servers registered with [Azure Arc-enabled servers](../azure-arc/servers/overview.md). From the machine or server that's hosting the role, you can run runbooks directly against it and against resources in the environment to manage those local resources.
automation Enforce Job Execution Hybrid Worker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/enforce-job-execution-hybrid-worker.md
Title: Enforce job execution on Azure Automation Hybrid Runbook Worker
description: This article tells how to use a custom Azure Policy definition to enforce job execution on an Azure Automation Hybrid Runbook Worker. Previously updated : 03/15/2023 Last updated : 09/17/2023 # Use Azure Policy to enforce job execution on Hybrid Runbook Worker > [!IMPORTANT]
-> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
Starting a runbook on a Hybrid Runbook Worker uses a **Run on** option that allows you to specify the name of a Hybrid Runbook Worker group when initiating from the Azure portal, with the Azure PowerShell, or REST API. When a group is specified, one of the workers in that group retrieves and runs the runbook. If your runbook does not specify this option, Azure Automation runs the runbook in the Azure sandbox.
automation Migrate Existing Agent Based Hybrid Worker To Extension Based Workers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md
Title: Migrate an existing agent-based hybrid workers to extension-based-workers
description: This article provides information on how to migrate an existing agent-based hybrid worker to extension based workers. Previously updated : 04/11/2023 Last updated : 09/17/2023 #Customer intent: As a developer, I want to learn about extension so that I can efficiently migrate agent based hybrid workers to extension based workers.
# Migrate the existing agent-based hybrid workers to extension-based hybrid workers > [!IMPORTANT]
-> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
This article describes the benefits of Extension-based User Hybrid Runbook Worker and how to migrate existing Agent-based User Hybrid Runbook Workers to Extension-based Hybrid Workers.
automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/troubleshoot/hybrid-runbook-worker.md
Title: Troubleshoot agent-based Hybrid Runbook Worker issues in Azure Automation description: This article tells how to troubleshoot and resolve issues that arise with Azure Automation agent-based Hybrid Runbook Workers. Previously updated : 04/26/2023 Last updated : 09/17/2023
# Troubleshoot agent-based Hybrid Runbook Worker issues in Automation > [!IMPORTANT]
-> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 October 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](../migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md)
+> Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) will retire on **31 August 2024** and wouldn't be supported after that date. You must complete migrating existing Agent-based User Hybrid Runbook Workers to Extension-based Workers before 31 August 2024. Moreover, starting **1 November 2023**, creating new Agent-based Hybrid Workers wouldn't be possible. [Learn more](../migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md)
This article provides information on troubleshooting and resolving issues with Azure Automation agent-based Hybrid Runbook Workers. For troubleshooting extension-based workers, see [Troubleshoot extension-based Hybrid Runbook Worker issues in Automation](./extension-based-hybrid-runbook-worker.md). For general information, see [Hybrid Runbook Worker overview](../automation-hybrid-runbook-worker.md).
automation Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/whats-new.md
description: Significant updates to Azure Automation updated each month.
Previously updated : 05/10/2023 Last updated : 09/17/2023
Azure Automation has expanded Public preview support for PowerShell 7.2 and Pyth
**Type:** Plan for change
-On **31 August 2024**, Azure Automation will [retire](https://azure.microsoft.com/updates/retirement-azure-automation-agent-user-hybrid-worker/) Agent-based User Hybrid Runbook Worker ([Windows](automation-windows-hrw-install.md) and [Linux](automation-linux-hrw-install.md)). You must migrate all Agent-based User Hybrid Workers to [Extension-based User Hybrid Runbook Worker](extension-based-hybrid-runbook-worker-install.md) (Windows and Linux) before the deprecation date. Moreover, starting **1 October 2023**, creating **new** Agent-based User Hybrid Runbook Worker will not be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
+On **31 August 2024**, Azure Automation will [retire](https://azure.microsoft.com/updates/retirement-azure-automation-agent-user-hybrid-worker/) Agent-based User Hybrid Runbook Worker ([Windows](automation-windows-hrw-install.md) and [Linux](automation-linux-hrw-install.md)). You must migrate all Agent-based User Hybrid Workers to [Extension-based User Hybrid Runbook Worker](extension-based-hybrid-runbook-worker-install.md) (Windows and Linux) before the deprecation date. Moreover, starting **1 November 2023**, creating **new** Agent-based User Hybrid Runbook Worker will not be possible. [Learn more](migrate-existing-agent-based-hybrid-worker-to-extension-based-workers.md).
## January 2023
azure-app-configuration Cli Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/cli-samples.md
Title: Azure CLI samples - Azure App Configuration description: Information about sample scripts provided for Azure App Configuration--++ Last updated 08/09/2022
azure-app-configuration Concept Customer Managed Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-customer-managed-keys.md
Title: Use customer-managed keys to encrypt your configuration data description: Encrypt your configuration data using customer-managed keys--++ Last updated 08/30/2022
azure-app-configuration Concept Enable Rbac https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-enable-rbac.md
Title: Authorize access to Azure App Configuration using Azure Active Directory description: Enable Azure RBAC to authorize access to your Azure App Configuration instance--++ Last updated 05/26/2020
azure-app-configuration Concept Feature Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-feature-management.md
Title: Understand feature management using Azure App Configuration description: Turn features on and off using Azure App Configuration --++
azure-app-configuration Concept Geo Replication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-geo-replication.md
Title: Geo-replication in Azure App Configuration description: Details of the geo-replication feature in Azure App Configuration. --++
azure-app-configuration Concept Github Action https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-github-action.md
Title: Sync your GitHub repository to App Configuration description: Use GitHub Actions to automatically update your App Configuration instance when you update your GitHub repository.--++ Last updated 05/28/2020
azure-app-configuration Concept Key Value https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-key-value.md
Title: Understand Azure App Configuration key-value store description: Understand key-value storage in Azure App Configuration, which stores configuration data as key-values. Key-values are a representation of application settings.--++ Last updated 09/14/2022
azure-app-configuration Concept Point Time Snapshot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-point-time-snapshot.md
Title: Retrieve key-values from a point-in-time
description: Retrieve old key-values using point-in-time revisions in Azure App Configuration, which maintains a record of changes to key-values. --++ Last updated 05/24/2023
azure-app-configuration Concept Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-private-endpoint.md
Title: Using private endpoints for Azure App Configuration description: Secure your App Configuration store using private endpoints --++ Last updated 07/15/2020
azure-app-configuration Enable Dynamic Configuration Dotnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/enable-dynamic-configuration-dotnet-core.md
description: In this tutorial, you learn how to dynamically update the configuration data for .NET apps documentationcenter: ''-+ editor: ''
ms.devlang: csharp
Last updated 07/11/2023-+ #Customer intent: I want to dynamically update my app to use the latest configuration data in App Configuration. # Tutorial: Use dynamic configuration in a .NET app
azure-app-configuration Enable Dynamic Configuration Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/enable-dynamic-configuration-dotnet.md
Title: '.NET Framework Tutorial: dynamic configuration in Azure App Configuration' description: In this tutorial, you learn how to dynamically update the configuration data for .NET Framework apps using Azure App Configuration. -+ ms.devlang: csharp Last updated 03/20/2023-+ #Customer intent: I want to dynamically update my .NET Framework app to use the latest configuration data in App Configuration. # Tutorial: Use dynamic configuration in a .NET Framework app
azure-app-configuration Howto App Configuration Event https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-app-configuration-event.md
Title: Use Event Grid for App Configuration data change notifications description: Learn how to use Azure App Configuration event subscriptions to send key-value modification events to a web endpoint -+ ms.assetid: ms.devlang: csharp Last updated 03/04/2020-+
azure-app-configuration Howto Disable Public Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-disable-public-access.md
Title: How to disable public access in Azure App Configuration description: How to disable public access to your Azure App Configuration store.--++ Last updated 07/12/2022
azure-app-configuration Howto Feature Filters Aspnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-feature-filters-aspnet-core.md
description: Learn how to use feature filters to enable conditional feature flag
ms.devlang: csharp --++ Last updated 3/9/2020
azure-app-configuration Howto Import Export Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-import-export-data.md
Title: Import or export data with Azure App Configuration description: Learn how to import or export configuration data to or from Azure App Configuration. Exchange data between your App Configuration store and code project. -+ Last updated 08/24/2022-+ # Import or export configuration data
azure-app-configuration Howto Integrate Azure Managed Service Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-integrate-azure-managed-service-identity.md
Title: Use managed identities to access App Configuration description: Authenticate to Azure App Configuration using managed identities--++
azure-app-configuration Howto Labels Aspnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-labels-aspnet-core.md
description: This article describes how to use labels to retrieve app configuration values for the environment in which the app is currently running. ms.devlang: csharp-+ Last updated 07/11/2023-+ # Use labels to provide per-environment configuration values.
azure-app-configuration Howto Move Resource Between Regions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-move-resource-between-regions.md
Title: Move an App Configuration store to another region description: Learn how to move an App Configuration store to a different region. --++ Last updated 03/27/2023
azure-app-configuration Howto Set Up Private Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-set-up-private-access.md
Title: How to set up private access to an Azure App Configuration store description: How to set up private access to an Azure App Configuration store in the Azure portal and in the CLI.--++ Last updated 07/12/2022
azure-app-configuration Howto Targetingfilter Aspnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/howto-targetingfilter-aspnet-core.md
description: Learn how to enable staged rollout of features for targeted audiences ms.devlang: csharp--++ Last updated 11/20/2020
azure-app-configuration Integrate Ci Cd Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/integrate-ci-cd-pipeline.md
Title: Integrate Azure App Configuration using a continuous integration and delivery pipeline description: Learn to implement continuous integration and delivery using Azure App Configuration -+ Last updated 08/30/2022-+ # Customer intent: I want to use Azure App Configuration data in my CI/CD pipeline.
azure-app-configuration Integrate Kubernetes Deployment Helm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/integrate-kubernetes-deployment-helm.md
Title: Integrate Azure App Configuration with Kubernetes Deployment using Helm description: Learn how to use dynamic configurations in Kubernetes deployment with Helm. -+ Last updated 03/27/2023-+ #Customer intent: I want to use Azure App Configuration data in Kubernetes deployment with Helm.
azure-app-configuration Manage Feature Flags https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/manage-feature-flags.md
description: In this tutorial, you learn how to manage feature flags separately from your application by using Azure App Configuration. documentationcenter: ''-+ editor: '' ms.assetid:
ms.devlang: csharp Last updated 04/05/2022-+ #Customer intent: I want to control feature availability in my app by using App Configuration.
azure-app-configuration Monitor App Configuration Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/monitor-app-configuration-reference.md
Title: Monitoring Azure App Configuration data reference description: Important Reference material needed when you monitor App Configuration --++ Last updated 05/05/2021
azure-app-configuration Monitor App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/monitor-app-configuration.md
Title: Monitor Azure App Configuration description: Start here to learn how to monitor App Configuration --++
azure-app-configuration Overview Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/overview-managed-identity.md
Title: Configure managed identities with Azure App Configuration description: Learn how managed identities work in Azure App Configuration and how to configure a managed identity-+ Last updated 02/25/2020-+
azure-app-configuration Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/overview.md
Title: What is Azure App Configuration? description: Read an overview of the Azure App Configuration service. Understand why you would want to use App Configuration, and learn how you can use it.--++ Last updated 03/20/2023
azure-app-configuration Policy Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/policy-reference.md
Title: Built-in policy definitions for Azure App Configuration
description: Lists Azure Policy built-in policy definitions for Azure App Configuration. These built-in policy definitions provide common approaches to managing your Azure resources. Last updated 09/13/2023 --++
azure-app-configuration Powershell Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/powershell-samples.md
Last updated 01/19/2023--++ # PowerShell samples for Azure App Configuration
azure-app-configuration Pull Key Value Devops Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/pull-key-value-devops-pipeline.md
Title: Pull settings from App Configuration with Azure Pipelines description: Learn how to use Azure Pipelines to pull key-values from an App Configuration Store -+ Last updated 11/17/2020-+ # Pull settings from App Configuration with Azure Pipelines
azure-app-configuration Push Kv Devops Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/push-kv-devops-pipeline.md
Title: Push settings to App Configuration with Azure Pipelines description: Learn to use Azure Pipelines to push key-values to an App Configuration Store -+ Last updated 02/23/2021-+ # Push settings to App Configuration with Azure Pipelines
azure-app-configuration Quickstart Azure App Configuration Create https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-azure-app-configuration-create.md
Title: "Quickstart: Create an Azure App Configuration store"--++ description: "In this quickstart, learn how to create an App Configuration store." ms.devlang: csharp
azure-app-configuration Quickstart Bicep https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-bicep.md
Title: Create an Azure App Configuration store using Bicep description: Learn how to create an Azure App Configuration store using Bicep.--++ Last updated 05/06/2022
azure-app-configuration Quickstart Container Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-container-apps.md
Title: "Quickstart: Use Azure App Configuration in Azure Container Apps" description: Learn how to connect a containerized application to Azure App Configuration, using Service Connector. -+ Last updated 03/02/2023-+
azure-app-configuration Quickstart Dotnet App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-dotnet-app.md
Title: Quickstart for Azure App Configuration with .NET Framework | Microsoft Do
description: In this article, create a .NET Framework app with Azure App Configuration to centralize storage and management of application settings separate from your code. documentationcenter: ''-+ ms.devlang: csharp Last updated 02/28/2023-+ #Customer intent: As a .NET Framework developer, I want to manage all my app settings in one place. # Quickstart: Create a .NET Framework app with Azure App Configuration
azure-app-configuration Quickstart Dotnet Core App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-dotnet-core-app.md
Title: Quickstart for Azure App Configuration with .NET | Microsoft Docs description: In this quickstart, create a .NET app with Azure App Configuration to centralize storage and management of application settings separate from your code. -+ ms.devlang: csharp Last updated 07/11/2023-+ #Customer intent: As a .NET developer, I want to manage all my app settings in one place. # Quickstart: Create a .NET app with App Configuration
azure-app-configuration Quickstart Feature Flag Azure Functions Csharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-feature-flag-azure-functions-csharp.md
Title: Quickstart for adding feature flags to Azure Functions | Microsoft Docs description: In this quickstart, use Azure Functions with feature flags from Azure App Configuration and test the function locally. -+ ms.devlang: csharp Last updated 3/20/2023-+ # Quickstart: Add feature flags to an Azure Functions app
azure-app-configuration Quickstart Feature Flag Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-feature-flag-dotnet.md
Title: Quickstart for adding feature flags to .NET Framework apps | Microsoft Do
description: A quickstart for adding feature flags to .NET Framework apps and managing them in Azure App Configuration documentationcenter: ''-+ editor: '' ms.assetid:
.NET Last updated 3/20/2023-+ #Customer intent: As a .NET Framework developer, I want to use feature flags to control feature availability quickly and confidently. # Quickstart: Add feature flags to a .NET Framework app
azure-app-configuration Quickstart Javascript https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-javascript.md
Title: Quickstart for using Azure App Configuration with JavaScript apps | Microsoft Docs description: In this quickstart, create a Node.js app with Azure App Configuration to centralize storage and management of application settings separate from your code. -+ ms.devlang: javascript Last updated 03/20/2023-+ #Customer intent: As a JavaScript developer, I want to manage all my app settings in one place. # Quickstart: Create a JavaScript app with Azure App Configuration
azure-app-configuration Quickstart Python Provider https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-python-provider.md
Title: Quickstart for using Azure App Configuration with Python apps | Microsoft Learn description: In this quickstart, create a Python app with the Azure App Configuration to centralize storage and management of application settings separate from your code. -+ ms.devlang: python Last updated 03/20/2023-+ #Customer intent: As a Python developer, I want to manage all my app settings in one place. # Quickstart: Create a Python app with Azure App Configuration
azure-app-configuration Quickstart Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-python.md
Title: Using Azure App Configuration in Python apps with the Azure SDK for Python | Microsoft Learn description: This document shows examples of how to use the Azure SDK for Python to access your data in Azure App Configuration. -+ ms.devlang: python Last updated 11/17/2022-+ #Customer intent: As a Python developer, I want to use the Azure SDK for Python to access my data in Azure App Configuration. # Create a Python app with the Azure SDK for Python
azure-app-configuration Quickstart Resource Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/quickstart-resource-manager.md
Title: Create an Azure App Configuration store by using Azure Resource Manager template (ARM template) description: Learn how to create an Azure App Configuration store by using Azure Resource Manager template (ARM template).--++ Last updated 06/09/2021
azure-app-configuration Rest Api Authentication Azure Ad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-authentication-azure-ad.md
Title: Azure Active Directory REST API - authentication description: Use Azure Active Directory to authenticate to Azure App Configuration by using the REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Authentication Hmac https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-authentication-hmac.md
Title: Azure App Configuration REST API - HMAC authentication description: Use HMAC to authenticate to Azure App Configuration by using the REST API--++ ms.devlang: csharp, golang, java, javascript, powershell, python
azure-app-configuration Rest Api Authentication Index https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-authentication-index.md
 Title: Azure App Configuration REST API - Authentication description: Reference pages for authentication using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Authorization Azure Ad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-authorization-azure-ad.md
Title: Azure App Configuration REST API - Azure Active Directory authorization description: Use Azure Active Directory for authorization against Azure App Configuration by using the REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Authorization Hmac https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-authorization-hmac.md
Title: Azure App Configuration REST API - HMAC authorization description: Use HMAC for authorization against Azure App Configuration using the REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Authorization Index https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-authorization-index.md
 Title: Azure App Configuration REST API - Authorization description: Reference pages for authorization using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Consistency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-consistency.md
 Title: Azure App Configuration REST API - consistency description: Reference pages for ensuring real-time consistency by using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Fiddler https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-fiddler.md
 Title: Azure Active Directory REST API - Test Using Fiddler description: Use Fiddler to test the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Headers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-headers.md
Title: Azure App Configuration REST API - Headers description: Reference pages for headers used with the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Key Value https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-key-value.md
 Title: Azure App Configuration REST API - key-value description: Reference pages for working with key-values by using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-keys.md
Title: Azure App Configuration REST API - Keys description: Reference pages for working with keys using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-labels.md
Title: Azure App Configuration REST API - Labels description: Reference pages for working with labels using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Locks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-locks.md
Title: Azure App Configuration REST API - locks description: Reference pages for working with key-value locks by using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Postman https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-postman.md
 Title: Azure Active Directory REST API - Test by using Postman description: Use Postman to test the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Revisions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-revisions.md
Title: Azure App Configuration REST API - key-value revisions description: Reference pages for working with key-value revisions by using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Throttling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-throttling.md
Title: Azure App Configuration REST API - Throttling description: Reference pages for understanding throttling when using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api Versioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api-versioning.md
Title: Azure App Configuration REST API - versioning description: Reference pages for versioning by using the Azure App Configuration REST API--++ Last updated 08/17/2020
azure-app-configuration Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/rest-api.md
Title: Azure App Configuration REST API description: Reference pages for the Azure App Configuration REST API--++ Last updated 11/28/2022
azure-app-configuration Cli Create Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/cli-create-service.md
Title: Azure CLI Script Sample - Create an Azure App Configuration Store
description: Create an Azure App Configuration store using a sample Azure CLI script. See reference article links to commands used in the script. -+ Last updated 01/18/2023-+
azure-app-configuration Cli Delete Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/cli-delete-service.md
Title: Azure CLI Script Sample - Delete an Azure App Configuration Store
description: Delete an Azure App Configuration store using a sample Azure CLI script. See reference article links to commands used in the script. -+ ms.devlang: azurecli Last updated 02/19/2020-+
azure-app-configuration Cli Export https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/cli-export.md
Title: Azure CLI Script Sample - Export from an Azure App Configuration Store
description: Use Azure CLI script to export configuration from Azure App Configuration -+ ms.devlang: azurecli Last updated 02/19/2020-+
azure-app-configuration Cli Import https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/cli-import.md
Title: Azure CLI script sample - Import to an App Configuration store
description: Use Azure CLI script - Importing configuration to Azure App Configuration -+ ms.devlang: azurecli Last updated 02/19/2020-+
azure-app-configuration Cli Work With Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/cli-work-with-keys.md
Title: Azure CLI Script Sample - Work with key-values in App Configuration Store
description: Use Azure CLI script to create, view, update and delete key values from App Configuration store -+ ms.devlang: azurecli Last updated 02/19/2020-+
azure-app-configuration Powershell Create Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/powershell-create-service.md
Title: PowerShell script sample - Create an Azure App Configuration store
description: Create an Azure App Configuration store using a sample PowerShell script. See reference article links to commands used in the script. -+ Last updated 02/12/2023-+
azure-app-configuration Powershell Delete Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/scripts/powershell-delete-service.md
Title: PowerShell script sample - Delete an Azure App Configuration store
description: Delete an Azure App Configuration store using a sample PowerShell script. See reference article links to commands used in the script. -+ Last updated 02/02/2023-+
azure-app-configuration Security Controls Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/security-controls-policy.md
Title: Azure Policy Regulatory Compliance controls for Azure App Configuration
description: Lists Azure Policy Regulatory Compliance controls available for Azure App Configuration. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources. Last updated 09/14/2023 --++
azure-app-configuration Use Feature Flags Dotnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/use-feature-flags-dotnet-core.md
Title: Tutorial for using feature flags in a .NET app | Microsoft Docs
description: In this tutorial, you learn how to implement feature flags in .NET Core apps. documentationcenter: ''-+ editor: '' ms.assetid:
ms.devlang: csharp Last updated 07/11/2023-+ #Customer intent: I want to control feature availability in my app by using the .NET Core Feature Manager library.
azure-app-configuration Use Key Vault References Dotnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/use-key-vault-references-dotnet-core.md
Title: Tutorial for using Azure App Configuration Key Vault references in an ASP
description: In this tutorial, you learn how to use Azure App Configuration's Key Vault references from an ASP.NET Core app documentationcenter: ''-+ editor: '' ms.assetid:
ms.devlang: csharp Last updated 07/11/2023-+ #Customer intent: I want to update my ASP.NET Core application to reference values stored in Key Vault through App Configuration.
azure-functions Functions Bindings Event Grid Output https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-event-grid-output.md
def main(eventGridEvent: func.EventGridEvent,
event_time=datetime.datetime.utcnow(), data_version="1.0")) ```-+ ::: zone-end ::: zone pivot="programming-language-csharp" ## Attributes
azure-linux Intro Azure Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-linux/intro-azure-linux.md
Previously updated : 06/01/2023 Last updated : 09/05/2023 # What is the Azure Linux Container Host for AKS?
-The Azure Linux Container Host is an operating system image that's optimized for running container workloads on [Azure Kubernetes Service (AKS)](../../articles/aks/intro-kubernetes.md). It's maintained by Microsoft and based on Microsoft Azure Linux, an open-source Linux distribution created by Microsoft.
+The Azure Linux Container Host is an operating system image that's optimized for running container workloads on [Azure Kubernetes Service (AKS)](../../articles/aks/intro-kubernetes.md). Microsoft maintains the Azure Linux Container Host and based it on [CBL-Mariner][cbl-mariner], an open-source Linux distribution created by Microsoft.
The Azure Linux Container Host is lightweight, containing only the packages needed to run container workloads. It's hardened based on significant validation tests and internal usage and is compatible with Azure agents. It provides reliability and consistency from cloud to edge across AKS, AKS for Azure Stack HCI, and Azure Arc. You can deploy Azure Linux node pools in a new cluster, add Azure Linux node pools to your existing clusters, or migrate your existing nodes to Azure Linux nodes.
To learn more about Azure Linux, see the [Azure Linux GitHub repository](https:/
The Azure Linux Container Host offers the following key benefits: -- **Secure supply chain**: Microsoft builds, signs, and validates the Azure Linux Container Host packages from source, and hosts its packages and sources in Microsoft-owned and secured platforms.-- **Small and lightweight**: The Azure Linux Container Host only includes the necessary set of packages needed to run container workloads. As a result, it consumes limited disk and memory resources.-- **Secure by default**: The Azure Linux Container Host has an emphasis on security and follows the secure-by-default principles, including using a hardened Linux kernel with Azure cloud optimizations and flags tuned for Azure. It also provides a reduced attack surface and eliminates patching and maintenance of unnecessary packages. For more information on Azure Linux Container Host security principles, see the [AKS security concepts](../../articles/aks/concepts-security.md).-- **Extensively validated**: The AKS and Azure Linux teams run a suite of functional and performance regression tests with the Azure Linux Container Host before releasing to customers, which enables earlier issue detection and mitigation.ΓÇï
+- **Small and lightweight**
+ - The Azure Linux Container Host only includes the necessary set of packages needed to run container workloads. As a result, it consumes limited disk and memory resources.
+ - Azure Linux has only 500 packages, and as a result takes up the least disk space by up to *5 GB* on AKS.
+- **Secure supply chain**
+ - The Linux and AKS teams at Microsoft build, sign, and validate the [Azure Linux Container Host packages][azure-linux-packages] from source, and host packages and sources in Microsoft-owned and secured platforms.
+ - Each package update runs through a full set of unit tests and end-to-end testing on the existing image to prevent regressions. The extensive testing, in combination with the smaller package count, reduces the chances of disruptive updates to applications.
+- **Secure by default**
+ - The Azure Linux Container Host has an emphasis on security. It follows the secure-by-default principles, including using a hardened Linux kernel with Azure cloud optimizations and flags tuned for Azure. It also provides a reduced attack surface and eliminates patching and maintenance of unnecessary packages.
+ - Microsoft monitors the CVE database and releases security patches monthly and critical updates within days if necessary.
+ - Azure Linux passes all the [CIS Level 1 benchmarks][cis-benchmarks], making it the only Linux distribution on AKS that does so.
+ - For more information on Azure Linux Container Host security principles, see the [AKS security concepts](../../articles/aks/concepts-security.md).
+- **Maintains compatibility with existing workloads**
+ - All existing and future AKS extensions, add-ons, and open-source projects on AKS support both Ubuntu and Azure Linux. This includes support for runtime components like Dapr, IaC tools like Terraform, and monitoring solutions like Dynatrace.
+ - Azure Linux ships with containerd as its container runtime and the upstream Linux kernel, which enables existing containers based on Linux images (like Alpine) to work seamlessly on Azure Linux.
> [!NOTE] >
The Azure Linux Container Host offers the following key benefits:
- Learn more about [Azure Linux Container Host core concepts](./concepts-core.md). - Follow our tutorial to [Deploy, manage, and update applications](./tutorial-azure-linux-create-cluster.md). - Get started by [Creating an Azure Linux Container Host for AKS cluster using Azure CLI](./quickstart-azure-cli.md).+
+<!-- LINKS - external -->
+[cbl-mariner]: https://github.com/microsoft/CBL-Mariner
+[azure-linux-packages]: https://packages.microsoft.com/cbl-mariner/2.0/prod/
+
+<!-- LINKS - internal -->
+[cis-benchmarks]: ../aks/cis-azure-linux.md
azure-monitor Agents Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/agents-overview.md
The tables below provide a comparison of Azure Monitor Agent with the legacy the
| | Event Hub | | | X | | **Services and features supported** | | | | | | | Microsoft Sentinel | X ([View scope](./azure-monitor-agent-migration.md#migrate-additional-services-and-features)) | X | |
-| | VM Insights | X (Public preview) | X | |
+| | VM Insights | X | X | |
| | Microsoft Defender for Cloud | X (Public preview) | X | | | | Automation Update Management | | X | | | | Azure Stack HCI | X | | |
The tables below provide a comparison of Azure Monitor Agent with the legacy the
| | Event Hub | | | X | | | **Services and features supported** | | | | | | | | Microsoft Sentinel | X ([View scope](./azure-monitor-agent-migration.md#migrate-additional-services-and-features)) | X | |
-| | VM Insights | X (Public preview) | X | |
+| | VM Insights | X | X | |
| | Microsoft Defender for Cloud | X (Public preview) | X | | | | Automation Update Management | | X | | | | Update Manager | N/A (Public preview, independent of monitoring agents) | | |
azure-monitor Alerts Troubleshoot Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-troubleshoot-log.md
The alert time range is limited to a maximum of two days. Even if the query cont
If the query requires more data than the alert evaluation, you can change the time range manually. If there's ago command in the query, it will be changed automatically to be 2 days (48 hours). + ## Log alert fired unnecessarily A configured [log alert rule in Azure Monitor](./alerts-log.md) might be triggered unexpectedly. The following sections describe some common reasons.
azure-monitor Custom Operations Tracking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/custom-operations-tracking.md
public class ApplicationInsightsMiddleware : OwinMiddleware
catch (Exception e) { requestTelemetry.Success = false;
+ requestTelemetry.ResponseCode;
telemetryClient.TrackException(e); throw; }
azure-monitor Opentelemetry Enable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/opentelemetry-enable.md
Follow the steps in this section to instrument your application with OpenTelemet
- Python Application using Python 3.7+
-> [!CAUTION]
-> We have not tested the Azure Monitor OpenTelemetry Distro running side-by-side with the OpenTelemetry Community Package. We recommend you uninstall any OpenTelemetry-related packages before installing the Distro.
+
+[!TIP] We don't recommend using the OTel Community SDK/API with the Azure Monitor OTel Distro since it automatically loads them as dependencies.
### Install the client library
Run your application and open your **Application Insights Resource** tab in the
You've now enabled Application Insights for your application. All the following steps are optional and allow for further customization.
-Not working? Check out the troubleshooting page for [ASP.NET Core](/troubleshoot/azure/azure-monitor/app-insights/opentelemetry-troubleshooting-dotnet), [Java](/troubleshoot/azure/azure-monitor/app-insights/opentelemetry-troubleshooting-java), [Node.js](/troubleshoot/azure/azure-monitor/app-insights/opentelemetry-troubleshooting-nodejs), or [Python](/troubleshoot/azure/azure-monitor/app-insights/opentelemetry-troubleshooting-python).
- > [!IMPORTANT] > If you have two or more services that emit telemetry to the same Application Insights resource, you're required to [set Cloud Role Names](opentelemetry-configuration.md#set-the-cloud-role-name-and-the-cloud-role-instance) to represent them properly on the Application Map.
azure-monitor Azure Monitor Workspace Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/azure-monitor-workspace-manage.md
This article shows you how to create and delete an Azure Monitor workspace. When
Use the following command to create an Azure Monitor workspace using Azure CLI. ```azurecli
-az resource create --resource-group <resource-group-name> --namespace microsoft.monitor --resource-type accounts --name <azure-monitor-workspace-name> --location <location> --properties {}
+az resource create --resource-group <resource-group-name> --namespace microsoft.monitor --resource-type accounts --name <azure-monitor-workspace-name> --location <location> --properties "{}"
``` ### [Resource Manager](#tab/resource-manager)
azure-monitor Azure Monitor Data Explorer Proxy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/azure-monitor-data-explorer-proxy.md
union AzureActivity, arg("").Resources
```kusto let CL1 = arg("").Resources ; union AzureActivity, CL1 | take 10
+```
-```sql
When you use the [`join` operator](/azure/data-explorer/kusto/query/joinoperator) instead of union, you need to use a [`hint`](/azure/data-explorer/kusto/query/joinoperator#join-hints) to combine the data in Azure Resource Graph with data in the Log Analytics workspace. Use `Hint.remote={Direction of the Log Analytics Workspace}`. For example:
-kusto
+```kusto
Perf | where ObjectName == "Memory" and (CounterName == "Available MBytes Memory") | extend _ResourceId = replace_string(replace_string(replace_string(_ResourceId, 'microsoft.compute', 'Microsoft.Compute'), 'virtualmachines','virtualMachines'),"resourcegroups","resourceGroups") | join hint.remote=left (arg("").Resources | where type =~ 'Microsoft.Compute/virtualMachines' | project _ResourceId=id, tags) on _ResourceId | project-away _ResourceId1 | where tostring(tags.env) == "prod"
To create a new alert rule based on a cross-service query, follow the steps in [
* [Private Link](../logs/private-link-security.md) (private endpoints) and [IP restrictions](/azure/data-explorer/security-network-restrict-public-access) are not support cross-service queries. * mv-expand is limited to 2000 records.
-* the following operators do not work with the cross query with ability with Azure Resource Graph:
-
-smv-apply(), rand(), arg_max() , arg_min(), avg() , avg_if(), countif(), sumif(), percentile() , percentiles() , percentilew() , percentilesw(), stdev() , stdevif() , stdevp(), variance() , variancep() , varianceif().
+* The following operators do not work with the cross query with ability with Azure Resource Graph: smv-apply(), rand(), arg_max() , arg_min(), avg() , avg_if(), countif(), sumif(), percentile() , percentiles() , percentilew() , percentilesw(), stdev() , stdevif() , stdevp(), variance() , variancep() , varianceif().
## Next steps * [Write queries](/azure/data-explorer/write-queries)
azure-monitor Data Collector Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/data-collector-api.md
Last updated 08/08/2023-
-# Send log data to Azure Monitor by using the HTTP Data Collector API (preview)
+# Send log data to Azure Monitor by using the HTTP Data Collector API (deprecated)
This article shows you how to use the HTTP Data Collector API to send log data to Azure Monitor from a REST API client. It describes how to format data that's collected by your script or application, include it in a request, and have that request authorized by Azure Monitor. We provide examples for Azure PowerShell, C#, and Python. > [!NOTE]
-> The Azure Monitor HTTP Data Collector API is in public preview.
+> The Azure Monitor HTTP Data Collector API has been deprecated and will no longer be functional as of 9/18/2026. It's been replaced by the [Logs ingestion API](logs-ingestion-api-overview.md).
## Concepts You can use the HTTP Data Collector API to send log data to a Log Analytics workspace in Azure Monitor from any client that can call a REST API. The client might be a runbook in Azure Automation that collects management data from Azure or another cloud, or it might be an alternative management system that uses Azure Monitor to consolidate and analyze log data.
Any request to the Azure Monitor HTTP Data Collector API must include an authori
Here's the format for the authorization header:
-```
+
+```sql
Authorization: SharedKey <WorkspaceID>:<Signature> ```
Authorization: SharedKey <WorkspaceID>:<Signature>
Use this format to encode the **SharedKey** signature string:
-```
+
+```ruby
StringToSign = VERB + "\n" + Content-Length + "\n" + Content-Type + "\n" +
StringToSign = VERB + "\n" +
Here's an example of a signature string:
-```
+
+```ruby
POST\n1024\napplication/json\nx-ms-date:Mon, 04 Apr 2016 08:00:00 GMT\n/api/logs ``` When you have the signature string, encode it by using the HMAC-SHA256 algorithm on the UTF-8-encoded string, and then encode the result as Base64. Use this format:
-```
+
+```sql
Signature=Base64(HMAC-SHA256(UTF8(StringToSign))) ```
The following properties are reserved and shouldn't be used in a custom record t
- TimeGenerated - RawData - ## Data limits The data posted to the Azure Monitor Data collection API is subject to certain constraints:
public class ApiExample {
``` + ## Alternatives and considerations
Although the Data Collector API should cover most of your needs as you collect f
| [Azure Data Explorer](/azure/data-explorer/ingest-data-overview) | Azure Data Explorer, now generally available to the public, is the data platform that powers Application Insights Analytics and Azure Monitor Logs. By using the data platform in its raw form, you have complete flexibility (but require the overhead of management) over the cluster (Kubernetes role-based access control (RBAC), retention rate, schema, and so on). Azure Data Explorer provides many [ingestion options](/azure/data-explorer/ingest-data-overview#ingestion-methods), including [CSV, TSV, and JSON](/azure/kusto/management/mappings) files. | <ul><li> Data that won't be correlated with any other data under Application Insights or Monitor Logs. </li><li> Data that requires advanced ingestion or processing capabilities that aren't available today in Azure Monitor Logs. </li></ul> | + ## Next steps - Use the [Log Search API](./log-query-overview.md) to retrieve data from the Log Analytics workspace. - Learn more about how to [create a data pipeline with the Data Collector API](create-pipeline-datacollector-api.md) by using a Logic Apps workflow to Azure Monitor.+
azure-netapp-files Azure Netapp Files Solution Architectures https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/azure-netapp-files-solution-architectures.md
na Previously updated : 06/12/2023 Last updated : 09/18/2023 # Solution architectures using Azure NetApp Files
This section provides references to SAP on Azure solutions.
* [DB2 Installation Guide on Azure NetApp Files](https://techcommunity.microsoft.com/t5/running-sap-applications-on-the/db2-installation-guide-on-anf/ba-p/3709437) * [Manual Recovery Guide for SAP DB2 on Azure VMs from Azure NetApp Files snapshot with AzAcSnap](https://techcommunity.microsoft.com/t5/running-sap-applications-on-the/manual-recovery-guide-for-sap-db2-on-azure-vms-from-azure-netapp/ba-p/3865379) * [SAP ASE 16.0 on Azure NetApp Files for SAP Workloads on SLES15](https://techcommunity.microsoft.com/t5/running-sap-applications-on-the/sap-ase-16-0-on-azure-netapp-files-for-sap-workloads-on-sles15/ba-p/3729496)
+* [SAP Netweaver 7.5 with MaxDB 7.9 on Azure using Azure NetApp Files](https://techcommunity.microsoft.com/t5/running-sap-applications-on-the/sap-netweaver-7-5-with-maxdb-7-9-on-azure-using-azure-netapp/ba-p/3905041)
### SAP IQ-NLS
azure-portal Azure Portal Safelist Urls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-portal/azure-portal-safelist-urls.md
Title: Allow the Azure portal URLs on your firewall or proxy server description: To optimize connectivity between your network and the Azure portal and its services, we recommend you add these URLs to your allowlist. Previously updated : 08/22/2023 Last updated : 09/14/2023
graph.microsoftazure.us
aadcdn.msauth.cn aadcdn.msftauth.cn login.live.com
+catalogartifact.azureedge.net
+store-images.s-microsoft.com
*.azure.cn *.microsoft.cn *.microsoftonline.cn
azure-sql-edge Backup Restore https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/backup-restore.md
Title: Back up and restore databases - Azure SQL Edge
description: Learn about backup and restore capabilities in Azure SQL Edge. - Previously updated : 05/19/2020 Last updated : 09/14/2023 -
+# Back up and restore databases in Azure SQL Edge
-# Back up and restore databases in Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-Azure SQL Edge is built on the latest versions of the Microsoft SQL Database Engine. It provides similar backup and restore database capabilities as those available in SQL Server on Linux and SQL Server running in containers. The backup and restore component provides an essential safeguard for protecting data stored in your Azure SQL Edge databases.
+Azure SQL Edge is built on the latest versions of the Microsoft SQL Database Engine. It provides similar backup and restore database capabilities to SQL Server on Linux and SQL Server running in containers. The backup and restore component provides an essential safeguard for protecting data stored in your Azure SQL Edge databases.
-To minimize the risk of catastrophic data loss, you should back up your databases periodically to preserve modifications to your data on a regular basis. A well-planned backup and restore strategy helps protect databases against data loss caused by a variety of failures. Test your strategy by restoring a set of backups and then recovering your database, to prepare you to respond effectively to a disaster.
+To minimize the risk of catastrophic data loss, you should back up your databases periodically to preserve modifications to your data regularly. A well-planned backup and restore strategy helps protect databases against data loss caused by various failures. Test your strategy by restoring a set of backups and then recovering your database, to prepare you to respond effectively to a disaster.
To read more about why backups are important, see [Backup and restore of SQL Server databases](/sql/relational-databases/backup-restore/back-up-and-restore-of-sql-server-databases/).
Azure SQL Edge enables you to back up to and restore from both local storage and
Azure SQL Edge supports the same backup types as SQL Server. For a complete list, see [Backup overview](/sql/relational-databases/backup-restore/backup-overview-sql-server/).
-> [!IMPORTANT]
+> [!IMPORTANT]
> Databases created in Azure SQL Edge use the simple recovery model by default. As such, you can't perform log backups on these databases. If you need to do this, you'll need an administrator to change the database recovery model to the full recovery model. For a complete list of recovery models supported by SQL Server, see [Recovery model overview](/sql/relational-databases/backup-restore/recovery-models-sql-server#RMov). ### Back up to local disk
-In the following example, you use the `BACKUP DATABASE` Transact-SQL command to create a database backup in the container. For the purpose of this example, you create a new folder called *backup* to store the backup files.
+In the following example, you use the `BACKUP DATABASE` Transact-SQL command to create a database backup in the container. For this example, you create a new folder called *backup* to store the backup files.
1. Create a folder for the backups. Run this command on the host where your Azure SQL Edge container is running. In the following command, replace **<AzureSQLEdge_Container_Name>** with the name of the Azure SQL Edge container in your deployment.
- ```bash
- sudo docker exec -it <AzureSQLEdge_Container_Name> mkdir /var/opt/mssql/backup
- ```
+ ```bash
+ sudo docker exec -it <AzureSQLEdge_Container_Name> mkdir /var/opt/mssql/backup
+ ```
-2. Connect to the Azure SQL Edge instance by using SQL Server Management Studio (SSMS), or by using Azure Data Studio. Run the `BACKUP DATABASE` command to take the backup of your user database. In the following example, you're taking the backup of the *IronOreSilicaPrediction* database.
+1. Connect to the Azure SQL Edge instance by using SQL Server Management Studio (SSMS), or by using Azure Data Studio. Run the `BACKUP DATABASE` command to take the backup of your user database. In the following example, you're taking the backup of the `IronOreSilicaPrediction` database.
- ```sql
- BACKUP DATABASE [IronOreSilicaPrediction]
- TO DISK = N'/var/opt/mssql/backup/IronOrePredictDB.bak'
- WITH NOFORMAT, NOINIT, NAME = N'IronOreSilicaPrediction-Full Database Backup',
- SKIP, NOREWIND, NOUNLOAD, COMPRESSION, STATS = 10
- GO
- ```
+ ```sql
+ BACKUP DATABASE [IronOreSilicaPrediction] TO DISK = N'/var/opt/mssql/backup/IronOrePredictDB.bak'
+ WITH NOFORMAT,
+ NOINIT,
+ NAME = N'IronOreSilicaPrediction-Full Database Backup',
+ SKIP,
+ NOREWIND,
+ NOUNLOAD,
+ COMPRESSION,
+ STATS = 10;
+ GO
+ ```
-3. After you run the command, if the backup of the database is successful, you'll see messages similar to the following in the results section of SSMS or Azure Data Studio.
+1. After you run the command, if the backup of the database is successful, you'll see messages similar to the following in the results section of SSMS or Azure Data Studio.
- ```txt
+ ```output
10 percent processed. 20 percent processed. 30 percent processed.
In the following example, you use the `BACKUP DATABASE` Transact-SQL command to
### Back up to URL
-Azure SQL Edge supports backups to both page blobs and block blobs. For more information, see [Back up to block blob vs page blob](/sql/relational-databases/backup-restore/sql-server-backup-to-url#blockbloborpageblob). In the following example, the database *IronOreSilicaPrediction* is being backed up to a block blob.
+Azure SQL Edge supports backups to both page blobs and block blobs. For more information, see [Back up to block blob vs page blob](/sql/relational-databases/backup-restore/sql-server-backup-to-url#blockbloborpageblob). In the following example, the database `IronOreSilicaPrediction` is being backed up to a block blob.
1. To configure backups to block blobs, first generate a shared access signature (SAS) token that you can use to create a SQL Server credential on Azure SQL Edge. The script creates a SAS that is associated with a stored access policy. For more information, see [Shared access signatures, part 1: Understanding the SAS model](../storage/common/storage-sas-overview.md). The script also writes the T-SQL command required to create the credential on SQL Server. The following script assumes that you already have an Azure subscription with a storage account, and a storage container for the backups. ```PowerShell
- # Define global variables for the script
+ # Define global variables for the script
$subscriptionName='<your subscription name>' # the name of subscription name you will use $resourcegroupName = '<your resource group name>' # the name of resource group you will use $storageAccountName= '<your storage account name>' # the storage account name you will use for backups
- $containerName= '<your storage container name>' # the storage container name to which you will attach the SAS policy with its SAS token
- $policyName = 'SASPolicy' # the name of the SAS policy
+ $containerName= '<your storage container name>' # the storage container name to which you will attach the SAS policy with its SAS token
+ $policyName = 'SASPolicy' # the name of the SAS policy
# adds an authenticated Azure account for use in the session Login-AzAccount
Azure SQL Edge supports backups to both page blobs and block blobs. For more inf
Select-AzSubscription -Subscription $subscriptionName # Generate the SAS token
- $sa = Get-AzStorageAccount -ResourceGroupName $resourcegroupName -Name $storageAccountName
- $storagekey = Get-AzStorageAccountKey -ResourceGroupName $resourcegroupName -Name $storageAccountName
+ $sa = Get-AzStorageAccount -ResourceGroupName $resourcegroupName -Name $storageAccountName
+ $storagekey = Get-AzStorageAccountKey -ResourceGroupName $resourcegroupName -Name $storageAccountName
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storagekey[0].Value $cbc = Get-AzStorageContainer -Name $containerName -Context $storageContext $policy = New-AzStorageContainerStoredAccessPolicy -Container $containerName -Policy $policyName -Context $storageContext -ExpiryTime $(Get-Date).ToUniversalTime().AddYears(10) -Permission "rwld" $sas = New-AzStorageContainerSASToken -Policy $policyName -Context $storageContext -Container $containerName Write-Host 'Shared Access Signature= '$($sas.Substring(1))''
- # Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature
- Write-Host 'Credential T-SQL'
+ # Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature
+ Write-Host 'Credential T-SQL'
$tSql = "CREATE CREDENTIAL [{0}] WITH IDENTITY='Shared Access Signature', SECRET='{1}'" -f $cbc.CloudBlobContainer.Uri.AbsoluteUri,$sas.Substring(1)
- $tSql | clip
+ $tSql | clip
Write-Host $tSql ``` After successfully running the script, copy the `CREATE CREDENTIAL` command to a query tool. Then connect to an instance of SQL Server, and run the command to create the credential with the SAS.
-2. Connect to the Azure SQL Edge instance by using SSMS or Azure Data Studio, and create the credential by using the command from the previous step. Make sure to replace the `CREATE CREDENTIAL` command with the actual output from the previous step.
+1. Connect to the Azure SQL Edge instance by using SSMS or Azure Data Studio, and create the credential by using the command from the previous step. Make sure to replace the `CREATE CREDENTIAL` command with the actual output from the previous step.
```sql
- IF NOT EXISTS
+ IF NOT EXISTS
(SELECT * FROM sys.credentials
- WHERE name = 'https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>')
+ WHERE name = 'https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>')
CREATE CREDENTIAL [https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>]
- WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
+ WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = '<SAS_TOKEN>'; ```
-3. The following command takes a backup of the *IronOreSilicaPrediction* to the Azure storage container.
+1. The following command takes a backup of the `IronOreSilicaPrediction` to the Azure storage container.
```sql BACKUP DATABASE IronOreSilicaPrediction TO URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/IronOreSilicaPrediction.bak'
- With MAXTRANSFERSIZE = 4194304,BLOCKSIZE=65536;
+ With MAXTRANSFERSIZE = 4194304,BLOCKSIZE=65536;
GO ```
Azure SQL Edge supports backups to both page blobs and block blobs. For more inf
In Azure SQL Edge, you can restore from a local disk, a network location, or an Azure Blob storage account. For more information about restore and recovery in SQL Server, see [Restore and recovery overview](/sql/relational-databases/backup-restore/restore-and-recovery-overview-sql-server). For an overview of the simple recovery model in SQL Server, see [Complete database restores (simple recovery model)](/sql/relational-databases/backup-restore/complete-database-restores-simple-recovery-model).
-> [!IMPORTANT]
-> Databases created in Azure SQL Edge cannot be restored on an instance of Microsoft SQL Server or Azure SQL. Additionally, a database created on Microsoft SQL Server or Azure SQL can be restored on Azure SQL Edge, provided the database does not contain any of the features not supported by Azure SQL Edge.
+> [!IMPORTANT]
+> Databases created in Azure SQL Edge can't be restored on an instance of Microsoft SQL Server or Azure SQL. Additionally, a database created on Microsoft SQL Server or Azure SQL can be restored on Azure SQL Edge, provided the database doesn't contain any of the features not supported by Azure SQL Edge.
### Restore from a local disk
-This example uses the *IronOreSilicaPrediction* backup that you made in the previous example. Now, you'll restore it as a new database with a different name.
+This example uses the `IronOreSilicaPrediction` backup that you made in the previous example. Now, you'll restore it as a new database with a different name.
1. If the database backup file isn't already present in the container, you can use the following command to copy the file into the container. The following example assumes that the backup file is present on the local host, and is being copied to the /var/opt/mssql/backup folder into an Azure SQL Edge container named *sql1*.
This example uses the *IronOreSilicaPrediction* backup that you made in the prev
sudo docker cp IronOrePredictDB.bak sql1:/var/opt/mssql/backup ```
-2. Connect to the Azure SQL Edge instance by using SSMS or Azure Data Studio to run the restore command. In the following example, **IronOrePredictDB.bak** is restored to create a new database, **IronOreSilicaPrediction_2**.
+1. Connect to the Azure SQL Edge instance by using SSMS or Azure Data Studio to run the restore command. In the following example, `IronOrePredictDB.bak` is restored to create a new database, `IronOreSilicaPrediction_2`.
```sql Restore FilelistOnly from disk = N'/var/opt/mssql/backup/IronOrePredictDB.bak'
This example uses the *IronOreSilicaPrediction* backup that you made in the prev
Restore Database IronOreSilicaPrediction_2 From disk = N'/var/opt/mssql/backup/IronOrePredictDB.bak' WITH MOVE 'IronOreSilicaPrediction' TO '/var/opt/mssql/datf',
- MOVE 'IronOreSilicaPrediction_log' TO '/var/opt/mssql/data/IronOreSilicaPrediction_Primary_2.ldf'
+ MOVE 'IronOreSilicaPrediction_log' TO '/var/opt/mssql/data/IronOreSilicaPrediction_Primary_2.ldf';
```
-3. After you run the restore command, if the restore operation was successful, you'll see messages similar to the following in the output window.
+1. After you run the restore command, if the restore operation was successful, you'll see messages similar to the following in the output window.
- ```txt
+ ```output
Processed 51648 pages for database 'IronOreSilicaPrediction_2', file 'IronOreSilicaPrediction' on file 1. Processed 2 pages for database 'IronOreSilicaPrediction_2', file 'IronOreSilicaPrediction_log' on file 1. RESTORE DATABASE successfully processed 51650 pages in 6.543 seconds (61.670 MB/sec).
This example uses the *IronOreSilicaPrediction* backup that you made in the prev
### Restore from URL
-Azure SQL Edge also supports restoring a database from an Azure Storage account. You can restore from either the block blobs or page blob backups. In the following example, the *IronOreSilicaPrediction_2020_04_16.bak* database backup file on a block blob is restored to create the database, *IronOreSilicaPrediction_3*.
+Azure SQL Edge also supports restoring a database from an Azure Storage account. You can restore from either the block blobs or page blob backups. In the following example, the `IronOreSilicaPrediction_2020_04_16.bak` database backup file on a block blob is restored to create the database, `IronOreSilicaPrediction_3`.
```sql RESTORE DATABASE IronOreSilicaPrediction_3 FROM URL = 'https://mystorageaccount.blob.core.windows.net/mysecondcontainer/IronOreSilicaPrediction_2020_04_16.bak'
-WITH MOVE 'IronOreSilicaPrediction' TO '/var/opt/mssql/datf',
+WITH MOVE 'IronOreSilicaPrediction' TO '/var/opt/mssql/datf',
MOVE 'IronOreSilicaPrediction_log' TO '/var/opt/mssql/data/IronOreSilicaPrediction_Primary_3.ldf', STATS = 10; ```
azure-sql-edge Configure Replication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/configure-replication.md
Title: Configure replication to Azure SQL Edge
description: Learn about configuring replication to Azure SQL Edge. - Previously updated : 05/19/2020 Last updated : 09/14/2023 -
+# Configure replication to Azure SQL Edge
-# Configure replication to Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-You can configure an instance of Azure SQL Edge as the push subscriber for one-way transactional replication or snapshot replication. This instance can't act as the publisher or the distributor for a transactional replication configuration. Note that Azure SQL Edge doesn't support merge replication, peer-to-peer replication, or Oracle publishing.
+You can configure an instance of Azure SQL Edge as the push subscriber for one-way transactional replication or snapshot replication. This instance can't act as the publisher or the distributor for a transactional replication configuration. Azure SQL Edge doesn't support merge replication, peer-to-peer replication, or Oracle publishing.
## Supported configurations
-
+ - The instance of Azure SQL Edge must be a push subscriber for a publisher. - The publisher and the distributor can be either: - An instance of SQL Server running on-premises, or an instance of SQL Server running in an Azure virtual machine. For more information, see [SQL Server on Azure Virtual Machines overview](/azure/azure-sql/virtual-machines/index). SQL Server instances must be using a version later than SQL Server 2016.
- - An instance of Azure SQL Managed Instance. SQL Managed Instance can host publisher, distributor, and subscriber databases. For more information, see [Replication with SQL Database Managed Instance](/azure/sql-database/replication-with-sql-database-managed-instance/).
+ - An instance of Azure SQL Managed Instance. SQL Managed Instance can host publisher, distributor, and subscriber databases. For more information, see [Replication with SQL Managed Instance](/azure/sql-database/replication-with-sql-database-managed-instance/).
-- The distribution database and the replication agents can't be placed on an instance of Azure SQL Edge.
+- The distribution database and the replication agents can't be placed on an instance of Azure SQL Edge.
-> [!NOTE]
-> If you attempt to configure replication by using an unsupported version, you might receive the following two errors: MSSQL_REPL20084 ("The process could not connect to Subscriber.") and MSSQL_REPL40532 ("Cannot open server \<name> requested by the login. The login failed.").
+> [!NOTE]
+> If you attempt to configure replication by using an unsupported version, you might receive the following two errors: MSSQL_REPL20084 ("The process could not connect to Subscriber.") and MSSQL_REPL40532 ("Cannot open server \<name> requested by the login. The login failed.").
## Remarks
The following requirements and best practices are important to understand as you
- You can configure replication by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms). You can also do so by running Transact-SQL statements on the publisher, by using either SQL Server Management Studio or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio). - To replicate to an instance of Azure SQL Edge, you must use SQL Server authentication to sign in. - Replicated tables must have a primary key.-- A single publication on SQL Server can support both Azure SQL Edge and SQL Server (on-premises and SQL Server in an Azure virtual machine) subscribers. -- Replication management, monitoring, and troubleshooting must be performed from the SQL Server instance. -- Only push subscriptions to Azure SQL Edge are supported. -- Only `@subscriber_type = 0` is supported in the stored procedure `sp_addsubscription` for Azure SQL Edge.
+- A single publication on SQL Server can support both Azure SQL Edge and SQL Server (on-premises and SQL Server in an Azure virtual machine) subscribers.
+- Replication management, monitoring, and troubleshooting must be performed from the SQL Server instance.
+- Only push subscriptions to Azure SQL Edge are supported.
+- Only `@subscriber_type = 0` is supported in the stored procedure `sp_addsubscription` for Azure SQL Edge.
- Azure SQL Edge doesn't support bi-directional, immediate, updatable, or peer-to-peer replication.-- Azure SQL Edge only supports a subset of features available in SQL Server or SQL Managed Instance. If you attempt to replicate a database (or objects within the database) that contains one or more unsupported features, the attempt fails. For example, if you attempt to replicate a database that contains objects with spatial data types, you'll receive an error. For more information, see [Supported features of Azure SQL Edge](features.md).
+- Azure SQL Edge only supports a subset of features available in SQL Server or SQL Managed Instance. If you attempt to replicate a database (or objects within the database) that contains one or more unsupported features, the attempt fails. For example, if you attempt to replicate a database that contains objects with spatial data types, you receive an error. For more information, see [Supported features of Azure SQL Edge](features.md).
## Initialize reference data on an instance of Azure SQL Edge You might want to initialize your instance with reference data that changes over time. For example, you might want to update machine learning models on your instance of Azure SQL Edge, after they have been trained on a SQL Server instance. Here's how to initialize your instance in such a scenario:
-1. Create a transactional replication publication on a SQL Server database.
-2. On the SQL Server instance, use the **New Subscription Wizard** or Transact-SQL statements to create a push to subscription to Azure SQL Edge.
-3. You can initialize the replicated database on Azure SQL Edge by using a snapshot generated by the snapshot agent, and distributed and delivered by the distribution agent. Alternatively, you can initialize by using a backup of the database from the publisher. Remember that if the database backup contains objects or features not supported by Azure SQL Edge, the restore operation fails.
+1. Create a transactional replication publication on a SQL Server database.
+1. On the SQL Server instance, use the **New Subscription Wizard** or Transact-SQL statements to create a push to subscription to Azure SQL Edge.
+1. You can initialize the replicated database on Azure SQL Edge by using a snapshot generated by the snapshot agent, and distributed and delivered by the distribution agent. Alternatively, you can initialize by using a backup of the database from the publisher. Remember that if the database backup contains objects or features not supported by Azure SQL Edge, the restore operation fails.
## Limitations The following options aren't supported for Azure SQL Edge subscriptions: -- Copy file groups association -- Copy table partitioning schemes -- Copy index partitioning schemes -- Copy user defined statistics -- Copy default bindings -- Copy rule bindings -- Copy fulltext indexes -- Copy XML XSD -- Copy XML indexes -- Copy permissions -- Copy spatial indexes -- Copy filtered indexes -- Copy data compression attribute -- Copy sparse column attribute
+- Copy file groups association
+- Copy table partitioning schemes
+- Copy index partitioning schemes
+- Copy user defined statistics
+- Copy default bindings
+- Copy rule bindings
+- Copy fulltext indexes
+- Copy XML XSD
+- Copy XML indexes
+- Copy permissions
+- Copy spatial indexes
+- Copy filtered indexes
+- Copy data compression attribute
+- Copy sparse column attribute
- Copy filestream, `hierarchyid`, or spatial data types-- Convert `hierarchyid` to MAX data types -- Convert spatial to MAX data types -- Copy extended properties -- Copy permissions
+- Convert `hierarchyid` to MAX data types
+- Convert spatial to MAX data types
+- Copy extended properties
+- Copy permissions
## Examples Create a publication and a push subscription. For more information, see:
-
+ - [Create a publication](/sql/relational-databases/replication/publish/create-a-publication)-- [Create a push subscription](/sql/relational-databases/replication/create-a-push-subscription/) by using the Azure SQL Edge server name and IP as the subscriber (for example, **myEdgeinstance,1433**), and a database name on the Azure SQL Edge instance as the destination database (for example, **AdventureWorks**).
+- [Create a push subscription](/sql/relational-databases/replication/create-a-push-subscription/) by using the Azure SQL Edge server name and IP as the subscriber (for example, **myEdgeinstance,1433**), and a database name on the Azure SQL Edge instance as the destination database (for example, **AdventureWorks**).
-## Next steps
+## Next steps
- [Create a publication](/sql/relational-databases/replication/publish/create-a-publication) - [Create a push subscription](/sql/relational-databases/replication/create-a-push-subscription/) - [Types of replication](/sql/relational-databases/replication/types-of-replication) - [Monitoring (replication)](/sql/relational-databases/replication/monitor/monitoring-replication)-- [Initialize a subscription](/sql/relational-databases/replication/initialize-a-subscription)
+- [Initialize a subscription](/sql/relational-databases/replication/initialize-a-subscription)
azure-sql-edge Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/configure.md
Title: Configure Azure SQL Edge
description: Learn about configuring Azure SQL Edge. - Previously updated : 09/22/2020 Last updated : 09/14/2023 - -+ - # Configure Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Azure SQL Edge supports configuration through one of the following two options: - Environment variables - An mssql.conf file placed in the /var/opt/mssql folder
-> [!NOTE]
+> [!NOTE]
> Setting environment variables overrides the settings specified in the mssql.conf file. ## Configure by using environment variables Azure SQL Edge exposes several different environment variables that can be used to configure the SQL Edge container. These environment variables are a subset of the ones available for SQL Server on Linux. For more information on SQL Server on Linux environment variables, see [Environment variables](/sql/linux/sql-server-linux-configure-environment-variables/).
-The following new environment variables were added to Azure SQL Edge.
-
-| Environment variable | Description | Values |
-|--|--| - |
-| **PlanId** | Specifies the Azure SQL Edge SKU to be used during initialization. This environment variable is only required when deploying Azure SQL Edge using Azure IoT Edge. | **asde-developer-on-iot-edge** or **asde-premium-on-iot-edge** |
-| **MSSQL_TELEMETRY_ENABLED** | Enable or disable usage and diagnostics data collection. | TRUE or FALSE |
-| **MSSQL_TELEMETRY_DIR** | Sets the target directory for the usage and diagnostics data collection audit files. | Folder location within SQL Edge container. This folder can be mapped to a host volume using either mount points or data volumes. |
-| **MSSQL_PACKAGE** | Specifies the location of the dacpac or bacpac package to be deployed. | Folder, file, or SAS URL containing the dacpac or bacpac packages. For more information, refer [Deploy SQL Database DACPAC and BACPAC packages in SQL Edge](deploy-dacpac.md). |
+The following new environment variables were added to Azure SQL Edge.
+| Environment variable | Description | Values |
+| | | |
+| **PlanId** | Specifies the Azure SQL Edge SKU to be used during initialization. This environment variable is only required when deploying Azure SQL Edge using Azure IoT Edge. | **asde-developer-on-iot-edge** or **asde-premium-on-iot-edge** |
+| **MSSQL_TELEMETRY_ENABLED** | Enable or disable usage and diagnostics data collection. | TRUE or FALSE |
+| **MSSQL_TELEMETRY_DIR** | Sets the target directory for the usage and diagnostics data collection audit files. | Folder location within SQL Edge container. This folder can be mapped to a host volume using either mount points or data volumes. |
+| **MSSQL_PACKAGE** | Specifies the location of the dacpac or bacpac package to be deployed. | Folder, file, or SAS URL containing the dacpac or bacpac packages. For more information, see [Deploy SQL Database DACPAC and BACPAC packages in SQL Edge](deploy-dacpac.md). |
-The following SQL Server on Linux environment variable isn't supported for Azure SQL Edge. If defined, this environment variable will be ignored during container initialization.
+The following SQL Server on Linux environment variable isn't supported for Azure SQL Edge. If defined, this environment variable is ignored during container initialization.
| Environment variable | Description |
-|--|--|
-| **MSSQL_ENABLE_HADR** | Enable availability group. For example, **1** is enabled, and **0** is disabled. |
+| | |
+| **MSSQL_ENABLE_HADR** | Enable availability group. For example, `1` is enabled, and `0` is disabled. |
-> [!IMPORTANT]
+> [!IMPORTANT]
> The **MSSQL_PID** environment variable for SQL Edge only accepts **Premium** and **Developer** as the valid values. Azure SQL Edge doesn't support initialization using a product key. ### Specify the environment variables
Specify environment variables for SQL Edge when you deploy the service through t
Add values in **Environment Variables**.
-![Set by using environment variables list](media/configure/set-environment-variables.png)
Add values in **Container Create Options**.
-![Set by using container create options](media/configure/set-environment-variables-using-create-options.png)
-> [!NOTE]
+> [!NOTE]
> In the disconnected deployment mode, environment variables can be specified using the `-e` or `--env` or the `--env-file` option of the `docker run` command. ## Configure by using an `mssql.conf` file
Add values in **Container Create Options**.
Azure SQL Edge doesn't include the [mssql-conf configuration utility](/sql/linux/sql-server-linux-configure-mssql-conf/) like SQL Server on Linux does. You need to manually configure the mssql.conf file and place it in the persistent storage drive that is mapped to the /var/opt/mssql/ folder in the SQL Edge module. When you're deploying SQL Edge from Azure Marketplace, this mapping is specified as the **Mounts** option in the **Container Create Options**. ```json
+{
+ "Mounts": [
{
- "Mounts": [
- {
- "Type": "volume",
- "Source": "sqlvolume",
- "Target": "/var/opt/mssql"
- }
- ]
- }
+ "Type": "volume",
+ "Source": "sqlvolume",
+ "Target": "/var/opt/mssql"
}
+ ]
+}
```
-The following new mssql.conf options were added for Azure SQL Edge.
+The following new mssql.conf options were added for Azure SQL Edge.
-|Option|Description|
-|:|:|
-|**customerfeedback** | Choose if SQL Server sends feedback to Microsoft. For more information, see [Disable usage and diagnostic data collection](usage-and-diagnostics-data-configuration.md#disable-usage-and-diagnostic-data-collection)|
-|**userrequestedlocalauditdirectory** | Sets the target directory for the usage and diagnostics data collection audit files. For more information, see [Local audit of usage and diagnostic data collection](usage-and-diagnostics-data-configuration.md#local-audit-of-usage-and-diagnostic-data-collection) |
+| Option | Description |
+| : | : |
+| **customerfeedback** | Choose if SQL Server sends feedback to Microsoft. For more information, see [Disable usage and diagnostic data collection](usage-and-diagnostics-data-configuration.md#disable-usage-and-diagnostic-data-collection) |
+| **userrequestedlocalauditdirectory** | Sets the target directory for the usage and diagnostics data collection audit files. For more information, see [Local audit of usage and diagnostic data collection](usage-and-diagnostics-data-configuration.md#local-audit-of-usage-and-diagnostic-data-collection) |
The following mssql.conf options aren't applicable to SQL Edge:
-|Option|Description|
-|:|:|
-|**Customer feedback** | Choose if SQL Server sends feedback to Microsoft. |
-|**Database mail profile** | Set the default database mail profile for SQL Server on Linux. |
-|**High availability** | Enable Availability Groups. |
-|**Microsoft Distributed Transaction Coordinator** | Configure and troubleshoot MSDTC on Linux. Additional distributed transaction-related configuration options aren't supported for SQL Edge. For more information on these additional configuration options, see [Configure MSDTC](/sql/linux/sql-server-linux-configure-mssql-conf#msdtc). |
-|**ML Services EULAs** | Accept R and Python EULAs for Azure Machine Learning packages. Applies to SQL Server 2019 only.|
-|**outboundnetworkaccess** |Enable outbound network access for [Machine Learning Services](/sql/linux/sql-server-linux-setup-machine-learning/) R, Python, and Java extensions.|
+| Option | Description |
+| : | : |
+| **Customer feedback** | Choose if SQL Server sends feedback to Microsoft. |
+| **Database mail profile** | Set the default database mail profile for SQL Server on Linux. |
+| **High availability** | Enable Availability Groups. |
+| **Microsoft Distributed Transaction Coordinator** | Configure and troubleshoot MSDTC on Linux. Additional distributed transaction-related configuration options aren't supported for SQL Edge. For more information on these additional configuration options, see [Configure MSDTC](/sql/linux/sql-server-linux-configure-mssql-conf#msdtc). |
+| **ML Services EULAs** | Accept R and Python EULAs for Azure Machine Learning packages. Applies to SQL Server 2019 only. |
+| **outboundnetworkaccess** | Enable outbound network access for [Machine Learning Services](/sql/linux/sql-server-linux-setup-machine-learning/) R, Python, and Java extensions. |
The following sample mssql.conf file works for SQL Edge. For more information on the format for an `mssql.conf` file, see [mssql.conf format](/sql/linux/sql-server-linux-configure-mssql-conf#mssql-conf-format).
traceflag2 = 1204
## Run Azure SQL Edge as non-root user
-By default, the Azure SQL Edge containers run with a non-root user/group. When deployed through the Azure Marketplace (or using docker run), unless a different user/group is specified, SQL Edge containers starts up as the mssql (non-root) user. To specify a different non-root user during deployment, add the `*"User": "<name|uid>[:<group|gid>]"*` key-value pair under container create options. In the example below SQL Edge is configured to start as the user `*IoTAdmin*`.
+By default, the Azure SQL Edge containers run with a non-root user/group. When deployed through the Azure Marketplace (or using `docker run`), unless a different user/group is specified, SQL Edge containers starts up as the mssql (non-root) user. To specify a different non-root user during deployment, add the `*"User": "<name|uid>[:<group|gid>]"*` key-value pair under container create options. In the following example, SQL Edge is configured to start as the user `*IoTAdmin*`.
```json {
By default, the Azure SQL Edge containers run with a non-root user/group. When d
} ```
-To allow the non-root user to access DB files that are on mounted volumes, ensure that the user/group you run the container under, has read & write permissions on the persistent file storage. In the example below we set the non-root user with user_id 10001 as the owner of the files.
+To allow the non-root user to access DB files that are on mounted volumes, ensure that the user/group you run the container under, has read & write permissions on the persistent file storage. In the following example, we set the non-root user with `user_id` of `10001` as the owner of the files.
```bash chown -R 10001:0 <database file dir> ```
-### Upgrading from earlier CTP releases
+### Upgrade from earlier CTP releases
Earlier CTPs of Azure SQL Edge were configured to run as the root users. The following options are available when upgrading from earlier CTPs. - Continue to use the root user - To continue using the root user, add the `*"User": "0:0"*` key-value pair under container create options.-- Use the default mssql user - To use the default mssql user, follow the steps below
- - Add a user named mssql on the docker host. In the example below, we add a user mssql with ID 10001. This user is also added to the root group.
+- Use the default mssql user - To use the default mssql user, follow these steps:
+ - Add a user named `mssql` on the Docker host. In the example below, we add a user mssql with ID 10001. This user is also added to the root group.
+ ```bash sudo useradd -M -s /bin/bash -u 10001 -g 0 mssql ```
- - Change the permission on the directory/mount volume where the database file resides
+
+ - Change the permission on the directory/mount volume where the database file resides
+ ```bash sudo chgrp -R 0 /var/lib/docker/volumes/kafka_sqldata/ sudo chmod -R g=u /var/lib/docker/volumes/kafka_sqldata/ ```+ - Use a different non-root user account - To use a different non-root user account
- - Update the container create options to specify add `*"User": "user_name | user_id*` key-value pair under container create options. Replace user_name or user_id with an actual user_name or user_id from your docker host.
+ - Update the container create options to specify add `*"User": "user_name | user_id*` key-value pair under container create options. Replace user_name or user_id with an actual user_name or user_id from your Docker host.
- Change the permissions on the directory/mount volume. ## Persist your data Your Azure SQL Edge configuration changes and database files are persisted in the container even if you restart the container with `docker stop` and `docker start`. However, if you remove the container with `docker rm`, everything in the container is deleted, including Azure SQL Edge and your databases. The following section explains how to use **data volumes** to persist your database files even if the associated containers are deleted.
-> [!IMPORTANT]
-> For Azure SQL Edge, it is critical that you understand data persistence in Docker. In addition to the discussion in this section, see Docker's documentation on [how to manage data in Docker containers](https://docs.docker.com/engine/tutorials/dockervolumes/).
+> [!IMPORTANT]
+> For Azure SQL Edge, it's critical that you understand data persistence in Docker. In addition to the discussion in this section, see Docker's documentation on [how to manage data in Docker containers](https://docs.docker.com/engine/tutorials/dockervolumes/).
### Mount a host directory as data volume
docker run -e "ACCEPT_EULA=Y" -e "MSSQL_SA_PASSWORD=<YourStrong!Passw0rd>" -p 14
This technique also enables you to share and view the files on the host outside of Docker.
-> [!IMPORTANT]
-> Host volume mapping for **Docker on Windows** does not currently support mapping the complete `/var/opt/mssql` directory. However, you can map a subdirectory, such as `/var/opt/mssql/data` to your host machine.
+> [!IMPORTANT]
+> Host volume mapping for **Docker on Windows** doesn't currently support mapping the complete `/var/opt/mssql` directory. However, you can map a subdirectory, such as `/var/opt/mssql/data` to your host machine.
-> [!IMPORTANT]
-> Host volume mapping for **Docker on Mac** with the Azure SQL Edge image is not supported at this time. Use data volume containers instead. This restriction is specific to the `/var/opt/mssql` directory. Reading from a mounted directory works fine. For example, you can mount a host directory using -v on Mac and restore a backup from a .bak file that resides on the host.
+> [!IMPORTANT]
+> Host volume mapping for **Docker on macOS** with the Azure SQL Edge image isn't supported at this time. Use data volume containers instead. This restriction is specific to the `/var/opt/mssql` directory. Reading from a mounted directory works fine. For example, you can mount a host directory using `-v` on macOS and restore a backup from a `.bak` file that resides on the host.
### Use data volume containers
docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=<YourStrong!Passw0rd>' -p 14
docker run -e "ACCEPT_EULA=Y" -e "MSSQL_SA_PASSWORD=<YourStrong!Passw0rd>" -p 1433:1433 -v sqlvolume:/var/opt/mssql -d mcr.microsoft.com/azure-sql-edge ```
-> [!NOTE]
-> This technique for implicitly creating a data volume in the run command does not work with older versions of Docker. In that case, use the explicit steps outlined in the Docker documentation, [Creating and mounting a data volume container](https://docs.docker.com/engine/tutorials/dockervolumes/#creating-and-mounting-a-data-volume-container).
+> [!NOTE]
+> This technique for implicitly creating a data volume in the run command doesn't work with older versions of Docker. In that case, use the explicit steps outlined in the Docker documentation, [Creating and mounting a data volume container](https://docs.docker.com/engine/tutorials/dockervolumes/#creating-and-mounting-a-data-volume-container).
Even if you stop and remove this container, the data volume persists. You can view it with the `docker volume ls` command.
If you then create another container with the same volume name, the new containe
To remove a data volume container, use the `docker volume rm` command.
-> [!WARNING]
+> [!WARNING]
> If you delete the data volume container, any Azure SQL Edge data in the container is *permanently* deleted. - ## Next steps - [Connect to Azure SQL Edge](connect.md)
azure-sql-edge Connect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/connect.md
Title: Connect and query Azure SQL Edge
description: Learn how to connect to and query Azure SQL Edge. Previously updated : 07/28/2023 Last updated : 09/14/2023 # Connect and query Azure SQL Edge
-In Azure SQL Edge, after you deploy a container, you can connect to the database engine from any of the following locations:
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+In Azure SQL Edge, after you deploy a container, you can connect to the Database Engine from any of the following locations:
- Inside the container - From another Docker container running on the same host
azure-sql-edge Create External Stream Transact Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/create-external-stream-transact-sql.md
Title: CREATE EXTERNAL STREAM (Transact-SQL) - Azure SQL Edge
description: Learn about the CREATE EXTERNAL STREAM statement in Azure SQL Edge - Previously updated : 07/27/2020 Last updated : 09/14/2023 - - # CREATE EXTERNAL STREAM (Transact-SQL)
-The EXTERNAL STREAM object has a dual purpose of both an input and output stream. It can be used as an input to query streaming data from event ingestion services such as Azure Event Hub, Azure IoT Hub (or Edge Hub) or Kafka or it can be used as an output to specify where and how to store results from a streaming query.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+The EXTERNAL STREAM object has a dual purpose of both an input and output stream. It can be used as an input to query streaming data from event ingestion services such as Azure Event Hubs, Azure IoT Hub (or Edge Hub) or Kafka or it can be used as an output to specify where and how to store results from a streaming query.
-An EXTERNAL STREAM can also be specified and created as both an output and input for services such as Event Hub or Blob storage. This facilitates chaining scenarios where a streaming query is persisting results to the external stream as output and another streaming query reading from the same external stream as input.
+An EXTERNAL STREAM can also be specified and created as both an output and input for services such as Event Hubs or Blob storage. This facilitates chaining scenarios where a streaming query is persisting results to the external stream as output and another streaming query reading from the same external stream as input.
Azure SQL Edge currently only supports the following data sources as stream inputs and outputs. | Data source type | Input | Output | Description |
-||-|--||
-| Azure IoT Edge hub | Y | Y | Data source to read and write streaming data to an Azure IoT Edge hub. For more information, see [IoT Edge Hub](../iot-edge/iot-edge-runtime.md#iot-edge-hub).|
-| SQL Database | N | Y | Data source connection to write streaming data to SQL Database. The database can be a local database in Azure SQL Edge, or a remote database in SQL Server or Azure SQL Database.|
-| Kafka | Y | N | Data source to read streaming data from a Kafka topic. Kafka support is not available for the ARM64 version of Azure SQL Edge.|
--
+| | | | |
+| Azure IoT Edge hub | Y | Y | Data source to read and write streaming data to an Azure IoT Edge hub. For more information, see [IoT Edge Hub](../iot-edge/iot-edge-runtime.md#iot-edge-hub). |
+| SQL Database | N | Y | Data source connection to write streaming data to SQL Database. The database can be a local database in Azure SQL Edge, or a remote database in SQL Server or Azure SQL Database. |
+| Kafka | Y | N | Data source to read streaming data from a Kafka topic. |
## Syntax ```syntaxsql
-CREATE EXTERNAL STREAM {external_stream_name}
-( <column_definition> [, <column_definition> ] * ) -- Used for Inputs - optional
+CREATE EXTERNAL STREAM { external_stream_name }
+( <column_definition> [ , <column_definition> ] * ) -- Used for Inputs - optional
WITH ( <with_options> ) <column_definition> ::=
WITH ( <with_options> )
[ ( precision [ , scale ] | max ) ] <with_options> ::=
- DATA_SOURCE = data_source_name,
- LOCATION = location_name,
- [FILE_FORMAT = external_file_format_name], --Used for Inputs - optional
- [<optional_input_options>],
- [<optional_output_options>],
+ DATA_SOURCE = data_source_name ,
+ LOCATION = location_name ,
+ [ FILE_FORMAT = external_file_format_name ] , --Used for Inputs - optional
+ [ <optional_input_options> ] ,
+ [ <optional_output_options> ] ,
TAGS = <tag_column_value> <optional_input_options> ::=
- INPUT_OPTIONS = '[<Input_options_data>]'
+ INPUT_OPTIONS = ' [ <input_options_data> ] '
<Input_option_data> ::= <input_option_values> [ , <input_option_values> ] <input_option_values> ::=
- PARTITIONS: [number_of_partitions]
+ PARTITIONS: [ number_of_partitions ]
| CONSUMER_GROUP: [ consumer_group_name ] | TIME_POLICY: [ time_policy ] | LATE_EVENT_TOLERANCE: [ late_event_tolerance_value ] | OUT_OF_ORDER_EVENT_TOLERANCE: [ out_of_order_tolerance_value ] <optional_output_options> ::=
- OUTPUT_OPTIONS = '[<output_option_data>]'
+ OUTPUT_OPTIONS = ' [ <output_option_data> ] '
<output_option_data> ::= <output_option_values> [ , <output_option_values> ]
WITH ( <with_options> )
<output_option_values> ::= REJECT_POLICY: [ reject_policy ] | MINIMUM_ROWS: [ row_value ]
- | MAXIMUM_TIME: [ time_value_minutes]
+ | MAXIMUM_TIME: [ time_value_minutes ]
| PARTITION_KEY_COLUMN: [ partition_key_column_name ] | PROPERTY_COLUMNS: [ ( [ output_col_name ] ) ] | SYSTEM_PROPERTY_COLUMNS: [ ( [ output_col_name ] ) ]
WITH ( <with_options> )
## Arguments -- [DATA_SOURCE](/sql/t-sql/statements/create-external-data-source-transact-sql/)-- [FILE_FORMAT](/sql/t-sql/statements/create-external-file-format-transact-sql/)-- **LOCATION**: Specifies the name for the actual data or location in the data source.
- - For Edge Hub or Kafka stream objects, location specifies the name of the Edge Hub or Kafka topic to read from or write to.
- - For SQL stream objects(SQL Server, Azure SQL Database or Azure SQL Edge) location specifies the name of the table. If the stream is created in the same database and schema as the destination table, then just the Table name suffices. Otherwise you need to fully qualify (`database_name.schema_name.table_name`) the table name.
- - For Azure Blob Storage stream object location refers to the path pattern to use inside the blob container. For more information, see [Outputs from Azure Stream Analytics](/articles/stream-analytics/stream-analytics-define-outputs.md#blob-storage-and-azure-data-lake-gen2).
--- **INPUT_OPTIONS**: Specify options as key-value pairs for services such as Kafka, IoT Edge Hub that are inputs to streaming queries
- - PARTITIONS:
- Number of partitions defined for a topic. The maximum number of partitions which can be used is limited to 32.
- - Applies to Kafka Input Streams
- - CONSUMER_GROUP:
- Event and IoT Hubs limit the number of readers within one consumer group (to 5). Leaving this field empty will use the '$Default' consumer group.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - TIME_POLICY:
- Describes whether to drop events or adjust the event time when late events or out of order events pass their tolerance value.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - LATE_EVENT_TOLERANCE:
- The maximum acceptable time delay. The delay represents the difference between the event's timestamp and the system clock.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - OUT_OF_ORDER_EVENT_TOLERANCE:
- Events can arrive out of order after they've made the trip from the input to the streaming query. These events can be accepted as-is, or you can choose to pause for a set period to reorder them.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
--- **OUTPUT_OPTIONS**: Specify options as key-value pairs for supported services that are outputs to streaming queries
- - REJECT_POLICY: DROP | RETRY
- Species the data error handling policies when data conversion errors occur.
- - Applies to all supported outputs
- - MINIMUM_ROWS:
- Minimum rows required per batch written to an output. For Parquet, every batch will create a new file.
- - Applies to all supported outputs
- - MAXIMUM_TIME:
- Maximum wait time in minutes per batch. After this time, the batch will be written to the output even if the minimum rows requirement is not met.
- - Applies to all supported outputs
- - PARTITION_KEY_COLUMN:
- The column that is used for the partition key.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - PROPERTY_COLUMNS:
- A comma-separated list of the names of output columns that will be attached to messages as custom properties if provided.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - SYSTEM_PROPERTY_COLUMNS:
- A JSON-formatted collection of name/value pairs of System Property names and output columns to be populated on Service Bus messages. e.g. { "MessageId": "column1", "PartitionKey": "column2"}
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - PARTITION_KEY:
- The name of the output column containing the partition key. The partition key is a unique identifier for the partition within a given table that forms the first part of an entity's primary key. It is a string value that may be up to 1 KB in size.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - ROW_KEY:
- The name of the output column containing the row key. The row key is a unique identifier for an entity within a given partition. It forms the second part of an entity's primary key. The row key is a string value that may be up to 1 KB in size.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - BATCH_SIZE:
- This represents the number of transactions for table storage where the maximum can go up to 100 records. For Azure Functions, this represents the batch size in bytes sent to the function per call - default is 256 kB.
- - Reserved for future usage. Does not apply to Azure SQL Edge.
- - MAXIMUM_BATCH_COUNT:
- Maximum number of events sent to the function per call for Azure function - default is 100. For SQL Database, this represents the maximum number of records sent with every bulk insert transaction - default is 10,000.
- - Applies to all SQL based outputs
- - STAGING_AREA: EXTERNAL DATA SOURCE object to Blob Storage
- The staging area for high-throughput data ingestion into Azure Synapse Analytics
- - Reserved for future usage. Does not apply to Azure SQL Edge.
+#### DATA_SOURCE
+
+For more information, see [DATA_SOURCE](/sql/t-sql/statements/create-external-data-source-transact-sql/).
+
+#### FILE_FORMAT
+
+For more information, see [FILE_FORMAT](/sql/t-sql/statements/create-external-file-format-transact-sql/).
+
+#### LOCATION
+
+Specifies the name for the actual data or location in the data source.
+
+- For Edge Hub or Kafka stream objects, location specifies the name of the Edge Hub or Kafka topic to read from or write to.
+- For SQL stream objects (SQL Server, Azure SQL Database or Azure SQL Edge), the location specifies the name of the table. If the stream is created in the same database and schema as the destination table, then just the Table name suffices. Otherwise you need to fully qualify the table name (`<database_name>.<schema_name>.<table_name>`).
+- For Azure Blob Storage stream object location refers to the path pattern to use inside the blob container. For more information, see [Outputs from Azure Stream Analytics](../stream-analytics/blob-storage-azure-data-lake-gen2-output.md).
+
+#### INPUT_OPTIONS
+
+Specify options as key-value pairs for services such as Kafka and IoT Edge Hubs, which are inputs to streaming queries.
+
+- PARTITIONS:
+
+ Number of partitions defined for a topic. The maximum number of partitions that can be used is limited to 32 (Applies to Kafka Input Streams).
+
+ - CONSUMER_GROUP:
+
+ Event and IoT Hubs limit the number of readers within one consumer group (to 5). Leaving this field empty will use the '$Default' consumer group.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+ - TIME_POLICY:
+
+ Describes whether to drop events or adjust the event time when late events or out of order events pass their tolerance value.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+ - LATE_EVENT_TOLERANCE:
+
+ The maximum acceptable time delay. The delay represents the difference between the event's timestamp and the system clock.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+ - OUT_OF_ORDER_EVENT_TOLERANCE:
+
+ Events can arrive out of order after they've made the trip from the input to the streaming query. These events can be accepted as-is, or you can choose to pause for a set period to reorder them.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+#### OUTPUT_OPTIONS
+
+Specify options as key-value pairs for supported services that are outputs to streaming queries
+
+- REJECT_POLICY: DROP | RETRY
+
+ Species the data error handling policies when data conversion errors occur.
+
+ - Applies to all supported outputs.
+
+- MINIMUM_ROWS:
+
+ Minimum rows required per batch written to an output. For Parquet, every batch creates a new file.
+
+ - Applies to all supported outputs.
+
+- MAXIMUM_TIME:
+
+ Maximum wait time in minutes per batch. After this time, the batch will be written to the output even if the minimum rows requirement isn't met.
+
+ - Applies to all supported outputs.
+
+- PARTITION_KEY_COLUMN:
+
+ The column that is used for the partition key.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+- PROPERTY_COLUMNS:
+
+ A comma-separated list of the names of output columns that are attached to messages as custom properties, if provided.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+- SYSTEM_PROPERTY_COLUMNS:
+
+ A JSON-formatted collection of name/value pairs of System Property names and output columns to be populated on Service Bus messages. For example, `{ "MessageId": "column1", "PartitionKey": "column2" }`.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+- PARTITION_KEY:
+
+ The name of the output column containing the partition key. The partition key is a unique identifier for the partition within a given table that forms the first part of an entity's primary key. It's a string value that may be up to 1 KB in size.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+- ROW_KEY:
+
+ The name of the output column containing the row key. The row key is a unique identifier for an entity within a given partition. It forms the second part of an entity's primary key. The row key is a string value that may be up to 1 KB in size.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+- BATCH_SIZE:
+
+ This represents the number of transactions for table storage where the maximum can go up to 100 records. For Azure Functions, this represents the batch size in bytes sent to the function per call - default is 256 kB.
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
+
+- MAXIMUM_BATCH_COUNT:
+
+ Maximum number of events sent to the function per call for Azure function - default is 100. For SQL Database, this represents the maximum number of records sent with every bulk insert transaction - default is 10,000.
+
+ - Applies to all SQL based outputs
+
+- STAGING_AREA: EXTERNAL DATA SOURCE object to Blob Storage
+
+ The staging area for high-throughput data ingestion into Azure Synapse Analytics
+
+ - Reserved for future usage. Doesn't apply to Azure SQL Edge.
For more information about supported input and output options corresponding to the data source type, see [Azure Stream Analytics - Input Overview](../stream-analytics/stream-analytics-add-inputs.md) and [Azure Stream Analytics - Outputs Overview](../stream-analytics/stream-analytics-define-outputs.md) respectively. ## Examples
-### Example 1 - EdgeHub
-
-Type: Input or Output<br>
+### Example A: EdgeHub
-Syntax:
+Type: Input or Output.
```sql CREATE EXTERNAL DATA SOURCE MyEdgeHub
-WITH
-(
- LOCATION = 'edgehub://'
-);
+ WITH (LOCATION = 'edgehub://');
CREATE EXTERNAL FILE FORMAT myFileFormat
-WITH (
- FORMAT_TYPE = JSON
-);
+ WITH (FORMAT_TYPE = JSON);
CREATE EXTERNAL STREAM Stream_A
-WITH
-(
- DATA_SOURCE = MyEdgeHub,
- FILE_FORMAT = myFileFormat,
- LOCATION = '<mytopicname>',
- OUTPUT_OPTIONS =
- 'REJECT_TYPE: Drop'
-);
+ WITH (
+ DATA_SOURCE = MyEdgeHub,
+ FILE_FORMAT = myFileFormat,
+ LOCATION = '<mytopicname>',
+ OUTPUT_OPTIONS = 'REJECT_TYPE: Drop'
+ );
```
-### Example 2 - Azure SQL Database, Azure SQL Edge, SQL Server
-
-Type: Output<br>
+### Example B: Azure SQL Database, Azure SQL Edge, SQL Server
-Syntax:
+Type: Output
```sql CREATE DATABASE SCOPED CREDENTIAL SQLCredName
-WITH IDENTITY = '<user>',
-SECRET = '<password>';
+ WITH IDENTITY = '<user>',
+ SECRET = '<password>';
-- Azure SQL Database CREATE EXTERNAL DATA SOURCE MyTargetSQLTabl
-WITH
-(
- LOCATION = '<my_server_name>.database.windows.net',
- CREDENTIAL = SQLCredName
-);
+ WITH (
+ LOCATION = '<my_server_name>.database.windows.net',
+ CREDENTIAL = SQLCredName
+ );
--SQL Server or Azure SQL Edge CREATE EXTERNAL DATA SOURCE MyTargetSQLTabl
-WITH
-(
- LOCATION = ' <sqlserver://<ipaddress>,<port>',
- CREDENTIAL = SQLCredName
-);
+ WITH (
+ LOCATION = ' <sqlserver://<ipaddress>,<port>',
+ CREDENTIAL = SQLCredName
+ );
CREATE EXTERNAL STREAM Stream_A
-WITH
-(
- DATA_SOURCE = MyTargetSQLTable,
- LOCATION = '<DatabaseName>.<SchemaName>.<TableName>' ,
- --Note: If table is contained in the database, <TableName> should be sufficient
- OUTPUT_OPTIONS =
- 'REJECT_TYPE: Drop'
-);
+ WITH (
+ DATA_SOURCE = MyTargetSQLTable,
+ LOCATION = '<DatabaseName>.<SchemaName>.<TableName>',
+ --Note: If table is contained in the database, <TableName> should be sufficient
+ OUTPUT_OPTIONS = 'REJECT_TYPE: Drop'
+ );
```
-### Example 3 - Kafka
+### Example C: Kafka
-Type: Input<br>
+Type: Input
-Syntax:
```sql CREATE EXTERNAL DATA SOURCE MyKafka_tweets
-WITH
-(
- --The location maps to KafkaBootstrapServer
- LOCATION = 'kafka://<kafkaserver>:<ipaddress>',
- CREDENTIAL = kafkaCredName
-);
+ WITH (
+ --The location maps to KafkaBootstrapServer
+ LOCATION = 'kafka://<kafkaserver>:<ipaddress>',
+ CREDENTIAL = kafkaCredName
+ );
CREATE EXTERNAL FILE FORMAT myFileFormat
-WITH (
- FORMAT_TYPE = JSON,
- DATA_COMPRESSION = 'org.apache.hadoop.io.compress.GzipCodec'
-);
-
-CREATE EXTERNAL STREAM Stream_A (user_id VARCHAR, tweet VARCHAR)
-WITH
-(
- DATA_SOURCE = MyKafka_tweets,
- LOCATION = '<KafkaTopicName>',
- FILE_FORMAT = myFileFormat,
- INPUT_OPTIONS =
- 'PARTITIONS: 5'
-);
+ WITH (
+ FORMAT_TYPE = JSON,
+ DATA_COMPRESSION = 'org.apache.hadoop.io.compress.GzipCodec'
+ );
+
+CREATE EXTERNAL STREAM Stream_A (
+ user_id VARCHAR,
+ tweet VARCHAR
+ )
+ WITH (
+ DATA_SOURCE = MyKafka_tweets,
+ LOCATION = '<KafkaTopicName>',
+ FILE_FORMAT = myFileFormat,
+ INPUT_OPTIONS = 'PARTITIONS: 5'
+ );
``` ## See also
azure-sql-edge Create Stream Analytics Job https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/create-stream-analytics-job.md
Title: Create a T-SQL streaming job in Azure SQL Edge
description: Learn about creating Stream Analytics jobs in Azure SQL Edge. - Previously updated : 07/27/2020 Last updated : 09/14/2023 -
+# Create a data streaming job in Azure SQL Edge
-# Create a data streaming job in Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
This article explains how to create a T-SQL streaming job in Azure SQL Edge. You create the external stream input and output objects, and then you define the streaming job query as part of the streaming job creation.
This article explains how to create a T-SQL streaming job in Azure SQL Edge. You
T-SQL streaming uses the external data source functionality of SQL Server to define the data sources associated with the external stream inputs and outputs of the streaming job. Use the following T-SQL commands to create an external stream input or output object: - [CREATE EXTERNAL FILE FORMAT (Transact-SQL)](/sql/t-sql/statements/create-external-file-format-transact-sql)- - [CREATE EXTERNAL DATA SOURCE (Transact-SQL)](/sql/t-sql/statements/create-external-data-source-transact-sql)- - [CREATE EXTERNAL STREAM (Transact-SQL)](#example-create-an-external-stream-object-to-azure-sql-database) Additionally, if Azure SQL Edge, SQL Server, or Azure SQL Database is used as an output stream, you need the [CREATE DATABASE SCOPED CREDENTIAL (Transact-SQL)](/sql/t-sql/statements/create-database-scoped-credential-transact-sql). This T-SQL command defines the credentials to access the database.
Additionally, if Azure SQL Edge, SQL Server, or Azure SQL Database is used as an
Azure SQL Edge currently only supports the following data sources as stream inputs and outputs. | Data source type | Input | Output | Description |
-||-|--||
-| Azure IoT Edge hub | Y | Y | Data source to read and write streaming data to an Azure IoT Edge hub. For more information, see [IoT Edge Hub](../iot-edge/iot-edge-runtime.md#iot-edge-hub).|
-| SQL Database | N | Y | Data source connection to write streaming data to SQL Database. The database can be a local database in Azure SQL Edge, or a remote database in SQL Server or Azure SQL Database.|
-| Kafka | Y | N | Data source to read streaming data from a Kafka topic. This adapter is currently only available for Intel or AMD versions of Azure SQL Edge. It isn't available for the ARM64 version of Azure SQL Edge.|
+| | | | |
+| Azure IoT Edge hub | Y | Y | Data source to read and write streaming data to an Azure IoT Edge hub. For more information, see [IoT Edge Hub](../iot-edge/iot-edge-runtime.md#iot-edge-hub). |
+| SQL Database | N | Y | Data source connection to write streaming data to SQL Database. The database can be a local database in Azure SQL Edge, or a remote database in SQL Server or Azure SQL Database. |
+| Kafka | Y | N | Data source to read streaming data from a Kafka topic. |
### Example: Create an external stream input/output object for Azure IoT Edge hub
The following example creates an external stream object for Azure IoT Edge hub.
1. Create an external file format of the type JSON.
- ```sql
- Create External file format InputFileFormat
- WITH
- (
- format_type = JSON,
- )
- go
- ```
-
-2. Create an external data source for Azure IoT Edge hub. The following T-SQL script creates a data source connection to an IoT Edge hub that runs on the same Docker host as Azure SQL Edge.
-
- ```sql
- CREATE EXTERNAL DATA SOURCE EdgeHubInput
- WITH
- (
- LOCATION = 'edgehub://'
- )
- go
- ```
-
-3. Create the external stream object for Azure IoT Edge hub. The following T-SQL script creates a stream object for the IoT Edge hub. In case of an IoT Edge hub stream object, the LOCATION parameter is the name of the IoT Edge hub topic or channel being read or written to.
-
- ```sql
- CREATE EXTERNAL STREAM MyTempSensors
- WITH
- (
+ ```sql
+ CREATE EXTERNAL FILE format InputFileFormat
+ WITH (FORMAT_TYPE = JSON);
+ GO
+ ```
+
+1. Create an external data source for Azure IoT Edge hub. The following T-SQL script creates a data source connection to an IoT Edge hub that runs on the same Docker host as Azure SQL Edge.
+
+ ```sql
+ CREATE EXTERNAL DATA SOURCE EdgeHubInput
+ WITH (LOCATION = 'edgehub://');
+ GO
+ ```
+
+1. Create the external stream object for Azure IoT Edge hub. The following T-SQL script creates a stream object for the IoT Edge hub. In case of an IoT Edge hub stream object, the LOCATION parameter is the name of the IoT Edge hub topic or channel being read or written to.
+
+ ```sql
+ CREATE EXTERNAL STREAM MyTempSensors
+ WITH (
DATA_SOURCE = EdgeHubInput, FILE_FORMAT = InputFileFormat, LOCATION = N'TemperatureSensors', INPUT_OPTIONS = N'', OUTPUT_OPTIONS = N''
- );
- go
- ```
+ );
+ GO
+ ```
### Example: Create an external stream object to Azure SQL Database
-The following example creates an external stream object to the local database in Azure SQL Edge.
+The following example creates an external stream object to the local database in Azure SQL Edge.
1. Create a master key on the database. This is required to encrypt the credential secret.
- ```sql
- CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<<Strong_Password_For_Master_Key_Encryption>>';
- ```
+ ```sql
+ CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<<Strong_Password_For_Master_Key_Encryption>>';
+ ```
-2. Create a database-scoped credential for accessing the SQL Server source. The following example creates a credential to the external data source, with IDENTITY = username, and SECRET = password.
+1. Create a database-scoped credential for accessing the SQL Server source. The following example creates a credential to the external data source, with IDENTITY = username, and SECRET = password.
- ```sql
- CREATE DATABASE SCOPED CREDENTIAL SQLCredential
- WITH IDENTITY = '<SQL_Login>', SECRET = '<SQL_Login_PASSWORD>'
- go
- ```
+ ```sql
+ CREATE DATABASE SCOPED CREDENTIAL SQLCredential
+ WITH IDENTITY = '<SQL_Login>', SECRET = '<SQL_Login_PASSWORD>';
+ GO
+ ```
-3. Create an external data source with CREATE EXTERNAL DATA SOURCE. The following example:
+1. Create an external data source with CREATE EXTERNAL DATA SOURCE. The following example:
- * Creates an external data source named *LocalSQLOutput*.
- * Identifies the external data source (`LOCATION = '<vendor>://<server>[:<port>]'`). In the example, it points to a local instance of Azure SQL Edge.
- * Uses the credential created previously.
+ - Creates an external data source named *LocalSQLOutput*.
+ - Identifies the external data source (`LOCATION = '<vendor>://<server>[:<port>]'`). In the example, it points to a local instance of Azure SQL Edge.
+ - Uses the credential created previously.
- ```sql
- CREATE EXTERNAL DATA SOURCE LocalSQLOutput
- WITH
- (
+ ```sql
+ CREATE EXTERNAL DATA SOURCE LocalSQLOutput
+ WITH (
LOCATION = 'sqlserver://tcp:.,1433', CREDENTIAL = SQLCredential
- )
- go
- ```
-
-4. Create the external stream object. The following example creates an external stream object pointing to a table *dbo.TemperatureMeasurements*, in the database *MySQLDatabase*.
-
- ```sql
- CREATE EXTERNAL STREAM TemperatureMeasurements
- WITH
- (
- DATA_SOURCE = LocalSQLOutput,
- LOCATION = N'MySQLDatabase.dbo.TemperatureMeasurements',
- INPUT_OPTIONS = N'',
- OUTPUT_OPTIONS = N''
- );
- ```
+ );
+ GO
+ ```
+
+1. Create the external stream object. The following example creates an external stream object pointing to a table *dbo.TemperatureMeasurements*, in the database *MySQLDatabase*.
+
+ ```sql
+ CREATE EXTERNAL STREAM TemperatureMeasurements
+ WITH
+ (
+ DATA_SOURCE = LocalSQLOutput,
+ LOCATION = N'MySQLDatabase.dbo.TemperatureMeasurements',
+ INPUT_OPTIONS = N'',
+ OUTPUT_OPTIONS = N''
+ );
+ ```
### Example: Create an external stream object for Kafka
-The following example creates an external stream object to the local database in Azure SQL Edge. This example assumes that the kafka server is configured for anonymous access.
+The following example creates an external stream object to the local database in Azure SQL Edge. This example assumes that the kafka server is configured for anonymous access.
1. Create an external data source with CREATE EXTERNAL DATA SOURCE. The following example:
- ```sql
- Create EXTERNAL DATA SOURCE [KafkaInput]
- With
- (
- LOCATION = N'kafka://<kafka_bootstrap_server_name_ip>:<port_number>'
- )
- GO
- ```
-2. Create an external file format for the kafka input. The following example created a JSON file format with GZipped Compression.
+ ```sql
+ CREATE EXTERNAL DATA SOURCE [KafkaInput]
+ WITH (LOCATION = N'kafka://<kafka_bootstrap_server_name_ip>:<port_number>');
+ GO
+ ```
+
+1. Create an external file format for the Kafka input. The following example created a JSON file format with GZipped Compression.
```sql
- CREATE EXTERNAL FILE FORMAT JsonGzipped
- WITH
- (
- FORMAT_TYPE = JSON ,
- DATA_COMPRESSION = 'org.apache.hadoop.io.compress.GzipCodec'
- )
+ CREATE EXTERNAL FILE FORMAT JsonGzipped
+ WITH (
+ FORMAT_TYPE = JSON,
+ DATA_COMPRESSION = 'org.apache.hadoop.io.compress.GzipCodec'
+ );
+ GO
```
-3. Create the external stream object. The following example creates an external stream object pointing to Kafka topic `*TemperatureMeasurement*`.
+1. Create the external stream object. The following example creates an external stream object pointing to Kafka topic `TemperatureMeasurement`.
- ```sql
- CREATE EXTERNAL STREAM TemperatureMeasurement
- WITH
- (
- DATA_SOURCE = KafkaInput,
- FILE_FORMAT = JsonGzipped,
- LOCATION = 'TemperatureMeasurement',
- INPUT_OPTIONS = 'PARTITIONS: 10'
- );
- ```
+ ```sql
+ CREATE EXTERNAL STREAM TemperatureMeasurement
+ WITH
+ (
+ DATA_SOURCE = KafkaInput,
+ FILE_FORMAT = JsonGzipped,
+ LOCATION = 'TemperatureMeasurement',
+ INPUT_OPTIONS = 'PARTITIONS: 10'
+ );
+ GO
+ ```
## Create the streaming job and the streaming queries Use the `sys.sp_create_streaming_job` system stored procedure to define the streaming queries and create the streaming job. The `sp_create_streaming_job` stored procedure takes the following parameters: -- `job_name`: The name of the streaming job. Streaming job names are unique across the instance.-- `statement`: [Stream Analytics Query Language](/stream-analytics-query/stream-analytics-query-language-reference)-based streaming query statements.
+- `@job_name`: The name of the streaming job. Streaming job names are unique across the instance.
+- `@statement`: [Stream Analytics Query Language](/stream-analytics-query/stream-analytics-query-language-reference)-based streaming query statements.
The following example creates a simple streaming job with one streaming query. This query reads the inputs from the IoT Edge hub, and writes to `dbo.TemperatureMeasurements` in the database. ```sql
-EXEC sys.sp_create_streaming_job @name=N'StreamingJob1',
-@statement= N'Select * INTO TemperatureMeasurements from MyEdgeHubInput'
+EXEC sys.sp_create_streaming_job @name = N'StreamingJob1',
+ @statement = N'Select * INTO TemperatureMeasurements from MyEdgeHubInput'
``` The following example creates a more complex streaming job with multiple different queries. These queries include one that uses the built-in `AnomalyDetection_ChangePoint` function to identify anomalies in the temperature data. ```sql
-EXEC sys.sp_create_streaming_job @name=N'StreamingJob2', @statement=
-N' Select * INTO TemperatureMeasurements1 from MyEdgeHubInput1
-
-Select * Into TemperatureMeasurements2 from MyEdgeHubInput2
-
-Select * Into TemperatureMeasurements3 from MyEdgeHubInput3
-
-SELECT
-Timestamp as [Time],
-[Temperature] As [Temperature],
-GetRecordPropertyValue(AnomalyDetection_ChangePoint(Temperature, 80, 1200) OVER(LIMIT DURATION(minute, 20)), ''Score'') as ChangePointScore,
-GetRecordPropertyValue(AnomalyDetection_ChangePoint(Temperature, 80, 1200) OVER(LIMIT DURATION(minute, 20)), ''IsAnomaly'') as IsChangePointAnomaly
-INTO TemperatureAnomalies FROM MyEdgeHubInput2;
-'
-go
+EXEC sys.sp_create_streaming_job @name = N'StreamingJob2',
+ @statement = N'
+ SELECT *
+ INTO TemperatureMeasurements1
+ FROM MyEdgeHubInput1
+
+ SELECT *
+ INTO TemperatureMeasurements2
+ FROM MyEdgeHubInput2
+
+ SELECT *
+ INTO TemperatureMeasurements3
+ FROM MyEdgeHubInput3
+
+ SELECT timestamp AS [Time],
+ [Temperature] AS [Temperature],
+ GetRecordPropertyValue(AnomalyDetection_ChangePoint(Temperature, 80, 1200) OVER (LIMIT DURATION(minute, 20)), '' Score '') AS ChangePointScore,
+ GetRecordPropertyValue(AnomalyDetection_ChangePoint(Temperature, 80, 1200) OVER (LIMIT DURATION(minute, 20)), '' IsAnomaly '') AS IsChangePointAnomaly
+ INTO TemperatureAnomalies
+ FROM MyEdgeHubInput2;
+';
+GO
``` ## Start, stop, drop, and monitor streaming jobs
go
To start a streaming job in Azure SQL Edge, run the `sys.sp_start_streaming_job` stored procedure. The stored procedure requires the name of the streaming job to start, as input. ```sql
-exec sys.sp_start_streaming_job @name=N'StreamingJob1'
-go
+EXEC sys.sp_start_streaming_job @name = N'StreamingJob1';
+GO
``` To stop a streaming job, run the `sys.sp_stop_streaming_job` stored procedure. The stored procedure requires the name of the streaming job to stop, as input. ```sql
-exec sys.sp_stop_streaming_job @name=N'StreamingJob1'
-go
+EXEC sys.sp_stop_streaming_job @name = N'StreamingJob1';
+GO
``` To drop (or delete) a streaming job, run the `sys.sp_drop_streaming_job` stored procedure. The stored procedure requires the name of the streaming job to drop, as input. ```sql
-exec sys.sp_drop_streaming_job @name=N'StreamingJob1'
-go
+EXEC sys.sp_drop_streaming_job @name = N'StreamingJob1';
+GO
``` To get the current status of a streaming job, run the `sys.sp_get_streaming_job` stored procedure. The stored procedure requires the name of the streaming job to drop, as input. It outputs the name and the current status of the streaming job. ```sql
-exec sys.sp_get_streaming_job @name=N'StreamingJob1'
- WITH RESULT SETS
-(
- (
- name nvarchar(256),
- status nvarchar(256),
- error nvarchar(256)
- )
-)
+EXEC sys.sp_get_streaming_job @name = N'StreamingJob1'
+WITH RESULT SETS (
+ (
+ name NVARCHAR(256),
+ status NVARCHAR(256),
+ error NVARCHAR(256)
+ )
+ );
+GO
``` The streaming job can have any one of the following statuses: | Status | Description |
-|--| |
+| | |
| Created | The streaming job was created, but hasn't yet been started. | | Starting | The streaming job is in the starting phase. | | Idle | The streaming job is running, but there's no input to process. | | Processing | The streaming job is running, and is processing inputs. This state indicates a healthy state for the streaming job. |
-| Degraded | The streaming job is running, but there were some non-fatal errors during input processing. The input job will continue to run, but will drop inputs that encounter errors. |
+| Degraded | The streaming job is running, but there were some non-fatal errors during input processing. The input job continues to run, but will drop inputs that encounter errors. |
| Stopped | The streaming job has been stopped. | | Failed | The streaming job failed. This is generally an indication of a fatal error during processing. |
-> [!NOTE]
-> Since the streaming job is executed asynchronously, the job might encounter errors at runtime. In order to troubleshoot a streaming job failure, use the `sys.sp_get_streaming_job` stored procedure, or review the docker log from the Azure SQL Edge container, which can provide the error details from the streaming job.
+> [!NOTE]
+> Since the streaming job is executed asynchronously, the job might encounter errors at runtime. In order to troubleshoot a streaming job failure, use the `sys.sp_get_streaming_job` stored procedure, or review the Docker log from the Azure SQL Edge container, which can provide the error details from the streaming job.
## Next steps -- [View metadata associated with streaming jobs in Azure SQL Edge](streaming-catalog-views.md)
+- [View metadata associated with streaming jobs in Azure SQL Edge](streaming-catalog-views.md)
- [Create an external stream](create-external-stream-transact-sql.md)
azure-sql-edge Data Retention Cleanup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/data-retention-cleanup.md
Title: Manage historical data with retention policy - Azure SQL Edge
description: Learn how to manage historical data with retention policy in Azure SQL Edge - Previously updated : 09/04/2020 Last updated : 09/14/2023 keywords: - SQL Edge - data retention- - # Manage historical data with retention policy
-Data Retention can enabled on the database and any of the underlying tables individually, allowing users to create flexible aging policies for their tables and databases. Applying data retention is simple: it requires only one parameter to be set during table creation or as part of an alter table operation.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-After data retention policy is defined for a database and the underlying table, a background timer task runs to remove any obsolete records from the table enabled for data retention. Identification of matching rows and their removal from the table occur transparently, in the background task that is scheduled and run by the system. Age condition for the table rows is checked based on the column used as the `filter_column` in the table definition. If retention period, for example, is set to one week, table rows eligible for cleanup satisfy either of the following condition:
+After the data retention policy is defined for a database and the underlying table, a background time timer task runs to remove any obsolete records from the table enabled for data retention. Identification of matching rows and their removal from the table occur transparently, in the background task scheduled and run by the system. Age condition for the table rows is checked based on the `filter_column` column specified in the table definition. If retention period is set to one week, for instance, table rows eligible for cleanup satisfy either of the following condition:
-- If the filter column uses DATETIMEOFFSET data type then the condition is `filter_column < DATEADD(WEEK, -1, SYSUTCDATETIME())`-- Else then the condition is `filter_column < DATEADD(WEEK, -1, SYSDATETIME())`
+- If the filter column uses DATETIMEOFFSET data type, then the condition is `filter_column < DATEADD(WEEK, -1, SYSUTCDATETIME())`
+- Otherwise, the condition is `filter_column < DATEADD(WEEK, -1, SYSDATETIME())`
## Data retention cleanup phases
-Data retention cleanup operation comprises of two phases.
-- Discovery Phase - In this phase the cleanup operation identifies all the tables within the user databases to build a list for cleanup. Discovery runs once a day.-- Cleanup Phase - In this phase, cleanup is run against all tables with finite data retention, identified in the discovery phase. If the cleanup operation cannot be performed on a table, then that table is skipped in the current run and will be retried in the next iteration. The following principles are used during cleanup
- - If an obsolete row is locked by another transaction, that row is skipped.
- - Cleanup runs with a default 5 seconds lock timeout setting. If the locks cannot be acquired on the tables within the timeout window, the table is skipped in the current run and will be retried in the next iteration.
- - If there is an error during cleanup of a table, that table is skipped and will be picked up in the next iteration.
+The data retention cleanup operation consists of two phases:
+
+1. **Discovery**: In this phase, the cleanup operation identifies all the tables within the user databases to build a list for cleanup. Discovery runs once a day.
+1. **Cleanup**: In this phase, cleanup is run against all tables with finite data retention, identified in the discovery phase. If the cleanup operation can't be performed on a table, then that table is skipped in the current run and will be retried in the next iteration. The following principles are used during cleanup:
+ - If an obsolete row is locked by another transaction, that row is skipped.
+ - Cleanup runs with a default lock timeout of 5 seconds. If the locks can't be acquired on the tables within the timeout window, the table is skipped in the current run and will be retried in the next iteration.
+ - If there's an error during cleanup of a table, that table is skipped and will be picked up in the next iteration.
## Manual cleanup
-Depending on the data retention settings on a table and the nature of the workload on the database, it's possible that the automatic cleanup thread may not completely remove all obsolete rows during its run. To assist with this and allow users to manually remove obsolete rows, the `sys.sp_cleanup_data_retention` stored procedure has been introduced in Azure SQL Edge.
+Depending on the data retention settings on a table and the nature of the workload on the database, it's possible that the automatic cleanup thread may not completely remove all obsolete rows during its run. To allow users to manually remove obsolete rows, the `sys.sp_cleanup_data_retention` stored procedure has been introduced in Azure SQL Edge.
+
+This stored procedure takes three parameters:
+
+- `@schema_name`: Name of the owning schema for the table. Required.
+- `@table_name`: Name of the table for which manual cleanup is being run. Required.
+- `@rowcount`: Output variable. Returns the number of rows cleaned up by the manual cleanup sp. Optional.
-This stored procedure takes three parameters.
- - Schema Name - Name of the owning schema for the table. This is a required parameter.
- - Table Name - Name of the table for which manual cleanup is being run. This is a required parameter.
- - rowcount (Output) - output variable. Returns the number of rows cleaned up by the manual cleanup sp. This is an optional parameter.
+For more information, see [sys.sp_cleanup_data_retention (Transact-SQL)](sys-sp-cleanup-data-retention.md).
The following example shows the execution of the manual cleanup sp for table `dbo.data_retention_table`. ```sql
-declare @rowcnt bigint
-EXEC sys.sp_cleanup_data_retention 'dbo', 'data_retention_table', @rowcnt output
-select @rowcnt
+DECLARE @rowcnt BIGINT;
+EXEC sys.sp_cleanup_data_retention 'dbo', 'data_retention_table', @rowcnt OUTPUT;
+SELECT @rowcnt;
``` ## How obsolete rows are deleted
-The cleanup process depends on the index layout of the table. A background task is created to perform obsolete data cleanup for all tables with finite retention period. Clean up logic for the rowstore (B-tree or Heap) index deletes aged row in smaller chunks (up to 10K) minimizing pressure on database log and IO subsystem. Although cleanup logic utilizes required B-tree index, order of deletions for the rows older than retention period cannot be firmly guaranteed. Hence, do not take any dependency on the cleanup order in your applications.
+The cleanup process depends on the index layout of the table. A background task is created to perform obsolete data cleanup for all tables with finite retention period. Clean up logic for the rowstore (heap or B-tree) index deletes aged row in smaller chunks (up to 10,000), minimizing pressure on database log and the I/O subsystem. Although cleanup logic utilizes the required B-tree index, the order of deletions for the rows older than retention period can't be firmly guaranteed. In other words, don't take a dependency on the cleanup order in your applications.
-The cleanup task for the clustered columnstore removes entire row groups at once (typically contain 1 million of rows each), which is very efficient, especially when data is generated and ages out at a high pace.
+> [!WARNING]
+> In the case of heaps and B-tree indexes, data retention runs a delete query on the underlying tables, which can conflict with delete triggers on the tables. You should either remove delete triggers from the tables, or avoid using data retention on tables that have delete DML triggers.
-![Data Retention Cleanup](./media/data-retention-cleanup/data-retention-cleanup.png)
+The cleanup task for the clustered columnstore indexes removes entire row groups at once (typically contain 1 million of rows each), which is efficient, especially when data is generated and ages out at a high pace.
-Excellent data compression and efficient retention cleanup makes clustered columnstore index a perfect choice for scenarios when your workload rapidly generates high amount of data.
-> [!Note]
-> In the case of B-Tree Indexes and heaps, data retention runs a delete query on the underlying tables, which can conflict with delete triggers on the tables. It is recommended to either remove delete triggers from the tables or to not enable data retention on tables that have delete DML trigger.
+Excellent data compression and efficient retention cleanup makes clustered columnstore indexes a perfect choice for scenarios when your workload rapidly generates a large amount of data.
-## Monitoring data retention cleanup
+## Monitor data retention cleanup
-Data retention policy cleanup operations can be monitored using extended events (XEvents) in Azure SQL Edge. For more information on extended events, refer [XEvents Overview](/sql/relational-databases/extended-events/extended-events).
+Data retention policy cleanup operations can be monitored using Extended Events in Azure SQL Edge. For more information on extended events, see [Extended Events Overview](/sql/relational-databases/extended-events/extended-events).
-The following six extended events help track the state of the cleanup operations.
+The following Extended Events help track the state of the cleanup operations.
| Name | Description |
-|| |
-| data_retention_task_started | Occurs when background task for cleanup of tables with retention policy starts. |
-| data_retention_task_completed | Occurs when background task for cleanup of tables with retention policy ends. |
-| data_retention_task_exception | Occurs when background task for cleanup of tables with retention policy fails outside of retention cleanup process specific to table. |
-| data_retention_cleanup_started | Occurs when clean up process of table with data retention policy starts. |
-| data_retention_cleanup_exception | Occurs cleanup process of table with retention policy fails. |
-| data_retention_cleanup_completed | Occurs when clean up process of table with data retention policy ends. |
+| | |
+| data_retention_task_started | Occurs when the background task for cleanup of tables with a retention policy starts. |
+| data_retention_task_completed | Occurs when the background task for cleanup of tables with a retention policy ends. |
+| data_retention_task_exception | Occurs when the background task for cleanup of tables with a retention policy fails, outside of retention cleanup process specific to those tables. |
+| data_retention_cleanup_started | Occurs when the cleanup process of a table with data retention policy starts. |
+| data_retention_cleanup_exception | Occurs when the cleanup process of a table with retention policy fails. |
+| data_retention_cleanup_completed | Occurs when the cleanup process of a table with data retention policy ends. |
-Additionally, a new ring buffer type named `RING_BUFFER_DATA_RETENTION_CLEANUP` has been added to sys.dm_os_ring_buffers dynamic management view. This view can be used to monitor the data retention cleanup operations.
+Additionally, a new ring buffer type named `RING_BUFFER_DATA_RETENTION_CLEANUP` has been added to the `sys.dm_os_ring_buffers` dynamic management view. This view can be used to monitor the data retention cleanup operations.
+## Next steps
-## Next Steps
- [Data Retention Policy](data-retention-overview.md) - [Enable and Disable Data Retention Policies](data-retention-enable-disable.md)
azure-sql-edge Data Retention Enable Disable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/data-retention-enable-disable.md
Title: Enable and disable data retention policies - Azure SQL Edge
description: Learn how to enable and disable data retention policies in Azure SQL Edge - Previously updated : 09/04/2020 Last updated : 09/14/2023 keywords: - SQL Edge - data retention- - # Enable and disable data retention policies
-This topic describes how to enable and disable data retention policies for a database and a table.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+This article describes how to enable and disable data retention policies for a database and a table.
## Enable data retention for a database
-The following example shows how to enable data retention by using [Alter Database](/sql/t-sql/statements/alter-database-transact-sql-set-options).
+The following example shows how to enable data retention by using [ALTER DATABASE](/sql/t-sql/statements/alter-database-transact-sql-set-options).
```sql
-ALTER DATABASE [<DatabaseName>] SET DATA_RETENTION ON;
+ALTER DATABASE [<DatabaseName>] SET DATA_RETENTION ON;
``` ## Check if data retention is enabled for a database
-The following command can be used to check if data retention is enabled for a database
+The following command can be used to check if data retention is enabled for a database.
+ ```sql
-SELECT is_data_retention_enabled, name
+SELECT is_data_retention_enabled,
+ name
FROM sys.databases; ``` ## Enable data retention for a table
-Data Retention must be enabled for each table for which you want data to be automatically purged. When Data Retention is enabled on the database and the table, a background system task will periodically scan the table to identify and delete any obsolete (aged) rows. Data Retention can be enabled on a table either during table creation using [Create Table](/sql/t-sql/statements/create-table-transact-sql) or by using [Alter Table](/sql/t-sql/statements/alter-table-transact-sql).
+Data Retention must be enabled for each table for which you want data to be automatically purged. When data retention is enabled on the database and the table, a background system task periodically scans the table to identify and delete any obsolete (aged) rows. Data retention can be enabled on a table either during table creation using [CREATE TABLE](/sql/t-sql/statements/create-table-transact-sql) or by using [ALTER TABLE](/sql/t-sql/statements/alter-table-transact-sql).
-The following example shows how to enable data retention for a table by using [Create Table](/sql/t-sql/statements/create-table-transact-sql).
+The following example shows how to enable data retention for a table by using [CREATE TABLE](/sql/t-sql/statements/create-table-transact-sql).
```sql
-CREATE TABLE [dbo].[data_retention_table]
-(
-[dbdatetime2] datetime2(7),
-[product_code] int,
-[value] char(10),
-CONSTRAINT [pk_current_data_retention_table] PRIMARY KEY CLUSTERED ([product_code])
-) WITH (DATA_DELETION = ON ( FILTER_COLUMN = [dbdatetime2], RETENTION_PERIOD = 1 day ) )
+CREATE TABLE [dbo].[data_retention_table] (
+ [dbdatetime2] DATETIME2(7),
+ [product_code] INT,
+ [value] CHAR(10),
+ CONSTRAINT [pk_current_data_retention_table] PRIMARY KEY CLUSTERED ([product_code])
+)
+WITH (
+ DATA_DELETION = ON (
+ FILTER_COLUMN = [dbdatetime2],
+ RETENTION_PERIOD = 1 day
+ )
+ );
```
-The `WITH (DATA_DELETION = ON ( FILTER_COLUMN = [dbdatetime2], RETENTION_PERIOD = 1 day ) )` part of the create table command sets the data retention on the table. The command uses the following required parameters
+The `WITH (DATA_DELETION = ON (FILTER_COLUMN = [dbdatetime2], RETENTION_PERIOD = 1 day))` part of the CREATE TABLE command sets the data retention on the table. The command uses the following required parameters:
+
+- DATA_DELETION: Indicates whether data retention is ON or OFF.
-- DATA_DELETION - Indicates whether data retention is ON or OFF.-- FILTER_COLUMN - Name on the column in the table, which will be used to ascertain if the rows are obsolete or not. The filter column can only be a column with the following data types
- - Date
- - SmallDateTime
- - DateTime
- - DateTime2
- - DateTimeOffset
-- RETENTION_PERIOD - An integer value followed by a unit descriptor. The allowed units are DAY, DAYS, WEEK, WEEKS, MONTH, MONTHS, YEAR and YEARS.
+- FILTER_COLUMN: Name on the column in the table, which will be used to ascertain if the rows are obsolete or not. The filter column can only be a column with the following data types:
-The following example shows how to enable data retention for table by using [Alter Table](/sql/t-sql/statements/alter-table-transact-sql).
+ - **date**
+ - **smalldatetime**
+ - **datetime**
+ - **datetime2**
+ - **datetimeoffset**
+
+- RETENTION_PERIOD: An integer value followed by a unit descriptor. The allowed units are DAY, DAYS, WEEK, WEEKS, MONTH, MONTHS, YEAR and YEARS.
+
+The following example shows how to enable data retention for table by using [ALTER TABLE](/sql/t-sql/statements/alter-table-transact-sql).
```sql
-Alter Table [dbo].[data_retention_table]
-SET (DATA_DELETION = On (FILTER_COLUMN = [timestamp], RETENTION_PERIOD = 1 day))
+ALTER TABLE [dbo].[data_retention_table]
+SET (
+ DATA_DELETION = ON (
+ FILTER_COLUMN = [timestamp],
+ RETENTION_PERIOD = 1 day
+ )
+)
``` ## Check if data retention is enabled for a table
SET (DATA_DELETION = On (FILTER_COLUMN = [timestamp], RETENTION_PERIOD = 1 day))
The following command can be used to check the tables for which data retention is enabled ```sql
-select name, data_retention_period, data_retention_period_unit from sys.tables
+SELECT name,
+ data_retention_period,
+ data_retention_period_unit
+FROM sys.tables;
```
-A value of data_retention_period = -1 and data_retention_period_unit as INFINITE, indicates that data retention is not set on the table.
+A value of `data_retention_period = -1` and `data_retention_period_unit` as INFINITE, indicates that data retention isn't set on the table.
-The following query can be used to identify the column used as the filter_column for data retention.
+The following query can be used to identify the column used as the `filter_column` for data retention.
```sql
-Select name from sys.columns
-where is_data_deletion_filter_column =1
-and object_id = object_id(N'dbo.data_retention_table', N'U')
+SELECT name
+FROM sys.columns
+WHERE is_data_deletion_filter_column = 1
+ AND object_id = object_id(N'dbo.data_retention_table', N'U');
```
-## Correlating DB and table data retention settings
+## Correlate database and table data retention settings
-The data retention setting on the database and the table, are used in conjunction to determine if autocleanup for aged rows will run on the tables or not.
+The data retention setting on the database and the table are used in conjunction to determine if autocleanup for aged rows runs on the tables.
-|Database Option | Table Option | Behavior |
-|-|--|-|
-| OFF | OFF | Data Retention policy is disabled and both auto and manual cleanup of aged records is disabled.|
-| OFF | ON | Data Retention policy is enabled for the table. Auto cleanup of obsolete records is disabled, however manual cleanup method can be used to cleanup obsolete records. |
-| ON | OFF | Data Retention policy is enabled at the database level. However since the option is disabled at the table level, there is no retention-based cleanup of aged rows.|
-| ON | ON | Data Retention policy is enabled for both the database and tables. Automatic cleanup of obsolete records is enabled. |
+| Database option | Table option | Behavior |
+| | | |
+| OFF | OFF | Data retention policy is disabled and both auto and manual cleanup of aged records is disabled. |
+| OFF | ON | Data retention policy is enabled for the table. Auto cleanup of obsolete records is disabled, however manual cleanup method can be used to clean up obsolete records. |
+| ON | OFF | Data retention policy is enabled at the database level. However since the option is disabled at the table level, there's no retention-based cleanup of aged rows. |
+| ON | ON | Data retention policy is enabled for both the database and tables. Automatic cleanup of obsolete records is enabled. |
## Disable data retention on a table
-Data Retention can be disabled on a table by using [Alter Table](/sql/t-sql/statements/alter-table-transact-sql). The following command can be used to disable data retention on a table.
+Data retention can be disabled on a table by using [ALTER TABLE](/sql/t-sql/statements/alter-table-transact-sql). The following command can be used to disable data retention on a table.
```sql
-Alter Table [dbo].[data_retention_table]
-Set (DATA_DELETION = OFF)
+ALTER TABLE [dbo].[data_retention_table]
+SET (DATA_DELETION = OFF);
``` ## Disable data retention on a database
-Data Retention can be disabled on a table by using [Alter Database](/sql/t-sql/statements/alter-database-transact-sql-set-options). The following command can be used to disable data retention on a database.
+Data retention can be disabled on a table by using [ALTER DATABASE](/sql/t-sql/statements/alter-database-transact-sql-set-options). The following command can be used to disable data retention on a database.
```sql
-ALTER DATABASE [<DatabaseName>] SET DATA_RETENTION OFF;
+ALTER DATABASE [<DatabaseName>] SET DATA_RETENTION OFF;
``` ## Next steps+ - [Data Retention and Automatic Data Purging](data-retention-overview.md) - [Manage historical data with retention policy](data-retention-cleanup.md)
azure-sql-edge Data Retention Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/data-retention-overview.md
Title: Data retention policy overview - Azure SQL Edge
description: Learn about the data retention policy in Azure SQL Edge - Previously updated : 09/04/2020 Last updated : 09/14/2023 keywords: - SQL Edge - data retention- - # Data retention overview
-Collection and storage of data from connected IoT devices is important to drive and gain operational and business insights. However given the volume of data originating from these devices, it becomes important for organizations to carefully plan the amount of data they want to retain and at what granularity. While retaining all data at all granularity is desirable, it's not always practical. Additionally, the volume of data that can be retained is constrained by the amount of storage available on the IoT or Edge devices.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+Collection and storage of data from connected IoT devices is important to drive and gain operational and business insights. However, with the volume of data originating from these devices, it becomes important for organizations to carefully plan the amount of data they want to retain and at what granularity. While retaining all data at all granularity is desirable, it's not always practical. Additionally, the volume of data that can be retained is constrained by the amount of storage available on the IoT or Edge devices.
-In Azure SQL Edge database administrators can define data retention policy on a SQL Edge database and its underlying tables. Once the data retention policy is defined, a background system task will run to purge any obsolete (old) data from the user tables.
+In Azure SQL Edge, database administrators can define data retention policy on a SQL Edge database and its underlying tables. Once the data retention policy is defined, a background system task runs to purge any obsolete (old) data from the user tables.
-> [!Note]
-> Data once purged from the table, is not recoverable. The only possible way to recover the purged data is to restore the database from an older backup.
+> [!NOTE]
+> Data once purged from the table, isn't recoverable. The only possible way to recover the purged data is to restore the database from an older backup.
Quickstarts:
Quickstarts:
## How data retention works
-To configure data retention, you can use DDL statements. For more information, [Enable and Disable Data Retention Policies](data-retention-enable-disable.md). For automatic deletion of the obsolete records, data retention must first be enabled for both the database and the tables that you want to be purged within that database.
+To configure data retention, you can use DDL statements. For more information, [Enable and Disable Data Retention Policies](data-retention-enable-disable.md). For automatic deletion of the obsolete records, data retention must first be enabled for both the database and the tables that you want to be purged within that database.
-After data retention is configured for a table, a background task runs to identify the obsolete records in a table and delete those records. If for some reason, the automatic cleanup of the tasks is not running or is unable to keep up with the deletes, then a manual cleanup operation can be performed on these tables. For more information on automatic and manual cleanups, refer [Automatic and Manual Cleanup](data-retention-cleanup.md).
+After data retention is configured for a table, a background task runs to identify the obsolete records in a table and delete those records. If for some reason, the automatic cleanup of the tasks isn't running or is unable to keep up with the delete operations, then a manual cleanup operation can be performed on these tables. For more information on automatic and manual cleanups, see [Automatic and Manual Cleanup](data-retention-cleanup.md).
-## Limitations and restrictions
+## Limitations
-- Data Retention, if enabled, is automatically disabled when the database is restored from a full backup or is reattached. -- Data Retention cannot be enabled for a Temporal History Table-- Data Retention filter colomn cannot be altered. To alter the column, disable data retention on the table.
+- Data Retention, if enabled, is automatically disabled when the database is restored from a full backup or is reattached.
+- Data Retention can't be enabled for a Temporal History Table
+- Data Retention filter column can't be altered. To alter the column, disable data retention on the table.
-## Next Steps
+## Next steps
- [Machine Learning and Artificial Intelligence with ONNX in SQL Edge](onnx-overview.md). - [Building an end to end IoT Solution with SQL Edge using IoT Edge](tutorial-deploy-azure-resources.md).
azure-sql-edge Date Bucket Tsql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/date-bucket-tsql.md
Title: Date_Bucket (Transact-SQL) - Azure SQL Edge
-description: Learn about using Date_Bucket in Azure SQL Edge
+ Title: DATE_BUCKET (Transact-SQL) - Azure SQL Edge
+description: Learn about using DATE_BUCKET in Azure SQL Edge
- Previously updated : 09/03/2020 Last updated : 09/14/2023 keywords:
- - Date_Bucket
+ - DATE_BUCKET
- SQL Edge-
+# DATE_BUCKET (Transact-SQL)
-# Date_Bucket (Transact-SQL)
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-This function returns the datetime value corresponding to the start of each datetime bucket, from the timestamp defined by the `origin` parameter or the default origin value of `1900-01-01 00:00:00.000` if the origin parameter is not specified.
+This function returns the datetime value corresponding to the start of each datetime bucket, from the timestamp defined by the `origin` parameter or the default origin value of `1900-01-01 00:00:00.000` if the origin parameter isn't specified.
-See [Date and Time Data Types and Functions &#40;Transact-SQL&#41;](/sql/t-sql/functions/date-and-time-data-types-and-functions-transact-sql/) for an overview of all Transact-SQL date and time data types and functions.
+See [Date and Time Data Types and Functions (Transact-SQL)](/sql/t-sql/functions/date-and-time-data-types-and-functions-transact-sql/) for an overview of all Transact-SQL date and time data types and functions.
[Transact-SQL Syntax Conventions](/sql/t-sql/language-elements/transact-sql-syntax-conventions-transact-sql/) ## Syntax ```syntaxsql
-DATE_BUCKET (datepart, number, date, origin)
+DATE_BUCKET (datePart , number , date , origin)
``` ## Arguments
-*datepart*
+#### *datePart*
-The part of *date* that is used with the ΓÇÿnumberΓÇÖ parameter. Ex. Year, month, minute, second etc.
+The part of *date* that is used with the 'number' parameter, as shown in the following table. `DATE_BUCKET` doesn't accept user-defined variable equivalents for the *datePart* arguments.
-> [!NOTE]
-> `DATE_BUCKET` does not accept user-defined variable equivalents for the *datePart* arguments.
+| *datePart* | Abbreviations |
+| | |
+| **day** | **dd**, **d** |
+| **week** | **wk**, **ww** |
+| **month** | **mm**, **m** |
+| **quarter** | **qq**, **q** |
+| **year** | **yy**, **yyyy** |
+| **hour** | **hh** |
+| **minute** | **mi**, **n** |
+| **second** | **ss**, **s** |
+| **millisecond** | **ms** |
-|*datePart*|Abbreviations|
-|||
-|**day**|**dd**, **d**|
-|**week**|**wk**, **ww**|
-|**month**|**mm**, **m**|
-|**quarter**|**qq**, **q**|
-|**year**|**yy**, **yyyy**|
-|**hour**|**hh**|
-|**minute**|**mi**, **n**|
-|**second**|**ss**, **s**|
-|**millisecond**|**ms**|
+#### *number*
-*number*
+The **integer** number that decides the width of the bucket combined with *datePart* argument. This represents the width of the *datePart* buckets from the origin time. This argument has to be a *positive* integer value.
-The *integer* number that decides the width of the bucket combined with *datepart* argument. This represents the width of the datepart buckets from the origin time. **`This argument has to be a positive integer value`**.
-
-*date*
+#### *date*
An expression that can resolve to one of the following values:
-+ **date**
-+ **datetime**
-+ **datetime2**
-+ **datetimeoffset**
-+ **smalldatetime**
-+ **time**
+- **date**
+- **datetime**
+- **datetime2**
+- **datetimeoffset**
+- **smalldatetime**
+- **time**
-For *date*, `DATE_BUCKET` will accept a column expression, expression, or user-defined variable if they resolve to any of the data types mentioned above.
+For *date*, `DATE_BUCKET` accepts a column expression, expression, or user-defined variable if they resolve to any of the data types mentioned previously.
-**Origin**
+#### *origin*
An optional expression that can resolve to one of the following values:
-+ **date**
-+ **datetime**
-+ **datetime2**
-+ **datetimeoffset**
-+ **smalldatetime**
-+ **time**
+- **date**
+- **datetime**
+- **datetime2**
+- **datetimeoffset**
+- **smalldatetime**
+- **time**
-The data type for `Origin` should match the data type of the `Date` parameter.
+The data type for *origin* should match the data type of the *date* parameter.
-`DATE_BUCKET` uses a default origin date value of `1900-01-01 00:00:00.000` i.e. 12:00 AM on Monday, January 1 1900, if no Origin value is specified for the function.
+`DATE_BUCKET` uses a default origin date value of `1900-01-01 00:00:00.000`, that is, 12:00 AM on Monday, January 1, 1900, if no *origin* value is specified for the function.
-## Return Type
+## Return type
-The return value data type for this method is dynamic. The return type depends on the argument supplied for `date`. If a valid input data type is supplied for `date`, `DATE_BUCKET` returns the same data type. `DATE_BUCKET` raises an error if a string literal is specified for the `date` parameter.
+The return value data type for this method is dynamic. The return type depends on the argument supplied for *date*. If a valid input data type is supplied for *date*, `DATE_BUCKET` returns the same data type. `DATE_BUCKET` raises an error if a string literal is specified for the *date* parameter.
-## Return Values
+## Return values
-### Understanding the output from `DATE_BUCKET`
+### Understand the output from `DATE_BUCKET`
-`Date_Bucket` returns the latest date or time value, corresponding to the datePart and number parameter. For example, in the expressions below, `Date_Bucket` will return the output value of `2020-04-13 00:00:00.0000000`, as the output is calculated based on one week buckets from the default origin time of `1900-01-01 00:00:00.000`. The value `2020-04-13 00:00:00.0000000` is 6276 weeks from the origin value of `1900-01-01 00:00:00.000`.
+`DATE_BUCKET` returns the latest date or time value, corresponding to the *datePart* and *number* parameters. For example, in the following expressions, `DATE_BUCKET` returns the output value of `2020-04-13 00:00:00.0000000`, as the output is calculated based on one week buckets from the default origin time of `1900-01-01 00:00:00.000`. The value `2020-04-13 00:00:00.0000000` is 6276 weeks from the origin value of `1900-01-01 00:00:00.000`.
```sql
-declare @date datetime2 = '2020-04-15 21:22:11';
-Select DATE_BUCKET(WEEK, 1, @date);
+DECLARE @date DATETIME2 = '2020-04-15 21:22:11';
+
+SELECT DATE_BUCKET(WEEK, 1, @date);
```
-For all the expressions below, the same output value of `2020-04-13 00:00:00.0000000` will be returned. This is because `2020-04-13 00:00:00.0000000` is 6276 weeks from the origin date and 6276 is divisible by 2, 3, 4 and 6.
+For all the following expressions, the same output value of `2020-04-13 00:00:00.0000000` is returned. This is because `2020-04-13 00:00:00.0000000` is 6276 weeks from the origin date and 6276 is divisible by 2, 3, 4 and 6.
```sql
-declare @date datetime2 = '2020-04-15 21:22:11';
-Select DATE_BUCKET(WEEK, 2, @date);
-Select DATE_BUCKET(WEEK, 3, @date);
-Select DATE_BUCKET(WEEK, 4, @date);
-Select DATE_BUCKET(WEEK, 6, @date);
+DECLARE @date DATETIME2 = '2020-04-15 21:22:11';
+
+SELECT DATE_BUCKET(WEEK, 2, @date);
+SELECT DATE_BUCKET(WEEK, 3, @date);
+SELECT DATE_BUCKET(WEEK, 4, @date);
+SELECT DATE_BUCKET(WEEK, 6, @date);
``` The output for the expression below is `2020-04-06 00:00:00.0000000`, which is 6275 weeks from the default origin time `1900-01-01 00:00:00.000`. ```sql
-declare @date datetime2 = '2020-04-15 21:22:11';
-Select DATE_BUCKET(WEEK, 5, @date);
+DECLARE @date DATETIME2 = '2020-04-15 21:22:11';
+
+SELECT DATE_BUCKET(WEEK, 5, @date);
``` The output for the expression below is `2020-06-09 00:00:00.0000000` , which is 75 weeks from the specified origin time `2019-01-01 00:00:00`. ```sql
-declare @date datetime2 = '2020-06-15 21:22:11';
-declare @origin datetime2 = '2019-01-01 00:00:00';
-Select DATE_BUCKET(WEEK, 5, @date, @origin);
+DECLARE @date DATETIME2 = '2020-06-15 21:22:11';
+DECLARE @origin DATETIME2 = '2019-01-01 00:00:00';
+
+SELECT DATE_BUCKET(WEEK, 5, @date, @origin);
```
-## datepart Argument
+## Remarks
-**dayofyear**, **day**, and **weekday** return the same value. Each *datepart* and its abbreviations return the same value.
+Use `DATE_BUCKET` in the following clauses:
+
+- `GROUP BY`
+- `HAVING`
+- `ORDER BY`
+- `SELECT <list>`
+- `WHERE`
+
+#### `datePart` argument
-## number Argument
+**dayofyear**, **day**, and **weekday** return the same value. Each *datePart* and its abbreviations return the same value.
-The *number* argument cannot exceed the range of positive **int** values. In the following statements, the argument for *number* exceeds the range of **int** by 1. The following statement returns the following error message: "`Msg 8115, Level 16, State 2, Line 2. Arithmetic overflow error converting expression to data type int."`
+#### *number* argument
+
+The *number* argument can't exceed the range of positive **int** values. In the following statements, the argument for *number* exceeds the range of **int** by 1. The following statement returns the following error message: `Msg 8115, Level 16, State 2, Line 2. Arithmetic overflow error converting expression to data type int.`
```sql
-declare @date datetime2 = '2020-04-30 00:00:00';
-Select DATE_BUCKET(DAY, 2147483648, @date);
+DECLARE @date DATETIME2 = '2020-04-30 00:00:00';
+
+SELECT DATE_BUCKET(DAY, 2147483648, @date);
```
-If a negative value for number is passed to the `Date_Bucket` function, the following error will be returned.
+If a negative value for number is passed to the `DATE_BUCKET` function, the following error is returned.
-```txt
+```output
Msg 9834, Level 16, State 1, Line 1
-Invalid bucket width value passed to date_bucket function. Only positive values are allowed.
-````
+Invalid bucket width value passed to DATE_BUCKET function. Only positive values are allowed.
+```
-## date Argument
+#### *date* argument
-`DATE_BUCKET` return the base value corresponding to the data type of the `date` argument. In the following example, an output value with datetime2 datatype is returned.
+`DATE_BUCKET` return the base value corresponding to the data type of the *date* argument. In the following example, an output value with **datetime2** data type is returned.
```sql
-Select DATE_BUCKET(DAY, 10, SYSUTCDATETIME());
+SELECT DATE_BUCKET(DAY, 10, SYSUTCDATETIME());
```
-## origin Argument
-
-The data type of the `origin` and `date` arguments in must be the same. If different data types are used, an error will be generated.
-
-## Remarks
-
-Use `DATE_BUCKET` in the following clauses:
+#### *origin* argument
-+ GROUP BY
-+ HAVING
-+ ORDER BY
-+ SELECT \<list>
-+ WHERE
+The data type of the *origin* and *date* arguments in must be the same. If different data types are used, an error is generated.
## Examples
-### A. Calculating Date_Bucket with a bucket width of 1 from the origin time
+### A. Calculate DATE_BUCKET with a bucket width of 1 from the origin time
-Each of these statements increments *date_bucket* with a bucket width of 1 from the origin time:
+Each of these statements increments *DATE_BUCKET* with a bucket width of 1 from the origin time:
```sql
-declare @date datetime2 = '2020-04-30 21:21:21'
-Select 'Week', DATE_BUCKET(WEEK, 1, @date)
-Union All
-Select 'Day', DATE_BUCKET(DAY, 1, @date)
-Union All
-Select 'Hour', DATE_BUCKET(HOUR, 1, @date)
-Union All
-Select 'Minutes', DATE_BUCKET(MINUTE, 1, @date)
-Union All
-Select 'Seconds', DATE_BUCKET(SECOND, 1, @date);
+DECLARE @date DATETIME2 = '2020-04-30 21:21:21';
+
+SELECT 'Week', DATE_BUCKET(WEEK, 1, @date)
+UNION ALL SELECT 'Day', DATE_BUCKET(DAY, 1, @date)
+UNION ALL SELECT 'Hour', DATE_BUCKET(HOUR, 1, @date)
+UNION ALL SELECT 'Minutes', DATE_BUCKET(MINUTE, 1, @date)
+UNION ALL SELECT 'Seconds', DATE_BUCKET(SECOND, 1, @date);
```
-Here is the result set.
+Here's the result set.
-```txt
+```output
Week 2020-04-27 00:00:00.0000000 Day 2020-04-30 00:00:00.0000000 Hour 2020-04-30 21:00:00.0000000
Minutes 2020-04-30 21:21:00.0000000
Seconds 2020-04-30 21:21:21.0000000 ```
-### B. Using expressions as arguments for the number and date parameters
+### B. Use expressions as arguments for the number and date parameters
-These examples use different types of expressions as arguments for the *number* and *date* parameters. These examples are built using the 'AdventureWorksDW2017' Database.
+These examples use different types of expressions as arguments for the *number* and *date* parameters. These examples are built using the `AdventureWorksDW2019` Database.
-#### Specifying user-defined variables as number and date
+#### Specify user-defined variables as number and date
This example specifies user-defined variables as arguments for *number* and *date*: ```sql
-DECLARE @days int = 365,
- @datetime datetime2 = '2000-01-01 01:01:01.1110000'; /* 2000 was a leap year */;
-SELECT Date_Bucket(DAY, @days, @datetime);
+DECLARE @days INT = 365,
+ @datetime DATETIME2 = '2000-01-01 01:01:01.1110000';/* 2000 was a leap year */;
+
+SELECT DATE_BUCKET(DAY, @days, @datetime);
```
-Here is the result set.
+Here's the result set.
-```txt
+```output
1999-12-08 00:00:00.0000000 (1 row affected) ```
-#### Specifying a column as date
+#### Specify a column as date
In the example below, we are calculating the sum of OrderQuantity and sum of UnitPrice grouped over weekly date buckets. ```sql
-SELECT
- Date_Bucket(WEEK, 1 ,cast(Shipdate as datetime2)) AS ShippedDateBucket
- ,Sum(OrderQuantity) As SumOrderQuantity
- ,Sum(UnitPrice) As SumUnitPrice
+SELECT DATE_BUCKET(WEEK, 1, CAST(Shipdate AS DATETIME2)) AS ShippedDateBucket,
+ Sum(OrderQuantity) AS SumOrderQuantity,
+ Sum(UnitPrice) AS SumUnitPrice
FROM dbo.FactInternetSales FIS
-where Shipdate between '2011-01-03 00:00:00.000' and '2011-02-28 00:00:00.000'
-Group by Date_Bucket(week, 1 ,cast(Shipdate as datetime2))
-order by ShippedDateBucket;
+WHERE Shipdate BETWEEN '2011-01-03 00:00:00.000'
+ AND '2011-02-28 00:00:00.000'
+GROUP BY DATE_BUCKET(week, 1, CAST(Shipdate AS DATETIME2))
+ORDER BY ShippedDateBucket;
```
-Here is the result set.
+Here's the result set.
-```txt
+```output
ShippedDateBucket SumOrderQuantity SumUnitPrice - 2011-01-03 00:00:00.0000000 21 65589.7546
ShippedDateBucket SumOrderQuantity SumUnitPrice
2011-02-28 00:00:00.0000000 9 28968.6982 ```
-#### Specifying scalar system function as date
+#### Specify scalar system function as date
This example specifies `SYSDATETIME` for *date*. The exact value returned depends on the day and time of statement execution: ```sql
-SELECT Date_Bucket(WEEK, 10, SYSDATETIME());
+SELECT DATE_BUCKET(WEEK, 10, SYSDATETIME());
```
-Here is the result set.
+Here's the result set.
-```txt
+```output
2020-03-02 00:00:00.0000000 (1 row affected) ```
-#### Specifying scalar subqueries and scalar functions as number and date
+#### Specify scalar subqueries and scalar functions as number and date
This example uses scalar subqueries, `MAX(OrderDate)`, as arguments for *number* and *date*. `(SELECT top 1 CustomerKey FROM dbo.DimCustomer where GeographyKey > 100)` serves as an artificial argument for the number parameter, to show how to select a *number* argument from a value list. ```sql
-SELECT DATE_BUCKET(WEEK,(SELECT top 1 CustomerKey FROM dbo.DimCustomer where GeographyKey > 100),
- (SELECT MAX(OrderDate) FROM dbo.FactInternetSales));
+SELECT DATE_BUCKET(WEEK,
+ (
+ SELECT TOP 1 CustomerKey
+ FROM dbo.DimCustomer
+ WHERE GeographyKey > 100
+ ),
+ (
+ SELECT MAX(OrderDate)
+ FROM dbo.FactInternetSales
+ )
+ );
```
-#### Specifying numeric expressions and scalar system functions as number and date
+#### Specify numeric expressions and scalar system functions as number and date
This example uses a numeric expression ((10/2)), and scalar system functions (SYSDATETIME) as arguments for number and date. ```sql
-SELECT Date_Bucket(WEEK,(10/2), SYSDATETIME());
+SELECT DATE_BUCKET(WEEK, (10 / 2), SYSDATETIME());
```
-#### Specifying an aggregate window function as number
+#### Specify an aggregate window function as number
This example uses an aggregate window function as an argument for *number*. ```sql
-Select
- DISTINCT DATE_BUCKET(DAY, 30, Cast([shipdate] as datetime2)) as DateBucket,
- First_Value([SalesOrderNumber]) OVER (Order by DATE_BUCKET(DAY, 30, Cast([shipdate] as datetime2))) as First_Value_In_Bucket,
- Last_Value([SalesOrderNumber]) OVER (Order by DATE_BUCKET(DAY, 30, Cast([shipdate] as datetime2))) as Last_Value_In_Bucket
- from [dbo].[FactInternetSales]
-Where ShipDate between '2011-01-03 00:00:00.000' and '2011-02-28 00:00:00.000'
-order by DateBucket;
+SELECT DISTINCT DATE_BUCKET(DAY, 30, CAST([shipdate] AS DATETIME2)) AS DateBucket,
+ FIRST_VALUE([SalesOrderNumber]) OVER (
+ ORDER BY DATE_BUCKET(DAY, 30, CAST([shipdate] AS DATETIME2))
+ ) AS FIRST_VALUE_In_Bucket,
+ LAST_VALUE([SalesOrderNumber]) OVER (
+ ORDER BY DATE_BUCKET(DAY, 30, CAST([shipdate] AS DATETIME2))
+ ) AS LAST_VALUE_In_Bucket
+FROM [dbo].[FactInternetSales]
+WHERE ShipDate BETWEEN '2011-01-03 00:00:00.000'
+ AND '2011-02-28 00:00:00.000'
+ORDER BY DateBucket;
GO ```
-### C. Using a non-default origin value
+
+### C. Use a non-default origin value
This example uses a non-default origin value to generate the date buckets. ```sql
-declare @date datetime2 = '2020-06-15 21:22:11';
-declare @origin datetime2 = '2019-01-01 00:00:00';
-Select DATE_BUCKET(HOUR, 2, @date, @origin);
+DECLARE @date DATETIME2 = '2020-06-15 21:22:11';
+DECLARE @origin DATETIME2 = '2019-01-01 00:00:00';
+
+SELECT DATE_BUCKET(HOUR, 2, @date, @origin);
``` ## See also
-[CAST and CONVERT &#40;Transact-SQL&#41;](/sql/t-sql/functions/cast-and-convert-transact-sql/)
+- [CAST and CONVERT (Transact-SQL)](/sql/t-sql/functions/cast-and-convert-transact-sql/)
azure-sql-edge Deploy Dacpac https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/deploy-dacpac.md
Title: Using SQL Database DACPAC and BACPAC packages - Azure SQL Edge
-description: Learn about using dacpacs and bacpacs in Azure SQL Edge
+description: Learn about using DACPACs and BACPACs in Azure SQL Edge
- Previously updated : 09/03/2020 Last updated : 09/14/2023 keywords: - SQL Edge - sqlpackage- - # SQL Database DACPAC and BACPAC packages in SQL Edge
-Azure SQL Edge is an optimized relational database engine geared for IoT and edge deployments. It's built on the latest versions of the Microsoft SQL Database Engine, which provides industry-leading performance, security, and query processing capabilities. Along with the industry-leading relational database management capabilities of SQL Server, Azure SQL Edge provides in-built streaming capability for real-time analytics and complex event-processing.
-
-Azure SQL Edge provides native mechanism that enable you to deploy a [SQL Database DACPAC and BACPAC](/sql/relational-databases/data-tier-applications/data-tier-applications) package during, or after deploying SQL Edge.
-
-SQL Database dacpac and bacpac packages can be deployed to SQL Edge using the `MSSQL_PACKAGE` environment variable. The environment variable can be configured with any of the following.
-- A local folder location within the SQL container containing the dacpac and bacpac files. This folder can be mapped to a host volume using either mount points or data volume containers. -- A local file path within the SQL container mapping to the dacpac or the bacpac file. This file path can be mapped to a host volume using either mount points or data volume containers. -- A local file path within the SQL container mapping to a zip file containing the dacpac or bacpac files. This file path can be mapped to a host volume using either mount points or data volume containers. -- An Azure Blob SAS URL to a zip file containing the dacpac and bacpac files.-- An Azure Blob SAS URL to a dacpac or a bacpac file. -
-## Use a SQL Database DAC package with SQL Edge
-
-To deploy (or import) a SQL Database DAC package `(*.dacpac)` or a BACPAC file `(*.bacpac)` using Azure Blob storage and a zip file, follow the steps below.
-
-1. Create/Extract a DAC package or Export a Bacpac File using one of the mechanism mentioned below.
- - Use [SQL Database Project Extension - Azure Data Studio](/sql/azure-data-studio/extensions/sql-database-project-extension-getting-started) to [create a new database project or export an existing database](/sql/azure-data-studio/extensions/sql-database-project-extension-getting-started)
- - Create or extract a SQL Database DAC package. See [Extracting a DAC from a database](/sql/relational-databases/data-tier-applications/extract-a-dac-from-a-database/) for information on how to generate a DAC package for an existing SQL Server database.
- - Exporting a deployed DAC package or a database. See [Export a Data-tier Application](/sql/relational-databases/data-tier-applications/export-a-data-tier-application/) for information on how to generate a bacpac file for an existing SQL Server database.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-> [!NOTE]
-> If you are using external streaming jobs as part of the database, please ensure the following:
->
-> - The generated dacpac will capture all the SQL Server Objects corresponding to the inputs/output streams and the streaming jobs. But the jobs will not be automatically started. In order to have the external streaming job automatically started after deployment, add a post-deployment script that restarts the jobs as follows:
->
-> ```
-> exec sys.sp_stop_streaming_job @name=N'<JOB NAME>';
-> GO
-> exec sys.sp_start_streaming_job @name=N'<JOB NAME>';
-> GO
-> ```
->
-> - Ensure any credentials required by the external streaming jobs to access input or output streams are provided as part of the dacpac.
+Azure SQL Edge is an optimized relational database engine geared for IoT and edge deployments. It's built on the latest versions of the Microsoft SQL Database Engine, which provides industry-leading performance, security, and query processing capabilities. Along with the industry-leading relational database management capabilities of SQL Server, Azure SQL Edge provides in-built streaming capability for real-time analytics and complex event-processing.
+Azure SQL Edge provides native mechanisms to deploy a [SQL Database DACPAC and BACPAC](/sql/relational-databases/data-tier-applications/data-tier-applications) package during, or after deploying, SQL Edge.
+SQL Database DACPAC and BACPAC packages can be deployed to SQL Edge using the `MSSQL_PACKAGE` environment variable. The environment variable can be configured with any of the following.
-2. Zip the `*.dacpac` or the `*.bacpac` file and upload it to an Azure Blob storage account. For more information on uploading files to Azure Blob storage, see [Upload, download, and list blobs with the Azure portal](../storage/blobs/storage-quickstart-blobs-portal.md).
+- A local folder location within the SQL container containing the DACPAC and BACPAC files. This folder can be mapped to a host volume using either mount points or data volume containers.
+- A local file path within the SQL container mapping to the DACPAC or the BACPAC file. This file path can be mapped to a host volume using either mount points or data volume containers.
+- A local file path within the SQL container mapping to a zip file containing the DACPAC or BACPAC files. This file path can be mapped to a host volume using either mount points or data volume containers.
+- An Azure Blob SAS URL to a zip file containing the DACPAC and BACPAC files.
+- An Azure Blob SAS URL to a DACPAC or a BACPAC file.
-3. Generate a shared access signature for the zip file by using the Azure portal. For more information, see [Delegate access with shared access signatures (SAS)](../storage/common/storage-sas-overview.md).
+## Use a SQL Database DAC package with SQL Edge
-4. Update the SQL Edge module configuration to include the shared access URI for the DAC package. To update the SQL Edge module, take these steps:
+To deploy (or import) a SQL Database DAC package `(*.dacpac)` or a BACPAC file `(*.bacpac)` using Azure Blob storage and a zip file, follow these steps.
- 1. In the Azure portal, go to your IoT Hub deployment.
+1. Create/extract a DAC package or export a BACPAC file using one of the following mechanisms.
+ - Use [SQL Database Project Extension - Azure Data Studio](/sql/azure-data-studio/extensions/sql-database-project-extension-getting-started) to [create a new database project or export an existing database](/sql/azure-data-studio/extensions/sql-database-project-extension-getting-started)
+ - Create or extract a SQL Database DAC package. See [Extracting a DAC from a database](/sql/relational-databases/data-tier-applications/extract-a-dac-from-a-database/) for information on how to generate a DAC package for an existing SQL Server database.
+ - Exporting a deployed DAC package or a database. See [Export a Data-tier Application](/sql/relational-databases/data-tier-applications/export-a-data-tier-application/) for information on how to generate a BACPAC file for an existing SQL Server database.
- 2. In the left pane, select **IoT Edge**.
+ If you're using external streaming jobs as part of the database, ensure that:
- 3. On the **IoT Edge** page, find and select the IoT Edge where the SQL Edge module is deployed.
+ - The generated DACPAC captures all the SQL Server objects corresponding to the inputs/output streams and the streaming jobs, but the jobs won't be automatically started. In order to have the external streaming job automatically started after deployment, add a post-deployment script that restarts the jobs as follows:
- 4. On the **IoT Edge Device** device page, select **Set Module**.
+ ```sql
+ EXEC `sys.sp_stop_streaming_job` @name = N'<JOB NAME>';
+ GO
+ EXEC `sys.sp_start_streaming_job` @name = N'<JOB NAME>';
+ GO
+ ```
- 5. On the **Set modules** page, and click on the Azure SQL Edge module.
+ - Any credentials required by the external streaming jobs to access input or output streams are provided as part of the DACPAC.
- 6. On the **Update IoT Edge Module** pane, select **Environment Variables**. Add the `MSSQL_PACKAGE` environment variable and specify the SAS URL generated in Step 3 above as the value for the environment variable.
+1. Zip the `*.dacpac` or the `*.bacpac` file and upload it to an Azure Blob storage account. For more information on uploading files to Azure Blob storage, see [Upload, download, and list blobs with the Azure portal](../storage/blobs/storage-quickstart-blobs-portal.md).
- 7. Select **Update**.
+1. Generate a shared access signature for the zip file by using the Azure portal. For more information, see [Delegate access with shared access signatures (SAS)](../storage/common/storage-sas-overview.md).
- 8. On the **Set modules** page, select **Review + create**.
+1. Update the SQL Edge module configuration to include the shared access URI for the DAC package. To update the SQL Edge module, take these steps:
- 9. On the **Set modules** page, select **Create**.
+ 1. In the Azure portal, go to your IoT Hub deployment.
+ 1. In the left pane, select **IoT Edge**.
+ 1. On the **IoT Edge** page, find and select the IoT Edge where the SQL Edge module is deployed.
+ 1. On the **IoT Edge Device** device page, select **Set Module**.
+ 1. On the **Set modules** page, and select the Azure SQL Edge module.
+ 1. On the **Update IoT Edge Module** pane, select **Environment Variables**. Add the `MSSQL_PACKAGE` environment variable and specify the SAS URL generated in Step 3 above as the value for the environment variable.
+ 1. Select **Update**.
+ 1. On the **Set modules** page, select **Review + create**.
+ 1. On the **Set modules** page, select **Create**.
-5. After the module update, the package files are downloaded, unzipped, and deployed against the SQL Edge instance.
+1. After the module update, the package files are downloaded, unzipped, and deployed against the SQL Edge instance.
-On each restart of the Azure SQL Edge container, SQL Edge attempts to download the zipped file package and evaluate for changes. If a new version of the dacpac file is encountered, the changes are deployed to the database in SQL Edge.
+On each restart of the Azure SQL Edge container, SQL Edge attempts to download the zipped file package and evaluate for changes. If a new version of the DACPAC file is encountered, the changes are deployed to the database in SQL Edge.
-## Known Issue
+## Known issue
-During some DACPAC or BACPAC deployments users may encounter a command timeouts, resulting in the failure of the dacpac deployment operation. If you encounter this problem, please use the SQLPackage.exe (or SQL Client Tools) to apply the DACPAC or BACPAC maually.
+During some DACPAC or BACPAC deployments users may encounter a command timeout, resulting in the failure of the DACPAC deployment operation. If you encounter this problem, use the SQLPackage.exe (or SQL Client Tools) to apply the DACPAC or BACPAC manually.
## Next steps
azure-sql-edge Deploy Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/deploy-kubernetes.md
Title: Deploy an Azure SQL Edge container in Kubernetes - Azure SQL Edge
description: Learn about deploying an Azure SQL Edge container in Kubernetes - Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - SQL Edge - container
- - kubernetes
-
+ - Kubernetes
- # Deploy an Azure SQL Edge container in Kubernetes
-Azure SQL Edge can be deployed on a Kubernetes cluster both as an IoT Edge module through Azure IoT Edge running on Kubernetes or as a standalone container pod. For the remainder of this article, we will focus on the standalone container deployment on a kubernetes cluster. For information on deploying Azure IoT Edge on Kubernetes, refer [Azure IoT Edge on Kubernetes (preview)](https://microsoft.github.io/iotedge-k8s-doc/introduction.html).
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+Azure SQL Edge can be deployed on a Kubernetes cluster both as an IoT Edge module through Azure IoT Edge running on Kubernetes, or as a standalone container pod. For the remainder of this article, we will focus on the standalone container deployment on a Kubernetes cluster. For information on deploying Azure IoT Edge on Kubernetes, see [Azure IoT Edge on Kubernetes (preview)](https://microsoft.github.io/iotedge-k8s-doc/introduction.html).
-This tutorial demonstrates how to configure a highly available Azure SQL Edge instance in a container on a kubernetes cluster.
+This tutorial demonstrates how to configure a highly available Azure SQL Edge instance in a container on a Kubernetes cluster.
> [!div class="checklist"] > * Create an SA password
This tutorial demonstrates how to configure a highly available Azure SQL Edge in
Kubernetes 1.6 and later has support for [storage classes](https://kubernetes.io/docs/concepts/storage/storage-classes/), [persistent volume claims](https://kubernetes.io/docs/concepts/storage/storage-classes/#persistentvolumeclaims), and the [Azure disk volume type](https://github.com/kubernetes/examples/tree/master/staging/volumes/azure_disk). You can create and manage your Azure SQL Edge instances natively in Kubernetes. The example in this article shows how to create a [deployment](https://kubernetes.io/docs/concepts/workloads/controllers/deployment/) to achieve a high availability configuration similar to a shared disk failover cluster instance. In this configuration, Kubernetes plays the role of the cluster orchestrator. When an Azure SQL Edge instance in a container fails, the orchestrator bootstraps another instance of the container that attaches to the same persistent storage.
-![Azure SQL Edge in a Kubernetes cluster](media/deploy-kubernetes/kubernetes-sql-edge.png)
In the preceding diagram, `azure-sql-edge` is a container in a [pod](https://kubernetes.io/docs/concepts/workloads/pods/pod/). Kubernetes orchestrates the resources in the cluster. A [replica set](https://kubernetes.io/docs/concepts/workloads/controllers/replicaset/) ensures that the pod is automatically recovered after a node failure. Applications connect to the service. In this case, the service represents a load balancer that hosts an IP address that stays the same after failure of the `azure-sql-edge`. In the following diagram, the `azure-sql-edge` container has failed. As the orchestrator, Kubernetes guarantees the correct count of healthy instances in the replica set, and starts a new container according to the configuration. The orchestrator starts a new pod on the same node, and `azure-sql-edge` reconnects to the same persistent storage. The service connects to the re-created `azure-sql-edge`.
-![Azure SQL Edge in a Kubernetes cluster after pod fail](media/deploy-kubernetes/kubernetes-sql-edge-after-pod-fail.png)
In the following diagram, the node hosting the `azure-sql-edge` container has failed. The orchestrator starts the new pod on a different node, and `azure-sql-edge` reconnects to the same persistent storage. The service connects to the re-created `azure-sql-edge`.
-![Azure SQL Edge in a Kubernetes cluster after node fail](media/deploy-kubernetes/kubernetes-sql-edge-after-node-fail.png)
## Prerequisites
-* **Kubernetes cluster**
- - The tutorial requires a Kubernetes cluster. The steps use [kubectl](https://kubernetes.io/docs/reference/kubectl/) to manage the cluster.
+- **Kubernetes cluster**
+ - The tutorial requires a Kubernetes cluster. The steps use [kubectl](https://kubernetes.io/docs/reference/kubectl/) to manage the cluster.
- - For the purpose of this tutorial, we will be using Azure Kubernetes Service to deploy Azure SQL Edge. See [Deploy an Azure Kubernetes Service (AKS) cluster](../aks/tutorial-kubernetes-deploy-cluster.md) to create and connect to a single-node Kubernetes cluster in AKS with `kubectl`.
+ - For the purpose of this tutorial, we are using Azure Kubernetes Service to deploy Azure SQL Edge. See [Deploy an Azure Kubernetes Service (AKS) cluster](../aks/tutorial-kubernetes-deploy-cluster.md) to create and connect to a single-node Kubernetes cluster in AKS with `kubectl`.
- >[!NOTE]
- >To protect against node failure, a Kubernetes cluster requires more than one node.
+ > [!NOTE]
+ > To protect against node failure, a Kubernetes cluster requires more than one node.
-* **Azure CLI**
+- **Azure CLI**
- The instructions in this tutorial have been validated against Azure CLI 2.10.1.
-## Create a kubernetes namespace for SQL Edge deployment
+## Create a Kubernetes namespace for SQL Edge deployment
-Create a new namespace in the kubernetes cluster. This namespace will be used to deploy SQL Edge and all the required artifacts. For more information on Kubernetes namespaces, refer [namespaces](https://kubernetes.io/docs/concepts/overview/working-with-objects/namespaces/).
+Create a new namespace in the kubernetes cluster. This namespace is used to deploy SQL Edge and all the required artifacts. For more information on Kubernetes namespaces, see [namespaces](https://kubernetes.io/docs/concepts/overview/working-with-objects/namespaces/).
```azurecli kubectl create namespace <namespace name>
- ```
+ ```
## Create an SA password
The following command creates a password for the SA account:
```azurecli kubectl create secret generic mssql --from-literal=SA_PASSWORD="MyC0m9l&xP@ssw0rd" -n <namespace name>
- ```
+ ```
Replace `MyC0m9l&xP@ssw0rd` with a complex password. ## Create storage
-Configure a [persistent volume](https://kubernetes.io/docs/concepts/storage/persistent-volumes/) and [persistent volume claim](https://kubernetes.io/docs/concepts/storage/persistent-volumes/#persistent-volume-claim-protection) in the Kubernetes cluster. Complete the following steps:
+Configure a [persistent volume](https://kubernetes.io/docs/concepts/storage/persistent-volumes/) and [persistent volume claim](https://kubernetes.io/docs/concepts/storage/persistent-volumes/#persistent-volume-claim-protection) in the Kubernetes cluster. Complete the following steps:
-1. Create a manifest to define the storage class and the persistent volume claim. The manifest specifies the storage provisioner, parameters, and [reclaim policy](https://kubernetes.io/docs/concepts/storage/persistent-volumes/#reclaiming). The Kubernetes cluster uses this manifest to create the persistent storage.
+1. Create a manifest to define the storage class and the persistent volume claim. The manifest specifies the storage provisioner, parameters, and [reclaim policy](https://kubernetes.io/docs/concepts/storage/persistent-volumes/#reclaiming). The Kubernetes cluster uses this manifest to create the persistent storage.
- The following yaml example defines a storage class and persistent volume claim. The storage class provisioner is `azure-disk`, because this Kubernetes cluster is in Azure. The storage account type is `Standard_LRS`. The persistent volume claim is named `mssql-data`. The persistent volume claim metadata includes an annotation connecting it back to the storage class.
+ The following yaml example defines a storage class and persistent volume claim. The storage class provisioner is `azure-disk`, because this Kubernetes cluster is in Azure. The storage account type is `Standard_LRS`. The persistent volume claim is named `mssql-data`. The persistent volume claim metadata includes an annotation connecting it back to the storage class.
```yaml kind: StorageClass
Configure a [persistent volume](https://kubernetes.io/docs/concepts/storage/pers
Save the file (for example, **pvc.yaml**).
-2. Create the persistent volume claim in Kubernetes.
+1. Create the persistent volume claim in Kubernetes.
```azurecli kubectl apply -f <Path to pvc.yaml file> -n <namespace name>
Configure a [persistent volume](https://kubernetes.io/docs/concepts/storage/pers
`<Path to pvc.yaml file>` is the location where you saved the file.
- The persistent volume is automatically created as an Azure storage account, and bound to the persistent volume claim.
+ The persistent volume is automatically created as an Azure storage account, and bound to the persistent volume claim.
- ![Screenshot of persistent volume claim command](medi.png)
+ :::image type="content" source="medi.png" alt-text="Screenshot of persistent volume claim command.":::
-3. Verify the persistent volume claim.
+1. Verify the persistent volume claim.
```azurecli kubectl describe pvc <PersistentVolumeClaim> -n <name of the namespace>
Configure a [persistent volume](https://kubernetes.io/docs/concepts/storage/pers
The returned metadata includes a value called `Volume`. This value maps to the name of the blob.
- ![Screenshot of returned metadata, including Volume](media/deploy-kubernetes/describe-volume.png)
+ :::image type="content" source="media/deploy-kubernetes/describe-volume.png" alt-text="Screenshot of returned metadata, including Volume.":::
-4. Verify the persistent volume.
+1. Verify the persistent volume.
```azurecli kubectl describe pv -n <namespace name> ```
- `kubectl` returns metadata about the persistent volume that was automatically created and bound to the persistent volume claim.
+ `kubectl` returns metadata about the persistent volume that was automatically created and bound to the persistent volume claim.
## Create the deployment
-In this example, the container hosting the Azure SQL Edge instance is described as a Kubernetes deployment object. The deployment creates a replica set. The replica set creates the pod.
+In this example, the container hosting the Azure SQL Edge instance is described as a Kubernetes deployment object. The deployment creates a replica set. The replica set creates the pod.
-In this step, create a manifest to describe the container based on the Azure SQL Edge Docker image. The manifest references the `mssql-data` persistent volume claim, and the `mssql` secret that you already applied to the Kubernetes cluster. The manifest also describes a [service](https://kubernetes.io/docs/concepts/services-networking/service/). This service is a load balancer. The load balancer guarantees that the IP address persists after Azure SQL Edge instance is recovered.
+In this step, create a manifest to describe the container based on the Azure SQL Edge Docker image. The manifest references the `mssql-data` persistent volume claim, and the `mssql` secret that you already applied to the Kubernetes cluster. The manifest also describes a [service](https://kubernetes.io/docs/concepts/services-networking/service/). This service is a load balancer. The load balancer guarantees that the IP address persists after Azure SQL Edge instance is recovered.
1. Create a manifest (a YAML file) to describe the deployment. The following example describes a deployment, including a container based on the Azure SQL Edge container image.
-```yaml
-apiVersion: apps/v1
-kind: Deployment
-metadata:
- name: sqledge-deployment
-spec:
- replicas: 1
- selector:
- matchLabels:
- app: sqledge
- template:
- metadata:
- labels:
- app: sqledge
- spec:
- volumes:
- - name: sqldata
- persistentVolumeClaim:
- claimName: mssql-data
- containers:
- - name: azuresqledge
- image: mcr.microsoft.com/azure-sql-edge:latest
- ports:
- - containerPort: 1433
- volumeMounts:
- - name: sqldata
- mountPath: /var/opt/mssql
- env:
- - name: MSSQL_PID
- value: "Developer"
- - name: ACCEPT_EULA
- value: "Y"
- - name: SA_PASSWORD
- valueFrom:
- secretKeyRef:
- name: mssql
- key: SA_PASSWORD
- - name: MSSQL_AGENT_ENABLED
- value: "TRUE"
- - name: MSSQL_COLLATION
- value: "SQL_Latin1_General_CP1_CI_AS"
- - name: MSSQL_LCID
- value: "1033"
- terminationGracePeriodSeconds: 30
- securityContext:
- fsGroup: 10001
-
-apiVersion: v1
-kind: Service
-metadata:
- name: sqledge-deployment
-spec:
- selector:
- app: sqledge
- ports:
- - protocol: TCP
- port: 1433
- targetPort: 1433
- name: sql
- type: LoadBalancer
-```
-
- Copy the preceding code into a new file, named `sqldeployment.yaml`. Update the following values:
-
- * MSSQL_PID `value: "Developer"`: Sets the container to run Azure SQL Edge Developer edition. Developer edition is not licensed for production data. If the deployment is for production use, set the edition to `Premium`.
-
- >[!NOTE]
- >For more information, see [How to license Azure SQL Edge](https://azure.microsoft.com/pricing/details/sql-edge/).
-
- * `persistentVolumeClaim`: This value requires an entry for `claimName:` that maps to the name used for the persistent volume claim. This tutorial uses `mssql-data`.
+ ```yaml
+ apiVersion: apps/v1
+ kind: Deployment
+ metadata:
+ name: sqledge-deployment
+ spec:
+ replicas: 1
+ selector:
+ matchLabels:
+ app: sqledge
+ template:
+ metadata:
+ labels:
+ app: sqledge
+ spec:
+ volumes:
+ - name: sqldata
+ persistentVolumeClaim:
+ claimName: mssql-data
+ containers:
+ - name: azuresqledge
+ image: mcr.microsoft.com/azure-sql-edge:latest
+ ports:
+ - containerPort: 1433
+ volumeMounts:
+ - name: sqldata
+ mountPath: /var/opt/mssql
+ env:
+ - name: MSSQL_PID
+ value: "Developer"
+ - name: ACCEPT_EULA
+ value: "Y"
+ - name: SA_PASSWORD
+ valueFrom:
+ secretKeyRef:
+ name: mssql
+ key: SA_PASSWORD
+ - name: MSSQL_AGENT_ENABLED
+ value: "TRUE"
+ - name: MSSQL_COLLATION
+ value: "SQL_Latin1_General_CP1_CI_AS"
+ - name: MSSQL_LCID
+ value: "1033"
+ terminationGracePeriodSeconds: 30
+ securityContext:
+ fsGroup: 10001
+
+ apiVersion: v1
+ kind: Service
+ metadata:
+ name: sqledge-deployment
+ spec:
+ selector:
+ app: sqledge
+ ports:
+ - protocol: TCP
+ port: 1433
+ targetPort: 1433
+ name: sql
+ type: LoadBalancer
+ ```
+
+ Copy the preceding code into a new file, named `sqldeployment.yaml`. Update the following values:
+
+ * MSSQL_PID `value: "Developer"`: Sets the container to run Azure SQL Edge Developer edition. Developer edition isn't licensed for production data. If the deployment is for production use, set the edition to `Premium`.
+
+ > [!NOTE]
+ > For more information, see [How to license Azure SQL Edge](https://azure.microsoft.com/pricing/details/sql-edge/).
+
+ * `persistentVolumeClaim`: This value requires an entry for `claimName:` that maps to the name used for the persistent volume claim. This tutorial uses `mssql-data`.
* `name: SA_PASSWORD`: Configures the container image to set the SA password, as defined in this section.
spec:
valueFrom: secretKeyRef: name: mssql
- key: SA_PASSWORD
+ key: SA_PASSWORD
```
- When Kubernetes deploys the container, it refers to the secret named `mssql` to get the value for the password.
+ When Kubernetes deploys the container, it refers to the secret named `mssql` to get the value for the password.
- >[!NOTE]
- >By using the `LoadBalancer` service type, the Azure SQL Edge instance is accessible remotely (via the internet) at port 1433.
+ > [!NOTE]
+ > By using the `LoadBalancer` service type, the Azure SQL Edge instance is accessible remotely (via the internet) at port 1433.
- Save the file (for example, **sqledgedeploy.yaml**).
+ Save the file (for example, `sqledgedeploy.yaml`).
-2. Create the deployment.
+1. Create the deployment.
```azurecli kubectl apply -f <Path to sqledgedeploy.yaml file> -n <namespace name>
spec:
`<Path to sqldeployment.yaml file>` is the location where you saved the file.
- ![Screenshot of deployment command](medi.png)
+ :::image type="content" source="medi.png" alt-text="Screenshot of deployment command.":::
The deployment and service are created. The Azure SQL Edge instance is in a container, connected to persistent storage. To view the status of the pod, type `kubectl get pod -n <namespace name>`.
- ![Screenshot of get pod command](medi.png)
+ :::image type="content" source="medi.png" alt-text="Screenshot of get pod command.":::
In the preceding image, the pod has a status of `Running`. This status indicates that the container is ready. This may take several minutes.
- >[!NOTE]
- >After the deployment is created, it can take a few minutes before the pod is visible. The delay is because the cluster pulls the Azure SQL Edge container image from the Docker hub. After the image is pulled the first time, subsequent deployments might be faster if the deployment is to a node that already has the image cached on it.
+ > [!NOTE]
+ > After the deployment is created, it can take a few minutes before the pod is visible. The delay is because the cluster pulls the Azure SQL Edge container image from the Docker hub. After the image is pulled the first time, subsequent deployments might be faster if the deployment is to a node that already has the image cached on it.
-3. Verify the services are running. Run the following command:
+1. Verify the services are running. Run the following command:
```azurecli kubectl get services -n <namespace name> ```
- This command returns services that are running, as well as the internal and external IP addresses for the services. Note the external IP address for the `mssql-deployment` service. Use this IP address to connect to Azure SQL Edge.
+ This command returns services that are running, as well as the internal and external IP addresses for the services. Note the external IP address for the `mssql-deployment` service. Use this IP address to connect to Azure SQL Edge.
- ![Screenshot of get service command](medi.png)
+ :::image type="content" source="medi.png" alt-text="Screenshot of get service command.":::
For more information about the status of the objects in the Kubernetes cluster, run: ```azurecli az aks browse --resource-group <MyResourceGroup> --name <MyKubernetesClustername>
- ```
+ ```
## Connect to the Azure SQL Edge instance
-If you configured the container as described, you can connect with an application from outside the Azure virtual network. Use the `sa` account and the external IP address for the service. Use the password that you configured as the Kubernetes secret. For more information on connecting to an Azure SQL Edge instance, refer [Connect to Azure SQL Edge](connect.md).
+If you configured the container as described, you can connect with an application from outside the Azure virtual network. Use the `sa` account and the external IP address for the service. Use the password that you configured as the Kubernetes secret. For more information on connecting to an Azure SQL Edge instance, see [Connect to Azure SQL Edge](connect.md).
## Verify failure and recovery
To verify failure and recovery, you can delete the pod. Do the following steps:
Note the name of the pod running Azure SQL Edge.
-2. Delete the pod.
+1. Delete the pod.
```azurecli kubectl delete pod sqledge-deployment-7df66c9999-rc9xl ```
- `sqledge-deployment-7df66c9999-rc9xl` is the value returned from the previous step for pod name.
-Kubernetes automatically re-creates the pod to recover an Azure SQL Edge instance, and connect to the persistent storage. Use `kubectl get pods` to verify that a new pod is deployed. Use `kubectl get services` to verify that the IP address for the new container is the same.
+ `sqledge-deployment-7df66c9999-rc9xl` is the value returned from the previous step for pod name.
+
+Kubernetes automatically re-creates the pod to recover an Azure SQL Edge instance, and connect to the persistent storage. Use `kubectl get pods` to verify that a new pod is deployed. Use `kubectl get services` to verify that the IP address for the new container is the same.
## Summary
-In this tutorial, you learned how to deploy Azure SQL Edge containers to a Kubernetes cluster for high availability.
+In this tutorial, you learned how to deploy Azure SQL Edge containers to a Kubernetes cluster for high availability.
> [!div class="checklist"] > * Create an SA password
azure-sql-edge Deploy Onnx https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/deploy-onnx.md
description: Learn how to train a model, convert it to ONNX, deploy it to Azure
Previously updated : 06/21/2022 Last updated : 09/14/2023 ms.prod: sql ms.technology: machine-learning keywords: deploy SQL Edge - # Deploy and make predictions with an ONNX model and SQL machine learning
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ In this quickstart, you'll learn how to train a model, convert it to ONNX, deploy it to [Azure SQL Edge](onnx-overview.md), and then run native PREDICT on data using the uploaded ONNX model. This quickstart is based on **scikit-learn** and uses the [Boston Housing dataset](https://scikit-learn.org/0.24/modules/generated/sklearn.datasets.load_boston.html). ## Before you begin
-* If you're using Azure SQL Edge, and you haven't deployed an Azure SQL Edge module, follow the steps of [deploy SQL Edge using the Azure portal](deploy-portal.md).
+- If you're using Azure SQL Edge, and you haven't deployed an Azure SQL Edge module, follow the steps of [deploy SQL Edge using the Azure portal](deploy-portal.md).
-* Install [Azure Data Studio](/sql/azure-data-studio/download).
+- Install [Azure Data Studio](/sql/azure-data-studio/download).
-* Install Python packages needed for this quickstart:
+- Install Python packages needed for this quickstart:
- 1. Open [New Notebook](/sql/azure-data-studio/sql-notebooks) connected to the Python 3 Kernel.
+ 1. Open [New Notebook](/sql/azure-data-studio/sql-notebooks) connected to the Python 3 Kernel.
1. Select **Manage Packages**
- 1. In the **Installed** tab, look for the following Python packages in the list of installed packages. If any of these packages are not installed, select the **Add New** tab, search for the package, and select **Install**.
+ 1. In the **Installed** tab, look for the following Python packages in the list of installed packages. If any of these packages aren't installed, select the **Add New** tab, search for the package, and select **Install**.
- **scikit-learn** - **numpy** - **onnxmltools**
This quickstart is based on **scikit-learn** and uses the [Boston Housing datase
- **skl2onnx** - **sqlalchemy**
-* For each script part below, enter it in a cell in the Azure Data Studio notebook and run the cell.
+- For each script part in the following sections, enter it in a cell in the Azure Data Studio notebook and run the cell.
## Train a pipeline
boston = load_boston()
boston df = pd.DataFrame(data=np.c_[boston['data'], boston['target']], columns=boston['feature_names'].tolist() + ['MEDV'])
-
+ target_column = 'MEDV'
-
+ # Split the data frame into features and target x_train = pd.DataFrame(df.drop([target_column], axis = 1)) y_train = pd.DataFrame(df.iloc[:,df.columns.tolist().index(target_column)])
print(y_train.head())
**Output**:
-```text
+```output
*** Training dataset x CRIM ZN INDUS CHAS NOX RM AGE DIS RAD TAX \
print(y_train.head())
3 0.03237 0.0 2.18 0.0 0.458 6.998 45.8 6.0622 3.0 222.0 4 0.06905 0.0 2.18 0.0 0.458 7.147 54.2 6.0622 3.0 222.0
- PTRATIO B LSTAT
-0 15.3 396.90 4.98
-1 17.8 396.90 9.14
-2 17.8 392.83 4.03
-3 18.7 394.63 2.94
-4 18.7 396.90 5.33
+ PTRATIO B LSTAT
+0 15.3 396.90 4.98
+1 17.8 396.90 9.14
+2 17.8 392.83 4.03
+3 18.7 394.63 2.94
+4 18.7 396.90 5.33
*** Training dataset y
print('*** Scikit-learn MSE: {}'.format(sklearn_mse))
**Output**:
-```text
+```output
*** Scikit-learn r2 score: 0.7406426641094094 *** Scikit-learn MSE: 21.894831181729206 ``` ## Convert the model to ONNX
-Convert the data types to the supported SQL data types. This conversion will be required for other dataframes as well.
+Convert the data types to the supported SQL data types. This conversion is required for other dataframes as well.
```python from skl2onnx.common.data_types import FloatTensorType, Int64TensorType, DoubleTensorType
onnx_model_path = 'boston1.model.onnx'
onnxmltools.utils.save_model(onnx_model, onnx_model_path) ```
-> [!NOTE]
-> You may need to set the `target_opset` parameter for the skl2onnx.convert_sklearn function if there is a mismatch between ONNX runtime version in SQL Edge and skl2onnx packge. For more information, see the [SQL Edge Release notes](release-notes.md) to get the ONNX runtime version corresponding for the release, and pick the `target_opset` for the ONNX runtime based on the [ONNX backward compatibility matrix](https://github.com/microsoft/onnxruntime/blob/master/docs/Versioning.md#version-matrix).
+> [!NOTE]
+> You may need to set the `target_opset` parameter for the skl2onnx.convert_sklearn function if there's a mismatch between ONNX runtime version in SQL Edge and skl2onnx packge. For more information, see the [SQL Edge Release notes](release-notes.md) to get the ONNX runtime version corresponding for the release, and pick the `target_opset` for the ONNX runtime based on the [ONNX backward compatibility matrix](https://github.com/microsoft/onnxruntime/blob/master/docs/Versioning.md#version-matrix).
## Test the ONNX model After converting the model to ONNX format, score the model to show little to no degradation in performance.
-> [!NOTE]
+> [!NOTE]
> ONNX Runtime uses floats instead of doubles so small discrepancies are possible. ```python
print()
**Output**:
-```text
+```output
*** Onnx r2 score: 0.7406426691136831 *** Onnx MSE: 21.894830759270633
Load the data into SQL.
First, create two tables, **features** and **target**, to store subsets of the Boston housing dataset.
-* **Features** contains all data being used to predict the target, median value.
-* **Target** contains the median value for each record in the dataset.
+- **Features** contains all data being used to predict the target, median value.
+- **Target** contains the median value for each record in the dataset.
```python import sqlalchemy
print(x_train.head())
print(y_train.head()) ```
-Finally, use `sqlalchemy` to insert the `x_train` and `y_train` pandas dataframes into the tables `features` and `target`, respectively.
+Finally, use `sqlalchemy` to insert the `x_train` and `y_train` pandas dataframes into the tables `features` and `target`, respectively.
```python db_connection_string = 'mssql+pyodbc://' + username + ':' + password + '@' + server + '/' + database + '?driver=ODBC+Driver+17+for+SQL+Server'
Now you can view the data in the database.
With the model in SQL, run native PREDICT on the data using the uploaded ONNX model.
-> [!NOTE]
+> [!NOTE]
> Change the notebook kernel to SQL to run the remaining cell. ```sql
DECLARE @model VARBINARY(max) = (
WITH predict_input AS (
- SELECT TOP (1000) [id]
- , CRIM
- , ZN
- , INDUS
- , CHAS
- , NOX
- , RM
- , AGE
- , DIS
- , RAD
- , TAX
- , PTRATIO
- , B
- , LSTAT
+ SELECT TOP (1000) [id],
+ CRIM,
+ ZN,
+ INDUS,
+ CHAS,
+ NOX,
+ RM,
+ AGE,
+ DIS,
+ RAD,
+ TAX,
+ PTRATIO,
+ B,
+ LSTAT
FROM [dbo].[features] )
-SELECT predict_input.id
- , p.variable1 AS MEDV
-FROM PREDICT(MODEL = @model, DATA = predict_input, RUNTIME=ONNX) WITH (variable1 FLOAT) AS p;
+SELECT predict_input.id,
+ p.variable1 AS MEDV
+FROM PREDICT(MODEL = @model, DATA = predict_input, RUNTIME = ONNX) WITH (variable1 FLOAT) AS p;
```
-## Next Steps
+## Next steps
-* [Machine Learning and AI with ONNX in SQL Edge](onnx-overview.md)
+- [Machine Learning and AI with ONNX in SQL Edge](onnx-overview.md)
azure-sql-edge Deploy Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/deploy-portal.md
description: Learn how to deploy Azure SQL Edge using the Azure portal
Previously updated : 01/13/2023 Last updated : 09/14/2023 keywords: deploy SQL Edge # Deploy Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Azure SQL Edge is a relational database engine optimized for IoT and Azure IoT Edge deployments. It provides capabilities to create a high-performance data storage and processing layer for IoT applications and solutions. This quickstart shows you how to get started with creating an Azure SQL Edge module through Azure IoT Edge using the Azure portal. ## Before you begin
Azure Marketplace is an online applications and services marketplace where you c
1. On the *Environment Variables* section of the **Update IoT Edge Module** pane, specify the desired values for the environment variables. For a complete list of Azure SQL Edge environment variables, see [Configure using environment variables](configure.md#configure-by-using-environment-variables). The following default environment variables are defined for the module.
- | **Parameter** | **Description** |
+ | Parameter | Description |
| | | | MSSQL_SA_PASSWORD | Change the default value to specify a strong password for the SQL Edge admin account. | | MSSQL_LCID | Change the default value to set the desired language ID to use for SQL Edge. For example, 1036 is French. |
Azure Marketplace is an online applications and services marketplace where you c
- **Binds** and **Mounts**
- If you need to deploy more than one SQL Edge module, ensure that you update the mounts option to create a new source and target pair for the persistent volume. For more information on mounts and volume, refer [Use volumes](https://docs.docker.com/storage/volumes/) on Docker documentation.
+ If you need to deploy more than one SQL Edge module, ensure that you update the mounts option to create a new source and target pair for the persistent volume. For more information on mounts and volume, see [Use volumes](https://docs.docker.com/storage/volumes/) on Docker documentation.
```json {
Azure Marketplace is an online applications and services marketplace where you c
``` > [!IMPORTANT]
- > Do not change the `PlanId` environment variable defined in the create config setting. If this value is changed, the Azure SQL Edge container will fail to start.
+ > Don't change the `PlanId` environment variable defined in the create config setting. If this value is changed, the Azure SQL Edge container will fail to start.
> [!WARNING] > If you reinstall the module, remember to remove any existing bindings first, otherwise your environment variables will not be updated. 1. On the **Update IoT Edge Module** pane, select **Update**.
-1. On the **Set modules on device** page select **Next: Routes >** if you need to define routes for your deployment. Otherwise select **Review + Create**. For more information on configuring routes, see [Deploy modules and establish routes in IoT Edge](../iot-edge/module-composition.md).
+1. On the **Set modules on device** page, select **Next: Routes >** if you need to define routes for your deployment. Otherwise select **Review + Create**. For more information on configuring routes, see [Deploy modules and establish routes in IoT Edge](../iot-edge/module-composition.md).
1. On the **Set modules on device** page, select **Create**. ## Connect to Azure SQL Edge
Azure Marketplace is an online applications and services marketplace where you c
The following steps use the Azure SQL Edge command-line tool, **sqlcmd**, inside the container to connect to Azure SQL Edge. > [!NOTE]
-> SQL Server command line tools, including **sqlcmd**, are not available inside the ARM64 version of Azure SQL Edge containers.
+> SQL Server command line tools, including **sqlcmd**, aren't available inside the ARM64 version of Azure SQL Edge containers.
1. Use the `docker exec -it` command to start an interactive bash shell inside your running container. In the following example, `AzureSQLEdge` is name specified by the `Name` parameter of your IoT Edge Module.
The following steps create a new database named `TestDB`.
```sql CREATE DATABASE TestDB;
- Go
+ GO
``` 1. On the next line, write a query to return the name of all of the databases on your server: ```sql
- SELECT Name from sys.databases;
- Go
+ SELECT name from sys.databases;
+ GO
``` ### Insert data
Next, create a new table called `Inventory`, and insert two new rows.
1. Insert data into the new table: ```sql
- INSERT INTO Inventory VALUES (1, 'banana', 150); INSERT INTO Inventory VALUES (2, 'orange', 154);
+ INSERT INTO Inventory
+ VALUES (1, 'banana', 150);
+
+ INSERT INTO Inventory
+ VALUES (2, 'orange', 154);
``` 1. Type `GO` to execute the previous commands:
Now, run a query to return data from the `Inventory` table.
## Connect from outside the container
-You can connect and run SQL queries against your Azure SQL Edge instance from any external Linux, Windows, or macOS tool that supports SQL connections. For more information on connecting to a SQL Edge container from outside, refer [Connect and Query Azure SQL Edge](./connect.md).
+You can connect and run SQL queries against your Azure SQL Edge instance from any external Linux, Windows, or macOS tool that supports SQL connections. For more information on connecting to a SQL Edge container from outside, see [Connect and Query Azure SQL Edge](connect.md).
In this quickstart, you deployed a SQL Edge Module on an IoT Edge device.
azure-sql-edge Disconnected Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/disconnected-deployment.md
Title: Deploy Azure SQL Edge with Docker - Azure SQL Edge
description: Learn about deploying Azure SQL Edge with Docker Previously updated : 01/13/2023 Last updated : 09/14/2023 keywords: - SQL Edge - container
- - docker
+ - Docker
# Deploy Azure SQL Edge with Docker
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ In this quickstart, you use Docker to pull and run the Azure SQL Edge container image. Then connect with **sqlcmd** to create your first database and run queries. This image consists of SQL Edge based on Ubuntu 18.04. It can be used with the Docker Engine 1.8+ on Linux.
Azure SQL Edge containers aren't supported on the following platforms for produc
| Parameter | Description | | | | | **-e "ACCEPT_EULA=Y"** | Set the **ACCEPT_EULA** variable to any value to confirm your acceptance of the [End-User Licensing Agreement](https://go.microsoft.com/fwlink/?linkid=2139274). Required setting for the SQL Edge image. |
- | **-e "MSSQL_SA_PASSWORD=yourStrong(!)Password"** | Specify your own strong password that is at least 8 characters and meets the [Azure SQL Edge password requirements](/sql/relational-databases/security/password-policy). Required setting for the SQL Edge image. |
+ | **-e "MSSQL_SA_PASSWORD=yourStrong(!)Password"** | Specify your own strong password that is at least eight characters and meets the [Azure SQL Edge password requirements](/sql/relational-databases/security/password-policy). Required setting for the SQL Edge image. |
| **-p 1433:1433** | Map a TCP port on the host environment (first value) with a TCP port in the container (second value). In this example, SQL Edge is listening on TCP 1433 in the container and this is exposed to the port, 1433, on the host. | | **--name azuresqledge** | Specify a custom name for the container rather than a randomly generated one. If you run more than one container, you can't reuse this same name. | | **-d** | Run the container in the background (daemon) |
Azure SQL Edge containers aren't supported on the following platforms for produc
1. To view your Docker containers, use the `docker ps` command. ```bash
- sudo docker ps -a
+ sudo docker ps -a
``` 1. If the **STATUS** column shows a status of **Up**, then SQL Edge is running in the container and listening on the port specified in the **PORTS** column. If the **STATUS** column for your SQL Edge container shows **Exited**, see the Troubleshooting section of Azure SQL Edge documentation.
- The `-h` (host name) parameter is also useful, but it isn't used in this tutorial for simplicity. This changes the internal name of the container to a custom value. This is the name you'll see returned in the following Transact-SQL query:
+ The `-h` (host name) parameter is also useful, but it isn't used in this tutorial for simplicity. This changes the internal name of the container to a custom value. This is the name that is returned in the following Transact-SQL query:
```sql SELECT @@SERVERNAME, SERVERPROPERTY('ComputerNamePhysicalNetBIOS'), SERVERPROPERTY('MachineName'),
- SERVERPROPERTY('ServerName')
+ SERVERPROPERTY('ServerName');
``` Setting `-h` and `--name` to the same value is a good way to easily identify the target container.
Azure SQL Edge containers aren't supported on the following platforms for produc
## Change the SA password
-The **SA** account is a system administrator on the Azure SQL Edge instance that gets created during setup. After creating your SQL Edge container, the `MSSQL_SA_PASSWORD` environment variable you specified is discoverable by running `echo $MSSQL_SA_PASSWORD` in the container. For security purposes, change your SA password.
+The **SA** account is a system administrator on the Azure SQL Edge instance that gets created during setup. After you create your SQL Edge container, the `MSSQL_SA_PASSWORD` environment variable you specified is discoverable by running `echo $MSSQL_SA_PASSWORD` in the container. For security purposes, change your SA password.
1. Choose a strong password to use for the SA user.
The **SA** account is a system administrator on the Azure SQL Edge instance that
The following steps use the Azure SQL Edge command-line tool, **sqlcmd**, inside the container to connect to SQL Edge.
-> [!NOTE]
-> **sqlcmd** is not available inside the ARM64 version of SQL Edge containers.
- 1. Use the `docker exec -it` command to start an interactive bash shell inside your running container. In the following example, `azuresqledge` is the name specified by the `--name` parameter when you created the container. ```bash sudo docker exec -it azuresqledge "bash" ```
-1. Once inside the container, connect locally with sqlcmd. Sqlcmd isn't in the path by default, so you have to specify the full path.
+1. Once inside the container, connect locally with **sqlcmd**. **sqlcmd** isn't in the path by default, so you have to specify the full path.
```bash /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P "<YourNewStrong@Passw0rd>"
The following steps create a new database named `TestDB`.
1. From the **sqlcmd** command prompt, paste the following Transact-SQL command to create a test database: ```sql
- CREATE DATABASE TestDB
- Go
+ CREATE DATABASE TestDB;
+ GO
``` 1. On the next line, write a query to return the name of all of the databases on your server: ```sql
- SELECT Name from sys.Databases
- Go
+ SELECT name from sys.databases;
+ GO
``` ### Insert data
Next create a new table, `Inventory`, and insert two new rows.
1. From the **sqlcmd** command prompt, switch context to the new `TestDB` database: ```sql
- USE TestDB
+ USE TestDB;
``` 1. Create new table named `Inventory`: ```sql
- CREATE TABLE Inventory (id INT, name NVARCHAR(50), quantity INT);
+ CREATE TABLE Inventory (
+ id INT,
+ name NVARCHAR(50),
+ quantity INT
+ );
``` 1. Insert data into the new table: ```sql
- INSERT INTO Inventory VALUES (1, 'banana', 150); INSERT INTO Inventory VALUES (2, 'orange', 154);
+ INSERT INTO Inventory
+ VALUES (1, 'banana', 150);
+
+ INSERT INTO Inventory
+ VALUES (2, 'orange', 154);
``` 1. Type `GO` to execute the previous commands:
Now, run a query to return data from the `Inventory` table.
## Connect from outside the container
-You can also connect to the SQL Edge instance on your Docker machine from any external Linux, Windows, or macOS tool that supports SQL connections. For more information on connecting to a SQL Edge container from outside, refer [Connect and Query Azure SQL Edge](connect.md).
+You can also connect to the SQL Edge instance on your Docker machine from any external Linux, Windows, or macOS tool that supports SQL connections. For more information on connecting to a SQL Edge container from outside, see [Connect and Query Azure SQL Edge](connect.md).
## Remove your container
azure-sql-edge Drop External Stream Transact Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/drop-external-stream-transact-sql.md
Title: DROP EXTERNAL STREAM (Transact-SQL) - Azure SQL Edge
description: Learn about the DROP EXTERNAL STREAM statement in Azure SQL Edge - Previously updated : 05/19/2020 Last updated : 09/14/2023 - - # DROP EXTERNAL STREAM (Transact-SQL)
-Drops a streaming job.
+Drops a streaming job.
## Syntax
-```sql
-DROP EXTERNAL STREAM {external_stream_name}
+```syntaxsql
+DROP EXTERNAL STREAM { external_stream_name }
``` ## See also -- [CREATE EXTERNAL STREAM (Transact-SQL)](create-external-stream-transact-sql.md)
+- [CREATE EXTERNAL STREAM (Transact-SQL)](create-external-stream-transact-sql.md)
azure-sql-edge Features https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/features.md
Title: Supported features of Azure SQL Edge
description: Learn about details of features supported by Azure SQL Edge. Previously updated : 06/29/2023 Last updated : 09/14/2023 keywords:
keywords:
# Supported features of Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Azure SQL Edge is built on the latest version of the SQL Database Engine. It supports a subset of the features supported in SQL Server 2022 on Linux, in addition to some features that are currently not supported or available in SQL Server 2022 on Linux (or in SQL Server on Windows). For a complete list of the features supported in SQL Server on Linux, see [Editions and supported features of SQL Server 2022 on Linux](/sql/linux/sql-server-linux-editions-and-components-2022). For editions and supported features of SQL Server on Windows, see [Editions and supported features of SQL Server 2022 (16.x)](/sql/sql-server/editions-and-components-of-sql-server-2022).
The recommended and supported file system for Azure SQL Edge is EXT4 and XFS. If
## Hardware support
-Azure SQL Edge requires a 64-bit processor (either x64 or ARM64), with a minimum of 1 CPU and 1 GB of RAM on the host. While the startup memory footprint of Azure SQL Edge is close to 450 MB, the additional memory is needed for other IoT Edge modules or processes running on the edge device. The actual memory and CPU requirements for Azure SQL Edge will vary based on the complexity of the workload and volume of data being processed. When choosing hardware for your solution, Microsoft recommends that you run extensive performance tests to ensure that the required performance characteristics for your solution are met.
+Azure SQL Edge requires an x86 64-bit processor, with a minimum of 1 CPU core, and 1 GB of RAM on the host. While the startup memory footprint of Azure SQL Edge is close to 450 MB, the additional memory is needed for other IoT Edge modules or processes running on the edge device. The actual memory and CPU requirements for Azure SQL Edge will vary based on the complexity of the workload and volume of data being processed. When you choose hardware for your solution, Microsoft recommends that you run extensive performance tests to ensure that the required performance characteristics for your solution are met.
## Azure SQL Edge components
-Azure SQL Edge only supports the database engine. It doesn't include support for other components available with SQL Server 2022 on Windows or with SQL Server 2022 on Linux. Specifically, Azure SQL Edge doesn't support SQL Server components like Analysis Services, Reporting Services, Integration Services, Master Data Services, Machine Learning Services (In-Database), and Machine Learning Server (standalone).
+Azure SQL Edge only supports the Database Engine. It doesn't include support for other components available with SQL Server 2022 on Windows or with SQL Server 2022 on Linux. Specifically, Azure SQL Edge doesn't support SQL Server components like Analysis Services, Reporting Services, Integration Services, Master Data Services, Machine Learning Services (In-Database), and Machine Learning Server (standalone).
## Supported features In addition to supporting a subset of features of SQL Server on Linux, Azure SQL Edge includes support for the following new features: - SQL streaming, which is based on the same engine that powers Azure Stream Analytics, provides real-time data streaming capabilities in Azure SQL Edge.-- The T-SQL function call `Date_Bucket` for Time-Series data analytics.-- Machine learning capabilities through the ONNX runtime, included with the SQL engine.
+- The T-SQL function call `DATE_BUCKET` for Time-Series data analytics.
+- Machine learning capabilities through the ONNX runtime, included with the SQL Database Engine.
## Unsupported features
azure-sql-edge High Availability Sql Edge Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/high-availability-sql-edge-containers.md
Title: High availability for Azure SQL Edge containers - Azure SQL Edge
description: Learn about high availability for Azure SQL Edge containers - Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - SQL Edge - containers - high availability- - # High availability for Azure SQL Edge containers
-Create and manage your Azure SQL Edge instances natively in Kubernetes. Deploy Azure SQL Edge to docker containers managed by [Kubernetes](https://kubernetes.io/). In Kubernetes, a container with an Azure SQL Edge instance can automatically recover in case a cluster node fails. You can configure the SQL Edge container image with a Kubernetes persistent volume claim (PVC). Kubernetes monitors the Azure SQL Edge process in the container. If the process, pod, container, or node fail, Kubernetes automatically bootstraps another instance and reconnects to the storage.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+Create and manage your Azure SQL Edge instances natively in Kubernetes. Deploy Azure SQL Edge to containers managed by [Kubernetes](https://kubernetes.io/). In Kubernetes, a container with an Azure SQL Edge instance can automatically recover in case a cluster node fails. You can configure the SQL Edge container image with a Kubernetes persistent volume claim (PVC). Kubernetes monitors the Azure SQL Edge process in the container. If the process, pod, container, or node fail, Kubernetes automatically bootstraps another instance and reconnects to the storage.
## Azure SQL Edge containers on Kubernetes
-Kubernetes 1.6 and later has support for [*storage classes*](https://kubernetes.io/docs/concepts/storage/storage-classes/), [*persistent volume claims*](https://kubernetes.io/docs/concepts/storage/storage-classes/#persistentvolumeclaims).
+Kubernetes 1.6 and later versions has support for [*storage classes*](https://kubernetes.io/docs/concepts/storage/storage-classes/), [*persistent volume claims*](https://kubernetes.io/docs/concepts/storage/storage-classes/#persistentvolumeclaims).
-In this configuration, Kubernetes plays the role of the container orchestrator.
+In this configuration, Kubernetes plays the role of the container orchestrator.
-![Diagram of Azure SQL Edge in a Kubernetes cluster](media/deploy-kubernetes/kubernetes-sql-edge.png)
In the preceding diagram, `azure-sql-edge` is a container in a [pod](https://kubernetes.io/docs/concepts/workloads/pods/pod/). Kubernetes orchestrates the resources in the cluster. A [replica set](https://kubernetes.io/docs/concepts/workloads/controllers/replicaset/) ensures that the pod is automatically recovered after a node failure. Applications connect to the service. In this case, the service represents a load balancer that hosts an IP address that stays the same after failure of the `azure-sql-edge`. In the following diagram, the `azure-sql-edge` container has failed. As the orchestrator, Kubernetes guarantees the correct count of healthy instances in the replica set, and starts a new container according to the configuration. The orchestrator starts a new pod on the same node, and `azure-sql-edge` reconnects to the same persistent storage. The service connects to the re-created `azure-sql-edge`.
-![Azure SQL Edge in a Kubernetes cluster after pod fail](media/deploy-kubernetes/kubernetes-sql-edge-after-pod-fail.png)
In the following diagram, the node hosting the `azure-sql-edge` container has failed. The orchestrator starts the new pod on a different node, and `azure-sql-edge` reconnects to the same persistent storage. The service connects to the re-created `azure-sql-edge`.
-![Azure SQL Edge in a Kubernetes cluster after node fail](media/deploy-kubernetes/kubernetes-sql-edge-after-node-fail.png).
To create a container in Kubernetes, see [Deploy a Azure SQL Edge container in Kubernetes](deploy-Kubernetes.md)
To deploy Azure SQL Edge containers in Azure Kubernetes Service (AKS), see the f
- [Deploy a Azure SQL Edge container in Kubernetes](deploy-Kubernetes.md) - [Machine Learning and Artificial Intelligence with ONNX in SQL Edge](onnx-overview.md). - [Building an end to end IoT Solution with SQL Edge using IoT Edge](tutorial-deploy-azure-resources.md).-- [Data Streaming in Azure SQL Edge](stream-data.md)
+- [Data Streaming in Azure SQL Edge](stream-data.md)
azure-sql-edge Imputing Missing Values https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/imputing-missing-values.md
Title: Filling time gaps and imputing missing values - Azure SQL Edge
description: Learn about filling time gaps and imputing missing values in Azure SQL Edge - Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - SQL Edge - timeseries-
+# Fill time gaps and imputing missing values
-# Filling time gaps and imputing missing values
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
When dealing with time series data, it's often possible that the time series data has missing values for the attributes. It's also possible that, because of the nature of the data, or because of interruptions in data collection, there are time *gaps* in the dataset.
-For example, when collecting energy usage statistics for a smart device, whenever the device isn't operational there will be gaps in the usage statistics. Similarly, in a machine telemetry data collection scenario, it's possible that the different sensors are configured to emit data at different frequencies, resulting in missing values for the sensors. For example, if there are two sensors, voltage and pressure, configured at 100 Hz and 10-Hz frequency respectively, the voltage sensor will emit data every one-hundredth of a second, while the pressure sensor will only emit data every one-tenth of a second.
+For example, when collecting energy usage statistics for a smart device, whenever the device isn't operational there are gaps in the usage statistics. Similarly, in a machine telemetry data collection scenario, it's possible that the different sensors are configured to emit data at different frequencies, resulting in missing values for the sensors. For example, if there are two sensors, voltage and pressure, configured at 100 Hz and 10-Hz frequency respectively, the voltage sensor emits data every one-hundredth of a second, while the pressure sensor only emits data every one-tenth of a second.
The following table describes a machine telemetry dataset, which was collected at a one-second interval.
-```txt
+```output
timestamp VoltageReading PressureReading -- - 2020-09-07 06:14:41.000 164.990400 97.223600
There are two important characteristics of the preceding dataset.
Gap filling is a technique that helps create contiguous, ordered set of timestamps to ease the analysis of time series data. In Azure SQL Edge, the easiest way to fill gaps in the time series dataset is to define a temporary table with the desired time distribution and then do a `LEFT OUTER JOIN` or a `RIGHT OUTER JOIN` operation on the dataset table.
-Taking the `MachineTelemetry` data represented above as an example, the following query can be used to generate contiguous, ordered set of timestamps for analysis.
+Taking the `MachineTelemetry` data represented previously as an example, the following query can be used to generate contiguous, ordered set of timestamps for analysis.
-> [!NOTE]
-> The query below generates the missing rows, with the timestamp values and `null` values for the attributes.
+> [!NOTE]
+> The following query generates the missing rows, with the timestamp values and `null` values for the attributes.
```sql
-Create Table #SeriesGenerate(dt datetime Primary key Clustered)
+CREATE TABLE #SeriesGenerate (dt DATETIME PRIMARY KEY CLUSTERED)
GO
-Declare @startdate datetime = '2020-09-07 06:14:41.000', @endtime datetime = '2020-09-07 06:14:56.000'
-While (@startdate <= @endtime)
+DECLARE @startdate DATETIME = '2020-09-07 06:14:41.000',
+ @endtime DATETIME = '2020-09-07 06:14:56.000'
+
+WHILE (@startdate <= @endtime)
BEGIN
-Insert into #SeriesGenerate values (@startdate)
-set @startdate = DATEADD(SECOND, 1, @startdate)
+ INSERT INTO #SeriesGenerate
+ VALUES (@startdate)
+
+ SET @startdate = DATEADD(SECOND, 1, @startdate)
END
-Select a.dt as timestamp, b.VoltageReading, b.PressureReading
-From
-#SeriesGenerate a LEFT OUTER JOIN MachineTelemetry b
- on a.dt = b.[timestamp]
+SELECT a.dt AS TIMESTAMP,
+ b.VoltageReading,
+ b.PressureReading
+FROM #SeriesGenerate a
+LEFT JOIN MachineTelemetry b
+ ON a.dt = b.[timestamp];
```+ The above query produces the following output containing all *one-second* timestamps in the specified range.
-Here is the Result Set
+Here's the result set:
-```txt
+```output
timestamp VoltageReading PressureReading -- -- - 2020-09-07 06:14:41.000 164.990400 97.223600
timestamp VoltageReading PressureReading
2020-09-07 06:14:56.000 159.183500 100.748200 ```
-## Imputing missing values
+## Impute missing values
-The preceding query generated the missing timestamps for data analysis, however it did not replace any of the missing values (represented as null) for `voltage` and `pressure` readings. In Azure SQL Edge, a new syntax was added to the T-SQL `LAST_VALUE()` and `FIRST_VALUE()` functions, which provide mechanisms to impute missing values, based on the preceding or following values in the dataset.
+The preceding query generated the missing timestamps for data analysis, however it didn't replace any of the missing values (represented as null) for `voltage` and `pressure` readings. In Azure SQL Edge, a new syntax was added to the T-SQL `LAST_VALUE()` and `FIRST_VALUE()` functions, which provide mechanisms to impute missing values, based on the preceding or following values in the dataset.
-The new syntax adds `IGNORE NULLS` and `RESPECT NULLS` clause to the `LAST_VALUE()` and `FIRST_VALUE()` functions. A following query on the `MachineTelemetry` dataset computes the missing values using the last_value function, where missing values are replaced with the last observed value in the dataset.
+The new syntax adds `IGNORE NULLS` and `RESPECT NULLS` clause to the `LAST_VALUE()` and `FIRST_VALUE()` functions. A following query on the `MachineTelemetry` dataset computes the missing values using the LAST_VALUE function, where missing values are replaced with the last observed value in the dataset.
```sql
-Select
- timestamp,
- VoltageReading As OriginalVoltageValues,
- LAST_VALUE(VoltageReading) IGNORE NULLS OVER (ORDER BY timestamp) As ImputedUsingLastValue,
- PressureReading As OriginalPressureValues,
- LAST_VALUE(PressureReading) IGNORE NULLS OVER (ORDER BY timestamp) As ImputedUsingLastValue
-From
-MachineTelemetry
-order by timestamp
+SELECT timestamp,
+ VoltageReading AS OriginalVoltageValues,
+ LAST_VALUE(VoltageReading) IGNORE NULLS OVER (
+ ORDER BY timestamp
+ ) AS ImputedUsingLastValue,
+ PressureReading AS OriginalPressureValues,
+ LAST_VALUE(PressureReading) IGNORE NULLS OVER (
+ ORDER BY timestamp
+ ) AS ImputedUsingLastValue
+FROM MachineTelemetry
+ORDER BY timestamp;
```
-Here is the Result Set
-```txt
+Here's the result set:
+
+```output
timestamp OrigVoltageVals ImputedVoltage OrigPressureVals ImputedPressure -- - -- -- - 2020-09-07 06:14:41.000 164.990400 164.990400 97.223600 97.223600
timestamp OrigVoltageVals ImputedVoltage OrigPressureVals Impute
2020-09-07 06:14:56.000 159.183500 159.183500 100.748200 100.748200 ```
-The following query imputes the missing values using both the `LAST_VALUE()` and the `FIRST_VALUE` function. For, the output column `ImputedVoltage` the missing values are replaced by the last observed value, while for the output column `ImputedPressure` the missing values are replaced by the next observed value in the dataset.
+The following query imputes the missing values using both the `LAST_VALUE()` and the `FIRST_VALUE` function. For the output column `ImputedVoltage`, the last observed value replaces the missing values, while for the output column `ImputedPressure` the missing values are replaced by the next observed value in the dataset.
```sql
-Select
- dt as timestamp,
- VoltageReading As OrigVoltageVals,
- LAST_VALUE(VoltageReading) IGNORE NULLS OVER (ORDER BY dt) As ImputedVoltage,
- PressureReading As OrigPressureVals,
- First_VALUE(PressureReading) IGNORE NULLS OVER (ORDER BY dt ROWS
- BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) As ImputedPressure
-From
-(Select a.dt, b.VoltageReading,b.PressureReading from
- #SeriesGenerate a
- LEFT OUTER JOIN
- MachineTelemetry b
- on a.dt = b.[timestamp]) A
-order by timestamp
+SELECT dt AS [timestamp],
+ VoltageReading AS OrigVoltageVals,
+ LAST_VALUE(VoltageReading) IGNORE NULLS OVER (
+ ORDER BY dt
+ ) AS ImputedVoltage,
+ PressureReading AS OrigPressureVals,
+ FIRST_VALUE(PressureReading) IGNORE NULLS OVER (
+ ORDER BY dt ROWS BETWEEN CURRENT ROW
+ AND UNBOUNDED FOLLOWING
+ ) AS ImputedPressure
+FROM (
+ SELECT a.dt,
+ b.VoltageReading,
+ b.PressureReading
+ FROM #SeriesGenerate a
+ LEFT JOIN MachineTelemetry b
+ ON a.dt = b.[timestamp]
+ ) A
+ORDER BY timestamp;
```
-Here is the Result Set
-```txt
+Here's the result set:
+
+```output
timestamp OrigVoltageVals ImputedVoltage OrigPressureVals ImputedPressure -- - -- 2020-09-07 06:14:41.000 164.990400 164.990400 97.223600 97.223600
timestamp OrigVoltageVals ImputedVoltage OrigPressureVals Imput
2020-09-07 06:14:56.000 159.183500 159.183500 100.748200 100.748200 ```
-> [!NOTE]
+> [!NOTE]
> The above query uses the `FIRST_VALUE()` function to replace missing values with the next observed value. The same result can be achieved by using the `LAST_VALUE()` function with a `ORDER BY <ordering_column> DESC` clause. ## Next steps
timestamp OrigVoltageVals ImputedVoltage OrigPressureVals Imput
- [FIRST_VALUE (Transact-SQL)](/sql/t-sql/functions/first-value-transact-sql?toc=%2fazure%2fazure-sql-edge%2ftoc.json) - [LAST_VALUE (Transact-SQL)](/sql/t-sql/functions/last-value-transact-sql?toc=%2fazure%2fazure-sql-edge%2ftoc.json) - [DATE_BUCKET (Transact-SQL)](date-bucket-tsql.md)-- [Aggregate Functions (Transact-SQL)](/sql/t-sql/functions/aggregate-functions-transact-sql)
+- [Aggregate Functions (Transact-SQL)](/sql/t-sql/functions/aggregate-functions-transact-sql)
azure-sql-edge Onnx Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/onnx-overview.md
description: Machine learning in Azure SQL Edge supports models in the Open Neur
Previously updated : 06/21/2022 Last updated : 09/14/2023 keywords: deploy SQL Edge- - # Machine learning and AI with ONNX in SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Machine learning in Azure SQL Edge supports models in the [Open Neural Network Exchange (ONNX)](https://onnx.ai/) format. ONNX is an open format you can use to interchange models between various [machine learning frameworks and tools](https://onnx.ai/supported-tools). ## Overview
-To infer machine learning models in Azure SQL Edge, you will first need to get a model. This can be a pre-trained model or a custom model trained with your framework of choice. Azure SQL Edge supports the ONNX format and you will need to convert the model to this format. There should be no impact on model accuracy, and once you have the ONNX model, you can deploy the model in Azure SQL Edge and use [native scoring with the PREDICT T-SQL function](/sql/advanced-analytics/sql-native-scoring/).
+To infer machine learning models in Azure SQL Edge, you first need to get a model. This can be a pretrained model or a custom model trained with your framework of choice. Azure SQL Edge supports the ONNX format and you need to convert the model to this format. There should be no effect on model accuracy, and once you have the ONNX model, you can deploy the model in Azure SQL Edge and use [native scoring with the PREDICT T-SQL function](/sql/advanced-analytics/sql-native-scoring/).
## Get ONNX models
To obtain a model in the ONNX format:
- **Model Building Services**: Services such as the [automated Machine Learning feature in Azure Machine Learning](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-bank-marketing-all-features/auto-ml-classification-bank-marketing-all-features.ipynb) and [Azure Custom Vision Service](../ai-services/custom-vision-service/getting-started-build-a-classifier.md) support directly exporting the trained model in the ONNX format. -- [**Convert and/or export existing models**](https://github.com/onnx/tutorials#converting-to-onnx-format): Several training frameworks (for example, [PyTorch](https://pytorch.org/docs/stable/onnx.html), Chainer, and Caffe2) support native export functionality to ONNX, which allows you to save your trained model to a specific version of the ONNX format. For frameworks that do not support native export, there are standalone ONNX Converter installable packages that enable you to convert models trained from different machine learning frameworks to the ONNX format.
+- [**Convert and/or export existing models**](https://github.com/onnx/tutorials#converting-to-onnx-format): Several training frameworks (for example, [PyTorch](https://pytorch.org/docs/stable/onnx.html), Chainer, and Caffe2) support native export functionality to ONNX, which allows you to save your trained model to a specific version of the ONNX format. For frameworks that don't support native export, there are standalone ONNX Converter installable packages that enable you to convert models trained from different machine learning frameworks to the ONNX format.
**Supported frameworks** * [PyTorch](http://pytorch.org/docs/master/onnx.html)
To obtain a model in the ONNX format:
* [Keras](https://github.com/onnx/keras-onnx) * [Scikit-learn](https://github.com/onnx/sklearn-onnx) * [CoreML](https://github.com/onnx/onnxmltools)
-
+ For the full list of supported frameworks and examples, see [Converting to ONNX format](https://github.com/onnx/tutorials#converting-to-onnx-format). ## Limitations
Currently, not all ONNX models are supported by Azure SQL Edge. The support is l
- [int and bigint](/sql/t-sql/data-types/int-bigint-smallint-and-tinyint-transact-sql) - [real and float](/sql/t-sql/data-types/float-and-real-transact-sql).
-
+ Other numeric types can be converted to supported types by using [CAST and CONVERT](/sql/t-sql/functions/cast-and-convert-transact-sql).
-The model inputs should be structured so that each input to the model corresponds to a single column in a table. For example, if you are using a pandas dataframe to train a model, then each input should be a separate column to the model.
+The model inputs should be structured so that each input to the model corresponds to a single column in a table. For example, if you're using a pandas dataframe to train a model, then each input should be a separate column to the model.
## Next steps - [Deploy SQL Edge through Azure portal](deploy-portal.md)-- [Deploy an ONNX model on Azure SQL Edge ](deploy-onnx.md)
+- [Deploy an ONNX model on Azure SQL Edge](deploy-onnx.md)
azure-sql-edge Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/overview.md
Title: What is Azure SQL Edge?
description: Learn about Azure SQL Edge - Previously updated : 05/19/2020 Last updated : 09/14/2023 keywords: - introduction to SQL Edge - what is SQL Edge - SQL Edge overview- - # What is Azure SQL Edge?
-Azure SQL Edge is an optimized relational database engine geared for IoT and IoT Edge deployments. It provides capabilities to create a high-performance data storage and processing layer for IoT applications and solutions. Azure SQL Edge provides capabilities to stream, process, and analyze relational and non-relational such as JSON, graph and time-series data, which makes it the right choice for a variety of modern IoT applications.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+Azure SQL Edge is an optimized relational database engine geared for IoT and IoT Edge deployments. It provides capabilities to create a high-performance data storage and processing layer for IoT applications and solutions. Azure SQL Edge provides capabilities to stream, process, and analyze relational and nonrelational data such as JSON, graph and time-series data, which makes it the right choice for various modern IoT applications.
Azure SQL Edge is built on the latest versions of the [SQL Server Database Engine](/sql/sql-server/sql-server-technical-documentation), which provides industry-leading performance, security and query processing capabilities. Since Azure SQL Edge is built on the same engine as [SQL Server](/sql/sql-server/sql-server-technical-documentation) and [Azure SQL](/azure/azure-sql/index), it provides the same Transact-SQL (T-SQL) programming surface area that makes development of applications or solutions easier and faster, and makes application portability between IoT Edge devices, data centers and the cloud straight forward. What is Azure SQL Edge video on Channel 9:+
+<br />
+ > [!VIDEO https://learn.microsoft.com/shows/Data-Exposed/What-is-Azure-SQL-Edge/player]
-## Deployment Models
+## Deployment models
Azure SQL Edge supports two deployment modes. - Connected deployment through Azure IoT Edge: Azure SQL Edge is available on the Azure Marketplace and can be deployed as a module for [Azure IoT Edge](../iot-edge/about-iot-edge.md). For more information, see [Deploy Azure SQL Edge](deploy-portal.md).<br>
-![SQL Edge overview diagram](media/overview/overview.png)
-- Disconnected deployment: Azure SQL Edge container images can be pulled from docker hub and deployed either as a standalone docker container or on a kubernetes cluster. For more information, see [Deploy Azure SQL Edge with Docker](disconnected-deployment.md) and [Deploy an Azure SQL Edge container in Kubernetes](deploy-kubernetes.md).
+- Disconnected deployment: Azure SQL Edge container images can be pulled from Docker hub and deployed either as a standalone container or on a Kubernetes cluster. For more information, see [Deploy Azure SQL Edge with Docker](disconnected-deployment.md) and [Deploy an Azure SQL Edge container in Kubernetes](deploy-kubernetes.md).
## Editions of SQL Edge SQL Edge is available with two different editions or software plans. These editions have identical feature sets and only differ in terms of their usage rights and the amount of cpu/memory they support.
- |**Plan** |**Description** |
- |||
- |Azure SQL Edge Developer | Development only sku, each SQL Edge container is limited to upto 4 cores and 32 GB Memory |
- |Azure SQL Edge | Production sku, each SQL Edge container is limited to upto 8 cores and 64 GB Memory. |
+| Plan | Description |
+| | |
+| Azure SQL Edge Developer | Development only SKU. Each SQL Edge container is limited to a maximum of 4 CPU cores and 32 GB of memory |
+| Azure SQL Edge | Production SKU. Each SQL Edge container is limited to a maximum of 8 CPU cores, and 64 GB of memory. |
-## Pricing and Availability
+## Price and availability
-Azure SQL Edge is now generally available. For more information on the pricing and availability in specific regions, see [Azure SQL Edge](https://azure.microsoft.com/services/sql-edge/).
+Azure SQL Edge is generally available. For more information on the pricing and availability in specific regions, see [Azure SQL Edge](https://azure.microsoft.com/services/sql-edge/).
-> [!IMPORTANT]
+> [!IMPORTANT]
> To understand the feature differences between Azure SQL Edge and SQL Server, as well as the differences among different Azure SQL Edge options, see [Supported features of Azure SQL Edge](features.md).
-## Streaming Capabilities
+## Streaming capabilities
Azure SQL Edge provides built in streaming capabilities for real-time analytics and complex event-processing. The streaming capability is built using the same constructs as [Azure Stream Analytics](../stream-analytics/stream-analytics-introduction.md) and similar capabilities as [Azure Stream Analytics on IoT Edge](../stream-analytics/stream-analytics-edge.md).
-The streaming engine for Azure SQL Edge is designed for low-latency, resiliency, efficient use of bandwidth and compliance.
+The streaming engine for Azure SQL Edge is designed for low-latency, resiliency, efficient use of bandwidth and compliance.
-For more information on data streaming in SQL Edge, refer [Data Streaming](stream-data.md)
+For more information on data streaming in SQL Edge, see [Data Streaming](stream-data.md).
-## Machine Learning and Artificial Intelligence Capabilities
+## Machine learning and artificial intelligence capabilities
Azure SQL Edge provides built-in machine learning and analytics capabilities by integrating the open format ONNX (Open Neural Network Exchange) runtime, which allows exchange of deep learning and neural network models between different frameworks. For more information on ONNX, see [here](https://onnx.ai/). ONNX runtime provides the flexibility to develop models in a language or tools of your choice, which can then be converted to the ONNX format for execution within SQL Edge. For more information, see [Machine Learning and Artificial Intelligence with ONNX in SQL Edge](onnx-overview.md).
-## Working with Azure SQL Edge
+## Work with Azure SQL Edge
-Azure SQL Edge makes developing and maintaining applications easier and more productive. Users can use all the familiar tools and skills to build great apps and solutions for their IoT Edge needs. User can develop in SQL Edge using tools like the following:
+Azure SQL Edge makes developing and maintaining applications easier and more productive. Users can use all the familiar tools and skills to build great apps and solutions for their IoT Edge needs. You can develop in SQL Edge using the following tools:
- [The Azure portal](https://portal.azure.com/) - A web-based application for managing all Azure services. - [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms/) - A free, downloadable client application for managing any SQL infrastructure, from SQL Server to SQL Database.
Azure SQL Edge makes developing and maintaining applications easier and more pro
- [Azure Data Studio](/sql/azure-data-studio/what-is/) - A free, downloadable, cross platform database tool for data professional using the Microsoft family of on-premises and cloud data platforms on Windows, macOS, and Linux. - [Visual Studio Code](https://code.visualstudio.com/docs) - A free, downloadable, open-source code editor for Windows, macOS, and Linux. It supports extensions, including the [mssql extension](https://aka.ms/mssql-marketplace) for querying Microsoft SQL Server, Azure SQL Database, and Azure Synapse Analytics. - ## Next steps - [Deploy SQL Edge through Azure portal](deploy-portal.md)
azure-sql-edge Performance Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/performance-best-practices.md
Title: Performance best practices and configuration guidelines - Azure SQL Edge
description: Learn about performance best practices and configuration guidelines in Azure SQL Edge - Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - SQL Edge - data retention- # Performance best practices and configuration guidelines
-Azure SQL Edge offers several features and capabilities that can be used to improve the performance of your SQL Edge deployment. This article provides some best practices and recommendations to maximize performance.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-## Best practices
+Azure SQL Edge offers several features and capabilities that can be used to improve the performance of your SQL Edge deployment. This article provides some best practices and recommendations to maximize performance.
-### Configure multiple tempdb data files
+## Best practices
-Azure SQL Edge by default creates only one tempdb data file as part of the container initialization. We recommend that you consider creating multiple tempdb data files post deployment. For more information, see the guidance in the article, [Recommendations to reduce allocation contention in SQL Server tempdb database](https://support.microsoft.com/help/2154845/recommendations-to-reduce-allocation-contention-in-sql-server-tempdb-d).
+### Configure multiple `tempdb` data files
+
+Azure SQL Edge by default creates only one `tempdb` data file as part of the container initialization. We recommend that you consider creating multiple `tempdb` data files post deployment. For more information, see the guidance in the article, [Recommendations to reduce allocation contention in SQL Server tempdb database](https://support.microsoft.com/help/2154845/recommendations-to-reduce-allocation-contention-in-sql-server-tempdb-d).
### Use Clustered columnstore indexes where possible IoT and Edge devices tend to generate high volume of data that is typically aggregated over some time window for analysis. Individual data rows are rarely used for any analysis. Columnstore indexes are ideal for storing and querying such large datasets. This index uses column-based data storage and query processing to achieve gains up to 10 times the query performance over traditional row-oriented storage. You can also achieve gains up to 10 times the data compression over the uncompressed data size. For more information, see [Columnstore Indexes](/sql/relational-databases/indexes/columnstore-indexes-overview)
-Additionally, other Azure SQL Edge features like data streaming and Data retention benefit from the columnstore optimizations around data insertion and data removal.
+Additionally, other Azure SQL Edge features like data streaming and Data retention benefit from the columnstore optimizations around data insertion and data removal.
### Simple recovery model Since storage can be constrained on edge devices, all user databases in Azure SQL Edge use the Simple Recovery model by default. Simple recovery model automatically reclaims log space to keep space requirements small, essentially eliminating the need to manage the transaction log space. On edge devices with limited storage available, this can be helpful. For more information on the simple recovery model and other recovery models available, see [Recovery Models](/sql/relational-databases/backup-restore/recovery-models-sql-server)
-Operations like Log Shipping and Point-In-time-restores, that require transaction log backups are not supported by the simple recovery model.
+The simple recovery model doesn't support operations that require transaction log backups, such as log shipping and point-in-time-restores.
-## Advanced configuration
+## Advanced configuration
-### Setting memory limits
+### Set memory limits
-Azure SQL Edge supports up to 64 GB of memory for the buffer pool, while additional memory may be required by satellite processes running within the SQL Edge container. On smaller edge devices with limited memory, it is advisable to limit the memory available to SQL Edge containers, such that the docker host and other edge processes or modules can function properly. The total memory available for SQL Edge can be controlled using the following mechanisms.
+Azure SQL Edge supports up to 64 GB of memory for the buffer pool, while additional memory may be required by satellite processes running within the SQL Edge container. On smaller edge devices with limited memory, it's advisable to limit the memory available to SQL Edge containers, such that the Docker host and other edge processes or modules can function properly. The total memory available for SQL Edge can be controlled using the following mechanisms.
- Limiting memory available to the SQL Edge Containers: The total memory available to the SQL Edge container can be limited by using the container runtime configuration option `--memory`. For more information on limiting memory available to the SQL Edge container, see [Runtime options with Memory, CPUs, and GPUs](https://docs.docker.com/config/containers/resource_constraints/). -- Limiting memory available to SQL process within the container: By default, the SQL process within the container uses only 80% of the physical RAM available. For majority of the deployments, the default configuration will be fine. However, there can be scenarios where additional memory might be required for the data streaming and the ONNX processes running inside the SQL Edge containers. In such scenarios, the memory available to the SQL process can be controlled using the `memory.memorylimitmb` setting in the mssql-conf file. For more information, see [Configure using mssql.conf file](configure.md#configure-by-using-an-mssqlconf-file).
+- Limiting memory available to SQL process within the container: By default, the SQL process within the container uses only 80% of the physical RAM available. For most deployments, the default configuration is fine. However, there can be scenarios where additional memory might be required for the data streaming and the ONNX processes running inside the SQL Edge containers. In such scenarios, the memory available to the SQL process can be controlled using the `memory.memorylimitmb` setting in the `mssql.conf` file. For more information, see [Configure using mssql.conf file](configure.md#configure-by-using-an-mssqlconf-file).
-When setting the memory limits, be careful to not set this value too low. If you do not set enough memory for the SQL process, it can severely impact SQL performance.
+When setting the memory limits, be careful to not set this value too low. If you don't set enough memory for the SQL process, it can severely affect performance.
### Delayed durability Transactions in Azure SQL Edge can be either fully durable, the SQL Server default, or delayed durable (also known as lazy commit).
-Fully durable transaction commits are synchronous and report a commit as successful and return control to the client only after the log records for the transaction are written to disk. Delayed durable transaction commits are asynchronous and report a commit as successful before the log records for the transaction are written to disk. Writing the transaction log entries to disk is required for a transaction to be durable. Delayed durable transactions become durable when the transaction log entries are flushed to disk.
+Fully durable transaction commits are synchronous, and report a commit as successful and return control to the client only after the log records for the transaction are written to disk. Delayed durable transaction commits are asynchronous and report a commit as successful before the log records for the transaction are written to disk. Writing the transaction log entries to disk is required for a transaction to be durable. Delayed durable transactions become durable when the transaction log entries are flushed to disk.
In deployments where **some data loss** can be tolerated or on edge devices with slow storage, delayed durability can be used to optimize data ingestion and data retention-based cleanup. For more information, see [Control Transaction Durability](/sql/relational-databases/logs/control-transaction-durability).
+### Linux OS configurations
-### Linux OS configurations
-
-Consider using the following [Linux Operating System configuration](/sql/linux/sql-server-linux-performance-best-practices#linux-os-configuration) settings to experience the best performance for a SQL Installation.
+Consider using the following [Linux Operating System configuration](/sql/linux/sql-server-linux-performance-best-practices#linux-os-configuration) settings to experience the best performance for a SQL Installation.
azure-sql-edge Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/release-notes.md
description: Release notes detailing what's new or what has changed in the Azure
Previously updated : 06/29/2023 Last updated : 09/14/2023 keywords: release notes SQL Edge # Azure SQL Edge release notes
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ This article describes what's new and what has changed with every new build of Azure SQL Edge. ## Azure SQL Edge 2.0.0
azure-sql-edge Resources Partners Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/resources-partners-security.md
description: Providing details about external partners who are working with Azur
Previously updated : 10/09/2020 Last updated : 09/14/2023 keywords: security partners Azure SQL Edge- # Azure SQL Edge security partners
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ This article highlights Microsoft partners companies with security solutions to provide end to end security solutions for your IoT Edge deployments using Azure SQL Edge. ## Security partners
-
-| Partner| Description | Links |
-|--|--|--|
-|![DH2i](media/resources/dh2i-logo.png)|DH2i takes an innovative new approach to networking connectivity by enabling organizations with its Software Defined Perimeter (SDP) Always-Secure and Always-On IT Infrastructure. DxOdyssey for IoT extends this to edge devices, allowing seamless access from the edge devices to the data center and cloud. This SDP module runs on any IoT device in a container on x64 and arm64 architecture. Once enabled, organizations can create secure, private application-level tunnels between devices and hubs without the requirement of a VPN or exposing public, open ports. This SDP module is purpose-built for IoT use cases where edge devices must communicate with any other devices, resources, applications, or clouds. Minimum hardware requirements: Linux x64 and arm64 OS, 1 GB of RAM, 100 Mb of storage| [Website](https://dh2i.com/) [Marketplace](https://portal.azure.com/#blade/Microsoft_Azure_Marketplace/MarketplaceOffersBlade/selectedMenuItemId/home) [Documentation](https://dh2i.com/dxodyssey-for-iot/) [Support](https://support.dh2i.com/)
+
+| Partner | Description | Links |
+| | | |
+|:::image type="icon" source="media/resources/dh2i-logo.png"::: |DH2i takes an innovative new approach to networking connectivity by enabling organizations with its Software Defined Perimeter (SDP) Always-Secure and Always-On IT Infrastructure. DxOdyssey for IoT extends this to edge devices, allowing seamless access from the edge devices to the data center and cloud. This SDP module runs on any IoT device in a container on x64 and ARM64 architecture. Once enabled, organizations can create secure, private application-level tunnels between devices and hubs without the requirement of a VPN or exposing public, open ports. This SDP module is purpose-built for IoT use cases where edge devices must communicate with any other devices, resources, applications, or clouds. Minimum hardware requirements: Linux x64 and ARM64 OS, 1 GB of RAM, 100 MB of storage| [Website](https://dh2i.com/) [Marketplace](https://portal.azure.com/#blade/Microsoft_Azure_Marketplace/MarketplaceOffersBlade/selectedMenuItemId/home) [Documentation](https://dh2i.com/dxodyssey-for-iot/) [Support](https://support.dh2i.com/)
## Next steps
azure-sql-edge Security Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/security-overview.md
Title: Secure Azure SQL Edge
description: Learn about security in Azure SQL Edge - Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - SQL Edge - security-
+# Secure Azure SQL Edge
-# Securing Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-With the increase in adoption of IoT and Edge computing across industries, there is a increase in the number of devices and the data generated from these devices. The increased volume of data and the number of device endpoints poses a significant challenge in terms of security of data and the devices.
+With the increase in adoption of IoT and Edge computing across industries, there's an increase in the number of devices and the data generated from these devices. The increased volume of data and the number of device endpoints poses a significant challenge in terms of security of data and the devices.
-Azure SQL Edge offers multiple features and capabilities that make it relatively easier to secure the IoT data within the SQL Server databases. Azure SQL Edge is built using the same SQL engine that powers Microsoft SQL Server and Azure SQL, sharing the same security capabilities, which makes it easier to extend the same security policies and practices from cloud to the edge.
+Azure SQL Edge offers multiple features and capabilities that make it relatively easier to secure the IoT data within the SQL Server databases. Azure SQL Edge is built using the same Database Engine that powers Microsoft SQL Server and Azure SQL, sharing the same security capabilities, which makes it easier to extend the same security policies and practices from cloud to the edge.
-Just like Microsoft SQL Server and Azure SQL, securing Azure SQL Edge deployments can be viewed as a series of steps involving four areas: the platform, authentication, objects (including data) and applications that access the system.
+Just like Microsoft SQL Server and Azure SQL, securing Azure SQL Edge deployments can be viewed as a series of steps involving four areas: the platform, authentication, objects (including data) and applications that access the system.
## Platform and system security
-The platform for Azure SQL Edge includes the physical docker host, the operating system on the host, and the networking systems connecting the physical device to applications and clients.
+The platform for Azure SQL Edge includes the physical Docker host, the operating system on the host, and the networking systems connecting the physical device to applications and clients.
-Implementing platform security starts with keeping unauthorized users off the network. Some of the best practices include, but is not limited to:
-- Implementing firewall rules to ensure organizational security policy. -- Ensure the operating system on the physical device has all the latest security updates applied.
+Implementing platform security starts with keeping unauthorized users off the network. Some of the best practices include, but isn't limited to:
+- Implementing firewall rules to ensure organizational security policy.
+- Ensure the operating system on the physical device has all the latest security updates applied.
- Specifying and restricting host ports that are using for Azure SQL Edge-- Ensuring that proper access control is applied to all data volumes which host Azure SQL Edge data.
+- Ensuring that proper access control is applied to all data volumes which host Azure SQL Edge data.
For more information on Azure SQL Edge network protocols and TDS endpoints refer, [Network Protocols and TDS Endpoints](/previous-versions/sql/sql-server-2008-r2/ms191220(v=sql.105)).
-## Authentication and authorization
+## Authentication and authorization
+
+### Authentication
-### Authentication
Authentication is the process of proving the user is who they claim to be. Azure SQL Edge currently only supports the `SQL Authentication` mechanism. - *SQL Authentication*:
- SQL authentication refers to the authentication of a user when connecting to Azure SQL Edge using username and password. The SQL **sa** login password must be specified during SQL Edge deployment. After that, additional SQL logins and users can be created by the server admin, which enable users to connect using username and password.
+ SQL authentication refers to the authentication of a user when connecting to Azure SQL Edge using username and password. The SQL **sa** login password must be specified during SQL Edge deployment. After that, additional SQL logins and users can be created by the server admin, which enable users to connect using username and password.
- For more information on creating and managing logins and users in SQL Edge, refer [Create a Login](/sql/relational-databases/security/authentication-access/create-a-login) and [Create Database User](/sql/relational-databases/security/authentication-access/create-a-database-user).
+ For more information on creating and managing logins and users in SQL Edge, see [Create a Login](/sql/relational-databases/security/authentication-access/create-a-login) and [Create Database User](/sql/relational-databases/security/authentication-access/create-a-database-user).
-### Authorization
+### Authorization
Authorization refers to the permissions assigned to a user within a database in Azure SQL Edge, and determines what the user is allowed to do. Permissions are controlled by adding user accounts to [database roles](/sql/relational-databases/security/authentication-access/database-level-roles) and assigning database-level permissions to those roles or by granting the user certain [object-level permissions](/sql/relational-databases/security/permissions-database-engine). For more information, see [Logins and users](/azure/azure-sql/database/logins-create-manage).
-As a best practice, create custom roles when needed. Add users to the role with the least privileges required to do their job function. Do not assign permissions directly to users. The server admin account is a member of the built-in db_owner role, which has extensive permissions and should only be granted to few users with administrative duties. For applications, use the [EXECUTE AS](/sql/t-sql/statements/execute-as-clause-transact-sql) to specify the execution context of the called module or use [Application Roles](/sql/relational-databases/security/authentication-access/application-roles) with limited permissions. This practice ensures that the application that connects to the database has the least privileges needed by the application. Following these best practices also fosters separation of duties.
+As a best practice, create custom roles when needed. Add users to the role with the least privileges required to do their job function. Don't assign permissions directly to users. The server admin account is a member of the built-in db_owner role, which has extensive permissions and should only be granted to few users with administrative duties. For applications, use the [EXECUTE AS](/sql/t-sql/statements/execute-as-clause-transact-sql) to specify the execution context of the called module or use [Application Roles](/sql/relational-databases/security/authentication-access/application-roles) with limited permissions. This practice ensures that the application that connects to the database has the least privileges needed by the application. Following these best practices also fosters separation of duties.
## Database object security
-Principals are the individuals, groups, and processes granted access to SQL Edge. "Securables" are the server, database, and objects the database contains. Each has a set of permissions that can be configured to help reduce the surface area. The following table contains information about principals and securables.
+Principals are the individuals, groups, and processes that are granted access to SQL Edge. "Securables" are the server, database, and objects the database contains. Each has a set of permissions that can be configured to help reduce the surface area. The following table contains information about principals and securables.
-|For information about|See|
-|||
-|Server and database users, roles, and processes|[Principals Database Engine](/sql/relational-databases/security/authentication-access/principals-database-engine)|
-|Server and database objects security|[Securables](/sql/relational-databases/security/securables)|
-| &nbsp; | &nbsp; |
+| For information about | See |
+| | |
+| Server and database users, roles, and processes | [Principals Database Engine](/sql/relational-databases/security/authentication-access/principals-database-engine) |
+| Server and database objects security | [Securables](/sql/relational-databases/security/securables) |
-### Encryption and certificates
-
-Encryption does not solve access control problems. However, it enhances security by limiting data loss even in the rare occurrence that access controls are bypassed. For example, if the database host computer is misconfigured and a malicious user obtains sensitive data, such as credit card numbers, that stolen information might be useless if it is encrypted. The following table contains more information about encryption in Azure SQL Edge.
-
-|For information about|See|
-|||
-|Implementing secure connections|[Encrypting Connections](/sql/linux/sql-server-linux-encrypted-connections)|
-|Encryption functions|[Cryptographic Functions &#40;Transact-SQL&#41;](/sql/t-sql/functions/cryptographic-functions-transact-sql)|
-|Data Encryption at rest|[Transparent Data Encryption](/sql/relational-databases/security/encryption/transparent-data-encryption)|
-|Always Encrypted|[Always Encrypted](/sql/relational-databases/security/encryption/always-encrypted-database-engine)|
-| &nbsp; | &nbsp; |
+### Encryption and certificates
-> [!NOTE]
-> The security limitations described for [SQL Server on Linux](/sql/linux/sql-server-linux-security-overview) also apply to Azure SQL Edge.
+Encryption doesn't solve access control problems. However, it enhances security by limiting data loss even in the rare occurrence that access controls are bypassed. For example, if the database host computer is misconfigured and a malicious user obtains sensitive data, such as credit card numbers, that stolen information might be useless if it's encrypted. The following table contains more information about encryption in Azure SQL Edge.
+| For information about | See |
+| | |
+| Implementing secure connections | [Encrypting Connections](/sql/linux/sql-server-linux-encrypted-connections) |
+| Encryption functions | [Cryptographic Functions (Transact-SQL)](/sql/t-sql/functions/cryptographic-functions-transact-sql) |
+| Data Encryption at rest | [Transparent Data Encryption](/sql/relational-databases/security/encryption/transparent-data-encryption) |
+| Always Encrypted | [Always Encrypted](/sql/relational-databases/security/encryption/always-encrypted-database-engine) |
-> [!NOTE]
-> Azure SQL Edge does not include the mssql-conf utility. All configurations including encryption related configuration needs to be performed through the [mssql.conf file](configure.md#configure-by-using-an-mssqlconf-file) or [environment variables](configure.md#configure-by-using-environment-variables).
+> [!NOTE]
+> The security limitations described for [SQL Server on Linux](/sql/linux/sql-server-linux-security-overview) also apply to Azure SQL Edge.
+> [!NOTE]
+> Azure SQL Edge doesn't include the **mssql-conf** utility. All configurations including encryption related configuration needs to be performed through the [mssql.conf file](configure.md#configure-by-using-an-mssqlconf-file) or [environment variables](configure.md#configure-by-using-environment-variables).
Similar to Azure SQL and Microsoft SQL Server, Azure SQL Edge provides the same mechanism to create and use certificates to enhance object and connection security. For more information, see, [CREATE CERTIFICATE (TRANSACT-SQL)](/sql/t-sql/statements/create-certificate-transact-sql). - ## Application security ### Client programs Azure SQL Edge security best practices include writing secure client applications. For more information about how to help secure client applications at the networking layer, see [Client Network Configuration](/sql/database-engine/configure-windows/client-network-configuration).
-### Security catalog views and functions
-Security information is exposed in several views and functions that are optimized for performance and utility. The following table contains information about security views and functions in Azure SQL Edge.
-
-|Functions and views|Links|
-|||
-|Security catalog views, which return information about database-level and server-level permissions, principals, roles, and so on. In addition, there are catalog views that provide information about encryption keys, certificates, and credentials.|[Security Catalog Views &#40;Transact-SQL&#41;](/sql/relational-databases/system-catalog-views/security-catalog-views-transact-sql)|
-|Security functions, which return information about the current user, permissions and schemas.|[Security Functions &#40;Transact-SQL&#41;](/sql/t-sql/functions/security-functions-transact-sql)|
-|Security dynamic management views.|[Security-Related Dynamic Management Views and Functions &#40;Transact-SQL&#41;](/sql/relational-databases/system-dynamic-management-views/security-related-dynamic-management-views-and-functions-transact-sql)|
-| &nbsp; | &nbsp; |
+### Security catalog views and functions
-### Auditing
+Security information is exposed in several views and functions that are optimized for performance and utility. The following table contains information about security views and functions in Azure SQL Edge.
-Azure SQL Edge provides the same Auditing mechanisms as SQL Server. For more information, see [SQL Server Audit (Database Engine)](/sql/relational-databases/security/auditing/sql-server-audit-database-engine).
+| Functions and views | Links |
+| | |
+| Security catalog views, which return information about database-level and server-level permissions, principals, roles, and so on. In addition, there are catalog views that provide information about encryption keys, certificates, and credentials. | [Security Catalog Views (Transact-SQL)](/sql/relational-databases/system-catalog-views/security-catalog-views-transact-sql) |
+| Security functions, which return information about the current user, permissions and schemas. | [Security Functions (Transact-SQL)](/sql/t-sql/functions/security-functions-transact-sql) |
+| Security dynamic management views. | [Security-Related Dynamic Management Views and Functions (Transact-SQL)](/sql/relational-databases/system-dynamic-management-views/security-related-dynamic-management-views-and-functions-transact-sql) |
+### Auditing
+
+Azure SQL Edge provides the same Auditing mechanisms as SQL Server. For more information, see [SQL Server Audit (Database Engine)](/sql/relational-databases/security/auditing/sql-server-audit-database-engine).
## Next steps
azure-sql-edge Stream Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/stream-data.md
Title: Data streaming in Azure SQL Edge
description: Learn about data streaming in Azure SQL Edge. - Previously updated : 07/08/2022 Last updated : 09/14/2023 - - # Data streaming in Azure SQL Edge
-Azure SQL Edge provides a native implementation of data streaming capabilities called T-SQL Streaming. It provides real-time data streaming, analytics, and event-processing to analyze and process high volumes of fast-streaming data from multiple sources, simultaneously. T-SQL streaming is built by using the same high-performance streaming engine that powers [Azure Stream Analytics](../stream-analytics/stream-analytics-introduction.md) in Microsoft Azure. The feature supports a similar set of capabilities offered by Azure Stream Analytics running on the edge.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+Azure SQL Edge provides a native implementation of data streaming capabilities called Transact-SQL (T-SQL) streaming. It provides real-time data streaming, analytics, and event-processing to analyze and process high volumes of fast-streaming data from multiple sources, simultaneously. T-SQL streaming is built by using the same high-performance streaming engine that powers [Azure Stream Analytics](../stream-analytics/stream-analytics-introduction.md) in Microsoft Azure. The feature supports a similar set of capabilities offered by Azure Stream Analytics running on the edge.
-As with Stream Analytics, T-SQL Streaming recognizes patterns and relationships in information extracted from a number of IoT input sources, including devices, sensors, and applications. You can use these patterns to trigger actions and initiate workflows. For example, you can create alerts, feed information to a reporting or visualization solution, or store the data for later use.
+As with Stream Analytics, T-SQL Streaming recognizes patterns and relationships in information extracted from several IoT input sources, including devices, sensors, and applications. You can use these patterns to trigger actions and initiate workflows. For example, you can create alerts, feed information to a reporting or visualization solution, or store the data for later use.
T-SQL streaming can help you:
-* Analyze real-time telemetry streams from IoT devices.
-* Use real-time analytics of data generated from autonomous and driverless vehicles.
-* Use remote monitoring and predictive maintenance of high-value industrial or manufacturing assets.
-* Use anomaly detection and pattern recognition of IoT sensor readings in an agriculture or an energy farm.
+- Analyze real-time telemetry streams from IoT devices.
+- Use real-time analytics of data generated from autonomous and driverless vehicles.
+- Use remote monitoring and predictive maintenance of high-value industrial or manufacturing assets.
+- Use anomaly detection and pattern recognition of IoT sensor readings in an agriculture or an energy farm.
## How does T-SQL streaming work?
-T-SQL streaming works in exactly the same manner as [Azure Stream Analytics](../stream-analytics/stream-analytics-introduction.md). For example, it uses the concept of streaming *jobs* for processing of real-time data streaming.
+T-SQL streaming works in exactly the same manner as [Azure Stream Analytics](../stream-analytics/stream-analytics-introduction.md). For example, it uses the concept of streaming *jobs* for processing of real-time data streaming.
A stream analytics job consists of: - **Stream input**: This defines the connections to a data source to read the data stream from. Azure SQL Edge currently supports the following stream input types:
- - Edge Hub
- - Kafka (Support for Kafka inputs is currently only available on Intel/AMD64 versions of Azure SQL Edge.)
+ - Edge Hub
+ - Kafka (Support for Kafka inputs is currently only available on Intel/AMD64 versions of Azure SQL Edge.)
- **Stream output**: This defines the connections to a data source to write the data stream to. Azure SQL Edge currently supports the following stream output types
- - Edge Hub
- - SQL (The SQL output can be a local database within the instance of Azure SQL Edge, or a remote SQL Server or Azure SQL Database.)
+ - Edge Hub
+ - SQL (The SQL output can be a local database within the instance of Azure SQL Edge, or a remote SQL Server or Azure SQL Database.)
- **Stream query**: This defines the transformation, aggregations, filter, sorting, and joins to be applied to the input stream, before it's written to the stream output. The stream query is based on the same query language as that used by Stream Analytics. For more information, see [Stream Analytics Query Language](/stream-analytics-query/stream-analytics-query-language-reference).
-> [!IMPORTANT]
+> [!IMPORTANT]
> T-SQL streaming, unlike Stream Analytics, doesn't currently support [using reference data for lookups](../stream-analytics/stream-analytics-use-reference-data.md) or [using UDF's and UDA's in a stream job](../stream-analytics/streaming-technologies.md#you-want-to-write-udfs-udas-and-custom-deserializers-in-a-language-other-than-javascript-or-c).
-> [!NOTE]
+> [!NOTE]
> T-SQL streaming only supports a subset of the language surface area supported by Stream Analytics. For more information, see [Stream Analytics Query Language](/stream-analytics-query/stream-analytics-query-language-reference).
-## Limitations and restrictions
+## Limitations
-The following limitations and restrictions apply to T-SQL streaming.
+The following limitations and restrictions apply to T-SQL streaming.
- Only one streaming job can be active at any specific time. Jobs that are already running must be stopped before starting another job. - Each streaming job execution is single-threaded. If the streaming job contains multiple queries, each query is evaluated in serial order.-- When you stopped a streaming job in Azure SQL Edge, there may be some delay before the next streaming job can be started. This delay is introduced because the underlying streaming process needs to be stopped in response to the stop job request and then restarted in response to the start job request. -- T-SQL Streaming upto 32 partitions for a kafka stream. Attempts to configure a higher partition count will result in an error.
+- When you stopped a streaming job in Azure SQL Edge, there may be some delay before the next streaming job can be started. This delay is introduced because the underlying streaming process needs to be stopped in response to the stop job request and then restarted in response to the start job request.
+- T-SQL Streaming upto 32 partitions for a kafka stream. Attempts to configure a higher partition count results in an error.
## Next steps -- [Create a Stream Analytics job in Azure SQL Edge ](create-stream-analytics-job.md)-- [Viewing metadata associated with stream jobs in Azure SQL Edge ](streaming-catalog-views.md)
+- [Create a Stream Analytics job in Azure SQL Edge](create-stream-analytics-job.md)
+- [Viewing metadata associated with stream jobs in Azure SQL Edge](streaming-catalog-views.md)
- [Create external stream](create-external-stream-transact-sql.md)
azure-sql-edge Streaming Catalog Views https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/streaming-catalog-views.md
Title: Streaming catalog views (Transact-SQL) - Azure SQL Edge
description: Learn about the available streaming catalog views and dynamic management views in Azure SQL Edge - Previously updated : 05/19/2019 Last updated : 09/14/2023 keywords: - sys.external_streams - SQL Edge-
+# Streaming catalog views (Transact-SQL)
-# Streaming Catalog Views (Transact-SQL)
-
-This section contains the available catalog views and functions that are related to T-SQL Streaming.
-
-## In This Section
-
-|View|Description|
-|:|:|
-|[sys.external_streams](sys-external-streams.md) |Returns a row for each external stream object created within the scope of the database.|
-|[sys.external_streaming_jobs](sys-external-streaming-jobs.md) |Returns a row for each external streaming job created within the scope of the database.|
-|[sys.external_job_streams](sys-external-job-streams.md)|Returns a row each for the input or output external stream object mapped to an external streaming job.|
-
-## See also
--- [Catalog Views (Transact-SQL)](/sql/relational-databases/system-catalog-views/catalog-views-transact-sql/)-- [System Views (Transact-SQL)](/sql/t-sql/language-reference/)-
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+This section contains the available catalog views and functions that are related to Transact-SQL streaming.
+## In this section
+| View | Description |
+| : | : |
+| [sys.external_streams](sys-external-streams.md) | Returns a row for each external stream object created within the scope of the database. |
+| [sys.external_streaming_jobs](sys-external-streaming-jobs.md) | Returns a row for each external streaming job created within the scope of the database. |
+| [sys.external_job_streams](sys-external-job-streams.md) | Returns a row each for the input or output external stream object mapped to an external streaming job. |
+## See also
+- [Catalog views (Transact-SQL)](/sql/relational-databases/system-catalog-views/catalog-views-transact-sql/)
+- [System views (Transact-SQL)](/sql/t-sql/language-reference/)
azure-sql-edge Sys External Job Streams https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/sys-external-job-streams.md
Title: sys.external_job_streams (Transact-SQL) - Azure SQL Edge
description: Learn about using sys.external_job_streams in Azure SQL Edge - Previously updated : 05/19/2019 Last updated : 09/14/2023 keywords: - sys.external_job_streams - SQL Edge- - # sys.external_job_streams (Transact-SQL)
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Returns a row each for the input or output external stream object mapped to an external streaming job.
-|Column name|Data type|Description|
-|--||--|
-|**job_id**|**int**| Object identification number for the streaming job object. This column maps to the object_id column of sys.external_streaming_jobs.|
-|**stream_id**|**int**| Object identification number for the stream object. This column maps to the object_id column of sys.external_streams. |
-|**is_input**|**bit**| 1 if the stream object is used an input for the streaming job, otherwise 0.|
-|**is_output**|**bit**| 1 if the stream object is used an output for the streaming job, otherwise 0.|
+| Column name | Data type | Description |
+| | | |
+| **job_id** | **int** | Object identification number for the streaming job object. This column maps to the object_id column of `sys.external_streaming_jobs`. |
+| **stream_id** | **int** | Object identification number for the stream object. This column maps to the object_id column of `sys.external_streams`. |
+| **is_input** | **bit** | `1` if the stream object is used an input for the streaming job, otherwise `0`. |
+| **is_output** | **bit** | `1` if the stream object is used an output for the streaming job, otherwise `0`. |
-## Example
+## Examples
-This catalog view is used together with `sys.external_streams` and `sys.external_streaming_jobs` catalog views. A sample query is shown below
+This catalog view is used together with `sys.external_streams` and `sys.external_streaming_jobs` catalog views. A sample query is shown as follows:
```sql
-Select
- sj.Name as Job_Name,
- sj.Create_date as Job_Create_date,
- sj.modify_date as Job_Modify_date,
- sj.statement as Stream_Job_Query,
- Input_Stream_Name =
- Case js.is_input
- when 1 then s.Name
- else null
+SELECT sj.Name AS Job_Name,
+ sj.Create_date AS Job_Create_Date,
+ sj.modify_date AS Job_Modify_Date,
+ sj.statement AS Stream_Job_Query,
+ Input_Stream_Name = CASE js.is_input
+ WHEN 1 THEN s.Name
+ ELSE NULL
END,
- output_Stream_Name =
- case js.is_output
- when 1 then s.Name
- else null
+ output_Stream_Name = CASE js.is_output
+ WHEN 1 THEN s.Name
+ ELSE NULL
END,
- s.location as Stream_Location
-from sys.external_job_streams js
-inner join sys.external_streams s on s.object_id = js.stream_id
-inner join sys.external_streaming_jobs sj on sj.object_id = js.job_id
+ s.location AS Stream_Location
+FROM sys.external_job_streams js
+INNER JOIN sys.external_streams s
+ ON s.object_id = js.stream_id
+INNER JOIN sys.external_streaming_jobs sj
+ ON sj.object_id = js.job_id;
``` ## Permissions
azure-sql-edge Sys External Streaming Jobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/sys-external-streaming-jobs.md
Title: sys.external_streaming_jobs (Transact-SQL) - Azure SQL Edge
-description: Learn about using sys.external_streaming_jobs in Azure SQL Edge
+description: sys.external_streaming_jobs returns a row for each external streaming job created within the scope of the database.
- Previously updated : 05/19/2019 Last updated : 09/14/2023 keywords: - sys.external_streaming_jobs - SQL Edge- - # sys.external_streaming_jobs (Transact-SQL)
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Returns a row for each external streaming job created within the scope of the database.
-|Column name|Data type|Description|
-|--||--|
-|**name**|**sysname**|Name of the stream. Is unique within the database.|
-|**object_id**|**int**|object identification number for the stream object. Is unique within the database.|
-|**principal_id**|**int**|ID of the principal that owns this assembly|
-|**schema_id**|**int**| ID of the schema that contains the object.|
-|**parent_object_id**|**id**| object identification number for the parent object for this stream. In the current implementation, this value is always null|
-|**type**|**char(2)**|Object type. For stream objects, the type is always 'EJ'|
-|**type_desc**|**nvarchar(60)**| Description of the object type. For stream objects, the type is always 'EXTERNAL_STREAMING_JOB'|
-|**create_date**|**datetime**| Date the object was created.|
-|**modify_date**|**datetime**| In the current implementation, this value is the same as the create_date for the stream object |
-|**is_ms_shipped**|**bit**| Object created by an internal component.|
-|**is_published**|**bit**| Object is published.|
-|**is_schema_published**|**bit**|Only the schema of the object is published.|
-|**uses_ansi_nulls**|**bit**| Stream object was created with the SET ANSI_NULLS database option ON|
-|**statement**|**nvarchar(max)**| The stream analytics query text for the streaming job. For more information, see [sp_create_streaming_job](overview.md) |
-|**status**|**int**| The current status of the streaming job. The possible values are <br /><br /> **Created** = 0. The streaming job was created, but has not yet been started. <br /><br /> **Starting** = 1. The streaming job is in the starting phase. <br /><br /> **Failed** = 6. The streaming job Failed. This is generally an indication of a fatal error during processing. <br /><br /> **Stopped** = 4. The streaming job has been stopped. <br /><br /> **Idle** = 7. The streaming job is running, however there is no input to process. <br /><br /> **Processing** = 8. The streaming job is running, and is processing inputs. This state indicates a healthy state for the streaming job. <br /><br /> **Degraded** = 9. The streaming job is running, however there were some non-fatal input/output serialization/de-serialization errors during input processing. The input job will continue to run, but will drop inputs that encounter errors.|
+| Column name | Data type | Description |
+| | | |
+| **name** | **sysname** | Name of the stream. Is unique within the database. |
+| **object_id** | **int** | object identification number for the stream object. Is unique within the database. |
+| **principal_id** | **int** | ID of the principal that owns this assembly |
+| **schema_id** | **int** | ID of the schema that contains the object. |
+| **parent_object_id** | **id** | object identification number for the parent object for this stream. In the current implementation, this value is always null. |
+| **type** | **char(2)** | Object type. For stream objects, the type is always `EJ`. |
+| **type_desc** | **nvarchar(60)** | Description of the object type. For stream objects, the type is always `EXTERNAL_STREAMING_JOB`. |
+| **create_date** | **datetime** | Date the object was created. |
+| **modify_date** | **datetime** | In the current implementation, this value is the same as the create_date for the stream object. |
+| **is_ms_shipped** | **bit** | Object created by an internal component. |
+| **is_published** | **bit** | Object is published. |
+| **is_schema_published** | **bit** | Only the schema of the object is published. |
+| **uses_ansi_nulls** | **bit** | Stream object was created with the `SET ANSI_NULLS` database option `ON`. |
+| **statement** | **nvarchar(max)** | The stream analytics query text for the streaming job. For more information, see [sp_create_streaming_job](overview.md). |
+| **status** | **int** | The current status of the streaming job. The possible values are<br /><br />**Created** = 0. The streaming job was created, but hasn't yet been started.<br /><br />**Starting** = 1. The streaming job is in the starting phase.<br /><br />**Failed** = 6. The streaming job failed. This is generally an indication of a fatal error during processing.<br /><br />**Stopped** = 4. The streaming job has been stopped.<br /><br />**Idle** = 7. The streaming job is running, however there's no input to process.<br /><br />**Processing** = 8. The streaming job is running, and is processing inputs. This state indicates a healthy state for the streaming job.<br /><br />**Degraded** = 9. The streaming job is running, however there were some nonfatal input/output serialization/de-serialization errors during input processing. The input job continues to run, but drops inputs that encounter errors. |
## Permissions
The visibility of the metadata in catalog views is limited to securables that a
- [T-SQL Streaming Catalog Views](overview.md) - [Catalog Views (Transact-SQL)](/sql/relational-databases/system-catalog-views/catalog-views-transact-sql/) - [System Views (Transact-SQL)](/sql/t-sql/language-reference/)-
azure-sql-edge Sys External Streams https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/sys-external-streams.md
Title: sys.external_streams (Transact-SQL) - Azure SQL Edge
-description: Learn about using sys.external_streams in Azure SQL Edge
+description: sys.external_streams returns a row for each external stream object created within the scope of the database.
- Previously updated : 05/19/2019 Last updated : 09/14/2023 keywords: - sys.external_streams - SQL Edge- - # sys.external_streams (Transact-SQL)
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Returns a row for each external stream object created within the scope of the database.
-|Column name|Data type|Description|
-|--||--|
-|**name**|**sysname**|Name of the stream. Is unique within the database.|
-|**object_id**|**int**|object identification number for the stream object. Is unique within the database.|
-|**principal_id**|**int**|ID of the principal that owns this assembly|
-|**schema_id**|**int**| ID of the schema that contains the object.|
-|**parent_object_id**|**id**| object identification number for the parent object for this stream. In the current implementation, this value is always null|
-|**type**|**char(2)**|Object type. For stream objects, the type is always 'ES'|
-|**type_desc**|**nvarchar(60)**| Description of the object type. For stream objects, the type is always 'EXTERNAL_STREAM'|
-|**create_date**|**datetime**| Date the object was created.|
-|**modify_date**|**datetime**| Date the object was last modified by using an ALTER statement.|
-|**is_ms_shipped**|**bit**| Object created by an internal component.|
-|**is_published**|**bit**|Object is published.|
-|**is_schema_published**|**bit**|Only the schema of the object is published.|
-|**max_column_id_used**|**bit**| This column is used for internal purposes and will be removed in future|
-|**uses_ansi_nulls**|**bit**| Stream object was created with the SET ANSI_NULLS database option ON|
-|**data_source_id**|**int**| The object ID for the external data source represented by the stream object |
-|**file_format_id**|**int**| The object ID for the external file format used by the stream object. The external file format is required to specify the actual layout of the data referenced by an external stream|
-|**location**|**varchar(max)**| The target for the external stream object. For more information, refer [Create External Stream](overview.md) |
-|**input_option**|**varchar(max)**| The input options used during the creation of the external stream. For more information, refer [Create External Stream](overview.md) |
-|**output_option**|**varchar(max)**| The output options used during the creation of the external stream. For more information, refer [Create External Stream](overview.md) |
+| Column name | Data type | Description |
+| | | |
+| **name** | **sysname** | Name of the stream. Is unique within the database. |
+| **object_id** | **int** | object identification number for the stream object. Is unique within the database. |
+| **principal_id** | **int** | ID of the principal that owns this assembly. |
+| **schema_id** | **int** | ID of the schema that contains the object. |
+| **parent_object_id** | **id** | object identification number for the parent object for this stream. In the current implementation, this value is always null. |
+| **type** | **char(2)** | Object type. For stream objects, the type is always 'ES'. |
+| **type_desc** | **nvarchar(60)** | Description of the object type. For stream objects, the type is always 'EXTERNAL_STREAM'. |
+| **create_date** | **datetime** | Date the object was created. |
+| **modify_date** | **datetime** | Date the object was last modified by using an ALTER statement. |
+| **is_ms_shipped** | **bit** | Object created by an internal component. |
+| **is_published** | **bit** | Object is published. |
+| **is_schema_published** | **bit** | Only the schema of the object is published. |
+| **max_column_id_used** | **bit** | This column is used for internal purposes and will be removed in future. |
+| **uses_ansi_nulls** | **bit** | Stream object was created with the SET ANSI_NULLS database option ON. |
+| **data_source_id** | **int** | The object ID for the external data source represented by the stream object. |
+| **file_format_id** | **int** | The object ID for the external file format used by the stream object. The external file format is required to specify the actual layout of the data referenced by an external stream. |
+| **location** | **varchar(max)** | The target for the external stream object. For more information, see [Create External Stream](overview.md). |
+| **input_option** | **varchar(max)** | The input options used during the creation of the external stream. For more information, see [Create External Stream](overview.md). |
+| **output_option** | **varchar(max)** | The output options used during the creation of the external stream. For more information, see [Create External Stream](overview.md). |
## Permissions
-The visibility of the metadata in catalog views is limited to securables that a user either owns or on which the user has been granted some permission. For more information, see [Metadata Visibility Configuration](/sql/relational-databases/security/metadata-visibility-configuration/).
+The visibility of the metadata in catalog views is limited to securables that a user either owns, or on which the user has been granted some permission. For more information, see [Metadata Visibility Configuration](/sql/relational-databases/security/metadata-visibility-configuration/).
## See also
azure-sql-edge Sys Sp Cleanup Data Retention https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/sys-sp-cleanup-data-retention.md
Title: sys.sp_cleanup_data_retention (Transact-SQL) - Azure SQL Edge
-description: Learn about using sys.sp_cleanup_data_retention (Transact-SQL) in Azure SQL Edge
+description: sys.sp_cleanup_data_retention performs cleanup of obsolete records from tables that have data retention policies enabled.
- Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - sys.sp_cleanup_data_retention (Transact-SQL) - SQL Edge- - # sys.sp_cleanup_data_retention (Transact-SQL)
-**Applies to:** Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
Performs cleanup of obsolete records from tables that have data retention policies enabled. For more information, see [Data Retention](data-retention-overview.md).
Performs cleanup of obsolete records from tables that have data retention polici
```syntaxsql sys.sp_cleanup_data_retention
- { [@schema_name = ] 'schema_name' },
- { [@table_name = ] 'table_name' },
- [ [@rowcount =] rowcount OUTPUT ]
-
+ { [ @schema_name = ] 'schema_name' } ,
+ { [ @table_name = ] 'table_name' } ,
+ [ [ @rowcount = ] rowcount OUTPUT ]
``` ## Arguments
-`[ @schema_name = ] schema_name`
- Is the name of the owning schema for the table on which cleanup needs to be performed. *schema_name* is a required parameter of type **sysname**.
-`[ @table_name = ] 'table_name'`
- Is the name of the table on which cleanup operation needs to be performed. *table_name* is a required parameter of type **sysname**.
+#### [ @schema_name = ] '*schema_name*'
+
+The name of the owning schema for the table on which cleanup needs to be performed. *schema_name* is a required parameter of type **sysname**.
+
+#### [ @table_name = ] '*table_name*'
+
+The name of the table on which cleanup operation needs to be performed. *table_name* is a required parameter of type **sysname**.
## Output parameter
-`[ @rowcount = ] rowcount OUTPUT`
- rowcount is an optional OUTPUT parameter that represents the number of records cleanup from the table. *rowcount* is int.
+#### [ @rowcount = ] rowcount OUTPUT
+
+An optional OUTPUT parameter that represents the number of records cleanup from the table. *rowcount* is **int**.
## Permissions
- Requires db_owner permissions.
+
+Requires **db_owner** permissions.
## Next steps+ - [Data Retention and Automatic Data Purging](data-retention-overview.md) - [Manage historical data with retention policy](data-retention-cleanup.md) - [Enable and disable data retention](data-retention-enable-disable.md)
azure-sql-edge Track Data Changes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/track-data-changes.md
Title: Track data changes in Azure SQL Edge
description: Learn about change tracking and change data capture in Azure SQL Edge. - Previously updated : 05/19/2020 Last updated : 09/14/2023 - - # Track data changes in Azure SQL Edge
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ Azure SQL Edge supports the two SQL Server features that track changes to data in a database: [change tracking](/sql/relational-databases/track-changes/track-data-changes-sql-server#Tracking) and [change data capture](/sql/relational-databases/track-changes/track-data-changes-sql-server#Capture). These features enable applications to determine the data modification language changes (insert, update, and delete operations) that were made to user tables in a database. You can enable change data capture and change tracking on the same database. No special considerations are required. The ability to query for data that has changed in a database is an important requirement for some applications to be efficient. Typically, to determine data changes, application developers must implement a custom tracking method in their applications by using a combination of triggers, timestamp columns, and additional tables. Creating these applications usually involves a lot of work to implement, leads to schema updates, and often carries a high performance overhead.
-In the case of an IoT solution, where you need to periodically move data from the edge to a cloud or datacenter, change tracking can be very useful. Users can quickly and effectively query only the changes from the last sync, and upload those changes to the cloud or datacenter target. For additional details, see [Benefits of using change data capture or change tracking](/sql/relational-databases/track-changes/track-data-changes-sql-server#benefits-of-using-change-data-capture-or-change-tracking).
+In the case of an IoT solution, where you need to periodically move data from the edge to a cloud or datacenter, change tracking can be useful. Users can quickly and effectively query only the changes from the last sync, and upload those changes to the cloud or datacenter target. For more information, see [Benefits of using change data capture or change tracking](/sql/relational-databases/track-changes/track-data-changes-sql-server#benefits-of-using-change-data-capture-or-change-tracking).
These two features aren't the same. For more information, see [Feature differences between change data capture and change tracking](/sql/relational-databases/track-changes/track-data-changes-sql-server#feature-differences-between-change-data-capture-and-change-tracking)
To administer and monitor this feature, see [Administer and monitor change data
To understand how to query and work with the changed data, see [Work with change data](/sql/relational-databases/track-changes/work-with-change-data-sql-server).
-> [!NOTE]
-> Change Data Capture functions which are dependent on CLR are not supported on Azure SQL Edge.
+> [!NOTE]
+> Change Data Capture functions which are dependent on CLR aren't supported on Azure SQL Edge.
## Change tracking
To understand how to query and work with the changed data, see [Work with change
## Temporal tables
-Azure SQL Edge also supports the temporal tables feature of SQL Server. This feature (also known as *system-versioned temporal tables*) brings built-in support for providing information about data stored in the table at any point in time. The feature doesn't simply provide information about only the data that is correct at the current moment in time.
+Azure SQL Edge also supports the temporal tables feature of SQL Server. This feature (also known as *system-versioned temporal tables*) brings built-in support for providing information about data stored in the table at any point in time. The feature provides more than just the data that is correct at the current moment in time.
-A system-versioned temporal table is a type of user table designed to keep a full history of data changes, and to allow easy point-in-time analysis. This type of temporal table is referred to as a system-versioned temporal table because the period of validity for each row is managed by the system (that is, the database engine).
+A system-versioned temporal table is a type of user table designed to keep a full history of data changes, and to allow easy point-in-time analysis. This type of temporal table is referred to as a system-versioned temporal table, because the system manages the period of validity for each row (that is, the Database Engine).
-Every temporal table has two explicitly defined columns, each with a `datetime2` data type. These columns are referred to as *period* columns. The system uses these period columns exclusively, to record the period of validity for each row whenever a row is modified.
+Every temporal table has two explicitly defined columns, each with a **datetime2** data type. These columns are referred to as *period* columns. The system uses these period columns exclusively, to record the period of validity for each row whenever a row is modified.
In addition to these period columns, a temporal table also contains a reference to another table with a mirrored schema. The system uses this table to automatically store the previous version of the row each time a row in the temporal table gets updated or deleted. This additional table is referred to as the *history* table, while the main table that stores current (actual) row versions is referred to as the *current* table, or simply as the temporal table. During temporal table creation, users can specify an existing history table (it must be compliant with the schema), or let the system create the default history table.
For more information, see [Temporal tables](/sql/relational-databases/tables/tem
## Next steps -- [Data streaming in Azure SQL Edge ](stream-data.md)-- [Machine learning and AI with ONNX in Azure SQL Edge ](onnx-overview.md)
+- [Data streaming in Azure SQL Edge](stream-data.md)
+- [Machine learning and AI with ONNX in Azure SQL Edge](onnx-overview.md)
- [Configure replication to Azure SQL Edge](configure-replication.md) - [Backup and restore databases in Azure SQL Edge](backup-restore.md)
azure-sql-edge Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/troubleshoot.md
Title: Troubleshooting Azure SQL Edge deployments
+ Title: Troubleshoot Azure SQL Edge deployments
description: Learn about possible errors when deploying Azure SQL Edge - Previously updated : 09/22/2020 Last updated : 09/14/2023 keywords: - SQL Edge - troubleshooting - deployment errors-
+# Troubleshoot Azure SQL Edge deployments
-# Troubleshooting Azure SQL Edge deployments
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
-This article provides information about possible errors seen when deploying and using Azure SQL Edge containers, and provides troubleshooting techniques to help resolve these issues.
+This article provides information about possible errors seen when deploying and using Azure SQL Edge containers, and provides troubleshooting techniques to help resolve these issues.
+
+Azure SQL Edge supports two deployment models:
-Azure SQL Edge supports two deployment models:
- Connected deployment through Azure IoT Edge: Azure SQL Edge is available on the Azure Marketplace and can be deployed as a module for [Azure IoT Edge](../iot-edge/about-iot-edge.md). For more information, see [Deploy Azure SQL Edge](deploy-portal.md).<br> -- Disconnected deployment: Azure SQL Edge container images can be pulled from docker hub and deployed either as a standalone docker container or on a kubernetes cluster. For more information, see [Deploy Azure SQL Edge with Docker](disconnected-deployment.md) and [Deploy an Azure SQL Edge container in Kubernetes](deploy-kubernetes.md).
+- Disconnected deployment: Azure SQL Edge container images can be pulled from Docker hub and deployed either as a standalone container or on a Kubernetes cluster. For more information, see [Deploy Azure SQL Edge with Docker](disconnected-deployment.md) and [Deploy an Azure SQL Edge container in Kubernetes](deploy-kubernetes.md).
-## Troubleshooting IoT Edge device and deployments
+## Troubleshoot IoT Edge device and deployments
If you get error while deploying SQL Edge through Azure IoT Edge, make sure that the `iotedge` service is properly configured and running. The following documents can be helpful when troubleshooting issues related to Azure IoT Edge:+ - [Common issues and resolutions for Azure IoT Edge](../iot-edge/troubleshoot-common-errors.md). - [Troubleshoot your IoT Edge device](../iot-edge/troubleshoot.md) ## Docker command errors
-If you get errors for any `docker` commands, make sure that the docker service is running, and try to run with elevated permissions.
+If you get errors for any `docker` commands, make sure that the Docker service is running, and try to run with elevated permissions.
For example, on Linux, you might get the following error when running `docker` commands:
For example, on Linux, you might get the following error when running `docker` c
Cannot connect to the Docker daemon. Is the docker daemon running on this host? ```
-If you get this error on Linux, try running the same commands prefaced with `sudo`. If that fails, verify the docker service is running, and start it if necessary.
+If you get this error on Linux, try running the same commands prefaced with `sudo`. If that fails, verify the Docker service is running, and start it if necessary.
```bash sudo systemctl status docker sudo systemctl start docker ```
-On Windows, verify that you are launching PowerShell or your command-prompt as an Administrator.
+On Windows, verify that you're launching PowerShell or your command-prompt as an Administrator.
## Azure SQL Edge container startup errors If the SQL Edge container fails to run, try the following tests: -- If you are using Azure IoT Edge, make sure that the module images were downloaded successfully and that the environment variables and container create options are correctly specified in the module manifest.
+- If you're using Azure IoT Edge, make sure that the module images were downloaded successfully, and that the environment variables and container create options are correctly specified in the module manifest.
-- If you are using docker or kubernetes based deployment, make sure that the `docker run` command is correctly formed. For more information, refer [Deploy Azure SQL Edge with Docker](disconnected-deployment.md) and [Deploy an Azure SQL Edge container in Kubernetes](deploy-kubernetes.md).
+- If you're using Docker or Kubernetes based deployment, make sure that the `docker run` command is correctly formed. For more information, see [Deploy Azure SQL Edge with Docker](disconnected-deployment.md) and [Deploy an Azure SQL Edge container in Kubernetes](deploy-kubernetes.md).
-- If you get an error such as `failed to create endpoint CONTAINER_NAME on network bridge. Error starting proxy: listen tcp 0.0.0.0:1433 bind: address already in use.`, you are attempting to map the container port 1433 to a port that is already in use. This can happen if you're running SQL Edge locally on the host machine. It can also happen if you start two SQL Edge containers and try to map them both to the same host port. If this happens, use the `-p` parameter to map the container port 1433 to a different host port. For example:
+- If you get an error such as `failed to create endpoint CONTAINER_NAME on network bridge. Error starting proxy: listen tcp 0.0.0.0:1433 bind: address already in use.`, you're attempting to map the container port 1433 to a port that is already in use. This can happen if you're running SQL Edge locally on the host machine. It can also happen if you start two SQL Edge containers and try to map them both to the same host port. If this happens, use the `-p` parameter to map the container port 1433 to a different host port. For example:
- ```bash
- sudo docker run --cap-add SYS_PTRACE -e 'ACCEPT_EULA=1' -e 'MSSQL_SA_PASSWORD=yourStrong(!)Password' -p 1433:1433 --name azuresqledge -d mcr.microsoft.com/azure-sql-edge-developer.
- ```
+ ```bash
+ sudo docker run --cap-add SYS_PTRACE -e 'ACCEPT_EULA=1' -e 'MSSQL_SA_PASSWORD=yourStrong(!)Password' -p 1433:1433 --name azuresqledge -d mcr.microsoft.com/azure-sql-edge-developer.
+ ```
-- If you get an error such as `Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get http://%2Fvar%2Frun%2Fdocker.sock/v1.30tdout=1&tail=all: dial unix /var/run/docker.sock: connect: permission denied` when trying to start a container, then add your user to the docker group in Ubuntu. Then logout and login again as this change will affect new sessions.
+- If you get an error such as `Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get http://%2Fvar%2Frun%2Fdocker.sock/v1.30tdout=1&tail=all: dial unix /var/run/docker.sock: connect: permission denied` when trying to start a container, then add your user to the docker group in Ubuntu. Then sign out and sign back in again, as this change affects new sessions.
- ```bash
- usermod -aG docker $USER
- ```
+ ```bash
+ usermod -aG docker $USER
+ ```
- Check to see if there are any error messages from container.
- ```bash
- docker logs e69e056c702d
- ```
+ ```bash
+ docker logs e69e056c702d
+ ```
-- If you are using any container management software, make sure it supports container processes running as root. The sqlservr process in the container runs as root.
+- If you're using any container management software, make sure it supports container processes running as root. The sqlservr process in the container runs as root.
-- By default Azure SQL Edge containers run as a non-root user named `mssql`. If you are using mount points or data volumes to persist data, ensure that the `mssql` user has appropriate permissions on the volume. For more information, see [Run as non-root user](configure.md#run-azure-sql-edge-as-non-root-user) and [Persist Data](configure.md#persist-your-data).
+- By default Azure SQL Edge containers run as a non-root user named `mssql`. If you're using mount points or data volumes to persist data, ensure that the `mssql` user has appropriate permissions on the volume. For more information, see [Run as non-root user](configure.md#run-azure-sql-edge-as-non-root-user) and [Persist Data](configure.md#persist-your-data).
-- If your SQL Edge Docker container exits immediately after starting, check your docker logs. If you are using PowerShell on Windows with the `docker run` command, use double quotes instead of single quotes. With PowerShell Core, use single quotes.
+- If your SQL Edge Docker container exits immediately after starting, check your docker logs. If you're using PowerShell on Windows with the `docker run` command, use double quotes instead of single quotes. With PowerShell Core, use single quotes.
- Review the [SQL Edge error logs](#errorlogs).
If the SQL Edge container fails to run, try the following tests:
If you can't connect to the SQL Edge instance running in your container, try the following tests: -- Make sure that your SQL Edge container is running by looking at the **STATUS** column of the `docker ps -a` output. If not, use `docker start <Container ID>` to start it.
+- Make sure that your SQL Edge container is running by looking at the `STATUS` column of the `docker ps -a` output. If not, use `docker start <Container ID>` to start it.
-- If you mapped to a non-default host port (not 1433), make sure you are specifying the port in your connection string. You can see your port mapping in the **PORTS** column of the `docker ps -a` output. For more information on connecting to Azure SQL Edge, refer [Connect and query Azure SQL Edge](connect.md)
+- If you mapped to a non-default host port (not 1433), make sure you're specifying the port in your connection string. You can see your port mapping in the `PORTS` column of the `docker ps -a` output. For more information on connecting to Azure SQL Edge, see [Connect and query Azure SQL Edge](connect.md).
-- If you had previously deployed SQL Edge with mapped data volume or data volume container, and are now using the existing mapped data volume or data volume container, SQL Edge ignores the value of `MSSQL_SA_PASSWORD` environment variable. Instead, the previously configured SA user password is used. This happens because SQL Edge reuses the existing master databases files in the mapped volume or data volume container. If you run into this issue, you can use the following options:
+- If you previously deployed SQL Edge with a mapped data volume or data volume container, and now use the existing mapped data volume or data volume container, SQL Edge ignores the value of `MSSQL_SA_PASSWORD` environment variable. Instead, the previously configured SA user password is used. This happens because SQL Edge reuses the existing `master` databases files in the mapped volume or data volume container. If you run into this issue, you can use the following options:
- - Connect using the previously used password, if it's still available.
- - Configure SQL Edge to use a different mapped volume or data volume container.
- - Remove the existing master database files (master.mdf and mastlog.mdf) from the mapped volume or data volume container.
+ - Connect using the previously used password, if it's still available.
+ - Configure SQL Edge to use a different mapped volume or data volume container.
+ - Remove the existing `master` database files (`master.mdf` and `mastlog.mdf`) from the mapped volume or data volume container.
- Review the [SQL Edge error logs](#errorlogs). ## <a id="errorlogs"></a> SQL Edge setup and error logs
-By default, SQL Edge error logs are present in the **/var/opt/mssql/log** directory within the container and can be accessed using any of the following ways:
+By default, SQL Edge error logs are present in the `/var/opt/mssql/log` directory within the container and can be accessed using any of the following ways:
+
+- If you mounted a host directory to `/var/opt/mssql` when you created your container, you can instead look in the `log` subdirectory on the mapped path on the host.
-- If you mounted a host directory to **/var/opt/mssql** when you created your container, you can instead look in the **log** subdirectory on the mapped path on the host.-- By using an interactive command-prompt to connect to the container. If the container is not running, first start the container. Then use an interactive command-prompt to inspect the logs. You can get the container ID by running the command `docker ps`.
+- By using an interactive command-prompt to connect to the container. If the container isn't running, first start the container. Then use an interactive command-prompt to inspect the logs. You can get the container ID by running the command `docker ps`.
- ```bash
- docker start <ContainerID>
- docker exec -it <ContainerID> "/bin/bash"
- ```
+ ```bash
+ docker start <ContainerID>
+ docker exec -it <ContainerID> "/bin/bash"
+ ```
- From the bash session inside your container, run the following commands:
+ From the bash session inside your container, run the following commands:
- ```bash
- cd /var/opt/mssql/log
- cat errorlog
- ```
-- If the SQL Edge container is up and running and you are able to connect to the instance using client tools, then you can use the stored procedure `sp_readerrorlog` to read the contents of the SQL Edge error log.
+ ```bash
+ cd /var/opt/mssql/log
+ cat errorlog
+ ```
+
+- If the SQL Edge container is up and running and you're able to connect to the instance using client tools, then you can use the stored procedure `sp_readerrorlog` to read the contents of the SQL Edge error log.
## Execute commands in a container
To start a bash terminal in the container run:
docker exec -it <Container ID> /bin/bash ```
-Now you can run commands as though you are running them at the terminal inside the container. When finished, type `exit`. This exits in the interactive command session, but your container continues to run.
-
-### Enabling verbose logging
+Now you can run commands as though you're running them at the terminal inside the container. When finished, type `exit`. This exits in the interactive command session, but your container continues to run.
-If the default log level for the streaming engine does not provide enough information, debug logging for the streaming engine can be enabled in SQL Edge. To enable debug logging add the `RuntimeLogLevel=debug` environment variable to your SQL Edge deployment. After enabling debug logging, attempt to reproduce the problem and check the logs for any relevant messages or exceptions.
+### Enable verbose logging
-> [!NOTE]
-> The Verbose Logging option should only be used for troubleshooting and not for regular production workload.
+If the default log level for the streaming engine doesn't provide enough information, debug logging for the streaming engine can be enabled in SQL Edge. To enable debug logging, add the `RuntimeLogLevel=debug` environment variable to your SQL Edge deployment. After enabling debug logging, attempt to reproduce the problem and check the logs for any relevant messages or exceptions.
+> [!NOTE]
+> The Verbose Logging option should only be used for troubleshooting and not for regular production workload.
## Next steps
If the default log level for the streaming engine does not provide enough inform
- [Data Streaming in Azure SQL Edge](stream-data.md) - [Data Retention and cleanup](data-retention-overview.md) - [Filling time gaps and imputing missing values](imputing-missing-values.md)-------
azure-sql-edge Tutorial Deploy Azure Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/tutorial-deploy-azure-resources.md
description: In part one of this three-part Azure SQL Edge tutorial for predicti
Previously updated : 05/19/2020 Last updated : 09/14/2023 - -+
+ - devx-track-azurepowershell
+ - devx-track-azurecli
# Install software and set up resources for the tutorial
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ In this three-part tutorial, you'll create a machine learning model to predict iron ore impurities as a percentage of Silica, and then deploy the model in Azure SQL Edge. In part one, you'll install the required software and deploy Azure resources. ## Prerequisites 1. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/).
-2. Install Visual Studio 2019 with
- * Azure IoT Edge tools
- * .NET core cross-platform development
- * Container development tools
-3. Install [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio/)
-4. Open Azure Data Studio and configure Python for notebooks. For details, see [Configure Python for Notebooks](/sql/azure-data-studio/sql-notebooks#configure-python-for-notebooks). This step can take several minutes.
-5. Install the latest version of [Azure CLI](https://github.com/Azure/azure-powershell/releases/tag/v3.5.0-February2020). The following scripts require that AZ PowerShell be the latest version (3.5.0, Feb 2020).
-6. Set up the environment to debug, run, and test IoT Edge solution by installing [Azure IoT EdgeHub Dev Tool](https://pypi.org/project/iotedgehubdev/).
-7. Install Docker.
+1. Install Visual Studio 2019 with
+ - Azure IoT Edge tools
+ - .NET core cross-platform development
+ - Container development tools
+1. Install [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio/)
+1. Open Azure Data Studio and configure Python for notebooks. For details, see [Configure Python for Notebooks](/sql/azure-data-studio/sql-notebooks#configure-python-for-notebooks). This step can take several minutes.
+1. Install the latest version of [Azure CLI](https://github.com/Azure/azure-powershell/releases/tag/v3.5.0-February2020). The following scripts require that AZ PowerShell be the latest version (3.5.0, Feb 2020).
+1. Set up the environment to debug, run, and test IoT Edge solution by installing [Azure IoT EdgeHub Dev Tool](https://pypi.org/project/iotedgehubdev/).
+1. Install Docker.
## Deploy Azure resources using PowerShell Script
-Deploy the Azure resources required by this Azure SQL Edge tutorial. These can be deployed either by using a PowerShell script or through the Azure portal. This tutorial uses a PowerShell script.
+Deploy the Azure resources required by this Azure SQL Edge tutorial. These resources can be deployed either by using a PowerShell script or through the Azure portal. This tutorial uses a PowerShell script.
+ 1. Import the modules needed to run the PowerShell script in this tutorial.
Deploy the Azure resources required by this Azure SQL Edge tutorial. These can b
az extension add --name azure-cli-ml ```
- [!INCLUDE [iot-hub-cli-version-info](../../includes/iot-hub-cli-version-info.md)]
--
-2. Declare the variables required by the PowerShell script.
+1. Declare the variables required by the PowerShell script.
```powershell $ResourceGroup = "<name_of_the_resource_group>"
Deploy the Azure resources required by this Azure SQL Edge tutorial. These can b
$StorageAccountName = "<name_of_your_storage_account>" ```
-3. Declare the rest of the variables.
+1. Declare the rest of the variables.
```powershell $IoTHubSkuName = "S1"
Deploy the Azure resources required by this Azure SQL Edge tutorial. These can b
$containerRegistryName = $ResourceGroup + "ContRegistry" ```
-4. To begin creating assets, log in to Azure.
+1. To begin creating assets, sign in to Azure.
```powershell Login-AzAccount
-
+ az login ```
-5. Set the Azure subscription ID.
+1. Set the Azure subscription ID.
```powershell Select-AzSubscription -Subscription $SubscriptionName az account set --subscription $SubscriptionName ```
-6. Create the resource group if it doesn't already exist.
+1. Create the resource group if it doesn't already exist.
```powershell $rg = Get-AzResourceGroup -Name $ResourceGroup
Deploy the Azure resources required by this Azure SQL Edge tutorial. These can b
} ```
-7. Create the storage account and storage account container in the resource group.
+1. Create the storage account and storage account container in the resource group.
```powershell
- $sa = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName
+ $sa = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName
if ($sa -eq $null) { New-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName -SkuName Standard_LRS -Location $location -Kind Storage
- $sa = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName
- $storagekey = Get-AzStorageAccountKey -ResourceGroupName $ResourceGroup -Name $StorageAccountName
- $storageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey[0].Value
- New-AzStorageContainer -Name "sqldatabasedacpac" -Context $storageContext
+ $sa = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName
+ $storageKey = Get-AzStorageAccountKey -ResourceGroupName $ResourceGroup -Name $StorageAccountName
+ $storageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storageKey[0].Value
+ New-AzStorageContainer -Name "sqldatabasedacpac" -Context $storageContext
} else {
- Write-Output ("Storage Account $StorageAccountName exists in Resource Group $ResourceGroup")
+ Write-Output ("Storage Account $StorageAccountName exists in Resource Group $ResourceGroup")
} ```
-8. Upload the database `dacpac` file to the storage account and generate a SAS URL for the blob. **Make a note of the SAS URL for the database `dacpac` blob.**
+1. Upload the database `dacpac` file to the storage account and generate a SAS URL for the blob. **Make a note of the SAS URL for the database `dacpac` blob.**
```powershell $file = Read-Host "Please Enter the location to the zipped Database DacPac file:"
Deploy the Azure resources required by this Azure SQL Edge tutorial. These can b
$DacpacFileSASURL = New-AzStorageBlobSASToken -Container "sqldatabasedacpac" -Blob "SQLDatabasedacpac.zip" -Context $sa.Context -Permission r -StartTime (Get-Date).DateTime -ExpiryTime (Get-Date).AddMonths(12) -FullUri ```
-9. Create an Azure container registry within this resource group.
+1. Create an Azure container registry within this resource group.
```powershell
- $containerRegistry = Get-AzContainerRegistry -ResourceGroupName $ResourceGroup -Name $containerRegistryName
+ $containerRegistry = Get-AzContainerRegistry -ResourceGroupName $ResourceGroup -Name $containerRegistryName
if ($containerRegistry -eq $null) {
- New-AzContainerRegistry -ResourceGroupName $ResourceGroup -Name $containerRegistryName -Sku Standard -Location $location -EnableAdminUser
- $containerRegistry = Get-AzContainerRegistry -ResourceGroupName $ResourceGroup -Name $containerRegistryName
+ New-AzContainerRegistry -ResourceGroupName $ResourceGroup -Name $containerRegistryName -Sku Standard -Location $location -EnableAdminUser
+ $containerRegistry = Get-AzContainerRegistry -ResourceGroupName $ResourceGroup -Name $containerRegistryName
} else {
Deploy the Azure resources required by this Azure SQL Edge tutorial. These can b
} ```
-10. Create the network security group within the resource group.
-
- ```powershell
- $nsg = Get-AzNetworkSecurityGroup -ResourceGroupName $ResourceGroup -Name $NetworkSecGroup
- if($nsg -eq $null)
- {
- Write-Output("Network Security Group $NetworkSecGroup does not exist in the resource group $ResourceGroup")
-
- $rule1 = New-AzNetworkSecurityRuleConfig -Name "SSH" -Description "Allow SSH" -Access Allow -Protocol Tcp -Direction Inbound -Priority 100 -SourceAddressPrefix Internet -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange 22
- $rule2 = New-AzNetworkSecurityRuleConfig -Name "SQL" -Description "Allow SQL" -Access Allow -Protocol Tcp -Direction Inbound -Priority 101 -SourceAddressPrefix Internet -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange 1600
- New-AzNetworkSecurityGroup -Name $NetworkSecGroup -ResourceGroupName $ResourceGroup -Location $location -SecurityRules $rule1, $rule2
-
- $nsg = Get-AzNetworkSecurityGroup -ResourceGroupName $ResourceGroup -Name $NetworkSecGroup
- }
- else
- {
- Write-Output ("Network Security Group $NetworkSecGroup exists in the resource group $ResourceGroup")
- }
- ```
-
-11. Create an Azure virtual machine enabled with SQL Edge. This VM will act as an Edge device.
-
- ```powershell
- $AzVM = Get-AzVM -ResourceGroupName $ResourceGroup -Name $EdgeDeviceId
- If($AzVM -eq $null)
- {
- Write-Output("The Azure VM with Name- $EdgeVMName is not present in the Resource Group- $ResourceGroup ")
-
- $SingleSubnet = New-AzVirtualNetworkSubnetConfig -Name $SubnetName -AddressPrefix $SubnetAddressPrefix
- $Vnet = New-AzVirtualNetwork -Name $NetworkName -ResourceGroupName $ResourceGroup -Location $location -AddressPrefix $VnetAddressPrefix -Subnet $SingleSubnet
- $publicIp = New-AzPublicIpAddress -Name $publicIpName -ResourceGroupName $ResourceGroup -AllocationMethod Static -Location $location
- $NIC = New-AzNetworkInterface -Name $NICName -ResourceGroupName $ResourceGroup -Location $location -SubnetId $Vnet.Subnets[0].Id -NetworkSecurityGroupId $nsg.Id -PublicIpAddressId $publicIp.Id
-
- ##Set-AzNetworkInterfaceIpConfig -Name "ipconfig1" -NetworkInterface $NIC -PublicIpAddress $publicIp
-
- $Credential = New-Object System.Management.Automation.PSCredential ($AdminAcc, $AdminPassword);
-
- $VirtualMachine = New-AzVMConfig -VMName $EdgeDeviceId -VMSize $VMSize
- $VirtualMachine = Set-AzVMOperatingSystem -VM $VirtualMachine -Linux -ComputerName $EdgeDeviceId -Credential $Credential
- $VirtualMachine = Add-AzVMNetworkInterface -VM $VirtualMachine -Id $NIC.Id
- $VirtualMachine = Set-AzVMSourceImage -VM $VirtualMachine -PublisherName $imagePublisher -Offer $imageOffer -Skus $imageSku -Version latest
- $VirtualMachine = Set-AzVMPlan -VM $VirtualMachine -Name $imageSku -Publisher $imagePublisher -Product $imageOffer
-
- $AzVM = New-AzVM -ResourceGroupName $ResourceGroup -Location $location -VM $VirtualMachine -Verbose
- $AzVM = Get-AzVM -ResourceGroupName $ResourceGroup -Name $EdgeDeviceId
-
- }
- else
- {
- Write-Output ("The Azure VM with Name- $EdgeDeviceId is present in the Resource Group- $ResourceGroup ")
- }
- ```
-
-12. Create an IoT hub within the resource group.
-
- ```powershell
- $iotHub = Get-AzIotHub -ResourceGroupName $ResourceGroup -Name $IoTHubName
- If($iotHub -eq $null)
- {
- Write-Output("IoTHub $IoTHubName does not exists, creating The IoTHub in the resource group $ResourceGroup")
- New-AzIotHub -ResourceGroupName $ResourceGroup -Name $IoTHubName -SkuName $IoTHubSkuName -Units $IoTHubUnits -Location $location -Verbose
- }
- else
- {
- Write-Output ("IoTHub $IoTHubName present in the resource group $ResourceGroup")
- }
- ```
-
-13. Add an Edge device to the IoT hub. This step only creates the device digital identity.
-
- ```powershell
- $deviceIdentity = Get-AzIotHubDevice -ResourceGroupName $ResourceGroup -IotHubName $IoTHubName -DeviceId $EdgeDeviceId
- If($deviceIdentity -eq $null)
- {
- Write-Output("The Edge Device with DeviceId- $EdgeDeviceId is not registered to the IoTHub- $IoTHubName ")
- Add-AzIotHubDevice -ResourceGroupName $ResourceGroup -IotHubName $IoTHubName -DeviceId $EdgeDeviceId -EdgeEnabled
- }
- else
- {
- Write-Output ("The Edge Device with DeviceId- $EdgeDeviceId is registered to the IoTHub- $IoTHubName")
- }
- $deviceIdentity = Get-AzIotHubDevice -ResourceGroupName $ResourceGroup -IotHubName $IoTHubName -DeviceId $EdgeDeviceId
- ```
-
-14. Get the device primary connection string. This will be needed later for the VM. The following command uses Azure CLI for deployments.
-
- ```powershell
- $deviceConnectionString = az iot hub device-identity connection-string show --device-id $EdgeDeviceId --hub-name $IoTHubName --resource-group $ResourceGroup --subscription $SubscriptionName
- $connString = $deviceConnectionString[1].Substring(23,$deviceConnectionString[1].Length-24)
- $connString
- ```
-
-15. Update the connection string in the IoT Edge configuration file on the Edge device. The following commands use Azure CLI for deployments.
-
- ```azurecli
- $script = "/etc/iotedge/configedge.sh '" + $connString + "'"
- az vm run-command invoke -g $ResourceGroup -n $EdgeDeviceId --command-id RunShellScript --script $script
- ```
-
-16. Create an Azure Machine Learning workspace within the resource group.
-
- ```azurecli
- az ml workspace create -w $MyWorkSpace -g $ResourceGroup
- ```
--
-## Next Steps
-
-* [Set up IoT Edge modules and connections](tutorial-set-up-iot-edge-modules.md)
+1. Create the network security group within the resource group.
+
+ ```powershell
+ $nsg = Get-AzNetworkSecurityGroup -ResourceGroupName $ResourceGroup -Name $NetworkSecGroup
+ if($nsg -eq $null)
+ {
+ Write-Output("Network Security Group $NetworkSecGroup does not exist in the resource group $ResourceGroup")
+
+ $rule1 = New-AzNetworkSecurityRuleConfig -Name "SSH" -Description "Allow SSH" -Access Allow -Protocol Tcp -Direction Inbound -Priority 100 -SourceAddressPrefix Internet -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange 22
+ $rule2 = New-AzNetworkSecurityRuleConfig -Name "SQL" -Description "Allow SQL" -Access Allow -Protocol Tcp -Direction Inbound -Priority 101 -SourceAddressPrefix Internet -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange 1600
+ New-AzNetworkSecurityGroup -Name $NetworkSecGroup -ResourceGroupName $ResourceGroup -Location $location -SecurityRules $rule1, $rule2
+
+ $nsg = Get-AzNetworkSecurityGroup -ResourceGroupName $ResourceGroup -Name $NetworkSecGroup
+ }
+ else
+ {
+ Write-Output ("Network Security Group $NetworkSecGroup exists in the resource group $ResourceGroup")
+ }
+ ```
+
+1. Create an Azure virtual machine enabled with SQL Edge. This VM acts as an Edge device.
+
+ ```powershell
+ $AzVM = Get-AzVM -ResourceGroupName $ResourceGroup -Name $EdgeDeviceId
+ If($AzVM -eq $null)
+ {
+ Write-Output("The Azure VM with Name- $EdgeVMName is not present in the Resource Group- $ResourceGroup ")
+
+ $SingleSubnet = New-AzVirtualNetworkSubnetConfig -Name $SubnetName -AddressPrefix $SubnetAddressPrefix
+ $Vnet = New-AzVirtualNetwork -Name $NetworkName -ResourceGroupName $ResourceGroup -Location $location -AddressPrefix $VnetAddressPrefix -Subnet $SingleSubnet
+ $publicIp = New-AzPublicIpAddress -Name $publicIpName -ResourceGroupName $ResourceGroup -AllocationMethod Static -Location $location
+ $NIC = New-AzNetworkInterface -Name $NICName -ResourceGroupName $ResourceGroup -Location $location -SubnetId $Vnet.Subnets[0].Id -NetworkSecurityGroupId $nsg.Id -PublicIpAddressId $publicIp.Id
+
+ ##Set-AzNetworkInterfaceIpConfig -Name "ipconfig1" -NetworkInterface $NIC -PublicIpAddress $publicIp
+
+ $Credential = New-Object System.Management.Automation.PSCredential ($AdminAcc, $AdminPassword);
+
+ $VirtualMachine = New-AzVMConfig -VMName $EdgeDeviceId -VMSize $VMSize
+ $VirtualMachine = Set-AzVMOperatingSystem -VM $VirtualMachine -Linux -ComputerName $EdgeDeviceId -Credential $Credential
+ $VirtualMachine = Add-AzVMNetworkInterface -VM $VirtualMachine -Id $NIC.Id
+ $VirtualMachine = Set-AzVMSourceImage -VM $VirtualMachine -PublisherName $imagePublisher -Offer $imageOffer -Skus $imageSku -Version latest
+ $VirtualMachine = Set-AzVMPlan -VM $VirtualMachine -Name $imageSku -Publisher $imagePublisher -Product $imageOffer
+
+ $AzVM = New-AzVM -ResourceGroupName $ResourceGroup -Location $location -VM $VirtualMachine -Verbose
+ $AzVM = Get-AzVM -ResourceGroupName $ResourceGroup -Name $EdgeDeviceId
+
+ }
+ else
+ {
+ Write-Output ("The Azure VM with Name- $EdgeDeviceId is present in the Resource Group- $ResourceGroup ")
+ }
+ ```
+
+1. Create an IoT hub within the resource group.
+
+ ```powershell
+ $iotHub = Get-AzIotHub -ResourceGroupName $ResourceGroup -Name $IoTHubName
+ If($iotHub -eq $null)
+ {
+ Write-Output("IoTHub $IoTHubName does not exists, creating The IoTHub in the resource group $ResourceGroup")
+ New-AzIotHub -ResourceGroupName $ResourceGroup -Name $IoTHubName -SkuName $IoTHubSkuName -Units $IoTHubUnits -Location $location -Verbose
+ }
+ else
+ {
+ Write-Output ("IoTHub $IoTHubName present in the resource group $ResourceGroup")
+ }
+ ```
+
+1. Add an Edge device to the IoT hub. This step only creates the device digital identity.
+
+ ```powershell
+ $deviceIdentity = Get-AzIotHubDevice -ResourceGroupName $ResourceGroup -IotHubName $IoTHubName -DeviceId $EdgeDeviceId
+ If($deviceIdentity -eq $null)
+ {
+ Write-Output("The Edge Device with DeviceId- $EdgeDeviceId is not registered to the IoTHub- $IoTHubName ")
+ Add-AzIotHubDevice -ResourceGroupName $ResourceGroup -IotHubName $IoTHubName -DeviceId $EdgeDeviceId -EdgeEnabled
+ }
+ else
+ {
+ Write-Output ("The Edge Device with DeviceId- $EdgeDeviceId is registered to the IoTHub- $IoTHubName")
+ }
+ $deviceIdentity = Get-AzIotHubDevice -ResourceGroupName $ResourceGroup -IotHubName $IoTHubName -DeviceId $EdgeDeviceId
+ ```
+
+1. Get the device primary connection string, which is needed later for the VM. The following command uses Azure CLI for deployments.
+
+ ```powershell
+ $deviceConnectionString = az iot hub device-identity connection-string show --device-id $EdgeDeviceId --hub-name $IoTHubName --resource-group $ResourceGroup --subscription $SubscriptionName
+ $connString = $deviceConnectionString[1].Substring(23,$deviceConnectionString[1].Length-24)
+ $connString
+ ```
+
+1. Update the connection string in the IoT Edge configuration file on the Edge device. The following commands use Azure CLI for deployments.
+
+ ```azurecli
+ $script = "/etc/iotedge/configedge.sh '" + $connString + "'"
+ az vm run-command invoke -g $ResourceGroup -n $EdgeDeviceId --command-id RunShellScript --script $script
+ ```
+
+1. Create an Azure Machine Learning workspace within the resource group.
+
+ ```azurecli
+ az ml workspace create -w $MyWorkSpace -g $ResourceGroup
+ ```
+
+## Next steps
+
+- [Set up IoT Edge modules and connections](tutorial-set-up-iot-edge-modules.md)
azure-sql-edge Tutorial Renewable Energy Demo https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/tutorial-renewable-energy-demo.md
description: This tutorial shows you how to use Azure SQL Edge for wake-detectio
Previously updated : 05/11/2023 Last updated : 09/14/2023 # Use Azure SQL Edge to build smarter renewable resources
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ The [Wind Turbine Demo](https://github.com/microsoft/sql-server-samples/tree/master/samples/demos/azure-sql-edge-demos/Wind%20Turbine%20Demo) for Azure SQL Edge is based on Contoso Renewable Energy, a wind turbine farm that uses SQL Edge for data processing onboard the generator. This demo walks you through resolving an alert being raised because of wind turbulence being detected at the device. You'll train a model and deploy it to SQL Edge, which corrects the detected wind wake and ultimately optimizes power output.
azure-sql-edge Tutorial Run Ml Model On Sql Edge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/tutorial-run-ml-model-on-sql-edge.md
description: In part three of this three-part Azure SQL Edge tutorial for predic
Previously updated : 05/19/2020 Last updated : 09/14/2023 -
+# Deploy ML model on Azure SQL Edge using ONNX
-# Deploy ML model on Azure SQL Edge using ONNX
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
In part three of this three-part tutorial for predicting iron ore impurities in Azure SQL Edge, you'll: 1. Use Azure Data Studio to connect to SQL Database in the Azure SQL Edge instance.
-2. Predict iron ore impurities with ONNX in Azure SQL Edge.
+1. Predict iron ore impurities with ONNX in Azure SQL Edge.
-## Key Components
+## Key components
-1. The solution uses a default 500 milliseconds between each message sent to the Edge Hub. This can be changed in the **Program.cs** file
- ```json
+1. The solution uses a default 500 milliseconds between each message sent to the Edge Hub. This can be changed in the **Program.cs** file
+
+ ```csharp
TimeSpan messageDelay = configuration.GetValue("MessageDelay", TimeSpan.FromMilliseconds(500)); ```
-2. The solution generated a message, with the following attributes. Add or remove the attributes as per requirements.
-```json
-{
- timestamp
- cur_Iron_Feed
- cur_Silica_Feed
- cur_Starch_Flow
- cur_Amina_Flow
- cur_Ore_Pulp_pH
- cur_Flotation_Column_01_Air_Flow
- cur_Flotation_Column_02_Air_Flow
- cur_Flotation_Column_03_Air_Flow
- cur_Flotation_Column_04_Air_Flow
- cur_Flotation_Column_01_Level
- cur_Flotation_Column_02_Level
- cur_Flotation_Column_03_Level
- cur_Flotation_Column_04_Level
- cur_Iron_Concentrate
-}
-```
+
+1. The solution generated a message, with the following attributes. Add or remove the attributes as per requirements.
+
+ ```json
+ {
+ timestamp
+ cur_Iron_Feed
+ cur_Silica_Feed
+ cur_Starch_Flow
+ cur_Amina_Flow
+ cur_Ore_Pulp_pH
+ cur_Flotation_Column_01_Air_Flow
+ cur_Flotation_Column_02_Air_Flow
+ cur_Flotation_Column_03_Air_Flow
+ cur_Flotation_Column_04_Air_Flow
+ cur_Flotation_Column_01_Level
+ cur_Flotation_Column_02_Level
+ cur_Flotation_Column_03_Level
+ cur_Flotation_Column_04_Level
+ cur_Iron_Concentrate
+ }
+ ```
## Connect to the SQL Database in the Azure SQL Edge instance to train, deploy, and test the ML model 1. Open Azure Data Studio.
-2. In the **Welcome** tab, start a new connection with the following details:
-
- |_Field_|_Value_|
- |-|-|
- |Connection type| Microsoft SQL Server|
- |Server|Public IP address mentioned in the VM that was created for this demo|
- |Username|sa|
- |Password|The strong password that was used while creating the Azure SQL Edge instance|
- |Database|Default|
- |Server group|Default|
- |Name (optional)|Provide an optional name|
+1. In the **Welcome** tab, start a new connection with the following details:
-3. Click **Connect**
+ | Field | Value |
+ | | |
+ | Connection type | Microsoft SQL Server |
+ | Server | Public IP address mentioned in the VM that was created for this demo |
+ | Username | sa |
+ | Password | The strong password that was used while creating the Azure SQL Edge instance |
+ | Database | Default |
+ | Server group | Default |
+ | Name (optional) | Provide an optional name |
-4. In the **File** section, open **/DeploymentScripts/MiningProcess_ONNX.jpynb** from the folder where you have cloned the project files on your machine.
+1. Select **Connect**.
-5. Set the kernel to Python 3.
+1. In the **File** section, open `/DeploymentScripts/MiningProcess_ONNX.jpynb` from the folder in which you cloned the project files on your machine.
+1. Set the kernel to Python 3.
## Next steps
-For more information on using ONNX models in Azure SQL Edge, see [Machine learning and AI with ONNX in SQL Edge](onnx-overview.md).
+- For more information on using ONNX models in Azure SQL Edge, see [Machine learning and AI with ONNX in SQL Edge](onnx-overview.md).
azure-sql-edge Tutorial Set Up Iot Edge Modules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/tutorial-set-up-iot-edge-modules.md
description: In part two of this three-part Azure SQL Edge tutorial for predicti
Previously updated : 09/22/2020 Last updated : 09/14/2023 - - # Set up IoT Edge modules and connections
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ In part two of this three-part tutorial for predicting iron ore impurities in Azure SQL Edge, you'll set up the following IoT Edge modules: - Azure SQL Edge
In part two of this three-part tutorial for predicting iron ore impurities in Az
## Specify container registry credentials
-The credentials to the container registries hosting module images need to be specified. These can be found in the container registry that was created in your resource group. Navigate to the **Access Keys** section. Make note of the following fields:
+The credentials to the container registries hosting module images need to be specified. These credentials can be found in the container registry that was created in your resource group. Navigate to the **Access Keys** section. Make note of the following fields:
- Registry name - Login server
Now, specify the container credentials in the IoT Edge module.
1. Navigate to the IoT hub that was created in your resource group.
-2. In the **IoT Edge** section under **Automatic Device Management**, click **Device ID**. For this tutorial, the ID is `IronOrePredictionDevice`.
+1. In the **IoT Edge** section under **Automatic Device Management**, select **Device ID**. For this tutorial, the ID is `IronOrePredictionDevice`.
-3. Select the **Set Modules** section.
+1. Select the **Set Modules** section.
-4. Under **Container Registry Credentials**, enter the following values:
+1. Under **Container Registry Credentials**, enter the following values:
- | _Field_ | _Value_ |
- | - | - |
- | Name | Registry name |
- | Address | Login server |
- | User Name | Username |
- | Password | Password |
+ | Field | Value |
+ | | |
+ | Name | Registry name |
+ | Address | Login server |
+ | User Name | Username |
+ | Password | Password |
## Build, push, and deploy the Data Generator Module 1. Clone the project files to your machine.
-2. Open the file **IronOre_Silica_Predict.sln** using Visual Studio 2019
-3. Update the container registry details in the **deployment.template.json**
+1. Open the file **IronOre_Silica_Predict.sln** using Visual Studio 2019
+1. Update the container registry details in the **deployment.template.json**
+ ```json
- "registryCredentials":{
- "RegistryName":{
- "username":"",
- "password":""
- "address":""
+ "registryCredentials": {
+ "RegistryName": {
+ "username": "",
+ "password": "",
+ "address": ""
}
- }
+ }
```
-4. Update the **modules.json** file to specify the target container registry (or repository for the module)
+
+1. Update the **modules.json** file to specify the target container registry (or repository for the module)
+ ```json
- "image":{
+ "image": {
"repository":"samplerepo.azurecr.io/ironoresilicapercent", "tag":
- }
+ }
```
-5. Execute the project in either debug or release mode to ensure the project runs without any issues
-6. Push the project to your container registry by right-clicking the project name and then selecting **Build and Push IoT Edge Modules**.
-7. Deploy the Data Generator module as an IoT Edge module to your Edge device.
+
+1. Execute the project in either debug or release mode to ensure the project runs without any issues
+1. Push the project to your container registry by right-clicking the project name and then selecting **Build and Push IoT Edge Modules**.
+1. Deploy the Data Generator module as an IoT Edge module to your Edge device.
## Deploy the Azure SQL Edge module
-1. Deploy the Azure SQL Edge module by clicking on **+ Add** and then **Marketplace Module**.
+1. Deploy the Azure SQL Edge module by selecting on **+ Add** and then **Marketplace Module**.
-2. On the **IoT Edge Module Marketplace** blade, search for *Azure SQL Edge* and pick *Azure SQL Edge Developer*.
+1. On the **IoT Edge Module Marketplace** pane, search for *Azure SQL Edge* and pick *Azure SQL Edge Developer*.
-3. Click on the newly added *Azure SQL Edge* module under **IoT Edge Modules** to configure the Azure SQL Edge module. For more information on the configuration options, see [Deploy Azure SQL Edge](./deploy-portal.md).
+1. Select the newly added *Azure SQL Edge* module under **IoT Edge Modules** to configure the Azure SQL Edge module. For more information on the configuration options, see [Deploy Azure SQL Edge](./deploy-portal.md).
-4. Add the `MSSQL_PACKAGE` environment variable to the *Azure SQL Edge* module deployment, and specify the SAS URL of the database dacpac file created in step 8 of [Part one](tutorial-deploy-azure-resources.md) of this tutorial.
+1. Add the `MSSQL_PACKAGE` environment variable to the *Azure SQL Edge* module deployment, and specify the SAS URL of the database dacpac file created in step 8 of [Part one](tutorial-deploy-azure-resources.md) of this tutorial.
-5. Click **update**
+1. Select **update**
-6. On the **Set modules on device** page, click **Next: Routes >**.
+1. On the **Set modules on device** page, select **Next: Routes >**.
-7. On the routes pane of the **Set modules on device** page, specify the routes for module to IoT Edge hub communication as described below. Make sure to update the module names in the route definitions below.
+1. On the routes pane of the **Set modules on device** page, specify the routes for module to IoT Edge hub communication as described below. Make sure to update the module names in the following route definitions.
- ```syntax
- FROM /messages/modules/<your_data_generator_module>/outputs/IronOreMeasures INTO
- BrokeredEndpoint("/modules/<your_azure_sql_edge_module>/inputs/IronOreMeasures")
+ ```sql
+ FROM /messages/modules/<your_data_generator_module>/outputs/IronOreMeasures
+ INTO BrokeredEndpoint("/modules/<your_azure_sql_edge_module>/inputs/IronOreMeasures")
``` For example:
- ```syntax
- FROM /messages/modules/ASEDataGenerator/outputs/IronOreMeasures INTO BrokeredEndpoint("/modules/AzureSQLEdge/inputs/IronOreMeasures")
+ ```sql
+ FROM /messages/modules/ASEDataGenerator/outputs/IronOreMeasures
+ INTO BrokeredEndpoint("/modules/AzureSQLEdge/inputs/IronOreMeasures")
```
+1. On the **Set modules on device** page, select **Next: Review + create >**
-7. On the **Set modules on device** page, click **Next: Review + create >**
-
-8. On the **Set modules on device** page, click **Create**
+1. On the **Set modules on device** page, select **Create**
## Create and start the T-SQL Streaming Job in Azure SQL Edge. 1. Open Azure Data Studio.
-2. In the **Welcome** tab, start a new connection with the following details:
+1. In the **Welcome** tab, start a new connection with the following details:
- |_Field_|_Value_|
- |-|-|
- |Connection type| Microsoft SQL Server|
- |Server|Public IP address mentioned in the VM that was created for this demo|
- |Username|sa|
- |Password|The strong password that was used while creating the Azure SQL Edge instance|
- |Database|Default|
- |Server group|Default|
- |Name (optional)|Provide an optional name|
+ | Field | Value |
+ | | |
+ | Connection type | Microsoft SQL Server |
+ | Server | Public IP address mentioned in the VM that was created for this demo |
+ | Username | sa |
+ | Password | The strong password that was used while creating the Azure SQL Edge instance |
+ | Database | Default |
+ | Server group | Default |
+ | Name (optional) | Provide an optional name |
-3. Click **Connect**
+1. Select **Connect**.
-4. In the **File** menu tab, open a new notebook or use the keyboard shortcut Ctrl + N.
+1. In the **File** menu tab, open a new notebook or use the keyboard shortcut **Ctrl + N**.
-5. In the new Query window, execute the script below to create the T-SQL Streaming job. Before executing the script, make sure to change the following variables.
- - *SQL_SA_Password:* The MSSQL_SA_PASSWORD value specified while deploy the Azure SQL Edge Module.
+1. In the new Query window, execute the script below to create the T-SQL Streaming job. Before executing the script, make sure to change the following variables:
+
+ - `@SQL_SA_Password`: The `MSSQL_SA_PASSWORD` value specified while deploying the Azure SQL Edge Module.
```sql
- Use IronOreSilicaPrediction
- Go
+ USE IronOreSilicaPrediction;
+ GO
- Declare @SQL_SA_Password varchar(200) = '<SQL_SA_Password>'
- declare @query varchar(max)
+ DECLARE @SQL_SA_Password VARCHAR(200) = '<SQL_SA_Password>';
+ DECLARE @query VARCHAR(MAX);
- /*
- Create Objects Required for Streaming
- */
+ /* Create objects required for streaming */
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'MyStr0ng3stP@ssw0rd';
- If NOT Exists (select name from sys.external_file_formats where name = 'JSONFormat')
- Begin
- CREATE EXTERNAL FILE FORMAT [JSONFormat]
- WITH ( FORMAT_TYPE = JSON)
- End
--
- If NOT Exists (select name from sys.external_data_sources where name = 'EdgeHub')
- Begin
- Create EXTERNAL DATA SOURCE [EdgeHub]
- With(
- LOCATION = N'edgehub://'
- )
- End
-
- If NOT Exists (select name from sys.external_streams where name = 'IronOreInput')
- Begin
- CREATE EXTERNAL STREAM IronOreInput WITH
- (
- DATA_SOURCE = EdgeHub,
- FILE_FORMAT = JSONFormat,
- LOCATION = N'IronOreMeasures'
- )
- End
--
- If NOT Exists (select name from sys.database_scoped_credentials where name = 'SQLCredential')
- Begin
- set @query = 'CREATE DATABASE SCOPED CREDENTIAL SQLCredential
- WITH IDENTITY = ''sa'', SECRET = ''' + @SQL_SA_Password + ''''
- Execute(@query)
- End
-
- If NOT Exists (select name from sys.external_data_sources where name = 'LocalSQLOutput')
- Begin
- CREATE EXTERNAL DATA SOURCE LocalSQLOutput WITH (
- LOCATION = 'sqlserver://tcp:.,1433',CREDENTIAL = SQLCredential)
- End
-
- If NOT Exists (select name from sys.external_streams where name = 'IronOreOutput')
- Begin
- CREATE EXTERNAL STREAM IronOreOutput WITH
- (
- DATA_SOURCE = LocalSQLOutput,
- LOCATION = N'IronOreSilicaPrediction.dbo.IronOreMeasurements'
- )
- End
-
- EXEC sys.sp_create_streaming_job @name=N'IronOreData',
- @statement= N'Select * INTO IronOreOutput from IronOreInput'
-
- exec sys.sp_start_streaming_job @name=N'IronOreData'
+ IF NOT EXISTS (
+ SELECT name
+ FROM sys.external_file_formats
+ WHERE name = 'JSONFormat'
+ )
+ BEGIN
+ CREATE EXTERNAL FILE FORMAT [JSONFormat]
+ WITH (FORMAT_TYPE = JSON)
+ END
+
+ IF NOT EXISTS (
+ SELECT name
+ FROM sys.external_data_sources
+ WHERE name = 'EdgeHub'
+ )
+ BEGIN
+ CREATE EXTERNAL DATA SOURCE [EdgeHub]
+ WITH (LOCATION = N'edgehub://')
+ END
+
+ IF NOT EXISTS (
+ SELECT name
+ FROM sys.external_streams
+ WHERE name = 'IronOreInput'
+ )
+ BEGIN
+ CREATE EXTERNAL STREAM IronOreInput
+ WITH (
+ DATA_SOURCE = EdgeHub,
+ FILE_FORMAT = JSONFormat,
+ LOCATION = N'IronOreMeasures'
+ )
+ END
+
+ IF NOT EXISTS (
+ SELECT name
+ FROM sys.database_scoped_credentials
+ WHERE name = 'SQLCredential'
+ )
+ BEGIN
+ SET @query = 'CREATE DATABASE SCOPED CREDENTIAL SQLCredential
+ WITH IDENTITY = ''sa'', SECRET = ''' + @SQL_SA_Password + ''''
+
+ EXECUTE (@query)
+ END
+
+ IF NOT EXISTS (
+ SELECT name
+ FROM sys.external_data_sources
+ WHERE name = 'LocalSQLOutput'
+ )
+ BEGIN
+ CREATE EXTERNAL DATA SOURCE LocalSQLOutput
+ WITH (
+ LOCATION = 'sqlserver://tcp:.,1433',
+ CREDENTIAL = SQLCredential
+ )
+ END
+
+ IF NOT EXISTS (
+ SELECT name
+ FROM sys.external_streams
+ WHERE name = 'IronOreOutput'
+ )
+ BEGIN
+ CREATE EXTERNAL STREAM IronOreOutput
+ WITH (
+ DATA_SOURCE = LocalSQLOutput,
+ LOCATION = N'IronOreSilicaPrediction.dbo.IronOreMeasurements'
+ )
+ END
+
+ EXEC sys.sp_create_streaming_job @name = N'IronOreData',
+ @statement = N'Select * INTO IronOreOutput from IronOreInput';
+
+ EXEC sys.sp_start_streaming_job @name = N'IronOreData';
```
-6. Use the following query to verify that the data from the data generation module is being streamed into the database.
+1. Use the following query to verify that the data from the data generation module is being streamed into the database.
```sql
- Select Top 10 * from dbo.IronOreMeasurements
- order by timestamp desc
+ SELECT TOP 10 *
+ FROM dbo.IronOreMeasurements
+ ORDER BY timestamp DESC;
``` - In this tutorial, we deployed the data generator module and the SQL Edge module. Then we created a streaming job to stream the data generated by the data generation module to SQL.
-## Next Steps
+## Next steps
- [Deploy ML model on Azure SQL Edge using ONNX](tutorial-run-ml-model-on-sql-edge.md)
azure-sql-edge Tutorial Sync Data Factory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/tutorial-sync-data-factory.md
Title: Sync data from Azure SQL Edge by using Azure Data Factory
description: Learn about syncing data between Azure SQL Edge and Azure Blob storage - Previously updated : 05/19/2020 Last updated : 09/14/2023 keywords: - SQL Edge - sync data from SQL Edge - SQL Edge data factory- - # Tutorial: Sync data from SQL Edge to Azure Blob storage by using Azure Data Factory
-In this tutorial, you'll use Azure Data Factory to incrementally sync data to Azure Blob storage from a table in an instance of Azure SQL Edge.
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+This tutorial shows you how to use Azure Data Factory to incrementally sync data to Azure Blob storage from a table in an instance of Azure SQL Edge.
## Before you begin If you haven't already created a database or table in your Azure SQL Edge deployment, use one of these methods to create one:
-* Use [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms/) or [Azure Data Studio](/sql/azure-data-studio/download/) to connect to SQL Edge. Run a SQL script to create the database and table.
-* Create a database and table by using [SQLCMD](/sql/tools/sqlcmd-utility/) by directly connecting to the SQL Edge module. For more information, see [Connect to the Database Engine by using sqlcmd](/sql/ssms/scripting/sqlcmd-connect-to-the-database-engine/).
-* Use SQLPackage.exe to deploy a DAC package file to the SQL Edge container. You can automate this process by specifying the SqlPackage file URI as part of the module's desired properties configuration. You can also directly use the SqlPackage.exe client tool to deploy a DAC package to SQL Edge.
+- Use [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms/) or [Azure Data Studio](/sql/azure-data-studio/download/) to connect to SQL Edge. Run a SQL script to create the database and table.
+- Create a database and table by using [sqlcmd](/sql/tools/sqlcmd-utility/) by directly connecting to the SQL Edge module. For more information, see [Connect to the Database Engine by using sqlcmd](/sql/ssms/scripting/sqlcmd-connect-to-the-database-engine/).
+- Use SQLPackage.exe to deploy a DAC package file to the SQL Edge container. You can automate this process by specifying the SqlPackage file URI as part of the module's desired properties configuration. You can also directly use the SqlPackage.exe client tool to deploy a DAC package to SQL Edge.
- For information about how to download SqlPackage.exe, see [Download and install sqlpackage](/sql/tools/sqlpackage-download/). Following are some sample commands for SqlPackage.exe. For more information, see the SqlPackage.exe documentation.
+ For information about how to download SqlPackage.exe, see [Download and install sqlpackage](/sql/tools/sqlpackage-download/). Following are some sample commands for SqlPackage.exe. For more information, see the SqlPackage.exe documentation.
- **Create a DAC package**
+ **Create a DAC package**
- ```cmd
- sqlpackage /Action:Extract /SourceConnectionString:"Data Source=<Server_Name>,<port>;Initial Catalog=<DB_name>;User ID=<user>;Password=<password>" /TargetFile:<dacpac_file_name>
- ```
+ ```cmd
+ sqlpackage /Action:Extract /SourceConnectionString:"Data Source=<Server_Name>,<port>;Initial Catalog=<DB_name>;User ID=<user>;Password=<password>" /TargetFile:<dacpac_file_name>
+ ```
- **Apply a DAC package**
+ **Apply a DAC package**
- ```cmd
- sqlpackage /Action:Publish /Sourcefile:<dacpac_file_name> /TargetServerName:<Server_Name>,<port> /TargetDatabaseName:<DB_Name> /TargetUser:<user> /TargetPassword:<password>
- ```
+ ```cmd
+ sqlpackage /Action:Publish /Sourcefile:<dacpac_file_name> /TargetServerName:<Server_Name>,<port> /TargetDatabaseName:<DB_Name> /TargetUser:<user> /TargetPassword:<password>
+ ```
## Create a SQL table and procedure to store and update the watermark levels
A watermark table is used to store the last timestamp up to which data has alrea
Run these commands on the SQL Edge instance: ```sql
- Create table [dbo].[watermarktable]
- (
- TableName varchar(255),
- WatermarkValue datetime,
- )
- GO
-
- CREATE PROCEDURE usp_write_watermark @timestamp datetime, @TableName varchar(50)
- AS
- BEGIN
-
+CREATE TABLE [dbo].[watermarktable] (
+ TableName VARCHAR(255),
+ WatermarkValue DATETIME,
+);
+GO
+
+CREATE PROCEDURE usp_write_watermark @timestamp DATETIME,
+ @TableName VARCHAR(50)
+AS
+BEGIN
UPDATE [dbo].[watermarktable] SET [WatermarkValue] = @timestamp
- WHERE [TableName] = @TableName
-
- END
- Go
+ WHERE [TableName] = @TableName;
+END
+GO
``` ## Create a Data Factory pipeline
Create a data factory by following the instructions in [this tutorial](../data-f
1. On the **Let's get started** page of the Data Factory UI, select **Create pipeline**.
- ![Create a Data Factory pipeline](media/tutorial-sync-data-factory/data-factory-get-started.png)
+ :::image type="content" source="media/tutorial-sync-data-factory/data-factory-get-started.png" alt-text="Screenshot of the create a Data Factory pipeline.":::
-2. On the **General** page of the **Properties** window for the pipeline, enter **PeriodicSync** for the name.
+1. On the **General** page of the **Properties** window for the pipeline, enter **PeriodicSync** for the name.
-3. Add the Lookup activity to get the old watermark value. In the **Activities** pane, expand **General** and drag the **Lookup** activity to the pipeline designer surface. Change the name of the activity to **OldWatermark**.
+1. Add the Lookup activity to get the old watermark value. In the **Activities** pane, expand **General** and drag the **Lookup** activity to the pipeline designer surface. Change the name of the activity to **OldWatermark**.
- ![Add the old watermark lookup](media/tutorial-sync-data-factory/create-old-watermark-lookup.png)
+ :::image type="content" source="media/tutorial-sync-data-factory/create-old-watermark-lookup.png" alt-text="Screenshot of adding the old watermark lookup.":::
-4. Switch to the **Settings** tab and select **New** for **Source Dataset**. You'll now create a dataset to represent data in the watermark table. This table contains the old watermark that was used in the previous copy operation.
+1. Switch to the **Settings** tab and select **New** for **Source Dataset**. You'll now create a dataset to represent data in the watermark table. This table contains the old watermark that was used in the previous copy operation.
-5. In the **New Dataset** window, select **Azure SQL Server**, and then select **Continue**.
+1. In the **New Dataset** window, select **Azure SQL Server**, and then select **Continue**.
-6. In the **Set properties** window for the dataset, under **Name**, enter **WatermarkDataset**.
+1. In the **Set properties** window for the dataset, under **Name**, enter **WatermarkDataset**.
-7. For **Linked Service**, select **New**, and then complete these steps:
+1. For **Linked Service**, select **New**, and then complete these steps:
- 1. Under **Name**, enter **SQLDBEdgeLinkedService**.
+ 1. Under **Name**, enter **SQLDBEdgeLinkedService**.
- 2. Under **Server name**, enter your SQL Edge server details.
+ 1. Under **Server name**, enter your SQL Edge server details.
- 3. Select your **Database name** from the list.
+ 1. Select your **Database name** from the list.
- 4. Enter your **User name** and **Password**.
+ 1. Enter your **User name** and **Password**.
- 5. To test the connection to the SQL Edge instance, select **Test connection**.
+ 1. To test the connection to the SQL Edge instance, select **Test connection**.
- 6. Select **Create**.
+ 1. Select **Create**.
- ![Create a linked service](media/tutorial-sync-data-factory/create-linked-service.png)
+ :::image type="content" source="media/tutorial-sync-data-factory/create-linked-service.png" alt-text="Screenshot of creating a linked service.":::
- 7. Select **OK**.
+ 1. Select **OK**.
-8. On the **Settings** tab, select **Edit**.
+1. On the **Settings** tab, select **Edit**.
-9. On the **Connection** tab, select **[dbo].[watermarktable]** for **Table**. If you want to preview data in the table, select **Preview data**.
+1. On the **Connection** tab, select `[dbo].[watermarktable]` for **Table**. If you want to preview data in the table, select **Preview data**.
-10. Switch to the pipeline editor by selecting the pipeline tab at the top or by selecting the name of the pipeline in the tree view on the left. In the properties window for the Lookup activity, confirm that **WatermarkDataset** is selected in the **Source dataset** list.
+1. Switch to the pipeline editor by selecting the pipeline tab at the top or by selecting the name of the pipeline in the tree view on the left. In the properties window for the Lookup activity, confirm that **WatermarkDataset** is selected in the **Source dataset** list.
-11. In the **Activities** pane, expand **General** and drag another **Lookup** activity to the pipeline designer surface. Set the name to **NewWatermark** on the **General** tab of the properties window. This Lookup activity gets the new watermark value from the table that contains the source data so it can be copied to the destination.
+1. In the **Activities** pane, expand **General** and drag another **Lookup** activity to the pipeline designer surface. Set the name to **NewWatermark** on the **General** tab of the properties window. This Lookup activity gets the new watermark value from the table that contains the source data so it can be copied to the destination.
-12. In the properties window for the second Lookup activity, switch to the **Settings** tab and select **New** to create a dataset to point to the source table that contains the new watermark value.
+1. In the properties window for the second Lookup activity, switch to the **Settings** tab and select **New** to create a dataset to point to the source table that contains the new watermark value.
-13. In the **New Dataset** window, select **SQL Edge instance**, and then select **Continue**.
+1. In the **New Dataset** window, select **SQL Edge instance**, and then select **Continue**.
- 1. In the **Set properties** window, under **Name**, enter **SourceDataset**. Under **Linked service**, select **SQLDBEdgeLinkedService**.
+ 1. In the **Set properties** window, under **Name**, enter **SourceDataset**. Under **Linked service**, select **SQLDBEdgeLinkedService**.
- 2. Under **Table**, select the table that you want to synchronize. You can also specify a query for this dataset, as described later in this tutorial. The query takes precedence over the table you specify in this step.
+ 1. Under **Table**, select the table that you want to synchronize. You can also specify a query for this dataset, as described later in this tutorial. The query takes precedence over the table you specify in this step.
- 3. Select **OK**.
+ 1. Select **OK**.
-14. Switch to the pipeline editor by selecting the pipeline tab at the top or by selecting the name of the pipeline in the tree view on the left. In the properties window for the Lookup activity, confirm that **SourceDataset** is selected in the **Source dataset** list.
+1. Switch to the pipeline editor by selecting the pipeline tab at the top or by selecting the name of the pipeline in the tree view on the left. In the properties window for the Lookup activity, confirm that **SourceDataset** is selected in the **Source dataset** list.
-15. Select **Query** under **Use query**. Update the table name in the following query and then enter the query. You're selecting only the maximum value of `timestamp` from the table. Be sure to select **First row only**.
+1. Select **Query** under **Use query**. Update the table name in the following query and then enter the query. You're selecting only the maximum value of `timestamp` from the table. Be sure to select **First row only**.
- ```sql
- select MAX(timestamp) as NewWatermarkvalue from [TableName]
- ```
+ ```sql
+ SELECT MAX(timestamp) AS NewWatermarkValue
+ FROM [TableName];
+ ```
- ![select query](media/tutorial-sync-data-factory/select-query-data-factory.png)
+ :::image type="content" source="media/tutorial-sync-data-factory/select-query-data-factory.png" alt-text="Screenshot of a select query.":::
-16. In the **Activities** pane, expand **Move & Transform** and drag the **Copy** activity from the **Activities** pane to the designer surface. Set the name of the activity to **IncrementalCopy**.
+1. In the **Activities** pane, expand **Move & Transform** and drag the **Copy** activity from the **Activities** pane to the designer surface. Set the name of the activity to **IncrementalCopy**.
-17. Connect both Lookup activities to the Copy activity by dragging the green button attached to the Lookup activities to the Copy activity. Release the mouse button when you see the border color of the Copy activity change to blue.
+1. Connect both Lookup activities to the Copy activity by dragging the green button attached to the Lookup activities to the Copy activity. Release the mouse button when you see the border color of the Copy activity change to blue.
-18. Select the Copy activity and confirm that you see the properties for the activity in the **Properties** window.
+1. Select the Copy activity and confirm that you see the properties for the activity in the **Properties** window.
-19. Switch to the **Source** tab in the **Properties** window and complete these steps:
+1. Switch to the **Source** tab in the **Properties** window and complete these steps:
- 1. In the **Source dataset** box, select **SourceDataset**.
+ 1. In the **Source dataset** box, select **SourceDataset**.
- 2. Under **Use query**, select **Query**.
+ 1. Under **Use query**, select **Query**.
- 3. Enter the SQL query in the **Query** box. Here's a sample query:
+ 1. Enter the SQL query in the **Query** box. Here's a sample query:
- ```sql
- select * from TemperatureSensor where timestamp > '@{activity('OldWaterMark').output.firstRow.WatermarkValue}' and timestamp <= '@{activity('NewWaterMark').output.firstRow.NewWatermarkvalue}'
- ```
+ ```sql
+ SELECT *
+ FROM TemperatureSensor
+ WHERE timestamp > '@{activity(' OldWaterMark ').output.firstRow.WatermarkValue}'
+ AND timestamp <= '@{activity(' NewWaterMark ').output.firstRow.NewWatermarkvalue}';
+ ```
-20. On the **Sink** tab, select **New** under **Sink Dataset**.
+1. On the **Sink** tab, select **New** under **Sink Dataset**.
-21. In this tutorial, the sink data store is an Azure Blob storage data store. Select **Azure Blob storage**, and then select **Continue** in the **New Dataset** window.
+1. In this tutorial, the sink data store is an Azure Blob storage data store. Select **Azure Blob storage**, and then select **Continue** in the **New Dataset** window.
-22. In the **Select Format** window, select the format of your data, and then select **Continue**.
+1. In the **Select Format** window, select the format of your data, and then select **Continue**.
-23. In the **Set Properties** window, under **Name**, enter **SinkDataset**. Under **Linked service**, select **New**. You'll now create a connection (a linked service) to your Azure Blob storage.
+1. In the **Set Properties** window, under **Name**, enter **SinkDataset**. Under **Linked service**, select **New**. You'll now create a connection (a linked service) to your Azure Blob storage.
-24. In the **New Linked Service (Azure Blob storage)** window, complete these steps:
+1. In the **New Linked Service (Azure Blob storage)** window, complete these steps:
- 1. In the **Name** box, enter **AzureStorageLinkedService**.
+ 1. In the **Name** box, enter **AzureStorageLinkedService**.
- 2. Under **Storage account name**, select the Azure storage account for your Azure subscription.
+ 1. Under **Storage account name**, select the Azure storage account for your Azure subscription.
- 3. Test the connection and then select **Finish**.
+ 1. Test the connection and then select **Finish**.
-25. In the **Set Properties** window, confirm that **AzureStorageLinkedService** is selected under **Linked service**. Select **Create** and **OK**.
+1. In the **Set Properties** window, confirm that **AzureStorageLinkedService** is selected under **Linked service**. Select **Create** and **OK**.
-26. On **Sink** tab, select **Edit**.
+1. On **Sink** tab, select **Edit**.
-27. Go to the **Connection** tab of SinkDataset and complete these steps:
+1. Go to the **Connection** tab of SinkDataset and complete these steps:
- 1. Under **File path**, enter *asdedatasync/incrementalcopy*, where *asdedatasync* is the blob container name and *incrementalcopy* is the folder name. Create the container if it doesn't exist, or use the name of an existing one. Azure Data Factory automatically creates the output folder *incrementalcopy* if it doesn't exist. You can also use the **Browse** button for the **File path** to navigate to a folder in a blob container.
+ 1. Under **File path**, enter `asdedatasync/incrementalcopy`, where `asdedatasync` is the blob container name and `incrementalcopy` is the folder name. Create the container if it doesn't exist, or use the name of an existing one. Azure Data Factory automatically creates the output folder `incrementalcopy` if it doesn't exist. You can also use the **Browse** button for the **File path** to navigate to a folder in a blob container.
- 2. For the **File** part of the **File path**, select **Add dynamic content [Alt+P]**, and then enter **@CONCAT('Incremental-', pipeline().RunId, '.txt')** in the window that opens. Select **Finish**. The file name is dynamically generated by the expression. Each pipeline run has a unique ID. The Copy activity uses the run ID to generate the file name.
+ 1. For the **File** part of the **File path**, select **Add dynamic content [Alt+P]**, and then enter `@CONCAT('Incremental-', pipeline().RunId, '.txt')` in the window that opens. Select **Finish**. The file name is dynamically generated by the expression. Each pipeline run has a unique ID. The Copy activity uses the run ID to generate the file name.
-28. Switch to the pipeline editor by selecting the pipeline tab at the top or by selecting the name of the pipeline in the tree view on the left.
+1. Switch to the pipeline editor by selecting the pipeline tab at the top or by selecting the name of the pipeline in the tree view on the left.
-29. In the **Activities** pane, expand **General** and drag the **Stored Procedure** activity from the **Activities** pane to the pipeline designer surface. Connect the green (success) output of the Copy activity to the Stored Procedure activity.
+1. In the **Activities** pane, expand **General** and drag the **Stored Procedure** activity from the **Activities** pane to the pipeline designer surface. Connect the green (success) output of the Copy activity to the Stored Procedure activity.
-30. Select **Stored Procedure Activity** in the pipeline designer and change its name to **SPtoUpdateWatermarkActivity**.
+1. Select **Stored Procedure Activity** in the pipeline designer and change its name to `SPtoUpdateWatermarkActivity`.
-31. Switch to the **SQL Account** tab, and select ***QLDBEdgeLinkedService** under **Linked service**.
+1. Switch to the **SQL Account** tab, and select ***QLDBEdgeLinkedService** under **Linked service**.
-32. Switch to the **Stored Procedure** tab and complete these steps:
+1. Switch to the **Stored Procedure** tab and complete these steps:
- 1. Under **Stored procedure name**, select **[dbo].[usp_write_watermark]**.
+ 1. Under **Stored procedure name**, select `[dbo].[usp_write_watermark]`.
- 2. To specify values for the stored procedure parameters, select **Import parameter** and enter these values for the parameters:
+ 1. To specify values for the stored procedure parameters, select **Import parameter** and enter these values for the parameters:
- |Name|Type|Value|
- |--|-|--|
- |LastModifiedtime|DateTime|@{activity('NewWaterMark').output.firstRow.NewWatermarkvalue}|
- |TableName|String|@{activity('OldWaterMark').output.firstRow.TableName}|
+ | Name | Type | Value |
+ | | | |
+ | LastModifiedTime | DateTime | `@{activity('NewWaterMark').output.firstRow.NewWatermarkvalue}` |
+ | TableName | String | `@{activity('OldWaterMark').output.firstRow.TableName}` |
-33. To validate the pipeline settings, select **Validate** on the toolbar. Confirm that there are no validation errors. To close the **Pipeline Validation Report** window, select **>>**.
+1. To validate the pipeline settings, select **Validate** on the toolbar. Confirm that there are no validation errors. To close the **Pipeline Validation Report** window, select **>>**.
-34. Publish the entities (linked services, datasets, and pipelines) to the Azure Data Factory service by selecting the **Publish All** button. Wait until you see a message confirming that the publish operation has succeeded.
+1. Publish the entities (linked services, datasets, and pipelines) to the Azure Data Factory service by selecting the **Publish All** button. Wait until you see a message confirming that the publish operation has succeeded.
## Trigger a pipeline based on a schedule 1. On the pipeline toolbar, select **Add Trigger**, select **New/Edit**, and then select **New**.
-2. Name your trigger **HourlySync**. Under **Type**, select **Schedule**. Set the **Recurrence** to every 1 hour.
+1. Name your trigger **HourlySync**. Under **Type**, select **Schedule**. Set the **Recurrence** to every 1 hour.
-3. Select **OK**.
+1. Select **OK**.
-4. Select **Publish All**.
+1. Select **Publish All**.
-5. Select **Trigger Now**.
+1. Select **Trigger Now**.
-6. Switch to the **Monitor** tab on the left. You can see the status of the pipeline run triggered by the manual trigger. Select **Refresh** to refresh the list.
+1. Switch to the **Monitor** tab on the left. You can see the status of the pipeline run triggered by the manual trigger. Select **Refresh** to refresh the list.
## Next steps
-The Azure Data Factory pipeline in this tutorial copies data from a table on a SQL Edge instance to a location in Azure Blob storage once every hour. To learn about using Data Factory in other scenarios, see these [tutorials](../data-factory/tutorial-copy-data-portal.md).
+- The Azure Data Factory pipeline in this tutorial copies data from a table on a SQL Edge instance to a location in Azure Blob storage once every hour. To learn about using Data Factory in other scenarios, see these [tutorials](../data-factory/tutorial-copy-data-portal.md).
azure-sql-edge Tutorial Sync Data Sync https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/tutorial-sync-data-sync.md
Title: Sync data from Azure SQL Edge by using SQL Data Sync
description: Learn about syncing data from Azure SQL Edge by using Azure SQL Data Sync - Previously updated : 05/19/2020 Last updated : 09/14/2023 keywords: - SQL Edge - sync data from SQL Edge - SQL Edge data sync- - # Tutorial: Sync data from SQL Edge to Azure SQL Database by using SQL Data Sync
-In this tutorial, you'll learn how to use an Azure SQL Data Sync *sync group* to incrementally sync data from Azure SQL Edge to Azure SQL Database. SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple databases in Azure SQL Database and SQL Server instances. For more information on SQL Data Sync, see [Azure SQL Data Sync](/azure/azure-sql/database/sql-data-sync-data-sql-server-sql-database).
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+
+This tutorial shows you how to use an Azure SQL Data Sync *sync group* to incrementally sync data from Azure SQL Edge to Azure SQL Database. SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple databases in Azure SQL Database and SQL Server instances. For more information on SQL Data Sync, see [Azure SQL Data Sync](/azure/azure-sql/database/sql-data-sync-data-sql-server-sql-database).
-Because SQL Edge is built on the latest versions of the [SQL Server Database Engine](/sql/sql-server/sql-server-technical-documentation/), any data synchronization mechanism that's applicable to a SQL Server instance can also be used to sync data to or from a SQL Edge instance running on an edge device.
+Because SQL Edge is built on the latest versions of the [SQL Server Database Engine](/sql/sql-server/sql-server-technical-documentation/), any data synchronization mechanism that's applicable to a SQL Server instance, can also be used to sync data to or from a SQL Edge instance running on an edge device.
## Prerequisites
This tutorial requires a Windows computer configured with the [Data Sync Agent f
## Before you begin
-* Create a database in Azure SQL Database. For information on how to create a database by using the Azure portal, see [Create a single database in Azure SQL Database](/azure/azure-sql/database/single-database-create-quickstart?tabs=azure-portal).
+- Create a database in Azure SQL Database. For information on how to create a database by using the Azure portal, see [Create a single database in Azure SQL Database](/azure/azure-sql/database/single-database-create-quickstart?tabs=azure-portal).
-* Create the tables and other necessary objects in your Azure SQL Database deployment.
+- Create the tables and other necessary objects in your Azure SQL Database deployment.
-* Create the necessary tables and objects in your Azure SQL Edge deployment. For more information, see [Using SQL Database DAC packages with SQL Edge](deploy-dacpac.md).
+- Create the necessary tables and objects in your Azure SQL Edge deployment. For more information, see [Using SQL Database DAC packages with SQL Edge](deploy-dacpac.md).
-* Register the Azure SQL Edge instance with the Data Sync Agent for Azure SQL Data Sync. For more information, see [Add a SQL Server database](/azure/azure-sql/database/sql-data-sync-sql-server-configure#add-on-prem).
+- Register the Azure SQL Edge instance with the Data Sync Agent for Azure SQL Data Sync. For more information, see [Add a SQL Server database](/azure/azure-sql/database/sql-data-sync-sql-server-configure#add-on-prem).
## Sync data between a database in Azure SQL Database and SQL Edge
-Setting up synchronization between a database in Azure SQL Database and a SQL Edge instance by using SQL Data Sync involves three key steps:
+Setting up synchronization between a database in Azure SQL Database and a SQL Edge instance by using SQL Data Sync involves three key steps:
+1. Use the Azure portal to create a sync group. For more information, see [Create a sync group](/azure/azure-sql/database/sql-data-sync-sql-server-configure#create-sync-group). You can use a single *hub* database to create multiple sync groups to synchronize data from various SQL Edge instances to one or more databases in Azure SQL Database.
-1. Use the Azure portal to create a sync group. For more information, see [Create a sync group](/azure/azure-sql/database/sql-data-sync-sql-server-configure#create-sync-group). You can use a single *hub* database to create multiple sync groups to synchronize data from various SQL Edge instances to one or more databases in Azure SQL Database.
+1. Add sync members to the sync group. For more information, see [Add sync members](/azure/azure-sql/database/sql-data-sync-sql-server-configure#add-sync-members).
-2. Add sync members to the sync group. For more information, see [Add sync members](/azure/azure-sql/database/sql-data-sync-sql-server-configure#add-sync-members).
-
-3. Set up the sync group to select the tables that will be part of the synchronization. For more information, see [Configure a sync group](/azure/azure-sql/database/sql-data-sync-sql-server-configure#add-sync-members).
+1. Set up the sync group to select the tables that will be part of the synchronization. For more information, see [Configure a sync group](/azure/azure-sql/database/sql-data-sync-sql-server-configure#add-sync-members).
After you complete the preceding steps, you'll have a sync group that includes a database in Azure SQL Database and a SQL Edge instance. For more info about SQL Data Sync, see these articles:
-* [Data Sync Agent for Azure SQL Data Sync](/azure/azure-sql/database/sql-data-sync-agent-overview)
+- [Data Sync Agent for Azure SQL Data Sync](/azure/azure-sql/database/sql-data-sync-agent-overview)
-* [Best practices](/azure/azure-sql/database/sql-data-sync-best-practices) and [How to troubleshoot issues with Azure SQL Data Sync](/azure/azure-sql/database/sql-data-sync-troubleshoot)
+- [Best practices](/azure/azure-sql/database/sql-data-sync-best-practices) and [How to troubleshoot issues with Azure SQL Data Sync](/azure/azure-sql/database/sql-data-sync-troubleshoot)
-* [Monitor SQL Data Sync with Azure Monitor logs](/azure/azure-sql/database/monitor-tune-overview)
+- [Monitor SQL Data Sync with Azure Monitor logs](/azure/azure-sql/database/monitor-tune-overview)
-* [Update the sync schema with Transact-SQL](/azure/azure-sql/database/sql-data-sync-update-sync-schema) or [PowerShell](/azure/azure-sql/database/scripts/update-sync-schema-in-sync-group)
+- [Update the sync schema with Transact-SQL](/azure/azure-sql/database/sql-data-sync-update-sync-schema) or [PowerShell](/azure/azure-sql/database/scripts/update-sync-schema-in-sync-group)
## Next steps -
-* [Use PowerShell to sync between Azure SQL Database and Azure SQL Edge](/azure/azure-sql/database/scripts/sql-data-sync-sync-data-between-azure-onprem). In this tutorial, replace the `OnPremiseServer` database details with the Azure SQL Edge details.
+- [Use PowerShell to sync between Azure SQL Database and Azure SQL Edge](/azure/azure-sql/database/scripts/sql-data-sync-sync-data-between-azure-onprem). In this tutorial, replace the `OnPremiseServer` database details with the Azure SQL Edge details.
azure-sql-edge Usage And Diagnostics Data Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql-edge/usage-and-diagnostics-data-configuration.md
Title: Azure SQL Edge usage and diagnostics data configuration
description: Learn how to configure usage and diagnostics data in Azure SQL Edge. - Previously updated : 08/04/2020 Last updated : 09/14/2023 - - # Azure SQL Edge usage and diagnostics data configuration
+> [!IMPORTANT]
+> Azure SQL Edge no longer supports the ARM64 platform.
+ By default, Azure SQL Edge collects information about how its customers are using the application. Specifically, Azure SQL Edge collects information about the deployment experience, usage, and performance. This information helps Microsoft improve the product to better meet customer needs. For example, Microsoft collects information about what kinds of error codes customers encounter so that we can fix related bugs, improve our documentation about how to use Azure SQL Edge, and determine whether features should be added to the product to better serve customers.
-Specifically, Microsoft does not send any of the following types of information through this mechanism:
+Specifically, Microsoft doesn't send any of the following types of information through this mechanism:
- Any values from inside user tables.-- Any logon credentials or other authentication information.
+- Any sign-in credentials or other authentication information.
- Any personal or customer data. The following sample scenario includes feature usage information that helps improve the product.
-An example query from the queries used for the usage and diagnostics data collection is provided below. The query identifies the count and types of different streaming data sources being used in Azure SQL Edge. This data helps Microsoft identify which streaming data sources are being used commonly such that Microsoft can improve the performance and user experience associated with these data sources.
+An example query from the queries used for the usage and diagnostics data collection is provided as follows. The query identifies the count and types of different streaming data sources being used in Azure SQL Edge. This data helps Microsoft identify which streaming data sources are being used commonly such that Microsoft can improve the performance and user experience associated with these data sources.
```sql
-select
-count(*) as [count], sum(inputs) as inputs, sum(outputs) as outputs, sum(linked_to_job)
-as linked_to_job, data_source_type
-from (
-select isnull(value,'unknown') as data_source_type, inputs, outputs, linked_to_job
-from
- (
- select
- convert(sysname, lower(substring(ds.location, 0, charindex('://', ds.location))), 1) as data_source_type,
- isnull(inputs, 0) as inputs, isnull(outputs, 0) as outputs, isnull(js.stream_id/js.stream_id, 0) as linked_to_job
- from sys.external_streams es
- join sys.external_data_sources ds
- on es.data_source_id = ds.data_source_id
- left join
- (
- select stream_id, max(cast(is_input as int)) inputs, max(cast(is_output as int)) outputs
- from sys.external_job_streams group by stream_id
+SELECT count(*) AS [count],
+ sum(inputs) AS inputs,
+ sum(outputs) AS outputs,
+ sum(linked_to_job) AS linked_to_job,
+ data_source_type
+FROM (
+ SELECT ISNULL(value, 'unknown') AS data_source_type,
+ inputs,
+ outputs,
+ linked_to_job
+ FROM (
+ SELECT convert(SYSNAME, LOWER(SUBSTRING(ds.location, 0, CHARINDEX('://', ds.location))), 1) AS data_source_type,
+ ISNULL(inputs, 0) AS inputs,
+ ISNULL(outputs, 0) AS outputs,
+ ISNULL(js.stream_id / js.stream_id, 0) AS linked_to_job
+ FROM sys.external_streams es
+ INNER JOIN sys.external_data_sources ds
+ ON es.data_source_id = ds.data_source_id
+ LEFT JOIN (
+ SELECT stream_id,
+ MAX(CAST(is_input AS INT)) inputs,
+ MAX(CAST(is_output AS INT)) outputs
+ FROM sys.external_job_streams
+ GROUP BY stream_id
) js
- on js.stream_id = es.object_id
- ) ds
-left join
- (
- select value from string_split('edgehub,sqlserver,kafka', ',')) as known_ep on data_source_type = value
+ ON js.stream_id = es.object_id
+ ) ds
+ LEFT JOIN (
+ SELECT value
+ FROM string_split('edgehub,sqlserver,kafka', ',')
+ ) AS known_ep
+ ON data_source_type = value
) known_ds
-group by data_source_type
+GROUP BY data_source_type;
``` ## Disable usage and diagnostic data collection Usage and diagnostic data collection on Azure SQL Edge can be disabled using either of the below methods.
-> [!NOTE]
-> Usage and diagnostic data cannot be disabled for the Developer version.
+> [!NOTE]
+> Usage and diagnostic data can't be disabled for the Developer version.
### Disable usage and diagnostics using environment variables
-To disable Usage and Diagnostics data collection on Azure SQL Edge, add the following environment variable and set its value to `*False*`. For more information on configuring Azure SQL Edge using environment variables, refer [Configure using Environment Variables](configure.md#configure-by-using-environment-variables).
+To disable usage and diagnostics data collection on Azure SQL Edge, add the following environment variable and set its value to `*False*`. For more information on configuring Azure SQL Edge using environment variables, see [Configure using Environment Variables](configure.md#configure-by-using-environment-variables).
-`MSSQL_TELEMETRY_ENABLED = TRUE | FALSE`
+#### MSSQL_TELEMETRY_ENABLED = TRUE | FALSE
-- TRUE - Enables collection of usage and diagnostics data. This is the default configuration.-- FALSE - Disables collection of usage and diagnostics data.
+- `TRUE` - Enables collection of usage and diagnostics data. This is the default configuration.
+- `FALSE` - Disables collection of usage and diagnostics data.
### Disable usage and diagnostics using mssql.conf file
-To disable Usage and Diagnostics data collection on Azure SQL Edge, add the following lines in the mssql.conf file on the persistent storage drive that is mapped to the /var/opt/mssql/ folder in the SQL Edge module. For more information on configuring Azure SQL Edge using mssql.conf file, refer [Configure using mssql.conf file](configure.md#configure-by-using-an-mssqlconf-file).
+To disable usage and diagnostics data collection on Azure SQL Edge, add the following lines in the mssql.conf file on the persistent storage drive that is mapped to the /var/opt/mssql/ folder in the SQL Edge module. For more information on configuring Azure SQL Edge using mssql.conf file, see [Configure using mssql.conf file](configure.md#configure-by-using-an-mssqlconf-file).
```ini [telemetry]
customerfeedback = false
## Local audit of usage and diagnostic data collection
-The Local Audit component of Azure SQL Edge Usage and Diagnostic Data collection can write data collected by the service to a designated folder, representing the data (logs) that will be sent to Microsoft. The purpose of the Local Audit is to allow customers to see all data Microsoft collects with this feature, for compliance, regulatory or privacy validation reasons.
+The Local Audit component of Azure SQL Edge Usage and Diagnostic Data collection can write data collected by the service to a designated folder, representing the data (logs) that is sent to Microsoft. The purpose of the Local Audit is to allow customers to see all data Microsoft collects with this feature, for compliance, regulatory or privacy validation reasons.
### Enable local audit of usage and diagnostics data
-To enable Local Audit usage and diagnostics data on Azure SQL Edge
+To enable Local Audit usage and diagnostics data on Azure SQL Edge:
-1. Create a target directory for new Local Audit log storage. This target directory can either be on the host or within the container. In the example below, target directory is created in the same mount volume that is mapped to /var/opt/mssql/ path on SQL Edge.
+1. Create a target directory for new Local Audit log storage. This target directory can either be on the host or within the container. In the following example, target directory is created in the same mount volume that is mapped to /var/opt/mssql/ path on SQL Edge.
```bash sudo mkdir <host mount path>/audit ```
-2. Configure audit of usage and diagnostics data using either environment variables or mssql.conf file.
+1. Configure audit of usage and diagnostics data using either environment variables or mssql.conf file.
+
+ - Using environment variables:
+
+ - Add the following environment variable to your SQL Edge deployment and specify the target directory for the audit files.
+
+ `*MSSQL_TELEMETRY_DIR = <host mount path>/audit*`
+
+ - Using `mssql.conf` file:
+
+ - Add the following lines in the mssql.conf file and specify the target directory for the audit files.
- - Using environment variables - Add the following environment variable to your SQL Edge deployment and specify the target directory for the audit files.
-
- `*MSSQL_TELEMETRY_DIR = <host mount path>/audit*`
-
- - Using mssql.conf file - Add the following lines in the mssql.conf file and specify the target directory for the audit files.
```ini [telemetry] userrequestedlocalauditdirectory = <host mount path>/audit
- ```
+ ```
## Next steps
backup Azure File Share Support Matrix https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/azure-file-share-support-matrix.md
You can use the [Azure Backup service](./backup-overview.md) to back up Azure fi
## Supported regions
-Azure file shares backup is available in all regions, **except** for Germany Central (Sovereign), Germany Northeast (Sovereign), China East, China East 2, China North, China North 2, China North 3, France South, and US Gov Iowa.
+Azure file shares backup is available in all regions, **except** for Germany Central (Sovereign), Germany Northeast (Sovereign), China East, China North, France South, and US Gov Iowa.
## Supported storage accounts
Azure file shares backup is available in all regions, **except** for Germany Cen
* Learn how to [Back up Azure file shares](backup-afs.md) * Learn how to [Restore Azure file shares](restore-afs.md) * Learn how to [Manage Azure file share backups](manage-afs-backup.md)+
backup Backup Support Matrix Iaas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-support-matrix-iaas.md
Title: Support matrix for Azure VM backups description: Get a summary of support settings and limitations for backing up Azure VMs by using the Azure Backup service. Previously updated : 09/08/2023 Last updated : 09/18/2023
NVMe/[ephemeral disks](../virtual-machines/ephemeral-os-disks.md) | Not supporte
Dynamic disk with spanned or striped volumes | Supported, unless you enable the selective disk feature on an Azure VM. VMs with encryption at host | Supported Disks with enabled Data Access with Azure Active Directory Authentication for disk upload/download | Not Supported
+Storage Replicas | Not supported
## VM network support
backup Blob Backup Support Matrix https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/blob-backup-support-matrix.md
Operational backup for blobs is available in all public cloud regions, except Fr
# [Vaulted backup](#tab/vaulted-backup)
-Vaulted backup (preview) for blobs is currently available in the following regions: France Central, Canada Central, Canada East, US East, South Central US, Germany West Central, Germany North, Australia Central, Australia Central 2, India South, India West, Korea Central and Korea South.
+Vaulted backup (preview) for blobs is currently available in all public regions **except** South Africa West, Sweden Central, Sweden South, Israel Central, Poland Central, India Central, Italy North and Malaysia South.
+
Vaulted backup (preview) for blobs is currently available in the following regio
# [Operational backup](#tab/operational-backup)
-Operational backup of blobs uses blob point-in-time restore, blob versioning, soft delete for blobs, change feed for blobs and delete lock to provide a local backup solution. So limitations that apply to these capabilities also apply to operational backup.
+Operational backup of blobs uses blob point-in-time restore, blob versioning, soft delete for blobs, change feed for blobs and delete lock to provide a local backup solution. Hence, the limitations that apply to these capabilities also apply to operational backup.
**Supported scenarios:** Operational backup supports block blobs in standard general-purpose v2 storage accounts only. Storage accounts with hierarchical namespace enabled (that is, ADLS Gen2 accounts) aren't supported. <br><br> Also, any page blobs, append blobs, and premium blobs in your storage account won't be restored and only block blobs will be restored.
Operational backup of blobs uses blob point-in-time restore, blob versioning, so
- If you stop protection (vaulted backup) on a storage account, it doesn't delete the object replication policy created on the storage account. In these scenarios, you need to manually delete the *OR policies*. - Cool and archived blobs are currently not supported. + ## Next steps [Overview of Azure Blobs backup for Azure Blobs](blob-backup-overview.md)+
baremetal-infrastructure About Nc2 On Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/baremetal-infrastructure/workloads/nc2-on-azure/about-nc2-on-azure.md
We offer two SKUs: AN36 and AN36P. For specifications, see [SKUs](skus.md).
* Microsoft Azure Consumption Contract (MACC) credits
-## **Azure Hybrid Benefits (AHUB) for NC2 on Azure**
+## Azure Hybrid Benefits (AHUB) for NC2 on Azure
### Azure Commercial benefits
cloud-shell Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-shell/troubleshooting.md
This article covers troubleshooting Cloud Shell common scenarios.
### Storage Dialog - Error: 400 DisallowedOperation -- **Details**: When using an Azure Active Directory subscription, you can't create storage.-- **Resolution**: Use an Azure subscription capable of creating storage resources. Azure AD
- subscriptions aren't able to create Azure resources.
+- **Details**: When using a Microsoft Entra ID subscription, you can't create storage.
+- **Resolution**: Use an Azure subscription capable of creating storage resources. Microsoft Entra
+ ID subscriptions aren't able to create Azure resources.
### Terminal output - Error: Failed to connect terminal: websocket can't be established
This article covers troubleshooting Cloud Shell common scenarios.
- **Details**: Cloud Shell uses a container to host your shell environment, as a result running the daemon is disallowed.-- **Resolution**: Utilize [docker-machine][04], which is installed by default, to manage docker
- containers from a remote Docker host.
+- **Resolution**: Use the [docker CLI][04], which is installed by default, to remotely manage docker
+ containers.
## PowerShell troubleshooting
Cloud Shell supports the latest versions of following browsers:
### Copy and paste -- Windows: <kbd>Ctrl</kbd>-<kbd>C</kbd> to copy is supported but use
- <kbd>Shift</kbd>-<kbd>Insert</kbd> to paste.
+- Windows: <kbd>Ctrl</kbd>+<kbd>c</kbd> to copy is supported but use
+ <kbd>Shift</kbd>+<kbd>Insert</kbd> to paste.
- FireFox/IE may not support clipboard permissions properly.-- macOS: <kbd>Cmd</kbd>-<kbd>C</kbd> to copy and <kbd>Cmd</kbd>-<kbd>V</kbd> to paste.
+- macOS: <kbd>Cmd</kbd>+<kbd>c</kbd> to copy and <kbd>Cmd</kbd>+<kbd>v</kbd> to paste.
+- Linux: <kbd>CTRL</kbd>+<kbd>c</kbd> to copy and <kbd>CTRL</kbd>+<kbd>Shift</kbd>+<kbd>v</kbd> to paste.
+
+> [!NOTE]
+> If no text is selected when you type <kbd>Ctrl</kbd>+<kbd>C</kbd>, Cloud Shell sends the `Ctrl C`
+> character to the shell. This could terminate the currently running command.
### Usage limits
-Cloud Shell is intended for interactive use cases. As a result, any long-running non-interactive
-sessions are ended without warning.
+Cloud Shell is intended for interactive use cases. Cloud Shell sessions time out after 20 minutes
+without interactive activity. As a result, any long-running non-interactive sessions are ended
+without warning.
### User permissions
directory isn't persisted.
### Supported entry point limitations
-Cloud Shell entry points beside the Azure portal, such as Visual Studio Code & Windows Terminal,
+Cloud Shell entry points beside the Azure portal, such as Visual Studio Code and Windows Terminal,
don't support various Cloud Shell functionalities: - Use of commands that modify UX components in Cloud Shell, such as `Code`-- Fetching non-arm access tokens
+- Fetching non-ARM access tokens
## Bash limitations
Azure Cloud Shell in Azure Government is only accessible through the Azure porta
> Connecting to GCC-High or Government DoD Clouds for Exchange Online is currently not supported. <!-- link references -->
-[04]: https://docs.docker.com/machine/overview/
+[04]: https://docs.docker.com/desktop/
[05]: persisting-shell-storage.md#mount-a-new-clouddrive [06]: /powershell/microsoftgraph/migration-steps
cloud-shell Using The Shell Window https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-shell/using-the-shell-window.md
select restore.
<kbd>Shift</kbd>-<kbd>Insert</kbd> to paste. - FireFox/IE may not support clipboard permissions properly. - macOS: <kbd>Cmd</kbd>-<kbd>C</kbd> to copy and <kbd>Cmd</kbd>-<kbd>V</kbd> to paste.
+- Linux: <kbd>CTRL</kbd>-<kbd>C</kbd> to copy and <kbd>CTRL</kbd>-<kbd>SHIFT</kbd>-<kbd>V</kbd> to paste.
+
+> [!NOTE]
+> If no text is selected when you type <kbd>Ctrl</kbd>-<kbd>C</kbd>, Cloud Shell sends the `Ctrl C`
+> character to the shell. This could terminate the currently running command.
## Resize Cloud Shell window
communication-services Number Lookup Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/numbers/number-lookup-concept.md
# Number Lookup overview Azure Communication Services enable you to retrieve insights and look up a specific phone number using the Azure Communication Services Number Lookup SDK. It is part of the Phone Numbers SDK and can be used to support customer service scenarios, appointment reminders, two-factor authentication, and other real-time communication needs. Azure Communication Services Number Lookup allows you to reliably retrieve number insights before engaging with end-users.
The main benefits the solution will provide to ACS customers can be summarized o
![Diagram showing call recording architecture using calling client sdk.](../numbers/mvp-use-case.png)
+## Pricing
++
+| Request | Price per API query |
+| | --|
+| Get Number Type and Carrier details, query per phone number | $0.005 |
++ ## Next steps > [!div class="nextstepaction"]
The main benefits the solution will provide to ACS customers can be summarized o
The following documents may be interesting to you: -- Familiarize yourself with the [Number Lookup SDK](../numbers/number-lookup-sdk.md)
+- Familiarize yourself with the [Number Lookup SDK](../numbers/number-lookup-sdk.md)
communication-services Number Lookup Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/numbers/number-lookup-sdk.md
# Number Lookup SDK overview Azure Communication Services Number Lookup is part of the Phone Numbers SDK. It can be used for your applications to add additional checks before sending an SMS or placing a call.
The following list presents the set of features which are currently available in
| Group of features | Capability | .NET | JS | Java | Python | | -- | - | | - | - | |
-| Core Capabilities | Get Number Type | ✔️ | ❌ | ❌ | ❌ |
-| | Get Carrier registered name | ✔️ | ❌ | ❌ | ❌ |
-| | Get associated Mobile Network Code, if available(two or three decimal digits used to identify network operator within a country) | ✔️ | ❌ | ❌ | ❌ |
-| | Get associated Mobile Country Code, if available(three decimal digits used to identify the country of a mobile operator) | ✔️ | ❌ | ❌ | ❌ |
-| | Get associated ISO Country Code | ✔️ | ❌ | ❌ | ❌ |
-| Phone Number | All number types in E164 format | ✔️ | ❌ | ❌ | ❌ |
+| Core Capabilities | Get Number Type | ✔️ | ✔️ | ✔️ | ✔️ |
+| | Get Carrier registered name | ✔️ | ✔️ | ✔️ | ✔️ |
+| | Get associated Mobile Network Code, if available(two or three decimal digits used to identify network operator within a country) | ✔️ | ✔️ | ✔️ | ✔️ |
+| | Get associated Mobile Country Code, if available(three decimal digits used to identify the country of a mobile operator) | ✔️ | ✔️ | ✔️ | ✔️ |
+| | Get associated ISO Country Code | ✔️ | ✔️ | ✔️ | ✔️ |
+| Phone Number | All number types in E164 format | ✔️ | ✔️ | ✔️ | ✔️ |
## Next steps
communication-services Number Lookup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/telephony/number-lookup.md
Previously updated : 05/30/2023 Last updated : 08/10/2023
+zone_pivot_groups: acs-js-csharp-java-python
# Quickstart: Look up operator information for a phone number using Azure Communication Services
-Get started with the Phone Numbers client library for C# to look up operator information for phone numbers, which can be used to determine whether and how to communicate with that phone number. Follow these steps to install the package and look up operator information about a phone number.
-> [!NOTE]
-> Find the code for this quickstart on [GitHub](https://github.com/Azure/communication-preview/tree/master/samples/NumberLookup).
-## Prerequisites
-- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- To enable Number Lookup service on your Azure Communication Services subscription, please complete this [form](https://forms.microsoft.com/pages/responsepage.aspx?id=v4j5cvGGr0GRqy180BHbR058xZQ9HIBLikwspEUN6t5URUVDTTdWMEg5VElQTFpaMVMyM085ODkwVS4u) for us to allow-list your subscription. -- The latest version of [.NET Core client library](https://dotnet.microsoft.com/download/dotnet-core) for your operating system.-- An active Communication Services resource and connection string. [Create a Communication Services resource](../create-communication-resource.md).-
-### Prerequisite check
--- In a terminal or command window, run the `dotnet` command to check that the .NET SDK is installed.-
-## Setting up
-
-To set up an environment for sending lookup queries, take the steps in the following sections.
-
-### Create a new C# application
-
-In a console window, such as cmd, PowerShell, or Bash, use the `dotnet new` command to create a new console app with the name `NumberLookupQuickstart`. This command creates a simple "Hello World" C# project with a single source file, **Program.cs**.
-
-```console
-dotnet new console -o NumberLookupQuickstart
-```
-
-Change your directory to the newly created app folder and use the `dotnet build` command to compile your application.
-
-```console
-cd NumberLookupQuickstart
-dotnet build
-```
-
-### Connect to dev package feed
-The private preview version of the SDK is published to a dev package feed. You can add the dev feed using the [NuGet CLI](/nuget/reference/nuget-exe-cli-reference), which will add it to the NuGet.Config file.
-
-```console
-nuget sources add -Name "Azure SDK for .NET Dev Feed" -Source "https://pkgs.dev.azure.com/azure-sdk/public/_packaging/azure-sdk-for-net/nuget/v3/index.json"
-```
-
-More detailed information and other options for connecting to the dev feed can be found in the [contributing guide](https://github.com/Azure/azure-sdk-for-net/blob/main/CONTRIBUTING.md#nuget-package-dev-feed).
-
-### Install the package
-
-While still in the application directory, install the Azure Communication Services PhoneNumbers client library for .NET package by using the following command.
-
-```console
-dotnet add package Azure.Communication.PhoneNumbers --version 1.2.0-alpha.20230531.2
-```
-
-Add a `using` directive to the top of **Program.cs** to include the `Azure.Communication` namespace.
-
-```csharp
-using System;
-using System.Threading.Tasks;
-using Azure.Communication.PhoneNumbers;
-```
-
-Update `Main` function signature to be async.
-
-```csharp
-internal class Program
-{
- static async Task Main(string[] args)
- {
- ...
- }
-}
-```
-
-## Code examples
-
-### Authenticate the client
-
-Phone Number clients can be authenticated using connection string acquired from an Azure Communication Services resource in the [Azure portal](https://portal.azure.com).
-It's recommended to use a `COMMUNICATION_SERVICES_CONNECTION_STRING` environment variable to avoid putting your connection string in plain text within your code.
-
-```csharp
-// This code retrieves your connection string from an environment variable.
-string? connectionString = Environment.GetEnvironmentVariable("COMMUNICATION_SERVICES_CONNECTION_STRING");
-
-PhoneNumbersClient client = new PhoneNumbersClient(connectionString, new PhoneNumbersClientOptions(PhoneNumbersClientOptions.ServiceVersion.V2023_05_01_Preview));
-```
-
-Phone Number clients can also authenticate with Azure Active Directory Authentication. With this option,
-`AZURE_CLIENT_SECRET`, `AZURE_CLIENT_ID` and `AZURE_TENANT_ID` environment variables need to be set up for authentication.
-
-```csharp
-// Get an endpoint to our Azure Communication Services resource.
-Uri endpoint = new Uri("<endpoint_url>");
-TokenCredential tokenCredential = new DefaultAzureCredential();
-client = new PhoneNumbersClient(endpoint, tokenCredential);
-```
-
-### Look up operator information for a number
-
-To search for a phone number's operator information, call `SearchOperatorInformationAsync` from the `PhoneNumbersClient`.
-
-```csharp
-OperatorInformationResult searchResult = await client.SearchOperatorInformationAsync(new[] { "<target-phone-number>" });
-OperatorInformation operatorInformation = searchResult.Results[0];
-```
-
-Replace `<target-phone-number>` with the phone number you're looking up, usually a number you'd like to send a message to.
-
-> [!WARNING]
-> Provide phone numbers in E.164 international standard format, for example, +14255550123.
-
-### Use operator information
-
-You can now use the operator information. For this quickstart guide, we can print some of the details to the console.
-
-```csharp
-Console.WriteLine($"{operatorInformation.PhoneNumber} is a {operatorInformation.NumberType ?? "unknown"} number, operated by {operatorInformation.OperatorDetails.Name ?? "an unknown operator"}");
-```
-
-You may also use the operator information to determine whether to send an SMS. For more information on sending an SMS, see the [SMS Quickstart](../sms/send.md).
-
-## Run the code
-
-Run the application from your application directory with the `dotnet run` command.
-
-```console
-dotnet run
-```
-
-## Sample code
-
-You can download the sample app from [GitHub](https://github.com/Azure/communication-preview/tree/master/samples/NumberLookup).
## Troubleshooting
communication-services Get Started Teams Call Queue https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/voice-video-calling/get-started-teams-call-queue.md
If you'd like to skip ahead to the end, you can download this quickstart as a sa
Teams Call Queue is a feature in Microsoft Teams that efficiently distributes incoming calls among a group of designated users or agents. It's useful for customer support or call center scenarios. Calls are placed in a queue and assigned to the next available agent based on a predetermined routing method. Agents receive notifications and can handle calls using Teams' call controls. The feature offers reporting and analytics for performance tracking. It simplifies call handling, ensures a consistent customer experience, and optimizes agent productivity. You can select existing or create new Call Queue via [Teams Admin Center](https://aka.ms/teamsadmincenter).
-Learn more about how to create Auto Attendant using Teams Admin Center [here](/microsoftteams/create-a-phone-system-auto-attendant?tabs=general-info).
+Learn more about how to create Call Queue using Teams Admin Center [here](/microsoftteams/create-a-phone-system-call-queue?tabs=general-info).
## Find Object ID for Call Queue
container-apps Service Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/service-connector.md
Title: Connect a container app to a cloud service with Service Connector description: Learn to connect a container app to an Azure service using the Azure portal or the CLI.--++ Last updated 06/16/2022
container-registry Container Registry Tutorial Sign Build Push https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-tutorial-sign-build-push.md
Title: Build, Sign and Verify a container image using notation and certificate in Azure Key Vault
-description: In this tutorial you'll learn to create a signing certificate, build a container image, remote sign image with notation and Azure Key Vault, and then verify the container image using the Azure Container Registry.
--
+ Title: Sign container images with Notation and Azure Key Vault using a self-signed certificate (Preview)
+description: In this tutorial you'll learn to create a self-signed certificate in Azure Key Vault (AKV), build and sign a container image stored in Azure Container Registry (ACR) with notation and AKV, and then verify the container image with notation.
++ Last updated 4/23/2023
-# Build, sign, and verify container images using Notary and Azure Key Vault (Preview)
+# Sign container images with Notation and Azure Key Vault using a self-signed certificate (Preview)
-The Azure Key Vault (AKV) is used to store a signing key that can be utilized by [notation](http://notaryproject.dev/) with the notation AKV plugin (azure-kv) to sign and verify container images and other artifacts. The Azure Container Registry (ACR) allows you to attach these signatures using the **az** or **oras** CLI commands.
-
-The signed image enables users to assure deployments are built from a trusted entity and verify artifact hasn't been tampered with since their creation. The signed artifact ensures integrity and authenticity before the user pulls an artifact into any environment and avoid attacks.
+Signing container images is a process that ensures their authenticity and integrity. This is achieved by adding a digital signature to the container image, which can be validated during deployment. The signature helps to verify that the image is from a trusted publisher and has not been modified. [Notation](https://github.com/notaryproject/notation) is an open source supply chain tool developed by the [Notary Project](https://notaryproject.dev/), which supports signing and verifying container images and other artifacts. The Azure Key Vault (AKV) is used to store certificates with signing keys that can be used by Notation with the Notation AKV plugin (azure-kv) to sign and verify container images and other artifacts. The Azure Container Registry (ACR) allows you to attach signatures to container images and other artifacts as well as view those signatures.
+> [!IMPORTANT]
+> This feature is currently in preview. Previews are made available to you on the condition that you agree to the [supplemental terms of use][terms-of-use]. Some aspects of this feature may change prior to general availability (GA).
In this tutorial: > [!div class="checklist"]
-> * Store a signing certificate in Azure Key Vault
-> * Sign a container image with notation
-> * Verify a container image signature with notation
+> * Install Notation CLI and AKV plugin
+> * Create a self-signed certificate in AKV
+> * Build and push a container image with [ACR Tasks](container-registry-tasks-overview.md)
+> * Sign a container image with Notation CLI and AKV plugin
+> * Validate a container image against the signature with Notation CLI
## Prerequisites
-> * Create and sign in ACR with OCI artifact enabled
-> * Create or use an [Azure Key Vault](../key-vault/general/quick-create-cli.md)
->* This tutorial can be run in the [Azure Cloud Shell](https://portal.azure.com/#cloudshell/)
+* Create or use an [Azure Container Registry](../container-registry/container-registry-get-started-azure-cli.md) for storing container images and signatures
+* Create or use an [Azure Key Vault](../key-vault/general/quick-create-cli.md) for managing certificates
+* Install and configure the latest [Azure CLI](/cli/azure/install-azure-cli), or Run commands in the [Azure Cloud Shell](https://portal.azure.com/#cloudshell/)
-## Install the notation CLI and AKV plugin
+## Install Notation CLI and AKV plugin
-1. Install notation v1.0.0-rc.7 on a Linux environment. You can also download the package for other environments by following the [Notation installation guide](https://notaryproject.dev/docs/installation/cli/).
+1. Install Notation v1.0.0 on a Linux amd64 environment. You can also download the package for other environments by following the [Notation installation guide](https://notaryproject.dev/docs/user-guides/installation/).
```bash # Download, extract and install
- curl -Lo notation.tar.gz https://github.com/notaryproject/notation/releases/download/v1.0.0-rc.7/notation_1.0.0-rc.7_linux_amd64.tar.gz
+ curl -Lo notation.tar.gz https://github.com/notaryproject/notation/releases/download/v1.0.0/notation_1.0.0_linux_amd64.tar.gz
tar xvzf notation.tar.gz
- # Copy the notation cli to the desired bin directory in your PATH
+ # Copy the Notation binary to the desired bin directory in your $PATH, for example
cp ./notation /usr/local/bin ```
-2. Install the notation Azure Key Vault plugin on a Linux environment for remote signing and verification. You can also download the package for other environments by following the [Notation AKV plugin installation guide](https://github.com/Azure/notation-azure-kv#installation-the-akv-plugin).
+2. Install the Notation Azure Key Vault plugin on a Linux amd64 environment. You can also download the package for other environments by following the [Notation AKV plugin installation guide](https://github.com/Azure/notation-azure-kv#installation-the-akv-plugin).
> [!NOTE]
- > The plugin directory varies depending upon the operating system being used. The directory path below assumes Ubuntu. Please read the [Notation directory structure for system configuration](https://notaryproject.dev/docs/concepts/directory-structure/) for more information.
+ > The plugin directory varies depending upon the operating system being used. The directory path below assumes Ubuntu. Please read the [Notation directory structure for system configuration](https://notaryproject.dev/docs/user-guides/how-to/directory-structure/) for more information.
```bash # Create a directory for the plugin
In this tutorial:
# Download the plugin curl -Lo notation-azure-kv.tar.gz \
- https://github.com/Azure/notation-azure-kv/releases/download/v1.0.0-rc.2/notation-azure-kv_1.0.0-rc.2_linux_amd64.tar.gz
+ https://github.com/Azure/notation-azure-kv/releases/download/v1.0.1/notation-azure-kv_1.0.1_linux_amd64.tar.gz
# Extract to the plugin directory
- tar xvzf notation-azure-kv.tar.gz -C ~/.config/notation/plugins/azure-kv notation-azure-kv
+ tar xvzf notation-azure-kv.tar.gz -C ~/.config/notation/plugins/azure-kv
``` 3. List the available plugins.
In this tutorial:
1. Configure AKV resource names. ```bash
- # Name of the existing Azure Key Vault used to store the signing keys
- AKV_NAME=<your-unique-keyvault-name>
- # New desired key name used to sign and verify
- KEY_NAME=wabbit-networks-io
- CERT_SUBJECT="CN=wabbit-networks.io,O=Notary,L=Seattle,ST=WA,C=US"
- CERT_PATH=./${KEY_NAME}.pem
+ # Name of the existing AKV used to store the signing keys
+ AKV_NAME=myakv
+ # Name of the certificate created in AKV
+ CERT_NAME=wabbit-networks-io
+ CERT_SUBJECT="CN=wabbit-networks.io,O=Notation,L=Seattle,ST=WA,C=US"
+ CERT_PATH=./${CERT_NAME}.pem
``` 2. Configure ACR and image resource names.
In this tutorial:
IMAGE_SOURCE=https://github.com/wabbit-networks/net-monitor.git#main ```
-## Store the signing certificate in AKV
+## Sign in with Azure CLI
+
+```bash
+az login
+```
+
+To learn more about Azure CLI and how to sign in with it, see [Sign in with Azure CLI](/cli/azure/authenticate-azure-cli).
+
+## Assign access policy in AKV (Azure CLI)
+
+A user principal with the correct access policy permissions is needed to create a self-signed certificate and sign artifacts. This principal can be a user principal, service principal, or managed identity. At a minimum, this principal needs the following permissions:
+
+- `Create` permissions for certificates
+- `Get` permissions for certificates
+- `Sign` permissions for keys
+
+In this tutorial, the access policy is assigned to a signed-in Azure user. To learn more about assigning policy to a principal, see [Assign Access Policy](/azure/key-vault/general/assign-access-policy).
-If you have an existing certificate, upload it to AKV. For more information on how to use your own signing key, see the [signing certificate requirements.](https://github.com/Azure/notation-azure-kv/blob/release-0.6/docs/ca-signed-workflow.md)
-Otherwise create an x509 self-signed certificate storing it in AKV for remote signing using the steps below.
+### Set the subscription that contains the AKV resource
-### Create a self-signed certificate (Azure CLI)
+```bash
+az account set --subscription <your_subscription_id>
+```
+
+### Set the access policy in AKV
+
+```bash
+USER_ID=$(az ad signed-in-user show --query id -o tsv)
+az keyvault set-policy -n $AKV_NAME --certificate-permissions create get --key-permissions sign --object-id $USER_ID
+```
+
+> [!IMPORTANT]
+> This example shows the minimum permissions needed for creating a certificate and signing a container image. Depending on your requirements, you may need to grant additional permissions.
+
+## Create a self-signed certificate in AKV (Azure CLI)
+
+The following steps show how to create a self-signed certificate for testing purpose.
1. Create a certificate policy file.
- Once the certificate policy file is executed as below, it creates a valid signing certificate compatible with **notation** in AKV. The EKU listed is for code-signing, but isn't required for notation to sign artifacts. The subject is used later as trust identity that user trust during verification.
+ Once the certificate policy file is executed as below, it creates a valid certificate compatible with [Notary Project certificate requirement](https://github.com/notaryproject/specifications/blob/v1.0.0/specs/signature-specification.md#certificate-requirements) in AKV. The value for `ekus` is for code-signing, but isn't required for notation to sign artifacts. The subject is used later as trust identity that user trust during verification.
```bash cat <<EOF > ./my_policy.json
Otherwise create an x509 self-signed certificate storing it in AKV for remote si
"certificateTransparency": null, "name": "Self" },
+ "keyProperties": {
+ "exportable": false,
+ "keySize": 2048,
+ "keyType": "RSA",
+ "reuseKey": true
+ },
"x509CertificateProperties": { "ekus": [ "1.3.6.1.5.5.7.3.3"
Otherwise create an x509 self-signed certificate storing it in AKV for remote si
2. Create the certificate.
- ```azure-cli
- az keyvault certificate create -n $KEY_NAME --vault-name $AKV_NAME -p @my_policy.json
+ ```bash
+ az keyvault certificate create -n $CERT_NAME --vault-name $AKV_NAME -p @my_policy.json
```
-3. Get the Key ID for the certificate.
+## Sign a container image with Notation CLI and AKV plugin
+
+1. Authenticate to your ACR by using your individual Azure identity.
```bash
- KEY_ID=$(az keyvault certificate show -n $KEY_NAME --vault-name $AKV_NAME --query 'kid' -o tsv)
+ az acr login --name $ACR_NAME
```
-
-4. Download public certificate.
+
+> [!IMPORTANT]
+> If you have Docker installed on your system and used `az acr login` or `docker login` to authenticate to your ACR, your credentials are already stored and available to notation. In this case, you donΓÇÖt need to run `notation login` again to authenticate to your ACR. To learn more about authentication options for notation, see [Authenticate with OCI-compliant registries](https://notaryproject.dev/docs/user-guides/how-to/registry-authentication/).
+
+2. Build and push a new image with ACR Tasks. Always use the digest value to identify the image for signing since tags are mutable and can be overwritten.
```bash
- CERT_ID=$(az keyvault certificate show -n $KEY_NAME --vault-name $AKV_NAME --query 'id' -o tsv)
- az keyvault certificate download --file $CERT_PATH --id $CERT_ID --encoding PEM
+ DIGEST=$(az acr build -r $ACR_NAME -t $REGISTRY/${REPO}:$TAG $IMAGE_SOURCE --no-logs --query "outputImages[0].digest" -o tsv)
+ IMAGE=$REGISTRY/${REPO}@$DIGEST
```
-5. Add a signing key referencing the key ID.
+ In this tutorial, if the image has already been built and is stored in the registry, the tag serves as an identifier for that image for convenience.
```bash
- notation key add $KEY_NAME --plugin azure-kv --id $KEY_ID
+ IMAGE=$REGISTRY/${REPO}@$TAG
```
-6. List the keys to confirm.
+3. Get the Key ID of the signing key. A certificate in AKV can have multiple versions, the following command gets the Key ID of the latest version.
- ```bash
- notation key ls
- ```
+ ```bash
+ KEY_ID=$(az keyvault certificate show -n $CERT_NAME --vault-name $AKV_NAME --query 'kid' -o tsv)
+ ```
-7. Add the downloaded public certificate to named trust store for signature verification.
+4. Sign the container image with the [COSE](https://datatracker.ietf.org/doc/html/rfc9052) signature format using the signing key ID. To sign with a self-signed certificate, you need to set the plugin configuration value `self_signed=true`.
- ```bash
- STORE_TYPE="ca"
- STORE_NAME="wabbit-networks.io"
- notation cert add --type $STORE_TYPE --store $STORE_NAME $CERT_PATH
- ```
+ ```bash
+ notation sign --signature-format cose --id $KEY_ID --plugin azure-kv --plugin-config self_signed=true $IMAGE
+ ```
-8. List the certificate to confirm.
+5. View the graph of signed images and associated signatures.
```bash
- notation cert ls
+ notation ls $IMAGE
```
-## Build and sign a container image
-
-1. Build and push a new image with ACR Tasks.
+## Verify a container image with Notation CLI
- ```azure-cli
- az acr build -r $ACR_NAME -t $IMAGE $IMAGE_SOURCE
- ```
+To verify the container image, add the root certificate that signs the leaf certificate to the trust store and create trust policies for verification. For the self-signed certificate used in this tutorial, the root certificate is the self-signed certificate itself.
-2. Authenticate with your individual Azure AD identity to use an ACR token.
+1. Download public certificate.
- ```azure-cli
- export USER_NAME="00000000-0000-0000-0000-000000000000"
- export PASSWORD=$(az acr login --name $ACR_NAME --expose-token --output tsv --query accessToken)
- notation login -u $USER_NAME -p $PASSWORD $REGISTRY
+ ```bash
+ az keyvault certificate download --name $CERT_NAME --vault-name $AKV_NAME --file $CERT_PATH
```
-> [!NOTE]
-> Currently, `notation` relies on [Docker Credential Store](https://docs.docker.com/engine/reference/commandline/login/#credentials-store) for authentication. Notation requires additional configuration on Linux. If `notation login` is failing, you can configure the Docker Credential Store or Notation environment variables by following the guide [Authenticate with OCI-compliant registries](https://notaryproject.dev/docs/how-to/registry-authentication/).
+2. Add the downloaded public certificate to named trust store for signature verification.
-3. Sign the container image with the [COSE](https://datatracker.ietf.org/doc/html/rfc8152) signature format using the signing key added in previous step.
-
- ```bash
- notation sign --signature-format cose --key $KEY_NAME $IMAGE
- ```
+ ```bash
+ STORE_TYPE="ca"
+ STORE_NAME="wabbit-networks.io"
+ notation cert add --type $STORE_TYPE --store $STORE_NAME $CERT_PATH
+ ```
-4. View the graph of signed images and associated signatures.
+3. List the certificate to confirm.
```bash
- notation ls $IMAGE
+ notation cert ls
```
+
+4. Configure trust policy before verification.
-## Verify the container image
-
-1. Configure trust policy before verification.
-
- The trust policy is a JSON document named `trustpolicy.json`, which is stored under the notation configuration directory. Users who verify signed artifacts from a registry use the trust policy to specify trusted identities that sign the artifacts, and the level of signature verification to use.
-
- Use the following command to configure trust policy. Upon successful execution of the command, one trust policy named `wabbit-networks-images` is created. This trust policy applies to all the artifacts stored in repositories defined in `$REGISTRY/$REPO`. The trust identity that user trusts has the x509 subject `$CERT_SUBJECT` from previous step, and stored under trust store named `$STORE_NAME` of type `$STORE_TYPE`. See [Trust store and trust policy specification](https://github.com/notaryproject/notaryproject/blob/main/specs/trust-store-trust-policy.md) for details.
+ Trust policies allow users to specify fine-tuned verification policies. The following example configures a trust policy named `wabbit-networks-images`, which applies to all artifacts in `$REGISTRY/$REPO` and uses the named trust store `$STORE_NAME` of type `$STORE_TYPE`. It also assumes that the user trusts a specific identity with the X.509 subject `$CERT_SUBJECT`. For more details, see [Trust store and trust policy specification](https://github.com/notaryproject/notaryproject/blob/v1.0.0/specs/trust-store-trust-policy.md).
```bash cat <<EOF > ./trustpolicy.json
Otherwise create an x509 self-signed certificate storing it in AKV for remote si
EOF ```
-3. Use `notation policy` to import the trust policy configuration from a JSON file that we created previously.
+5. Use `notation policy` to import the trust policy configuration from a JSON file that we created previously.
```bash notation policy import ./trustpolicy.json notation policy show ```
-4. The notation command can also help to ensure the container image hasn't been tampered with since build time by comparing the `sha` with what is in the registry.
+6. Use `notation verify` to verify the container image hasn't been altered since build time.
```bash notation verify $IMAGE ```+ Upon successful verification of the image using the trust policy, the sha256 digest of the verified image is returned in a successful output message. ## Next steps See [Ratify on Azure: Allow only signed images to be deployed on AKS with Notation and Ratify](https://github.com/deislabs/ratify/blob/main/docs/quickstarts/ratify-on-azure.md).+
+[terms-of-use]: https://azure.microsoft.com/support/legal/preview-supplemental-terms/
cosmos-db Choose Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/choose-model.md
Here are a few key factors to help you decide which is the right option for you.
- Your workload has more long-running queries, complex aggregation pipelines, distributed transactions, joins, etc. - You prefer high-capacity vertical and horizontal scaling with familiar vCore-based cluster tiers such as M30, M40, M50 and more. - You're running applications requiring 99.995% availability.
+- You need native support for storing and searching vector embeddings.
[**Get started with Azure Cosmos DB for MongoDB vCore**](./vcore/quickstart-portal.md)
data-factory Tutorial Data Flow Adventure Works Retail Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/tutorial-data-flow-adventure-works-retail-template.md
With the databases created, ensure the dataflows are pointing to the correct tab
## Troubleshoot the pipelines If the pipeline fails to run successfully, there's a few main things to check for errors.
-* Dataset schema. Make sure the data settings for the CSV files are accurate. If you included row headers, make sure the how headers option is checked on the database table.
+* Dataset schema. Make sure the data settings for the CSV files are accurate. If you included row headers, make sure the row headers option is checked on the database table.
* Data flow sources. If you used different column or table names than what were provided in the example schema, you'll need to step through the data flows to verify that the columns are mapped correctly. * Data flow sink. The schema and data format configurations on the target database will need to match the data flow template. Like above, if any changes were made you those items will need to be aligned.
defender-for-cloud Connect Azure Subscription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/connect-azure-subscription.md
Within minutes of launching Defender for Cloud for the first time, you might see
## Enable all paid plans on your subscription
-To enable all of the Defender for Cloud's protections, you need to enable the other paid plans for each of the workloads that you want to protect.
+To enable all of Defender for Cloud's protections, you need to enable the plans for the workloads that you want to protect.
> [!NOTE] >
To enable all of the Defender for Cloud's protections, you need to enable the ot
> - You can enable **Microsoft Defender for open-source relational databases** at the resource level only. > - The Microsoft Defender plans available at the workspace level are: **Microsoft Defender for Servers**, **Microsoft Defender for SQL servers on machines**.
-When you enabled Defender plans on an entire Azure subscription, the protections are applied to all other resources in the subscription.
+When you enable Defender plans on an entire Azure subscription, the protections are applied to all other resources in the subscription.
**To enable additional paid plans on a subscription**:
defender-for-cloud Integration Defender For Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/integration-defender-for-endpoint.md
Use the [Defender for Endpoint status workbook](https://aka.ms/MDEStatus) to ver
##### Enable for multiple subscriptions with a PowerShell script
-Use our [PowerShell script](https://github.com/Azure/Microsoft-Defender-for-Cloud/tree/main/Powershell%20scripts/Enable%20MDE%20Integration%20for%20Linux) from the Defender for Cloud GitHub repository to enable endpoint protection on Linux machines that are in multiple subscriptions.
+Use our [PowerShell script](https://github.com/Azure/Microsoft-Defender-for-Cloud/tree/main/Powershell%20scripts/MDE%20Integration/Enable%20MDE%20Integration%20for%20Linux) from the Defender for Cloud GitHub repository to enable endpoint protection on Linux machines that are in multiple subscriptions.
##### Manage automatic updates configuration for Linux
defender-for-cloud Subassessment Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/subassessment-rest-api.md
Title: Container vulnerability assessments powered by Microsoft Defender Vulnera
description: Learn about container vulnerability assessments powered by Microsoft Defender Vulnerability Management subassessments Previously updated : 09/11/2023 Last updated : 09/18/2023
-# Container vulnerability assessments powered by Microsoft Defender Vulnerability Management subassessments
-
-API Version: 2019-01-01-preview
-
-Get security subassessments on all your scanned resources inside a scope.
+# Container vulnerability assessments REST API
## Overview
-You can access vulnerability assessment results pragmatically for both registry and runtime recommendations using the subassessments rest API.
-
-For more information on how to get started with our REST API, see [Azure REST API reference](/rest/api/azure/). Use the following information for specific information for the container vulnerability assessment results powered by Microsoft Defender Vulnerability Management.
-
-## HTTP Requests
-
-### Get
-
-#### GET
-
-`https://management.azure.com/{scope}/providers/Microsoft.Security/assessments/{assessmentName}/subAssessments/{subAssessmentName}?api-version=2019-01-01-preview`
-
-#### URI Parameters
-
-| Name | In | Required | Type | Description |
-| -- | -- | -- | | |
-| assessmentName | path | True | string | The Assessment Key - Unique key for the assessment type |
-| scope | path | True | string | Scope of the query. Can be subscription (/subscriptions/{SubscriptionID}) or management group (/providers/Microsoft.Management/managementGroups/mgName). |
-| subAssessmentName | path | True | string | The Sub-Assessment Key - Unique key for the subassessment type |
-| api-version | query | True | string | API version for the operation |
-
-#### Responses
-
-| Name | Type | Description |
-| - | | - |
-| 200 OK | [SecuritySubAssessment](/rest/api/defenderforcloud/sub-assessments/get#securitysubassessment) | OK |
-| Other Status Codes | [CloudError](/rest/api/defenderforcloud/sub-assessments/get#clouderror) | Error response describing why the operation failed. |
-
-### List
-
-#### GET
-
-`https://management.azure.com/{scope}/providers/Microsoft.Security/assessments/{assessmentName}/subAssessments?api-version=2019-01-01-preview`
-
-#### URI parameters
-
-| **Name** | **In** | **Required** | **Type** | **Description** |
-| | | | -- | |
-| **assessmentName** | path | True | string | The Assessment Key - Unique key for the assessment type |
-| **scope** | path | True | string | Scope of the query. The scope for AzureContainerVulnerability is the registry itself. |
-| **api-version** | query | True | string | API version for the operation |
-
-#### Responses
-
-| Name | Type | Description |
-| | | |
-| 200 OK | [SecuritySubAssessmentList](/rest/api/defenderforcloud/sub-assessments/list#securitysubassessmentlist) | OK |
-| Other Status Codes | [CloudError](/rest/api/defenderforcloud/sub-assessments/list#clouderror) | Error response describing why the operation failed. |
-
-## Security
+Azure Resource Graph (ARG) provides a REST API that can be used to pragmatically access vulnerability assessment results for both Azure registry and runtime vulnerabilities recommendations.
+Learn more about [ARG references and query examples](/azure/governance/resource-graph/overview).
-### azure_auth
+Azure container registry vulnerabilities sub assessments are published to ARG as part of the security resources. For more information, see:
+- [Security Resources ARG Query Samples](/azure/governance/resource-graph/samples/samples-by-category?tabs=azure-cli#list-container-registry-vulnerability-assessment-results)
+- [Generic Security Sub Assessment Query](/azure/governance/resource-graph/samples/samples-by-category?tabs=azure-cli#list-container-registry-vulnerability-assessment-results)
-Azure Active Directory OAuth2 Flow
+## ARG query examples
-Type: oauth2
-Flow: implicit
-Authorization URL: `https://login.microsoftonline.com/common/oauth2/authorize`
-
-Scopes
-
-| Name | Description |
-| | -- |
-| user_impersonation | impersonate your user account |
-
-### Example
-
-### HTTP
-
-#### GET
-
-`https://management.azure.com/subscriptions/{SubscriptionID}/resourceGroups/myResourceGroup/providers/Microsoft.ContainerRegistry/registries/myRegistry/providers/Microsoft.Security/assessments/{SubscriptionID}/subAssessments?api-version=2019-01-01-preview`
-
-#### Sample Response
+To pull specific sub assessments, you need the assessment key. For Container vulnerability assessment powered by MDVM the key is `c0b7cfc6-3172-465a-b378-53c7ff2cc0d5`.
+The following is a generic security sub assessment query example that can be used as an example to build queries with. This query pulls the first sub assessment generated in the last hour.
+```kql
+securityresources
+| where type =~ "microsoft.security/assessments/subassessments" and properties.additionalData.assessedResourceType == "AzureContainerRegistryVulnerability"
+| extend assessmentKey=extract(@"(?i)providers/Microsoft.Security/assessments/([^/]*)", 1, id)
+| where assessmentKey == "c0b7cfc6-3172-465a-b378-53c7ff2cc0d5"
+| extend timeGenerated = properties.timeGenerated
+| where timeGenerated > ago(1h)
+```
+### Query result
```json
-{
- "value": [
- {
- "type": "Microsoft.Security/assessments/subAssessments",
- "id": "/subscriptions/{SubscriptionID}/resourceGroups/PytorchEnterprise/providers/Microsoft.ContainerRegistry/registries/ptebic/providers/Microsoft.Security/assessments/c0b7cfc6-3172-465a-b378-53c7ff2cc0d5/subassessments/3f069764-2777-3731-9698-c87f23569a1d",
- "name": "{name}",
- "properties": {
- "id": "CVE-2021-39537",
- "displayName": "CVE-2021-39537",
- "status": {
- "code": "NotApplicable",
- "severity": "High",
- "cause": "Exempt",
- "description": "Disabled parent assessment"
- },
- "remediation": "Create new image with updated package libncursesw5 with version 6.2-0ubuntu2.1 or higher.",
- "description": "This vulnerability affects the following vendors: Gnu, Apple, Red_Hat, Ubuntu, Debian, Suse, Amazon, Microsoft, Alpine. To view more details about this vulnerability please visit the vendor website.",
- "timeGenerated": "2023-08-08T08:14:13.742742Z",
- "resourceDetails": {
- "source": "Azure",
- "id": "/repositories/public/azureml/aifx/stable-ubuntu2004-cu116-py39-torch1121/images/sha256:7f107db187ff32acfbc47eaa262b44d13d725f14dd08669a726a81fba87a12d6"
- },
- "additionalData": {
- "assessedResourceType": "AzureContainerRegistryVulnerability",
- "artifactDetails": {
- "repositoryName": "public/azureml/aifx/stable-ubuntu2004-cu116-py39-torch1121",
- "registryHost": "ptebic.azurecr.io",
- "digest": "sha256:7f107db187ff32acfbc47eaa262b44d13d725f14dd08669a726a81fba87a12d6",
- "tags": [
- "biweekly.202305.2"
- ],
- "artifactType": "ContainerImage",
- "mediaType": "application/vnd.docker.distribution.manifest.v2+json",
- "lastPushedToRegistryUTC": "2023-05-15T16:00:40.2938142Z"
- },
- "softwareDetails": {
- "osDetails": {
- "osPlatform": "linux",
- "osVersion": "ubuntu_linux_20.04"
- },
- "packageName": "libncursesw5",
- "category": "OS",
- "fixReference": {
- "id": "USN-6099-1",
- "url": "https://ubuntu.com/security/notices/USN-6099-1",
- "description": "USN-6099-1: ncurses vulnerabilities 2023 May 23",
- "releaseDate": "2023-05-23T00:00:00+00:00"
- },
- "vendor": "ubuntu",
- "version": "6.2-0ubuntu2",
- "evidence": [
- "dpkg-query -f '${Package}:${Source}:\\n' -W | grep -e ^libncursesw5:.* -e .*:libncursesw5: | cut -f 1 -d ':' | xargs dpkg-query -s",
- "dpkg-query -f '${Package}:${Source}:\\n' -W | grep -e ^libncursesw5:.* -e .*:libncursesw5: | cut -f 1 -d ':' | xargs dpkg-query -s"
- ],
- "language": "",
- "fixedVersion": "6.2-0ubuntu2.1",
- "fixStatus": "FixAvailable"
- },
- "vulnerabilityDetails": {
- "cveId": "CVE-2021-39537",
- "references": [
- {
- "title": "CVE-2021-39537",
- "link": "https://nvd.nist.gov/vuln/detail/CVE-2021-39537"
- }
- ],
- "cvss": {
- "2.0": null,
- "3.0": {
- "base": 7.8,
- "cvssVectorString": "CVSS:3.0/AV:L/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H/E:P/RL:U/RC:R"
- }
- },
- "workarounds": [],
- "publishedDate": "2020-08-04T00:00:00",
- "lastModifiedDate": "2023-07-07T00:00:00",
- "severity": "High",
- "cpe": {
- "uri": "cpe:2.3:a:ubuntu:libncursesw5:*:*:*:*:*:ubuntu_linux_20.04:*:*",
- "part": "Applications",
- "vendor": "ubuntu",
- "product": "libncursesw5",
- "version": "*",
- "update": "*",
- "edition": "*",
- "language": "*",
- "softwareEdition": "*",
- "targetSoftware": "ubuntu_linux_20.04",
- "targetHardware": "*",
- "other": "*"
- },
- "weaknesses": {
- "cwe": [
- {
- "id": "CWE-787"
- }
- ]
- },
- "exploitabilityAssessment": {
- "exploitStepsVerified": false,
- "exploitStepsPublished": false,
- "isInExploitKit": false,
- "types": [],
- "exploitUris": []
- }
- },
- "cvssV30Score": 7.8
- }
+[
+ {
+ "id": "/subscriptions/{SubscriptionId}/resourceGroups/{ResourceGroup}/providers/Microsoft.ContainerRegistry/registries/{Registry Name}/providers/Microsoft.Security/assessments/c0b7cfc6-3172-465a-b378-53c7ff2cc0d5/subassessments/{SubAssessmentId}",
+ "name": "{SubAssessmentId}",
+ "type": "microsoft.security/assessments/subassessments",
+ "tenantId": "{TenantId}",
+ "kind": "",
+ "location": "global",
+ "resourceGroup": "{ResourceGroup}",
+ "subscriptionId": "{SubscriptionId}",
+ "managedBy": "",
+ "sku": null,
+ "plan": null,
+ "properties": {
+ "id": "CVE-2022-42969",
+ "additionalData": {
+ "assessedResourceType": "AzureContainerRegistryVulnerability",
+ "vulnerabilityDetails": {
+ "severity": "High",
+ "exploitabilityAssessment": {
+ "exploitStepsPublished": false,
+ "exploitStepsVerified": false,
+ "isInExploitKit": false,
+ "exploitUris": [],
+ "types": [
+ "Remote"
+ ]
+ },
+ "lastModifiedDate": "2023-09-12T00:00:00Z",
+ "publishedDate": "2022-10-16T06:15:00Z",
+ "workarounds": [],
+ "references": [
+ {
+ "title": "CVE-2022-42969",
+ "link": "https://nvd.nist.gov/vuln/detail/CVE-2022-42969"
+ },
+ {
+ "title": "oval:org.opensuse.security:def:202242969",
+ "link": "https://ftp.suse.com/pub/projects/security/oval/suse.linux.enterprise.server.15.xml.gz"
+ },
+ {
+ "title": "oval:com.microsoft.cbl-mariner:def:11166",
+ "link": "https://raw.githubusercontent.com/microsoft/CBL-MarinerVulnerabilityData/main/cbl-mariner-1.0-oval.xml"
+ },
+ {
+ "title": "ReDoS in py library when used with subversion ",
+ "link": "https://github.com/advisories/GHSA-w596-4wvx-j9j6"
+ }
+ ],
+ "weaknesses": {
+ "cwe": [
+ {
+ "id": "CWE-1333"
+ }
+ ]
+ },
+ "cveId": "CVE-2022-42969",
+ "cvss": {
+ "2.0": null,
+ "3.0": {
+ "cvssVectorString": "CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H",
+ "base": 7.5
}
- }
- ]
-}
+ },
+ "cpe": {
+ "language": "*",
+ "softwareEdition": "*",
+ "version": "*",
+ "targetHardware": "*",
+ "targetSoftware": "python",
+ "vendor": "py",
+ "edition": "*",
+ "product": "py",
+ "update": "*",
+ "other": "*",
+ "part": "Applications",
+ "uri": "cpe:2.3:a:py:py:*:*:*:*:*:python:*:*"
+ }
+ },
+ "artifactDetails": {
+ "lastPushedToRegistryUTC": "2023-09-04T16:05:32.8223098Z",
+ "repositoryName": "public/azureml/aifx/stable-ubuntu2004-cu117-py39-torch200",
+ "registryHost": "ptebic.azurecr.io",
+ "artifactType": "ContainerImage",
+ "mediaType": "application/vnd.docker.distribution.manifest.v2+json",
+ "digest": "sha256:4af8e6f002401a965bbe753a381af308b40d8947fad2b9e1f6a369aa81abee59",
+ "tags": [
+ "biweekly.202309.1"
+ ]
+ },
+ "softwareDetails": {
+ "category": "Language",
+ "language": "python",
+ "fixedVersion": "",
+ "version": "1.11.0.0",
+ "vendor": "py",
+ "packageName": "py",
+ "osDetails": {
+ "osPlatform": "linux",
+ "osVersion": "ubuntu_linux_20.04"
+ },
+ "fixStatus": "FixAvailable",
+ "evidence": []
+ },
+ "cvssV30Score": 7.5
+ },
+ "description": "This vulnerability affects the following vendors: Pytest, Suse, Microsoft, Py. To view more details about this vulnerability please visit the vendor website.",
+ "displayName": "CVE-2022-42969",
+ "resourceDetails": {
+ "id": "/repositories/public/azureml/aifx/stable-ubuntu2004-cu117-py39-torch200/images/sha256:4af8e6f002401a965bbe753a381af308b40d8947fad2b9e1f6a369aa81abee59",
+ "source": "Azure"
+ },
+ "timeGenerated": "2023-09-12T13:36:15.0772799Z",
+ "remediation": "No remediation exists",
+ "status": {
+ "description": "Disabled parent assessment",
+ "severity": "High",
+ "code": "NotApplicable",
+ "cause": "Exempt"
+ }
+ },
+ "tags": null,
+ "identity": null,
+ "zones": null,
+ "extendedLocation": null,
+ "assessmentKey": "c0b7cfc6-3172-465a-b378-53c7ff2cc0d5",
+ "timeGenerated": "2023-09-12T13:36:15.0772799Z"
+ }
+]
``` ## Definitions
Scopes
| Name | Description | | | | | AzureResourceDetails | Details of the Azure resource that was assessed |
-| CloudError | Common error response for all Azure Resource Manager APIs to return error details for failed operations. (This definition also follows the OData error response format.). |
-| CloudErrorBody | The error detail |
| AzureContainerVulnerability | More context fields for container registry Vulnerability assessment | | CVE | CVE Details | | CVSS | CVSS Details |
-| ErrorAdditionalInfo | The resource management error additional info. |
| SecuritySubAssessment | Security subassessment on a resource | | SecuritySubAssessmentList | List of security subassessments | | ArtifactDetails | Details for the affected container image |
Scopes
### AzureContainerRegistryVulnerability (MDVM)
-Additional context fields for Azure container registry vulnerability assessment
+Other context fields for Azure container registry vulnerability assessment
| **Name** | **Type** | **Description** | | -- | -- | -- |
Details of the Azure resource that was assessed
| ID | string | Azure resource ID of the assessed resource | | source | string: Azure | The platform where the assessed resource resides |
-### CloudError
-
-Common error response for all Azure Resource Manager APIs to return error details for failed operations. (This response also follows the OData error response format.).
-
-| **Name** | **Type** | **Description** |
-| -- | | -- |
-| error.additionalInfo | [ErrorAdditionalInfo](/rest/api/defenderforcloud/sub-assessments/list#erroradditionalinfo) | The error additional info. |
-| error.code | string | The error code. |
-| error.details | [CloudErrorBody](/rest/api/defenderforcloud/sub-assessments/list?tabs=HTTP#clouderrorbody) | The error details. |
-| error.message | string | The error message. |
-| error.target | string | The error target. |
-
-### CloudErrorBody
-
-The error detail.
-
-| **Name** | **Type** | **Description** |
-| -- | | -- |
-| additionalInfo | [ErrorAdditionalInfo](/rest/api/defenderforcloud/sub-assessments/list#erroradditionalinfo) | The error additional info. |
-| code | string | The error code. |
-| details | [CloudErrorBody](/rest/api/defenderforcloud/sub-assessments/list#clouderrorbody) | The error details. |
-| message | string | The error message. |
-| target | string | The error target. |
-
-### ErrorAdditionalInfo
-
-The resource management error additional info.
-
-| **Name** | **Type** | **Description** |
-| -- | -- | - |
-| info | object | The additional info. |
-| type | string | The additional info type. |
- ### SecuritySubAssessment Security subassessment on a resource
deployment-environments How To Authenticate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/deployment-environments/how-to-authenticate.md
+
+ Title: Authenticate to Azure Deployment Environments REST APIs
+description: Learn how to authenticate to Azure Deployment Environments REST APIs.
+++++ Last updated : 09/07/2023+
+# Authenticating to Azure Deployment Environments REST APIs
+
+> [!TIP]
+> Before authenticating, ensure that the user or identity has the appropriate permissions to perform the desired action. For more information, see [configuring project admins](./how-to-configure-project-admin.md) and [configuring environment users](./how-to-configure-deployment-environments-user.md).
++
+## Using Azure AD authentication for REST APIs
+
+Use the following procedures to authenticate with Azure AD. You can follow along in [Azure Cloud Shell](../../articles/cloud-shell/quickstart.md), on an Azure virtual machine, or on your local machine.
+
+### Sign in to the user's Azure subscription
+
+Start by authenticating with Azure AD by using the Azure CLI. This step isn't required in Azure Cloud Shell.
+
+```azurecli
+az login
+```
+
+The command opens a browser window to the Azure AD authentication page. It requires you to give your Azure AD user ID and password.
+
+Next, set the correct subscription context. If you authenticate from an incorrect subscription or tenant you may receive unexpected 403 Forbidden errors.
+
+```azurecli
+az account set --subscription <subscription_id>
+```
++
+### Retrieve the Azure AD access token
+
+Use the Azure CLI to acquire an access token for the Azure AD authenticated user.
+Note that the resource ID is different depending on if you are accessing administrator (control plane) APIs or developer (data plane) APIs.
+
+For administrator APIs, use the following command:
+```azurecli-interactive
+az account get-access-token
+```
+
+For developer APIs, use the following command:
+```azurecli-interactive
+az account get-access-token --resource https://devcenter.azure.com
+```
+
+After authentication is successful, Azure AD returns an access token for current Azure subscription:
+
+```json
+{
+ "accessToken": "[TOKEN]",
+ "expiresOn": "[expiration_date_and_time]",
+ "subscription": "[subscription_id]",
+ "tenant": "[tenant_id]",
+ "tokenType": "Bearer"
+}
+```
+
+The token is a Base64 string. The token is valid for at least 5 minutes with the maximum of 90 minutes. The expiresOn defines the actual token expiration time.
+
+> [!TIP]
+> Developer API tokens for the service are encrypted and cannot be decoded using JWT decoding tools. They can only be processed by the service.
++
+### Using a bearer token to access REST APIs
+To access REST APIs, you must set the Authorization header on your request. The header value should be the string `Bearer` followed by a space and the token you received in the previous step.
+
+## Next steps
+- Review [Azure Active Directory fundamentals](../../articles/active-directory/fundamentals/whatis.md).
dev-box How To Authenticate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dev-box/how-to-authenticate.md
+
+ Title: Authenticate to Microsoft Dev Box REST APIs
+description: Learn how to authenticate to Microsoft Dev Box REST APIs.
+++++ Last updated : 09/07/2023+
+# Authenticating to Microsoft Dev Box REST APIs
+
+> [!TIP]
+> Before authenticating, ensure that the user or identity has the appropriate permissions to perform the desired action. For more information, see [configuring project admins](./how-to-project-admin.md) and [configuring Dev Box users](./how-to-dev-box-user.md).
++
+## Using Azure AD authentication for REST APIs
+
+Use the following procedures to authenticate with Azure AD. You can follow along in [Azure Cloud Shell](../../articles/cloud-shell/quickstart.md), on an Azure virtual machine, or on your local machine.
+
+### Sign in to the user's Azure subscription
+
+Start by authenticating with Azure AD by using the Azure CLI. This step isn't required in Azure Cloud Shell.
+
+```azurecli
+az login
+```
+
+The command opens a browser window to the Azure AD authentication page. It requires you to give your Azure AD user ID and password.
+
+Next, set the correct subscription context. If you authenticate from an incorrect subscription or tenant you may receive unexpected 403 Forbidden errors.
+
+```azurecli
+az account set --subscription <subscription_id>
+```
++
+### Retrieve the Azure AD access token
+
+Use the Azure CLI to acquire an access token for the Azure AD authenticated user.
+Note that the resource ID is different depending on if you are accessing administrator (control plane) APIs or developer (data plane) APIs.
+
+For administrator APIs, use the following command:
+```azurecli-interactive
+az account get-access-token
+```
+
+For developer APIs, use the following command:
+```azurecli-interactive
+az account get-access-token --resource https://devcenter.azure.com
+```
+
+After authentication is successful, Azure AD returns an access token for current Azure subscription:
+
+```json
+{
+ "accessToken": "[TOKEN]",
+ "expiresOn": "[expiration_date_and_time]",
+ "subscription": "[subscription_id]",
+ "tenant": "[tenant_id]",
+ "tokenType": "Bearer"
+}
+```
+
+The token is a Base64 string. The token is valid for at least 5 minutes with the maximum of 90 minutes. The expiresOn defines the actual token expiration time.
+
+> [!TIP]
+> Developer API tokens for the service are encrypted and cannot be decoded using JWT decoding tools. They can only be processed by the service.
++
+### Using a bearer token to access REST APIs
+To access REST APIs, you must set the Authorization header on your request. The header value should be the string `Bearer` followed by a space and the token you received in the previous step.
+
+## Next steps
+- Review [Azure Active Directory fundamentals](../../articles/active-directory/fundamentals/whatis.md).
devtest-labs Deliver Proof Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/deliver-proof-concept.md
# Deliver a proof of concept for Azure DevTest Labs enterprise deployment
+Enterprises are rapidly adopting the cloud because of [benefits](/azure/architecture/cloud-adoption/business-strategy/cloud-migration-business-case) that include agility, flexibility, and economics. The first steps are often development and test workloads. Azure DevTest Labs provides [features](devtest-lab-concepts.md) that benefit the enterprise and support [key dev/test scenarios](devtest-lab-guidance-get-started.md).
+ This article describes how an enterprise can deliver a successful proof of concept or pilot for an Azure DevTest Labs deployment. Proof of concept uses a concentrated effort from a single team to establish organizational value. Every enterprise has different requirements for incorporating Azure DevTest Labs into their organization. Proof of concept is a first step toward a successful end-to-end deployment.
Learn about Azure and DevTest Labs by using the following resources:
- [Understand the Azure portal](https://azure.microsoft.com/features/azure-portal) - [DevTest Labs overview](devtest-lab-overview.md) - [DevTest Labs scenarios](devtest-lab-guidance-get-started.md)-- [DevTest Labs in the enterprise](devtest-lab-guidance-prescriptive-adoption.md) - [DevTest Labs enterprise reference architecture](devtest-lab-reference-architecture.md)
+### Understand enterprise focus areas
+
+Common concerns for enterprises that migrate workloads to the cloud include:
+
+- [Securing development/testing resources](devtest-lab-guidance-governance-policy-compliance.md)
+- [Managing and understanding costs](devtest-lab-guidance-governance-cost-ownership.md)
+- Enabling self-service for developers without compromising enterprise security and compliance
+- Automating and extending DevTest Labs to cover additional scenarios
+- [Scaling a DevTest Labs-based solution to thousands of resources](devtest-lab-guidance-scale.md)
+- [Large-scale deployments of DevTest Labs](devtest-lab-guidance-orchestrate-implementation.md)
+- [Getting started with a proof of concept](devtest-lab-guidance-orchestrate-implementation.md)
### Get an Azure subscription - Enterprises with an existing [Enterprise Agreement](https://azure.microsoft.com/pricing/purchase-options/enterprise-agreement) that enables access to Azure can use an existing or new subscription for DevTest Labs. If there's an Enterprise Agreement in place, an [Enterprise Dev/Test subscription](https://azure.microsoft.com/offers/ms-azr-0148p/) gives you access to Windows 10/Windows 8.1 client operating systems, and discounted rates for development and testing workloads.
devtest-labs Devtest Lab Guidance Governance Application Migration Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-guidance-governance-application-migration-integration.md
- Title: Application migration and integration
-description: This article provides governance guidance for Azure DevTest Labs infrastructure. The context is application migration and integration.
--- Previously updated : 06/26/2020----
-# Governance of Azure DevTest Labs infrastructure - Application migration and integration
-Once your development/test lab environment has been established, you need to think about the following questions:
--- How do you use the environment within your project team?-- How do you ensure that you follow any required organizational policies, and maintain the agility to add value to your application?-
-## Azure Marketplace images vs. custom images
-
-### Question
-When should I use an Azure Marketplace image vs. my own custom organizational image?
-
-### Answer
-Azure Marketplace should be used by default unless you have specific concerns or organizational requirements. Some common examples include;
--- Complex software setup that requires an application to be included as part of the base image.-- Installation and setup of an application could take many hours, which aren't an efficient use of compute time to be added on an Azure Marketplace image.-- Developers and testers require access to a virtual machine quickly, and want to minimize the setup time of a new virtual machine.-- Compliance or regulatory conditions (for example, security policies) that must be in place for all machines.-
-Consider using custom images carefully. Custom images introduce extra complexity, as you now have to manage VHD files for those underlying base images. You also need to routinely patch those base images with software updates. These updates include new operating system (OS) updates, and any updates or configuration changes needed for the software package itself.
-
-## Formula vs. custom image
-
-### Question
-When should I use a formula vs. custom image?
-
-### Answer
-Typically, the deciding factor in this scenario is cost and reuse.
-
-You can reduce cost by creating a custom image if:
-- Many users or labs require the image.-- The required image has a lot of software on top of the base image.-
-This solution means that you create the image once. A custom image reduces the setup time of the virtual machine. You don't incur costs from running the virtual machine during setup.
-
-Another factor is the frequency of changes to your software package. If you run daily builds and require that software to be on your users' virtual machines, consider using a formula instead of a custom image.
-
-## Use custom organizational images
-
-This scenario is an advanced scenario, and the scripts provided are sample scripts only. If any changes are required, you need to manage and maintain the scripts used in your environment.
---
-## Patterns to set up network configuration
-
-### Question
-How do I ensure that development and test virtual machines are unable to reach the public internet? Are there any recommended patterns to set up network configuration?
-
-### Answer
-Yes. There are two aspects to consider ΓÇô inbound and outbound traffic.
-
-**Inbound traffic** ΓÇô If the virtual machine doesn't have a public IP address, then the internet can't reach it. A common approach is to set a subscription-level policy that no user can create a public IP address.
-
-**Outbound traffic** ΓÇô If you want to prevent virtual machines from going directly to the public internet, and force traffic through a corporate firewall, you can route traffic on-premises via Azure ExpressRoute or VPN, by using forced routing.
-
-> [!NOTE]
-> If you have a proxy server that blocks traffic without proxy settings, do not forget to add exceptions to the labΓÇÖs artifact storage account, .
-
-You could also use network security groups for virtual machines or subnets. This step adds another layer of protection to allow or block traffic.
-
-## New vs. existing virtual network
-
-### Question
-When should I create a new virtual network for my DevTest Labs environment vs. using an existing virtual network?
-
-### Answer
-If your VMs need to interact with existing infrastructure, you should consider using an existing virtual network inside your DevTest Labs environment. If you use ExpressRoute, minimize the number of virtual networks and subnets so you don't fragment the IP address space assigned to your subscriptions. Also consider using the virtual network peering pattern here (Hub-Spoke model). This approach enables virtual network and subnet communication across subscriptions within a given region.
-
-Each DevTest Labs environment could have its own virtual network, but there are [limits](../azure-resource-manager/management/azure-subscription-service-limits.md) on the number of virtual networks per subscription. The default amount is 50, though this limit can be raised to 100.
-
-## Shared, public, or private IP
-
-### Question
-When should I use a shared IP vs. public IP vs. private IP?
-
-### Answer
-If you use a site-to-site VPN or Express Route, consider using private IPs so that your machines are accessible via your internal network, and inaccessible over public internet.
-
-> [!NOTE]
-> Lab owners can change this subnet policy to ensure that no one accidentally creates public IP addresses for their VMs. The subscription owner should create a subscription policy preventing public IPs from being created.
-
-When using shared public IPs, the virtual machines in a lab share a public IP address. This approach can be helpful when you need to avoid breaching the limits on public IP addresses for a given subscription.
-
-## Limits of labs per subscription
-### Question
-How many labs can I create under the same subscription?
-
-### Answer
-
-There isn't a specific limit on the number of labs that can be created per subscription. However, the amount of resources used per subscription is limited. You can read about the [limits and quotas for Azure subscriptions](../azure-resource-manager/management/azure-subscription-service-limits.md) and [how to increase these limits](https://azure.microsoft.com/blog/azure-limits-quotas-increase-requests).
-
-## Limits of VMs per lab
-### Question
-How many VMs can I create per lab?
-
-### Answer:
-There is no specific limit on the number of VMs that can be created per lab. However, the resources (VM cores, public IP addresses, and so on) that are used are limited per subscription. You can read about the [limits and quotas for Azure subscriptions](../azure-resource-manager/management/azure-subscription-service-limits.md) and [how to increase these limits](https://azure.microsoft.com/blog/azure-limits-quotas-increase-requests).
-
-## Limits of number of virtual machines per user or lab
-
-### Question
-Is there a rule for how many virtual machines I should set per user, or per lab?
-
-### Answer
-When considering the number of virtual machines per user or per lab, there are three main concerns:
--- The **overall cost** that the team can spend on resources in the lab. ItΓÇÖs easy to spin up many machines. To control costs, one mechanism is to limit the number of VMs per user or per lab-- The total number of virtual machines in a lab is impacted by the [subscription level quotas](../azure-resource-manager/management/azure-subscription-service-limits.md) available. One of the upper limits is 800 resource groups per subscription. DevTest Labs currently creates a new resource group for each VM (unless shared public IPs are used). If there are 10 labs in a subscription, labs could fit approximately 79 virtual machines in each lab (800 upper limit ΓÇô 10 resource groups for the 10 labs themselves) = 79 virtual machines per lab.-- If the lab is connected to on-premises via Express Route (for example), there are **defined IP address spaces available** for the VNet/Subnet. To ensure that VMs in the lab don't fail to be created (error: canΓÇÖt get IP address), lab owners can specify the max VMs per lab aligned with the IP address space available.-
-## Use Resource Manager templates
-
-### Question
-How can I use Resource Manager templates in my DevTest Labs Environment?
-
-### Answer
-Deploy your Resource Manager templates by using the steps in [Use Azure DevTest Labs for test environments](devtest-lab-test-env.md). Basically, you check your Resource Manager templates into an Azure Repos or GitHub Git repository, and add a [private repository for your templates](devtest-lab-test-env.md) to the lab.
-
-This scenario may not be useful if you're using DevTest Labs to host development machines. Use this scenario to build a staging environment that's representative of production.
-
-The number of virtual machines per lab or per user option only limits the number of machines natively created in the lab itself. This option doesn't limit creation by any environments with Resource Manager templates.
-
-## Next steps
-See [Use environments in DevTest Labs](devtest-lab-test-env.md).
devtest-labs Devtest Lab Guidance Governance Cost Ownership https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-guidance-governance-cost-ownership.md
- Title: Manage cost and ownership
-description: This article provides information that helps you optimize for cost and align ownership across your environment.
--- Previously updated : 06/26/2020----
-# Governance of Azure DevTest Labs infrastructure - Manage cost and ownership
-Cost and ownership are primary concerns when you consider building your development and test environments. In this section, you find information that helps you optimize for cost and align ownership across your environment.
-
-## Optimize for cost
-
-### Question
-How can I optimize for cost within my DevTest Labs environment?
-
-### Answer
-Several built-in features of DevTest Labs help you optimize for cost. See [cost management, thresholds](devtest-lab-configure-cost-management.md) [,and policies](devtest-lab-set-lab-policy.md) articles to limit activities of your users.
-
-If you use DevTest Labs for development and test workloads, consider using the [Enterprise Dev/Test Subscription Benefit](https://azure.microsoft.com/offers/ms-azr-0148p/) that's part of your Enterprise Agreement. Or if you're a Pay as you Go customer, consider the [Pay-as-you go DevTest offer](https://azure.microsoft.com/offers/ms-azr-0023p/).
-
-This approach provides several advantages:
--- Special lower Dev/Test rates on Windows virtual machines, cloud services, HDInsight, App Service, and Logic Apps-- Great Enterprise Agreement (EA) rates on other Azure services-- Access to exclusive Dev/Test images in the Gallery, including Windows 8.1 and Windows 10
-
-Only active Visual Studio subscribers (standard subscriptions, annual cloud subscriptions, and monthly cloud subscriptions) can use Azure resources running within an enterprise Dev/Test subscription. However, end users can access the application to provide feedback or do acceptance testing. You can use resources within this subscription only for developing and testing applications. There's no uptime guarantee.
-
-If you decide to use the DevTest offer, use this benefit exclusively for development and testing your applications. Usage within the subscription doesn't carry a financially backed SLA, except for the use of Azure DevOps and HockeyApp.
-
-## Define role-based access across your organization
-### Question
-How do I define Azure role-based access control for my DevTest Labs environments to ensure that IT can govern while developers/test can do their work?
-
-### Answer
-There's a broad pattern, but the detail depends on your organization.
-
-Central IT should own only what's necessary, and enable the project and application teams to have the needed level of control. Typically, it means that central IT owns the subscription and handles core IT functions such as networking configurations. The set of **owners** for a subscription should be small. These owners can nominate other owners when there's a need, or apply subscription-level policies, for example ΓÇ£No Public IPΓÇ¥.
-
-There may be a subset of users that require access across a subscription, such as Tier1 or Tier 2 support. In this case, we recommend that you give these users the **contributor** access so that they can manage the resources, but not provide user access or adjust policies.
-
-DevTest Labs resource owners should be close to the project or application team. These owners understand machine and software requirements. In most organizations, the owner of the DevTest Labs resource is the project or development lead. This owner can manage users and policies within the lab environment and can manage all virtual machines in the DevTest Labs environment.
-
-Add project and application team members to the DevTest Labs Users role. These users can create virtual machines, in line with lab and subscription-level policies. Users can also manage their own virtual machines, but can't manage virtual machines that belong to other users.
-
-For more information, see [Azure enterprise scaffold ΓÇô prescriptive subscription governance](/azure/architecture/cloud-adoption/appendix/azure-scaffold).
--
-## Next steps
-See [Corporate policy and compliance](devtest-lab-guidance-governance-policy-compliance.md).
devtest-labs Devtest Lab Guidance Governance Policy Compliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-guidance-governance-policy-compliance.md
- Title: Company policy and compliance
-description: This article provides guidance on governing company policy and compliance for Azure DevTest Labs infrastructure.
--- Previously updated : 06/26/2020----
-# Governance of Azure DevTest Labs infrastructure - Company policy and compliance
-This article provides guidance on governing company policy and compliance for Azure DevTest Labs infrastructure.
-
-## Public vs. private artifact repository
-
-### Question
-When should an organization use a public artifact repository vs. private artifact repository in DevTest Labs?
-
-### Answer
-The [public artifact repository](https://github.com/Azure/azure-devtestlab/tree/master/Artifacts) provides an initial set of software packages that are most commonly used. It helps with rapid deployment without having to invest time to reproduce common developer tools and add-ins. You can choose to deploy their own private repository. You can use a public and a private repository in parallel. You may also choose to disable the public repository. The criteria to deploy a private repository should be driven by the following questions and considerations:
--- Does the organization have a requirement to have corporate licensed software as part of their DevTest Labs offering? If the answer is yes, then a private repository should be created.-- Does the organization develop custom software that provides a specific operation, which is required as part of the overall provisioning process? If the answer is yes, then a private repository should be deployed.-- If organization's governance policy requires isolation, and external repositories are not under direct configuration management by the organization, a private artifact repository should be deployed. As part of this process, an initial copy of the public repository can be copied and integrated with the private repository. Then, the public repository can be disabled so that no one within the organization can access it anymore. This approach forces all users within the organization to have only a single repository that is approved by the organization and minimize configuration drift.-
-### Single repository or multiple repositories
-
-### Question
-Should an organization plan for a single repository or allow multiple repositories?
-
-### Answer
-As part of your organization's overall governance and configuration management strategy, we recommend that you use a centralized repository. When you use multiple repositories, they may become silos of unmanaged software over the time. With a central repository, multiple teams can consume artifacts from this repository for their projects. It enforces standardization, security, ease of management, and eliminates the duplication of efforts. As part of the centralization, the following actions are recommended practices for long-term management and sustainability:
--- Associate the Azure Repos with the same Azure Active Directory tenant that the Azure subscription is using for authentication and authorization.-- Create a group named **All DevTest Labs Developers** in Azure Active Directory that is centrally managed. Any developer who contributes to artifact development should be placed in this group.-- The same Azure Active Directory group can be used to provide access to the Azure Repos repository and to the lab.-- In Azure Repos, branching or forking should be used to a separate an in-development repository from the primary production repository. Content is only added to the main branch with a pull request after a proper code review. Once the code reviewer approves the change, a lead developer, who is responsible for maintenance of the main branch, merges the updated code. -
-## Corporate security policies
-
-### Question
-How can an organization ensure corporate security policies are in place?
-
-### Answer
-An organization may achieve it by doing the following actions:
-
-1. Developing and publishing a comprehensive security policy. The policy articulates the rules of acceptable use associated with the using software, cloud assets. It also defines what clearly violates the policy.
-2. Develop a custom image, custom artifacts, and a deployment process that allows for orchestration within the security realm that is defined with active directory. This approach enforces the corporate boundary and sets a common set of environmental controls. These controls against the environment a developer can consider as they develop and follow a secure development lifecycle as part of their overall process. The objective also is to provide an environment that is not overly restrictive that may hinder development, but a reasonable set of controls. The group policies at the organization unit (OU) that contains lab virtual machines could be a subset of the total group policies that are found in production. Alternatively, they can be an additional set to properly mitigate any identified risks.
-
-## Data integrity
-
-### Question
-How can an organization ensure data integrity to ensure that remoting developers can't remove code or introduce malware or unapproved software?
-
-### Answer
-There are several layers of control to mitigate the threat from external consultants, contractors, or employees that are remoting in to collaborate in DevTest Labs.
-
-As stated previously, the first step must have an acceptable use policy drafted and defined that clearly outlines the consequences when someone violates the policy.
-
-The first layer of controls for remote access is to apply a remote access policy through a VPN connection that is not directly connected to the lab.
-
-The second layer of controls is to apply a set of group policy objects that prevent copy and paste through remote desktop. A network policy could be implemented to not allow outbound services from the environment such as FTP and RDP services out of the environment. User-defined routing could force all Azure network traffic back to on-premises, but the routing could not account for all URLs that might allow uploading of data unless controlled through a proxy to scan content and sessions. Public IPs could be restricted within the virtual network supporting DevTest Labs to not allow bridging of an external network resource.
-
-Ultimately, the same type of restrictions needs to be applied across the organization, which would have to also account for all possible methods of removable media or external URLs that could accept a post of content. Consult with your security professional to review and implement a security policy. For more recommendations, see [Microsoft Cyber Security](https://www.microsoft.com/security/default.aspx?&WT.srch=1&wt.mc_id=AID623240_SEM_sNYnsZDs).
--
-## Next steps
-See [Application migration and integration](devtest-lab-guidance-governance-application-migration-integration.md).
devtest-labs Devtest Lab Guidance Governance Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-guidance-governance-resources.md
description: This article addresses the alignment and management of resources fo
Previously updated : 06/26/2020 Last updated : 09/15/2023
-# Governance of Azure DevTest Labs infrastructure - Resources
+# Governance of Azure DevTest Labs infrastructure
This article addresses the alignment and management of resources for DevTest Labs within your organization.
-## Align within an Azure subscription
+## Resources
+### Align DevTest Labs resources within an Azure subscription
-### Question
-How do I align DevTest Labs resources within an Azure subscription?
-
-### Answer
Before an organization begins to use Azure for general application development, IT planners should first review how to introduce the capability as part of their overall portfolio of services. Areas for review should address the following concerns: - How to measure the cost associated with the application development lifecycle?
Before an organization begins to use Azure for general application development,
The **first recommended practice** is to review organizations' Azure taxonomy where the divisions between production and development subscriptions are outlined. In the following diagram, the suggested taxonomy allows for a logical separation of development/testing and production environments. With this approach, an organization can introduce billing codes to track costs associated with each environment separately. For more information, see [Prescriptive subscription governance](/azure/architecture/cloud-adoption/appendix/azure-scaffold). Additionally, you can use [Azure tags](../azure-resource-manager/management/tag-resources.md) to organize resources for tracking and billing purposes.
-The **second recommended practice** is to enable the DevTest subscription within the Azure Enterprise portal. It allows an organization to run client operating systems that are not typically available in an Azure enterprise subscription. Then, use enterprise software where you pay only for the compute and don't worry about licensing. It ensures that the billing for designated services, including gallery images in IaaS such as Microsoft SQL Server, is based on consumption only. Details about the Azure DevTest subscription can be found [here](https://azure.microsoft.com/offers/ms-azr-0148p/) for Enterprise Agreement (EA) customers and [here](https://azure.microsoft.com/offers/ms-azr-0023p/) for Pay as you Go customers.
+The **second recommended practice** is to enable the DevTest subscription within the Azure Enterprise portal. It allows an organization to run client operating systems that aren't typically available in an Azure enterprise subscription. Then, use enterprise software where you pay only for the compute and don't worry about licensing. It ensures that the billing for designated services, including gallery images in IaaS such as Microsoft SQL Server, is based on consumption only. Details about the Azure DevTest subscription can be found [here](https://azure.microsoft.com/offers/ms-azr-0148p/) for Enterprise Agreement (EA) customers and [here](https://azure.microsoft.com/offers/ms-azr-0023p/) for Pay as you Go customers.
-![Resource alignment with subscriptions](./media/devtest-lab-guidance-governance/resource-alignment-with-subscriptions.png)
This model provides an organization the flexibility to deploy Azure DevTest Labs at scale. An organization can support hundreds of labs for various business units with 100 to 1000 virtual machines running in parallel. It promotes the notion of a centralized enterprise lab solution that can share the same principles of configuration management and security controls.
-This model also ensures that the organization does not exhaust their resource limits associated with their Azure subscription. For details about subscription and service limits, see [Azure subscription and service limits, quotas, and constraints](../azure-resource-manager/management/azure-subscription-service-limits.md). The DevTest Labs provisioning process can consume large number of resource groups. You can request for limits to be increased through a support request in the Azure DevTest subscription. The resources within the production subscription are not affected as the development subscription grows in use. For more information on scaling DevTest Labs, see [Scale quotas and limits in DevTest Labs](devtest-lab-scale-lab.md).
+This model also ensures that the organization doesn't exhaust their resource limits associated with their Azure subscription. For details about subscription and service limits, see [Azure subscription and service limits, quotas, and constraints](../azure-resource-manager/management/azure-subscription-service-limits.md). The DevTest Labs provisioning process can consume large number of resource groups. You can request for limits to be increased through a support request in the Azure DevTest subscription. The resources within the production subscription aren't affected as the development subscription grows in use. For more information on scaling DevTest Labs, see [Scale quotas and limits in DevTest Labs](devtest-lab-scale-lab.md).
+
+A common subscription level limit that needs to be accounted for is how the network IP range assignments are allocated to support both production and development subscriptions. These assignments should account for growth over time (assuming on-premises connectivity or another networking topology that requires the enterprise to manage their networking stack instead of defaulting to AzureΓÇÖs implementation). The recommended practice is to have a few virtual networks that have a large IP address prefix assigned and divided with many large subnets rather than to have multiple virtual networks with small subnets. For example, with 10 subscriptions, you can define 10 virtual networks (one for each subscription). All labs that donΓÇÖt require isolation can share the same subnet on the subscriptionΓÇÖs virtual network.
+
+### Number of users per lab and labs per organization
+
+Business units and development groups that are associated with the same development project should be associated with the same lab. It allows for same types of policies, images, and shutdown policies to be applied to both groups.
+
+You may also need to consider geographic boundaries. For example, developers in the north east United States (US) may use a lab provisioned in East US2. And, developers in Dallas, Texas, and Denver, Colorado may be directed to use a resource in US South Central. If there's a collaborative effort with an external third party, they could be assigned to a lab that isn't used by internal developers.
+
+You may also use a lab for a specific project within Azure DevOps Projects. Then, you apply security through a specified Azure Active Directory group, which allows access to both set of resources. The virtual network assigned to the lab can be another boundary to consolidate users.
+
+### Preventing the deletion of resources
+
+Set permissions at the lab level so that only authorized users can delete resources or change lab policies. Developers should be placed within the **DevTest Labs Users** group. The lead developer or the infrastructure lead should be the **DevTest Labs Owner**. We recommend that you have only two lab owners. This policy extends towards the code repository to avoid corruption. Lab uses have rights to use resources but can't update lab policies. See the following article that lists the roles and rights that each built-in group has within a lab: [Add owners and users in Azure DevTest Labs](devtest-lab-add-devtest-user.md).
+
+## Manage cost and ownership
+Cost and ownership are primary concerns when you consider building your development and test environments. In this section, you find information that helps you optimize for cost and align ownership across your environment.
+
+### Optimize for cost
+
+Several built-in features of DevTest Labs help you optimize for cost. See [cost management, thresholds](devtest-lab-configure-cost-management.md) [,and policies](devtest-lab-set-lab-policy.md) articles to limit activities of your users.
+
+If you use DevTest Labs for development and test workloads, consider using the [Enterprise Dev/Test Subscription Benefit](https://azure.microsoft.com/offers/ms-azr-0148p/) that's part of your Enterprise Agreement. Or if you're a Pay as you Go customer, consider the [Pay-as-you go DevTest offer](https://azure.microsoft.com/offers/ms-azr-0023p/).
+
+This approach provides several advantages:
+
+- Special lower Dev/Test rates on Windows virtual machines, cloud services, HDInsight, App Service, and Logic Apps
+- Great Enterprise Agreement (EA) rates on other Azure services
+- Access to exclusive Dev/Test images in the Gallery, including Windows 8.1 and Windows 10
+
+Only active Visual Studio subscribers (standard subscriptions, annual cloud subscriptions, and monthly cloud subscriptions) can use Azure resources running within an enterprise Dev/Test subscription. However, end users can access the application to provide feedback or do acceptance testing. You can use resources within this subscription only for developing and testing applications. There's no uptime guarantee.
+
+If you decide to use the DevTest offer, use this benefit exclusively for development and testing your applications. Usage within the subscription doesn't carry a financially backed SLA, except for the use of Azure DevOps and HockeyApp.
+
+### Define role-based access across your organization
+
+Central IT should own only what's necessary, and enable the project and application teams to have the needed level of control. Typically, it means that central IT owns the subscription and handles core IT functions such as networking configurations. The set of **owners** for a subscription should be small. These owners can nominate other owners when there's a need, or apply subscription-level policies, for example ΓÇ£No Public IPΓÇ¥.
+
+There may be a subset of users that require access across a subscription, such as Tier 1 or Tier 2 support. In this case, we recommend that you give these users the **contributor** access so that they can manage the resources, but not provide user access or adjust policies.
+
+DevTest Labs resource owners should be close to the project or application team. These owners understand machine and software requirements. In most organizations, the owner of the DevTest Labs resource is the project or development lead. This owner can manage users and policies within the lab environment and can manage all virtual machines in the DevTest Labs environment.
+
+Add project and application team members to the DevTest Labs Users role. These users can create virtual machines, in line with lab and subscription-level policies. Users can also manage their own virtual machines, but can't manage virtual machines that belong to other users.
+
+For more information, see [Azure enterprise scaffold ΓÇô prescriptive subscription governance](/azure/architecture/cloud-adoption/appendix/azure-scaffold).
+
+## Company policy and compliance
+This section provides guidance on governing company policy and compliance for Azure DevTest Labs infrastructure.
+
+### Public vs. private artifact repository
+
+The [public artifact repository](https://github.com/Azure/azure-devtestlab/tree/master/Artifacts) provides an initial set of software packages that are most commonly used. It helps with rapid deployment without having to invest time to reproduce common developer tools and add-ins. You can choose to deploy their own private repository. You can use a public and a private repository in parallel. You may also choose to disable the public repository. The criteria to deploy a private repository should be driven by the following questions and considerations:
+
+- Does the organization have a requirement to have corporate licensed software as part of their DevTest Labs offering? If the answer is yes, then a private repository should be created.
+- Does the organization develop custom software that provides a specific operation, which is required as part of the overall provisioning process? If the answer is yes, then a private repository should be deployed.
+- If organization's governance policy requires isolation, and external repositories aren't under direct configuration management by the organization, a private artifact repository should be deployed. As part of this process, an initial copy of the public repository can be copied and integrated with the private repository. Then, the public repository can be disabled so that no one within the organization can access it anymore. This approach forces all users within the organization to have only a single repository that is approved by the organization and minimize configuration drift.
+
+#### Single repository or multiple repositories
+
+As part of your organization's overall governance and configuration management strategy, we recommend that you use a centralized repository. When you use multiple repositories, they may become silos of unmanaged software over the time. With a central repository, multiple teams can consume artifacts from this repository for their projects. It enforces standardization, security, ease of management, and eliminates the duplication of efforts. As part of the centralization, the following actions are recommended practices for long-term management and sustainability:
+
+- Associate the Azure Repos with the same Azure Active Directory tenant that the Azure subscription is using for authentication and authorization.
+- Create a group named **All DevTest Labs Developers** in Azure Active Directory that is centrally managed. Any developer who contributes to artifact development should be placed in this group.
+- The same Azure Active Directory group can be used to provide access to the Azure Repos repository and to the lab.
+- In Azure Repos, branching or forking should be used to a separate an in-development repository from the primary production repository. Content is only added to the main branch with a pull request after a proper code review. Once the code reviewer approves the change, a lead developer, who is responsible for maintenance of the main branch, merges the updated code.
+
+### Corporate security policies
+
+An organization may apply corporate security policies by:
+
+- Developing and publishing a comprehensive security policy. The policy articulates the rules of acceptable use associated with the using software, cloud assets. It also defines what clearly violates the policy.
+- Developing a custom image, custom artifacts, and a deployment process that allows for orchestration within the security realm that is defined with active directory. This approach enforces the corporate boundary and sets a common set of environmental controls. These controls against the environment a developer can consider as they develop and follow a secure development lifecycle as part of their overall process. The objective also is to provide an environment that isn't overly restrictive that may hinder development, but a reasonable set of controls. The group policies at the organization unit (OU) that contains lab virtual machines could be a subset of the total group policies that are found in production. Alternatively, they can be another set to properly mitigate any identified risks.
+
+### Data integrity
+
+An organization can ensure data integrity to ensure that remoting developers can't remove code or introduce malware or unapproved software. There are several layers of control to mitigate the threat from external consultants, contractors, or employees remoting in to collaborate in DevTest Labs.
+
+As stated previously, the first step must have an acceptable use policy drafted and defined that clearly outlines the consequences when someone violates the policy.
+
+The first layer of controls for remote access is to apply a remote access policy through a VPN connection that isn't directly connected to the lab.
+
+The second layer of controls is to apply a set of group policy objects that prevent copy and paste through remote desktop. A network policy could be implemented to not allow outbound services from the environment such as FTP and RDP services out of the environment. User-defined routing could force all Azure network traffic back to on-premises, but the routing couldn't account for all URLs that might allow uploading of data unless controlled through a proxy to scan content and sessions. Public IPs could be restricted within the virtual network supporting DevTest Labs to not allow bridging of an external network resource.
+
+Ultimately, the same type of restrictions needs to be applied across the organization, which would have to also account for all possible methods of removable media or external URLs that could accept a post of content. Consult with your security professional to review and implement a security policy. For more recommendations, see [Microsoft Cyber Security](https://www.microsoft.com/security/default.aspx?&WT.srch=1&wt.mc_id=AID623240_SEM_sNYnsZDs).
+
+## Application migration and integration
+Once your development/test lab environment has been established, you need to think about the following questions:
+
+- How do you use the environment within your project team?
+- How do you ensure that you follow any required organizational policies, and maintain the agility to add value to your application?
+
+### Azure Marketplace images vs. custom images
+
+Azure Marketplace should be used by default unless you have specific concerns or organizational requirements. Some common examples include;
+
+- Complex software setup that requires an application to be included as part of the base image.
+- Installation and setup of an application could take many hours, which aren't an efficient use of compute time to be added on an Azure Marketplace image.
+- Developers and testers require access to a virtual machine quickly, and want to minimize the setup time of a new virtual machine.
+- Compliance or regulatory conditions (for example, security policies) that must be in place for all machines.
+
+Consider using custom images carefully. Custom images introduce extra complexity, as you now have to manage VHD files for those underlying base images. You also need to routinely patch those base images with software updates. These updates include new operating system (OS) updates, and any updates or configuration changes needed for the software package itself.
+
+### Formula vs. custom image
+
+Typically, the deciding factor in this scenario is cost and reuse.
+
+You can reduce cost by creating a custom image if:
+- Many users or labs require the image.
+- The required image has a lot of software on top of the base image.
+
+This solution means that you create the image once. A custom image reduces the setup time of the virtual machine. You don't incur costs from running the virtual machine during setup.
+
+Another factor is the frequency of changes to your software package. If you run daily builds and require that software to be on your users' virtual machines, consider using a formula instead of a custom image.
+
+### Patterns to set up network configuration
+
+When you ensure that development and test virtual machines are unable to reach the public internet There are two aspects to consider ΓÇô inbound and outbound traffic.
+
+**Inbound traffic** ΓÇô If the virtual machine doesn't have a public IP address, then the internet can't reach it. A common approach is to set a subscription-level policy that no user can create a public IP address.
+
+**Outbound traffic** ΓÇô If you want to prevent virtual machines from going directly to the public internet, and force traffic through a corporate firewall, you can route traffic on-premises via Azure ExpressRoute or VPN, by using forced routing.
+
+> [!NOTE]
+> If you have a proxy server that blocks traffic without proxy settings, do not forget to add exceptions to the labΓÇÖs artifact storage account, .
-A common subscription level limit that needs to be accounted for is how the network IP range assignments are allocated to support both production and development subscriptions. These assignments should account for growth over time (assuming on-premises connectivity or another networking topology that requires the enterprise to manage their networking stack instead of defaulting to AzureΓÇÖs implementation). The recommended practice is to have a few virtual networks that have a large IP address prefix assigned and divided with many large subnets rather than to have multiple virtual networks with small subnets. For example, with 10 subscriptions, you can define 10 virtual networks (one for each subscription). All labs that donΓÇÖt require isolation can share the same subnet on the subscriptionΓÇÖs vnet.
+You could also use network security groups for virtual machines or subnets. This step adds another layer of protection to allow or block traffic.
-## Maintain naming conventions
+### New vs. existing virtual network
-### Question
-How do I maintain a naming convention across my DevTest Labs environment?
+If your VMs need to interact with existing infrastructure, you should consider using an existing virtual network inside your DevTest Labs environment. If you use ExpressRoute, minimize the number of virtual networks and subnets so you don't fragment the IP address space assigned to your subscriptions. Also consider using the virtual network peering pattern here (Hub-Spoke model). This approach enables virtual network and subnet communication across subscriptions within a given region.
-### Answer
-You may want to extend current enterprise naming conventions to Azure operations and make them consistent across the DevTest Labs environment.
+Each DevTest Labs environment could have its own virtual network, but there are [limits](../azure-resource-manager/management/azure-subscription-service-limits.md) on the number of virtual networks per subscription. The default amount is 50, though this limit can be raised to 100.
-When deploying DevTest Labs, we recommend that you have specific starting policies. You deploy these policies by a central script and JSON templates to enforce consistency. Naming policies can be implemented through Azure policies applied at the subscription level. For JSON samples for Azure Policy, see [Azure Policy samples](../governance/policy/samples/index.md).
+### Shared, public, or private IP
-## Number of users per lab and labs per organization
+If you use a site-to-site VPN or Express Route, consider using private IPs so that your machines are accessible via your internal network, and inaccessible over public internet.
-### Question
-How do I determine the ratio of users per lab and the overall number of labs needed across an organization?
+> [!NOTE]
+> Lab owners can change this subnet policy to ensure that no one accidentally creates public IP addresses for their VMs. The subscription owner should create a subscription policy preventing public IPs from being created.
-### Answer
-We recommend that business units and development groups that are associated with the same development project are associated with the same lab. It allows for same types of policies, images, and shutdown policies to be applied to both groups.
+When using shared public IPs, the virtual machines in a lab share a public IP address. This approach can be helpful when you need to avoid breaching the limits on public IP addresses for a given subscription.
-You may also need to consider geographic boundaries. For example, developers in the north east United States (US) may use a lab provisioned in East US2. And, developers in Dallas, Texas, and Denver, Colorado may be directed to use a resource in US South Central. If there is a collaborative effort with an external third party, they could be assigned to a lab that is not used by internal developers.
+### Lab limits
-You may also use a lab for a specific project within Azure DevOps projects. Then, you apply security through a specified Azure Active Directory group, which allows access to both set of resources. The virtual network assigned to the lab can be another boundary to consolidate users.
+There are several lab limits that you should be aware of. These limits are described in the following sections.
+#### Limits of labs per subscription
-## Deletion of resources
+There isn't a specific limit on the number of labs that can be created per subscription. However, the amount of resources used per subscription is limited. You can read about the [limits and quotas for Azure subscriptions](../azure-resource-manager/management/azure-subscription-service-limits.md) and [how to increase these limits](https://azure.microsoft.com/blog/azure-limits-quotas-increase-requests).
+
+#### Limits of VMs per lab
+
+There's no specific limit on the number of VMs that can be created per lab. However, the resources (VM cores, public IP addresses, and so on) that are used are limited per subscription. You can read about the [limits and quotas for Azure subscriptions](../azure-resource-manager/management/azure-subscription-service-limits.md) and [how to increase these limits](https://azure.microsoft.com/blog/azure-limits-quotas-increase-requests).
+
+#### Limits of number of virtual machines per user or lab
-### Question
-How can we prevent the deletion of resources within a lab?
+When you consider the number of virtual machines per user or per lab, there are three main concerns:
-### Answer
-We recommend that you set proper permissions at the lab level so that only authorized users can delete resources or change lab policies. Developers should be placed within the **DevTest Labs Users** group. The lead developer or the infrastructure lead should be the **DevTest Labs Owner**. We recommend that you have only two lab owners. This policy extends towards the code repository to avoid corruption. Lab uses have rights to use resources but cannot update lab policies. See the following article that lists the roles and rights that each built-in group has within a lab: [Add owners and users in Azure DevTest Labs](devtest-lab-add-devtest-user.md).
+- The **overall cost** that the team can spend on resources in the lab. ItΓÇÖs easy to spin up many machines. To control costs, one mechanism is to limit the number of VMs per user or per lab
+- The total number of virtual machines in a lab is impacted by the [subscription level quotas](../azure-resource-manager/management/azure-subscription-service-limits.md) available. One of the upper limits is 800 resource groups per subscription. DevTest Labs currently creates a new resource group for each VM (unless shared public IPs are used). If there are 10 labs in a subscription, labs could fit approximately 79 virtual machines in each lab (800 upper limit ΓÇô 10 resource groups for the 10 labs themselves) = 79 virtual machines per lab.
+- If the lab is connected to on-premises via Express Route (for example), there are **defined IP address spaces available** for the VNet/Subnet. To ensure that VMs in the lab don't fail to be created (error: canΓÇÖt get IP address), lab owners can specify the max VMs per lab aligned with the IP address space available.
-## Move lab to another resource group
+### Use Resource Manager templates
-### Question
-Is it supported to move a lab into another Resource Group?
+Deploy your Resource Manager templates by using the steps in [Use Azure DevTest Labs for test environments](devtest-lab-test-env.md). Basically, you check your Resource Manager templates into an Azure Repos or GitHub Git repository, and add a [private repository for your templates](devtest-lab-test-env.md) to the lab.
-### Answer
-Yes. Navigate to the Resource Group page from the home page for your lab. Then, select **Move** on the toolbar, and select the lab you want to move to a different resource group. When you create a lab, a resource group is automatically created for you. However, you may want to move the lab to a different resource group that follows the enterprise naming conventions.
+This scenario may not be useful if you're using DevTest Labs to host development machines. Use this scenario to build a staging environment that's representative of production.
-## Next steps
-See [Manage cost and ownership](devtest-lab-guidance-governance-cost-ownership.md).
+The number of virtual machines per lab or per user option only limits the number of machines natively created in the lab itself. This option doesn't limit creation by any environments with Resource Manager templates.
devtest-labs Devtest Lab Guidance Prescriptive Adoption https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-guidance-prescriptive-adoption.md
- Title: Adopt Azure DevTest Labs for your enterprise
-description: This article provides prescriptive guidance for using Azure DevTest Labs in your enterprise.
--- Previously updated : 06/26/2020----
-# DevTest Labs in the enterprise
-Enterprises are rapidly adopting the cloud because of [benefits](/azure/architecture/cloud-adoption/business-strategy/cloud-migration-business-case) that include agility, flexibility, and economics. The first steps are often development and test workloads. Azure DevTest Labs provides [features](devtest-lab-concepts.md) that benefit the enterprise and support [key dev/test scenarios](devtest-lab-guidance-get-started.md).
-
-Common concerns for enterprises that migrate workloads to the cloud include:
--- [Securing development/testing resources](devtest-lab-guidance-governance-policy-compliance.md)-- [Managing and understanding costs](devtest-lab-guidance-governance-cost-ownership.md)-- Enabling self-service for developers without compromising enterprise security and compliance-- Automating and extending DevTest Labs to cover additional scenarios-- [Scaling a DevTest Labs-based solution to thousands of resources](devtest-lab-guidance-scale.md)-- [Large-scale deployments of DevTest Labs](devtest-lab-guidance-orchestrate-implementation.md)-- [Getting started with a proof of concept](devtest-lab-guidance-orchestrate-implementation.md)-
-## Intended audience
-This documentation is for enterprise IT planners, architects, and managers who are responsible for establishing and reviewing deployments and overseeing operations. These articles emphasize the overall process and recommended design principles. The goal is to promote a secure and stable development/testing environment, which ultimately drives adoption of Azure DevTest Labs within an organization.
-
-## Enterprise customers
-
-Many current DevTest Labs enterprise customers successfully use DevTest Labs for development and for testing workloads in their organizations. [Learn more](https://azure.microsoft.com/case-studies/?term=DevTest+labs).
-
-## Next steps
-- [Reference architecture for an enterprise](devtest-lab-reference-architecture.md)
devtest-labs Devtest Lab Guidance Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-guidance-scale.md
A separate subscription per user provides equal opportunities to the alternative
- **Chargebacks** to groups or individual developers become much easier allowing organizations to account for costs using their current model. - **Ownership & permissions** of the DevTest Labs environments are simple. You give developers the subscription-level access and they are 100% responsible for everything including the networking configuration, lab policies, and VM management.
-In the Enterprise, there may be enough constraints on the extremes of the spectrum. Therefore, you may need to set up subscriptions in a way that falls in the middle of these extremes. As a best practice, the goal of an organization should be to use the minimum number of subscriptions possible. Keep in mind the forcing functions that increase the total number of subscriptions. To reiterate, subscription topology is critical for an enterprise deployment of DevTest Labs but shouldn't delay a proof of concept. There are more details in the [Governance](devtest-lab-guidance-governance-policy-compliance.md) article on how to decide on subscription and lab granularity in the organization.
+In the Enterprise, there may be enough constraints on the extremes of the spectrum. Therefore, you may need to set up subscriptions in a way that falls in the middle of these extremes. As a best practice, the goal of an organization should be to use the minimum number of subscriptions possible. Keep in mind the forcing functions that increase the total number of subscriptions. To reiterate, subscription topology is critical for an enterprise deployment of DevTest Labs but shouldn't delay a proof of concept. There are more details in the [Governance](./devtest-lab-guidance-governance-resources.md#align-devtest-labs-resources-within-an-azure-subscription) article on how to decide on subscription and lab granularity in the organization.
## Roles and responsibilities A DevTest Labs proof of concept has three primary roles with defined responsibilities ΓÇô Subscription owner, DevTest Labs owner, DevTest Labs user, and optionally a Contributor.
event-grid How To Event Domains https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/how-to-event-domains.md
This article shows how to: * Create an Event Grid domain
-* Subscribe to event grid topics
+* Subscribe to Event Grid topics
* List keys * Publish events to a domain
New-AzEventGridSubscription `
-If you need a test endpoint to subscribe your events to, you can always deploy a [pre-built web app](https://github.com/Azure-Samples/azure-event-grid-viewer) that displays the incoming events. You can send your events to your test website at `https://<your-site-name>.azurewebsites.net/api/updates`.
+If you need a test endpoint to subscribe your events to, you can always deploy a [prebuilt web app](https://github.com/Azure-Samples/azure-event-grid-viewer) that displays the incoming events. You can send your events to your test website at `https://<your-site-name>.azurewebsites.net/api/updates`.
<a href="https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2Fazure-event-grid-viewer%2Fmaster%2Fazuredeploy.json" target="_blank"><img src="../media/template-deployments/deploy-to-azure.svg" alt="Button to deploy to Azure."></a>
-Permissions that are set for a topic are stored in Azure Active Directory and must be deleted explicitly. Deleting an event subscription won't revoke a users access to create event subscriptions if they've write access on a topic.
+Permissions that are set for a topic are stored in Microsoft Entra ID and must be deleted explicitly. Deleting an event subscription doesn't revoke a users access to create event subscriptions if they've write access on a topic.
-## Publish events to an Event Grid Domain
+## Publish events to an Event Grid domain
Publishing events to a domain is the same as [publishing to a custom topic](./post-to-custom-topic.md). However, instead of publishing to the custom topic, you publish all events to the domain endpoint. In the JSON event data, you specify the topic you wish the events to go to. The following array of events would result in event with `"id": "1111"` to topic `demotopic1` while event with `"id": "2222"` would be sent to topic `demotopic2`:
Get-AzEventGridDomainKey `
And then use your favorite method of making an HTTP POST to publish your events to your Event Grid domain.
+> [!NOTE]
+> For samples that use programming language SDKs to publish events to an Event Grid domain, use the following links:
+> - [.NET](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/eventgrid/Azure.Messaging.EventGrid/samples/Sample2_PublishEventsToDomain.md#publishing-events-to-an-event-grid-domain)
+> - [Python](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventgrid/azure-eventgrid/samples/sync_samples/sample_publish_eg_events_to_a_domain.py)
+> - [Java](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/eventgrid/azure-messaging-eventgrid-cloudnative-cloudevents/src/samples/java/com/azure/messaging/eventgrid/cloudnative/cloudevents/samples/PublishNativeCloudEventToDomainAsync.java)
+ ## Search lists of topics or subscriptions To search and manage large number of topics or subscriptions, the Event Grid APIs support listing and pagination.
event-grid Mqtt Event Grid Namespace Terminology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/mqtt-event-grid-namespace-terminology.md
Client group is a collection of clients. Clients can be grouped together using
## Topic space
-Topic space is a set of topic templates. It's used to simplify access control management by enabling you to grant publish or subscribe access to a group of topics at once instead of individual topics. For more information about topic spaces configuration, see [MQTT topic spaces](mqtt-topic-spaces.md).
+Topic space is a set of topic templates. It's used to simplify access control management by enabling you to scope publish or subscribe access for a client group, to a group of topics at once instead of individual topics. For more information about topic spaces configuration, see [MQTT topic spaces](mqtt-topic-spaces.md).
## Topic filter
expressroute Expressroute Locations Providers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-locations-providers.md
The following table shows connectivity locations and the service providers for e
| **Cape Town** | [Teraco CT1](https://www.teraco.co.za/data-centre-locations/cape-town/) | 3 | South Africa West | Supported | BCX<br/>Internet Solutions - Cloud Connect<br/>Liquid Telecom<br/>MTN Global Connect<br/>Teraco<br/>Vodacom | | **Chennai** | Tata Communications | 2 | South India | Supported | BSNL<br/>DE-CIX<br/>Global CloudXchange (GCX)<br/>Lightstorm<br/>SIFY<br/>Tata Communications<br/>VodafoneIdea | | **Chennai2** | Airtel | 2 | South India | Supported | Airtel |
-| **Chicago** | [Equinix CH1](https://www.equinix.com/locations/americas-colocation/united-states-colocation/chicago-data-centers/ch1/) | 1 | North Central US | Supported | Aryaka Networks<br/>AT&T Dynamic Exchange<br/>AT&T NetBond<br/>British Telecom<br/>CenturyLink Cloud Connect<br/>Cologix<br/>Colt<br/>Comcast<br/>Coresite<br/>Equinix<br/>InterCloud<br/>Internet2<br/>Level 3 Communications<br/>Megaport<br/>PacketFabric<br/>PCCW Global Limited<br/>Sprint<br/>Tata Communications<br/>Telia Carrier<br/>Verizon<br/>Vodafone<br/>Zayo |
+| **Chicago** | [Equinix CH1](https://www.equinix.com/locations/americas-colocation/united-states-colocation/chicago-data-centers/ch1/) | 1 | North Central US | Supported | Aryaka Networks<br/>AT&T Dynamic Exchange<br/>AT&T NetBond<br/>British Telecom<br/>CenturyLink Cloud Connect<br/>Cologix<br/>Colt<br/>Comcast<br/>Coresite<br/>Equinix<br/>InterCloud<br/>Internet2<br/>Level 3 Communications<br/>Megaport<br/>Momentum Telecom<br/>PacketFabric<br/>PCCW Global Limited<br/>Sprint<br/>Tata Communications<br/>Telia Carrier<br/>Verizon<br/>Vodafone<br/>Zayo |
| **Chicago2** | [CoreSite CH1](https://www.coresite.com/data-center/ch1-chicago-il) | 1 | North Central US | Supported | CoreSite<br/>DE-CIX | | **Copenhagen** | [Interxion CPH1](https://www.interxion.com/Locations/copenhagen/) | 1 | n/a | Supported | GlobalConnect<br/>Interxion | | **Dallas** | [Equinix DA3](https://www.equinix.com/locations/americas-colocation/united-states-colocation/dallas-data-centers/da3/) | 1 | n/a | Supported | Aryaka Networks<br/>AT&T Dynamic Exchange<br/>AT&T NetBond<br/>Cologix<br/>Cox Business Cloud Port<br/>Equinix<br/>Intercloud<br/>Internet2<br/>Level 3 Communications<br/>Megaport<br/>Neutrona Networks<br/>Orange<br/>PacketFabric<br/>Telmex Uninet<br/>Telia Carrier<br/>Transtelco<br/>Verizon<br/>Vodafone<br/>Zayo |
The following table shows connectivity locations and the service providers for e
| **Mumbai** | Tata Communications | 2 | West India | Supported | BSNL<br/>British Telecom<br/>DE-CIX<br/>Global CloudXchange (GCX)<br/>Reliance Jio<br/>Sify<br/>Tata Communications<br/>Verizon | | **Mumbai2** | Airtel | 2 | West India | Supported | Airtel<br/>Sify<br/>Orange<br/>Vodafone Idea | | **Munich** | [EdgeConneX](https://www.edgeconnex.com/locations/europe/munich/) | 1 | n/a | Supported | Colt<br/>DE-CIX<br/>Megaport |
-| **New York** | [Equinix NY5](https://www.equinix.com/locations/americas-colocation/united-states-colocation/new-york-data-centers/ny5/) | 1 | n/a | Supported | CenturyLink Cloud Connect<br/>Coresite<br/>Crown Castle<br/>DE-CIX<br/>Equinix<br/>InterCloud<br/>Lightpath<br/>Megaport<br/>NTT Communications<br/>Packet<br/>Zayo |
+| **New York** | [Equinix NY5](https://www.equinix.com/locations/americas-colocation/united-states-colocation/new-york-data-centers/ny5/) | 1 | n/a | Supported | CenturyLink Cloud Connect<br/>Coresite<br/>Crown Castle<br/>DE-CIX<br/>Equinix<br/>InterCloud<br/>Lightpath<br/>Megaport<br/>Momentum Telecom<br/>NTT Communications<br/>Packet<br/>Zayo |
| **Newport(Wales)** | [Next Generation Data](https://www.nextgenerationdata.co.uk) | 1 | UK West | Supported | British Telecom<br/>Colt<br/>Jisc<br/>Level 3 Communications<br/>Next Generation Data | | **Osaka** | [Equinix OS1](https://www.equinix.com/locations/asia-colocation/japan-colocation/osaka-data-centers/os1/) | 2 | Japan West | Supported | AT TOKYO<br/>BBIX<br/>Colt<br/>Equinix<br/>Internet Initiative Japan Inc. - IIJ<br/>Megaport<br/>NTT Communications<br/>NTT SmartConnect<br/>Softbank<br/>Tokai Communications | | **Oslo** | [DigiPlex Ulven](https://www.digiplex.com/locations/oslo-datacentre) | 1 | Norway East | Supported | GlobalConnect<br/>Megaport<br/>Telenor<br/>Telia Carrier |
The following table shows connectivity locations and the service providers for e
| **Vancouver** | [Cologix VAN1](https://www.cologix.com/data-centers/vancouver/van1/) | 1 | n/a | Supported | Bell Canada<br/>Cologix<br/>Megaport<br/>Telus<br/>Zayo | | **Warsaw** | [Equinix WA1](https://www.equinix.com/data-centers/europe-colocation/poland-colocation/warsaw-data-centers/wa1) | 1 | Poland Central | Supported | Equinix, Orange Poland, T-mobile Poland | | **Washington DC** | [Equinix DC2](https://www.equinix.com/locations/americas-colocation/united-states-colocation/washington-dc-data-centers/dc2/)<br/>[Equinix DC6](https://www.equinix.com/data-centers/americas-colocation/united-states-colocation/washington-dc-data-centers/dc6) | 1 | East US<br/>East US 2 | Supported | Aryaka Networks<br/>AT&T NetBond<br/>British Telecom<br/>CenturyLink Cloud Connect<br/>Cologix<br/>Colt<br/>Comcast<br/>Coresite<br/>Cox Business Cloud Port<br/>Crown Castle<br/>Equinix<br/>Internet2<br/>InterCloud<br/>Iron Mountain<br/>IX Reach<br/>Level 3 Communications<br/>Lightpath<br/>Megaport<br/>Neutrona Networks<br/>NTT Communications<br/>Orange<br/>PacketFabric<br/>SES<br/>Sprint<br/>Tata Communications<br/>Telia Carrier<br/>Verizon<br/>Zayo |
-| **Washington DC2** | [Coresite VA2](https://www.coresite.com/data-center/va2-reston-va) | 1 | East US<br/>East US 2 | n/a | CenturyLink Cloud Connect<br/>Coresite<br/>Intelsat<br/>Megaport<br/>Viasat<br/>Zayo |
+| **Washington DC2** | [Coresite VA2](https://www.coresite.com/data-center/va2-reston-va) | 1 | East US<br/>East US 2 | n/a | CenturyLink Cloud Connect<br/>Coresite<br/>Intelsat<br/>Megaport<br/>Momentum Telecom<br/>Viasat<br/>Zayo |
| **Zurich** | [Interxion ZUR2](https://www.interxion.com/Locations/zurich/) | 1 | Switzerland North | Supported | Colt<br/>Equinix<br/>Intercloud<br/>Interxion<br/>Megaport<br/>Swisscom<br/>Zayo |
If you're remote and don't have fiber connectivity or want to explore other conn
| Location | Exchange | Connectivity providers | |--|--|--| | **Amsterdam** | Equinix<br/>Interxion<br/>Level 3 Communications | BICS<br/>CloudXpress<br/>Eurofiber<br/>Fastweb S.p.A<br/>Gulf Bridge International<br/>Kalaam Telecom Bahrain B.S.C<br/>MainOne<br/>Nianet<br/>POST Telecom Luxembourg<br/>Proximus<br/>RETN<br/>TDC Erhverv<br/>Telecom Italia Sparkle<br/>Telekom Deutschland GmbH<br/>Telia |
-| **Atlanta** | Equinix | Crown Castle |
+| **Atlanta** | Equinix | Crown Castle<br/>Momentum Telecom |
| **Cape Town** | Teraco | MTN | | **Chennai** | Tata Communications | Tata Teleservices | | **Chicago** | Equinix | Crown Castle<br/>Spectrum Enterprise<br/>Windstream |
-| **Dallas** | Equinix<br/>Megaport | Axtel<br/>C3ntro Telecom<br/>Cox Business<br/>Crown Castle<br/>Data Foundry<br/>Spectrum Enterprise<br/>Transtelco |
+| **Dallas** | Equinix<br/>Megaport | Axtel<br/>C3ntro Telecom<br/>Cox Business<br/>Crown Castle<br/>Data Foundry<br/>Momentum Telecom<br/>Spectrum Enterprise<br/>Transtelco |
| **Frankfurt** | Interxion | BICS<br/>Cinia<br/>Equinix<br/>Nianet<br/>QSC AG<br/>Telekom Deutschland GmbH | | **Hamburg** | Equinix | Cinia | | **Hong Kong** | Equinix | Chief<br/>Macroview Telecom | | **Johannesburg** | Teraco | MTN | | **London** | BICS<br/>Equinix<br/>euNetworks | Bezeq International Ltd.<br/>CoreAzure<br/>Epsilon Telecommunications Limited<br/>Exponential E<br/>HSO<br/>NexGen Networks<br/>Proximus<br/>Tamares Telecom<br/>Zain |
-| **Los Angeles** | Equinix | Crown Castle<br/>Spectrum Enterprise<br/>Transtelco |
+| **Los Angeles** | Equinix | Crown Castle<br/>Momentum Telecom<br/>Spectrum Enterprise<br/>Transtelco |
| **Madrid** | Level3 | Zertia |
+| **Miami** | Equinix<br/>Megaport | Momentum Telecom<br/>Zertia |
| **Montreal** | Cologix | Airgate Technologies<br/>Inc. Aptum Technologies<br/>Oncore Cloud Services Inc.<br/>Rogers<br/>Zirro | | **Mumbai** | Tata Communications | Tata Teleservices | | **New York** | Equinix<br/>Megaport | Altice Business<br/>Crown Castle<br/>Spectrum Enterprise<br/>Webair | | **Paris** | Equinix | Proximus | | **Quebec City** | Megaport | Fibrenoire | | **Sao Paulo** | Equinix | Venha Pra Nuvem |
-| **Seattle** | Equinix | Alaska Communications |
-| **Silicon Valley** | Coresite<br/>Equinix | Cox Business<br/>Spectrum Enterprise<br/>Windstream<br/>X2nsat Inc. |
+| **Seattle** | Equinix | Alaska Communications<br/>Momentum Telecom |
+| **Silicon Valley** | Coresite<br/>Equinix<br/>Megaport | Cox Business<br/>Momentum Telecom<br/>Spectrum Enterprise<br/>Windstream<br/>X2nsat Inc. |
| **Singapore** | Equinix | 1CLOUDSTAR<br/>BICS<br/>CMC Telecom<br/>Epsilon Telecommunications Limited<br/>LGA Telecom<br/>United Information Highway (UIH) | | **Slough** | Equinix | HSO | | **Sydney** | Megaport | Macquarie Telecom Group | | **Tokyo** | Equinix | ARTERIA Networks Corporation<br/>BroadBand Tower<br/>Inc. | | **Toronto** | Equinix<br/>Megaport | Airgate Technologies Inc.<br/>Beanfield Metroconnect<br/>Aptum Technologies<br/>IVedha Inc<br/>Oncore Cloud Services Inc.<br/>Rogers<br/>Thinktel<br/>Zirro |
-| **Washington DC** | Equinix | Altice Business<br/>BICS<br/>Cox Business<br/>Crown Castle<br/>Gtt Communications Inc<br/>Epsilon Telecommunications Limited<br/>Masergy<br/>Windstream |
+| **Washington DC** | Equinix | Altice Business<br/>BICS<br/>Cox Business<br/>Crown Castle<br/>Gtt Communications Inc<br/>Epsilon Telecommunications Limited<br/>Masergy<br/>Momentum Telecom<br/>Windstream |
## ExpressRoute system integrators Enabling private connectivity to fit your needs can be challenging, based on the scale of your network. You can work with any of the system integrators listed in the following table to assist you with onboarding to ExpressRoute.
expressroute Expressroute Locations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-locations.md
The following table shows locations by service provider. If you want to view ava
| **[Liquid Intelligent Technologies](https://liquidcloud.africa/connect/)** | Supported | Supported | Cape Town<br/>Johannesburg | | **[LGUplus](http://www.uplus.co.kr/)** |Supported |Supported | Seoul | | **[Megaport](https://www.megaport.com/services/microsoft-expressroute/)** | Supported | Supported | Amsterdam<br/>Atlanta<br/>Auckland<br/>Chicago<br/>Dallas<br/>Denver<br/>Dubai2<br/>Dublin<br/>Frankfurt<br/>Geneva<br/>Hong Kong<br/>Hong Kong2<br/>Las Vegas<br/>London<br/>London2<br/>Los Angeles<br/>Madrid<br/>Melbourne<br/>Miami<br/>Minneapolis<br/>Montreal<br/>Munich<br/>New York<br/>Osaka<br/>Oslo<br/>Paris<br/>Perth<br/>Phoenix<br/>Quebec City<br/>Queretaro (Mexico)<br/>San Antonio<br/>Seattle<br/>Silicon Valley<br/>Singapore<br/>Singapore2<br/>Stavanger<br/>Stockholm<br/>Sydney<br/>Sydney2<br/>Tokyo<br/>Tokyo2 Toronto<br/>Vancouver<br/>Washington DC<br/>Washington DC2<br/>Zurich |
+| **[Momentum Telecom](https://gomomentum.com/)** | Supported | Supported | Chicago<br/>New York<br/>Washington DC2 |
| **[MTN](https://www.mtnbusiness.co.za/en/Cloud-Solutions/Pages/microsoft-express-route.aspx)** | Supported | Supported | London | | **MTN Global Connect** | Supported | Supported | Cape Town<br/>Johannesburg| | **[National Telecom](https://www.nc.ntplc.co.th/cat/category/264/855/CAT+Direct+Cloud+Connect+for+Microsoft+ExpressRoute?lang=en_EN)** | Supported | Supported | Bangkok |
If you're remote and don't have fiber connectivity, or you want to explore other
| **[Macquarie Telecom Group](https://macquariegovernment.com/secure-cloud/secure-cloud-exchange/)** | Megaport | Sydney | | **[MainOne](https://www.mainone.net/services/connectivity/cloud-connect/)** |Equinix | Amsterdam | | **[Masergy](https://www.masergy.com/sd-wan/multi-cloud-connectivity)** | Equinix | Washington DC |
+| **[Momentum Telecom](https://gomomentum.com/)** | Equinix<br/>Megaport | Atlanta<br/>Dallas<br/>Los Angeles<br/>Miami<br/>Seattle<br/>Silicon Valley<br/>Washington DC |
| **[MTN](https://www.mtnbusiness.co.za/en/Cloud-Solutions/Pages/microsoft-express-route.aspx)** | Teraco | Cape Town<br/>Johannesburg | | **[NexGen Networks](https://www.nexgen-net.com/nexgen-networks-direct-connect-microsoft-azure-expressroute.html)** | Interxion | London | | **[Nianet](https://www.globalconnect.dk/)** |Equinix | Amsterdam<br/>Frankfurt |
If you're remote and don't have fiber connectivity, or you want to explore other
| **[Proximus](https://www.proximus.be/en/id_cl_explore/companies-and-public-sector/networks/corporate-networks/explore.html)**| Equinix | Amsterdam<br/>Dublin<br/>London<br/>Paris | | **[QSC AG](https://www2.qbeyond.de/en/)** |Interxion | Frankfurt | | **[RETN](https://retn.net/products/cloud-connect)** | Equinix | Amsterdam |
-| **Rogers** | Cologix<br/>Equinix | Montreal<br/>Toronto |
+| **[Rogers]** | Cologix<br/>Equinix | Montreal<br/>Toronto |
| **[Spectrum Enterprise](https://enterprise.spectrum.com/services/internet-networking/wan/cloud-connect.html)** | Equinix | Chicago<br/>Dallas<br/>Los Angeles<br/>New York<br/>Silicon Valley | | **[Tamares Telecom](http://www.tamarestelecom.com/our-services/#Connectivity)** | Equinix | London | | **[Tata Teleservices](https://www.tatatelebusiness.com/data-services/ez-cloud-connect/)** | Tata Communications | Chennai<br/>Mumbai |
hdinsight Enable Private Link On Kafka Rest Proxy Hdi Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/enable-private-link-on-kafka-rest-proxy-hdi-cluster.md
Previously updated : 08/30/2022 Last updated : 09/19/2023 # Enable Private Link on an HDInsight Kafka Rest Proxy cluster
hdinsight Enterprise Security Package https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/enterprise-security-package.md
Title: Enterprise Security Package for Azure HDInsight
description: Learn the Enterprise Security Package components and versions in Azure HDInsight. Previously updated : 08/12/2022 Last updated : 09/19/2023 # Enterprise Security Package for Azure HDInsight
hdinsight Apache Hadoop Run Samples Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hadoop/apache-hadoop-run-samples-linux.md
description: Get started using MapReduce samples in jar files included in HDInsi
Previously updated : 08/26/2022 Last updated : 09/14/2023 # Run the MapReduce examples included in HDInsight
The following samples are contained in this archive:
||| |aggregatewordcount|Counts the words in the input files.| |aggregatewordhist|Computes the histogram of the words in the input files.|
-|bbp|Uses Bailey-Borwein-Plouffe to compute exact digits of Pi.|
+|`bbp`|Uses Bailey-Borwein-Plouffe to compute exact digits of Pi.|
|dbcount|Counts the pageview logs stored in a database.| |distbbp|Uses a BBP-type formula to compute exact bits of Pi.| |grep|Counts the matches of a regex in the input.|
The following samples are contained in this archive:
|pentomino|Tile laying program to find solutions to pentomino problems.| |pi|Estimates Pi using a quasi-Monte Carlo method.| |randomtextwriter|Writes 10 GB of random textual data per node.|
-|randomwriter|Writes 10 GB of random data per node.|
-|secondarysort|Defines a secondary sort to the reduce phase.|
+|`randomwriter`|Writes 10 GB of random data per node.|
+|`secondarysort`|Defines a secondary sort to the reduce phase.|
|sort|Sorts the data written by the random writer.| |sudoku|A sudoku solver.| |teragen|Generate data for the terasort.| |terasort|Run the terasort.| |teravalidate|Checking results of terasort.| |wordcount|Counts the words in the input files.|
-|wordmean|Counts the average length of the words in the input files.|
-|wordmedian|Counts the median length of the words in the input files.|
+|`wordmean`|Counts the average length of the words in the input files.|
+|`wordmedian`|Counts the median length of the words in the input files.|
|wordstandarddeviation|Counts the standard deviation of the length of the words in the input files.| ## Run the wordcount example
The following samples are contained in this archive:
* Each column can contain either a number or `?` (which indicates a blank cell) * Cells are separated by a space
-There is a certain way to construct Sudoku puzzles; you can't repeat a number in a column or row. There's an example on the HDInsight cluster that is properly constructed. It is located at `/usr/hdp/*/hadoop/src/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/java/org/apache/hadoop/examples/dancing/puzzle1.dta` and contains the following text:
+There is a certain way to construct Sudoku puzzles; you can't repeat a number in a column or row. There is an example of the HDInsight cluster that is properly constructed. It is located at `/usr/hdp/*/hadoop/src/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/java/org/apache/hadoop/examples/dancing/puzzle1.dta` and contains the following text:
```output 8 5 ? 3 9 ? ? ? ?
The results appear similar to the following text:
## Pi (π) example
-The pi sample uses a statistical (quasi-Monte Carlo) method to estimate the value of pi. Points are placed at random in a unit square. The square also contains a circle. The probability that the points fall within the circle is equal to the area of the circle, pi/4. The value of pi can be estimated from the value of 4R. R is the ratio of the number of points that are inside the circle to the total number of points that are within the square. The larger the sample of points used, the better the estimate is.
+The pi sample uses a statistical (quasi-Monte Carlo) method to estimate the value of pi. Points are placed at random in a unit square. The square also contains a circle. The probability that the points fall within the circle is equal to the area of the circle, pi/4. The value of pi can be estimated from the value of `4R`. R is the ratio of the number of points that are inside the circle to the total number of points that are within the square. The larger the sample of points used, the better the estimate is.
Use the following command to run this sample. This command uses 16 maps with 10,000,000 samples each to estimate the value of pi:
yarn jar /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar
The value returned by this command is similar to **3.14159155000000000000**. For references, the first 10 decimal places of pi are 3.1415926535.
-## 10 GB GraySort example
+## 10-GB GraySort example
GraySort is a benchmark sort. The metric is the sort rate (TB/minute) that is achieved while sorting large amounts of data, usually a 100 TB minimum.
-This sample uses a modest 10 GB of data so that it can be run relatively quickly. It uses the MapReduce applications developed by Owen O'Malley and Arun Murthy. These applications won the annual general-purpose ("Daytona") terabyte sort benchmark in 2009, with a rate of 0.578 TB/min (100 TB in 173 minutes). For more information on this and other sorting benchmarks, see the [Sort Benchmark](https://sortbenchmark.org/) site.
+This sample uses a modest 10 GB of data so that it can be run relatively quickly. It uses the MapReduce applications developed by `Owen O'Malley` and `Arun Murthy`. These applications won the annual general-purpose ("Daytona") terabyte sort benchmark in 2009, with a rate of 0.578 TB/min (100 TB in 173 minutes). For more information on this and other sorting benchmarks, see the [Sort Benchmark](https://sortbenchmark.org/) site.
This sample uses three sets of MapReduce programs:
This sample uses three sets of MapReduce programs:
* **TeraSort**: Samples the input data and uses MapReduce to sort the data into a total order
- TeraSort is a standard MapReduce sort, except for a custom partitioner. The partitioner uses a sorted list of N-1 sampled keys that define the key range for each reduce. In particular, all keys such that sample[i-1] <= key < sample[i] are sent to reduce i. This partitioner guarantees that the outputs of reduce i are all less than the output of reduce i+1.
+ TeraSort is a standard MapReduce sort, except for a custom partitioner. The partitioner uses a sorted list of N-1 sampled keys that define the key range for each reduce. In particular, all keys such that sample[i-1] <= key < sample[i] are sent to reduce i. This partitioner guarantees that the outputs of reduce `i` are all less than the output of reduce `i+1`.
* **TeraValidate**: A MapReduce program that validates that the output is globally sorted
hdinsight Apache Hadoop Visual Studio Tools Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hadoop/apache-hadoop-visual-studio-tools-get-started.md
keywords: hadoop tools,hive query,visual studio,visual studio hadoop
Previously updated : 09/13/2023 Last updated : 09/19/2023 # Use Data Lake Tools for Visual Studio to connect to Azure HDInsight and run Apache Hive queries
hdinsight Hdinsight Use Sqoop https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hadoop/hdinsight-use-sqoop.md
description: Learn how to use Azure PowerShell from a workstation to run Sqoop i
Previously updated : 08/28/2022 Last updated : 09/18/2023 # Use Apache Sqoop with Hadoop in HDInsight
Although Apache Hadoop is a natural choice for processing unstructured and semi-
[Apache Sqoop](https://sqoop.apache.org/docs/1.99.7/user.html) is a tool designed to transfer data between Hadoop clusters and relational databases. You can use it to import data from a relational database management system (RDBMS) such as SQL Server, MySQL, or Oracle into the Hadoop distributed file system (HDFS), transform the data in Hadoop with MapReduce or Apache Hive, and then export the data back into an RDBMS. In this article, you're using Azure SQL Database for your relational database. > [!IMPORTANT]
-> This article sets up a test environment to perform the data transfer. You then choose a data transfer method for this environment from one of the methods in section [Run Sqoop jobs](#run-sqoop-jobs), further below.
+> This article sets up a test environment to perform the data transfer. You then choose a data transfer method for this environment from one of the methods in section [Run Sqoop jobs](#run-sqoop-jobs).
For Sqoop versions that are supported on HDInsight clusters, see [What's new in the cluster versions provided by HDInsight?](../hdinsight-component-versioning.md)
For Sqoop versions that are supported on HDInsight clusters, see [What's new in
HDInsight cluster comes with some sample data. You use the following two samples:
-* An Apache Log4j log file, which is located at `/example/data/sample.log`. The following logs are extracted from the file:
+* An Apache `Log4j` log file, which is located at `/example/data/sample.log`. The following logs are extracted from the file:
```text 2012-02-03 18:35:34 SampleClass6 [INFO] everything normal for id 577725851
In this article, you use these two datasets to test Sqoop import and export.
## <a name="create-cluster-and-sql-database"></a>Set up test environment
-The cluster, SQL database, and other objects are created through the Azure portal using an Azure Resource Manager template. The template can be found in [Azure quickstart templates](https://azure.microsoft.com/resources/templates/hdinsight-linux-with-sql-database/). The Resource Manager template calls a bacpac package to deploy the table schemas to a SQL database. If you want to use a private container for the bacpac files, use the following values in the template:
+The cluster, SQL database, and other objects are created through the Azure portal using an Azure Resource Manager template. The template can be found in [Azure quickstart templates](https://azure.microsoft.com/resources/templates/hdinsight-linux-with-sql-database/). The Resource Manager template calls a bacpac package to deploy the table schemas to an SQL database. If you want to use a private container for the bacpac files, use the following values in the template:
```json "storageKeyType": "Primary",
The cluster, SQL database, and other objects are created through the Azure porta
|Bacpac File Name |Use the default value unless you want to use your own bacpac file.| |Location |Use the default value.|
- The [logical SQL server](/azure/azure-sql/database/logical-servers) name will be `<ClusterName>dbserver`. The database name will be `<ClusterName>db`. The default storage account name will be `e6qhezrh2pdqu`.
+ The [logical SQL server](/azure/azure-sql/database/logical-servers) name is `<ClusterName>dbserver`. The database name is `<ClusterName>db`. The default storage account name is `e6qhezrh2pdqu`.
3. Select **I agree to the terms and conditions stated above**.
The cluster, SQL database, and other objects are created through the Azure porta
## Run Sqoop jobs
-HDInsight can run Sqoop jobs by using a variety of methods. Use the following table to decide which method is right for you, then follow the link for a walkthrough.
+HDInsight can run Sqoop jobs by using various methods. Use the following table to decide which method is right for you, then follow the link for a walkthrough.
| **Use this** if you want... | ...an **interactive** shell | ...**batch** processing | ...from this **client operating system** | |: |::|::|: |: |
hdinsight Apache Hbase Advisor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hbase/apache-hbase-advisor.md
Previously updated : 07/20/2022 Last updated : 09/15/2023 #Customer intent: The azure advisories help to tune the cluster/query. This doc gives a much deeper understanding of the various advisories including the recommended configuration tunings. # Apache HBase advisories in Azure HDInsight
This article describes several advisories to help you optimize the Apache HBase
## Optimize HBase to read most recently written data
-If your usecase involves reading the most recently written data from HBase, this advisory can help you. For high performance, it's optimal that HBase reads are to be served from memstore, instead of the remote storage.
+If your use case involves reading the most recently written data from HBase, this advisory can help you. For high performance, it is optimal that HBase reads are to be served from `memstore`, instead of the remote storage.
-The query advisory indicates that for a given column family in a table > 75% reads that are getting served from memstore. This indicator suggests that even if a flush happens on the memstore the recent file needs to be accessed and that needs to be in cache. The data is first written to memstore the system accesses the recent data there. There's a chance that the internal HBase flusher threads detect that a given region has reached 128M (default) size and can trigger a flush. This scenario happens to even the most recent data that was written when the memstore was around 128M in size. Therefore, a later read of those recent records may require a file read rather than from memstore. Hence it is best to optimize that even recent data that is recently flushed can reside in the cache.
+The query advisory indicates that for a given column family in a table > 75% reads that are getting served from `memstore`. This indicator suggests that even if a flush happens on the `memstore` the recent file needs to be accessed and that needs to be in cache. The data is first written to `memstore` the system accesses the recent data there. There's a chance that the internal HBase flusher threads detect that a given region has reached 128M (default) size and can trigger a flush. This scenario happens to even the most recent data that was written when the `memstore` was around 128M in size. Therefore, a later read of those recent records may require a file read rather than from `memstore`. Hence it is best to optimize that even recent data that is recently flushed can reside in the cache.
To optimize the recent data in cache, consider the following configuration settings:
To optimize the recent data in cache, consider the following configuration setti
## Optimize the flush queue
-This advisory indicates that HBase flushes may need tuning. The current configuration for flush handlers may not be high enough to handle with write traffic which may lead to slow down of flushes .
+This advisory indicates that HBase flushes may need tuning. The current configuration for flush handlers may not be high enough to handle with write traffic that may lead to slow down of flushes.
In the region server UI, notice if the flush queue grows beyond 100. This threshold indicates the flushes are slow and you may have to tune the `hbase.hstore.flusher.count` configuration. By default, the value is 2. Ensure that the max flusher threads don't increase beyond 6.
-Additionally, see if you have a recommendation for region count tuning. If we yes, we suggest you to try the region tuning to see if that helps in faster flushes. Otherwise, tuning the flusher threads may help you.
+Additionally, see if you have a recommendation for region count tuning. If yes, we suggest you to try the region tuning to see if that helps in faster flushes. Otherwise, tuning the flusher threads may help you.
## Region count tuning
-The region count tuning advisory indicates that HBase has blocked updates, and the region count may be more than the optimally supported heap size. You can tune the heap size, memstore size, and the region count.
+The region count tuning advisory indicates that HBase has blocked updates, and the region count may be more than the optimally supported heap size. You can tune the heap size, `memstore` size, and the region count.
As an example scenario: -- Assume the heap size for the region server is 10 GB. By default the `hbase.hregion.memstore.flush.size` is `128M`. The default value for `hbase.regionserver.global.memstore.size` is `0.4`. Which means that out of the 10 GB, 4 GB is allocated for memstore (globally).
+- Assume the heap size for the region server is 10 GB. By default the `hbase.hregion.memstore.flush.size` is `128M`. The default value for `hbase.regionserver.global.memstore.size` is `0.4`. Which means that out of the 10 GB, 4 GB is allocated for `memstore` (globally).
- Assume there's an even distribution of the write load on all the regions and assuming every region grows upto 128 MB only then the max number of regions in this setup is `32` regions. If a given region server is configured to have 32 regions, the system better avoids blocking updates. -- With these settings in place, the number of regions is 100. The 4-GB global memstore is now split across 100 regions. So effectively each region gets only 40 MB for memstore. When the writes are uniform, the system does frequent flushes and smaller size of the order < 40 MB. Having many flusher threads might increase the flush speed `hbase.hstore.flusher.count`.
+- With these settings in place, the number of regions is 100. The 4-GB global `memstore` is now split across 100 regions. So effectively each region gets only 40 MB for `memstore`. When the writes are uniform, the system does frequent flushes and smaller size of the order < 40 MB. Having many flusher threads might increase the flush speed `hbase.hstore.flusher.count`.
-The advisory means that it would be good to reconsider the number of regions per server, the heap size, and the global memstore size configuration along with the tuning of flush threads to avoid updates getting blocked.
+The advisory means that it would be good to reconsider the number of regions per server, the heap size, and the global `memstore` size configuration along with the tuning of flush threads to avoid updates getting blocked.
## Compaction queue tuning
hdinsight Troubleshoot Data Retention Issues Expired Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hbase/troubleshoot-data-retention-issues-expired-data.md
Title: Troubleshoot data retention (TTL) issues with expired data not being dele
description: Troubleshoot various data-retention (TTL) issues with expired data not being deleted from storage on Azure HDInsight Previously updated : 05/06/2022 Last updated : 09/14/2023 # Troubleshoot data retention (TTL) issues with expired data not being deleted from storage on Azure HDInsight
-In HBase cluster, you may decide that you would like to remove data after it ages either to free some storage and save on costs as the older data is no longer needed, either to comply with regulations. When that is needed, you'll usually set TTL in a table at the ColumnFamily level to expire and automatically delete older data. While TTL can be set as well at cell level, setting it at ColumnFamily level is usually a more convenient option because the ease of administration and because a cell TTL (expressed in ms) can't extend the effective lifetime of a cell beyond a ColumnFamily level TTL setting (expressed in seconds), so only required shorter retention times at cell level could benefit from setting cell level TTL.
+In HBase cluster, you may decide that you would like to remove data after it ages either to free some storage and save on costs as the older data is no longer needed, either to comply with regulations. When that is needed, you need to set TTL in a table at the ColumnFamily level to expire and automatically delete older data. While TTL can be set as well at cell level, setting it at ColumnFamily level is usually a more convenient option because the ease of administration and because a cell TTL (expressed in ms) can't extend the effective lifetime of a cell beyond a ColumnFamily level TTL setting (expressed in seconds), so only required shorter retention times at cell level could benefit from setting cell level TTL.
Despite setting TTL, you may notice sometimes that you don't obtain the desired effect, i.e. some data hasn't expired and/or storage size hasn't decreased. ## Prerequisites
-To prepare to follow the steps and commands below, open two ssh connections to HBase cluster:
+Follow the steps and given commands, open two ssh connections to HBase cluster:
-* In one of the ssh sessions keep the default bash shell.
+* In one of, the ssh sessions keep the default bash shell.
-* In the second ssh session launch HBase shell by running the command below.
+* In the second ssh session launch HBase shell by running, the following command.
``` hbase shell
To prepare to follow the steps and commands below, open two ssh connections to H
### Check if desired TTL is configured and if expired data is removed from query result
-Follow the steps below to understand where is the issue. Start by checking if the behavior occurs for a specific table or for all the tables. If you're unsure whether the issue impacts all the tables or a specific table, just consider as example a specific table name for the start.
+Follow the steps given to understand where is the issue. Start by checking if the behavior occurs for a specific table or for all the tables. If you're unsure whether the issue impacts all the tables or a specific table, just consider as example a specific table name for the start.
-1. Check first that TTL has been configured for ColumnFamily for the target tables. Run the command below in the ssh session where you launched HBase shell and observe example and output below. One column family has TTL set to 50 seconds, the other ColumnFamily has no value configured for TTL, thus it appears as "FOREVER" (data in this column family isn't configured to expire).
+1. Check first that TTL has been configured for ColumnFamily for the target tables. Run following command in the ssh session where you launched HBase shell and observe example and output below. One column family has TTL set to 50 seconds, the other ColumnFamily has no value configured for TTL, thus it appears as "FOREVER" (data in this column family isn't configured to expire).
``` describe 'table_name'
Follow the steps below to understand where is the issue. Start by checking if th
![Screenshot showing describe table name command.](media/troubleshoot-data-retention-issues-expired-data/image-1.png)
-1. If not configured, default TTL is set to 'FOREVER'. There are two possibilities why data is not expired as expected and removed from query result.
+1. If not configured, default TTL is set to 'FOREVER.' There are two possibilities why data is not expired as expected and removed from query result.
- 1. If TTL has any other value then 'FOREVER', observe the value for column family and note down the value in seconds(pay special attention to value correlated with the unit measure as cell TTL is in ms, but column family TTL is in seconds) to confirm if it is the expected one. If the observed value isn't correct, fix that first.
+ 1. If TTL has any other value, then 'FOREVER', observe the value for column family and note down the value in seconds(pay special attention to value correlated with the unit measure as cell TTL is in ms, but column family TTL is in seconds) to confirm if it is the expected one. If the observed value isn't correct, fix that first.
1. If TTL value is 'FOREVER' for all column families, configure TTL as first step and afterwards monitor if data is expired as expected. 1. If you establish that TTL is configured and has the correct value for the ColumnFamily, next step is to confirm that the expired data no longer shows up when doing table scans. When data expires, it should be removed and not show up in the scan table results. Run the below command in HBase shell to check.
Follow the steps below to understand where is the issue. Start by checking if th
### Check the number and size of StoreFiles per table per region to observe if any changes are visible after the compaction operation
-1. Before moving to next step, from ssh session with bash shell, run the following command to check the current number of StoreFiles and size for each StoreFile currently showing up for the ColumnFamily for which the TTL has been configured. Note first the table and ColumnFamily for which you'll be doing the check, then run the following command in ssh session (bash).
+1. Before moving to next step, from ssh session with bash shell, run the following command to check the current number of StoreFiles and size for each StoreFile currently showing up for the ColumnFamily for which the TTL has been configured. Note first the table and ColumnFamily for which you are doing the check, then run the following command in ssh session (bash).
``` hdfs dfs -ls -R /hbase/data/default/table_name/ | grep "column_family_name"
Follow the steps below to understand where is the issue. Start by checking if th
![Screenshot showing check size of store file command.](media/troubleshoot-data-retention-issues-expired-data/image-2.png)
-1. Likely, there will be more results shown in the output, one result for each region ID that is part of the table and between 0 and more results for StoreFiles present under each region name, for the selected ColumnFamily. To count the overall number of rows in the result output above, run the following command.
+1. Likely, there are more results shown in the output, one result for each region ID that is part of the table and between 0 and more results for StoreFiles present under each region name, for the selected ColumnFamily. To count the overall number of rows in the result output above, run the following command.
``` hdfs dfs -ls -R /hbase/data/default/table_name/ | grep "column_family_name" | wc -l
Follow the steps below to understand where is the issue. Start by checking if th
hdfs dfs -ls -R /hbase/data/default/table_name/ | grep "column_family_name" ```
-1. An additional store file is created compared to previous result output for each region where data is modified, the StoreFile will include current content of MemStore for that region.
+1. An additional store file is created compared to previous result output for each region where data is modified, the StoreFile includes current content of MemStore for that region.
![Screenshot showing memory store for the region.](media/troubleshoot-data-retention-issues-expired-data/image-3.png) ### Check the number and size of StoreFiles per table per region after major compaction
-1. At this point, the data from MemStore has been written to StoreFile, in storage, but expired data may still exist in one or more of the current StoreFiles. Although minor compactions can help delete some of the expired entries, it isn't guaranteed that it will remove all of them as minor compaction will usually not select all the StoreFiles for compaction, while major compaction will select all the StoreFiles for compaction in that region.
+1. At this point, the data from MemStore has been written to StoreFile, in storage, but expired data may still exist in one or more of the current StoreFiles. Although minor compactions can help delete some of the expired entries, it is not guaranteed that it removes all of them as minor compaction. It will not select all the StoreFiles for compaction, while major compaction will select all the StoreFiles for compaction in that region.
Also, there's another situation when minor compaction may not remove cells with TTL expired. There's a property named MIN_VERSIONS and it defaults to 0 only (see in the above output from describe 'table_name' the property MIN_VERSIONS=>'0'). If this property is set to 0, the minor compaction will remove the cells with TTL expired. If this value is greater than 0, minor compaction may not remove the cells with TTL expired even if it touches the corresponding file as part of compaction. This property configures the min number of versions of a cell to keep, even if those versions have TTL expired.
Follow the steps below to understand where is the issue. Start by checking if th
major_compact 'table_name' ```
-1. Depending on the table size, major compaction operation can take some time. Use the command below in HBase shell to monitor progress. If the compaction is still running when you execute the command below, you'll see the output "MAJOR", but if the compaction is completed, you will see the output "NONE".
+1. Depending on the table size, major compaction operation can take some time. Use following command in HBase shell to monitor progress. If the compaction is still running when you execute the following command, you get the output as "MAJOR", but if the compaction is completed, you get the as output "NONE."
``` compaction_state 'table_name' ```
-1. When the compaction status appears as "NONE" in hbase shell, if you switch quickly to bash and run command
+1. When the compaction status appears as "NONE" in hbase shell, if you switch quickly to bash and run command.
``` hdfs dfs -ls -R /hbase/data/default/table_name/ | grep "column_family_name"
If you didn't see your problem or are unable to solve your issue, visit one of t
* Get answers from Azure experts through [Azure Community Support](https://azure.microsoft.com/support/community/).
-* Connect with [@AzureSupport](https://twitter.com/azuresupport) - the official Microsoft Azure account for improving customer experience. Connecting the Azure community to the right resources: answers, support, and experts.
+* Connect with [@AzureSupport](https://twitter.com/azuresupport) - the official Microsoft Azure account for improving customer experience. Connecting the Azure community to the right resources: `answers`, `support`, and `experts`.
* If you need more help, you can submit a support request from the [Azure portal](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade/). Select **Support** from the menu bar or open the **Help + support** hub. For more detailed information, review [How to create an Azure support request](../../azure-portal/supportability/how-to-create-azure-support-request.md). Access to Subscription Management and billing support is included with your Microsoft Azure subscription, and Technical Support is provided through one of the [Azure Support Plans](https://azure.microsoft.com/support/plans/).
hdinsight Hdinsight Create Virtual Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hdinsight-create-virtual-network.md
description: Learn how to create an Azure Virtual Network to connect HDInsight t
Previously updated : 08/16/2022 Last updated : 09/19/2023 # Create virtual networks for Azure HDInsight clusters
hdinsight Hdinsight Delete Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hdinsight-delete-cluster.md
description: Information on the various ways that you can delete an Azure HDInsi
Previously updated : 08/26/2022 Last updated : 09/19/2023 # Delete an HDInsight cluster using your browser, PowerShell, or the Azure CLI
hdinsight Hdinsight Hadoop Create Linux Clusters Azure Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hdinsight-hadoop-create-linux-clusters-azure-powershell.md
ms.tool: azure-powershell Previously updated : 08/05/2022 Last updated : 09/19/2023 # Create Linux-based clusters in HDInsight using Azure PowerShell
hdinsight Hdinsight Multiple Clusters Data Lake Store https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hdinsight-multiple-clusters-data-lake-store.md
description: Learn how to use more than one HDInsight cluster with a single Data
Previously updated : 08/16/2022 Last updated : 09/15/2023 # Use multiple HDInsight clusters with an Azure Data Lake Storage account
Last updated 08/16/2022
Starting with HDInsight version 3.5, you can create HDInsight clusters with Azure Data Lake Storage accounts as the default filesystem. Data Lake Storage supports unlimited storage that makes it ideal not only for hosting large amounts of data; but also for hosting multiple HDInsight clusters that share a single Data Lake Storage Account. For instructions on how to create an HDInsight cluster with Data Lake Storage as the storage, see [Quickstart: Set up clusters in HDInsight](./hdinsight-hadoop-provision-linux-clusters.md).
-This article provides recommendations to the Data Lake Storage administrator for setting up a single and shared Data Lake Storage Account that can be used across multiple **active** HDInsight clusters. These recommendations apply to hosting multiple secure as well as non-secure Apache Hadoop clusters on a shared Data Lake Storage account.
+This article provides recommendations to the Data Lake Storage administrator for setting up a single and shared Data Lake Storage Account that can be used across multiple **active** HDInsight clusters. These recommendations apply to host multiple secure and non-secure Apache Hadoop clusters on a shared Data Lake Storage account.
## Data Lake Storage file and folder level ACLs
To enable this folder structure to be effectively used by HDInsight clusters, th
||||||||| |/ | rwxr-x--x |admin |admin |Service principal |--x |FINGRP |r-x | |/clusters | rwxr-x--x |admin |admin |Service principal |--x |FINGRP |r-x |
-|/clusters/finance | rwxr-x--t |admin |FINGRP |Service principal |rwx |- |- |
+|/clusters/finance | rwxr-x--t |admin |FINGRP |Service principal | `rwx` |- |- |
In the table,
We recommend that input data to a job, and the outputs from a job be stored in a
## Limit on clusters sharing a single storage account
-The limit on the number of clusters that can share a single Data Lake Storage account depends on the workload being run on those clusters. Having too many clusters or very heavy workloads on the clusters that share a storage account might cause the storage account ingress/egress to get throttled.
+The limit on the number of clusters that can share a single Data Lake Storage account depends on the workload being run on those clusters. Having too many clusters or heavy workloads on the clusters that share a storage account might cause the storage account ingress/egress to get throttled.
## Support for Default-ACLs
-When creating a Service Principal with named-user access (as shown in the table above), we recommend **not** adding the named-user with a default-ACL. Provisioning named-user access using default-ACLs results in the assignment of 770 permissions for owning-user, owning-group, and others. While this default value of 770 doesn't take away permissions from owning-user (7) or owning-group (7), it takes away all permissions for others (0). This results in a known issue with one particular use-case that is discussed in detail in the [Known issues and workarounds](#known-issues-and-workarounds) section.
+When creating a Service Principal with named-user access (as shown in the table), we recommend **not** adding the named-user with a default-ACL. Provisioning named-user access using default-ACLs results in the assignment of 770 permissions for owning-user, owning-group, and others. While this default value of 770 doesn't take away permissions from owning-user (7) or owning-group (7), it takes away all permissions for others (0). This results in a known issue with one particular use-case that is discussed in detail in the [Known issues and workarounds](#known-issues-and-workarounds) section.
## Known issues and workarounds
This section lists the known issues for using HDInsight with Data Lake Storage,
### Publicly visible localized Apache Hadoop YARN resources
-When a new Azure Data Lake Storage account is created, the root directory is automatically provisioned with Access-ACL permission bits set to 770. The root folderΓÇÖs owning user is set to the user that created the account (the Data Lake Storage admin) and the owning group is set to the primary group of the user that created the account. No access is provided for "others".
+When a new Azure Data Lake Storage account is created, the root directory is automatically provisioned with Access-ACL permission bits set to 770. The root folderΓÇÖs owning user is set to the user that created the account (the Data Lake Storage admin) and the owning group is set to the primary group of the user that created the account. No access is provided for "others."
-These settings are known to affect one specific HDInsight use-case captured in [YARN 247](https://hwxmonarch.atlassian.net/browse/YARN-247). Job submissions could fail with an error message similar to this:
+These settings are known to affect one specific HDInsight use-case captured in [YARN 247](https://hwxmonarch.atlassian.net/browse/YARN-247). Job submissions could fail with an error message:
```output Resource XXXX is not publicly accessible and as such cannot be part of the public cache. ```
-As stated in the YARN JIRA linked earlier, while localizing public resources, the localizer validates that all the requested resources are indeed public by checking their permissions on the remote file-system. Any LocalResource that doesn't fit that condition is rejected for localization. The check for permissions, includes read-access to the file for "others". This scenario doesn't work out-of-the-box when hosting HDInsight clusters on Azure Data Lake, since Azure Data Lake denies all access to "others" at root folder level.
+As stated in the YARN JIRA linked earlier, while localizing public resources, the localizer validates that all the requested resources are indeed public by checking their permissions on the remote file-system. Any LocalResource that doesn't fit that condition is rejected for localization. The check for permissions includes read-access to the file for "others." This scenario doesn't work out-of-the-box when hosting HDInsight clusters on Azure Data Lake, since Azure Data Lake denies all access to "others" at root folder level.
#### Workaround
-Set read-execute permissions for **others** through the hierarchy, for example, at **/**, **/clusters** and **/clusters/finance** as shown in the table above.
+Set read-execute permissions for **others** through the hierarchy, for example, at **/**, **/clusters** and **/clusters/finance** as shown in the table.
## See also
hdinsight Hdinsight Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/hdinsight-troubleshoot-guide.md
Title: Azure HDInsight troubleshooting guides
description: Troubleshoot Azure HDInsight. Step-by-step documentation shows you how to use HDInsight to solve common problems with Apache Hive, Apache Spark, Apache YARN, Apache HBase, and HDFS. Previously updated : 08/05/2022 Last updated : 09/19/2023 # Troubleshoot Azure HDInsight
hdinsight Hive Migration Across Storage Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/interactive-query/hive-migration-across-storage-accounts.md
Previously updated : 07/19/2022 Last updated : 09/19/2023 # Hive workload migration to new account in Azure Storage
hdinsight Hive Warehouse Connector Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/interactive-query/hive-warehouse-connector-apis.md
Previously updated : 07/19/2022 Last updated : 09/19/2023 # Hive Warehouse Connector APIs in Azure HDInsight
hdinsight Interactive Query Troubleshoot Hive Logs Diskspace Full Headnodes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/interactive-query/interactive-query-troubleshoot-hive-logs-diskspace-full-headnodes.md
Previously updated : 07/19/2022 Last updated : 09/18/2023 # Scenario: Apache Hive logs are filling up the disk space on the head nodes in Azure HDInsight
This article describes troubleshooting steps and possible resolutions for proble
## Issue
-In a HDI 4.0 Apache Hive/LLAP cluster, unwanted logs are taking up the entire disk space on the head nodes. This condition could cause the following problems:
+In an HDI 4.0 Apache Hive/LLAP cluster, unwanted logs are taking up the entire disk space on the head nodes. This condition could cause the following problems:
- SSH access fails because no space is left on the head node. - HiveServer2 Interactive fails to restart. ## Cause
-Automatic hive log deletion is not configured in the advanced hive-log4j2 configurations. The default size limit of 60GB takes too much space for the customer's usage pattern. By default, the amount of logs kept is defined by this equation `MB logs/day = appender.RFA.strategy.max * 10MB`.
+Automatic hive log deletion is not configured in the advanced hive-log4j2 configurations. The default size limit of 60-GB takes too much space for the customer's usage pattern. By default, the number of logs kept is defined by this equation `MB logs/day = appender.RFA.strategy.max * 10MB`.
## Resolution
Automatic hive log deletion is not configured in the advanced hive-log4j2 config
#appender.RFA.strategy.action.condition.nested_condition.lastMod.age = 30D ```
-4. We will go through three basic options with deletion based on:
+4. Go through three basic options with deletion based on:
- **Total Size** - Change `appender.RFA.strategy.action.condition.nested_condition.fileSize.exceeds` to a size limit of your choice.
Automatic hive log deletion is not configured in the advanced hive-log4j2 config
``` - **Combination of Total Size and Date**
- - You can combine both options by uncommenting like below. The log4j2 will then behave as so: Start deleting logs when either condition is met.
+ - You can combine both options by uncommenting. The log4j2 will then behave as so: Start deleting logs when either condition is met.
``` # Deletes logs based on total accumulated size, keeping the most recent
iot-edge How To Configure Api Proxy Module https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/how-to-configure-api-proxy-module.md
Title: Configure API proxy module for Azure IoT Edge
+ Title: Configure API proxy module
+ description: Learn how to customize the API proxy module for IoT Edge gateway hierarchies.
Currently, the default environment variables include:
| Environment variable | Description | | -- | -- |
-| `PROXY_CONFIG_ENV_VAR_LIST` | List all the variables that you intend to update in a comma-separated list. This step prevents accidentally modifying the wrong configuration settings.
-| `NGINX_DEFAULT_TLS` | Specifies the list of TLS protocol(s) to be enabled. See NGINX's [ssl_protocols](https://nginx.org/docs/http/ngx_http_ssl_module.html#ssl_protocols).<br><br>Default is 'TLSv1.2'. |
+| `PROXY_CONFIG_ENV_VAR_LIST` | List all the variables that you intend to update in a comma-separated list. This step prevents accidentally modifying the wrong configuration settings. |
+| `NGINX_DEFAULT_TLS` | Specifies the list of TLS protocols to be enabled. See NGINX's [ssl_protocols](https://nginx.org/docs/http/ngx_http_ssl_module.html#ssl_protocols).<br><br>Default is 'TLSv1.2'. |
| `NGINX_DEFAULT_PORT` | Changes the port that the nginx proxy listens to. If you update this environment variable, you must expose the port in the module dockerfile and declare the port binding in the deployment manifest. For more information, see [Expose proxy port](#expose-proxy-port).<br><br>Default is 443.<br><br>When deployed from the Azure Marketplace, the default port is updated to 8000, to prevent conflicts with the edgeHub module. For more information, see [Minimize open ports](#minimize-open-ports). | | `DOCKER_REQUEST_ROUTE_ADDRESS` | Address to route docker requests. Modify this variable on the top layer device to point to the registry module.<br><br>Default is the parent hostname. | | `BLOB_UPLOAD_ROUTE_ADDRESS` | Address to route blob registry requests. Modify this variable on the top layer device to point to the blob storage module.<br><br>Default is the parent hostname. |
Configure the following modules at the **top layer**:
Configure the following module on any **lower layer** for this scenario:
-* An API proxy module
+* API proxy module. The API proxy module is required on all lower layer devices except the bottom layer device.
* Configure the following environment variables: | Name | Value |
Configure the following module on any **lower layer** for this scenario:
Port 8000 is exposed by default from the docker image. If a different nginx proxy port is used, add the **ExposedPorts** section declaring the port in the deployment manifest. For example, if you change the nginx proxy port to 8001, add the following to the deployment manifest:
-```
+```json
{ "ExposedPorts": { "8001/tcp": {}
Port 8000 is exposed by default from the docker image. If a different nginx prox
Another use case for the API proxy module is to enable IoT Edge devices in lower layers to upload blobs. This use case enables troubleshooting functionality on lower layer devices like uploading module logs or uploading the support bundle.
-This scenario uses the [Azure Blob Storage on IoT Edge](https://azuremarketplace.microsoft.com/marketplace/apps/azure-blob-storage.edge-azure-blob-storage) module at the top layer to handle blob creation and upload.
-
-In a nested scenario, up to five layers are supported. Each upstream IoT Edge device in the nested hierarchy requires the *Azure Blob Storage on IoT Edge* module. For a sample multi-layer deployment, see the [Azure IoT Edge for Industrial IoT](https://github.com/Azure-Samples/iot-edge-for-iiot) sample.
+This scenario uses the [Azure Blob Storage on IoT Edge](https://azuremarketplace.microsoft.com/marketplace/apps/azure-blob-storage.edge-azure-blob-storage) module at the top layer to handle blob creation and upload. In a nested scenario, up to five layers are supported. The *Azure Blob Storage on IoT Edge* module is required on the top layer device and optional for lower layer devices. For a sample multi-layer deployment, see the [Azure IoT Edge for Industrial IoT](https://github.com/Azure-Samples/iot-edge-for-iiot) sample.
Configure the following modules at the **top layer**:
key-vault Overview Vnet Service Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/general/overview-vnet-service-endpoints.md
Here's a list of trusted services that are allowed to access a key vault if the
| Azure Application Gateway |[Using Key Vault certificates for HTTPS-enabled listeners](../../application-gateway/key-vault-certs.md) | Azure Backup|Allow backup and restore of relevant keys and secrets during Azure Virtual Machines backup, by using [Azure Backup](../../backup/backup-overview.md).| | Azure Batch | [Configure customer-managed keys for Batch accounts](../../batch/batch-customer-managed-key.md) and [Key Vault for User Subscription Batch accounts](../../batch/batch-account-create-portal.md) |
+| Azure Bot Service | [Azure AI Bot Service encryption for data at rest](/azure/bot-service/bot-service-encryption#grant-azure-bot-service-access-to-a-key-vault) |
| Azure CDN | [Configure HTTPS on an Azure CDN custom domain: Grant Azure CDN access to your key vault](../../cdn/cdn-custom-ssl.md?tabs=option-2-enable-https-with-your-own-certificate#grant-azure-cdn-access-to-your-key-vault)| | Azure Container Registry|[Registry encryption using customer-managed keys](../../container-registry/tutorial-enable-customer-managed-keys.md) | Azure Data Factory|[Fetch data store credentials in Key Vault from Data Factory](https://go.microsoft.com/fwlink/?linkid=2109491)|
load-balancer Load Balancer Custom Probe Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/load-balancer-custom-probe-overview.md
The protocol used by the health probe can be configured to one of the following
| Single instance probes down | New TCP connections succeed to remaining healthy backend endpoint. Established TCP connections to this backend endpoint continue. | Existing UDP flows move to another healthy instance in the backend pool.| | All instances probe down | No new flows are sent to the backend pool. Standard Load Balancer allows established TCP flows to continue given that a backend pool has more than one backend instance. Basic Load Balancer terminates all existing TCP flows to the backend pool. | All existing UDP flows terminate. |
-## Probe interval
+## Probe interval & timeout
The interval value determines how frequently the health probe checks for a response from your backend pool instances. If the health probe fails, your backend pool instances are immediately marked as unhealthy. If the health probe succeeds on the next healthy probe up, Azure Load Balancer marks your backend pool instances as healthy. The health probe attempts to check the configured health probe port every 5 seconds by default but can be explicitly set to another value.
-It is important to note that probes also have a timeout period. For example, if a health probe interval is set to 15 seconds, the total time it takes for your health probe to reflect your application would be 20 seconds (interval + timeout period). Assume the reaction to a timeout response takes a minimum of 5 seconds and a maximum of 10 seconds to react to the change. This example is provided to illustrate what is taking place.
+In order to ensure a timely response is received, health probes have built-in timeouts. The following are the timeout durations for TCP and HTTP/S probes:
+* TCP probe timeout duration: 60 seconds
+* HTTP/S probe timeout duration: 30 seconds (60 seconds for establishing a connection)
+
+If the configured interval is longer than the above timeout period, the health probe will timeout and fail if no response is received during the timeout period. For example, if a TCP health probe is configured with a probe interval of 120 seconds (every 2 minutes), and no probe response is received within the first 60 seconds, the probe will have reached its timeout period and fail.
## Design guidance
load-balancer Load Balancer Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/load-balancer-overview.md
# Customer intent: As an IT administrator, I want to learn more about the Azure Load Balancer service and what I can use it for. Previously updated : 11/30/2022 Last updated : 09/15/2023 # What is Azure Load Balancer?
-*Load balancing* refers to evenly distributing load (incoming network traffic) across a group of backend resources or servers.
+*Load balancing* refers to efficiently distributing incoming network traffic across a group of backend servers or resources.
Azure Load Balancer operates at layer 4 of the Open Systems Interconnection (OSI) model. It's the single point of contact for clients. Load balancer distributes inbound flows that arrive at the load balancer's front end to backend pool instances. These flows are according to configured load-balancing rules and health probes. The backend pool instances can be Azure Virtual Machines or instances in a Virtual Machine Scale Set.
machine-learning Concept What Is Managed Feature Store https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-what-is-managed-feature-store.md
Materialization is the process of computing feature values for a given feature w
- **Managed spark support for materialization** - materialization jobs are run using Azure Machine Learning managed Spark (in serverless compute instances), so that you're freed from setting up and managing the Spark infrastructure. > [!NOTE]
-> Only offline store (ADLS Gen2) materialization is currently supported.
+> Both offline store (ADLS Gen2) and online store (Redis) materialization are currently supported.
### Feature retrieval
machine-learning How To Manage Inputs Outputs Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-manage-inputs-outputs-pipeline.md
The following screenshots provide an example of how inputs and outputs are displ
In the pipeline job page of studio, the asset type inputs/output of a component is shown as a small circle in the corresponding component, known as the Input/Output port. These ports represent the data flow in a pipeline.
-The pipeline level input/output is displayed as a purple box for easy identification.
+The pipeline level output is displayed as a purple box for easy identification.
:::image type="content" source="./media/how-to-manage-pipeline-input-output/input-output-port.png" lightbox="./media/how-to-manage-pipeline-input-output/input-output-port.png" alt-text="Screenshot highlighting the pipeline input and output port.":::
The end to end notebook example in [azureml-example repo](https://github.com/Azu
### Studio
-You can promote a component's input to pipeline level input in designer authoring page. Go to the component's setting panel by double click the component -> find the input you'd like to promote -> Select the three dots on the right -> Select Add to pipeline input.
+You can promote a component's input to pipeline level input in designer authoring page. Go to the component's setting panel by double clicking the component -> find the input you'd like to promote -> Select the three dots on the right -> Select Add to pipeline input.
:::image type="content" source="./media/how-to-manage-pipeline-input-output/promote-pipeline-input.png" lightbox="./media/how-to-manage-pipeline-input-output/promote-pipeline-input.png" alt-text="Screenshot highlighting how to promote to pipeline input in designer.":::
You can promote a component's input to pipeline level input in designer authorin
By default, all inputs are required and must be assigned a value (or a default value) each time you submit a pipeline job. However, there may be instances where you need optional inputs. In such cases, you have the flexibility to not assign a value to the input when submitting a pipeline job.
-For example, if you have an optional input with data/model type and you don't assign a value when submitting the pipeline job, there will be a component in the pipeline that doesn't have upstream data dependency (the input port isn't connected to any component or data/model node). The pipeline service invokes this component directly rather than waiting upstream dependency to be ready.
+Optional input can be useful in below two scenarios:
+
+- If you have an optional data/model type input and don't assign a value to it when submitting the pipeline job, there will be a component in the pipeline that lacks a preceding data dependency. In other words, the input port isn't linked to any component or data/model node. This causes the pipeline service to invoke this component directly, instead of waiting for the preceding dependency to be ready.
+- Below screenshot provides a clear example of the second scenario. If you set `continue_on_step_failure = True` for the pipeline and have a second node (node2) that uses the output from the first node (node1) as an optional input, node2 will still be executed even if node1 fails. However, if node2 is using required input from node1, it will not be executed if node1 fails.
+
+ :::image type="content" source="./media/how-to-manage-pipeline-input-output/continue-on-failure-optional-input.png" lightbox="./media/how-to-manage-pipeline-input-output/continue-on-failure-optional-input.png" alt-text="Screenshot to show the orchestration logic of optional input and continue on failure.":::
Following are examples about how to define optional input.
machine-learning How To Troubleshoot Kubernetes Compute https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-troubleshoot-kubernetes-compute.md
If the proxy and workspace with private link is configured correctly, you can se
## Other known issues
-### Kubernetes compute upadte does not take effect
+### Kubernetes compute update does not take effect
At this time, the CLI v2 and SDK v2 do not allow updating any configuration of an existing Kubernetes compute. For example, changing the namespace will not take effect.
A common cause of the "InternalServerError" failure when creating workloads such
- [How to troubleshoot kubernetes extension](how-to-troubleshoot-kubernetes-extension.md) - [How to troubleshoot online endpoints](how-to-troubleshoot-online-endpoints.md)-- [Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
+- [Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
machine-learning How To Troubleshoot Kubernetes Extension https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-troubleshoot-kubernetes-extension.md
This table shows how to troubleshoot the error codes returned by the HealthCheck
|--|--|--| |E40001 | LOAD_BALANCER_NOT_SUPPORT | Load balancer isn't supported in your cluster. You need to configure the load balancer in your cluster or consider setting `inferenceRouterServiceType` to `nodePort` or `clusterIP`. | |E40002 | INSUFFICIENT_NODE | You have enabled `inferenceRouterHA` that requires at least three nodes in your cluster. Disable the HA if you have fewer than three nodes. |
-|E40003 | INTERNAL_LOAD_BALANCER_NOT_SUPPORT | Currently, only AKS support the internal load balancer. Don't set `internalLoadBalancerProvider` if you don't have an AKS cluster.|
+|E40003 | INTERNAL_LOAD_BALANCER_NOT_SUPPORT | Currently, only AKS support the internal load balancer, and we only support the `azure` type. Don't set `internalLoadBalancerProvider` if you don't have an AKS cluster. |
|E40007 | INVALID_SSL_SETTING | The SSL key or certificate isn't valid. The CNAME should be compatible with the certificate. | |E45002 | PROMETHEUS_CONFLICT | The Prometheus Operator installed is conflict with your existing Prometheus Operator. For more information, see [Prometheus operator](#prometheus-operator) | |E45003 | BAD_NETWORK_CONNECTIVITY | You need to meet [network-requirements](./how-to-access-azureml-behind-firewall.md#scenario-use-kubernetes-compute).|
machine-learning Get Started Prompt Flow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/get-started-prompt-flow.md
This article walks you through the main user journey of using Prompt flow in Azu
> Prompt flow is **not supported** in the workspace which has data isolation enabled. The enableDataIsolation flag can only be set at the workspace creation phase and can't be updated. > >Prompt flow is **not supported** in the project workspace which was created with a workspace hub. The workspace hub is a private preview feature.
->
->Prompt flow is **not supported** in workspaces that enable managed VNet. Managed VNet is a private preview feature.
->
->Prompt flow is **not supported** if you secure your Azure AI services account(Azure openAI, Azure cognitive search, Azure content safety) with virtual networks. If you want to use these as connection in prompt flow please allow access from all networks.
In your Azure Machine Learning workspace, you can enable Prompt flow by turning on **Build AI solutions with Prompt flow** in the **Manage preview features** panel.
machine-learning How To Create Manage Runtime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/how-to-create-manage-runtime.md
Go to runtime detail page and select update button at the top. You can change ne
> [!NOTE] > If you used a custom environment, you need to rebuild it using latest prompt flow image first, and then update your runtime with the new custom environment.
-## Troubleshooting guide for runtime
-
-### Common issues
-
-#### My runtime is failed with a system error **runtime not ready** when using a custom environment
--
-First, go to the Compute Instance terminal and run `docker ps` to find the root cause.
-
-Use `docker images` to check if the image was pulled successfully. If your image was pulled successfully, check if the Docker container is running. If it's already running, locate this runtime, which will attempt to restart the runtime and compute instance.
-
-#### Run failed due to "No module named XXX"
-
-This type error usually related to runtime lack required packages. If you're using default environment, make sure image of your runtime is using the latest version, learn more: [runtime update](#update-runtime-from-ui), if you're using custom image and you're using conda environment, make sure you have installed all required packages in your conda environment, learn more: [customize Prompt flow environment](how-to-customize-environment-runtime.md#customize-environment-with-docker-context-for-runtime).
-
-#### Request timeout issue
-
-##### Request timeout error shown in UI
-
-**MIR runtime request timeout error in the UI:**
--
-Error in the example says "UserError: Upstream request timeout".
-
-**Compute instance runtime request timeout error:**
--
-Error in the example says "UserError: Invoking runtime gega-ci timeout, error message: The request was canceled due to the configured HttpClient.Timeout of 100 seconds elapsing".
-
-#### How to identify which node consume the most time
-
-1. Check the runtime logs
-
-2. Trying to find below warning log format
-
- {node_name} has been running for {duration} seconds.
-
- For example:
-
- - Case 1: Python script node running for long time.
-
- :::image type="content" source="./media/how-to-create-manage-runtime/runtime-timeout-running-for-long-time.png" alt-text="Screenshot of a timeout run logs in the studio UI. " lightbox = "./media/how-to-create-manage-runtime/runtime-timeout-running-for-long-time.png":::
-
- In this case, you can find that the `PythonScriptNode` was running for a long time (almost 300s), then you can check the node details to see what's the problem.
-
- - Case 2: LLM node running for long time.
-
- :::image type="content" source="./media/how-to-create-manage-runtime/runtime-timeout-by-language-model-timeout.png" alt-text="Screenshot of a timeout logs caused by LLM timeout in the studio UI. " lightbox = "./media/how-to-create-manage-runtime/runtime-timeout-by-language-model-timeout.png":::
-
- In this case, if you find the message `request canceled` in the logs, it may be due to the OpenAI API call taking too long and exceeding the runtime limit.
-
- An OpenAI API Timeout could be caused by a network issue or a complex request that requires more processing time. For more information, see [OpenAI API Timeout](https://help.openai.com/en/articles/6897186-timeout).
-
- You can try waiting a few seconds and retrying your request. This usually resolves any network issues.
-
- If retrying doesn't work, check whether you're using a long context model, such as ΓÇÿgpt-4-32kΓÇÖ, and have set a large value for `max_tokens`. If so, it's expected behavior because your prompt may generate a very long response that takes longer than the interactive mode upper threshold. In this situation, we recommend trying 'Bulk test', as this mode doesn't have a timeout setting.
-
-3. If you can't find anything in runtime logs to indicate it's a specific node issue
-
- Contact the Prompt Flow team ([promptflow-eng](mailto:aml-pt-eng@microsoft.com)) with the runtime logs. We'll try to identify the root cause.
-
-### Compute instance runtime related
-
-#### How to find the compute instance runtime log for further investigation?
-
-Go to the compute instance terminal and run `docker logs -<runtime_container_name>`
-
-#### User doesn't have access to this compute instance. Please check if this compute instance is assigned to you and you have access to the workspace. Additionally, verify that you are on the correct network to access this compute instance.
--
-This because you're cloning a flow from others that is using compute instance as runtime. As compute instance runtime is user isolated, you need to create your own compute instance runtime or select a managed online deployment/endpoint runtime, which can be shared with others.
## Next steps
machine-learning How To Customize Environment Runtime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/how-to-customize-environment-runtime.md
Follow [this document to add custom application](../how-to-create-compute-instan
:::image type="content" source="./media/how-to-customize-environment-runtime/runtime-creation-add-custom-application-ui.png" alt-text="Screenshot of compute showing custom applications. " lightbox = "./media/how-to-customize-environment-runtime/runtime-creation-add-custom-application-ui.png":::
-## Create managed online deployment that can be used as Prompt flow runtime (deprecated)
-
-> [!IMPORTANT]
-> Managed online endpoint/deployment as runtime is **deprecated**. Please use [Migrate guide for managed online endpoint/deployment runtime](./migrate-managed-inference-runtime.md).
-
-### Create managed online deployment that can be used as Prompt flow runtime via CLI v2
-
-Learn more about [deploy and score a machine learning model by using an online endpoint](../how-to-deploy-online-endpoints.md)
-
-#### Create managed online endpoint
-
-To define a managed online endpoint, you can use the following yaml template. Make sure to replace the `ENDPOINT_NAME` with the desired name for your endpoint.
-
-```yaml
-$schema: https://azuremlschemas.azureedge.net/latest/managedOnlineEndpoint.schema.json
-name: <ENDPOINT_NAME>
-description: this is a sample promptflow endpoint
-auth_mode: key
-```
-
-Use following CLI command `az ml online-endpoint create -f <yaml_file> -g <resource_group> -w <workspace_name>` to create managed online endpoint. To learn more, see [Deploy and score a machine learning model by using an online endpoint](../how-to-deploy-online-endpoints.md).
-
-#### Create Prompt flow runtime image config file
-
-To configure your Prompt flow runtime, place the following config file in your model folder. This config file provides the necessary information for the runtime to work properly.
-
-For the `mt_service_endpoint` parameter, follow this format: `https://<region>.api.azureml.ms`. For example, if your region is eastus, then your service endpoint should be `https://eastus.api.azureml.ms`
-
-```yaml
-storage:
- storage_account: <WORKSPACE_LINKED_STORAGE>
-deployment:
- subscription_id: <SUB_ID>
- resource_group: <RG_NAME>
- workspace_name: <WORKSPACE_NAME>
- endpoint_name: <ENDPOINT_NAME>
- deployment_name: blue
- mt_service_endpoint: <PROMPT_FLOW_SERVICE_ENDPOINT>
-```
-
-#### Create managed online endpoint
-
-You need to replace the following placeholders with your own values:
--- `ENDPOINT_NAME`: the name of the endpoint you created in the previous step-- `PRT_CONFIG_FILE`: the name of the config file that contains the port and runtime settings. Include the parent model folder name, for example, if model folder name is `model`, then the config file name should be `model/config.yaml`.-- `IMAGE_NAME` to name of your own image, for example: `mcr.microsoft.com/azureml/promptflow/promptflow-runtime:<newest_version>`, you can also follow [Customize environment with docker context for runtime](#customize-environment-with-docker-context-for-runtime) to create your own environment.-
-```yaml
-$schema: https://azuremlschemas.azureedge.net/latest/managedOnlineDeployment.schema.json
-name: blue
-endpoint_name: <ENDPOINT_NAME>
-type: managed
-model:
- path: ./
- type: custom_model
-instance_count: 1
-# 4core, 32GB
-instance_type: Standard_E4s_v3
-request_settings:
- max_concurrent_requests_per_instance: 10
- request_timeout_ms: 90000
-environment_variables:
- PRT_CONFIG_FILE: <PRT_CONFIG_FILE>
-environment:
- name: promptflow-runtime
- image: <IMAGE_NAME>
- inference_config:
- liveness_route:
- port: 8080
- path: /health
- readiness_route:
- port: 8080
- path: /health
- scoring_route:
- port: 8080
- path: /score
-
-```
-
-Use following CLI command `az ml online-deployment create -f <yaml_file> -g <resource_group> -w <workspace_name>` to create managed online deployment that can be used as a Prompt flow runtime.
## Next steps
machine-learning How To Deploy For Real Time Inference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/how-to-deploy-for-real-time-inference.md
If you didn't complete the tutorial, you need to build a flow. Testing the flow
We'll use the sample flow **Web Classification** as example to show how to deploy the flow. This sample flow is a standard flow. Deploying chat flows is similar. Evaluation flow doesn't support deployment.
+## Define the environment used by deployment
+
+When you deploy prompt flow to managed online endpoint in UI. You need define the environment used by this flow. By default, it will use the latest prompt image version. You can specify extra packages you needed in `requirements.txt`. You can find `requirements.txt` in the root folder of your flow folder, which is system generated file.
++ ## Create an online endpoint Now that you have built a flow and tested it properly, it's time to create your online endpoint for real-time inference.
machine-learning How To Evaluate Semantic Kernel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel.md
+
+ Title: Evaluate your Semantic Kernel with Prompt flow (preview)
+
+description: Learn how to evaluate Semantic Kernel in Prompt flow with Azure Machine Learning studio.
+++++++ Last updated : 09/15/2023++
+# Evaluate your Semantic Kernel with Prompt flow (preview)
+
+In the rapidly evolving landscape of AI orchestration, a comprehensive evaluation of your plugins and planners is paramount for optimal performance. This article introduces how to evaluate your **Semantic Kernel** [plugins](/semantic-kernel/ai-orchestration/plugins) and [planners](/semantic-kernel/ai-orchestration/planners) with Prompt flow. Furthermore, you can learn the seamless integration story between Prompt flow and Semantic Kernel.
++
+The integration of Semantic Kernel with Prompt flow is a significant milestone.
+* It allows you to harness the powerful AI orchestration capabilities of Semantic Kernel to enhance the efficiency and effectiveness of your Prompt flow.
+* More importantly, it enables you to utilize Prompt flow's powerful evaluation and experiment management to assess the quality of your Semantic Kernel plugins and planners comprehensively.
+
+## What is Semantic Kernel?
+
+[Semantic Kernel](/semantic-kernel/overview/) is an open-source SDK that lets you easily combine AI services with conventional programming languages like C# and Python. By doing so, you can create AI apps that combine the best of both worlds. It provides plugins and planners, which are powerful tool that makes use of AI capabilities to optimize operations, thereby driving efficiency and accuracy in planning.
+
+## Using prompt flow for plugin and planner evaluation
+
+As you build plugins and add them to planners, itΓÇÖs important to make sure they work as intended. This becomes crucial as more plugins are added, increasing the potential for errors.
+
+Previously, testing plugins and planners was a manual, time-consuming process. Until now, you can automate this with Prompt flow.
+
+In our comprehensive updated documentation, we provide guidance step by step:
+1. Create a flow with Semantic Kernel.
+1. Executing batch tests.
+1. Conducting evaluations to quantitatively ascertain the accuracy of your planners and plugins.
+
+### Create a flow with Semantic Kernel
+
+Similar to the integration of Langchain with Prompt flow, Semantic Kernel, which also supports Python, can operate within Prompt flow in the **Python node**.
++
+#### Prerequisites: Setup runtime and connection
+
+> [!IMPORTANT]
+> Prior to developing the flow, it's essential to install the [Semantic Kernel package](/semantic-kernel/get-started/quick-start-guide/?toc=%2Fsemantic-kernel%2Ftoc.json&tabs=python) in your runtime environment for executor.
+
+To learn more, see [Customize environment for runtime](./how-to-customize-environment-runtime.md) for guidance.
+
+> [!IMPORTANT]
+> The approach to consume OpenAI or Azure OpenAI in Semantic Kernel is to to obtain the keys you have specified in environment variables or stored in a `.env` file.
+
+In prompt flow, you need to use **Connection** to store the keys. You can convert these keys from environment variables to key-values in a custom connection in Prompt flow.
++
+You can then utilize this custom connection to invoke your OpenAI or Azure OpenAI model within the flow.
++
+#### Create and develop a flow
+Once the setup is complete, you can conveniently convert your existing Semantic Kernel planner to a Prompt flow by following the steps below:
+1. Create a standard flow.
+1. Select a runtime with Semantic Kernel installed.
+1. Select the *+ Python* icon to create a new Python node.
+1. Name it as your planner name (e.g., *math_planner*).
+1. Select **+** button in *Files* tab to upload any other reference files (for example, *plugins*).
+1. Update the code in *__.py* file with your planner's code.
+1. Define the input and output of the planner node.
+1. Set the flow input and output.
+1. Click *Run* for a single test.
+
+For example, we can create a flow with a Semantic Kernel planner that solves math problems. Follow this [documentation](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/create-a-prompt-flow-with-semantic-kernel) with steps necessary to create a simple Prompt flow with Semantic Kernel at its core.
++
+Set up the connection in python code.
++
+Select the connection object in the node input, and set the model name of OpenAI or deployment name of Azure OpenAI.
++
+### Batch testing your plugins and planners
+
+Instead of manually testing different scenarios one-by-one, now you can now automatically run large batches of tests using Prompt flow and benchmark data.
++
+Once the flow has passed the single test run in the previous step, you can effortlessly create a batch test in Prompt flow by adhering to the following steps:
+1. Create benchmark data in a *jsonl* file, contains a list of JSON objects that contains the input and the correct ground truth.
+1. Click *Batch run* to create a batch test.
+1. Complete the batch run settings, especially the data part.
+1. Submit run without evaluation (for this specific batch test, the *Evaluation step* can be skipped).
+
+In our [Running batches with Prompt flow](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/running-batches-with-prompt-flow?tabs=gpt-35-turbo), we demonstrate how you can use this functionality to run batch tests on a planner that uses a math plugin. By defining a bunch of word problems, we can quickly test any changes we make to our plugins or planners so we can catch regressions early and often.
++
+In your workspace, you can go to the **Run list** in Prompt flow, select **Details** button, and then select **Output** tab to view the batch run result.
++++
+### Evaluating the accuracy
+
+Once a batch run is completed, you then need an easy way to determine the adequacy of the test results. This information can then be used to develop accuracy scores, which can be incrementally improved.
++
+Evaluation flows in Prompt flow enable this functionality. Using the sample evaluation flows offered by prompt flow, you can assess various metrics such as **classification accuracy**, **perceived intelligence**, **groundedness**, and more.
++
+There's also the flexibility to develop **your own custom evaluators** if needed.
+
+In Prompt flow, you can quick create an evaluation run based on a completed batch run by following the steps below:
+1. Prepare the evaluation flow and the complete a batch run.
+1. Click *Run* tab in home page to go to the run list.
+1. Go into the previous completed batch run.
+1. Click *Evaluate* in the above to create an evaluation run.
+1. Complete the evaluation settings, especially the evaluation flow and the input mapping.
+1. Submit run and wait for the result.
+++++
+Follow this [documentation](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/evaluating-plugins-and-planners-with-prompt-flow?tabs=gpt-35-turbo) for Semantic Kernel to learn more about how to use the [math accuracy evaluation flow](https://github.com/microsoft/promptflow/tree/main/examples/flows/evaluation/eval-accuracy-maths-to-code) to test our planner to see how well it solves word problems.
+
+After running the evaluator, youΓÇÖll get a summary back of your metrics. Initial runs may yield less than ideal results, which can be used as a motivation for immediate improvement.
+
+To check the metrics, you can go back to the batch run detail page, click **Details** button, and then click **Output** tab, select the evaluation run name in the dropdown list to view the evaluation result.
++
+You can check the aggregated metric in the **Metrics** tab.
+++
+### Experiments for quality improvement
+
+If you find that your plugins and planners arenΓÇÖt performing as well as they should, there are steps you can take to make them better. In this documentation, we provide an in-depth guide on practical strategies to bolster the effectiveness of your plugins and planners. We recommend the following high-level considerations:
+
+1. Use a more advanced model like GPT-4 instead of GPT-3.5-turbo.
+1. [Improve the description of your plugins](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/evaluating-plugins-and-planners-with-prompt-flow?tabs=gpt-35-turbo#improving-the-descriptions-of-your-plugin) so theyΓÇÖre easier for the planner to use.
+1. [Inject additional help to the planner](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/evaluating-plugins-and-planners-with-prompt-flow?tabs=gpt-35-turbo#improving-the-descriptions-of-your-plugin) when sending the userΓÇÖs ask.
+
+By doing a combination of these three things, we demonstrate how you can take a failing planner and turn it into a winning one! At the end of the walkthrough, you should have a planner that can correctly answer all of the benchmark data.
+
+Throughout the process of enhancing your plugins and planners in Prompt flow, you can **utilize the runs to monitor your experimental progress**. Each iteration allows you to submit a batch run with an evaluation run at the same time.
++
+This enables you to conveniently compare the results of various runs, assisting you in identifying which modifications are beneficial and which are not.
+
+To compare, select the runs you wish to analyze, then select the **Visualize outputs** button in the above.
++
+This will present you with a detailed table, line-by-line comparison of the results from selected runs.
++
+## Next steps
+
+> [!TIP]
+> Follow along with our documentations to get started!
+> And keep an eye out for more integrations.
+
+If youΓÇÖre interested in learning more about how you can use Prompt flow to test and evaluate Semantic Kernel, we recommend following along to the articles we created. At each step, we provide sample code and explanations so you can use Prompt flow successfully with Semantic Kernel.
+
+* [Using Prompt flow with Semantic Kernel](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/)
+* [Create a Prompt flow with Semantic Kernel](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/create-a-prompt-flow-with-semantic-kernel)
+* [Running batches with Prompt flow](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/running-batches-with-prompt-flow)
+* [Evaluate your plugins and planners](/semantic-kernel/ai-orchestration/planners/evaluate-and-deploy-planners/)
+
+When your planner is fully prepared, it can be deployed as an online endpoint in Azure Machine Learning. This allows it to be easily integrated into your application for consumption. Learn more about how to [deploy a flow as a managed online endpoint for real-time inference](./how-to-deploy-for-real-time-inference.md).
++
machine-learning How To Secure Prompt Flow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md
Workspace managed virtual network is the recommended way to support network isol
:::image type="content" source="./media/how-to-secure-prompt-flow/outbound-rule-non-azure-resources.png" alt-text="Screenshot of user defined outbound rule for non Azure resource." lightbox = "./media/how-to-secure-prompt-flow/outbound-rule-non-azure-resources.png":::
+4. In workspace which enable managed VNet, you can only deploy prompt flow to managed online endpoint. You can follow [Secure your managed online endpoints with network isolation](../how-to-secure-kubernetes-inferencing-environment.md) to secure your managed online endpoint.
+ ## Secure prompt flow use your own virtual network - To set up Azure Machine Learning related resources as private, see [Secure workspace resources](../how-to-secure-workspace-vnet.md). - Meanwhile, you can follow [private Azure Cognitive Services](../../ai-services/cognitive-services-virtual-networks.md) to make them as private.
+- If you want to deploy prompt flow in workspace which secured by your own virtual network, you can deploy it to AKS cluster which is in the same virtual network. You can follow [Secure your RAG workflows with network isolation](../how-to-secure-rag-workflows.md) to secure your AKS cluster.
- You can either create private endpoint to the same virtual network or leverage virtual network peering to make them communicate with each other.
-## Limitations
+## Known limitations
-- Only public access enable storage account is supported. You can't use private storage account now.
+- Only public access enable storage account is supported. You can't use private storage account now. Find workaround here: [Why I can't create or upgrade my flow when I disable public network access of storage account?](./tools-reference/troubleshoot-guidance.md#why-i-cant-create-or-upgrade-my-flow-when-i-disable-public-network-access-of-storage-account)
- Workspace hub / lean workspace and AI studio don't support bring your own virtual network. - Managed online endpoint only supports workspace managed virtual network. If you want to use your own virtual network, you may need one workspace for prompt flow authoring with your virtual network and another workspace for prompt flow deployment using managed online endpoint with workspace managed virtual network.
-## FAQ
-
-### Why I can't create or upgrade my flow when I disable public network access of storage account?
-Prompt flow rely on fileshare to store snapshot of flow. Prompt flow didn't support private storage account now. Here are some workarounds you can try:
-- Make the storage account as public access enabled if there is no security concern. -- If you are only use UI to authoring promptflow, you can add following flights (flight=PromptFlowCodeFirst=false) to use our old UI.-- You can use our CLI/SDK to authoring promptflow, CLI/SDK authong didn't rely on fileshare. See [Integrate Prompt Flow with LLM-based application DevOps ](how-to-integrate-with-llm-app-devops.md).- ## Next steps - [Secure workspace resources](../how-to-secure-workspace-vnet.md)-- [Workspace managed network isolation](../how-to-managed-network.md)
+- [Workspace managed network isolation](../how-to-managed-network.md)
+- [Secure Azure Kubernetes Service inferencing environment](../how-to-secure-online-endpoint.md)
+- [Secure your managed online endpoints with network isolation](../how-to-secure-kubernetes-inferencing-environment.md)
+- [Secure your RAG workflows with network isolation](../how-to-secure-rag-workflows.md)
machine-learning Troubleshoot Guidance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/prompt-flow/tools-reference/troubleshoot-guidance.md
Last updated 09/05/2023
This article addresses frequent questions about tool usage.
-## Error "package tool is not found" occurs when updating the flow for code first experience.
+## Error "package tool isn't found" occurs when updating the flow for code first experience.
When you update flows for code first experience, if the flow utilized these tools (Faiss Index Lookup, Vector Index Lookup, Vector DB Lookup, Content Safety (Text)), you may encounter the error message like below:
To resolve the issue, you have two options:
- **Option 1** - Update your runtime to latest version. ![how-to-switch-to-raw-file-mode](../media/faq/switch-to-raw-file-mode.png) - Update the tool names. ![how-to-update-tool-name](../media/faq/update-tool-name.png)
To resolve the issue, you have two options:
- **Option 2** - Update your runtime to latest version.
- - Remove the old tool and re-create a new tool.
+ - Remove the old tool and re-create a new tool.
+
+## Why I can't create or upgrade my flow when I disable public network access of storage account?
+Prompt flow relies on fileshare to store snapshot of flow. Prompt flow currently doesn't support private storage account. Here are some workarounds you can try:
+- Make the storage account as public access enabled if there's no security concern.
+- If you're only using UI to authoring prompt flow, you can add following flights (flight=PromptFlowCodeFirst=false) to use our old UI.
+- You can use our CLI/SDK to authoring prompt flow, CLI/SDK authoring didn't rely on fileshare. See [Integrate Prompt Flow with LLM-based application DevOps ](../how-to-integrate-with-llm-app-devops.md).
++
+## Why I can't upgrade my old flow?
+Prompt flow relies on fileshare to store snapshot of flow. If fileshare have some issue, you may encounter this issue. Here are some workarounds you can try:
+- If you're using private storage account, please see [Why I can't create or upgrade my flow when I disable public network access of storage account?](#why-i-cant-create-or-upgrade-my-flow-when-i-disable-public-network-access-of-storage-account)
+- If the storage account is enabled public access, please check whether there are datastore named `workspaceworkingdirectory` in your workspace, it should be fileshare type.
+![workspaceworkingdirectory](../media/faq/working-directory.png)
+ - If you didn't get this datastore, you need add it in your workspace.
+ - Create fileshare with name `code-391ff5ac-6576-460f-ba4d-7e03433c68b6`
+ - Create data store with name `workspaceworkingdirectory` . See [Create datastores](../../how-to-datastore.md)
+ - If you have `workspaceworkingdirectory` datastore but its type is `blob` instead of `fileshare`, please create new workspace and use storage didn't enable hierarchical namespaces ADLS Gen2 as workspace default storage account. See [Create workspace](../../how-to-manage-workspace.md#create-a-workspace)
+
+
+## Runtime related issues
+
+### My runtime is failed with a system error **runtime not ready** when using a custom environment
++
+First, go to the Compute Instance terminal and run `docker ps` to find the root cause.
+
+Use `docker images` to check if the image was pulled successfully. If your image was pulled successfully, check if the Docker container is running. If it's already running, locate this runtime, which will attempt to restart the runtime and compute instance.
+
+### Run failed due to "No module named XXX"
+
+This type error related to runtime lack required packages. If you're using default environment, make sure image of your runtime is using the latest version, learn more: [runtime update](../how-to-create-manage-runtime.md#update-runtime-from-ui), if you're using custom image and you're using conda environment, make sure you have installed all required packages in your conda environment, learn more: [customize Prompt flow environment](../how-to-customize-environment-runtime.md#customize-environment-with-docker-context-for-runtime).
+
+### Request timeout issue
+
+#### Request timeout error shown in UI
+
+**MIR runtime request timeout error in the UI:**
++
+Error in the example says "UserError: Upstream request timeout".
+
+**Compute instance runtime request timeout error:**
++
+Error in the example says "UserError: Invoking runtime gega-ci timeout, error message: The request was canceled due to the configured HttpClient.Timeout of 100 seconds elapsing".
+
+### How to identify which node consume the most time
+
+1. Check the runtime logs
+
+2. Trying to find below warning log format
+
+ {node_name} has been running for {duration} seconds.
+
+ For example:
+
+ - Case 1: Python script node running for long time.
+
+ :::image type="content" source="../media/how-to-create-manage-runtime/runtime-timeout-running-for-long-time.png" alt-text="Screenshot of a timeout run logs in the studio UI. " lightbox = "../media/how-to-create-manage-runtime/runtime-timeout-running-for-long-time.png":::
+
+ In this case, you can find that the `PythonScriptNode` was running for a long time (almost 300 s), then you can check the node details to see what's the problem.
+
+ - Case 2: LLM node running for long time.
+
+ :::image type="content" source="../media/how-to-create-manage-runtime/runtime-timeout-by-language-model-timeout.png" alt-text="Screenshot of a timeout logs caused by LLM timeout in the studio UI. " lightbox = "../media/how-to-create-manage-runtime/runtime-timeout-by-language-model-timeout.png":::
+
+ In this case, if you find the message `request canceled` in the logs, it may be due to the OpenAI API call taking too long and exceeding the runtime limit.
+
+ An OpenAI API Timeout could be caused by a network issue or a complex request that requires more processing time. For more information, see [OpenAI API Timeout](https://help.openai.com/en/articles/6897186-timeout).
+
+ You can try waiting a few seconds and retrying your request. This usually resolves any network issues.
+
+ If retrying doesn't work, check whether you're using a long context model, such as ΓÇÿgpt-4-32kΓÇÖ, and have set a large value for `max_tokens`. If so, it's expected behavior because your prompt may generate a long response that takes longer than the interactive mode upper threshold. In this situation, we recommend trying 'Bulk test', as this mode doesn't have a timeout setting.
+
+3. If you can't find anything in runtime logs to indicate it's a specific node issue
+
+ Contact the Prompt Flow team ([promptflow-eng](mailto:aml-pt-eng@microsoft.com)) with the runtime logs. We try to identify the root cause.
+
+### How to find the compute instance runtime log for further investigation?
+
+Go to the compute instance terminal and run `docker logs -<runtime_container_name>`
+
+### User doesn't have access to this compute instance. Check if this compute instance is assigned to you and you have access to the workspace. Additionally, verify that you are on the correct network to access this compute instance.
++
+It's because you're cloning a flow from others that is using compute instance as runtime. As compute instance runtime is user isolated, you need to create your own compute instance runtime or select a managed online deployment/endpoint runtime, which can be shared with others.
machine-learning Reference Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/reference-kubernetes.md
spec:
storage: 1Gi ``` > [!IMPORTANT]
-> Only training job pods and batch-deployment pods will have access to the PVC(s); managed and Kubernetes online-deployment pods will not. In addition, only the pods in the same Kubernetes namespace with the PVC(s) will be mounted the volume. Data scientist is able to access the `mount path` specified in the PVC annotation in the job.
+> * Only the command job/component, hyperdrive job/component, and batch-deployment support custom data storage from PVC(s).
+ > * The real-time online endpoint, AutoML job and PRS job do not support custom data storage from PVC(s).
+> * In addition, only the pods in the same Kubernetes namespace with the PVC(s) will be mounted the volume. Data scientist is able to access the `mount path` specified in the PVC annotation in the job.
+> AutoML job and Prs job will not have access to the PVC(s).
## Supported Azure Machine Learning taints and tolerations
machine-learning Reference Yaml Deployment Managed Online https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/reference-yaml-deployment-managed-online.md
The source JSON schema can be found at https://azuremlschemas.azureedge.net/late
| Key | Type | Description | Default value | | | - | -- | - |
-| `request_timeout_ms` | integer | The scoring timeout in milliseconds. | `5000` |
+| `request_timeout_ms` | integer | The scoring timeout in milliseconds. Note that the maximum value allowed is `90000` milliseconds. See [Managed online endpoint quotas](how-to-manage-quotas.md#azure-machine-learning-managed-online-endpoints) for more. | `5000` |
| `max_concurrent_requests_per_instance` | integer | The maximum number of concurrent requests per instance allowed for the deployment. <br><br> **Note:** If you're using [Azure Machine Learning Inference Server](how-to-inference-server-http.md) or [Azure Machine Learning Inference Images](concept-prebuilt-docker-images-inference.md), your model must be configured to handle concurrent requests. To do so, pass `WORKER_COUNT: <int>` as an environment variable. For more information about `WORKER_COUNT`, see [Azure Machine Learning Inference Server Parameters](how-to-inference-server-http.md#server-parameters) <br><br> **Note:** Set to the number of requests that your model can process concurrently on a single node. Setting this value higher than your model's actual concurrency can lead to higher latencies. Setting this value too low may lead to under utilized nodes. Setting too low may also result in requests being rejected with a 429 HTTP status code, as the system will opt to fail fast. For more information, see [Troubleshooting online endpoints: HTTP status codes](how-to-troubleshoot-online-endpoints.md#http-status-codes). | `1` | | `max_queue_wait_ms` | integer | The maximum amount of time in milliseconds a request will stay in the queue. | `500` |
managed-grafana Concept Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/concept-whats-new.md
Title: What's new in Azure Managed Grafana description: Recent updates for Azure Managed Grafana--++ Last updated 02/06/2023
managed-grafana Encryption https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/encryption.md
Title: Encryption in Azure Managed Grafana description: Learn how data is encrypted in Azure Managed Grafana.--++ Last updated 03/23/2023
managed-grafana Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/faq.md
Title: Azure Managed Grafana FAQ description: Frequently asked questions about Azure Managed Grafana--++ Last updated 07/17/2023
managed-grafana Find Help Open Support Ticket https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/find-help-open-support-ticket.md
Title: Find help or open a support ticket for Azure Managed Grafana description: Learn how to find help or open a support ticket for Azure Managed Grafana-+ Last updated 01/23/2023-+ # Find help or open a support ticket for Azure Managed Grafana
managed-grafana Grafana App Ui https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/grafana-app-ui.md
Title: Grafana UI description: Learn about the Grafana UI components--panels, visualizations and dashboards.--++ Last updated 3/23/2022
managed-grafana High Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/high-availability.md
Title: Azure Managed Grafana service reliability description: Learn about service reliability and availability options provided by Azure Managed Grafana--++ Last updated 3/23/2023
managed-grafana How To Api Calls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-api-calls.md
Title: 'Call Grafana APIs programmatically with Azure Managed Grafana' description: Learn how to call Grafana APIs programmatically with Azure Active Directory and an Azure service principal--++ Last updated 04/05/2023
managed-grafana How To Authentication Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-authentication-permissions.md
Title: How to set up authentication and permissions in Azure Managed Grafana
description: Learn how to set up Azure Managed Grafana authentication permissions using a system-assigned Managed identity or a Service Principal --++ Last updated 12/13/2022
managed-grafana How To Connect To Data Source Privately https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-connect-to-data-source-privately.md
Title: How to connect to a data source privately in Azure Managed Grafana
description: Learn how to connect an Azure Managed Grafana instance to a data source using Managed Private Endpoint --++ Last updated 5/18/2023
managed-grafana How To Create Api Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-create-api-keys.md
Title: Create and manage Grafana API keys in Azure Managed Grafana description: Learn how to generate and manage Grafana API keys, and start making API calls for Azure Managed Grafana.--++
managed-grafana How To Create Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-create-dashboard.md
Title: Create a Grafana dashboard with Azure Managed Grafana description: Learn how to create and configure Azure Managed Grafana dashboards.--++ Last updated 03/07/2023
managed-grafana How To Data Source Plugins Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-data-source-plugins-managed-identity.md
Title: How to configure data sources for Azure Managed Grafana description: In this how-to guide, discover how you can configure data sources for Azure Managed Grafana using Managed Identity.--++ Last updated 1/12/2023
managed-grafana How To Deterministic Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-deterministic-ip.md
Title: How to set up and use deterministic outbound APIs in Azure Managed Grafan
description: Learn how to set up and use deterministic outbound APIs in Azure Managed Grafana --++ Last updated 03/23/2022
managed-grafana How To Enable Zone Redundancy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-enable-zone-redundancy.md
Title: How to enable zone redundancy in Azure Managed Grafana
description: Learn how to create a zone-redundant Managed Grafana instance. --++ Last updated 02/28/2023
managed-grafana How To Grafana Enterprise https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-grafana-enterprise.md
Title: Subscribe to Grafana Enterprise description: Activate Grafana Enterprise to access Grafana Enterprise plugins within Azure Managed Grafana--++ Last updated 01/09/2023
managed-grafana How To Monitor Managed Grafana Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-monitor-managed-grafana-workspace.md
Title: 'How to monitor your Azure Managed Grafana instance with logs' description: Learn how to monitor your Azure Managed Grafana instance with logs.--++
managed-grafana How To Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-permissions.md
Title: How to modify access permissions to Azure Monitor description: Learn how to manually set up permissions that allow your Azure Managed Grafana instance to access a data source--++
managed-grafana How To Service Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-service-accounts.md
Title: How to use service accounts in Azure Managed Grafana description: In this guide, learn how to use service accounts in Azure Managed Grafana.--++
managed-grafana How To Set Up Private Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-set-up-private-access.md
Title: How to set up private access (preview) in Azure Managed Grafana description: How to disable public access to your Azure Managed Grafana workspace and configure private endpoints.--++ Last updated 02/16/2023
managed-grafana How To Share Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-share-dashboard.md
Title: Share an Azure Managed Grafana dashboard or panel description: Learn how to share a Grafana dashboard with internal and external stakeholders, such as customers or partners.--++ Last updated 03/01/2023
managed-grafana How To Share Grafana Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-share-grafana-workspace.md
Title: How to share an Azure Managed Grafana instance description: 'Learn how you can share access permissions to Azure Grafana Managed.' --++
managed-grafana How To Smtp Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-smtp-settings.md
Title: 'How to configure SMTP settings within Azure Managed Grafana' description: Learn how to configure SMTP settings to generate email notifications for Azure Managed Grafana--++ Last updated 02/01/2023
managed-grafana How To Sync Teams With Azure Ad Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-sync-teams-with-azure-ad-groups.md
Title: Sync Grafana teams with Azure Active Directory groups
description: Learn how to set up Grafana teams using Azure Active Directory groups in Azure Managed Grafana --++ Last updated 9/11/2023
managed-grafana How To Use Azure Monitor Alerts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-use-azure-monitor-alerts.md
Title: Use Azure Monitor alerts with Azure Managed Grafana
description: Learn how to set up Azure Monitor alerts and use them with Azure Managed Grafana --++ Last updated 6/8/2023
managed-grafana How To Use Reporting And Image Rendering https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/how-to-use-reporting-and-image-rendering.md
Title: How to use reporting and image rendering in Azure Managed Grafana
description: Learn how to create reports in Azure Managed Grafana and understand performance and limitations of image rendering --++ Last updated 5/6/2023
managed-grafana Known Limitations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/known-limitations.md
description: Learn about current limitations in Azure Managed Grafana.
Last updated 03/13/2023-+ -+ # Limitations of Azure Managed Grafana
managed-grafana Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/overview.md
Title: What is Azure Managed Grafana? description: Read an overview of Azure Managed Grafana. Understand why and how to use Managed Grafana. --++ Last updated 3/23/2023
managed-grafana Quickstart Managed Grafana Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/quickstart-managed-grafana-cli.md
Title: 'Quickstart: create an Azure Managed Grafana instance using the Azure CLI
description: Learn how to create a Managed Grafana instance using the Azure CLI --++ Last updated 12/13/2022 ms.devlang: azurecli
managed-grafana Quickstart Managed Grafana Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/quickstart-managed-grafana-portal.md
Title: 'Quickstart: create an Azure Managed Grafana instance using the Azure por
description: Learn how to create a Managed Grafana workspace to generate a new Managed Grafana instance in the Azure portal --++ Last updated 03/23/2022
managed-grafana Troubleshoot Managed Grafana https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/troubleshoot-managed-grafana.md
Title: 'Troubleshoot Azure Managed Grafana' description: Troubleshoot Azure Managed Grafana issues related to fetching data, managing Managed Grafana dashboards, speed and more.--++ Last updated 09/13/2022
orbital Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/orbital/resource-graph-samples.md
+
+ Title: Azure Resource Graph Sample Queries for Azure Orbital Ground Station
+description: Provides a collection of Azure Resource Graph sample queries for Azure Orbital Ground Station.
++++ Last updated : 09/08/2023+++
+# Azure Resource Graph sample queries for Azure Orbital Ground Station
+
+This page is a collection of [Azure Resource Graph](../governance/resource-graph/overview.md)
+sample queries for Azure Orbital Ground Station. For a complete list of Azure Resource Graph samples, see
+[Resource Graph samples by Category](../governance/resource-graph/samples/samples-by-category.md)
+and [Resource Graph samples by Table](../governance/resource-graph/samples/samples-by-table.md).
+
+## Sample queries
+
+### List Upcoming Contacts
+#### Sorted by reservation start time
+
+```kusto
+OrbitalResources
+| where type == 'microsoft.orbital/spacecrafts/contacts' and todatetime(properties.reservationStartTime) >= now()
+| sort by todatetime(properties.reservationStartTime)
+| extend Contact_Profile = tostring(split(properties.contactProfile.id, "/")[-1])
+| extend Spacecraft = tostring(split(id,ΓÇ»"/")[-3])
+| project Contact = tostring(name), Groundstation = tostring(properties.groundStationName), Spacecraft, Contact
+```
+
+#### Sorted by ground station
+
+```kusto
+OrbitalResources
+| where type == 'microsoft.orbital/spacecrafts/contacts' and todatetime(properties.reservationStartTime) >= now()
+| sort by tostring(properties.groundStationName)
+| extend Contact_Profile = tostring(split(properties.contactProfile.id, "/")[-1])
+| extend Spacecraft = tostring(split(id,ΓÇ»"/")[-3])
+| project Contact = tostring(name), Groundstation = tostring(properties.groundStationName), Spacecraft, Contact_Profile, Reservation_Start_Time = todatetime(properties.reservationStartTime), Reservation_End_Time = todatetime(properties.reservationEndTime), Status=properties.status, Provisioning_Status=properties.provisioningState
+```
+
+#### Sorted by contact profile
+
+```kusto
+OrbitalResources
+| where type == 'microsoft.orbital/spacecrafts/contacts'
+| where todatetime(properties.reservationStartTime) >= now()
+| sort by Contact_Profile
+| extend Contact_Profile = tostring(split(properties.contactProfile.id, "/")[-1])
+| extend Spacecraft = tostring(split(id,ΓÇ»"/")[-3])
+| project Contact = tostring(name), Groundstation = tostring(properties.groundStationName), Spacecraft, Contact_Profile, Reservation_Start_Time = todatetime(properties.reservationStartTime), Reservation_End_Time = todatetime(properties.reservationEndTime), Status=properties.status, Provisioning_Status=properties.provisioningState
+```
+
+### List Contacts from Past ΓÇÿxΓÇÖ Days ΓÇô sorted by reservation start time
+
+```kusto
+OrbitalResources
+| where type == 'microsoft.orbital/spacecrafts/contacts' and todatetime(properties.reservationStartTime) >= now(-1d)
+| sort by todatetime(properties.reservationStartTime)
+| extend Contact_Profile = tostring(split(properties.contactProfile.id, "/")[-1])
+| extend Spacecraft = tostring(split(id,ΓÇ»"/")[-3])
+| project Contact = tostring(name), Groundstation = tostring(properties.groundStationName), Spacecraft, Contact_Profile, Reservation_Start_Time = todatetime(properties.reservationStartTime), Reservation_End_Time = todatetime(properties.reservationEndTime), Status=properties.status, Provisioning_Status=properties.provisioningState
+```
+
+#### On a specified ground station
+
+This query will help customers track all the past contacts sorted by reservation start time for a specified ground station.
+
+```kusto
+OrbitalResources
+| where type == 'microsoft.orbital/spacecrafts/contacts' and todatetime(properties.reservationStartTime) >= now(-1d) and properties.groundStationName == 'Microsoft_Gavle'
+| sort by todatetime(properties.reservationStartTime)
+| extend Contact_Profile = tostring(split(properties.contactProfile.id, "/")[-1])
+| extend Spacecraft = tostring(split(id,ΓÇ»"/")[-3])
+| project Contact = tostring(name), Groundstation = tostring(properties.groundStationName), Spacecraft, Contact_Profile, Reservation_Start_Time = todatetime(properties.reservationStartTime), Reservation_End_Time = todatetime(properties.reservationEndTime), Status=properties.status, Provisioning_Status=properties.provisioningState
+```
+
+#### On specified contact profile
+
+This query will help customers track all the past contacts sorted by reservation start time for a specified contact profile.
+
+```kusto
+OrbitalResources
+| where type == 'microsoft.orbital/spacecrafts/contacts'
+| extend Contact_Profile = tostring(split(properties.contactProfile.id, "/")[-1])
+| where todatetime(properties.reservationStartTime) >= now(-1d) and Contact_Profile == 'test-CP'
+| sort by todatetime(properties.reservationStartTime)
+| extend Spacecraft = tostring(split(id,ΓÇ»"/")[-3])
+| project Contact = tostring(name), Groundstation = tostring(properties.groundStationName), Spacecraft, Contact_Profile, Reservation_Start_Time = todatetime(properties.reservationStartTime), Reservation_End_Time = todatetime(properties.reservationEndTime), Status=properties.status, Provisioning_Status=properties.provisioningState
+```
reliability Migrate App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/reliability/migrate-app-configuration.md
Title: Migrate App Configuration to a region with availability zone support description: Learn how to migrate Azure App Configuration to availability zone support.-+ Last updated 09/10/2022-+
search Resource Demo Sites https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/resource-demo-sites.md
Previously updated : 08/22/2023 Last updated : 09/18/2023 # Demos - Azure Cognitive Search Demos are hosted apps that showcase search and AI enrichment functionality in Azure Cognitive Search. Several of these demos include source code on GitHub so that you can see how they were made.
-The following demos are built and hosted by Microsoft.
+Microsoft built and hosts the following demos.
| Demo name | Description | Source code | |--| |-|
-| [Chat with your data](https://entgptsearch.azurewebsites.net/) | An Azure web app that uses ChatGPT in Azure OpenAI with fictitous health plan data in a search index. | [https://github.com/Azure-Samples/azure-search-openai-demo/](https://github.com/Azure-Samples/azure-search-openai-demo/) |
-| [AzSearchLab](https://azuresearchlab.azurewebsites.net/) | A web front end that makes calls to a search index. | [https://github.com/Azure-Samples/azure-search-lab](https://github.com/Azure-Samples/azure-search-lab) |
+| [Chat with your data](https://entgptsearch.azurewebsites.net/) | An Azure web app that uses ChatGPT in Azure OpenAI with fictitious health plan data in a search index. | [https://github.com/Azure-Samples/azure-search-openai-demo/](https://github.com/Azure-Samples/azure-search-openai-demo/) |
| [NYC Jobs demo](https://azjobsdemo.azurewebsites.net/) | An ASP.NET app with facets, filters, details, geo-search (map controls). | [https://github.com/Azure-Samples/search-dotnet-asp-net-mvc-jobs](https://github.com/Azure-Samples/search-dotnet-asp-net-mvc-jobs) | | [JFK files demo](https://jfk-demo-2019.azurewebsites.net/#/) | An ASP.NET web app built on a public data set, transformed with custom and predefined skills to extract searchable content from scanned document (JPEG) files. [Learn more...](https://www.microsoft.com/ai/ai-lab-jfk-files) | [https://github.com/Microsoft/AzureSearch_JFK_Files](https://github.com/Microsoft/AzureSearch_JFK_Files) | | [Semantic search for retail](https://brave-meadow-0f59c9b1e.1.azurestaticapps.net/) | Web app for a fictitious online retailer, "Terra" | Not available |
search Resource Tools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/resource-tools.md
Productivity tools are built by engineers at Microsoft, but aren't part of the A
| Tool name | Description | Source code | |--| |-|
-| [Azure Cognitive Search Lab readme](https://github.com/Azure-Samples/azure-search-lab/blob/main/README.md) | Connects to your search service with a Web UI that exercises the full REST API, including the ability to edit a live search index. | [https://github.com/Azure-Samples/azure-search-lab](https://github.com/Azure-Samples/azure-search-lab) |
| [Back up and Restore readme](https://github.com/liamc) | Download a populated search index to your local device and then upload the index and its content to a new search service. | [https://github.com/liamca/azure-search-backup-restore](https://github.com/liamca/azure-search-backup-restore) | | [Knowledge Mining Accelerator readme](https://github.com/Azure-Samples/azure-search-knowledge-mining/blob/main/README.md) | Code and docs to jump start a knowledge store using your data. | [https://github.com/Azure-Samples/azure-search-knowledge-mining](https://github.com/Azure-Samples/azure-search-knowledge-mining) | | [Performance testing readme](https://github.com/Azure-Samples/azure-search-performance-testing/blob/main/README.md) | This solution helps you load test Azure Cognitive Search. It uses Apache JMeter as an open source load and performance testing tool and Terraform to dynamically provision and destroy the required infrastructure on Azure. | [https://github.com/Azure-Samples/azure-search-performance-testing](https://github.com/Azure-Samples/azure-search-performance-testing) |
search Search Dotnet Mgmt Sdk Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-dotnet-mgmt-sdk-migration.md
Title: Upgrade to the Azure Search .NET Management SDK
+ Title: Upgrade management SDKs
-description: Upgrade to the Azure Search .NET Management SDK from previous versions. Learn about new features and the code changes necessary for migration.
+description: Learn about the management libraries and packages used for control plane operations in Azure Cognitive Search.
ms.devlang: csharp Previously updated : 10/03/2022 Last updated : 09/15/2023
-# Upgrading versions of the Azure Search .NET Management SDK
+# Upgrade versions of the Azure Search .NET Management SDK
-This article explains how to migrate to successive versions of the Azure Search .NET Management SDK, used to provision or deprovision search services, adjust capacity, and manage API keys.
+This article points you to libraries in the Azure SDK for .NET for managing a search service. These libraries provide the APIs used to create, configure, and delete search services. They also provide APIS used to adjust capacity, manage API keys, and configure network security.
-Management SDKs target a specific version of the Management REST API. For more information about concepts and operations, see [Search Management (REST)](/rest/api/searchmanagement/).
+Management SDKs target a specific version of the Management REST API. Release notes for each library indicate which REST API version is the target for each package. For more information about concepts and operations, see [Search Management (REST)](/rest/api/searchmanagement/).
## Versions
-Microsoft.Azure.Management.Search is now deprecated. We recommend [Azure.ResourceManager.Search](https://github.com/Azure/azure-sdk-for-net/blob/Azure.ResourceManager.Search_1.0.0/sdk/search/Azure.ResourceManager.Search/README.md) instead.
+The following table lists the client libraries used to provision a search service.
-| SDK version | Corresponding REST API version | Feature addition or behavior change |
-|-|--|-|
-| [1.0](https://www.nuget.org/packages/Azure.ResourceManager.Search/) | api-version=2020-08-01 | This is a new package from the Azure SDK team that implements approaches and standards that are common to resource management in Azure. There's no migration path. If you've used the previous client library for service administration in Azure Cognitive Search, you should redesign your solution to use the new `Azure.ResourceManager.Search` package. See the [readme](https://github.com/Azure/azure-sdk-for-net/blob/Azure.ResourceManager.Search_1.0.0/sdk/search/Azure.ResourceManager.Search/README.md) for links and next steps.|
-| [3.0](https://www.nuget.org/packages/Microsoft.Azure.Management.Search/3.0.0) | api-version=2020-30-20 | Adds endpoint security (IP firewalls and integration with [Azure Private Link](../private-link/private-endpoint-overview.md)) |
-| [2.0](https://www.nuget.org/packages/Microsoft.Azure.Management.Search/2.0.0) | api-version=2019-10-01 | Usability improvements. Breaking change on [List Query Keys](/rest/api/searchmanagement/2021-04-01-preview/query-keys/list-by-search-service) (GET is discontinued). |
-| [1.0](https://www.nuget.org/packages/Microsoft.Azure.Management.Search/1.0.1) | api-version=2015-08-19 | First version |
+| Namespace | Version| Status | Change log |
+|--|--|--||
+| [Azure.ResourceManager.Search](/dotnet/api/overview/azure/resourcemanager.search-readme?view=azure-dotnet&preserve-view=true) | [Package versions](https://www.nuget.org/packages/Azure.ResourceManager.Search/1.0.0) | **Current** | [Release notes](https://github.com/Azure/azure-sdk-for-net/blob/Azure.ResourceManager.Search_1.2.0-beta.1/sdk/search/Azure.ResourceManager.Search/CHANGELOG.md) |
+| [Microsoft.Azure.Management.Search](/dotnet/api/overview/azure/search/management/management-cognitivesearch(deprecated)?view=azure-dotnet&preserve-view=true) | [Package versions](https://www.nuget.org/packages/Microsoft.Azure.Management.Search#versions-body-tab) | **Deprecated** | [Release notes](https://www.nuget.org/packages/Microsoft.Azure.Management.Search#release-body-tab) |
-## How to upgrade
+## Checklist for upgrade
-1. Review the [client library changelist](https://github.com/Azure/azure-sdk-for-net/blob/Azure.ResourceManager.Search_1.0.0/sdk/search/Azure.ResourceManager.Search/CHANGELOG.md) for insight into the scope of changes.
+1. Review the [client library change list](https://github.com/Azure/azure-sdk-for-net/blob/Azure.ResourceManager.Search_1.0.0/sdk/search/Azure.ResourceManager.Search/CHANGELOG.md) for insight into the scope of changes.
1. In your application code, delete the reference to `Microsoft.Azure.Management.Search` and its dependencies.
Microsoft.Azure.Management.Search is now deprecated. We recommend [Azure.Resourc
1. Once NuGet has downloaded the new packages and their dependencies, replace the API calls.
-<!-- | Old API | New API |
-|||
-| [CreateOrUpdateWithHttpMessagesAsync Method](/dotnet/api/microsoft.azure.management.search.iservicesoperations.createorupdatewithhttpmessagesasync) | TBD |
-| [CheckNameAvailabilityWithHttpMessagesAsync Method](/dotnet/api/microsoft.azure.management.search.iservicesoperations.checknameavailabilitywithhttpmessagesasync) | TBD |
-| [IAdminKeysOperations.GetWithHttpMessagesAsync Method](/dotnet/api/microsoft.azure.management.search.iadminkeysoperations.getwithhttpmessagesasync) | TBD | -->
-
-## Upgrade to 3.0
-
-Version 3.0 adds private endpoint protection by restricting access to IP ranges, and by optionally integrating with Azure Private Link for search services that shouldn't be visible on the public internet.
-
-### New APIs
-
-| API | Category| Details |
-|--|--||
-| [NetworkRuleSet](/rest/api/searchmanagement/2021-04-01-preview/services/create-or-update#networkruleset) | IP firewall | Restrict access to a service endpoint to a list of allowed IP addresses. See [Configure IP firewall](service-configure-firewall.md) for concepts and portal instructions. |
-| [Shared Private Link Resource](/rest/api/searchmanagement/2021-04-01-preview/shared-private-link-resources) | Private Link | Create a shared private link resource to be used by a search service. |
-| [Private Endpoint Connections](/rest/api/searchmanagement/2021-04-01-preview/private-endpoint-connections) | Private Link | Establish and manage connections to a search service through private endpoint. See [Create a private endpoint](service-create-private-endpoint.md) for concepts and portal instructions.|
-| [Private Link Resources](/rest/api/searchmanagement/2021-04-01-preview/private-link-resources) | Private Link | For a search service that has a private endpoint connection, get a list of all services used in the same virtual network. If your search solution includes indexers that pull from Azure data sources (such as Azure Storage, Azure Cosmos DB, Azure SQL), or uses Azure AI services or Key Vault, then all of those resources should have endpoints in the virtual network, and this API should return a list. |
-| [PublicNetworkAccess](/rest/api/searchmanagement/2021-04-01-preview/services/create-or-update#publicnetworkaccess)| Private Link | This is a property on Create or Update Service requests. When disabled, private link is the only access modality. |
-
-### Breaking changes
-
-You can no longer use GET on a [List Query Keys](/rest/api/searchmanagement/2021-04-01-preview/query-keys/list-by-search-service) request. In previous releases you could use either GET or POST, in this release and in all releases moving forward, only POST is supported.
-
-## Upgrade to 2.0
-
-Version 2 of the Azure Search .NET Management SDK is a minor upgrade, so changing your code should require only minimal effort. The changes to the SDK are strictly client-side changes to improve the usability of the SDK itself. These changes include the following:
-
-* `Services.CreateOrUpdate` and its asynchronous versions now automatically poll the provisioning `SearchService` and don't return until service provisioning is complete. This saves you from having to write such polling code yourself.
-
-* If you still want to poll service provisioning manually, you can use the new `Services.BeginCreateOrUpdate` method or one of its asynchronous versions.
-
-* New methods `Services.Update` and its asynchronous versions have been added to the SDK. These methods use HTTP PATCH to support incremental updating of a service. For example, you can now scale a service by passing a `SearchService` instance to these methods that contains only the desired `partitionCount` and `replicaCount` properties. The old way of calling `Services.Get`, modifying the returned `SearchService`, and passing it to `Services.CreateOrUpdate` is still supported, but is no longer necessary.
- ## Next steps
-If you encounter problems, the best forum for posting questions is [Stack Overflow](https://stackoverflow.com/questions/tagged/azure-cognitive-search?tab=Newest). If you find a bug, you can file an issue in the [Azure .NET SDK GitHub repository](https://github.com/Azure/azure-sdk-for-net/issues). Make sure to label your issue title with "[search]".
+If you encounter problems, the best forum for posting questions is [Stack Overflow](https://stackoverflow.com/questions/tagged/azure-cognitive-search?tab=Newest). If you find a bug, you can file an issue in the [Azure .NET SDK GitHub repository](https://github.com/Azure/azure-sdk-for-net/issues). Make sure to label your issue title with *search*.
search Tutorial Javascript Create Load Index https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-javascript-create-load-index.md
Previously updated : 08/29/2023 Last updated : 09/13/2023 ms.devlang: javascript
Continue to build your search-enabled website by following these steps:
* Create a search resource * Create a new index
-* Import data with JavaScript using the [bulk_insert_books script](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/main/search-website-functions-v4/bulk-insert-v4/bulk_insert_books.js) and Azure SDK [@azure/search-documents](https://www.npmjs.com/package/@azure/search-documents).
+* Import data with JavaScript using the [bulk_insert_books script](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/main/search-website-functions-v4/bulk-insert/bulk_insert_books.js) and Azure SDK [@azure/search-documents](https://www.npmjs.com/package/@azure/search-documents).
## Create an Azure Cognitive Search resource
The ESM script uses the Azure SDK for Cognitive Search:
* [npm package @azure/search-documents](https://www.npmjs.com/package/@azure/search-documents) * [Reference Documentation](/javascript/api/overview/azure/search-documents-readme)
-1. In Visual Studio Code, open the `bulk_insert_books.js` file in the subdirectory, `search-website-functions-v4/bulk-insert-v4`, replace the following variables with your own values to authenticate with the Azure Search SDK:
+1. In Visual Studio Code, open the `bulk_insert_books.js` file in the subdirectory, `search-website-functions-v4/bulk-insert`, replace the following variables with your own values to authenticate with the Azure Search SDK:
* YOUR-SEARCH-RESOURCE-NAME * YOUR-SEARCH-ADMIN-KEY
- :::code language="javascript" source="~/azure-search-javascript-samples/search-website-functions-v4/bulk-insert-v4/bulk_insert_books.js" :::
+ :::code language="javascript" source="~/azure-search-javascript-samples/search-website-functions-v4/bulk-insert/bulk_insert_books.js" :::
-1. Open an integrated terminal in Visual Studio for the project directory's subdirectory, `search-website-functions-v4/bulk-insert-v4`, and run the following command to install the dependencies.
+1. Open an integrated terminal in Visual Studio for the project directory's subdirectory, `search-website-functions-v4/bulk-insert`, and run the following command to install the dependencies.
```bash npm install
The ESM script uses the Azure SDK for Cognitive Search:
## Run the bulk import script for Search
-1. Continue using the integrated terminal in Visual Studio for the project directory's subdirectory, `search-website-functions-v4/bulk-insert-v4`, to run the following bash command to run the `bulk_insert_books.js` script:
+1. Continue using the integrated terminal in Visual Studio for the project directory's subdirectory, `search-website-functions-v4/bulk-insert`, to run the `bulk_insert_books.js` script:
```javascript npm start
search Tutorial Javascript Deploy Static Web App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-javascript-deploy-static-web-app.md
Previously updated : 08/29/2023 Last updated : 09/13/2023 ms.devlang: javascript
search Tutorial Javascript Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-javascript-overview.md
Previously updated : 08/29/2023 Last updated : 09/13/2023 ms.devlang: javascript
The [sample](https://github.com/Azure-Samples/azure-search-javascript-samples/tr
|App|Purpose|GitHub<br>Repository<br>Location| |--|--|--|
-|Client|React app (presentation layer) to display books, with search. It calls the Azure Function app. |[/search-website-functions-v4/client-v4](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/master/search-website-functions-v4/client-v4)|
-|Server|Azure Function app (business layer) - calls the Azure Cognitive Search API using JavaScript SDK |[/search-website-functions-v4/api-v4](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/master/search-website-functions-v4/api-v4)|
-|Bulk insert|JavaScript file to create the index and add documents to it.|[/search-website-functions-v4/bulk-insert-v4](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/master/search-website-functions-v4/bulk-insert-v4)|
+|Client|React app (presentation layer) to display books, with search. It calls the Azure Function app. |[/search-website-functions-v4/client](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/master/search-website-functions-v4/client)|
+|Server|Azure Function app (business layer) - calls the Azure Cognitive Search API using JavaScript SDK |[/search-website-functions-v4/api](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/master/search-website-functions-v4/api)|
+|Bulk insert|JavaScript file to create the index and add documents to it.|[/search-website-functions-v4/bulk-insert](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/master/search-website-functions-v4/bulk-insert)|
## Set up your development environment
search Tutorial Javascript Search Query Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-javascript-search-query-integration.md
Previously updated : 08/29/2023 Last updated : 09/13/2023 ms.devlang: javascript
The Function app authenticates through the SDK to the cloud-based Cognitive Sear
## Configure secrets in a configuration file ## Azure Function: Search the catalog
-The [Search API](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/api-v4/src/functions/search.js) takes a search term and searches across the documents in the search index, returning a list of matches.
+The [Search API](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/api/src/functions/search.js) takes a search term and searches across the documents in the search index, returning a list of matches.
The Azure Function pulls in the search configuration information, and fulfills the query. ## Client: Search from the catalog Call the Azure Function in the React client with the following code. +
+## Client: Facets from the catalog
+
+This React component includes the search textbox and the [**facets**](search-faceted-navigation.md) associated with the search results. Facets need to be thought out and designed as part of the search schema when the search data is loaded. Then the facets are used in the search query, along with the search text, to provide the faceted navigation experience.
++
+## Client: Pagination from the catalog
+
+When the search results expand beyond a trivial few (8), the `@mui/material/TablePagination` component provides **pagination** across the results.
++
+When the user changes the page, that value is sent to the parent `Search.js` page from the `handleChangePage` function. The function sends a new request to the search API for the same query and the new page. The API response updates the facets, results, and pager components.
## Azure Function: Suggestions from the catalog
-The [Suggest API](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/api-v4/src/functions/suggest.js) takes a search term while a user is typing and suggests search terms such as book titles and authors across the documents in the search index, returning a small list of matches.
+The [Suggest API](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/api/src/functions/suggest.js) takes a search term while a user is typing and suggests search terms such as book titles and authors across the documents in the search index, returning a small list of matches.
-The search suggester, `sg`, is defined in the [schema file](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/bulk-insert-v4/good-books-index.json) used during bulk upload.
+The search suggester, `sg`, is defined in the [schema file](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/bulk-insert/good-books-index.json) used during bulk upload.
## Client: Suggestions from the catalog The Suggest function API is called in the React app at `\src\components\SearchBar\SearchBar.js` as part of component initialization: +
+This React component uses the `@mui/material/Autocomplete` component to provide a search textbox, which also supports displaying suggestions (using the `renderInput` function). Autocomplete starts after the first several characters are entered. As each new character is entered, it's sent as a query to the search engine. The results are displayed as a short list of suggestions.
+
+This autocomplete functionality is a common feature but this specific implementation has an additional use case. The customer can enter text and select from the suggestions _or_ submit their entered text. The input from the suggestion list as well as the input from the textbox must be tracked for changes, which impact how the form is rendered and what is sent to the **search** API when the form is submitted.
+
+If your use case for search allows your user to select only from the suggestions, that will reduce the scope of complexity of the control but limit the user experience.
## Azure Function: Get specific document
-The [Lookup API](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/api-v4/src/functions/lookup.js) takes an ID and returns the document object from the search index.
+The [Lookup API](https://github.com/Azure-Samples/azure-search-javascript-samples/blob/master/search-website-functions-v4/api/src/functions/lookup.js) takes an ID and returns the document object from the search index.
## Client: Get specific document This function API is called in the React app at `\src\pages\Details\Detail.js` as part of component initialization: +
+If your client app can use pregenerated content, this page is a good candidate for autogeneration because the content is static, pulled directly from the search index.
## Next steps
search Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/whats-new.md
Learn about the latest updates to Azure Cognitive Search functionality, docs, an
| November | [Query performance dashboard](https://github.com/Azure-Samples/azure-samples-search-evaluation). This Application Insights sample demonstrates an approach for deep monitoring of query usage and performance of an Azure Cognitive Search index. It includes a JSON template that creates a workbook and dashboard in Application Insights and a Jupyter Notebook that populates the dashboard with simulated data. | | October | [Compliance risk analysis using Azure Cognitive Search](/azure/architecture/guide/ai/compliance-risk-analysis). On Azure Architecture Center, this guide covers the implementation of a compliance risk analysis solution that uses Azure Cognitive Search. | | October | [Beiersdorf customer story using Azure Cognitive Search](https://customers.microsoft.com/story/1552642769228088273-Beiersdorf-consumer-goods-azure-cognitive-search). This customer story showcases semantic search and document summarization to provide researchers with ready access to institutional knowledge. |
-| September | [Azure Cognitive Search Lab](https://github.com/Azure-Samples/azure-search-lab/blob/main/README.md). This C# sample provides the source code for building a web front-end that accesses all of the REST API calls against an index. This tool is used by support engineers to investigate customer support issues. You can try this [demo site](https://azuresearchlab.azurewebsites.net/) before building your own copy. |
| September | [Event-driven indexing for Cognitive Search](https://github.com/aditmer/Event-Driven-Indexing-For-Cognitive-Search/blob/main/README.md). This C# sample is an Azure Function app that demonstrates event-driven indexing in Azure Cognitive Search. If you've used indexers and skillsets before, you know that indexers can run on demand or on a schedule, but not in response to events. This demo shows you how to set up an indexing pipeline that responds to data update events. | | August | [Tutorial: Index large data from Apache Spark](search-synapseml-cognitive-services.md). This tutorial explains how to use the SynapseML open-source library to push data from Apache Spark into a search index. It also shows you how to make calls to Azure AI services to get AI enrichment without skillsets and indexers. | | June | [Semantic search (preview)](semantic-search-overview.md). New support for Storage Optimized tiers (L1, L2). |
sentinel Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/whats-new.md
The WindowsEvent schema has been expanded to include new fields, such as `Keywor
These additions allow for more comprehensive analysis and for more information to be extracted and parsed from the event.
-If you aren't interested in ingesting the new fields, use ingest-time transformation in the AMA DCR to filter and drop the fields, while still ingesting the events. To ingest the events, add the following to your DCRs: 
+If you aren't interested in ingesting the new fields, use ingest-time transformation in the AMA DCR to filter and drop the fields, while still ingesting the events. To ingest the events without the new fields, add the following to your DCRs: 
```kusto "transformKql": "source | project-away TimeCreated, SystemThreadId, EventRecordId, SystemProcessId, Correlation, Keywords, Opcode, SystemUserId, Version"
service-connector Concept Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/concept-availability.md
Title: High availability for Service Connector description: This article covers availability zones, zone redundancy, disaster recovery, and cross-region failover for Service Connector.--++ Last updated 05/24/2022
service-connector Concept Region Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/concept-region-support.md
Title: Service Connector Region Support description: Service Connector region availability and region support list--++ Last updated 09/19/2022
service-connector Concept Service Connector Internals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/concept-service-connector-internals.md
Title: Service Connector internals description: Learn about Service Connector internals, the architecture, the connections and how data is transmitted.--++
service-connector How To Get Configurations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-get-configurations.md
Title: Get connection configurations added by Service Connector description: Get connection configurations added by Service Connector--++ Last updated 07/04/2023
service-connector How To Integrate App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-app-configuration.md
Title: Integrate Azure App Configuration with Service Connector description: Integrate Azure App Configuration into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Confluent Kafka https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-confluent-kafka.md
Title: Integrate Apache kafka on Confluent Cloud with Service Connector description: Integrate Apache kafka on Confluent Cloud into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Cosmos Cassandra https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-cassandra.md
Title: Integrate the Azure Cosmos DB for Apache Cassandra with Service Connector description: Integrate the Azure Cosmos DB for Apache Cassandra into your application with Service Connector--++ Last updated 09/19/2022
service-connector How To Integrate Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-db.md
Title: Integrate Azure Cosmos DB for MongoDB with Service Connector description: Integrate Azure Cosmos DB for MongoDB into your application with Service Connector--++ Last updated 09/19/2022
service-connector How To Integrate Cosmos Gremlin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-gremlin.md
Title: Integrate the Azure Cosmos DB for Apache Gremlin with Service Connector description: Integrate the Azure Cosmos DB for Apache Gremlin into your application with Service Connector--++ Last updated 09/19/2022
service-connector How To Integrate Cosmos Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-sql.md
Title: Integrate the Azure Cosmos DB for NoSQL with Service Connector description: Integrate the Azure Cosmos DB SQL into your application with Service Connector--++ Last updated 09/19/2022
service-connector How To Integrate Cosmos Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-table.md
Title: Integrate the Azure Cosmos DB for Table with Service Connector description: Integrate the Azure Cosmos DB for Table into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Event Hubs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-event-hubs.md
Title: Integrate Azure Event Hubs with Service Connector description: Integrate Azure Event Hubs into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Key Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-key-vault.md
Title: Integrate Azure Key Vault with Service Connector description: Integrate Azure Key Vault into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-mysql.md
Title: Integrate Azure Database for MySQL with Service Connector description: Integrate Azure Database for MySQL into your application with Service Connector--++ Last updated 11/29/2022
service-connector How To Integrate Postgres https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-postgres.md
Title: Integrate Azure Database for PostgreSQL with Service Connector description: Integrate Azure Database for PostgreSQL into your application with Service Connector--++ Last updated 11/29/2022
service-connector How To Integrate Redis Cache https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-redis-cache.md
Title: Integrate Azure Cache for Redis and Azure Cache Redis Enterprise with Service Connector description: Integrate Azure Cache for Redis and Azure Cache Redis Enterprise into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Service Bus https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-service-bus.md
Title: Integrate Azure Service Bus with Service Connector description: Integrate Service Bus into your application with Service Connector--++
service-connector How To Integrate Signalr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-signalr.md
Title: Integrate Azure SignalR Service with Service Connector description: Integrate Azure SignalR Service into your application with Service Connector. Learn about authentication types and client types of Azure SignalR Service.--++ Last updated 08/11/2022
service-connector How To Integrate Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-sql-database.md
Title: Integrate Azure SQL Database with Service Connector description: Integrate SQL into your application with Service Connector--++ Last updated 11/29/2022
service-connector How To Integrate Storage Blob https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-blob.md
Title: Integrate Azure Blob Storage with Service Connector description: Integrate Azure Blob Storage into your application with Service Connector--++
service-connector How To Integrate Storage File https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-file.md
Title: Integrate Azure Files with Service Connector description: Integrate Azure Files into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Storage Queue https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-queue.md
Title: Integrate Azure Queue Storage with Service Connector description: Integrate Azure Queue Storage into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Storage Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-table.md
Title: Integrate Azure Table Storage with Service Connector description: Integrate Azure Table Storage into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Integrate Web Pubsub https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-web-pubsub.md
Title: Integrate Azure Web PubSub with service connector description: Integrate Azure Web PubSub into your application with Service Connector--++ Last updated 08/11/2022
service-connector How To Manage Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-manage-authentication.md
Title: Manage authentication in Service Connector description: Learn how to select and manage authentication parameters in Service Connector. -+ Last updated 03/07/2023-+ # Manage authentication within Service Connector
service-connector How To Troubleshoot Front End Error https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-troubleshoot-front-end-error.md
Title: Service Connector troubleshooting guidance description: This article lists error messages and suggested actions of Service Connector to use for troubleshooting issues.--++ Last updated 5/25/2022
service-connector Known Limitations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/known-limitations.md
Last updated 03/02/2023--++ # Known limitations of Service Connector
service-connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/overview.md
Title: What is Service Connector? description: Understand typical use case scenarios for Service Connector, and learn the key benefits of Service Connector.--++
service-connector Quickstart Cli App Service Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-cli-app-service-connection.md
Title: Quickstart - Create a service connection in App Service with the Azure CLI description: Quickstart showing how to create a service connection in App Service with the Azure CLI--++ Last updated 04/13/2023
service-connector Quickstart Cli Container Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-cli-container-apps.md
Title: Quickstart - Create a service connection in Container Apps using the Azure CLI description: Quickstart showing how to create a service connection in Azure Container Apps using the Azure CLI--++ Last updated 04/13/2023
service-connector Quickstart Cli Spring Cloud Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-cli-spring-cloud-connection.md
Title: Quickstart - Create a service connection in Azure Spring Apps with the Azure CLI description: Quickstart showing how to create a service connection in Azure Spring Apps with the Azure CLI displayName: --++ Last updated 04/13/2022
service-connector Quickstart Portal App Service Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-portal-app-service-connection.md
Title: Quickstart - Create a service connection in App Service from the Azure portal description: Quickstart showing how to create a service connection in App Service from the Azure portal--++
service-connector Quickstart Portal Container Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-portal-container-apps.md
Title: Quickstart - Create a service connection in Container Apps from the Azure portal description: Quickstart showing how to create a service connection in Azure Container Apps from the Azure portal--++
service-connector Quickstart Portal Spring Cloud Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-portal-spring-cloud-connection.md
Title: Create a service connection in Azure Spring Apps from the Azure portal description: This quickstart shows you how to create a service connection in Azure Spring Apps from the Azure portal.--++ Last updated 08/10/2022
service-connector Tutorial Connect Web App App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-connect-web-app-app-configuration.md
Title: 'Tutorial: Connect a web app to Azure App Configuration with Service Connector' description: Learn how you can connect an ASP.NET Core application hosted in Azure Web Apps to App Configuration using Service Connector'--++ Last updated 10/24/2022
service-connector Tutorial Csharp Webapp Storage Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-csharp-webapp-storage-cli.md
Title: 'Tutorial: Deploy a web application connected to Azure Blob Storage with Service Connector' description: Create a web app connected to Azure Blob Storage with Service Connector.--++ Last updated 05/03/2022
service-connector Tutorial Django Webapp Postgres Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-django-webapp-postgres-cli.md
Title: 'Tutorial: Using Service Connector to build a Django app with Postgres on
description: Create a Python web app with a PostgreSQL database and deploy it to Azure. The tutorial uses the Django framework, the app is hosted on Azure App Service on Linux, and the App Service and Database is connected with Service Connector. ms.devlang: python --++ Last updated 05/03/2022
service-connector Tutorial Java Spring Confluent Kafka https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-java-spring-confluent-kafka.md
Title: 'Tutorial: Deploy a Spring Boot app connected to Apache Kafka on Confluen
description: Create a Spring Boot app connected to Apache Kafka on Confluent Cloud with Service Connector in Azure Spring Apps. ms.devlang: java --++ Last updated 05/03/2022
service-connector Tutorial Java Spring Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-java-spring-mysql.md
Title: 'Tutorial: Deploy an application to Azure Spring Apps and connect it to Azure Database for MySQL Flexible Server using Service Connector' description: Create a Spring Boot application connected to Azure Database for MySQL Flexible Server with Service Connector.--++ Last updated 11/02/2022
service-connector Tutorial Passwordless https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-passwordless.md
Title: 'Tutorial: Create a passwordless connection with Service Connector' description: Create a passwordless connection with Service Connector --++ Last updated 07/17/2023
service-connector Tutorial Portal Key Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-portal-key-vault.md
Title: Tutorial - Create a service connection and store secrets into Key Vault description: Tutorial showing how to create a service connection and store secrets into Key Vault--++
storage Storage Blob Dotnet Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-dotnet-get-started.md
# Get started with Azure Blob Storage and .NET + This article shows you how to connect to Azure Blob Storage by using the Azure Blob Storage client library for .NET. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service. [API reference](/dotnet/api/azure.storage.blobs) | [Library source code](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/storage/Azure.Storage.Blobs) | [Package (NuGet)](https://www.nuget.org/packages/Azure.Storage.Blobs) | [Samples](../common/storage-samples-dotnet.md?toc=/azure/storage/blobs/toc.json#blob-samples) | [Give feedback](https://github.com/Azure/azure-sdk-for-net/issues)
storage Storage Blob Download Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-download-java.md
# Download a blob with Java + This article shows how to download a blob using the [Azure Storage client library for Java](/java/api/overview/azure/storage-blob-readme). You can download blob data to various destinations, including a local file path, stream, or text string. You can also open a blob stream and read from it. ## Prerequisites
storage Storage Blob Download Javascript https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-download-javascript.md
# Download a blob with JavaScript + This article shows how to download a blob using the [Azure Storage client library for JavaScript](https://www.npmjs.com/package/@azure/storage-blob). You can download blob data to various destinations, including a local file path, stream, or text string. ## Prerequisites
storage Storage Blob Download Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-download-python.md
# Download a blob with Python + This article shows how to download a blob using the [Azure Storage client library for Python](/python/api/overview/azure/storage). You can download blob data to various destinations, including a local file path, stream, or text string. You can also open a blob stream and read from it. ## Prerequisites
storage Storage Blob Download Typescript https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-download-typescript.md
# Download a blob with TypeScript + This article shows how to download a blob using the [Azure Storage client library for JavaScript](/javascript/api/overview/azure/storage-blob-readme). You can download blob data to various destinations, including a local file path, stream, or text string. ## Prerequisites
storage Storage Blob Download https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-download.md
# Download a blob with .NET + This article shows how to download a blob using the [Azure Storage client library for .NET](/dotnet/api/overview/azure/storage). You can download blob data to various destinations, including a local file path, stream, or text string. You can also open a blob stream and read from it. ## Prerequisites
storage Storage Blob Java Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-java-get-started.md
# Get started with Azure Blob Storage and Java + This article shows you how to connect to Azure Blob Storage by using the Azure Blob Storage client library for Java. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service. [API reference](/jav?toc=/azure/storage/blobs/toc.json#blob-samples) | [Give feedback](https://github.com/Azure/azure-sdk-for-java/issues)
storage Storage Blob Javascript Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-javascript-get-started.md
# Get started with Azure Blob Storage and JavaScript + This article shows you how to connect to Azure Blob Storage by using the Azure Blob Storage client library v12 for JavaScript. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service. The [sample code snippets](https://github.com/Azure-Samples/AzureStorageSnippets/tree/master/blobs/howto/JavaScript/NodeJS-v12/dev-guide) are available in GitHub as runnable Node.js files.
storage Storage Blob Python Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-python-get-started.md
# Get started with Azure Blob Storage and Python + This article shows you how to connect to Azure Blob Storage by using the Azure Blob Storage client library for Python. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service. [API reference](/python/api/azure-storage-blob) | [Package (PyPi)](https://pypi.org/project/azure-storage-blob/) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-blob) | [Samples](../common/storage-samples-python.md?toc=/azure/storage/blobs/toc.json#blob-samples) | [Give feedback](https://github.com/Azure/azure-sdk-for-python/issues)
storage Storage Blob Typescript Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-typescript-get-started.md
# Get started with Azure Blob Storage and TypeScript + This article shows you how to connect to Azure Blob Storage by using the Azure Blob Storage client library for JavaScript. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service. [Package (npm)](https://www.npmjs.com/package/@azure/storage-blob) | [API reference](/javascript/api/preview-docs/@azure/storage-blob) | [Library source code](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/storage/storage-blob) | [Give Feedback](https://github.com/Azure/azure-sdk-for-js/issues)
storage Storage Blob Upload Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-upload-java.md
# Upload a block blob with Java + This article shows how to upload a block blob using the [Azure Storage client library for Java](/java/api/overview/azure/storage-blob-readme). You can upload data to a block blob from a file path, a stream, a binary object, or a text string. You can also upload blobs with index tags. ## Prerequisites
storage Storage Blob Upload Javascript https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-upload-javascript.md
# Upload a blob with JavaScript + This article shows how to upload a blob using the [Azure Storage client library for JavaScript](https://www.npmjs.com/package/@azure/storage-blob). You can upload data to a block blob from a file path, a stream, a buffer, or a text string. You can also upload blobs with index tags. ## Prerequisites
storage Storage Blob Upload Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-upload-python.md
# Upload a block blob with Python + This article shows how to upload a blob using the [Azure Storage client library for Python](/python/api/overview/azure/storage). You can upload data to a block blob from a file path, a stream, a binary object, or a text string. You can also upload blobs with index tags. ## Prerequisites
storage Storage Blob Upload Typescript https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-upload-typescript.md
# Upload a blob with TypeScript + This article shows how to upload a blob using the [Azure Storage client library for JavaScript](/javascript/api/overview/azure/storage-blob-readme). You can upload data to a block blob from a file path, a stream, a buffer, or a text string. You can also upload blobs with index tags. ## Prerequisites
storage Storage Blob Upload https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-upload.md
# Upload a blob with .NET + This article shows how to upload a blob using the [Azure Storage client library for .NET](/dotnet/api/overview/azure/storage). You can upload data to a block blob from a file path, a stream, a binary object, or a text string. You can also open a blob stream and write to it, or upload large blobs in blocks. ## Prerequisites
synapse-analytics Known Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/known-issues.md
Synapse workspaces created from an existing dedicated SQL Pool report query fail
**Workaround**: If you encountered a similar error, engage Microsoft Support Team for assistance.
+### Tag updates appear to fail
+
+When making a change to the [tags](../azure-resource-manager/management/tag-resources-portal.md) of a dedicated SQL pool through Azure portal or other methods, an error message may appear even though the change is made successfully.
+
+**Workaround**: You can confirm that the change to the tags was successful and ignore/suppress the error message as needed.
+ ## Azure Synapse workspace active known issues summary The following are known issues with the Synapse workspace.
virtual-machines Agent Dependency Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/agent-dependency-linux.md
Title: Azure Monitor Dependency virtual machine extension for Linux description: Deploy the Azure Monitor Dependency agent on Linux virtual machine by using a virtual machine extension. --++ --++ Previously updated : 06/01/2021 Last updated : 08/29/2023 # Azure Monitor Dependency virtual machine extension for Linux
-The Azure Monitor for VMs Map feature gets its data from the Microsoft Dependency agent. The Azure VM Dependency agent virtual machine extension for Linux is published and supported by Microsoft. The extension installs the Dependency agent on Azure virtual machines. This document details the supported platforms, configurations, and deployment options for the Azure VM Dependency agent virtual machine extension for Linux.
+The Azure Monitor for VMs Map feature gets its data from the Microsoft Dependency agent. The Azure VM Dependency agent virtual machine extension for Linux installs the Dependency agent on Azure virtual machines. This document details the supported platforms, configurations, and deployment options for the Azure VM Dependency agent virtual machine extension for Linux.
## Prerequisites
The following JSON shows the schema for the Azure VM Dependency agent extension
{ "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0",
- "parameters": {
- "vmName": {
- "type": "string",
- "metadata": {
- "description": "The name of existing Linux Azure VM."
+ "parameters": {
+ "vmName": {
+ "type": "string",
+ "metadata": {
+ "description": "The name of existing Linux Azure VM."
} } }, "variables": {
- "vmExtensionsApiVersion": "2017-03-30"
+ "vmExtensionsApiVersion": "2017-03-30"
}, "resources": [ {
The following JSON shows the schema for the Azure VM Dependency agent extension
"name": "[concat(parameters('vmName'),'/DAExtension')]", "apiVersion": "[variables('vmExtensionsApiVersion')]", "location": "[resourceGroup().location]",
- "dependsOn": [
- ],
+ "dependsOn": [],
"properties": {
- "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
- "type": "DependencyAgentLinux",
- "typeHandlerVersion": "9.5",
- "autoUpgradeMinorVersion": true
+ "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
+ "type": "DependencyAgentLinux",
+ "typeHandlerVersion": "9.5",
+ "autoUpgradeMinorVersion": true,
+ "settings": {
+ "enableAMA": "true"
+ }
} } ],
The following JSON shows the schema for the Azure VM Dependency agent extension
| publisher | Microsoft.Azure.Monitoring.DependencyAgent | | type | DependencyAgentLinux | | typeHandlerVersion | 9.5 |
+| settings | "enableAMA": "true" |
+
+> [!IMPORTANT]
+> Be sure to add `enableAMA` to your template if you're using Azure Monitor Agent; otherwise, Dependency agent attempts to send data to the legacy Log Analytics agent.
## Template deployment
The following example assumes the Dependency agent extension is nested inside th
"apiVersion": "[variables('apiVersion')]", "location": "[resourceGroup().location]", "dependsOn": [
- "[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"
+ "[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"
], "properties": {
- "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
+ "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
"type": "DependencyAgentLinux", "typeHandlerVersion": "9.5",
- "autoUpgradeMinorVersion": true
+ "autoUpgradeMinorVersion": true,
+ "settings": {
+ "enableAMA": "true"
+ }
} } ```
When you place the extension JSON at the root of the template, the resource name
"apiVersion": "[variables('apiVersion')]", "location": "[resourceGroup().location]", "dependsOn": [
- "[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"
+ "[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"
], "properties": {
- "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
+ "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
"type": "DependencyAgentLinux", "typeHandlerVersion": "9.5",
- "autoUpgradeMinorVersion": true
+ "autoUpgradeMinorVersion": true,
+ "settings": {
+ "enableAMA": "true"
+ }
} } ```
To enable automatic extension upgrade for an extension, you must ensure the prop
When automatic extension upgrade is enabled on a VM or VM scale set, the extension is upgraded automatically whenever the extension publisher releases a new version for that extension. The upgrade is applied safely following availability-first principles as described [here](../automatic-extension-upgrade.md#how-does-automatic-extension-upgrade-work).
-The `enableAutomaticUpgrade` attribute's functionality is different from that of the `autoUpgradeMinorVersion`. The `autoUpgradeMinorVersion` attributes does not automatically trigger a minor version update when the extension publisher releases a new version. The `autoUpgradeMinorVersion` attribute indicates whether the extension should use a newer minor version if one is available at deployment time. Once deployed, however, the extension will not upgrade minor versions unless redeployed, even with this property set to true.
+The `enableAutomaticUpgrade` attribute's functionality is different from that of the `autoUpgradeMinorVersion`. The `autoUpgradeMinorVersion` attribute doesn't automatically trigger a minor version update when the extension publisher releases a new version. The `autoUpgradeMinorVersion` attribute indicates whether the extension should use a newer minor version if one is available at deployment time. Once deployed, however, the extension won't upgrade minor versions unless redeployed, even with this property set to true.
To keep your extension version updated, we recommend using `enableAutomaticUpgrade` with your extension deployment.
virtual-machines Agent Dependency Windows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/extensions/agent-dependency-windows.md
Title: Azure Monitor Dependency virtual machine extension for Windows description: Deploy the Azure Monitor Dependency agent on Windows virtual machine by using a virtual machine extension. ----++++ Previously updated : 03/27/2023 Last updated : 08/29/2023 # Azure Monitor Dependency virtual machine extension for Windows
-The Azure Monitor for VMs Map feature gets its data from the Microsoft Dependency agent. The Azure VM Dependency agent virtual machine extension for Windows is published and supported by Microsoft. The extension installs the Dependency agent on Azure virtual machines. This document details the supported platforms, configurations, and deployment options for the Azure VM Dependency agent virtual machine extension for Windows.
+The Azure Monitor for VMs Map feature gets its data from the Microsoft Dependency agent. The Azure VM Dependency agent virtual machine extension for Windows installs the Dependency agent on Azure virtual machines. This document details the supported platforms, configurations, and deployment options for the Azure VM Dependency agent virtual machine extension for Windows.
## Operating system
The following JSON shows the schema for the Azure VM Dependency agent extension
```json {
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
+ "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
"parameters": { "vmName": { "type": "string",
The following JSON shows the schema for the Azure VM Dependency agent extension
"name": "[concat(parameters('vmName'),'/DAExtension')]", "apiVersion": "[variables('vmExtensionsApiVersion')]", "location": "[resourceGroup().location]",
- "dependsOn": [
- ],
+ "dependsOn": [],
"properties": {
- "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
- "type": "DependencyAgentWindows",
- "typeHandlerVersion": "9.10",
- "autoUpgradeMinorVersion": true
+ "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
+ "type": "DependencyAgentWindows",
+ "typeHandlerVersion": "9.10",
+ "autoUpgradeMinorVersion": true,
+ "settings": {
+ "enableAMA": "true"
+ }
} } ],
The following JSON shows the schema for the Azure VM Dependency agent extension
| publisher | Microsoft.Azure.Monitoring.DependencyAgent | | type | DependencyAgentWindows | | typeHandlerVersion | 9.10 |
+| autoUpgradeMinorVersion | true |
+| settings | "enableAMA": "true" |
+
+> [!IMPORTANT]
+> Be sure to add `enableAMA` to your template if you're using Azure Monitor Agent; otherwise, Dependency agent attempts to send data to the legacy Log Analytics agent.
## Template deployment You can deploy the Azure VM extensions with Azure Resource Manager templates. You can use the JSON schema detailed in the previous section in an Azure Resource Manager template to run the Azure VM Dependency agent extension during an Azure Resource Manager template deployment.
The following example assumes the Dependency agent extension is nested inside th
"[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]" ], "properties": {
- "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
- "type": "DependencyAgentWindows",
- "typeHandlerVersion": "9.10",
- "autoUpgradeMinorVersion": true
- }
+ "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
+ "type": "DependencyAgentWindows",
+ "typeHandlerVersion": "9.10",
+ "autoUpgradeMinorVersion": true,
+ "settings": {
+ "enableAMA": "true"
+ }
+ }
} ```
When you place the extension JSON at the root of the template, the resource name
"[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]" ], "properties": {
- "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
- "type": "DependencyAgentWindows",
- "typeHandlerVersion": "9.10",
- "autoUpgradeMinorVersion": true
+ "publisher": "Microsoft.Azure.Monitoring.DependencyAgent",
+ "type": "DependencyAgentWindows",
+ "typeHandlerVersion": "9.10",
+ "autoUpgradeMinorVersion": true,
+ "settings": {
+ "enableAMA": "true"
+ }
} } ```
To enable automatic extension upgrade for an extension, you must ensure the prop
When automatic extension upgrade is enabled on a VM or VM scale set, the extension is upgraded automatically whenever the extension publisher releases a new version for that extension. The upgrade is applied safely following availability-first principles as described [here](../automatic-extension-upgrade.md#how-does-automatic-extension-upgrade-work).
-The `enableAutomaticUpgrade` attribute's functionality is different from that of the `autoUpgradeMinorVersion`. The `autoUpgradeMinorVersion` attributes does not automatically trigger a minor version update when the extension publisher releases a new version. The `autoUpgradeMinorVersion` attribute indicates whether the extension should use a newer minor version if one is available at deployment time. Once deployed, however, the extension will not upgrade minor versions unless redeployed, even with this property set to true.
+The `enableAutomaticUpgrade` attribute's functionality is different from that of the `autoUpgradeMinorVersion`. The `autoUpgradeMinorVersion` attribute doesn't automatically trigger a minor version update when the extension publisher releases a new version. The `autoUpgradeMinorVersion` attribute indicates whether the extension should use a newer minor version if one is available at deployment time. Once deployed, however, the extension won't upgrade minor versions unless redeployed, even with this property set to true.
To keep your extension version updated, we recommend using `enableAutomaticUpgrade` with your extension deployment.
virtual-machines Image Builder Json https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-json.md
description: Learn how to create a Bicep file or ARM template JSON template to u
Previously updated : 07/17/2023 Last updated : 09/18/2023
The `PowerShell` customizer supports running PowerShell scripts and inline comma
"name": "<name>", "scriptUri": "<path to script>", "runElevated": <true false>,
+ "runAsSystem": <true false>,
"sha256Checksum": "<sha256 checksum>" }, {
The `PowerShell` customizer supports running PowerShell scripts and inline comma
"name": "<name>", "inline": "<PowerShell syntax to run>", "validExitCodes": [<exit code>],
- "runElevated": <true or false>
+ "runElevated": <true or false>,
+ "runAsSystem": <true or false>
} ] ```
customize: [
name: '<name>' scriptUri: '<path to script>' runElevated: <true false>
+ runAsSystem: <true false>
sha256Checksum: '<sha256 checksum>' } {
customize: [
inline: '<PowerShell syntax to run>' validExitCodes: [<exit code>] runElevated: <true or false>
+ runAsSystem: <true or false>
} ] ```
Customize properties:
- **inline** ΓÇô Inline commands to be run, separated by commas. - **validExitCodes** ΓÇô Optional, valid codes that can be returned from the script/inline command. The property avoids reported failure of the script/inline command. - **runElevated** ΓÇô Optional, boolean, support for running commands and scripts with elevated permissions.
+- **runAsSystem** - Optional, boolean, determines whether the PowerShell script should be run as the System user.
- **sha256Checksum** - generate the SHA256 checksum of the file locally, update the checksum value to lowercase, and Image Builder will validate the checksum during the deployment of the image template. To generate the sha256Checksum, use the [Get-FileHash](/powershell/module/microsoft.powershell.utility/get-filehash) cmdlet in PowerShell.
virtual-machines Image Builder Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-troubleshoot.md
description: This article helps you troubleshoot common problems and errors you
Previously updated : 07/31/2023 Last updated : 09/18/2023
Increase the build VM size.
#### Warning ```text
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT Build 'azure-arm' errored: Future#WaitForCompletion: context has been cancelled: StatusCode=200 -- Original Error: context deadline exceeded
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER ERR ==> Some builds didn't complete successfully and had errors:
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER ERR 2020/04/30 22:29:23 machine readable: azure-arm,error []string{"Future#WaitForCompletion: context has been cancelled: StatusCode=200 -- Original Error: context deadline exceeded"}
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT ==> Some builds didn't complete successfully and had errors:
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER ERR ==> Builds finished but no artifacts were created.
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT --> azure-arm: Future#WaitForCompletion: context has been cancelled: StatusCode=200 -- Original Error: context deadline exceeded
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER ERR 2020/04/30 22:29:23 Cancelling builder after context cancellation context canceled
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER ERR 2020/04/30 22:29:23 [INFO] (telemetry) Finalizing.
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT ==> Builds finished but no artifacts were created.
-[a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER ERR 2020/04/30 22:29:24 waiting for all plugin processes to complete...
-Done exporting Packer logs to Azure for Packer prefix: [a170b40d-2d77-4ac3-8719-72cdc35cf889] PACKER OUT
+[<log_id>] PACKER 2023/09/14 19:01:18 ui: Build 'azure-arm' finished after 3 minutes 13 seconds.
+[<log_id>] PACKER 2023/09/14 19:01:18 ui:
+[<log_id>] PACKER ==> Wait completed after 3 minutes 13 seconds
+[<log_id>] PACKER 2023/09/14 19:01:18 ui:
+[<log_id>] PACKER ==> Builds finished but no artifacts were created.
+[<log_id>] PACKER 2023/09/14 19:01:18 [INFO] (telemetry) Finalizing.
+[<log_id>] PACKER 2023/09/14 19:01:19 waiting for all plugin processes to complete...
+[<log_id>] PACKER 2023/09/14 19:01:19 /aib/packerInput/packer-plugin-azure: plugin process exited
+[<log_id>] PACKER 2023/09/14 19:01:19 /aib/packerInput/packer: plugin process exited
+[<log_id>] PACKER 2023/09/14 19:01:19 /aib/packerInput/packer: plugin process exited
+[<log_id>] PACKER 2023/09/14 19:01:19 /aib/packerInput/packer: plugin process exited
+[<log_id>] PACKER Done exporting Packer logs to Azure Storage.
```
+#### Solution
+
+The above warning can safely be ignored.
+
+### Skipping image creation
+
+#### Warning
+
+```text
+[<log_id>] PACKER 2023/09/14 19:00:18 ui: ==> azure-arm: -> Snapshot ID : '/subscriptions/<subscription_id>/resourceGroups/<resourcegroup_name>/providers/Microsoft.Compute/snapshots/<snapshot_name>'
+[<log_id>] PACKER 2023/09/14 19:00:18 ui: ==> azure-arm: Skipping image creation...
+[<log_id>] PACKER 2023/09/14 19:00:18 ui: ==> azure-arm:
+[<log_id>] PACKER ==> azure-arm: Deleting individual resources ...
+[<log_id>] PACKER 2023/09/14 19:00:18 packer-plugin-azure plugin: 202
+```
+ #### Solution The above warning can safely be ignored.
virtual-network Associate Public Ip Address Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/associate-public-ip-address-vm.md
# Associate a public IP address to a virtual machine
-In this article, you learn how to associate a public IP address to an existing virtual machine (VM). To do so, you associate the public IP address to an IP configuration of a network interface attached to a VM. You can use the [Azure portal](#azure-portal), the [Azure CLI](#azure-cli), or [Azure PowerShell](#azure-powershell).
+In this article, you learn how to associate a public IP address to an existing virtual machine (VM). To do so, you associate the public IP address to an IP configuration of a network interface attached to a VM. You can use the Azure portal, the Azure CLI, or Azure PowerShell by selecting the tab for the method you want to use.
If you want to instead create a new VM with a public IP address, you can use the [Azure portal](virtual-network-deploy-static-pip-arm-portal.md), the [Azure CLI](virtual-network-deploy-static-pip-arm-cli.md), or [Azure PowerShell](virtual-network-deploy-static-pip-arm-ps.md).
Public IP addresses have a nominal fee. For details, see [pricing](https://azure
- An Azure account with an active subscription. You can [create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-## Azure portal
+# [Azure portal](#tab/azure-portal)
1. Sign in to the [Azure portal](https://portal.azure.com).
Public IP addresses have a nominal fee. For details, see [pricing](https://azure
> [!NOTE] > Public IP addresses are associated to the IP configurations for a network interface. In this screenshot, the network interface has only one IP configuration. If the network interface had multiple IP configurations, they would all appear in the list, and you'd select the IP configuration that you want to associate the public IP address to.
-1. Select **Associate**, then select **Public IP address** to choose an existing public IP address from the drop-down list. If no public IP addresses are listed, you need to create one. To learn how, see [Create a public IP address](virtual-network-public-ip-address.md#create-a-public-ip-address).
+1. In the **Edit IP configuration** window, select **Associate public IP address**, then select **Public IP address** to choose an existing public IP address from the drop-down list. If no public IP addresses are listed, you need to create one. To learn how, see [Create a public IP address](virtual-network-public-ip-address.md#create-a-public-ip-address).
- :::image type="content" source="./media/associate-public-ip-address-vm/choose-public-ip-address.png" alt-text="Screenshot showing how to select and associate an existing public I P.":::
+ :::image type="content" source="./media/associate-public-ip-address-vm/choose-public-ip-address.png" alt-text="Screenshot showing how to select, create, and associate a new public IP address.":::
-1. Select **Save**, and then close the IP configuration window.
-
- :::image type="content" source="./media/associate-public-ip-address-vm/enable-public-ip-address.png" alt-text="Screenshot showing the selected public I P.":::
> [!NOTE] > The public IP addresses that appear in the drop-down list are those that exist in the same region as the VM. If you have multiple public IP addresses created in the region, all will appear here. Any address that's already associated to a different resource is grayed out.
-1. From the **Network interface** window, view the public IP address assigned to the IP configuration. It might take a few seconds for a newly associated IP address to appear.
+1. Select **Save**.
+
+1. In the **IP Configurations** window, view the public IP address assigned to the IP configuration. It might take a few seconds for a newly associated IP address to appear.
:::image type="content" source="./media/associate-public-ip-address-vm/view-assigned-public-ip-address.png" alt-text="Screenshot showing the newly assigned public I P.":::
Public IP addresses have a nominal fee. For details, see [pricing](https://azure
1. Open the necessary ports in your security groups by adjusting the security rules in the network security groups. For information, see [Allow network traffic to the VM](#allow-network-traffic-to-the-vm).
-## Azure CLI
+# [Azure CLI](#tab/azure-cli)
Install the [Azure CLI](/cli/azure/install-azure-cli?toc=%2fazure%2fvirtual-network%2ftoc.json) on your machine, or use Azure Cloud Shell. Cloud Shell is a free Bash shell that you can run directly within the Azure portal. It includes the Azure CLI preinstalled and configured to use with your Azure account. Select the **Open Cloudshell** button in the Azure CLI code examples that follow. When you select **Open Cloudshell**, Cloud Shell loads in your browser and prompts you to sign into your Azure account.
Install the [Azure CLI](/cli/azure/install-azure-cli?toc=%2fazure%2fvirtual-netw
1. Open the necessary ports in your security groups by adjusting the security rules in the network security groups. For information, see [Allow network traffic to the VM](#allow-network-traffic-to-the-vm).
-## Azure PowerShell
+# [Azure PowerShell](#tab/azure-powershell)
Install [Azure PowerShell](/powershell/azure/install-azure-powershell) on your machine, or use Cloud Shell. Cloud Shell is a free Bash shell that you can run directly within the Azure portal. It includes Azure PowerShell preinstalled and configured to use with your Azure account. Select the **Open Cloudshell** button in the Azure PowerShell code examples that follow. When you select **Open Cloudshell**, Cloud Shell loads in your browser and prompts you to sign into your Azure account.
Install [Azure PowerShell](/powershell/azure/install-azure-powershell) on your m
1. Open the necessary ports in your security groups by adjusting the security rules in the network security groups. For information, see [Allow network traffic to the VM](#allow-network-traffic-to-the-vm). ++ ## Allow network traffic to the VM Before you can connect to a public IP address from the internet, you must open the necessary ports in your security groups. These ports must be open in any network security group that you might have associated to the network interface, the subnet of the network interface, or both. Although security groups filter traffic to the private IP address of the network interface, after inbound internet traffic arrives at the public IP address, Azure translates the public address to the private IP address. Therefore, if a network security group prevents the traffic flow, the communication with the public IP address fails.
virtual-network Create Public Ip Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/create-public-ip-portal.md
Follow these steps to create a public IPv4 address with a Standard SKU named myS
> [!NOTE] > In regions with [availability zones](../../availability-zones/az-overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json#availability-zones), you have the option to select **No Zone** (default), a specific zone, or **Zone-redundant**. The choice depends on your specific domain failure requirements. In regions without availability zones, this field doesn't appear.
-You can associate the public IP address you created with a Windows or Linux [virtual machine](../../virtual-machines/overview.md). For more information, see [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md#azure-cli). You can also associate a public IP address with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md) by assigning it to the load balancer front-end configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
+You can associate the public IP address you created with a Windows or Linux [virtual machine](../../virtual-machines/overview.md). For more information, see [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md). You can also associate a public IP address with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md) by assigning it to the load balancer front-end configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
# [**Basic SKU**](#tab/option-1-create-public-ip-basic)
Follow these steps to create a public IPv4 address with a Standard SKU and routi
> [!NOTE] > In regions with [availability zones](../../availability-zones/az-overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json#availability-zones), you have the option to select **No Zone** (default), a specific zone, or **Zone-redundant**. The choice depends on your specific domain failure requirements. In regions without availability zones, this field doesn't appear.
-You can associate the public IP address you created with a Windows or Linux [virtual machine](../../virtual-machines/overview.md). For more information, see [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md#azure-cli). You can also associate a public IP address with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md) by assigning it to the load balancer front-end configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
+You can associate the public IP address you created with a Windows or Linux [virtual machine](../../virtual-machines/overview.md). For more information, see [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md). You can also associate a public IP address with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md) by assigning it to the load balancer front-end configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
# [**Tier**](#tab/option-1-create-public-ip-tier)
virtual-network Create Public Ip Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/create-public-ip-template.md
To use a standard global public IPv4 address, the template section should look s
For more information on the public IP properties listed in this article, see [Manage public IP addresses](virtual-network-public-ip-address.md#create-a-public-ip-address). ## Next steps-- Associate a [public IP address to a Virtual Machine](./associate-public-ip-address-vm.md#azure-portal)
+- Associate a [public IP address to a Virtual Machine](./associate-public-ip-address-vm.md)
- Learn more about [public IP addresses](public-ip-addresses.md#public-ip-addresses) in Azure. - Learn more about all [public IP address settings](virtual-network-public-ip-address.md#create-a-public-ip-address).
virtual-network Routing Preference Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/routing-preference-cli.md
az network public-ip create \
> [!NOTE] > Currently, routing preference only supports IPV4 public IP addresses.
-You can associate the above created public IP address with a [Windows](../../virtual-machines/windows/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) or [Linux](../../virtual-machines/linux/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machine. Use the CLI section on the tutorial page: [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md#azure-cli) to associate the Public IP to your VM. You can also associate the public IP address created above with with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md), by assigning it to the load balancer **frontend** configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
+You can associate the above created public IP address with a [Windows](../../virtual-machines/windows/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) or [Linux](../../virtual-machines/linux/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machine. Use the CLI section on the tutorial page: [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md) to associate the Public IP to your VM. You can also associate the public IP address created above with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md), by assigning it to the load balancer **frontend** configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
## Next steps
virtual-network Routing Preference Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/routing-preference-portal.md
If you don't have an Azure subscription, create a [free account](https://azure.m
> [!NOTE] > Public IP addresses are created with an IPv4 or IPv6 address. However, routing preference only supports IPV4 currently.
-You can associate the above created public IP address with a [Windows](../../virtual-machines/windows/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) or [Linux](../../virtual-machines/linux/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machine. Use the CLI section on the tutorial page: [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md#azure-cli) to associate the public IP to your VM. You can also associate the public IP address created above with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md), by assigning it to the load balancer **frontend** configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
+You can associate the above created public IP address with a [Windows](../../virtual-machines/windows/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) or [Linux](../../virtual-machines/linux/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machine. Use the CLI section on the tutorial page: [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md) to associate the public IP to your VM. You can also associate the public IP address created above with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md), by assigning it to the load balancer **frontend** configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
## Next steps - Learn more about [public IP with routing preference](routing-preference-overview.md).
virtual-network Routing Preference Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/routing-preference-powershell.md
$publicIp = New-AzPublicIpAddress `
-IpAddressVersion IPv4 ```
-You can associate the above created public IP address with a [Windows](../../virtual-machines/windows/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) or [Linux](../../virtual-machines/linux/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machine. Use the CLI section on the tutorial page: [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md#azure-cli) to associate the Public IP to your VM. You can also associate the public IP address created above with with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md), by assigning it to the load balancer **frontend** configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
+You can associate the above created public IP address with a [Windows](../../virtual-machines/windows/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) or [Linux](../../virtual-machines/linux/overview.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machine. Use the CLI section on the tutorial page: [Associate a public IP address to a virtual machine](./associate-public-ip-address-vm.md) to associate the Public IP to your VM. You can also associate the public IP address created above with an [Azure Load Balancer](../../load-balancer/load-balancer-overview.md), by assigning it to the load balancer **frontend** configuration. The public IP address serves as a load-balanced virtual IP address (VIP).
## Clean up resources
virtual-network Service Tags Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/service-tags-overview.md
By default, service tags reflect the ranges for the entire cloud. Some service t
| **ChaosStudio** | Azure Chaos Studio. <br/><br/>**Note**: If you have enabled Application Insights integration on the Chaos Agent, the AzureMonitor tag is also required. | Both | No | Yes | | **CognitiveServicesFrontend** | The address ranges for traffic for Azure AI services frontend portals. | Both | No | Yes | | **CognitiveServicesManagement** | The address ranges for traffic for Azure AI services. | Both | No | Yes |
-| **DataFactory** | Azure Data Factory | Both | No | Yes |
+| **DataFactory** | Azure Data Factory | Both | Yes | Yes |
| **DataFactoryManagement** | Management traffic for Azure Data Factory. | Outbound | No | Yes | | **Dynamics365ForMarketingEmail** | The address ranges for the marketing email service of Dynamics 365. | Both | Yes | Yes | | **Dynamics365BusinessCentral** | This tag or the IP addresses covered by this tag can be used to restrict access from/to the Dynamics 365 Business Central Services. | Both | No | Yes |